The Ask AI node can connect to any public Model Context Protocol (MCP) server and call its tools during the conversation. You can also connect multiple MCP servers simultaneously, allowing the AI to use tools from different services in a single workflow. This lets you integrate specialised or internal services directly with AI without having to wait for a dedicated Gumloop node.


DeepWiki remote-MCP demo

Why Use It

  • Extended reach – access APIs that are outside the current node library and MCP features
  • Multi-tool workflows – combine several endpoints in one prompt
  • Multi-server workflows – connect multiple MCP servers for cross-service integration
  • Model-orchestrated execution – the model selects parameters and handles sequencing

Key Capabilities

CapabilityDetails
Tool callsOpenAI Responses API · Anthropic Messages API
Multi-server supportConnect multiple MCP servers in a single Ask AI node
AuthenticationOAuth Bearer tokens
ServerStreamable HTTP · Server-Sent Events (SSE)
Supported modelsGPT-4.1 · Claude 3.7 Sonnet · Claude 4 Sonnet

Ask AI MCP vs Native MCP Node

Ask AI + MCPNative MCP node
ConfigurationToggle inside Ask AI directlyGuided wizard
Multi-server supportConnect multiple serversOne server per node
Tool callsSelected at run-time by the modelDefined in node script
Recommended forFlexible or evolving tasksStable, repeatable jobs
Confirmation promptNot availableOptional
Error handlingControlled in the promptBuilt-in retries & guards

Note: Since Gumloop already provides native integrations for popular services like GitHub, Salesforce, Slack, Gmail, and Hubspot, the Ask AI MCP feature is most valuable for accessing specialized or niche services that aren’t covered by our standard node library.

When to Use Ask AI MCP vs Native MCP Nodes

Understanding when to use each approach is crucial for building effective workflows:

Use Ask AI MCP When:

  • You need raw, unrestricted access to an MCP server
  • You’re working with custom or experimental MCP servers
  • You want to combine multiple tools from the same server in one conversation
  • You want to combine tools from multiple MCP servers in one workflow
  • You need fine-grained control over tool calling behavior
  • You’re prototyping or testing MCP server functionality

Use Native MCP Nodes When:

  • You want a guided, user-friendly experience
  • You need visual workflow building with drag-and-drop
  • You want approval prompts before actions
  • You’re building production workflows that others will use
  • You need error handling and retry logic

Further reading:

Setting Up Remote MCP Credentials

Before using MCP servers in the Ask AI node, you need to configure your credentials:

Step 1: Add MCP Server Credentials

  1. Navigate to Settings > Credentials
  2. Search for “MCP Server” in the available integrations
  3. Click “Add Credential”

Step 2: Configure Server Details

Fill in the required information:

FieldDescriptionExample
URLThe MCP server endpoint (must start with https://)https://mcp.stripe.com
LabelUnique identifier for this serverstripe-production
Authorization TokenOAuth Bearer token (if required)sk_live_... (optional)

Step 3: Save and Test

  1. Click “Save Credential”
  2. Repeat for additional MCP servers if needed
  3. Enable Connect to MCP Server and test with the Ask AI node

Enabling MCP in Ask AI Node

  1. Add Ask AI Node: Drag an Ask AI node onto your canvas
  2. Open Advanced Options: Click “Show more options” in the node configuration
  3. Enable MCP: Toggle “Connect MCP Server?” to ON
  4. Select Server(s): Choose your configured MCP server(s) from the dropdown
    • Single Server: Select one server for focused integration
    • Multiple Servers: Select multiple servers from the dropdown for cross-service workflows

Node Configuration with MCP

When MCP is enabled, the Ask AI node behavior changes:

  • Tool Discovery: The node automatically imports all available tools from the MCP server(s)
  • Model Selection: Only works with MCP-compatible models (GPT 4.1, Claude 3.7 Sonnet & Claude 4 Sonnet)
  • Multi-Server Mode: When multiple servers are connected, tools from different servers can be used in the same conversation

Example Use Cases

Example 1: DeepWiki Repository Research

Scenario: Research technical documentation and specifications from GitHub repositories using the DeepWiki MCP server.

Setup:

MCP Server: DeepWiki (https://mcp.deepwiki.com/mcp)
Model: GPT-4.1
Authorization: Not required (public server)

Prompt:

I need to understand the Model Context Protocol specification.
Using the modelcontextprotocol/modelcontextprotocol repository:

1. What are the core transport protocols supported?
2. How does the authentication flow work?
3. What are the main message types in the protocol?
4. Are there any recent changes to error handling?

Provide a technical summary that I can share with my development team.

Expected Behavior:

  1. The AI uses DeepWiki’s ask_question tool to query the repository
  2. Searches through documentation and code for relevant information
  3. Synthesizes findings into a coherent technical summary
  4. Provides specific references to documentation sections

Example 2: Custom Internal Tool Integration

Scenario: Connect to your company’s custom MCP server for internal tool access.

Setup:

MCP Server: Internal Analytics (https://analytics.yourcompany.com/mcp)
Model: Claude 4 Sonnet
Authorization: Bearer internal_token_123

Prompt:

I need a quarterly performance report. Please:

1. Get user engagement metrics for Q4 2024
2. Compare with Q3 2024 performance
3. Identify top 3 performing features
4. Flag any metrics that show concerning trends
5. Generate executive summary with key insights

Format this as a presentation-ready report with specific numbers and percentages.

Why This Works:

  • Your internal MCP server exposes specific analytics tools
  • The AI can combine multiple data sources in a single conversation
  • Complex business logic is handled by the MCP server’s tools
  • You get formatted output without manual data compilation

Example 3: Multi-Server Research + Analytics Workflow

Scenario: Research a technology topic and analyze its market impact using multiple data sources.

Connected Servers:

  • DeepWiki: Technical documentation research
  • Internal Analytics: Company performance data

Prompt:

I need to prepare a strategic report on GraphQL adoption. Please:

1. Use DeepWiki to research the latest GraphQL specifications and best practices
2. Query our internal analytics to see how our GraphQL APIs are performing
3. Synthesize both findings into an executive summary with recommendations

Focus on performance implications and developer productivity gains.

Multi-Server Workflow:

  1. DeepWiki tools: Research GraphQL documentation and recent updates
  2. Analytics tools: Pull internal API performance metrics and usage data
  3. AI synthesis: Combine insights from both sources into a coherent report

Important Limitations & Considerations

Technical Limitations

  1. HTTP Only: Only publicly accessible HTTP(S) servers are supported

    • Local STDIO servers cannot be connected
    • Server must be internet-accessible
  2. Single or Multiple Servers: Each Ask AI node can connect to one or multiple MCP servers

    • Use multiple servers for cross-service workflows
    • Consider using subflows for complex multi-server workflows
  3. No Approval Prompts: The node cannot request confirmation before tool calls

    • ⚠️ Caution: Avoid destructive actions like deleting files
    • Test thoroughly with non-production data first

Security Considerations

  1. Direct Tool Access: All MCP server tools are immediately available to the AI

    • Review server documentation to understand available actions
    • Use appropriate authorization tokens to limit access scope
  2. Data Sharing: Information in your prompt may be sent to the MCP server(s)

    • Be mindful of sensitive data in prompts
    • Review MCP server privacy policies
    • With multiple servers, consider cross-server data sharing implications

Best Practices

Single Server Workflows

  • Reference tools when precision matters – “Use list_customers …”
  • Define context and expected output – “Return a short summary suitable for an email digest”
  • Start with discovery – “List available tools and required inputs”

Multi-Server Workflows

  • Be specific about sources – “Use DeepWiki to research X, then check our analytics for Y”
  • Define clear context – “Combine data from all sources into a single executive summary”
  • Handle cross-server dependencies – “First get customer data, then use that to query billing information”

Troubleshooting

Single Server Issues

IssueChecklist
Cannot connectURL correct? HTTPS? Token valid? Server up?
Auth failure on callToken expired / wrong scope?
AI ignoring toolsMention tool names or clarify goal.

Multi-Server Issues

IssueChecklist
Tool conflictsCheck if similar tool names exist across servers
AI confusion with toolsBe more explicit about which server to use

Remote MCP access in Ask AI provides a flexible way to integrate external APIs, with support for both single and multi-server workflows, while native MCP nodes offer a guided, repeatable approach. Choose the option that best fits your workflow requirements.