Connecting Remote MCP Servers with Ask AI
The Ask AI node can connect to any public Model Context Protocol (MCP) server and call its tools during the conversation. You can also connect multiple MCP servers simultaneously, allowing the AI to use tools from different services in a single workflow. This lets you integrate specialised or internal services directly with AI without having to wait for a dedicated Gumloop node.
DeepWiki remote-MCP demo
Why Use It
- Extended reach – access APIs that are outside the current node library and MCP features
- Multi-tool workflows – combine several endpoints in one prompt
- Multi-server workflows – connect multiple MCP servers for cross-service integration
- Model-orchestrated execution – the model selects parameters and handles sequencing
Key Capabilities
Capability | Details |
---|---|
Tool calls | OpenAI Responses API · Anthropic Messages API |
Multi-server support | Connect multiple MCP servers in a single Ask AI node |
Authentication | OAuth Bearer tokens |
Server | Streamable HTTP · Server-Sent Events (SSE) |
Supported models | GPT-4.1 · Claude 3.7 Sonnet · Claude 4 Sonnet |
Ask AI MCP vs Native MCP Node
Ask AI + MCP | Native MCP node | |
---|---|---|
Configuration | Toggle inside Ask AI directly | Guided wizard |
Multi-server support | Connect multiple servers | One server per node |
Tool calls | Selected at run-time by the model | Defined in node script |
Recommended for | Flexible or evolving tasks | Stable, repeatable jobs |
Confirmation prompt | Not available | Optional |
Error handling | Controlled in the prompt | Built-in retries & guards |
Note: Since Gumloop already provides native integrations for popular services like GitHub, Salesforce, Slack, Gmail, and Hubspot, the Ask AI MCP feature is most valuable for accessing specialized or niche services that aren’t covered by our standard node library.
When to Use Ask AI MCP vs Native MCP Nodes
Understanding when to use each approach is crucial for building effective workflows:
Use Ask AI MCP When:
- You need raw, unrestricted access to an MCP server
- You’re working with custom or experimental MCP servers
- You want to combine multiple tools from the same server in one conversation
- You want to combine tools from multiple MCP servers in one workflow
- You need fine-grained control over tool calling behavior
- You’re prototyping or testing MCP server functionality
Use Native MCP Nodes When:
- You want a guided, user-friendly experience
- You need visual workflow building with drag-and-drop
- You want approval prompts before actions
- You’re building production workflows that others will use
- You need error handling and retry logic
Further reading:
Setting Up Remote MCP Credentials
Before using MCP servers in the Ask AI node, you need to configure your credentials:
Step 1: Add MCP Server Credentials
- Navigate to Settings > Credentials
- Search for “MCP Server” in the available integrations
- Click “Add Credential”
Step 2: Configure Server Details
Fill in the required information:
Field | Description | Example |
---|---|---|
URL | The MCP server endpoint (must start with https://) | https://mcp.stripe.com |
Label | Unique identifier for this server | stripe-production |
Authorization Token | OAuth Bearer token (if required) | sk_live_... (optional) |
Step 3: Save and Test
- Click “Save Credential”
- Repeat for additional MCP servers if needed
- Enable
Connect to MCP Server
and test with theAsk AI
node
Enabling MCP in Ask AI Node
- Add Ask AI Node: Drag an Ask AI node onto your canvas
- Open Advanced Options: Click “Show more options” in the node configuration
- Enable MCP: Toggle “Connect MCP Server?” to ON
- Select Server(s): Choose your configured MCP server(s) from the dropdown
- Single Server: Select one server for focused integration
- Multiple Servers: Select multiple servers from the dropdown for cross-service workflows
Node Configuration with MCP
When MCP is enabled, the Ask AI node behavior changes:
- Tool Discovery: The node automatically imports all available tools from the MCP server(s)
- Model Selection: Only works with MCP-compatible models (GPT 4.1, Claude 3.7 Sonnet & Claude 4 Sonnet)
- Multi-Server Mode: When multiple servers are connected, tools from different servers can be used in the same conversation
Example Use Cases
Example 1: DeepWiki Repository Research
Scenario: Research technical documentation and specifications from GitHub repositories using the DeepWiki MCP server.
Setup:
Prompt:
Expected Behavior:
- The AI uses DeepWiki’s
ask_question
tool to query the repository - Searches through documentation and code for relevant information
- Synthesizes findings into a coherent technical summary
- Provides specific references to documentation sections
Example 2: Custom Internal Tool Integration
Scenario: Connect to your company’s custom MCP server for internal tool access.
Setup:
Prompt:
Why This Works:
- Your internal MCP server exposes specific analytics tools
- The AI can combine multiple data sources in a single conversation
- Complex business logic is handled by the MCP server’s tools
- You get formatted output without manual data compilation
Example 3: Multi-Server Research + Analytics Workflow
Scenario: Research a technology topic and analyze its market impact using multiple data sources.
Connected Servers:
- DeepWiki: Technical documentation research
- Internal Analytics: Company performance data
Prompt:
Multi-Server Workflow:
- DeepWiki tools: Research GraphQL documentation and recent updates
- Analytics tools: Pull internal API performance metrics and usage data
- AI synthesis: Combine insights from both sources into a coherent report
Important Limitations & Considerations
Technical Limitations
-
HTTP Only: Only publicly accessible HTTP(S) servers are supported
- Local STDIO servers cannot be connected
- Server must be internet-accessible
-
Single or Multiple Servers: Each Ask AI node can connect to one or multiple MCP servers
- Use multiple servers for cross-service workflows
- Consider using subflows for complex multi-server workflows
-
No Approval Prompts: The node cannot request confirmation before tool calls
- ⚠️ Caution: Avoid destructive actions like deleting files
- Test thoroughly with non-production data first
Security Considerations
-
Direct Tool Access: All MCP server tools are immediately available to the AI
- Review server documentation to understand available actions
- Use appropriate authorization tokens to limit access scope
-
Data Sharing: Information in your prompt may be sent to the MCP server(s)
- Be mindful of sensitive data in prompts
- Review MCP server privacy policies
- With multiple servers, consider cross-server data sharing implications
Best Practices
Single Server Workflows
- Reference tools when precision matters – “Use
list_customers
…” - Define context and expected output – “Return a short summary suitable for an email digest”
- Start with discovery – “List available tools and required inputs”
Multi-Server Workflows
- Be specific about sources – “Use DeepWiki to research X, then check our analytics for Y”
- Define clear context – “Combine data from all sources into a single executive summary”
- Handle cross-server dependencies – “First get customer data, then use that to query billing information”
Troubleshooting
Single Server Issues
Issue | Checklist |
---|---|
Cannot connect | URL correct? HTTPS? Token valid? Server up? |
Auth failure on call | Token expired / wrong scope? |
AI ignoring tools | Mention tool names or clarify goal. |
Multi-Server Issues
Issue | Checklist |
---|---|
Tool conflicts | Check if similar tool names exist across servers |
AI confusion with tools | Be more explicit about which server to use |
Remote MCP access in Ask AI provides a flexible way to integrate external APIs, with support for both single and multi-server workflows, while native MCP nodes offer a guided, repeatable approach. Choose the option that best fits your workflow requirements.