Skip to main content
Connect to your BigQuery warehouse to explore projects, inspect datasets and tables, and run SQL using clear natural language. Build repeatable workflows that pull exactly the fields you care about as structured data ready for analysis, reporting, and downstream automation.
Work across your Google Cloud projects to list datasets and tables, inspect metadata, and execute SQL with structured outputs that plug directly into your workflows.

How to Use MCP Nodes

What is Google BigQuery MCP?

The Google BigQuery MCP creates a customized node that understands BigQuery concepts like projects, datasets, tables, schemas, and SQL. You can ask for what you need in plain language and receive structured data outputs that flow into the rest of your automation. For best results, mention the Google Cloud project ID in your prompts to guide the AI to the right resources.

What Can It Do for You?

  • Discover which Google Cloud projects, datasets, and tables you can access
  • Inspect dataset and table metadata including schema, row counts, and size
  • Run SQL queries in BigQuery and return results as structured data
  • Power reliable data workflows by chaining outputs into analysis, reporting, or storage steps

Available Tools

ToolWhat It DoesExample Use
List Project IdsList all Google Cloud project IDs accessible by the authenticated user”List accessible Google Cloud project IDs and return structured data with project_id”
List Dataset IdsList all dataset IDs in a BigQuery project”Using project id, list all dataset IDs in the project and return structured data with dataset_id”
List Table IdsList all table IDs in a BigQuery dataset”Using project id and dataset id, list table IDs and return structured data with table_id”
Get Dataset InfoGet metadata about a dataset including description, location, and timestamps”Using project id and dataset id, get dataset metadata and return structured data with dataset_id, description, location, created, modified”
Get Table InfoGet table metadata including schema, row count, and size”Using project id, dataset id, and table id, get table metadata and return structured data with table_id, row_count, size_bytes, schema”
Execute SqlExecute a SQL query on BigQuery and return results”In project id, run SQL sql query and return structured data with column names and rows”

How to Use

1

Create Your Google BigQuery MCP Node

Go to your node library, search for Google BigQuery, and click “Create a node with AI”
2

Add Your Prompt

Drag the Google BigQuery MCP node to your canvas and add your prompt in the text box. For best results, include the Google Cloud project ID in your prompt when you know it.
3

Test Your Node

Run the node to see the results. If it works as expected, you’re all set! If you run into issues, check the troubleshooting tips below.
4

Save and Reuse

Once your Google BigQuery MCP node is working, save it to your library. You can now use this customized node in any workflow.

Example Prompts

Here are some prompts that work well with Google BigQuery MCP: Discover Projects
List accessible Google Cloud project IDs and return structured data with project_id
List Datasets
Using `project id`, list all dataset IDs in the project and return structured data with dataset_id
List Tables
Using `project id` and `dataset id`, list table IDs and return structured data with table_id
Inspect Metadata
Using `project id`, `dataset id`, and `table id`, get table metadata and return table_id, row_count, size_bytes, schema
Run SQL
In `project id`, run SQL `sql query` and return structured data with column names and rows
Mention the Google Cloud project ID in your prompt to reduce ambiguity, for example “In project id, run SQL sql query”. Start simple with a single action per node, request only the fields you need, and chain nodes for multi-step workflows.

Troubleshooting

If your Google BigQuery MCP node isn’t working as expected, try these best practices:

Keep Prompts Simple and Specific

  • Good: “Using project id and dataset id, list table IDs and return table_id”
  • Bad: “List all datasets across all projects, then run a query on each and summarize the results”
While this prompt might work, it’s more efficient to break it into separate nodes. Google BigQuery MCP works best with focused, single-action prompts.

Match What Google BigQuery Can Do

  • Good: “Using project id, dataset id, and table id, get table metadata and return row_count and size_bytes”
  • Bad: “Run a query, format a chart, and email it to recipient email
Google BigQuery MCP focuses on BigQuery data operations. For emailing or formatting, combine it with your email or spreadsheet nodes in your workflow.

Break Complex Tasks Into Steps

Instead of trying to do everything in one prompt (which can cause timeouts and errors):
List datasets in `project id`, list tables updated after `date`, get their row counts, and email a summary using `email template`
Break this into smaller, focused nodes that each handle one task:
1

Step 1: List Datasets

In project id, list dataset IDs and return structured data with dataset_id
2

Step 2: List Tables

Using project id and dataset id, list table IDs and return structured data with table_id
3

Step 3: Get Table Info

Using project id, dataset id, and table id, get table metadata and return structured data with table_id, row_count, size_bytes, schema
In your workflow, connect these nodes sequentially. The dataset IDs output from Step 1 becomes an input to Step 2, and the table IDs from Step 2 feed into Step 3.

Focus on Data Retrieval

Google BigQuery MCP is great at getting information from BigQuery. For analysis or content creation, connect it to other nodes. Example:
  • Good prompt: “In project id, run SQL sql query and return structured data with column names and rows”
  • Bad prompt: “Run SQL, analyze the trends, and draft an executive summary”
Use the Ask AI node for analysis and summarization. Connect the BigQuery results to Ask AI to generate insights or summaries.

Troubleshooting Node Creation

If you’re seeing empty outputs in the node creation window (or if you’ve already created the node, hover over it and click “Edit”), use the chat interface to prompt the AI to add debug logs and verify the API response. You can also click the “Request changes” button in the node creation window to ask the AI to adjust the implementation.
In the node creation window (or if you’ve already created the node, hover over it and click “Edit”), describe what you expected versus what you received, then click “Request changes” to guide the AI to refine filters, fields, or logic.
First click “Fix with Gummie”. If multiple attempts do not resolve the issue, simplify your prompt or contact support.
Use the chat interface in the node creation window to refine filters, output fields, or pagination. The AI will adjust the node based on your feedback.

Need More Help?