mcp-gsc is a Python MCP server that connects Google Search Console to Claude Code. Once connected, Claude can pull search analytics, inspect URLs, check indexing status, and manage sitemaps — all through tool calls.
I set it up for 0xinsider.com and ran an audit. The first thing it found was that my robots.ts file had disallow: ["/"], which tells crawlers to avoid the entire site. That had been live for weeks.
Without this, here’s how that would’ve gone: open Search Console, notice low impressions, scratch my head, check a few pages manually, eventually think to look at robots.txt, open VS Code, find the file, spot the conflict, fix it. Thirty minutes if I’m lucky, days if I don’t think to check robots.
With the MCP server, Claude pulled the performance data and the robots config in the same turn. It saw low impressions next to a misconfigured crawl rule and connected the dots. I typed “fix it” and the file was updated. The whole thing took seconds.
That’s the real value — not just reading data faster, but closing the loop between diagnosis and fix. Every SEO workflow I’ve done before involved jumping between the Search Console dashboard, a spreadsheet, and my editor. Now it’s one conversation.
Setup
The server uses OAuth to authenticate with your Google account. You need a Google Cloud project with the Search Console API enabled.
- Go to Google Cloud Console, create a project (or use an existing one)
- Enable the Search Console API
- Go to Credentials, create an OAuth 2.0 client ID (Desktop app type)
- Download the JSON file as
client_secrets.json
Clone the server and install dependencies:
git clone https://github.com/AminForou/mcp-gsc.gitcd mcp-gscuv venv .venv && source .venv/bin/activateuv pip install -r requirements.txtPlace your client_secrets.json in the mcp-gsc directory. On first run, the server opens a browser for OAuth consent. The token gets saved to token.json.
Add it to your Claude Code MCP config:
{ "mcpServers": { "gsc": { "command": "/path/to/mcp-gsc/.venv/bin/python", "args": ["/path/to/mcp-gsc/gsc_server.py"] } }}Restart Claude Code. You should see GSC tools available.
What it found on 0xinsider.com
I asked Claude to investigate search performance. It pulled performance overview, top queries, top pages, device breakdown, country data, sitemaps, and URL inspection results — all in parallel, in one turn.
28-day numbers:
| Metric | Value |
|---|---|
| Clicks | 59 |
| Impressions | 5,116 |
| Average CTR | 1.15% |
| Average Position | 6.2 |
Here’s what it found:
robots.ts was blocking crawlers. The file had both allow: "/" and disallow: ["/"]. When these conflict, the more specific rule wins — so Google was being told not to crawl anything. Here’s what it looked like:
// Before (broken)rules: { userAgent: "*", allow: "/", disallow: ["/"],}
// After (fixed)rules: { userAgent: "*", allow: "/", disallow: ["/api/"],}The learn guide had a search intent mismatch. The page /learn/how-to-trade-prediction-markets had 1,411 impressions and 1 click — a 0.07% CTR. The queries driving those impressions were about “how to invest in prediction markets through stocks,” not about trading on Polymarket directly. The title said “How to Trade” but users were searching for “How to Invest.” Changed the title to match the actual queries.
Leaderboard wasn’t ranking for its own keyword. The /leaderboard page was showing up at position 12-48 for “polymarket leaderboard” queries. The title was “Top Polymarket Traders — Most Advanced Trader Rankings.” It didn’t contain the word “leaderboard.” Changed it to “Polymarket Leaderboard — Top Traders Ranked by Profit.”
Trader profiles had generic meta descriptions. Every profile page had the same template: “Deep dive into @name’s Polymarket strategy. Full P&L breakdown…” — no actual stats. Updated generateMetadata to fetch trader data and include real numbers: P&L, win rate, markets traded. So now a profile shows something like “@gabagool22 Polymarket Profile — $214K P&L, 52% win rate” in search results instead of a generic blurb.
Desktop CTR was 5x worse than mobile. Desktop: 4,321 impressions, 0.74% CTR. Mobile: 767 impressions, 3.52% CTR. The meta descriptions were too long and generic for desktop SERPs where Google shows more text. Shortened them and added specific numbers.
All seven fixes happened in the same Claude Code session where the data was pulled. No tab switching, no exporting CSVs, no copying query data into a spreadsheet.
The tools
The server exposes 19 tools. The ones I used most:
| Tool | What it does |
|---|---|
get_performance_overview | Clicks, impressions, CTR, position by day |
get_search_analytics | Top queries or pages, grouped by dimension |
get_advanced_search_analytics | Same but with filters, sorting, pagination |
batch_url_inspection | Check indexing and rich results for multiple URLs at once |
get_search_by_page_query | See which queries drive traffic to a specific page |
compare_search_periods | Diff two time ranges |
batch_url_inspection is the one that saves the most time. In the Search Console UI, you inspect URLs one at a time. With the MCP tool, you pass 5 or 10 URLs and get back indexing status and rich result detection for all of them in one call.
get_search_by_page_query is useful for diagnosing CTR problems. You see a page with 30,000 impressions and 0.11% CTR, so you pull the queries for that page and discover the title doesn’t match what people are actually searching for.
What makes this different from just using the GSC dashboard
The dashboard shows data. Claude Code shows data and has access to the codebase at the same time.
When Claude finds that a page has high impressions and low CTR, it can read the page’s frontmatter, see the current title and meta description, compare them against the actual search queries, and propose a change — all without you navigating anywhere. The gap between finding a problem and fixing it is one sentence: “fix it.”
The robots.txt bug is a good example. The performance data alone didn’t explain why impressions were low. The robots data alone looked like a normal config file. It was the combination — seeing poor performance alongside a misconfigured robots file — that surfaced the real issue.