Blog

GA4 + GSC + Looker Studio Dashboard for Crypto SEO Retainers

The exact analytics stack we ship inside every Foundation engagement — GA4 events, GSC integration, Supabase storage, Looker Studio dashboards. Built for crypto verticals where compliance and tracking constraints intersect.

The dashboard a crypto SEO client gets at the end of a Foundation engagement is, in many ways, the most lasting artefact of the work. Articles age, schema gets refactored, but the dashboard keeps measuring whatever ranking and AI-citation work runs after — including in-house work after the agency relationship ends. We hand over the dashboard, the runbook, and the build pipeline.

This post is the actual stack: tools, events, schema, and the gotchas that bite specifically on crypto sites where geo-blocking and tracking-consent constraints are higher than usual.

Quick facts

ParameterValue
StackGA4 + GSC + Supabase + Looker Studio (or Metabase)
Build time1.5–2 days inside Foundation engagement
Daily ingestionCron-triggered fetch from GA4 API + GSC API into Supabase
Update cadenceHourly for live metrics; daily for ranking deltas
Data retention25 months (GA4 default) + Supabase (indefinite, owned)
Cost (monthly)$0 GA4 + $25 Supabase + $0 Looker Studio

What’s the architecture?

Four-layer pipeline. Layer 1 — collection: GA4 on the site for traffic + events; Google Search Console for query/page-level performance; Bing Webmaster Tools for the smaller-but-real Bing/Yahoo share (often 8–14% of crypto traffic in EN markets); Cloudflare Web Analytics as a privacy-friendly fallback.

Layer 2 — ingestion: nightly cron via n8n or GitHub Actions calling GA4 Data API v1 and GSC Search Analytics API. Outputs land in Supabase tables (one per source: ga4_sessions, gsc_queries, gsc_pages, bing_queries, etc.). Schema is denormalised — each row is one date × one query × one page so joins are trivial.

Layer 3 — enrichment: a separate cron joins GSC query data with our internal keyword cluster taxonomy (which queries belong to which jurisdiction, which intent bucket). This is where the GA4-GSC link (so common dashboards skip) gets done correctly: clicks from GSC matched to landing pages → matched to GA4 conversions on the same landing pages within a 30-day window.

Layer 4 — visualisation: Looker Studio for client-facing dashboards (free, easy to share with view-only access); Metabase for internal SQL exploration (Authority-tier clients get a Metabase login too).

What events does GA4 actually need to track?

Five custom events beyond the GA4 defaults. audit_call_book — fires when someone clicks the “Book audit call” CTA. Parameters: cta_location (header/footer/inline), referrer_path (where they came from). This is the lead-attribution backbone.

pricing_view — fires on /pricing 5-second engagement (proxy for actual reading vs. bounce). Parameters: scroll_depth, time_on_page. Used to identify which jurisdictions/services drive pricing-page traffic.

case_study_engaged — same idea for /cases/* pages. Parameter: case_slug.

outbound_link_click — outbound link clicks for citation analysis. Parameter: target_domain.

form_submit — split into form_submit_success and form_submit_error for better attribution accuracy.

We avoid the everything-is-an-event pattern; five events is enough to drive 95% of analysis questions, and more events fragment the data without adding insight.

How do you handle crypto-specific tracking constraints?

Two recurring issues. Consent banners and GA4 consent mode: EU users need explicit consent for analytics under GDPR. GA4 Consent Mode v2 lets you collect aggregated/anonymised data even before consent — without it, EU traffic shows as zero in GA4. We ship Consent Mode v2 with every Foundation engagement; it’s a 30-minute Tag Manager change but commonly missed by generic agencies.

Geo-blocking impact on GA4: some crypto sites geo-block US/UK/sanctioned jurisdictions at the CDN layer. Geo-blocked users 4xx out before the GA4 script loads, so they’re invisible to GA4 entirely. This is fine for GA4 reporting (you don’t want to track them anyway), but it skews CWV reports if you measure on the smaller geo-permitted user base. We add a CDN-edge counter (Cloudflare Workers Analytics) to surface the geo-blocked count separately.

What does the client-facing dashboard show?

Eight panels on the monthly delta report. Panel 1 — Cluster ranking: average position per jurisdiction × per intent bucket, with month-over-month delta. Panel 2 — Indexed pages: count over time + new/lost pages this month. Panel 3 — GSC clicks + impressions: filtered to commercial intent queries only (we tag this with the keyword taxonomy). Panel 4 — AI citation share: from the prompt panel (Perplexity weekly, others monthly).

Panel 5 — Lead funnel: Audit call CTAs → form submits → qualified leads (manual tag from sales). Panel 6 — Top performing pages: by clicks, by conversions, by AI citations. Panel 7 — Content health: pages with stale dateModified, broken schema, low CTR. Panel 8 — Anomalies: any metric ≥1.5σ outside its 90-day mean, flagged for review.

What’s the most-asked feature on the dashboard?

“Did this article move the needle?” — clients want to know which content investments paid back. We answer this with a per-article report: clicks attributed (GSC), AI citations attributed (panel diff), referring domains attributed (Ahrefs API), and revenue/leads attributed (form events). The answer is rarely “this exact article” — it’s usually “this cluster of articles plus the on-page rewrite”. The report shows the cluster effect rather than promising single-article ROI.

Frequently asked questions

Why not just use Ahrefs or SE Ranking dashboards? Those are great for ranking data alone. They don’t merge with GA4 conversions or AI citation panels. Our stack is the merge layer.

Why Supabase instead of BigQuery? GA4 → BigQuery is free up to a quota and works well for high-volume sites. For most crypto SEO clients (sub-100k monthly sessions), Supabase Postgres at $25/mo is simpler, faster to query, and cheaper than BigQuery’s slot pricing for the same workload. We use BigQuery for clients above ~250k sessions/month.

Can the client take the dashboard with them after the engagement? Yes — that’s the design. The Supabase project is set up under the client’s account from day one (we have temporary access to build it), and the Looker Studio dashboard is shared with the client’s email. After engagement closure, we revoke our access; the client retains everything.

Does this work for token launches with anti-tracking users? Partially. Privacy-aware crypto users disable GA4 with extensions; expected miss rate is 25–40%. Cloudflare Web Analytics (server-side, no client JS) catches the missed traffic at aggregate level. Combined view = ~95% accurate.

Oleh Bielyakov avatar

Oleh Bielyakov

Head of Ads · 7 yrs

Oleh leads paid acquisition and analytics across chyzh.agency engagements — Google Ads (full 2026 cert stack), Meta Ads, Bing, Microsoft Advertising, GA4 and GTM. Seven-plus years closing the loop between SEO content and paid demand. On crypto-seo.pro engagements he runs the analytics build (GA4, Looker Studio, Supabase pipeline), audits Merchant Center and feed quality where applicable, and benchmarks paid CAC against organic-attributed pipeline so the SEO retainer can be defended on commercial outcomes, not just ranking screenshots.

LinkedIn ↗

Want this thinking applied to your domain?

Free 30-minute audit call. Bring the domain and the keyword cluster — we'll tell you whether Foundation, Growth or Authority fits.