Enabling the GA4 BigQuery export sounds straightforward — and mechanically it is. But before you click through the setup, a handful of decisions about GCP project structure, dataset location, and export frequency have long-term consequences for cost, data organisation, and how easy the data is to work with later. This guide covers the setup steps and the thinking behind each decision.
Before you start — decisions to make first
These choices affect everything downstream. Making them deliberately before setup is significantly easier than changing them later.
US (multi-region) or EU (multi-region) is appropriate. If you have data residency requirements, choose accordingly. Mismatched regions between datasets you want to join will cause query failures.Step 1: Set up a GCP project and enable BigQuery
Go to console.cloud.google.com. If you don't have a GCP project, create one — give it a name that will make sense when you're looking at billing and IAM six months from now. Something like analytics-prod or company-analytics is better than a generated name.
- In the GCP console, navigate to APIs & Services → Library
- Search for "BigQuery API" and enable it if not already enabled
- Navigate to Billing and confirm a billing account is attached to the project — the export will fail silently without one
- Note the Project ID — you'll need it when linking in GA4
Step 2: Grant the GA4 service account access
GA4 exports data using a Google-managed service account. This account needs permission to create datasets and write data to your BigQuery project. Without this, the export link will appear to work in GA4 but nothing will arrive in BigQuery.
- In your GCP project, go to IAM & Admin → IAM
- Click Grant Access
- In the "New principals" field, enter:
analytics-processing@system.gserviceaccount.com - Assign the role: BigQuery Data Editor
- Save
This service account is the same for all GA4 properties — it's a Google-managed account, not one specific to your property. The Data Editor role gives it permission to create the dataset and write tables within your project without broader access to other GCP resources.
Step 3: Link BigQuery in GA4
The link is configured from GA4's side, not from BigQuery. You need Editor or Administrator role on the GA4 property to do this.
- In GA4, go to Admin → Property → BigQuery Links
- Click Link
- Click Choose a BigQuery project and select your GCP project from the list — if it doesn't appear, confirm you're logged in with a Google account that has access to both GA4 and the GCP project
- Choose the data location for the BigQuery dataset — this is the region decision from earlier
- Select export frequency: Daily, Streaming, or both
- Select which data streams to export — typically your main web stream, but you can select multiple if your property has app streams too
- Review and click Submit
What gets created automatically
After linking, GA4 creates a dataset in your BigQuery project named analytics_PROPERTYID where PROPERTYID is your GA4 property's numeric ID. Within this dataset, it creates partitioned tables named events_YYYYMMDD for each day's export and events_intraday_YYYYMMDD for intraday streaming data if enabled.
What you get — the exported data structure
Understanding what's in the export before you start querying saves significant time. The events table is not structured like a GA4 report — it's raw event data, one row per event, with nested and repeated fields that require specific SQL to work with.
| Table | Contents | Updates |
|---|---|---|
| events_YYYYMMDD | Complete data for one day — all events, all parameters. This is the primary table for analysis. | Once per day, finalised |
| events_intraday_YYYYMMDD | Same-day events as they occur. Exists only if streaming export is enabled. Gets replaced by events_YYYYMMDD when the daily export completes. | Hourly, then deleted |
| pseudonymous_users_YYYYMMDD | User-level data including user properties and predicted audiences. Available only with certain GA4 360 configurations. | Once per day |
Key columns in the events table
| Column | Type | Notes |
|---|---|---|
| event_date | STRING | Format YYYYMMDD — use PARSE_DATE to convert for date arithmetic |
| event_timestamp | INTEGER | Microseconds since Unix epoch — divide by 1,000,000 for seconds |
| event_name | STRING | The event name — page_view, session_start, purchase, etc. |
| event_params | RECORD (REPEATED) | Array of key-value pairs — must UNNEST to access individual parameters |
| user_pseudo_id | STRING | Pseudonymous user identifier — consistent per browser/device |
| user_id | STRING | Your own user ID if set via gtag or GA4 tag — null if not implemented |
| traffic_source | RECORD | Source, medium, and campaign of the session's acquisition |
| geo | RECORD | Country, region, city, continent |
| device | RECORD | Category, browser, OS, mobile brand/model |
| ecommerce | RECORD | Purchase-level data — total_item_quantity, purchase_revenue, etc. |
| items | RECORD (REPEATED) | Item-level ecommerce data — one entry per product in the transaction |
(SELECT value.string_value FROM UNNEST(event_params) WHERE key = 'page_location'). See the schema explained post for the full pattern library.Verifying the export is working correctly
After 24–48 hours, confirm the export is running correctly before building anything on top of it.
analytics_PROPERTYID dataset is present.events_YYYYMMDD tables should exist for recent dates. If you only see today's date and not yesterday's, wait another 24 hours.SELECT COUNT(*) FROM `project.analytics_ID.events_*` WHERE _TABLE_SUFFIX = FORMAT_DATE('%Y%m%d', DATE_SUB(CURRENT_DATE(), INTERVAL 1 DAY)) and compare the row count against GA4's event count for the same date.SELECT event_name, COUNT(*) as count FROM ... GROUP BY event_name ORDER BY count DESC LIMIT 20 and verify you see expected events like page_view, session_start, and any conversion events.Common problems after setup
No data after 48 hours
Most commonly caused by the service account not having the correct IAM role, or the billing account not being attached. Check both. In GA4's BigQuery Links page, the link status should show as Active — if it shows an error, it will usually describe the permission problem.
Dataset appears but tables are empty
The dataset was created but the export failed to write data. Usually a permissions issue — the service account can create the dataset but not write to it. Check the IAM role is BigQuery Data Editor and not a more restrictive role.
Row counts significantly lower than expected
If BigQuery row counts are 20%+ lower than GA4's event counts for the same date, check whether your property has Consent Mode v2 configured. Users who decline consent have their data modelled in GA4's interface but not exported to BigQuery — the raw export only contains events from users who consented. This is expected behaviour, not a bug.
Data stops appearing after a few days
Occasionally the BigQuery link needs to be refreshed after certain GA4 property changes. In GA4 Admin → BigQuery Links, check the link status. If it shows an error, try removing and re-adding the link.
