Enabling the GA4 BigQuery export sounds straightforward — and mechanically it is. But before you click through the setup, a handful of decisions about GCP project structure, dataset location, and export frequency have long-term consequences for cost, data organisation, and how easy the data is to work with later. This guide covers the setup steps and the thinking behind each decision.

The export is not retroactive. When you enable the GA4 BigQuery export, it begins exporting data from that date forward. It does not backfill historical data. Every day you wait is a day of raw event data that will never be available in BigQuery. If you're planning to enable the export, do it now — even if you're not ready to query the data yet.

Before you start — decisions to make first

These choices affect everything downstream. Making them deliberately before setup is significantly easier than changing them later.

Decision 1
Which GCP project to export to
If you manage multiple GA4 properties, decide now whether each property exports to its own GCP project or all properties export into a single shared project. A single shared project is easier to manage for cross-property analysis — one place to query, one billing account, one set of IAM permissions. Separate projects make sense when properties belong to different clients or organisations with strict data separation requirements.
Decision 2
Dataset location
BigQuery datasets have a region — once set, it cannot be changed without recreating the dataset. Choose a region close to your users or your other GCP infrastructure. For most businesses, US (multi-region) or EU (multi-region) is appropriate. If you have data residency requirements, choose accordingly. Mismatched regions between datasets you want to join will cause query failures.
Decision 3
Daily vs streaming export
GA4 offers two export modes. Daily export runs once per day and exports the previous day's complete data — reliable, cost-efficient, slight delay. Streaming export sends events to BigQuery in near-real-time as they occur — faster but significantly more expensive at scale, and the data structure differs slightly. For most use cases, daily export is the right choice. Enable streaming only if you have a specific real-time use case that justifies the cost.
Decision 4
Who needs access
Decide before setup who will need BigQuery access and at what permission level. Analysts who run queries need BigQuery Data Viewer and BigQuery Job User roles. People who create datasets and tables need BigQuery Data Editor. Decide whether to manage access at the project level or dataset level — dataset-level is more granular and useful when different properties belong to different teams.

Step 1: Set up a GCP project and enable BigQuery

01
Google Cloud Platform
Create or select a GCP project and enable the BigQuery API

Go to console.cloud.google.com. If you don't have a GCP project, create one — give it a name that will make sense when you're looking at billing and IAM six months from now. Something like analytics-prod or company-analytics is better than a generated name.

  1. In the GCP console, navigate to APIs & Services → Library
  2. Search for "BigQuery API" and enable it if not already enabled
  3. Navigate to Billing and confirm a billing account is attached to the project — the export will fail silently without one
  4. Note the Project ID — you'll need it when linking in GA4
GCP free tier: BigQuery includes 1TB of query processing free per month and 10GB of storage free. For most GA4 setups, monthly costs stay well under $10 if queries are written with date partition filters. The export itself is free — you only pay for storage and queries.

Step 2: Grant the GA4 service account access

02
GCP IAM
Add the GA4 service account as BigQuery Data Editor

GA4 exports data using a Google-managed service account. This account needs permission to create datasets and write data to your BigQuery project. Without this, the export link will appear to work in GA4 but nothing will arrive in BigQuery.

  1. In your GCP project, go to IAM & Admin → IAM
  2. Click Grant Access
  3. In the "New principals" field, enter: analytics-processing@system.gserviceaccount.com
  4. Assign the role: BigQuery Data Editor
  5. Save

This service account is the same for all GA4 properties — it's a Google-managed account, not one specific to your property. The Data Editor role gives it permission to create the dataset and write tables within your project without broader access to other GCP resources.

Step 3: Link BigQuery in GA4

03
GA4 Admin
Create the BigQuery link from within GA4

The link is configured from GA4's side, not from BigQuery. You need Editor or Administrator role on the GA4 property to do this.

  1. In GA4, go to Admin → Property → BigQuery Links
  2. Click Link
  3. Click Choose a BigQuery project and select your GCP project from the list — if it doesn't appear, confirm you're logged in with a Google account that has access to both GA4 and the GCP project
  4. Choose the data location for the BigQuery dataset — this is the region decision from earlier
  5. Select export frequency: Daily, Streaming, or both
  6. Select which data streams to export — typically your main web stream, but you can select multiple if your property has app streams too
  7. Review and click Submit

What gets created automatically

After linking, GA4 creates a dataset in your BigQuery project named analytics_PROPERTYID where PROPERTYID is your GA4 property's numeric ID. Within this dataset, it creates partitioned tables named events_YYYYMMDD for each day's export and events_intraday_YYYYMMDD for intraday streaming data if enabled.

What you get — the exported data structure

Understanding what's in the export before you start querying saves significant time. The events table is not structured like a GA4 report — it's raw event data, one row per event, with nested and repeated fields that require specific SQL to work with.

TableContentsUpdates
events_YYYYMMDD Complete data for one day — all events, all parameters. This is the primary table for analysis. Once per day, finalised
events_intraday_YYYYMMDD Same-day events as they occur. Exists only if streaming export is enabled. Gets replaced by events_YYYYMMDD when the daily export completes. Hourly, then deleted
pseudonymous_users_YYYYMMDD User-level data including user properties and predicted audiences. Available only with certain GA4 360 configurations. Once per day

Key columns in the events table

ColumnTypeNotes
event_dateSTRINGFormat YYYYMMDD — use PARSE_DATE to convert for date arithmetic
event_timestampINTEGERMicroseconds since Unix epoch — divide by 1,000,000 for seconds
event_nameSTRINGThe event name — page_view, session_start, purchase, etc.
event_paramsRECORD (REPEATED)Array of key-value pairs — must UNNEST to access individual parameters
user_pseudo_idSTRINGPseudonymous user identifier — consistent per browser/device
user_idSTRINGYour own user ID if set via gtag or GA4 tag — null if not implemented
traffic_sourceRECORDSource, medium, and campaign of the session's acquisition
geoRECORDCountry, region, city, continent
deviceRECORDCategory, browser, OS, mobile brand/model
ecommerceRECORDPurchase-level data — total_item_quantity, purchase_revenue, etc.
itemsRECORD (REPEATED)Item-level ecommerce data — one entry per product in the transaction
event_params requires UNNEST. The single most common mistake for people new to the GA4 BigQuery schema is trying to access event parameters as direct columns. They don't exist as columns — they're stored in a repeated record that requires an UNNEST subquery to access. For example, to get the page URL: (SELECT value.string_value FROM UNNEST(event_params) WHERE key = 'page_location'). See the schema explained post for the full pattern library.

Verifying the export is working correctly

After 24–48 hours, confirm the export is running correctly before building anything on top of it.

Dataset exists: in BigQuery, navigate to your project and confirm the analytics_PROPERTYID dataset is present.
Tables are being created: within the dataset, events_YYYYMMDD tables should exist for recent dates. If you only see today's date and not yesterday's, wait another 24 hours.
Row counts are plausible: run SELECT COUNT(*) FROM `project.analytics_ID.events_*` WHERE _TABLE_SUFFIX = FORMAT_DATE('%Y%m%d', DATE_SUB(CURRENT_DATE(), INTERVAL 1 DAY)) and compare the row count against GA4's event count for the same date.
Key event names are present: run SELECT event_name, COUNT(*) as count FROM ... GROUP BY event_name ORDER BY count DESC LIMIT 20 and verify you see expected events like page_view, session_start, and any conversion events.
event_params contain expected values: pick one event and inspect its parameters to confirm your custom event parameters are exporting correctly.

Common problems after setup

No data after 48 hours

Most commonly caused by the service account not having the correct IAM role, or the billing account not being attached. Check both. In GA4's BigQuery Links page, the link status should show as Active — if it shows an error, it will usually describe the permission problem.

Dataset appears but tables are empty

The dataset was created but the export failed to write data. Usually a permissions issue — the service account can create the dataset but not write to it. Check the IAM role is BigQuery Data Editor and not a more restrictive role.

Row counts significantly lower than expected

If BigQuery row counts are 20%+ lower than GA4's event counts for the same date, check whether your property has Consent Mode v2 configured. Users who decline consent have their data modelled in GA4's interface but not exported to BigQuery — the raw export only contains events from users who consented. This is expected behaviour, not a bug.

Data stops appearing after a few days

Occasionally the BigQuery link needs to be refreshed after certain GA4 property changes. In GA4 Admin → BigQuery Links, check the link status. If it shows an error, try removing and re-adding the link.

Before querying, audit the GA4 data. The BigQuery export faithfully exports whatever GA4 recorded — including staging traffic, misattributed conversions, and zombie event data. GA4 Health Check identifies what's wrong with your property before you build analysis on top of it. Run the audit — $79 →
Travis Gunn
Founder of GA4 Health Check. Working with Google Analytics since 2013, with over 250 clients audited across almost every industry vertical. 100% Job Success on Upwork for over a decade.