Managing a network of blogs — whether it's ten sites or a hundred — creates an analytics problem that GA4 wasn't really designed to solve. Each property is a silo. The data is all there, but answering network-level questions requires either opening every property individually and building a mental picture, or building the infrastructure to combine the data into a single queryable layer.

The good news is that the infrastructure doesn't have to be complex. Depending on your network size and technical comfort, a Google Sheets and Apps Script approach can get you most of the way there without BigQuery at all. For larger networks or more advanced analysis, BigQuery is the natural next step — and the two approaches work well together.

What you're trying to answer

Before choosing tools, it's worth being specific about the questions. For a blog network, the most useful ones are typically:

These are different questions from "what are the total sessions this month" — they require historical context, cross-property comparison, and some notion of momentum rather than just volume.

Two approaches — choose based on your situation

Option A
Google Sheets + Apps Script
  • No BigQuery setup or GCP costs
  • Accessible to anyone comfortable with Sheets
  • GA4 Data API is well-supported in Apps Script
  • Email alerts built in via Gmail service
  • Editable by non-technical team members

  • Slower and less efficient at very large scale
  • Limited by Apps Script execution time limits
  • Harder to do complex SQL-style analysis
  • Looker Studio connection less flexible than BQ
Option B
BigQuery + Scheduled Queries
  • Handles very large networks cleanly
  • Full SQL for complex analysis
  • Native Looker Studio integration
  • Scales without execution time limits
  • Historical depth — query all data at once

  • Requires BigQuery export enabled on all properties
  • GCP costs (usually small but present)
  • More setup and technical overhead
  • Less accessible to non-technical collaborators

For most blog networks up to around 30–40 properties, the Sheets and Apps Script approach is the faster path to a working system. For larger networks, or where you need the full flexibility of SQL and long historical data ranges, BigQuery is worth the additional setup. The two also combine well — Apps Script to pull and write data, BigQuery for the heavy querying and analysis.

The architecture

Multiple GA4 propertiesone per site in the network
GA4 Data APIqueried via Apps Script or BQ
Unified sessions tabledate · property · sessions
→ record detection →
Records lognew daily highs flagged
Looker Studio dashboardnetwork view + property drill-down
+
Email alertfires when a site breaks its record

Option A: Google Sheets and Apps Script

01
Apps Script · Setup
Create the properties registry sheet

Start with a sheet named Properties with two columns: property_id (the GA4 property ID, e.g. 123456789) and site_name (a human-readable label for the dashboard). One row per site in the network. This is the list your Apps Script will iterate over — adding or removing a site from the network means editing this sheet, not the script.

Add a second sheet named Daily Sessions with columns: date, property_id, site_name, sessions. This accumulates one row per property per day as the script runs. A third sheet named Records tracks each property's all-time daily session high.

02
Apps Script · Data collection
Pull daily sessions from the GA4 Data API

The GA4 Data API is accessible from Apps Script using the AnalyticsData service — no OAuth setup required if you run the script as yourself with the necessary GA4 read permissions. The script reads the properties list, queries yesterday's session count for each one, and appends a row to the Daily Sessions sheet.

// Apps Script — fetch yesterday's sessions for all properties
function fetchDailySessions() {
  const ss = SpreadsheetApp.getActiveSpreadsheet();
  const propsSheet = ss.getSheetByName('Properties');
  const dataSheet  = ss.getSheetByName('Daily Sessions');
  const properties = propsSheet.getDataRange().getValues().slice(1);

  // Yesterday's date in YYYY-MM-DD format
  const yesterday = new Date();
  yesterday.setDate(yesterday.getDate() - 1);
  const dateStr = Utilities.formatDate(
    yesterday, 'UTC', 'yyyy-MM-dd'
  );

  properties.forEach(([propertyId, siteName]) => {
    try {
      const response = AnalyticsData.Properties.runReport(
        {
          dateRanges: [{ startDate: dateStr, endDate: dateStr }],
          metrics:    [{ name: 'sessions' }]
        },
        `properties/${propertyId}`
      );

      const sessions = response.rows
        ? parseInt(response.rows[0].metricValues[0].value)
        : 0;

      dataSheet.appendRow([dateStr, propertyId, siteName, sessions]);

    } catch(e) {
      // Log errors but continue — one failed property
      // shouldn't stop the rest of the run
      console.error(`Error for ${siteName}: ${e.message}`);
    }
  });
}

The try/catch per property is important — without it, a single property with a permissions error or API issue will stop the entire run and leave the rest of the network without data for that day.

Schedule this function via Apps Script triggers to run daily at 6am — after GA4's data processing has completed for the previous day. Go to Extensions → Apps Script → Triggers → Add Trigger, set function to fetchDailySessions, event source to Time-driven, type to Day timer, time to 6–7am.

03
Apps Script · Record detection
Detect new daily session records and send alerts

After fetching the day's sessions, compare each property's total against its historical maximum. If yesterday's count exceeds the previous record, log it to the Records sheet and send an email alert.

// Check for new records and send email alerts
function checkForRecords() {
  const ss          = SpreadsheetApp.getActiveSpreadsheet();
  const dataSheet   = ss.getSheetByName('Daily Sessions');
  const recordSheet = ss.getSheetByName('Records');
  const data        = dataSheet.getDataRange().getValues().slice(1);

  // Build a map of property_id → all-time max (excluding today)
  const yesterday = Utilities.formatDate(
    new Date(Date.now() - 86400000), 'UTC', 'yyyy-MM-dd'
  );

  const historicMax = {};
  const todayRows   = [];

  data.forEach(([date, propId, siteName, sessions]) => {
    if (date === yesterday) {
      todayRows.push({ propId, siteName, sessions });
    } else {
      historicMax[propId] = Math.max(
        historicMax[propId] || 0, sessions
      );
    }
  });

  // Check each of yesterday's rows against the historic max
  const newRecords = todayRows.filter(({ propId, sessions }) =>
    sessions > (historicMax[propId] || 0)
  );

  if (newRecords.length === 0) return;

  // Build and send email
  const rows = newRecords.map(({ siteName, sessions, propId }) => {
    const prev = historicMax[propId] || 0;
    const pct  = prev > 0
      ? ((sessions - prev) / prev * 100).toFixed(1)
      : 'N/A';
    // Also log to Records sheet
    recordSheet.appendRow([yesterday, siteName, sessions, prev, pct + '%']);
    return `${siteName}: ${sessions} sessions (+${pct}% above record)`;
  });

  GmailApp.sendEmail(
    Session.getEffectiveUser().getEmail(),
    `🏆 ${newRecords.length} new daily record(s) — ${yesterday}`,
    rows.join('\n')
  );
}

Add a call to checkForRecords() at the end of fetchDailySessions() so the record check runs automatically after each data pull. Alternatively, set it as a separate trigger that runs 15 minutes after the fetch — giving the data time to write before the comparison runs.

Option B: BigQuery for larger networks

For networks above roughly 40 properties, or where you need complex historical analysis that would be slow in Sheets, BigQuery is the better foundation. The GA4 BigQuery export sends each property's event data to a separate dataset — you combine them using a UNION ALL query that reads sessions from each property's events_* table.

The core query pattern is the same as the Sheets approach conceptually — sessions per property per day — but expressed in SQL across the exported event tables:

-- Unified daily sessions across network properties
-- Add one UNION ALL block per property

SELECT
  PARSE_DATE('%Y%m%d', event_date)   AS date,
  'site-one.com'                    AS site_name,
  COUNT(DISTINCT CONCAT(user_pseudo_id, '.',
    (SELECT value.int_value
     FROM UNNEST(event_params)
     WHERE key = 'ga_session_id')))  AS sessions
FROM `project.analytics_111111111.events_*`
WHERE _TABLE_SUFFIX >= FORMAT_DATE('%Y%m%d',
  DATE_SUB(CURRENT_DATE(), INTERVAL 1 DAY))
AND event_name = 'session_start'
GROUP BY 1, 2

UNION ALL

SELECT
  PARSE_DATE('%Y%m%d', event_date),
  'site-two.com',
  COUNT(DISTINCT CONCAT(user_pseudo_id, '.',
    (SELECT value.int_value
     FROM UNNEST(event_params)
     WHERE key = 'ga_session_id')))
FROM `project.analytics_222222222.events_*`
WHERE _TABLE_SUFFIX >= FORMAT_DATE('%Y%m%d',
  DATE_SUB(CURRENT_DATE(), INTERVAL 1 DAY))
AND event_name = 'session_start'
GROUP BY 1, 2

-- repeat for each property in the network

Schedule this as a BigQuery scheduled query set to INSERT INTO an existing unified table daily. Record detection then runs as a second scheduled query using a window function to compare each day's sessions against the historical maximum for that property — see the full BigQuery approach post for the complete record detection SQL.

A practical hybrid approach: use Apps Script to pull data from the GA4 Data API into Sheets for the current-day and recent data, and BigQuery for deeper historical queries. The two connect naturally — Apps Script can write to BigQuery via the BigQuery service, and Looker Studio can read from both Google Sheets and BigQuery in blended data sources.

Looker Studio dashboard

Whether you're using Sheets or BigQuery as the data source, the Looker Studio setup is similar. Connect to whichever data source you built:

Dashboard structure for a network view

Network summary table — all sites ranked by sessions in the selected period. Columns: site name, sessions, percentage of network total. A date range filter defaulting to yesterday gives you the immediate daily leaderboard.

Trend chart — total network sessions by day as a line chart. Useful for spotting network-wide patterns — seasonal dips, traffic from a high-profile external link, the effect of a content push.

Records panel — a table sourced from the Records sheet or log, showing which sites broke their daily record recently. Keep this to the last 30 days and sort by date descending. This is the most editorially interesting view — it surfaces momentum rather than just volume.

Property filter — a dropdown of all site names. When an editor selects a specific site, all charts filter to show that site's trend in isolation, with a reference line showing the all-time record.

Defining "growing" vs "just getting traffic": the most meaningful measure of growth in a blog network is whether a site's trailing 30-day session average is higher than the previous 30-day average — not whether it had a good day last week. Add this as a calculated field in Looker Studio: a rolling 30-day average compared period-over-period gives a cleaner signal of genuine momentum than raw daily sessions.
Network-level analytics are only as good as the underlying GA4 data. Before building dashboards across multiple properties, it's worth confirming each property's data is reliable. GA4 Health Check audits a property in 60 seconds. Run the audit — $79 →
Travis Gunn
Founder of GA4 Health Check. Working with Google Analytics since 2013, with over 250 clients audited across almost every industry vertical. 100% Job Success on Upwork for over a decade.