Skip to main content

Hi fellow data crunchers,

I believe we are not the only team that misses Universal Analytics, with its filter options and views that made reporting easier. 

A very common issue we are facing is that we work with a team that manages only a certain vertical of their organisation (certain countries, sub-domains etc.) and we want to pull data for only this vertical from GA4. 

The option we have right now is to add an extra dimension to the data transfers in Supermetrics then filter the relevant vertical in Big Query to show performance on only that.

This makes the data crazy fragmented and in some cases cardinality issues could arise. 

Any of you have similar issues that a filter for transfers could solve? 

Hi @David !

I’ve had nightmares about GA4 data for a long time, but I’ve come to accept some realities along the way. Below is a breakdown that could also help those with less knowledge of the situation, so apologies if some of the info isn’t new to you:

  • Fragmented reports are the new norm
    • Similar to UA, custom reports are limited by the API in how many dimensions and metrics can be pulled. It’s not possible to pull one big table with 5 UTM parameters, page path, country, region, city, event name, and the name of your property, along with things like views, sessions, event count, etc... This means we have to break custom reports down into smaller chunks, and sometimes this data doesn’t match up correctly. Heck, if I pull daily data for a month like September into a 30-row table, then pull a “September 2024” rollup table with one row, the numbers can be slightly off. The GA4 UI will also report something different, because the GA4 UI and the API have different ways of pulling the data. It’s a mess, but it leans into the next bullet.
  • Granularity is relative to how much the API pulls, and in some instances is capped
    • Unlike UA, where the system told you when data was filtered or capped based on how much data was being processed, GA4 doesn’t warn you. I could pull “all” of my data for 2024, but it may only return back the top 100k rows. Unless you read the API docs carefully, you could be pulling (and transferring to BQ) limited data that will always have cardinality issues, because GA4 won’t report everything stored through the API. It’s infuriating. Speaking of infuriating...
  • Data isn’t stable until 72 hours after it processes
    • If you work with UTMs (we deal with session-scoped UTMs since they seem to be the most consistent with matching back to advertising actions), and you’re trying to build a near-realtime report, anything reported or communicated to a stakeholder within 72 hours of that data appearing is subject to change. For example, if you pull data for a report on Monday, and you chat with a stakeholder on Tuesday, and they go back to the report on Friday to see Monday’s numbers, those numbers could be drastically different, or different enough to make that stakeholder think your numbers were reported incorrectly. Google’s data freshness support article goes into more detail, and this can greatly impact how our reporting is perceived.

This might be a bit of overkill for your question, but these three factors need to be taken into account to find a solution for your problem. Fragmented reports are the way to go, which is a pain, but depending on the degree of reporting you’re communicating to a stakeholder, GA4 should be more of a “fairly accurate guide” than a final source of truth. My approach is to delay any meetings for that 72-hour window as much as possible, stipulate that numbers are subject to change (because we’re not paying $300k for GA360, which won't help much anyway), and ultimately, the business goals and final transactions are recorded outside GA4, so as long as needs are being met and the business aligns in the same general direction, everyone’s expectations are leveled. Perfectionism with GA4 is an impossibility, and I fear they designed it that way.


Hi @gcfmineo 

Thanks for sharing these insights! We’re definitely in the same boat – GA4 has given us our fair share of nightmares too. The instable data processing and fragmented report issues are constant pain points, and, like you said, perfectionism with GA4 seems out of reach.

That’s where a pre-filter could work well in Supermetrics so we wouldn’t have to pull all the dimensions into every single transfer so the output would be more true to the UI when using filters there. 


Hey @David! We at Supermetrics are looking to bring filtering options to DWH very soon, and would like to learn more about your usecase and verify that what we are planning makes sense.

I’ll reach out to you to schedule a call if you’d be up for it?

 


Hi @Otto! That is brilliant:) Feel free to reach out.


Thanks David, talk to you soon!

And for anyone else curious about using filters in DWH transfers, just drop us a comment here and let’s set something up. Your feedback really helps us build something that works for you.


Reply