Appearance
Getting Started
Welcome to the SUSE Paid Advertising Data Warehouse. This page explains how the data flows, where it comes from, and what you need to know when working with the reports.
Work in Progress
This project is actively being developed. New tables, views, and features are added regularly. Looker Studio dashboards are also a work in progress and may change as the data model evolves.
How Data Gets Into BigQuery
We pull advertising data from 6 platforms. They use two different ingestion methods:
Automated Platforms (Dataslayer)
These 4 platforms are pulled automatically every day by Dataslayer, a tool that connects to each platform's API and writes data directly into BigQuery staging tables.
| Platform | Schedule | Lookback Window |
|---|---|---|
| Google Ads | Daily, ~06:00 UTC | 60 days |
| Microsoft Ads | Daily, ~06:00 UTC | 60 days |
| LinkedIn Ads | Daily, ~06:00 UTC | 15 days |
| Reddit Ads | Daily, ~05:00 UTC | 30 days |
You don't need to do anything for these -- data arrives automatically.
Manual Platforms (Google Sheets)
These 2 platforms are not supported by Dataslayer and require manual processing:
| Platform | How It Works |
|---|---|
| 6sense | Export CSV from 6sense dashboard, upload via Google Sheets menu |
| StackAdapt | Export CSV from StackAdapt dashboard, upload via Google Sheets menu |
To update 6sense or StackAdapt data:
- Export the CSV report from the platform's dashboard
- Open the SUSE Ads Google Sheet
- Use the HoD Menu to trigger processing:
- 6sense: HoD Menu > 6sense > V3 Drive Processing (or Complete All)
- StackAdapt: HoD Menu > StackAdapt > Complete All
- The script parses the CSV, formats it, and uploads directly to BigQuery
TIP
6sense and StackAdapt data is only as fresh as the last manual upload. If the numbers look stale, check when the last upload was run.
What Happens After Ingestion
Once data lands in BigQuery staging tables, scheduled refresh scripts run daily:
- 07:30 UTC -- Creative refresh (MERGE new/updated creative metadata)
- 08:00 UTC -- Adlevel refresh (DELETE old + INSERT new performance data)
- 08:30 UTC -- Budget refresh (update budget pacing numbers)
The refresh scripts move data from staging tables into final tables, which keep the full history.
Where to Find Things
In BigQuery
| What | Where |
|---|---|
| Daily performance data | suse_{platform}_adlevel_final_v2 tables |
| Creative metadata | suse_{platform}_creative_final_v2 tables |
| Campaign status (active/paused) | suse_{platform}_status_final_v2 tables |
| Cross-platform view | unified_performance_v2 view |
| Budget pacing | budget_pacing_summary_v2 view |
In Looker Studio
WARNING
Looker Studio dashboards are a work in progress. They connect to the BigQuery views and tables listed above. As the data model evolves, dashboards may be updated or restructured.
The dashboards pull from the final_v2 tables and views. If data looks wrong or missing, check:
- Is the platform automated or manual? If it's 6sense or StackAdapt, someone needs to manually run the upload.
- Did the refresh script run? Check
refresh_log_v2in BigQuery for today's entries. - Is there a schema issue? If Dataslayer changed its output, the staging table schema might not match what the refresh script expects.
Quick Reference: Platform Data Sources
| Platform | Ingestion | Automation | Status Table |
|---|---|---|---|
| Google Ads | Dataslayer | Automatic | Yes |
| Microsoft Ads | Dataslayer | Automatic | Yes |
| LinkedIn Ads | Dataslayer | Automatic | Not yet |
| Reddit Ads | Dataslayer | Automatic | Not yet |
| 6sense | Google Sheets | Manual | Yes |
| StackAdapt | Google Sheets | Manual | Not yet |
Need Help?
- Check the Daily Pipeline page for the full schedule
- Check Debugging Refreshes if data isn't flowing
- Review the Data Dictionary for column definitions