BigQuery Pricing for E-commerce: A Non-Technical Guide

Oskar Maciejek
11/25/2025

When you hear the term "Data Warehouse" or "BigQuery," two thoughts probably cross your mind. First: "This is the technology I need to connect data and scale my business." Second: "This sounds like a Google invoice I won't have control over."

Cloud pricing and many guides refer to complicated concepts for a non-technical person. They are full of terms like active storage, slot time, or streaming inserts. And many store owners ask themselves a simple question: "Will this ruin me?"

The answer is: No, if the data architecture is correct.

BigQuery is like a powerful industrial machine. It is cheap to maintain, as long as you use it wisely. In this guide, we will go through all cost stages and show where the line lies between a "playground for analysts" and real reporting for business.

Important Disclaimer: It would be easiest to write, as is common in various guides, "It depends...".
Cloud costs are fluid. Nevertheless, the calculations below are real estimates for a medium/large e-commerce, aimed at showing the scale of costs and the differences between the "raw" and the optimized approach.

Part 1: Storage - We are safe here

Let's start with the basics. BigQuery is, simply put, a gigantic Excel in the cloud. The first cost is storing data (e.g., from GA4, CRM, ads).
You pay for how much space your data occupies on Google's disks.

Example: Store generating 10 million events monthly in GA4

Let's assume your e-commerce generates 10 million actions monthly (views, clicks, purchases).

  • GA4: 10 million events is approx. 10 GB monthly (averaging that one event is approx. 0.5 - 1 KB of data)
  • Additionally, orders/products database from CRM and campaign data from the ad system: approx. 5-8 GB monthly.
  • Total: You gain approx. 15-18 GB of data every month.

Estimated cost:
Google offers the first 10 GB per month for free. Each subsequent gigabyte costs around $0.02.
Even if you accumulate a 3-year history (approx. 500 GB), the monthly cost of maintaining this archive will close around $10 per month.
Conclusion: Just keeping data is very cheap.

Part 2: Data Delivery - in batches or instantly

You have two options for sending data to BigQuery.

Let's look at the example of data export from Google Analytics 4:

  1. Daily Export: Once a day, in the morning, Google packs yesterday's data and uploads it to your warehouse. Cost: $0.
  2. Streaming (Live): Data hits the database minutes after the event. Cost: approx. $0.05 per GB.

Conclusion: BigQuery does not charge fees in this case for the process of saving data in the form of packages (so-called batch). In the case of streaming (sending live data), you also have nothing to worry about if your scale isn't very large (10 GB * $0.05 = $0.50 monthly).

Part 3: Processing (Query) - Two Worlds of Costs

Here we get to the heart of the matter. BigQuery makes money on reading data ($6.25 per 1 TB). But who reads this data? Here we must distinguish between two scenarios, because they determine your invoice.

World 1: Analyst in the console (Human Factor)

This is the situation where your analyst goes directly into BigQuery and writes SQL code to answer an "ad hoc" question, e.g., "Check why the conversion from iPhones dropped last Tuesday".

Here the cost depends 100% on human skill.

  • Junior Analyst: Might write a SELECT * query that mindlessly reads the entire database (terabytes of data) to find 5 rows. Cost of one query: $5.
  • Senior Analyst: Will use partitions, select only necessary columns, and limit scanning. The same query will be executed for $0.05.

This is a "laboratory". Here costs are one-off and depend on skill. But business rarely sits in the console. Business sits in reports.

World 2: Business in Looker Studio (Automated)

This is your daily life. You have a dashboard in Looker Studio. You don't write SQL code there - you click filters, change dates.
But Looker Studio must send an SQL query to BigQuery in the background to draw a chart.
And here the problem appears: The Automaton (Looker Studio) is not as clever as the Senior Analyst. If you connect it to raw data, it will generate heavy, expensive queries with your every click.

We have frequently worked on projects where we optimized the costs of non-optimal queries used in reports. Let's look at the example below - every day the report generated increasingly higher costs, reaching the amount of $53 per day. By introducing slight changes in SQL queries (skill), the report continued to work the same way, but costs dropped almost to 0. However, this is best presented in an example case study.

Part 4: Case Study - How much does a Quarterly Report cost?

Let's assume you have a sales dashboard from the last 90 days. It is used by 5 people (board, marketing).

Scenario A: The "Raw" Route (Looker Studio -> Raw Data)

You connect the report directly to tables with raw GA4 events.
One day of raw data weighs 1 GB. A 90-day report must therefore "touch" 90 GB of data.

  1. Interaction Trap: Looker Studio is "wasteful". To display a dashboard with 10 elements (charts, counters), it can send 10 separate queries.
  2. No Cache: BigQuery has a cache, but it only works if you change nothing.
    • The Manager enters the report.
    • Clicks the "Black Friday Campaign" filter.
    • Cache stops working. BigQuery must sift through 90 GB again to cut out just this campaign.
    • The Manager changes the table sorting. Another 90 GB.

Effect:
5 people × 10 filter changes daily × gigabytes of data.
The invoice becomes unpredictable. It could be $50, or it could be $300 if the team is very active. You pay every time someone touches the report.

Scenario B: The "Data Mart" Approach (WitCloud)

This approach is based on the principle: Let's prepare the data once, but properly.

The process looks like this:

  1. Automatic Task (Job): Every morning the system downloads data only from yesterday (1 GB of raw data).
  2. Aggregation: The system extracts what is important and saves it in the Data Mart. The resulting "brick" from one day weighs e.g., 20 MB (and not 1000 MB).
  3. Adding the brick: This small portion of data is appended to the main table as a new partition.

What happens in the report?
When the Manager changes dates, filters, and sorts, Looker Studio queries the Data Mart.
Instead of scanning 90 GB, it scans 90 small "bricks" (total 1.8 GB).
Estimated monthly cost:

  1. Daily processing: You pay in the morning for recalculating only one day of raw data. This is a fixed cost, approx. $15 - $25 monthly.
  2. Reporting: Because you work on lightweight data, hundreds of clicks by your managers generate a cost in the range of $2 - $5 monthly.

Total: $20 - $30 monthly. A fixed amount, independent of how often you check the results.

Part 5: Why does it work? Two pillars of savings

The secret to low costs in the Data Mart approach relies on two technical mechanisms. Partitioning is only half the success. The other half is Aggregation (Reducing detail).

To understand this, imagine how a large supermarket works.

Pillar 1: Aggregation (Instead of a million receipts - a summary)

  • Raw Data: This is a giant sack where you keep all receipts from all cash registers. Each receipt has a list of products, time, cashier. If you have 10,000 customers daily, you have 10,000 long receipts (rows in the database).
    • When you ask BigQuery for revenue, the database must take every receipt in hand and sum up the amounts. This takes time and costs money because the database "crunches" a huge amount of information.
  • Data Mart (Aggregation): This is a situation where the accountant takes these 10,000 receipts once a day, calculates what is important, and writes it on a single sheet: "Day: Tuesday. Total sales: 50,000. Number of transactions: 10,000".
    • We put this single sheet (aggregation result) into the Data Mart.
    • Effect: Instead of keeping millions of rows about every click, we keep a dozen rows with the daily summary. The table becomes 1000x lighter.

Pillar 2: Partitioning (Order in the binder)

Since we already have these lightweight "daily summary sheets," we must arrange them well.

  • Without Partitions: Sheets with summaries lie in one pile. To find "July," you have to dig through everything.
  • With Partitions: Sheets are filed in a binder, where each plastic sleeve is labeled with a date.

How does it work together?

When your Manager opens the Dashboard in Looker Studio and asks for results from the last quarter:

  1. Thanks to Partitioning, BigQuery opens only 90 specific sleeves in the binder (it doesn't touch the rest of the year).
  2. Thanks to Aggregation, inside each sleeve it finds not thousands of receipts, but one sheet with a summary.

That is why the report loads in a fraction of a second and costs fractions of a cent. BigQuery doesn't have to calculate anything anymore (because we calculated it in the morning) - it only reads the ready result.

Summary: So how much does it actually cost?

For a typical e-commerce with 10 million events monthly, the real bill for BigQuery with a well-designed architecture looks as follows:

Cost TypeDescriptionEstimated Amount
StorageMaintaining data from several years (GA4, Ads, CRM).$5 - $15 (depending on history)
Processing (ETL)Daily recalculation of only new data to Data Marts.$15 - $25
Reporting (Query)Using dashboards in Looker Studio (on lightweight data).$1 - $5
TOTAL$25 - $45 / monthly

BigQuery is not expensive - ignorance is expensive. It is risky to put raw data in the hands (and tools) of people who do not optimize queries. Implementing an intermediate layer (Data Marts) - whether manually or through platforms like WitCloud - turns an unpredictable invoice into a low, fixed subscription.