Usage → Table
Analyze BigQuery table access patterns, identify optimization opportunities, and find candidates for deprecation or cleanup.
The Table tab provides comprehensive insights into how your BigQuery tables are being accessed, their resource consumption, and optimization opportunities.
Overview
The Table usage view enables you to:
- Track which tables consume the most slots and bytes
- Identify tables that may be candidates for optimization or deprecation
- Understand partitioning efficiency across your data estate
- Monitor table usage trends over time
Page Controls
Project Filter
Filter metrics to specific BigQuery projects.
Cost Toggle
Toggle between resource metrics (slots, bytes) and cost estimates (USD).
Cards & Tables
Partitioned Table Summary
Three summary cards showing your table partitioning strategy:
- Partitioned Table Count: Number of partitioned vs. unpartitioned tables
- Partitioned Bytes: Storage size breakdown by partitioning status
- Partitioned Rows: Row count breakdown by partitioning status
A high percentage of unpartitioned tables may indicate opportunities to improve query performance and reduce costs.
Access Query Count by Table
The primary table showing access metrics for each table:
- Query count (24h, 7d, 30d)
- Slots used or cost (24h, 7d, 30d)
- Bytes processed (24h, 7d, 30d)
Click on any table to see the queries that accessed it.
Tables Queried but Not Updated
Tables that are actively queried but haven't been updated recently. This can indicate:
- Stale data being served to consumers
- Potential data freshness issues
- Tables that should have their update pipelines investigated
Filter by last queried time, last updated time, and partitioning status.
Tables Updated but Not Queried
Tables that are being updated but not queried. This can indicate:
- Wasted compute resources on unused pipelines
- Tables that could be deprecated
- Development/test tables that should be cleaned up
Tables Not Queried Recently
Tables that haven't been queried in the last 90 days. These are candidates for:
- Archival or deletion to reduce storage costs
- Review of data retention policies
- Deprecation workflows
Data Pipeline Cost
Shows aggregated cost metrics for data pipelines, including load and usage query counts, bytes processed, and execution times. This helps identify expensive pipelines and understand the downstream impact of source tables.
Cost Calculation
Storage Cost Estimation
Monthly storage costs are estimated using BigQuery's standard storage pricing:
| Storage Type | Price (USD/GiB/month) |
|---|---|
| Active Logical | $0.02 |
| Active Physical | $0.04 |
| Long-term Logical | $0.01 |
| Long-term Physical | $0.02 |
Note: If your organization's negotiated pricing or rates differ from BigQuery's defaults, Revefi can customize these calculations to match your actual costs. Contact our team to ensure your reports reflect your true rates!
Use Cases
Identifying Optimization Opportunities
- Sort by cost to find the most expensive tables
- Look for tables with high bytes processed relative to their size
- Find large unpartitioned tables that would benefit from partitioning
Data Hygiene
- Review "Tables Queried but Not Updated" for data freshness issues
- Check "Tables Updated but Not Queried" for wasted compute
- Use "Tables Not Queried Recently" to identify cleanup opportunities
Cost Management
- Quantify the financial impact of optimizations
- Compare trends across time periods
- Focus efforts on high-cost tables
Related Pages
- Usage → Project: High-level project overview
- Usage → User: User-level usage analysis
- Usage → Query: Individual query analysis
- Usage → Failures: Failed query investigation
Updated 16 days ago
