Analytics Design
by yusuke-suzuki
Design data analysis from purpose clarification to visualization. Use when analyzing data, exploring BigQuery schemas, building queries, or creating Looker Studio reports.
Skill Details
Repository Files
3 files in this skill directory
name: analytics-design description: Design data analysis from purpose clarification to visualization. Use when analyzing data, exploring BigQuery schemas, building queries, or creating Looker Studio reports.
Analytics Design
Workflow
Use references/analytics-design-template.md to document every analysis.
-
Clarify Purpose: What do you want to know? Why is this analysis needed? Who will use it? One-time or ongoing monitoring?
-
Discover Data: Explore available datasets and understand schema.
- Ask user for project/dataset context and business background
bq ls,bq show --schemafor BigQuery tablesdb/schema.rbfor Rails projects- API docs or sample data for external services
- Understand table relationships (ER diagrams help)
-
Build Query: Write SQL based on discovered schema.
- Use CTEs for readability
- Execute using BigQuery CLI (see BigQuery Query Execution below)
- Interpret results and document findings
-
Create Dashboard (if ongoing monitoring needed):
- Use references/looker-studio-template.md to design
- Define decisions: What actions will users take based on this dashboard?
- Check existing resources: Similar dashboards or queries already exist?
- Align time granularity with usage frequency (daily/weekly/monthly)
- Design data sources, pages, and charts
BigQuery Query Execution
Prerequisites
Check gcloud configuration before running queries:
gcloud config get-value project
- If authentication error: prompt user to run
gcloud auth login, then resume - If project unset: prompt user to run
gcloud config set project <PROJECT_ID>
Execution Process
-
Dry run: Validate syntax and estimate cost
bq query --use_legacy_sql=false --dry_run "SELECT * FROM \`project.dataset.table\`"Cost: ~$5/TB. <1GB is light, 2GB+ needs optimization.
-
Execute: Run and confirm results
bq query --use_legacy_sql=false --format=csv "SELECT * FROM \`project.dataset.table\`"
Always use fully-qualified table names: project.dataset.table
Query Design Tips
- Specify exact date ranges
- Filter partitioned tables by partition key
- Avoid correlated subqueries (use JOINs/CTEs)
- Filter early with CTEs before joining large tables
Looker Studio Best Practices
Reference Documentation
- Data types: Field data types (Number, Text, Date & Time, Currency, Percent, etc.)
- Types of charts: Chart types (Time series, Combo chart, Table, etc.)
- Parameters: Data source parameters
Settings Documentation
- Verify setting names against actual Looker Studio UI before documenting
- Use exact terminology from the UI
Data Source Design
- One data source per analytical purpose
- Pre-aggregate in SQL for performance
- Include bucket fields for distribution analysis
- Include sort-order fields for proper chart ordering
- Descriptive data source names
Report Structure
- Separate pages by time granularity (daily/monthly)
- Group related metrics per page
- Consistent filter scopes within pages
Chart Type Selection
| Purpose | Chart Type |
|---|---|
| KPI current value | Scorecard |
| Time series trend | Time series chart |
| Category breakdown over time | Stacked area / Stacked bar |
| Category comparison | Bar chart |
| Composition | Pie chart |
| Detailed data | Table |
| Distribution (percentile) | Time series (multiple metrics) |
Related Skills
Xlsx
Comprehensive spreadsheet creation, editing, and analysis with support for formulas, formatting, data analysis, and visualization. When Claude needs to work with spreadsheets (.xlsx, .xlsm, .csv, .tsv, etc) for: (1) Creating new spreadsheets with formulas and formatting, (2) Reading or analyzing data, (3) Modify existing spreadsheets while preserving formulas, (4) Data analysis and visualization in spreadsheets, or (5) Recalculating formulas
Clickhouse Io
ClickHouse database patterns, query optimization, analytics, and data engineering best practices for high-performance analytical workloads.
Clickhouse Io
ClickHouse database patterns, query optimization, analytics, and data engineering best practices for high-performance analytical workloads.
Analyzing Financial Statements
This skill calculates key financial ratios and metrics from financial statement data for investment analysis
Data Storytelling
Transform data into compelling narratives using visualization, context, and persuasive structure. Use when presenting analytics to stakeholders, creating data reports, or building executive presentations.
Team Composition Analysis
This skill should be used when the user asks to "plan team structure", "determine hiring needs", "design org chart", "calculate compensation", "plan equity allocation", or requests organizational design and headcount planning for a startup.
Kpi Dashboard Design
Design effective KPI dashboards with metrics selection, visualization best practices, and real-time monitoring patterns. Use when building business dashboards, selecting metrics, or designing data visualization layouts.
Dbt Transformation Patterns
Master dbt (data build tool) for analytics engineering with model organization, testing, documentation, and incremental strategies. Use when building data transformations, creating data models, or implementing analytics engineering best practices.
Sql Optimization Patterns
Master SQL query optimization, indexing strategies, and EXPLAIN analysis to dramatically improve database performance and eliminate slow queries. Use when debugging slow queries, designing database schemas, or optimizing application performance.
Anndata
This skill should be used when working with annotated data matrices in Python, particularly for single-cell genomics analysis, managing experimental measurements with metadata, or handling large-scale biological datasets. Use when tasks involve AnnData objects, h5ad files, single-cell RNA-seq data, or integration with scanpy/scverse tools.
