Timescaledb Management
by Chucks007
Guidelines for managing TimescaleDB hypertables, continuous aggregates, and migrations. Use this when creating new tables or modifying time-series data schemas.
Skill Details
Repository Files
1 file in this skill directory
name: timescaledb-management description: Guidelines for managing TimescaleDB hypertables, continuous aggregates, and migrations. Use this when creating new tables or modifying time-series data schemas.
When managing database migrations for the Cycle Navigator Dashboard, follow these mandatory steps:
-
Mandatory Backup: Always back up the production database before any migration.
- Linux (Bash):
pg_dump -h localhost -U cycle_user cycle_navigator > backup_$(date +%Y%m%d).sql - Linux (Fish):
pg_dump -h localhost -U cycle_user cycle_navigator > backup_(date +%Y%m%d).sql - Windows (PowerShell):
pg_dump -h localhost -U cycle_user cycle_navigator > backup_$(Get-Date -Format 'yyyyMMdd').sql - Windows (CMD):
pg_dump -h localhost -U cycle_user cycle_navigator > backup_%date:~10,4%%date:~4,2%%date:~7,2%.sql
- Linux (Bash):
-
Pre-Migration Checks: Run the check script to verify compatibility before applying migrations. This script checks for schema conflicts, data type mismatches, and TimescaleDB extension availability.
Prerequisites: Ensure Python 3.x is installed and dependencies are met (runpip install -r requirements.txtif needed).
Command:python3 scripts/run_timescale_migrations.py --check-only(usepythonon Windows if aliased).
If checks fail, review the output for issues and resolve before proceeding. -
Hypertable Conversion: Converting a table to a hypertable is irreversible. Verify the
chunk_time_intervalbefore conversion (standard is '1 month' for FRED data due to lower update frequency, and '7 days' for crypto data due to high-frequency volatility).
Command:python3 scripts/run_timescale_migrations.py --convert-table <table_name> --chunk-interval <interval>
After conversion, runSELECT * FROM timescaledb_information.hypertables;to confirm the hypertable exists and chunk settings are correct. -
Compression Policy: Ensure new hypertables include a compression policy to optimize storage and query performance. The standard is 90 days for macro data (slower update cadence) and 30 days for crypto data (high-frequency updates).
Command:python3 scripts/run_timescale_migrations.py --compress-table <table_name> --compress-after <days>
Verify with:SELECT * FROM timescaledb_information.compression_settings WHERE hypertable_name = '<table_name>'; -
Continuous Aggregates: When pre-calculating metrics, use materialized views with a refresh policy to reduce query latency. Use hourly refreshes for crypto data (high-frequency volatility) and daily refreshes for macro data (slower update cadence).
Command:python3 scripts/run_timescale_migrations.py --create-aggregate <aggregate_name> --source-table <table_name> --refresh-interval <interval>
Example metrics: OHLCV candles, moving averages, volatility indices.
Verify with:SELECT * FROM timescaledb_information.continuous_aggregates WHERE view_name = '<aggregate_name>';
Related Skills
Xlsx
Comprehensive spreadsheet creation, editing, and analysis with support for formulas, formatting, data analysis, and visualization. When Claude needs to work with spreadsheets (.xlsx, .xlsm, .csv, .tsv, etc) for: (1) Creating new spreadsheets with formulas and formatting, (2) Reading or analyzing data, (3) Modify existing spreadsheets while preserving formulas, (4) Data analysis and visualization in spreadsheets, or (5) Recalculating formulas
Clickhouse Io
ClickHouse database patterns, query optimization, analytics, and data engineering best practices for high-performance analytical workloads.
Clickhouse Io
ClickHouse database patterns, query optimization, analytics, and data engineering best practices for high-performance analytical workloads.
Analyzing Financial Statements
This skill calculates key financial ratios and metrics from financial statement data for investment analysis
Data Storytelling
Transform data into compelling narratives using visualization, context, and persuasive structure. Use when presenting analytics to stakeholders, creating data reports, or building executive presentations.
Kpi Dashboard Design
Design effective KPI dashboards with metrics selection, visualization best practices, and real-time monitoring patterns. Use when building business dashboards, selecting metrics, or designing data visualization layouts.
Dbt Transformation Patterns
Master dbt (data build tool) for analytics engineering with model organization, testing, documentation, and incremental strategies. Use when building data transformations, creating data models, or implementing analytics engineering best practices.
Sql Optimization Patterns
Master SQL query optimization, indexing strategies, and EXPLAIN analysis to dramatically improve database performance and eliminate slow queries. Use when debugging slow queries, designing database schemas, or optimizing application performance.
Anndata
This skill should be used when working with annotated data matrices in Python, particularly for single-cell genomics analysis, managing experimental measurements with metadata, or handling large-scale biological datasets. Use when tasks involve AnnData objects, h5ad files, single-cell RNA-seq data, or integration with scanpy/scverse tools.
Xlsx
Spreadsheet toolkit (.xlsx/.csv). Create/edit with formulas/formatting, analyze data, visualization, recalculate formulas, for spreadsheet processing and analysis.
