Db

by johnlindquist

data

Database operations for SQLite, PostgreSQL, and MySQL. Use for queries, schema inspection, migrations, and AI-assisted query generation.

Skill Details

Repository Files

1 file in this skill directory


name: db description: Database operations for SQLite, PostgreSQL, and MySQL. Use for queries, schema inspection, migrations, and AI-assisted query generation.

Database Manager

Query and manage databases across SQLite, PostgreSQL, and MySQL.

Prerequisites

Install database CLIs as needed:

# SQLite (usually pre-installed on macOS/Linux)
sqlite3 --version

# PostgreSQL
brew install postgresql
# or
apt install postgresql-client

# MySQL
brew install mysql-client
# or
apt install mysql-client

CLI Reference

SQLite

# Connect to database
sqlite3 database.db

# Execute query
sqlite3 database.db "SELECT * FROM users LIMIT 10"

# Output as CSV
sqlite3 -csv database.db "SELECT * FROM users"

# Output as JSON (requires sqlite 3.33+)
sqlite3 -json database.db "SELECT * FROM users"

# Column headers
sqlite3 -header database.db "SELECT * FROM users"

# Execute SQL file
sqlite3 database.db < queries.sql

# Schema commands
sqlite3 database.db ".schema"
sqlite3 database.db ".tables"
sqlite3 database.db ".schema users"

PostgreSQL

# Connect
psql postgresql://user:pass@host:5432/dbname

# Execute query
psql -c "SELECT * FROM users LIMIT 10" postgresql://...

# Tuples only (no headers)
psql -t -c "SELECT count(*) FROM users" postgresql://...

# No alignment (machine-readable)
psql -t -A -c "SELECT id,name FROM users" postgresql://...

# Execute SQL file
psql -f queries.sql postgresql://...

# List tables
psql -c "\dt" postgresql://...

# Describe table
psql -c "\d users" postgresql://...

# Output format
psql -c "SELECT * FROM users" --csv postgresql://...
psql -c "SELECT * FROM users" --html postgresql://...

MySQL

# Connect
mysql -h host -u user -p dbname

# Execute query
mysql -h host -u user -p -e "SELECT * FROM users LIMIT 10" dbname

# Batch mode (no headers)
mysql -h host -u user -p -B -e "SELECT * FROM users" dbname

# Execute SQL file
mysql -h host -u user -p dbname < queries.sql

# Show tables
mysql -h host -u user -p -e "SHOW TABLES" dbname

# Describe table
mysql -h host -u user -p -e "DESCRIBE users" dbname

Common Operations

Schema Inspection

SQLite

# All tables
sqlite3 db.sqlite ".tables"

# Table schema
sqlite3 db.sqlite ".schema tablename"

# All schemas
sqlite3 db.sqlite ".schema"

PostgreSQL

# All tables
psql -c "\dt" $DATABASE_URL

# Table schema
psql -c "\d tablename" $DATABASE_URL

# Table with indexes
psql -c "\d+ tablename" $DATABASE_URL

MySQL

# All tables
mysql -e "SHOW TABLES" -h host -u user -p dbname

# Table schema
mysql -e "DESCRIBE tablename" -h host -u user -p dbname

# Create statement
mysql -e "SHOW CREATE TABLE tablename" -h host -u user -p dbname

Query Explanation

# SQLite
sqlite3 db.sqlite "EXPLAIN QUERY PLAN SELECT * FROM users WHERE email = 'x'"

# PostgreSQL
psql -c "EXPLAIN ANALYZE SELECT * FROM users WHERE email = 'x'" $DATABASE_URL

# MySQL
mysql -e "EXPLAIN SELECT * FROM users WHERE email = 'x'" dbname

Data Export

# SQLite to CSV
sqlite3 -csv -header db.sqlite "SELECT * FROM users" > users.csv

# PostgreSQL to CSV
psql -c "\COPY users TO 'users.csv' CSV HEADER" $DATABASE_URL

# MySQL to CSV
mysql -e "SELECT * FROM users" -B dbname | tr '\t' ',' > users.csv

AI-Assisted Query Generation

Use Gemini to help write queries:

# Describe what you want
gemini -m pro -o text -e "" "Write a SQL query to:
- Find all users who signed up in the last 30 days
- Who have made at least one purchase
- Order by purchase count descending

Table schemas:
- users (id, email, created_at)
- purchases (id, user_id, amount, created_at)

Output PostgreSQL-compatible SQL."

Safe Query Review

# Generate query
QUERY=$(gemini -m pro -o text -e "" "Write SQL for: [your request]")

# Review before executing
echo "Generated query:"
echo "$QUERY"

# Then execute if safe
# psql -c "$QUERY" $DATABASE_URL

Migration Patterns

Schema Changes

# Create migration file
cat > migrations/001_add_column.sql << 'EOF'
ALTER TABLE users ADD COLUMN status VARCHAR(50) DEFAULT 'active';
EOF

# Apply migration
psql -f migrations/001_add_column.sql $DATABASE_URL

Safe Migration Workflow

# 1. Test on copy first
createdb test_migration
pg_dump $DATABASE_URL | psql test_migration

# 2. Run migration on test
psql -f migration.sql test_migration

# 3. Verify
psql -c "\d tablename" test_migration

# 4. Apply to production
psql -f migration.sql $DATABASE_URL

# 5. Cleanup
dropdb test_migration

Environment Variables

Store connection strings securely:

# .env file (don't commit!)
DATABASE_URL=postgresql://user:pass@host:5432/dbname
SQLITE_DB=./data/app.db

# Usage
psql $DATABASE_URL
sqlite3 $SQLITE_DB

Best Practices

  1. Never hardcode credentials - Use environment variables
  2. Review AI-generated queries - Before executing
  3. Use EXPLAIN - Check query performance
  4. Test migrations - On copy before production
  5. Backup before changes - Especially destructive ones
  6. Use transactions - For multi-statement changes
  7. Limit results - Always use LIMIT during exploration

Related Skills

Xlsx

Comprehensive spreadsheet creation, editing, and analysis with support for formulas, formatting, data analysis, and visualization. When Claude needs to work with spreadsheets (.xlsx, .xlsm, .csv, .tsv, etc) for: (1) Creating new spreadsheets with formulas and formatting, (2) Reading or analyzing data, (3) Modify existing spreadsheets while preserving formulas, (4) Data analysis and visualization in spreadsheets, or (5) Recalculating formulas

data

Clickhouse Io

ClickHouse database patterns, query optimization, analytics, and data engineering best practices for high-performance analytical workloads.

datacli

Clickhouse Io

ClickHouse database patterns, query optimization, analytics, and data engineering best practices for high-performance analytical workloads.

datacli

Analyzing Financial Statements

This skill calculates key financial ratios and metrics from financial statement data for investment analysis

data

Data Storytelling

Transform data into compelling narratives using visualization, context, and persuasive structure. Use when presenting analytics to stakeholders, creating data reports, or building executive presentations.

data

Kpi Dashboard Design

Design effective KPI dashboards with metrics selection, visualization best practices, and real-time monitoring patterns. Use when building business dashboards, selecting metrics, or designing data visualization layouts.

designdata

Dbt Transformation Patterns

Master dbt (data build tool) for analytics engineering with model organization, testing, documentation, and incremental strategies. Use when building data transformations, creating data models, or implementing analytics engineering best practices.

testingdocumenttool

Sql Optimization Patterns

Master SQL query optimization, indexing strategies, and EXPLAIN analysis to dramatically improve database performance and eliminate slow queries. Use when debugging slow queries, designing database schemas, or optimizing application performance.

designdata

Anndata

This skill should be used when working with annotated data matrices in Python, particularly for single-cell genomics analysis, managing experimental measurements with metadata, or handling large-scale biological datasets. Use when tasks involve AnnData objects, h5ad files, single-cell RNA-seq data, or integration with scanpy/scverse tools.

arttooldata

Xlsx

Spreadsheet toolkit (.xlsx/.csv). Create/edit with formulas/formatting, analyze data, visualization, recalculate formulas, for spreadsheet processing and analysis.

tooldata

Skill Information

Category:Data
Last Updated:12/12/2025