Programmatic Eda

by nimrodfisher

data

Systematic exploratory data analysis following best practices. Use when analyzing any dataset to understand structure, identify data quality issues (duplicates, missing values, inconsistencies, outliers), examine distributions, detect correlations, and generate visualizations. Provides comprehensive data profiling with sanity checks before analysis.

Skill Details

Repository Files

1 file in this skill directory


name: programmatic-eda description: Systematic exploratory data analysis following best practices. Use when analyzing any dataset to understand structure, identify data quality issues (duplicates, missing values, inconsistencies, outliers), examine distributions, detect correlations, and generate visualizations. Provides comprehensive data profiling with sanity checks before analysis.

Programmatic EDA

Quick Start

Execute systematic data quality checks, distribution analysis, and correlation detection on any dataset with automated sanity checks.

Context Requirements

Before starting EDA, Claude needs:

  1. Dataset Access: The data file or database connection
  2. Business Context: What this data represents and what decisions it informs
  3. Quality Thresholds (optional): What % missing/outliers are acceptable

Context Gathering

If dataset not yet loaded:

"Please provide your dataset. I can work with:

  • CSV/Excel files (upload or provide path)
  • Database connection details
  • Pandas DataFrame (if already loaded in notebook)"

If business context missing:

"To provide relevant insights, I need to understand:

  1. What does this dataset represent? (customers, transactions, events, etc.)
  2. What business question are you trying to answer?
  3. What time period does this cover?
  4. Are there any known data quality issues I should be aware of?"

For quality thresholds (if not provided, use defaults):

"I'll use standard thresholds unless you specify otherwise:

  • Missing values: Flag if >5% (warn if >30%)
  • Outliers: Flag using IQR method (1.5 × IQR)
  • Duplicates: Flag if >1%

Do these work for your use case, or should I adjust?"

Workflow

1. Data Loading & Overview

Related Skills

Xlsx

Comprehensive spreadsheet creation, editing, and analysis with support for formulas, formatting, data analysis, and visualization. When Claude needs to work with spreadsheets (.xlsx, .xlsm, .csv, .tsv, etc) for: (1) Creating new spreadsheets with formulas and formatting, (2) Reading or analyzing data, (3) Modify existing spreadsheets while preserving formulas, (4) Data analysis and visualization in spreadsheets, or (5) Recalculating formulas

data

Clickhouse Io

ClickHouse database patterns, query optimization, analytics, and data engineering best practices for high-performance analytical workloads.

datacli

Clickhouse Io

ClickHouse database patterns, query optimization, analytics, and data engineering best practices for high-performance analytical workloads.

datacli

Analyzing Financial Statements

This skill calculates key financial ratios and metrics from financial statement data for investment analysis

data

Data Storytelling

Transform data into compelling narratives using visualization, context, and persuasive structure. Use when presenting analytics to stakeholders, creating data reports, or building executive presentations.

data

Kpi Dashboard Design

Design effective KPI dashboards with metrics selection, visualization best practices, and real-time monitoring patterns. Use when building business dashboards, selecting metrics, or designing data visualization layouts.

designdata

Dbt Transformation Patterns

Master dbt (data build tool) for analytics engineering with model organization, testing, documentation, and incremental strategies. Use when building data transformations, creating data models, or implementing analytics engineering best practices.

testingdocumenttool

Sql Optimization Patterns

Master SQL query optimization, indexing strategies, and EXPLAIN analysis to dramatically improve database performance and eliminate slow queries. Use when debugging slow queries, designing database schemas, or optimizing application performance.

designdata

Anndata

This skill should be used when working with annotated data matrices in Python, particularly for single-cell genomics analysis, managing experimental measurements with metadata, or handling large-scale biological datasets. Use when tasks involve AnnData objects, h5ad files, single-cell RNA-seq data, or integration with scanpy/scverse tools.

arttooldata

Xlsx

Spreadsheet toolkit (.xlsx/.csv). Create/edit with formulas/formatting, analyze data, visualization, recalculate formulas, for spreadsheet processing and analysis.

tooldata

Skill Information

Category:Data
Last Updated:1/11/2026