Log Analytics

by My3VM

codedata

Generate and execute Python code to analyze large log datasets, detect patterns, and extract actionable insights

Skill Details

Repository Files

4 files in this skill directory


name: log-analytics description: Generate and execute Python code to analyze large log datasets, detect patterns, and extract actionable insights

Log Analytics Skill

Purpose: Generate and execute Python code to analyze large log datasets, detect patterns, and extract actionable insights.

When to Use: When you need to analyze 500+ log entries, detect error patterns, calculate statistics, or perform time-series analysis on log data.

🚨 CRITICAL SECURITY RULE: ALL file paths MUST be relative to project directory and start with analytics/ NEVER use /tmp/, /private/tmp/, or any paths outside the project workspace


🎯 Skill Overview

This skill guides you through:

  1. Fetching raw log data (1000+ entries)
  2. Generating Python analysis code tailored to the data structure
  3. Executing the code and interpreting results

CRITICAL: This skill uses progressive disclosure. You MUST read phase files in order.


🚀 Workflow

MANDATORY FIRST STEP:

Before using any tools, use the Read tool to read:

.claude/skills/log-analytics/phases/data-fetch.md

This file contains Phase 1 instructions and tells you which file to read next.

DO NOT proceed with tool calls until you've read Phase 1.

The complete workflow consists of 3 phases:

  1. Data Fetch (1-2 min) → phases/data-fetch.md
  2. Code Generation (2-3 min) → phases/code-generation.md
  3. Analysis Execution (1-2 min) → phases/analysis-execution.md

Each phase file contains a "Next Step" section directing you to the next phase.


🔑 Key Principles

Progressive Disclosure: Phase files reveal detailed instructions progressively. Read each phase file in sequence - do not skip ahead or assume you know what to do.

Dynamic Code Generation: Generate Python code based on the ACTUAL log structure returned. Don't use generic templates.

Structured Output: Always provide analysis results in JSON format with counts, percentages, and trends.

Save Your Work: Save generated scripts to analytics/ directory for reuse and auditing.


📊 Expected Outputs

By the end of this skill execution, you will have:

  1. Raw log data saved to analytics/incident_logs.json
  2. Python analysis script saved to analytics/parse_logs_[timestamp].py
  3. Analysis results in JSON format showing:
    • Error counts by type
    • Time-based error distribution
    • Service-level breakdown
    • Performance metrics (p95, p99)
    • Detected anomalies

🔗 Integration

This skill can be invoked by other skills (e.g., incident-analysis) when they need deep log analysis.

From incident-analysis skill:

When log data exceeds 500 entries, invoke the log-analytics skill:
Use Skill tool → "log-analytics"

📁 MCP Tools Used

This skill requires the log-analytics-server MCP server, which provides:

  • get_raw_logs(incident_id, timeframe) - Fetch large log datasets
  • execute_analysis_script(script_path) - Run generated Python code

Ready to begin?

Use the Read tool to read: .claude/skills/log-analytics/phases/data-fetch.md

Related Skills

Xlsx

Comprehensive spreadsheet creation, editing, and analysis with support for formulas, formatting, data analysis, and visualization. When Claude needs to work with spreadsheets (.xlsx, .xlsm, .csv, .tsv, etc) for: (1) Creating new spreadsheets with formulas and formatting, (2) Reading or analyzing data, (3) Modify existing spreadsheets while preserving formulas, (4) Data analysis and visualization in spreadsheets, or (5) Recalculating formulas

data

Clickhouse Io

ClickHouse database patterns, query optimization, analytics, and data engineering best practices for high-performance analytical workloads.

datacli

Clickhouse Io

ClickHouse database patterns, query optimization, analytics, and data engineering best practices for high-performance analytical workloads.

datacli

Analyzing Financial Statements

This skill calculates key financial ratios and metrics from financial statement data for investment analysis

data

Data Storytelling

Transform data into compelling narratives using visualization, context, and persuasive structure. Use when presenting analytics to stakeholders, creating data reports, or building executive presentations.

data

Kpi Dashboard Design

Design effective KPI dashboards with metrics selection, visualization best practices, and real-time monitoring patterns. Use when building business dashboards, selecting metrics, or designing data visualization layouts.

designdata

Dbt Transformation Patterns

Master dbt (data build tool) for analytics engineering with model organization, testing, documentation, and incremental strategies. Use when building data transformations, creating data models, or implementing analytics engineering best practices.

testingdocumenttool

Sql Optimization Patterns

Master SQL query optimization, indexing strategies, and EXPLAIN analysis to dramatically improve database performance and eliminate slow queries. Use when debugging slow queries, designing database schemas, or optimizing application performance.

designdata

Anndata

This skill should be used when working with annotated data matrices in Python, particularly for single-cell genomics analysis, managing experimental measurements with metadata, or handling large-scale biological datasets. Use when tasks involve AnnData objects, h5ad files, single-cell RNA-seq data, or integration with scanpy/scverse tools.

arttooldata

Xlsx

Spreadsheet toolkit (.xlsx/.csv). Create/edit with formulas/formatting, analyze data, visualization, recalculate formulas, for spreadsheet processing and analysis.

tooldata

Skill Information

Category:Technical
Last Updated:11/28/2025