Community Metrics
by 6529-Collections
Create new community metrics by adding enum values, recording functions, wiring, backfill migrations, and API integration. Use when adding new community metrics, creating metrics, or tracking community activity.
Skill Details
Repository Files
1 file in this skill directory
name: community-metrics description: Create new community metrics by adding enum values, recording functions, wiring, backfill migrations, and API integration. Use when adding new community metrics, creating metrics, or tracking community activity.
Creating New Community Metrics
This skill guides you through the complete process of creating a new community metric in the 6529 SEIZE Backend.
Overview
Community metrics track various activities and statistics. Creating a new metric involves six steps:
- Add enum value to
MetricRollupHourMetric - Create recording function in
MetricsRecorder - Wire up the recorder in relevant code locations
- Create backfill migration for historical data
- Add field to
openapi.yamlin/community-metricsendpoint - Wire up metric field in
CommunityMetricsService.getCommunityMetricsSummary
Required Information
Before implementing, gather these details using AskUserQuestion:
1. Metric Name
- What should the metric be called?
- Use UPPER_SNAKE_CASE for the enum value
- Example:
DROPS_CREATED,WAVE_PARTICIPATIONS,RATINGS_GIVEN
2. Recording Function Arguments
- What parameters does the recording function need?
- Common patterns:
- Simple counter:
recordMetric(metric: MetricRollupHourMetric) - With entity ID:
recordMetric(metric: MetricRollupHourMetric, entityId: string) - With amount:
recordMetric(metric: MetricRollupHourMetric, amount: number) - With identity:
recordMetric(metric: MetricRollupHourMetric, identityId: string)
- Simple counter:
3. Recording Locations
- Where in the codebase should this metric be recorded?
- Consider:
- Which API endpoints create/modify the tracked activity?
- Which background loops process related data?
- Which services handle the business logic?
- Examples:
- Drop creation:
drops.api.service.tsincreateDrop() - Wave participation:
waves.api.service.tsin multiple methods - Ratings:
ratings.api.service.tsin rating submission methods
- Drop creation:
4. Summary Aggregation Strategy
- How should the metric be aggregated in
getCommunityMetricsSummary? - Options:
- Sum: Total count across all time (e.g., total drops created)
- Count: Number of distinct occurrences
- Average: Mean value over time
- Latest: Most recent value
- Custom: Complex calculation requiring joins or subqueries
Implementation Steps
Step 1: Add Enum Value
Find MetricRollupHourMetric (likely in src/entities/ or src/enums/) and add the new metric:
export enum MetricRollupHourMetric {
// ... existing metrics
NEW_METRIC_NAME = 'NEW_METRIC_NAME'
}
Step 2: Create Recording Function
In MetricsRecorder class (find with Glob or Grep), add a method to record the metric:
async recordNewMetricName(args: any): Promise<void> {
await this.recordMetric(
MetricRollupHourMetric.NEW_METRIC_NAME,
// additional arguments as needed
);
}
Step 3: Wire Up Recording
In the identified locations, call the recording function:
await this.metricsRecorder.recordNewMetricName(args);
Important: Ensure MetricsRecorder is available in the service. Check constructor for dependency injection.
Step 4: Create Backfill Migration
Run the migration command:
npm run migrate:new backfill-metric-name-metric
This creates two files:
migrations/TIMESTAMP-backfill-metric-name-metric.up.sqlmigrations/TIMESTAMP-backfill-metric-name-metric.down.sql
In the .up.sql file, write SQL to backfill historical data. Common pattern:
-- Insert historical metric data
INSERT INTO metric_rollup_hour (metric, hour, value, created_at)
SELECT
'NEW_METRIC_NAME' as metric,
DATE_FORMAT(created_at, '%Y-%m-%d %H:00:00') as hour,
COUNT(*) as value,
NOW() as created_at
FROM relevant_table
WHERE created_at IS NOT NULL
GROUP BY DATE_FORMAT(created_at, '%Y-%m-%d %H:00:00')
ON DUPLICATE KEY UPDATE
value = VALUES(value);
Delete the .down.sql file and update the .js migration file to do nothing in the down migration:
async down() {
// Do nothing - we don't rollback metric backfills
}
Step 5: Update OpenAPI Schema
Find /community-metrics endpoint in openapi.yaml and add the new field to the response schema:
CommunityMetricsSummary:
type: object
properties:
# ... existing metrics
new_metric_name:
type: integer
description: Description of what this metric represents
After editing, regenerate types:
cd src/api-serverless && npm run restructure-openapi && npm run generate
Step 6: Wire Up in CommunityMetricsService
Find CommunityMetricsService.getCommunityMetricsSummary and add the metric aggregation based on the chosen strategy:
For Sum Strategy:
const newMetricName = await this.db.execute<{total: number}>(
`SELECT COALESCE(SUM(value), 0) as total
FROM metric_rollup_hour
WHERE metric = :metric`,
{ metric: MetricRollupHourMetric.NEW_METRIC_NAME }
);
return {
// ... existing metrics
new_metric_name: newMetricName[0]?.total ?? 0
};
For Custom Strategy: Implement the specific SQL query needed for the aggregation logic.
Verification Checklist
After implementation, verify:
- Enum value added to
MetricRollupHourMetric - Recording function created in
MetricsRecorder - Recording function called in all identified locations
- Backfill migration created and SQL written
- OpenAPI schema updated with new field
- Types regenerated (
npm run restructure-openapi && npm run generate) - Aggregation logic added to
getCommunityMetricsSummary - Tests pass (
npm test) - Code builds (
npm run build)
Common Patterns
Pattern: Simple Activity Counter
- Use For: Counting occurrences (drops created, votes cast)
- Arguments: Just the metric enum
- Aggregation: SUM of all values
Pattern: Identity-Specific Metric
- Use For: Tracking per-user activity
- Arguments: Metric enum + identity ID
- Aggregation: SUM or COUNT with GROUP BY identity
Pattern: Weighted Metric
- Use For: Metrics with varying values (reputation changes, token amounts)
- Arguments: Metric enum + amount
- Aggregation: SUM of amounts
Files to Locate
Use these search patterns to find relevant files:
MetricRollupHourMetric:Grep "enum MetricRollupHourMetric"MetricsRecorder:Glob "**/*MetricsRecorder*"CommunityMetricsService:Glob "**/CommunityMetricsService*"- Migration files: Check
migrations/directory - OpenAPI:
openapi.yamlin project root
Example: Adding "Comments Created" Metric
- Enum: Add
COMMENTS_CREATEDtoMetricRollupHourMetric - Recorder:
async recordCommentCreated() { await this.recordMetric(MetricRollupHourMetric.COMMENTS_CREATED); } - Wiring: Call in
comments.api.service.tsafter comment creation - Backfill: Aggregate from
commentstable grouped by hour - OpenAPI: Add
comments_created: integerto response - Service: SUM all
COMMENTS_CREATEDvalues
Next Steps
- Use
AskUserQuestionto gather the four required pieces of information - Search the codebase to locate the necessary files
- Implement each step in order
- Run tests and build to verify
- Create migration and test backfill locally if possible
Related Skills
Reactome Database
Query Reactome REST API for pathway analysis, enrichment, gene-pathway mapping, disease pathways, molecular interactions, expression analysis, for systems biology studies.
Mermaid Diagrams
Comprehensive guide for creating software diagrams using Mermaid syntax. Use when users need to create, visualize, or document software through diagrams including class diagrams (domain modeling, object-oriented design), sequence diagrams (application flows, API interactions, code execution), flowcharts (processes, algorithms, user journeys), entity relationship diagrams (database schemas), C4 architecture diagrams (system context, containers, components), state diagrams, git graphs, pie charts,
Polars
Fast DataFrame library (Apache Arrow). Select, filter, group_by, joins, lazy evaluation, CSV/Parquet I/O, expression API, for high-performance data analysis workflows.
Reactome Database
Query Reactome REST API for pathway analysis, enrichment, gene-pathway mapping, disease pathways, molecular interactions, expression analysis, for systems biology studies.
Mermaidjs V11
Create diagrams and visualizations using Mermaid.js v11 syntax. Use when generating flowcharts, sequence diagrams, class diagrams, state diagrams, ER diagrams, Gantt charts, user journeys, timelines, architecture diagrams, or any of 24+ diagram types. Supports JavaScript API integration, CLI rendering to SVG/PNG/PDF, theming, configuration, and accessibility features. Essential for documentation, technical diagrams, project planning, system architecture, and visual communication.
Monitoring Apis
|
Validating Performance Budgets
Validate application performance against defined budgets to identify regressions early. Use when checking page load times, bundle sizes, or API response times against thresholds. Trigger with phrases like "validate performance budget", "check performance metrics", or "detect performance regression".
Tracking Application Response Times
Track and optimize application response times across API endpoints, database queries, and service calls. Use when monitoring performance or identifying bottlenecks. Trigger with phrases like "track response times", "monitor API performance", or "analyze latency".
Databuddy
Integrate Databuddy analytics into applications using the SDK or REST API. Use when implementing analytics tracking, feature flags, custom events, Web Vitals, error tracking, LLM observability, or querying analytics data programmatically.
Datasette Plugin Writer
Guide for writing Datasette plugins. This skill should be used when users want to create or develop plugins for Datasette, including information about plugin hooks, the cookiecutter template, database APIs, request/response handling, and plugin configuration.
