A memory benchmarking and analysis tool for CPython development, designed to track memory usage patterns across different commits and build configurations.
Architecture
This project consists of three main components:
Backend (/backend/) - FastAPI application with SQLite database for data storage and API endpoints
Frontend (/frontend/) - Next.js React application with rich data visualization and analysis tools
Worker (/worker/) - Python CLI tool for running memory benchmarks on CPython commits
Quick Start
Prerequisites
Docker Engine 20.10+ and Docker Compose 2.0+
CPython source repository (for benchmarking with the worker)
Setup & Installation
# Copy environment config
cp .env.example .env
# Build and start all services
docker compose -f docker-compose.dev.yml up --build
Backend tests use an in-memory SQLite database, independent of the
PostgreSQL instance used in development. Each test gets a fresh database
with empty tables. Fixtures in backend/tests/conftest.py provide
pre-built model instances (commits, binaries, environments, runs,
benchmark results, auth tokens) that tests can depend on as needed.
Requests go through httpx.AsyncClient with FastAPI’s ASGI transport,
so the full request/response cycle (middleware, dependency injection,
validation) is exercised without a running server.
Both checks run in CI on pushes to main and on pull requests.
# Edit backend/requirements.in, then regenerate both lockfiles:
docker run --rm -v "$(pwd)/backend:/app" -w /app python:3.13-slim-bookworm \
sh -c "pip install --quiet pip-tools && \
pip-compile --strip-extras --generate-hashes \
--output-file requirements.txt requirements.in && \
pip-compile --strip-extras --generate-hashes \
--output-file requirements-dev.txt requirements-dev.in"
# Rebuild the backend container:
docker compose -f docker-compose.dev.yml up --build -d backend
Worker Setup
Worker Usage
# Set authentication token
export MEMORY_TRACKER_TOKEN=your_token_here
# List available binaries and environments
memory-tracker list-binaries
memory-tracker list-environments
# Run benchmarks on CPython commits
memory-tracker benchmark /path/to/cpython HEAD~5..HEAD \
--binary-id default \
--environment-id linux-x86_64
# Parallel processing with 4 workers
memory-tracker benchmark /path/to/cpython HEAD~10..HEAD \
--binary-id default \
--environment-id linux-x86_64 \
--max-workers 4
# Local checkout mode (sequential only)
memory-tracker benchmark /path/to/cpython HEAD~5..HEAD \
--binary-id default \
--environment-id linux-x86_64 \
--local-checkout
Docker Support
# Development with hot reload
docker compose -f docker-compose.dev.yml up
# Production deployment
docker compose up
Local Development (not recommended)
Running services directly on the host is possible but not recommended.
Docker Compose ensures consistent Python/Node versions, database setup,
and dependency isolation across all platforms.
Prerequisites
Python 3.13+
Node.js 20+
make setup # Install deps, init DB, populate mock data
make dev # Start frontend + backend with hot reload
make test # Run backend tests
make reset-db # Drop and recreate database with fresh data
make populate-db # Populate the DB with mock data
make build # Build frontend for production
make clean # Clean up generated files and caches
Usage Examples
Analyzing Memory Trends
Navigate to /trends to view memory usage over time
CPython Memory Tracker
A memory benchmarking and analysis tool for CPython development, designed to track memory usage patterns across different commits and build configurations.
Architecture
This project consists of three main components:
/backend/) - FastAPI application with SQLite database for data storage and API endpoints/frontend/) - Next.js React application with rich data visualization and analysis tools/worker/) - Python CLI tool for running memory benchmarks on CPython commitsQuick Start
Prerequisites
Setup & Installation
Development
Services start automatically with hot reload:
Development Commands
Testing
Backend tests use an in-memory SQLite database, independent of the PostgreSQL instance used in development. Each test gets a fresh database with empty tables. Fixtures in
backend/tests/conftest.pyprovide pre-built model instances (commits, binaries, environments, runs, benchmark results, auth tokens) that tests can depend on as needed. Requests go throughhttpx.AsyncClientwith FastAPI’s ASGI transport, so the full request/response cycle (middleware, dependency injection, validation) is exercised without a running server.Both checks run in CI on pushes to
mainand on pull requests.Populating Mock Data
Updating Backend Dependencies
Worker Setup
Worker Usage
Docker Support
Local Development (not recommended)
Running services directly on the host is possible but not recommended. Docker Compose ensures consistent Python/Node versions, database setup, and dependency isolation across all platforms.
Prerequisites
Usage Examples
Analyzing Memory Trends
/trendsto view memory usage over timeComparing Commits
/difffor commit comparisonRunning Benchmarks
Contributing
License
This project is licensed under the MIT License - see the LICENSE file for details.
Related Projects