Thank you for your interest in contributing to Carbon! This document provides guidelines and information for contributors to help make the contribution process smooth and effective.
- About Carbon
- Development Setup
- Project Structure
- Code Style and Standards
- Testing
- Adding New Features
- Adding New Decoders
- Adding New Datasources
- Submitting Changes
- Release Process
Carbon is an indexing framework for Solana that provides a modular pipeline for sourcing data, decoding updates, and processing them to build end-to-end indexers. The project is organized as a Rust workspace with multiple crates covering different aspects of the indexing ecosystem.
- Rust: Version 1.82 or higher (see
rust-toolchain.tomlfor exact version) - Git: For version control
- Cargo: Rust's package manager (included with Rust)
-
Clone the repository:
git clone https://github.com/sevenlabs-hq/carbon.git cd carbon -
Install dependencies:
cargo build
-
Run tests:
cargo test
To activate the pre-commit hooks, run:
./.pre-commit.shThis will register the following checks that run on each commit:
- fmt: Checks code formatting using
cargo fmt --check - clippy: Runs
cargo clippyto catch potential issues - cargo_sort: Uses
cargo-sortto ensure Cargo.toml files are sorted correctly - machete: Checks for unused Cargo dependencies using
cargo-machete
The Carbon project is organized as a Rust workspace with the following main components:
carbon-core: The main framework providing pipeline orchestrationcarbon-cli: Command-line interface for generating decoders and scaffolding projectscarbon-macros: Procedural macros for the frameworkcarbon-proc-macros: Additional procedural macroscarbon-test-utils: Testing utilities and helpers
Data source implementations for various Solana data streams:
carbon-rpc-block-subscribe-datasource: WebSocket-based block subscriptioncarbon-rpc-program-subscribe-datasource: Program-specific account updatescarbon-yellowstone-grpc-datasource: Yellowstone gRPC Geyser clientcarbon-helius-atlas-ws-datasource: Helius Atlas WebSocket integrationcarbon-jito-shredstream-grpc-datasource: JITO shredstream integrationcarbon-rpc-block-crawler-datasource: Historical block crawlingcarbon-rpc-transaction-crawler-datasource: Historical transaction crawling
Program-specific decoders for popular Solana programs:
carbon-token-program-decoder: SPL Token programcarbon-jupiter-swap-decoder: Jupiter swap programcarbon-raydium-amm-v4-decoder: Raydium AMM v4carbon-kamino-lending-decoder: Kamino lending- And many more...
carbon-log-metrics: Log-based metrics collectioncarbon-prometheus-metrics: Prometheus metrics export
Working examples demonstrating various use cases:
block-finality-alerts: Block processing examplejupiter-swap-alerts: Jupiter swap monitoringkamino-alerts: Kamino lending monitoringtoken-indexing: Token account indexing with PostgreSQL- And more...
Carbon follows standard Rust conventions and best practices:
- Formatting: Use
cargo fmtto format code - Linting: Use
cargo clippyfor linting - Documentation: Document all public APIs with doc comments
Run the following commands to ensure code quality:
# Format code
./scripts/cargo-fmt.sh
# Run clippy with strict settings
./scripts/cargo-clippy.sh
# Run tests
cargo test
# Check for unused dependencies
cargo macheteThe project uses a strict clippy configuration defined in clippy.toml:
- Minimum Rust version: 1.82
- Maximum stack size for large types: 128 bytes
- Denies warnings, default trait access, arithmetic side effects, manual let-else, and used underscore binding
# Run all tests
cargo test
# Run tests for a specific crate
cargo test -p carbon-core
# Run tests with output
cargo test -- --nocapture- Unit tests: Located in
src/files with#[cfg(test)]modules - Integration tests: Located in
tests/directories - Examples: Working examples in the
examples/directory serve as integration tests
The carbon-test-utils crate provides common testing utilities and fixtures for:
- Mock datasources
- Test data generation
- Common test setup patterns
-
Create a feature branch
-
Implement the feature following the code style guidelines
-
Add tests for your new functionality
-
Update documentation including README files and doc comments
-
Run quality checks:
./scripts/cargo-clippy.sh ./scripts/cargo-fmt.sh cargo test
When adding a new crate to the workspace:
- Create the crate structure in the appropriate directory
- Add to workspace in
Cargo.toml:members = [ # ... existing members "your-new-crate" ]
- Add dependencies to the workspace dependencies section
- Update publish script in
scripts/publish-crate.shif the crate should be published
Each decoder follows a consistent structure:
decoders/your-program-decoder/
├── Cargo.toml
├── README.md
├── src/
│ ├── lib.rs
│ │ ├── account/
│ │ │ ├── mod.rs
│ │ │ └── decoder.rs
│ │ ├── instruction/
│ │ │ ├── mod.rs
│ │ │ └── decoder.rs
│ │ └── types/
│ │ ├── mod.rs
│ │ └── types.rs
│ └── tests/
│ └── fixtures/
-
Generate decoder using CLI (recommended):
carbon-cli parse --idl program_address -u mainnet-beta --output ./decoders/your-program-decoder
-
Manual creation:
- Create the directory structure
- Implement
AccountDecoderandInstructionDecodertraits - Add proper error handling
- Include comprehensive tests
-
Add to workspace:
- Update
Cargo.tomlworkspace dependencies - Add to publish script if needed
- Update
- Error handling: Use proper error types and provide meaningful error messages
- Documentation: Document all public APIs and complex logic
- Testing: Include unit tests and integration tests with real transaction data
- Performance: Optimize for performance, especially for high-frequency updates
Each datasource follows a consistent structure:
datasources/your-datasource/
├── Cargo.toml
├── README.md
└── src/
└── lib.rs
-
Implement the
Datasourcetrait:use carbon_core::datasource::{Datasource, Update, UpdateType}; use async_trait::async_trait; pub struct YourDatasource; #[async_trait] impl Datasource for YourDatasource { async fn consume( &self, sender: &tokio::sync::mpsc::UnboundedSender<Update>, cancellation_token: CancellationToken, ) -> CarbonResult<()> { // Implementation } fn update_types(&self) -> Vec<UpdateType> { vec![UpdateType::AccountUpdate, UpdateType::Transaction] } }
-
Add configuration options for flexibility
-
Include proper error handling and retry logic
-
Add comprehensive tests with mocked data
- Reliability: Implement proper error handling and retry mechanisms
- Performance: Optimize for throughput and low latency
- Configuration: Provide flexible configuration options
- Monitoring: Include metrics and logging for observability
-
Fork the repository and create a feature branch
-
Make your changes following the guidelines above
-
Test thoroughly:
cargo test ./scripts/cargo-clippy.sh ./scripts/cargo-fmt.sh -
Update documentation as needed
-
Create a pull request with:
- Clear description of changes
- Link to any related issues
- Screenshots or examples if applicable
- Title: Use conventional commit format (e.g., "feat: add new decoder for X program")
- Description: Explain what, why, and how
- Tests: Ensure all tests pass
- Documentation: Update relevant documentation
- Breaking changes: Clearly mark and explain breaking changes
Use conventional commit format:
type(scope): description
[optional body]
[optional footer]
Types: feat, fix, docs, style, refactor, test, chore
- Version: Managed in
Cargo.tomlworkspace package section - Rust version: Specified in
Cargo.tomlandclippy.toml - Dependencies: All workspace dependencies are centralized
- Issues: Use GitHub issues for bug reports and feature requests
- Discussions: Use GitHub discussions for questions and general discussion
- Documentation: Check the README and example projects
- Examples: Review the
examples/directory for working implementations
Please be respectful and inclusive in all interactions. We welcome contributors from all backgrounds and experience levels.
By contributing to Carbon, you agree that your contributions will be licensed under the MIT License.
Thank you for contributing to Carbon! Your contributions help make Solana indexing more accessible and powerful for the entire ecosystem.