Skip to content
/ etl Public

Stream your Postgres data anywhere in real-time. Simple Rust building blocks for change data capture (CDC) pipelines.

License

Notifications You must be signed in to change notification settings

supabase/etl


Supabase Logo

Supabase ETL

A Rust crate to quickly build replication solutions for Postgres. Build data pipelines which continually copy data from Postgres to other systems.
Examples

ETL

This crate builds abstractions on top of Postgres's logical streaming replication protocol and pushes users towards the pit of success without letting them worry about low level details of the protocol.

Table of Contents

Features

The etl crate supports the following destinations:

  • BigQuery
  • Apache Iceberg (planned)
  • DuckDB (planned)

Installation

To use etl in your Rust project, add the core library and desired destinations via git dependencies in Cargo.toml:

[dependencies]
etl = { git = "https://github.com/supabase/etl" }
etl-destinations = { git = "https://github.com/supabase/etl", features = ["bigquery"] }

The etl crate provides the core replication functionality, while etl-destinations contains destination-specific implementations. Each destination is behind a feature of the same name in the etl-destinations crate. The git dependency is needed for now because the crates are not yet published on crates.io.

Quickstart

To quickly get started with etl, see the etl-examples crate which contains practical examples and detailed setup instructions.

Database Setup

Before running the examples, tests, or the API and replicator components, you'll need to set up a PostgreSQL database. We provide a convenient script to help you with this setup. For detailed instructions on how to use the database setup script, please refer to our Database Setup Guide.

Running Tests

To run the test suite:

cargo test --all-features

Docker

The repository includes Docker support for both the replicator and api components:

# Build replicator image
docker build -f ./etl-replicator/Dockerfile .

# Build api image
docker build -f ./etl-api/Dockerfile .

Architecture

For a detailed explanation of the ETL architecture and design decisions, please refer to our Design Document.

Troubleshooting

Too Many Open Files Error

If you see the following error when running tests on macOS:

called `Result::unwrap()` on an `Err` value: Os { code: 24, kind: Uncategorized, message: "Too many open files" }

Raise the limit of open files per process with:

ulimit -n 10000

Performance Considerations

Currently, the system parallelizes the copying of different tables, but each individual table is still copied in sequential batches. This limits performance for large tables. We plan to address this once the ETL system reaches greater stability.

License

Distributed under the Apache-2.0 License. See LICENSE for more information.

About

Stream your Postgres data anywhere in real-time. Simple Rust building blocks for change data capture (CDC) pipelines.

Topics

Resources

License

Code of conduct

Contributing

Stars

Watchers

Forks

Releases

No releases published

Sponsor this project

 

Packages

No packages published

Contributors 13

Languages