This project contains an end-to-end pipeline for the dvd_rental
sample database. It:
- extracts data from a local database using Airbyte
- loads the raw data into Snowflake
- applies a series of transformations using dbt, including the creation of fact and dimension tables, and
- creates a couple of views for analytics use
To run this code locally:
- Download the sample database and load it into your local Postgres instance.
- Run
make get-airbyte
from root to download the airbyte repo.cd
into the repo and run./run-ab-platform.sh
to run airbyte locally on docker. - Navigate to
localhost:8000
and set up a source and a destination in order to extract thedvd_rental
database from your local postgres instance into your data warehouse. - Once your data exists in your data warehouse, you can update your dbt settings to reflect the necessary credentials, and then
cd
into thetransformation
directory to apply the transformatinos usingdbt run
.