Repository: cnstlungu/portable-data-stack-dagster Branch: main Commit: 9183b35e9c67 Files: 13 Total size: 10.3 KB Directory structure: gitextract_5c6bhaqa/ ├── .gitignore ├── LICENSE ├── README.md ├── dagster/ │ ├── Dockerfile │ ├── definitions.py │ └── entrypoint.sh ├── dbt/ │ └── Dockerfile ├── docker-compose.yml ├── generator/ │ └── Dockerfile ├── shared/ │ ├── db/ │ │ ├── .keep │ │ └── datamart.duckdb.example │ └── parquet/ │ └── .keep └── superset/ └── Dockerfile ================================================ FILE CONTENTS ================================================ ================================================ FILE: .gitignore ================================================ .venv *.egg-info/ *.parquet *.db tmp* __pycache__ target/ dbt_packages/ logs/ *.duckdb *.env **/.logs_queue **/.nux **/.telemetry **/history **/schedules **/storage dbt/postcard_company/* # Keeps !geography.csv !dashboard.json ================================================ FILE: LICENSE ================================================ MIT License Copyright (c) 2023 Constantin Lungu Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. ================================================ FILE: README.md ================================================ # Portable Data Stack This application is an Analytics suite for an imaginary company selling postcards. The company sells both directly but also through resellers in the majority of European countries. ## Stack - Dagster - Docker (Docker Compose) - DuckDB - dbt core - Superset ## Interested in the data model? Generation of example data and the underlying dbt-core model is available in the [postcard-company-datamart](https://github.com/cnstlungu/postcard-company-datamart) project ## For other stacks using the same dbt-core model, check the below: - [portable-data-stack-mage](https://github.com/cnstlungu/portable-data-stack-mage) - [portable-data-stack-airflow](https://github.com/cnstlungu/portable-data-stack-airflow) - [portable-data-stack-sqlmesh](https://github.com/cnstlungu/portable-data-stack-sqlmesh) Implementations with other tools with a different model: - [portable-data-stack-bruin](https://github.com/cnstlungu/portable-data-stack-bruin) - [postcard-company-dataform](https://github.com/cnstlungu/postcard-company-dataform) For the legacy version involving OLTP, CSV and JSON sources, check out the `legacy-oltp` branch. ### System requirements * [Docker](https://docs.docker.com/engine/install/) ## Setup 1. Rename `.env.example` file to `.env` and set your desired password. Remember to never commit files containing passwords or any other sensitive information. 2. Rename `shared/db/datamart.duckdb.example` to `shared/db/datamart.duckdb` or init an empty database file there with that name. 3. With **Docker Engine** installed, change directory to the root folder of the project (also the one that contains docker-compose.yml) and run `docker compose up --build` Note that this may take several minutes to completed. Check out the console to see when the Dagster interface is ready. 4. Once the Docker suite has finished loading, open up [Dagster (dagit)](http://localhost:3000) , go to `Assets`, select all and click `Materialize selected` ![Dagit](resources/dagit.png "Dagit") 5. When the assets have been materialized, you can open the [Superset interface](http://localhost:8088) ### Demo Credentials Demo credentials are set in the .env file mentioned above. ### Ports exposed locally * Dagster (dagit): 3000 * Superset: 8088 Generated parquet files are saved in the **shared** folder. The data is fictional and automatically generated. Any similarities with existing persons, entities, products or businesses are purely coincidental. ### General flow 1. Generate test data as parquet files using Python 2. Import data to the staging area in the Data Warehouse (DuckDB), orchestrated by Dagster 3. Model data, build fact and dimension tables, load the Data Warehouse using dbt - installs dbt dependencies - seeds the database with static data (e.g. geography) - runs the model - tests the model 4. Analyze and visually explore the data using Superset or directly query the Data Warehouse database instance For superset, the default credentials are: user = admin, password = admin ## Overview of architecture The Docker process will begin building the application suite. The suite is made up of the following components, each within its own docker container: * **generator**: this is a collection of Python scripts that will generate, insert and export the example data * **dbt**: the data model, sourced from [postcard-company-datamart](https://github.com/cnstlungu/postcard-company-datamart) project * **dagster**: this is the orchestrator tool that will trigger the ETL tasks; its GUI is locally available on port 3000; * **superset**: this contains the web-based Business Intelligence application we will use to explore the data; exposed on port 8088. Once the Docker building process has completed, we may open the Dagster GUI (locally: localhost:3000) to view and materialize our assets. ![Dagster](resources/orchestration.png "Orchestration with Dagster") After the assets have been materialized you can either analyze the data using the querying and visualization tools provided by Superset (available locally on port 8088), or query the Data Warehouse (available as a DuckDB Database) ![Apache Superset](resources/superset.png "Superset") ## Credits Inspired by: - [Build a poor man’s data lake from scratch with DuckDB](https://dagster.io/blog/duckdb-data-lake) - [Using dbt with Dagster software-defined assets](https://docs.dagster.io/integrations/dbt/using-dbt-with-dagster) ================================================ FILE: dagster/Dockerfile ================================================ FROM python:3.11-slim RUN apt-get update && apt-get install -y --no-install-recommends curl && rm -rf /var/lib/apt/lists/* RUN pip install uv RUN uv pip install --system dagster==1.11.13 \ dagster-dbt==0.27.13 \ duckdb==1.4.0 \ dbt-core==1.10.13 \ dbt-duckdb==1.9.6 \ dagster-duckdb==0.27.13 \ dagster-webserver==1.11.13 \ "pydantic<2.9.0" \ "watchdog<5" WORKDIR / COPY dagster/definitions.py /definitions.py COPY dagster/entrypoint.sh /entrypoint.sh RUN chmod +x /entrypoint.sh ENTRYPOINT ["/entrypoint.sh"] CMD ["-h", "0.0.0.0", "-f", "/definitions.py"] ================================================ FILE: dagster/definitions.py ================================================ from pathlib import Path from dagster import Definitions, AssetExecutionContext from dagster_dbt import DbtCliResource, dbt_assets # Paths inside the container DBT_PROJECT_DIR = Path("/postcard_company") DBT_PROFILES_DIR = Path("/postcard_company") # you have profiles.yml there # dbt CLI resource Dagster will use to run dbt dbt_resource = DbtCliResource( project_dir=str(DBT_PROJECT_DIR), profiles_dir=str(DBT_PROFILES_DIR), ) # dbt manifest produced by `dbt compile` or `dbt build` MANIFEST_PATH = DBT_PROJECT_DIR / "target" / "manifest.json" @dbt_assets(manifest=MANIFEST_PATH) def postcard_company_dbt_assets( context: AssetExecutionContext, dbt: DbtCliResource, ): # You can change to ["run"], ["test"], etc. yield from dbt.cli(["build"], context=context).stream() defs = Definitions( assets=[postcard_company_dbt_assets], resources={"dbt": dbt_resource}, ) ================================================ FILE: dagster/entrypoint.sh ================================================ #!/bin/bash set -e echo "Running dbt setup..." cd /postcard_company dbt deps dbt seed dbt compile echo "Starting Dagster..." # Forward CMD arguments to Dagster exec dagster dev "$@" ================================================ FILE: dbt/Dockerfile ================================================ FROM python:3.11-slim RUN apt-get update && apt-get install -y git make automake gcc g++ subversion && rm -rf /var/lib/apt/lists/* RUN git clone -n --depth=1 --filter=tree:0 https://github.com/cnstlungu/postcard-company-datamart.git /datamart WORKDIR /datamart RUN git sparse-checkout set --no-cone postcard_company CMD ["git", "checkout"] ================================================ FILE: docker-compose.yml ================================================ services: generator: build: context: . dockerfile: ./generator/Dockerfile volumes: - ./shared:/shared environment: INPUT_FILES_PATH: /shared/parquet dbt: build: context: . dockerfile: ./dbt/Dockerfile volumes: - ./shared:/shared - ./dbt/postcard_company:/datamart/postcard_company environment: INPUT_FILES_PATH: /shared/parquet dagster: build: context: . dockerfile: ./dagster/Dockerfile restart: always environment: DUCKDB_FILE_PATH: /shared/db/datamart.duckdb INPUT_FILES_PATH: /shared/parquet volumes: - ./shared:/shared - ./dbt/postcard_company:/postcard_company ports: - "3000:3000" depends_on: generator: condition: service_completed_successfully dbt: condition: service_completed_successfully superset: build: context: . dockerfile: ./superset/Dockerfile args: SUPERSET_ADMIN: $SUPERSET_ADMIN SUPERSET_PASSWORD: $SUPERSET_PASSWORD SUPERSET_SECRET_KEY: ${SUPERSET_SECRET_KEY} environment: SUPERSET_SECRET_KEY: ${SUPERSET_SECRET_KEY} ports: - "8088:8088" command: gunicorn --bind "0.0.0.0:8088" --access-logfile '-' --error-logfile '-' --workers 1 --worker-class gthread --threads 20 --timeout 60 --limit-request-line 0 --limit-request-field_size 0 "superset.app:create_app()" post_start: - command: "superset import-dashboards -p ./dashboard.zip -u ${SUPERSET_ADMIN}" volumes: - ./shared/db:/app/superset_home/db depends_on: - dagster ================================================ FILE: generator/Dockerfile ================================================ FROM python:3.11-slim RUN apt-get update && apt-get install -y git make automake gcc g++ subversion RUN git clone -n --depth=1 --filter=tree:0 https://github.com/cnstlungu/postcard-company-datamart.git /generator WORKDIR /generator RUN git sparse-checkout set --no-cone generator && git checkout WORKDIR /generator/generator RUN pip install uv RUN uv pip install --system -r requirements.txt CMD ["python3", "generate.py"] ================================================ FILE: shared/db/.keep ================================================ ================================================ FILE: shared/parquet/.keep ================================================ ================================================ FILE: superset/Dockerfile ================================================ FROM apache/superset:4.1.1 ARG SUPERSET_ADMIN ARG SUPERSET_PASSWORD ARG SUPERSET_SECRET_KEY # Switching to root to install the required packages USER root RUN pip install uv COPY --chown=superset:superset ./superset/assets . RUN uv pip install --system duckdb-engine==0.17.0 duckdb==1.4.4 USER superset RUN superset fab create-admin \ --username ${SUPERSET_ADMIN} \ --firstname Superset \ --lastname Admin \ --email admin@example.com \ --password ${SUPERSET_PASSWORD} RUN superset db upgrade RUN superset init RUN superset set_database_uri -d DW -u duckdb:///superset_home/db/datamart.duckdb