Showing preview only (496K chars total). Download the full file or copy to clipboard to get everything.
Repository: starburstdata/dbt-trino
Branch: master
Commit: 5813b01fa239
Files: 164
Total size: 451.5 KB
Directory structure:
gitextract_za4id763/
├── .changes/
│ ├── 0.0.0.md
│ ├── 1.10.0/
│ │ └── Features-20251210-194211.yaml
│ ├── 1.10.0.md
│ ├── 1.10.1/
│ │ └── Dependencies-20260115-092226.yaml
│ ├── 1.10.1.md
│ ├── header.tpl.md
│ └── unreleased/
│ └── .gitkeep
├── .changie.yaml
├── .flake8
├── .github/
│ ├── ISSUE_TEMPLATE/
│ │ ├── bug_report.yml
│ │ ├── config.yml
│ │ └── feature_request.yml
│ ├── dependabot.yml
│ ├── pull_request_template.md
│ └── workflows/
│ ├── bot-changelog.yml
│ ├── changelog-existence.yml
│ ├── ci.yml
│ ├── release.yml
│ ├── security.yml
│ └── version-bump.yml
├── .gitignore
├── .pre-commit-config.yaml
├── CHANGELOG.md
├── CONTRIBUTING.md
├── LICENSE.txt
├── Makefile
├── README.md
├── dbt/
│ ├── adapters/
│ │ └── trino/
│ │ ├── __init__.py
│ │ ├── __version__.py
│ │ ├── catalogs/
│ │ │ ├── __init__.py
│ │ │ ├── _relation.py
│ │ │ └── _trino_catalog_metastore.py
│ │ ├── column.py
│ │ ├── connections.py
│ │ ├── constants.py
│ │ ├── impl.py
│ │ ├── parse_model.py
│ │ └── relation.py
│ └── include/
│ └── trino/
│ ├── __init__.py
│ ├── dbt_project.yml
│ ├── macros/
│ │ ├── adapters.sql
│ │ ├── apply_grants.sql
│ │ ├── catalog.sql
│ │ ├── materializations/
│ │ │ ├── incremental.sql
│ │ │ ├── materialized_view.sql
│ │ │ ├── seeds/
│ │ │ │ └── helpers.sql
│ │ │ ├── snapshot.sql
│ │ │ ├── table.sql
│ │ │ └── view.sql
│ │ └── utils/
│ │ ├── any_value.sql
│ │ ├── array_append.sql
│ │ ├── array_concat.sql
│ │ ├── array_construct.sql
│ │ ├── bool_or.sql
│ │ ├── datatypes.sql
│ │ ├── date_spine.sql
│ │ ├── date_trunc.sql
│ │ ├── dateadd.sql
│ │ ├── datediff.sql
│ │ ├── hash.sql
│ │ ├── listagg.sql
│ │ ├── right.sql
│ │ ├── safe_cast.sql
│ │ ├── split_part.sql
│ │ └── timestamps.sql
│ └── sample_profiles.yml
├── dev_requirements.txt
├── docker/
│ ├── init_starburst.bash
│ ├── init_trino.bash
│ ├── remove_starburst.bash
│ ├── remove_trino.bash
│ ├── starburst/
│ │ ├── catalog/
│ │ │ ├── delta.properties
│ │ │ ├── hive.properties
│ │ │ ├── iceberg.properties
│ │ │ ├── memory.properties
│ │ │ ├── postgresql.properties
│ │ │ └── tpch.properties
│ │ └── etc/
│ │ ├── config.properties
│ │ ├── jvm.config
│ │ └── node.properties
│ └── trino/
│ ├── catalog/
│ │ ├── delta.properties
│ │ ├── hive.properties
│ │ ├── iceberg.properties
│ │ ├── memory.properties
│ │ ├── postgresql.properties
│ │ └── tpch.properties
│ └── etc/
│ ├── config.properties
│ ├── jvm.config
│ └── node.properties
├── docker-compose-starburst.yml
├── docker-compose-trino.yml
├── mypy.ini
├── pytest.ini
├── setup.py
├── tests/
│ ├── conftest.py
│ ├── functional/
│ │ └── adapter/
│ │ ├── behavior_flags/
│ │ │ └── test_require_certificate_validation.py
│ │ ├── catalog_integrations/
│ │ │ ├── fixtures.py
│ │ │ └── test_catalog_integration.py
│ │ ├── column_types/
│ │ │ ├── fixtures.py
│ │ │ └── test_column_types.py
│ │ ├── constraints/
│ │ │ ├── fixtures.py
│ │ │ └── test_constraints.py
│ │ ├── dbt_clone/
│ │ │ └── test_dbt_clone.py
│ │ ├── dbt_debug/
│ │ │ └── test_dbt_debug.py
│ │ ├── dbt_show/
│ │ │ └── test_dbt_show.py
│ │ ├── empty/
│ │ │ └── test_empty.py
│ │ ├── fixture_datediff.py
│ │ ├── hooks/
│ │ │ ├── data/
│ │ │ │ ├── seed_model.sql
│ │ │ │ └── seed_run.sql
│ │ │ ├── test_hooks_delete.py
│ │ │ ├── test_model_hooks.py
│ │ │ └── test_run_hooks.py
│ │ ├── materialization/
│ │ │ ├── fixtures.py
│ │ │ ├── test_incremental_delete_insert.py
│ │ │ ├── test_incremental_merge.py
│ │ │ ├── test_incremental_microbatch.py
│ │ │ ├── test_incremental_predicates.py
│ │ │ ├── test_incremental_schema.py
│ │ │ ├── test_incremental_views_enabled.py
│ │ │ ├── test_materialized_view.py
│ │ │ ├── test_on_table_exists.py
│ │ │ ├── test_prepared_statements.py
│ │ │ ├── test_snapshot.py
│ │ │ └── test_view_security.py
│ │ ├── materialized_view_tests/
│ │ │ ├── test_materialized_view_dbt_core.py
│ │ │ └── utils.py
│ │ ├── persist_docs/
│ │ │ ├── fixtures.py
│ │ │ └── test_persist_docs.py
│ │ ├── show/
│ │ │ ├── fixtures.py
│ │ │ └── test_show.py
│ │ ├── simple_seed/
│ │ │ ├── seed_bom.csv
│ │ │ ├── seeds.py
│ │ │ └── test_seed.py
│ │ ├── store_failures/
│ │ │ ├── fixtures.py
│ │ │ └── test_store_failures.py
│ │ ├── test_basic.py
│ │ ├── test_caching.py
│ │ ├── test_changing_relation_type.py
│ │ ├── test_concurrency.py
│ │ ├── test_custom_schema.py
│ │ ├── test_ephemeral.py
│ │ ├── test_get_incremental_tmp_relation_type_macro.py
│ │ ├── test_grants.py
│ │ ├── test_query_comments.py
│ │ ├── test_quote_policy.py
│ │ ├── test_sample_mode.py
│ │ ├── test_seeds_column_types_overrides.py
│ │ ├── test_session_property.py
│ │ ├── test_simple_copy.py
│ │ ├── test_simple_snapshot.py
│ │ ├── test_sql_status_output.py
│ │ ├── test_table_properties.py
│ │ ├── unit_testing/
│ │ │ └── test_unit_testing.py
│ │ └── utils/
│ │ ├── fixture_date_spine.py
│ │ ├── fixture_get_intervals_between.py
│ │ ├── test_data_types.py
│ │ ├── test_date_spine.py
│ │ ├── test_get_intervals_between.py
│ │ ├── test_timestamps.py
│ │ └── test_utils.py
│ └── unit/
│ ├── __init__.py
│ ├── test_adapter.py
│ └── utils.py
└── tox.ini
================================================
FILE CONTENTS
================================================
================================================
FILE: .changes/0.0.0.md
================================================
## Previous Releases
For information on prior major and minor releases, see their changelogs:
* [1.9](https://github.com/starburstdata/dbt-trino/blob/1.9.latest/CHANGELOG.md)
* [1.8](https://github.com/starburstdata/dbt-trino/blob/1.8.latest/CHANGELOG.md)
* [1.7](https://github.com/starburstdata/dbt-trino/blob/1.7.latest/CHANGELOG.md)
* [1.6](https://github.com/starburstdata/dbt-trino/blob/1.6.latest/CHANGELOG.md)
* [1.5](https://github.com/starburstdata/dbt-trino/blob/1.5.latest/CHANGELOG.md)
* [1.4](https://github.com/starburstdata/dbt-trino/blob/1.4.latest/CHANGELOG.md)
* [1.3](https://github.com/starburstdata/dbt-trino/blob/1.3.latest/CHANGELOG.md)
* [1.2](https://github.com/starburstdata/dbt-trino/blob/1.2.latest/CHANGELOG.md)
* [1.1](https://github.com/starburstdata/dbt-trino/blob/1.1.latest/CHANGELOG.md)
* [1.0 and earlier](https://github.com/starburstdata/dbt-trino/blob/1.0.latest/CHANGELOG.md)
================================================
FILE: .changes/1.10.0/Features-20251210-194211.yaml
================================================
kind: Features
body: Add support for catalog integration
time: 2025-12-10T19:42:11.700646+01:00
custom:
Author: damian3031
Issue: ""
PR: "502"
================================================
FILE: .changes/1.10.0.md
================================================
## dbt-trino 1.10.0 - December 16, 2025
### Features
- Add support for catalog integration ([#502](https://github.com/starburstdata/dbt-trino/pull/502))
### Contributors
- [@damian3031](https://github.com/damian3031) ([#502](https://github.com/starburstdata/dbt-trino/pull/502))
================================================
FILE: .changes/1.10.1/Dependencies-20260115-092226.yaml
================================================
kind: Dependencies
body: Bump dbt-adapters>=1.16,<2.0
time: 2026-01-15T09:22:26.968512-08:00
custom:
Author: zqureshi
Issue: "507"
PR: "507"
================================================
FILE: .changes/1.10.1.md
================================================
## dbt-trino 1.10.1 - January 16, 2026
### Dependencies
- Bump dbt-adapters>=1.16,<2.0 ([#507](https://github.com/starburstdata/dbt-trino/issues/507), [#507](https://github.com/starburstdata/dbt-trino/pull/507))
### Contributors
- [@zqureshi](https://github.com/zqureshi) ([#507](https://github.com/starburstdata/dbt-trino/pull/507))
================================================
FILE: .changes/header.tpl.md
================================================
# dbt-trino Changelog
- This file provides a full account of all changes to `dbt-trino`
- Changes are listed under the (pre)release in which they first appear. Subsequent releases include changes from previous releases.
- "Breaking changes" listed under a version may require action from end users or external maintainers when upgrading to that version.
- Do not edit this file directly. This file is auto-generated using [changie](https://github.com/miniscruff/changie). For details on how to document a change, see [the contributing guide](https://github.com/starburstdata/dbt-trino/blob/master/CONTRIBUTING.md#adding-changelog-entry)
================================================
FILE: .changes/unreleased/.gitkeep
================================================
================================================
FILE: .changie.yaml
================================================
changesDir: .changes
unreleasedDir: unreleased
headerPath: header.tpl.md
versionHeaderPath: ""
changelogPath: CHANGELOG.md
versionExt: md
versionFormat: '## dbt-trino {{.Version}} - {{.Time.Format "January 02, 2006"}}'
kindFormat: '### {{.Kind}}'
changeFormat: '- {{.Body}} ({{if ne .Custom.Issue ""}}[#{{.Custom.Issue}}](https://github.com/starburstdata/dbt-trino/issues/{{.Custom.Issue}}), {{end}}[#{{.Custom.PR}}](https://github.com/starburstdata/dbt-trino/pull/{{.Custom.PR}}))'
kinds:
- label: Breaking Changes
- label: Features
- label: Fixes
- label: Under the Hood
- label: Dependencies
changeFormat: '- {{.Body}} ({{if ne .Custom.Issue ""}}[#{{.Custom.Issue}}](https://github.com/starburstdata/dbt-trino/issues/{{.Custom.Issue}}), {{end}}[#{{.Custom.PR}}](https://github.com/starburstdata/dbt-trino/pull/{{.Custom.PR}}))'
- label: Security
changeFormat: '- {{.Body}} ({{if ne .Custom.Issue ""}}[#{{.Custom.Issue}}](https://github.com/starburstdata/dbt-trino/issues/{{.Custom.Issue}}), {{end}}[#{{.Custom.PR}}](https://github.com/starburstdata/dbt-trino/pull/{{.Custom.PR}}))'
newlines:
beforeChangelogHeader: 1
custom:
- key: Author
label: GitHub Username(s) (separated by a single space if multiple)
type: string
minLength: 3
- key: Issue
label: GitHub Issue Number
type: int
minInt: 1
optional: true
- key: PR
label: GitHub Pull Request Number
type: int
minInt: 1
footerFormat: |
{{- $contributorDict := dict }}
{{- range $change := .Changes }}
{{- $authorList := splitList " " $change.Custom.Author }}
{{- /* loop through all authors for a PR */}}
{{- range $author := $authorList }}
{{- $authorLower := lower $author }}
{{- $prLink := $change.Kind }}
{{- $prLink = "[#pr](https://github.com/starburstdata/dbt-trino/pull/pr)" | replace "pr" $change.Custom.PR }}
{{- /* check if this contributor has other PRs associated with them already */}}
{{- if hasKey $contributorDict $author }}
{{- $prList := get $contributorDict $author }}
{{- $prList = append $prList $prLink }}
{{- $contributorDict := set $contributorDict $author $prList }}
{{- else }}
{{- $prList := list $prLink }}
{{- $contributorDict := set $contributorDict $author $prList }}
{{- end }}
{{- end}}
{{- end }}
{{- /* no indentation here for formatting so the final markdown doesn't have unneeded indentations */}}
{{- if $contributorDict}}
### Contributors
{{- range $k,$v := $contributorDict }}
- [@{{$k}}](https://github.com/{{$k}}) ({{ range $index, $element := $v }}{{if $index}}, {{end}}{{$element}}{{end}})
{{- end }}
{{- end }}
================================================
FILE: .flake8
================================================
[flake8]
select =
E
W
F
ignore =
W503,
W504,
E203,
E741,
E501,
exclude = test
================================================
FILE: .github/ISSUE_TEMPLATE/bug_report.yml
================================================
---
name: Bug report
description: Report a bug or an issue you've found with dbt-trino
labels: bug
body:
- type: textarea
attributes:
label: Expected behavior
description: What do you think should have happened
placeholder: >
A clear and concise description of what you expected to happen.
validations:
required: true
- type: textarea
attributes:
label: Actual behavior
description: Describe what actually happened
placeholder: >
A clear and concise description of what actually happened.
validations:
required: true
- type: textarea
attributes:
label: Steps To Reproduce
description: This will help us reproduce your issue
placeholder: >
In as much detail as possible, please provide steps to reproduce the issue.
Sample code that triggers the issue, relevant server settings, etc is all very helpful here.
validations:
required: true
- type: textarea
attributes:
label: Log output/Screenshots
description: What do you think went wrong?
placeholder: >
If applicable, add log output and/or screenshots to help explain your problem.
- type: input
attributes:
label: Operating System
description: What Operating System are you using?
placeholder: "You can get it via `cat /etc/os-release` for example"
validations:
required: true
- type: input
attributes:
label: dbt version
description: "Execute `dbt --version`"
placeholder: Which version of dbt are you using?
validations:
required: true
- type: input
attributes:
label: Trino Server version
description: "Run `SELECT VERSION();` on your Trino server"
placeholder: Which Trino server version are you using?
validations:
required: true
- type: input
attributes:
label: Python version
description: "You can get it via executing `python --version`"
placeholder: What Python version are you using?
validations:
required: true
- type: checkboxes
attributes:
label: Are you willing to submit PR?
description: >
This is absolutely not required, but we are happy to guide you in the contribution process
especially if you already have a good understanding of how to implement the feature.
options:
- label: Yes I am willing to submit a PR!
- type: markdown
attributes:
value: "Thanks for completing our form!"
================================================
FILE: .github/ISSUE_TEMPLATE/config.yml
================================================
---
contact_links:
- name: Ask a question or get help around `dbt-trino` on Slack
url: https://getdbt.slack.com/channels/db-presto-trino
about: Get help and share your experiences around `dbt-trino` with the `dbt` Slack community.
================================================
FILE: .github/ISSUE_TEMPLATE/feature_request.yml
================================================
---
name: Feature request
description: Suggest an idea for dbt-trino
labels: enhancement
body:
- type: textarea
attributes:
label: Describe the feature
description: What would you like to happen?
placeholder: >
A clear and concise description of what you want to happen
and what problem it would solve.
validations:
required: true
- type: textarea
attributes:
label: Describe alternatives you've considered
description: What did you try to make it happen?
placeholder: >
A clear and concise description of any alternative solutions or features you've considered.
- type: textarea
attributes:
label: Who will benefit?
placeholder: >
What kind of use case will this feature be useful for? Please be specific and provide examples, this will help us prioritize properly.
- type: checkboxes
attributes:
label: Are you willing to submit PR?
description: >
This is absolutely not required, but we are happy to guide you in the contribution process
especially if you already have a good understanding of how to implement the feature.
options:
- label: Yes I am willing to submit a PR!
- type: markdown
attributes:
value: "Thanks for completing our form!"
================================================
FILE: .github/dependabot.yml
================================================
version: 2
updates:
# python dependencies
- package-ecosystem: "pip"
directory: "/"
schedule:
interval: "daily"
rebase-strategy: "disabled"
labels:
- "Skip Changelog"
- "dependencies"
- package-ecosystem: "github-actions"
directory: "/"
schedule:
interval: "weekly"
rebase-strategy: "disabled"
================================================
FILE: .github/pull_request_template.md
================================================
## Overview
<!---
Include the number of the issue addressed by this PR above if applicable.
PRs for code changes without an associated issue *will not be merged*.
See CONTRIBUTING.md for more information.
Example:
resolves #1234
-->
## Checklist
- [ ] I have run this code in development and it appears to resolve the stated issue
- [ ] This PR includes tests, or tests are not required/relevant for this PR
- [ ] `README.md` updated and added information about my change
- [ ] I have run `changie new` to [create a changelog entry](https://github.com/starburstdata/dbt-trino/blob/master/CONTRIBUTING.md#Adding-CHANGELOG-Entry)
================================================
FILE: .github/workflows/bot-changelog.yml
================================================
# **what?**
# When bots create a PR, this action will add a corresponding changie yaml file to that
# PR when a specific label is added.
#
# The file is created off a template:
#
# kind: <per action matrix>
# body: <PR title>
# time: <current timestamp>
# custom:
# Author: <PR User Login (generally the bot)>
# Issue: 4904
# PR: <PR number>
#
# **why?**
# Automate changelog generation for more visability with automated bot PRs.
#
# **when?**
# Once a PR is created, label should be added to PR before or after creation. You can also
# manually trigger this by adding the appropriate label at any time.
#
# **how to add another bot?**
# Add the label and changie kind to the include matrix. That's it!
#
name: Bot Changelog
on:
pull_request:
# catch when the PR is opened with the label or when the label is added
types: [opened, labeled]
permissions:
contents: write
pull-requests: read
jobs:
generate_changelog:
runs-on: ubuntu-latest
steps:
- name: Check out the repository
uses: actions/checkout@v4
with:
fetch-depth: 2
- name: Create and commit changelog on bot PR
id: bot_changelog
uses: emmyoop/changie_bot@v1.0
with:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
commit_author_name: "starburstdata-automation"
commit_author_email: "automation@starburstdata.com"
commit_message: ${{ github.event.pull_request.title }}
changie_kind: "Dependencies"
label: "dependencies"
custom_changelog_string: "custom:\n Author: ${{ github.event.pull_request.user.login }}\n Issue: ''\n PR: ${{ github.event.pull_request.number }}"
================================================
FILE: .github/workflows/changelog-existence.yml
================================================
# **what?**
# Checks that a file has been committed under the /.changes directory
# as a new CHANGELOG entry. Cannot check for a specific filename as
# it is dynamically generated by change type and timestamp.
# This workflow should not require any secrets since it runs for PRs
# from forked repos.
# By default, secrets are not passed to workflows running from
# a forked repo.
# **why?**
# Ensure code change gets reflected in the CHANGELOG.
# **when?**
# This will run for all PRs going into master. It will
# run when they are opened, reopened, when any label is added or removed
# and when new code is pushed to the branch. The action will then get
# skipped if the 'Skip Changelog' label is present is any of the labels.
name: Check Changelog Entry
on:
pull_request:
types: [opened, reopened, labeled, unlabeled, synchronize]
workflow_dispatch:
defaults:
run:
shell: bash
permissions:
contents: read
pull-requests: write
jobs:
changelog:
uses: dbt-labs/actions/.github/workflows/changelog-existence.yml@main
with:
changelog_comment: 'Thank you for your pull request! We could not find a changelog entry for this change. For details on how to document a change, see [the contributing guide](https://github.com/starburstdata/dbt-trino/blob/master/CONTRIBUTING.md#adding-changelog-entry).'
skip_label: 'Skip Changelog'
secrets: inherit
================================================
FILE: .github/workflows/ci.yml
================================================
name: dbt-trino tests
on:
push:
branches:
- master
- "*.*.latest"
paths-ignore:
- "**/*.md"
pull_request:
branches:
- master
- "*.*.latest"
paths-ignore:
- "**/*.md"
jobs:
checks:
runs-on: ubuntu-latest
steps:
- name: "Checkout the source code"
uses: actions/checkout@v4
- name: "Install Python"
uses: actions/setup-python@v5
- name: "Install dev requirements"
run: pip install -r dev_requirements.txt
- name: "Run pre-commit checks"
run: pre-commit run --all-files
test:
runs-on: ubuntu-latest
strategy:
fail-fast: false
matrix:
engine:
- "trino"
- "starburst"
- "starburst_galaxy"
python:
- "3.9"
- "3.10"
- "3.11"
- "3.12"
- "3.13"
isStarburstBranch:
- ${{ (github.event_name == 'pull_request' && contains(github.event.pull_request.head.repo.full_name, 'starburstdata')) || github.event_name != 'pull_request' }}
exclude:
- engine: "starburst_galaxy"
python: "3.13"
isStarburstBranch: false
- engine: "starburst_galaxy"
python: "3.12"
- engine: "starburst_galaxy"
python: "3.11"
- engine: "starburst_galaxy"
python: "3.10"
- engine: "starburst_galaxy"
python: "3.9"
steps:
- name: Checkout
uses: actions/checkout@v4
- name: Setup Python
uses: actions/setup-python@v5
with:
python-version: ${{ matrix.python }}
- name: Run dbt-trino tests against ${{ matrix.engine }} on python ${{ matrix.python }}
env:
DBT_TESTS_STARBURST_GALAXY_HOST: ${{ secrets.DBT_TESTS_STARBURST_GALAXY_HOST }}
DBT_TESTS_STARBURST_GALAXY_USER: ${{ secrets.DBT_TESTS_STARBURST_GALAXY_USER }}
DBT_TESTS_STARBURST_GALAXY_PASSWORD: ${{ secrets.DBT_TESTS_STARBURST_GALAXY_PASSWORD }}
run: |
if [[ ${{ matrix.engine }} == "trino" || ${{ matrix.engine }} == "starburst" ]]; then
make dbt-${{ matrix.engine }}-tests
elif [[ ${{ matrix.engine }} == "starburst_galaxy" ]]; then
python -m pip install -e . -r dev_requirements.txt
python -m pytest tests/functional --profile starburst_galaxy
fi
- name: Remove container on failure
if: failure()
run: ./docker/remove_${{ matrix.engine }}.bash || true
================================================
FILE: .github/workflows/release.yml
================================================
name: dbt-trino release
on:
workflow_dispatch:
jobs:
test:
runs-on: ubuntu-latest
steps:
- name: Checkout
uses: actions/checkout@v4
- name: Setup Python
uses: actions/setup-python@v5
with:
python-version: "3.13"
- name: Test release
run: |
python3 -m venv env
source env/bin/activate
pip install -r dev_requirements.txt
pip install twine wheel setuptools
python setup.py sdist bdist_wheel
pip install dist/dbt_trino-*.tar.gz
pip install dist/dbt_trino-*-py3-none-any.whl
twine check dist/dbt_trino-*-py3-none-any.whl dist/dbt_trino-*.tar.gz
github-release:
name: GitHub release
runs-on: ubuntu-latest
needs: test
steps:
- name: Checkout
uses: actions/checkout@v4
- name: Setup Python
uses: actions/setup-python@v5
with:
python-version: "3.13"
- name: Get dbt-trino version
run: echo "version_number=$(cat dbt/adapters/trino/__version__.py | sed -n 's/version = "\(.*\)\"/\1/p')" >> $GITHUB_ENV
# Need to set an output variable because env variables can't be taken as input
# This is needed for the next step with releasing to GitHub
- name: Find release type
id: release_type
env:
IS_PRERELEASE: ${{ contains(env.version_number, 'rc') || contains(env.version_number, 'b') }}
run: |
echo "isPrerelease=$IS_PRERELEASE" >> $GITHUB_OUTPUT
- name: Create GitHub release
uses: actions/create-release@v1
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} # This token is provided by Actions, you do not need to create your own token
with:
tag_name: v${{env.version_number}}
release_name: v${{env.version_number}}
prerelease: ${{ steps.release_type.outputs.isPrerelease }}
body: |
[Release notes](https://github.com/starburstdata/dbt-trino/blob/master/CHANGELOG.md)
```sh
$ pip install dbt-trino==${{env.version_number}}
```
pypi-release:
name: Pypi release
runs-on: ubuntu-latest
needs: github-release
environment: PypiProd
permissions:
id-token: write
steps:
- name: Checkout
uses: actions/checkout@v4
- name: Setup Python
uses: actions/setup-python@v5
with:
python-version: "3.13"
- name: Get dbt-trino version
run: echo "version_number=$(cat dbt/adapters/trino/__version__.py | sed -n 's/version = "\(.*\)\"/\1/p')" >> $GITHUB_ENV
- name: Release to pypi
run: |
python3 -m venv env
source env/bin/activate
pip install -r dev_requirements.txt
pip install twine wheel setuptools
python setup.py sdist bdist_wheel
twine upload --non-interactive dist/dbt_trino-${{env.version_number}}-py3-none-any.whl dist/dbt_trino-${{env.version_number}}.tar.gz
================================================
FILE: .github/workflows/security.yml
================================================
name: Veracode SCA
on:
workflow_dispatch:
jobs:
veracode-sca-task:
runs-on: ubuntu-latest
name: Scan repository for Issues
steps:
- name: Checkout
uses: actions/checkout@v4
- name: Run Veracode SCA
env:
SRCCLR_API_TOKEN: ${{ secrets.SRCCLR_API_TOKEN }}
uses: veracode/veracode-sca@v1.09
with:
github_token: ${{ secrets.GITHUB_TOKEN }}
create-issues: true
min-cvss-for-issue: 1
fail-on-cvss: 11
================================================
FILE: .github/workflows/version-bump.yml
================================================
# **what?**
# This workflow will take the new version number to bump to. With that
# it will run versionbump to update the version number everywhere in the
# code base and then run changie to create the corresponding changelog.
# A PR will be created with the changes that can be reviewed before committing.
# **why?**
# This is to aid in releasing dbt-trino and making sure we have updated
# the version in all places and generated the changelog.
# **when?**
# This is triggered manually
name: Version Bump
on:
workflow_dispatch:
inputs:
version_number:
description: 'The version number to bump to (ex. 1.2.0, 1.3.0b1)'
required: true
jobs:
bump:
runs-on: ubuntu-latest
steps:
- name: "[DEBUG] Print Variables"
run: |
echo "all variables defined as inputs"
echo The version_number: ${{ github.event.inputs.version_number }}
- name: Check out the repository
uses: actions/checkout@v4
- uses: actions/setup-python@v5
with:
python-version: "3.8"
- name: Install brew
run: |
echo "/home/linuxbrew/.linuxbrew/bin:/home/linuxbrew/.linuxbrew/sbin" >> $GITHUB_PATH
- name: Install python dependencies
run: |
python3 -m venv env
source env/bin/activate
pip install --upgrade pip
- name: Audit Version and Parse Into Parts
id: semver
uses: dbt-labs/actions/parse-semver@v1
with:
version: ${{ github.event.inputs.version_number }}
- name: Set branch value
id: variables
run: |
echo "BRANCH_NAME=prep-release/${{ github.event.inputs.version_number }}_$GITHUB_RUN_ID" >> $GITHUB_OUTPUT
- name: Create PR branch
run: |
git checkout -b ${{ steps.variables.outputs.BRANCH_NAME }}
git push origin ${{ steps.variables.outputs.BRANCH_NAME }}
git branch --set-upstream-to=origin/${{ steps.variables.outputs.BRANCH_NAME }} ${{ steps.variables.outputs.BRANCH_NAME }}
- name: Bump version
run: |
echo -en "version = \"${{ github.event.inputs.version_number }}\"\n" > dbt/adapters/trino/__version__.py
git status
- name: Run changie
run: |
brew tap miniscruff/changie https://github.com/miniscruff/changie
brew install changie
if [[ ${{ steps.semver.outputs.is-pre-release }} -eq 1 ]]
then
changie batch ${{ steps.semver.outputs.base-version }} --move-dir '${{ steps.semver.outputs.base-version }}' --prerelease '${{ steps.semver.outputs.pre-release }}'
else
if [[ -d ".changes/${{ steps.semver.outputs.base-version }}" ]]
then
changie batch ${{ steps.semver.outputs.base-version }} --include '${{ steps.semver.outputs.base-version }}' --remove-prereleases
else
changie batch ${{ steps.semver.outputs.base-version }} --move-dir '${{ steps.semver.outputs.base-version }}'
fi
fi
changie merge
git status
- name: Commit version bump to branch
uses: EndBug/add-and-commit@v9
with:
author_name: 'Github Build Bot'
author_email: 'automation@starburstdata.com'
message: 'Bumping version to ${{ github.event.inputs.version_number }} and generate CHANGELOG'
branch: '${{ steps.variables.outputs.BRANCH_NAME }}'
push: 'origin origin/${{ steps.variables.outputs.BRANCH_NAME }}'
- name: Create Pull Request
uses: peter-evans/create-pull-request@v7
with:
author: 'Github Build Bot <automation@starburstdata.com>'
base: ${{github.ref}}
title: 'Bumping version to ${{ github.event.inputs.version_number }} and generate changelog'
branch: '${{ steps.variables.outputs.BRANCH_NAME }}'
labels: |
Skip Changelog
================================================
FILE: .gitignore
================================================
*.egg-info
env/
__pycache__/
.tox/
.idea/
build/
dist/
dbt-integration-tests
docker/dbt/.user.yml
.DS_Store
.vscode/
logs/
.venv/
================================================
FILE: .pre-commit-config.yaml
================================================
# Configuration for pre-commit hooks (see https://pre-commit.com/).
# Eventually the hooks described here will be run as tests before merging each PR.
# TODO: remove global exclusion of tests when testing overhaul is complete
exclude: ^test/
repos:
- repo: https://github.com/pre-commit/pre-commit-hooks
rev: v4.4.0
hooks:
- id: check-yaml
args: [--unsafe]
- id: check-json
- id: end-of-file-fixer
- id: trailing-whitespace
exclude_types:
- "markdown"
- id: check-case-conflict
- repo: https://github.com/dbt-labs/pre-commit-hooks
rev: v0.1.0a1
hooks:
- id: dbt-core-in-adapters-check
- repo: https://github.com/psf/black
rev: 23.3.0
hooks:
- id: black
args:
- "--line-length=99"
- "--target-version=py38"
- id: black
alias: black-check
stages: [manual]
args:
- "--line-length=99"
- "--target-version=py38"
- "--check"
- "--diff"
- repo: https://github.com/pycqa/isort
rev: 5.12.0
hooks:
- id: isort
args: [ "--profile", "black", "--filter-files" ]
- repo: https://github.com/pycqa/flake8
rev: 7.1.2
hooks:
- id: flake8
- id: flake8
alias: flake8-check
stages: [manual]
- repo: https://github.com/pre-commit/mirrors-mypy
rev: v1.2.0
hooks:
- id: mypy
# N.B.: Mypy is... a bit fragile.
#
# By using `language: system` we run this hook in the local
# environment instead of a pre-commit isolated one. This is needed
# to ensure mypy correctly parses the project.
# It may cause trouble in that it adds environmental variables out
# of our control to the mix. Unfortunately, there's nothing we can
# do about per pre-commit's author.
# See https://github.com/pre-commit/pre-commit/issues/730 for details.
args: [ --show-error-codes, --ignore-missing-imports ]
files: ^dbt/adapters/.*
language: system
- id: mypy
alias: mypy-check
stages: [ manual ]
args: [ --show-error-codes, --pretty, --ignore-missing-imports ]
files: ^dbt/adapters
language: system
================================================
FILE: CHANGELOG.md
================================================
# dbt-trino Changelog
- This file provides a full account of all changes to `dbt-trino`
- Changes are listed under the (pre)release in which they first appear. Subsequent releases include changes from previous releases.
- "Breaking changes" listed under a version may require action from end users or external maintainers when upgrading to that version.
- Do not edit this file directly. This file is auto-generated using [changie](https://github.com/miniscruff/changie). For details on how to document a change, see [the contributing guide](https://github.com/starburstdata/dbt-trino/blob/master/CONTRIBUTING.md#adding-changelog-entry)
## dbt-trino 1.10.1 - January 16, 2026
### Dependencies
- Bump dbt-adapters>=1.16,<2.0 ([#507](https://github.com/starburstdata/dbt-trino/issues/507), [#507](https://github.com/starburstdata/dbt-trino/pull/507))
### Contributors
- [@zqureshi](https://github.com/zqureshi) ([#507](https://github.com/starburstdata/dbt-trino/pull/507))
## dbt-trino 1.10.0 - December 16, 2025
### Features
- Add support for catalog integration ([#502](https://github.com/starburstdata/dbt-trino/pull/502))
### Contributors
- [@damian3031](https://github.com/damian3031) ([#502](https://github.com/starburstdata/dbt-trino/pull/502))
## Previous Releases
For information on prior major and minor releases, see their changelogs:
* [1.9](https://github.com/starburstdata/dbt-trino/blob/1.9.latest/CHANGELOG.md)
* [1.8](https://github.com/starburstdata/dbt-trino/blob/1.8.latest/CHANGELOG.md)
* [1.7](https://github.com/starburstdata/dbt-trino/blob/1.7.latest/CHANGELOG.md)
* [1.6](https://github.com/starburstdata/dbt-trino/blob/1.6.latest/CHANGELOG.md)
* [1.5](https://github.com/starburstdata/dbt-trino/blob/1.5.latest/CHANGELOG.md)
* [1.4](https://github.com/starburstdata/dbt-trino/blob/1.4.latest/CHANGELOG.md)
* [1.3](https://github.com/starburstdata/dbt-trino/blob/1.3.latest/CHANGELOG.md)
* [1.2](https://github.com/starburstdata/dbt-trino/blob/1.2.latest/CHANGELOG.md)
* [1.1](https://github.com/starburstdata/dbt-trino/blob/1.1.latest/CHANGELOG.md)
* [1.0 and earlier](https://github.com/starburstdata/dbt-trino/blob/1.0.latest/CHANGELOG.md)
================================================
FILE: CONTRIBUTING.md
================================================
# Contributing to `dbt-trino`
## Getting the code
### How to contribute?
You can contribute to `dbt-trino` by forking the `dbt-trino` repository. For a detailed overview on forking, check out the [GitHub docs on forking](https://help.github.com/en/articles/fork-a-repo). In short, you will need to:
1. Fork the `dbt-trino` repository
2. Clone your fork locally
3. Check out a new branch for your proposed changes
4. Push changes to your fork
5. Open a pull request against `starburstdata/dbt-trino` from your forked repository
## Setting up an environment
There are some tools that will be helpful to you in developing locally. While this is the list relevant for `dbt-trino` development, many of these tools are used commonly across open-source python projects.
### Tools
These are the tools used in `dbt-trino` development and testing:
- [`tox`](https://tox.readthedocs.io/en/latest/) to manage virtualenvs across python versions. We currently target the latest patch releases for Python 3.9, 3.10, 3.11, 3.12, and 3.13
- [`pytest`](https://docs.pytest.org/en/latest/) to define, discover, and run tests
- [`flake8`](https://flake8.pycqa.org/en/latest/) for code linting
- [`black`](https://github.com/psf/black) for code formatting
- [`isort`](https://pycqa.github.io/isort/) for sorting imports
- [`mypy`](https://mypy.readthedocs.io/en/stable/) for static type checking
- [`pre-commit`](https://pre-commit.com) to easily run those checks
- [`changie`](https://changie.dev/) to create changelog entries, without merge conflicts
- [`make`](https://users.cs.duke.edu/~ola/courses/programming/Makefiles/Makefiles.html) to run multiple setup or test steps in combination. Don't worry too much, nobody _really_ understands how `make` works, and our Makefile aims to be super simple.
- [GitHub Actions](https://github.com/features/actions) for automating tests and checks, once a PR is pushed to the `dbt-trino` repository
A deep understanding of these tools in not required to effectively contribute to `dbt-trino`, but we recommend checking out the attached documentation if you're interested in learning more about each one.
#### Virtual environments
We strongly recommend using virtual environments when developing code in `dbt-trino`. We recommend creating this virtualenv
in the root of the `dbt-trino` repository. To create a new virtualenv, run:
```sh
python3 -m venv env
source env/bin/activate
```
This will create and activate a new Python virtual environment.
#### Docker and `docker compose`
Docker and `docker compose` are both used in testing. Specific instructions for you OS can be found [here](https://docs.docker.com/get-docker/).
## Running `dbt-trino` in development
### Installation
First make sure that you set up your `virtualenv` as described in [Setting up an environment](#setting-up-an-environment). Also ensure you have the latest version of pip installed with `pip install --upgrade pip`. Next, install `dbt-trino` (and its dependencies) with:
```sh
pip install -e . -r dev_requirements.txt
```
When installed in this way, any changes you make to your local copy of the source code will be reflected immediately in your next `dbt` run.
### Running `dbt-trino`
With your virtualenv activated, the `dbt` script should point back to the source code you've cloned on your machine. You can verify this by running `which dbt`. This command should show you a path to an executable in your virtualenv.
Configure your [profile](https://docs.getdbt.com/docs/configure-your-profile) as necessary to connect to your target databases. It may be a good idea to add a new profile pointing to a local Trino instance if appropriate.
## Testing
Once you're able to manually test that your code change is working as expected, it's important to run existing automated tests, as well as adding some new ones. These tests will ensure that:
- Your code changes do not unexpectedly break other established functionality
- Your code changes can handle all known edge cases
- The functionality you're adding will _keep_ working in the future
### Initial setup
To be able to run the tests locally you will need a Trino or Starburst instance.
```sh
# to start Trino
make start-trino
# to start Starburst
make start-starburst
```
### Test commands
There are a few methods for running tests locally.
#### Makefile
There are multiple targets in the Makefile to run common test suites and code
checks, most notably:
```sh
# Runs integration tests on Trino
make dbt-trino-tests
# Runs integration tests on Starburst
make dbt-starburst-tests
```
> These make targets assume you have a local installation of a recent version of [`tox`](https://tox.readthedocs.io/en/latest/) for unit/integration testing and pre-commit for code quality checks,
> unless you use choose a Docker container to run tests. Run `make help` for more info.
#### `pre-commit`
[`pre-commit`](https://pre-commit.com) takes care of running all code-checks for formatting and linting. Run `make dev` to install `pre-commit` in your local environment. Once this is done you can use any of the linter-based make targets as well as a git pre-commit hook that will ensure proper formatting and linting.
#### `tox`
[`tox`](https://tox.readthedocs.io/en/latest/) takes care of managing virtualenvs and install dependencies in order to run tests. You can also run tests in parallel, for example, you can run unit tests for Python 3.9, 3.10, 3.11, 3.12, and 3.13 checks in parallel with `tox -p`. Also, you can run unit tests for specific python versions with `tox -e py39`. The configuration for these tests in located in `tox.ini`.
#### `pytest`
Finally, you can also run a specific test or group of tests using [`pytest`](https://docs.pytest.org/en/latest/) directly. With a virtualenv active and dev dependencies installed you can do things like:
```sh
# run all unit tests in a file
python3 -m pytest tests/unit/utils.py
# run a specific unit test
python3 -m pytest tests/unit/test_adapter.py::TestTrinoAdapter::test_acquire_connection
# run integration tests
python3 -m pytest tests/functional
```
> See [pytest usage docs](https://docs.pytest.org/en/6.2.x/usage.html) for an overview of useful command-line options.
The catalog in the dbt profile can be setup through [pytest markers](https://docs.pytest.org/en/7.1.x/example/markers.html#registering-markers), if no marker has been specified the memory catalog is used.
For example if you want to set the dbt profile to connect to the Delta Lake catalog, annotate your test with `@pytest.mark.delta`, (supported markers are `postgresql`, `delta` or `iceberg`).
```
@pytest.mark.delta
def test_run_seed_test(self, project):
...
```
## Adding CHANGELOG Entry
We use [changie](https://changie.dev) to generate `CHANGELOG` entries. **Note:** Do not edit the `CHANGELOG.md` directly. Your modifications will be lost.
Follow the steps to [install `changie`](https://changie.dev/guide/installation/) for your system.
Once changie is installed and your PR is created, simply run `changie new` and changie will walk you through the process of creating a changelog entry. Commit the file that's created and your changelog entry is complete!
You don't need to worry about which `dbt-trino` version your change will go into. Just create the changelog entry with `changie`, and open your PR against the `master` branch.
## Submitting a Pull Request
A `dbt-trino` maintainer will review your PR. They may suggest code revision for style or clarity, or request that you add unit or integration test(s). These are good things! We believe that, with a little bit of help, anyone can contribute high-quality code.
Automated tests run via GitHub Actions. If you're a first-time contributor, all tests (including code checks and unit tests) will require a maintainer to approve. Changes in the `dbt-trino` repository trigger integration tests against Trino and Starburst.
Once all tests are passing and your PR has been approved, a `dbt-trino` maintainer will merge your changes into the master branch. And that's it! Happy developing :tada:
================================================
FILE: LICENSE.txt
================================================
Apache License
Version 2.0, January 2004
http://www.apache.org/licenses/
TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
1. Definitions.
"License" shall mean the terms and conditions for use, reproduction,
and distribution as defined by Sections 1 through 9 of this document.
"Licensor" shall mean the copyright owner or entity authorized by
the copyright owner that is granting the License.
"Legal Entity" shall mean the union of the acting entity and all
other entities that control, are controlled by, or are under common
control with that entity. For the purposes of this definition,
"control" means (i) the power, direct or indirect, to cause the
direction or management of such entity, whether by contract or
otherwise, or (ii) ownership of fifty percent (50%) or more of the
outstanding shares, or (iii) beneficial ownership of such entity.
"You" (or "Your") shall mean an individual or Legal Entity
exercising permissions granted by this License.
"Source" form shall mean the preferred form for making modifications,
including but not limited to software source code, documentation
source, and configuration files.
"Object" form shall mean any form resulting from mechanical
transformation or translation of a Source form, including but
not limited to compiled object code, generated documentation,
and conversions to other media types.
"Work" shall mean the work of authorship, whether in Source or
Object form, made available under the License, as indicated by a
copyright notice that is included in or attached to the work
(an example is provided in the Appendix below).
"Derivative Works" shall mean any work, whether in Source or Object
form, that is based on (or derived from) the Work and for which the
editorial revisions, annotations, elaborations, or other modifications
represent, as a whole, an original work of authorship. For the purposes
of this License, Derivative Works shall not include works that remain
separable from, or merely link (or bind by name) to the interfaces of,
the Work and Derivative Works thereof.
"Contribution" shall mean any work of authorship, including
the original version of the Work and any modifications or additions
to that Work or Derivative Works thereof, that is intentionally
submitted to Licensor for inclusion in the Work by the copyright owner
or by an individual or Legal Entity authorized to submit on behalf of
the copyright owner. For the purposes of this definition, "submitted"
means any form of electronic, verbal, or written communication sent
to the Licensor or its representatives, including but not limited to
communication on electronic mailing lists, source code control systems,
and issue tracking systems that are managed by, or on behalf of, the
Licensor for the purpose of discussing and improving the Work, but
excluding communication that is conspicuously marked or otherwise
designated in writing by the copyright owner as "Not a Contribution."
"Contributor" shall mean Licensor and any individual or Legal Entity
on behalf of whom a Contribution has been received by Licensor and
subsequently incorporated within the Work.
2. Grant of Copyright License. Subject to the terms and conditions of
this License, each Contributor hereby grants to You a perpetual,
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
copyright license to reproduce, prepare Derivative Works of,
publicly display, publicly perform, sublicense, and distribute the
Work and such Derivative Works in Source or Object form.
3. Grant of Patent License. Subject to the terms and conditions of
this License, each Contributor hereby grants to You a perpetual,
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
(except as stated in this section) patent license to make, have made,
use, offer to sell, sell, import, and otherwise transfer the Work,
where such license applies only to those patent claims licensable
by such Contributor that are necessarily infringed by their
Contribution(s) alone or by combination of their Contribution(s)
with the Work to which such Contribution(s) was submitted. If You
institute patent litigation against any entity (including a
cross-claim or counterclaim in a lawsuit) alleging that the Work
or a Contribution incorporated within the Work constitutes direct
or contributory patent infringement, then any patent licenses
granted to You under this License for that Work shall terminate
as of the date such litigation is filed.
4. Redistribution. You may reproduce and distribute copies of the
Work or Derivative Works thereof in any medium, with or without
modifications, and in Source or Object form, provided that You
meet the following conditions:
(a) You must give any other recipients of the Work or
Derivative Works a copy of this License; and
(b) You must cause any modified files to carry prominent notices
stating that You changed the files; and
(c) You must retain, in the Source form of any Derivative Works
that You distribute, all copyright, patent, trademark, and
attribution notices from the Source form of the Work,
excluding those notices that do not pertain to any part of
the Derivative Works; and
(d) If the Work includes a "NOTICE" text file as part of its
distribution, then any Derivative Works that You distribute must
include a readable copy of the attribution notices contained
within such NOTICE file, excluding those notices that do not
pertain to any part of the Derivative Works, in at least one
of the following places: within a NOTICE text file distributed
as part of the Derivative Works; within the Source form or
documentation, if provided along with the Derivative Works; or,
within a display generated by the Derivative Works, if and
wherever such third-party notices normally appear. The contents
of the NOTICE file are for informational purposes only and
do not modify the License. You may add Your own attribution
notices within Derivative Works that You distribute, alongside
or as an addendum to the NOTICE text from the Work, provided
that such additional attribution notices cannot be construed
as modifying the License.
You may add Your own copyright statement to Your modifications and
may provide additional or different license terms and conditions
for use, reproduction, or distribution of Your modifications, or
for any such Derivative Works as a whole, provided Your use,
reproduction, and distribution of the Work otherwise complies with
the conditions stated in this License.
5. Submission of Contributions. Unless You explicitly state otherwise,
any Contribution intentionally submitted for inclusion in the Work
by You to the Licensor shall be under the terms and conditions of
this License, without any additional terms or conditions.
Notwithstanding the above, nothing herein shall supersede or modify
the terms of any separate license agreement you may have executed
with Licensor regarding such Contributions.
6. Trademarks. This License does not grant permission to use the trade
names, trademarks, service marks, or product names of the Licensor,
except as required for reasonable and customary use in describing the
origin of the Work and reproducing the content of the NOTICE file.
7. Disclaimer of Warranty. Unless required by applicable law or
agreed to in writing, Licensor provides the Work (and each
Contributor provides its Contributions) on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
implied, including, without limitation, any warranties or conditions
of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
PARTICULAR PURPOSE. You are solely responsible for determining the
appropriateness of using or redistributing the Work and assume any
risks associated with Your exercise of permissions under this License.
8. Limitation of Liability. In no event and under no legal theory,
whether in tort (including negligence), contract, or otherwise,
unless required by applicable law (such as deliberate and grossly
negligent acts) or agreed to in writing, shall any Contributor be
liable to You for damages, including any direct, indirect, special,
incidental, or consequential damages of any character arising as a
result of this License or out of the use or inability to use the
Work (including but not limited to damages for loss of goodwill,
work stoppage, computer failure or malfunction, or any and all
other commercial damages or losses), even if such Contributor
has been advised of the possibility of such damages.
9. Accepting Warranty or Additional Liability. While redistributing
the Work or Derivative Works thereof, You may choose to offer,
and charge a fee for, acceptance of support, warranty, indemnity,
or other liability obligations and/or rights consistent with this
License. However, in accepting such obligations, You may act only
on Your own behalf and on Your sole responsibility, not on behalf
of any other Contributor, and only if You agree to indemnify,
defend, and hold each Contributor harmless for any liability
incurred by, or claims asserted against, such Contributor by reason
of your accepting any such warranty or additional liability.
END OF TERMS AND CONDITIONS
APPENDIX: How to apply the Apache License to your work.
To apply the Apache License to your work, attach the following
boilerplate notice, with the fields enclosed by brackets "{}"
replaced with your own identifying information. (Don't include
the brackets!) The text should be enclosed in the appropriate
comment syntax for the file format. We also recommend that a
file or class name and description of purpose be included on the
same "printed page" as the copyright notice for easier
identification within third-party archives.
Copyright 2021 Starburst Data, Inc.
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
================================================
FILE: Makefile
================================================
.EXPORT_ALL_VARIABLES:
DBT_TEST_USER_1=user1
DBT_TEST_USER_2=user2
DBT_TEST_USER_3=user3
start-trino:
docker network create dbt-net || true
./docker/init_trino.bash
dbt-trino-tests: start-trino
pip install -e . -r dev_requirements.txt
tox -r
start-starburst:
docker network create dbt-net || true
./docker/init_starburst.bash
dbt-starburst-tests: start-starburst
pip install -e . -r dev_requirements.txt
tox -r
dev:
pre-commit install
================================================
FILE: README.md
================================================
# dbt-trino
<picture>
<source media="(prefers-color-scheme: dark)" srcset="https://raw.githubusercontent.com/starburstdata/dbt-trino/master/assets/images/Starburst_Logo_White%2BBlue.svg" width="98%">
<source media="(prefers-color-scheme: light)" srcset="https://raw.githubusercontent.com/starburstdata/dbt-trino/master/assets/images/Starburst_Logo_Black%2BBlue.svg" width="98%">
<img alt="Starburst" src="https://raw.githubusercontent.com/starburstdata/dbt-trino/master/assets/images/Starburst_Logo_Black%2BBlue.svg">
</picture>
<picture>
<source media="(prefers-color-scheme: dark)" srcset="https://raw.githubusercontent.com/starburstdata/dbt-trino/master/assets/images/dbt-signature_tm_light.svg" width="45%">
<source media="(prefers-color-scheme: light)" srcset="https://raw.githubusercontent.com/starburstdata/dbt-trino/master/assets/images/dbt-signature_tm.svg" width="45%">
<img alt="dbt" src="https://raw.githubusercontent.com/starburstdata/dbt-trino/master/assets/images/dbt-signature_tm.svg">
</picture>
       
<picture>
<source media="(prefers-color-scheme: dark)" srcset="https://raw.githubusercontent.com/starburstdata/dbt-trino/master/assets/images/trino-logo-dk-bg.svg" width="50%">
<source media="(prefers-color-scheme: light)" srcset="https://raw.githubusercontent.com/starburstdata/dbt-trino/master/assets/images/trino-logo-w-bk.svg" width="50%">
<img alt="trino" src="https://raw.githubusercontent.com/starburstdata/dbt-trino/master/assets/images/trino-logo-w-bk.svg">
</picture>
[](https://github.com/starburstdata/dbt-trino/actions/workflows/ci.yml?query=workflow%3A%22dbt-trino+tests%22+branch%3Amaster+event%3Apush) [](https://getdbt.slack.com/channels/db-starburst-and-trino)
## Introduction
[dbt](https://docs.getdbt.com/docs/introduction) is a data transformation workflow tool that lets teams quickly and collaboratively deploy analytics code, following software engineering best practices like modularity, CI/CD, testing, and documentation. It enables anyone who knows SQL to build production-grade data pipelines.
One frequently asked question in the context of using `dbt` tool is:
> Can I connect my dbt project to two databases?
(see the answered [question](https://docs.getdbt.com/faqs/connecting-to-two-dbs-not-allowed) on the dbt website).
**TL;DR** `dbt` stands for transformation as in `T` within `ELT` pipelines, it doesn't move data from source to a warehouse.
`dbt-trino` adapter uses [Trino](https://trino.io/) as a underlying query engine to perform query federation across disperse data sources. Trino connects to multiple and diverse data sources ([available connectors](https://trino.io/docs/current/connector.html)) via one dbt connection and process SQL queries at scale. Transformations defined in dbt are passed to Trino which handles these SQL transformation queries and translates them to queries specific to the systems it connects to create tables or views and manipulate data.
This repository represents a fork of the [dbt-presto](https://github.com/dbt-labs/dbt-presto) with adaptations to make it work with Trino.
## Compatibility
This dbt plugin has been tested against `Trino` version `478`, `Starburst Enterprise` version `477-e.1` and `Starburst Galaxy`.
## Setup & Configuration
For information on installing and configuring your profile to authenticate to Trino or Starburst, please refer to [Starburst and Trino Setup](https://docs.getdbt.com/reference/warehouse-setups/trino-setup) in the dbt docs.
### Trino- and Starburst-specific configuration
For Trino- and Starburst-specific configuration, you can refer to [Starburst (Trino) configurations](https://docs.getdbt.com/reference/resource-configs/trino-configs) on the dbt docs site.
## Contributing
- Want to report a bug or request a feature? Let us know on [Slack](http://community.getdbt.com/) in the [#db-starburst-and-trino](https://getdbt.slack.com/channels/db-starburst-and-trino) channel, or on [Trino slack](https://trino.io/slack.html) in the [#python](https://trinodb.slack.com/channels/python) channel, or open [an issue](https://github.com/starburstdata/dbt-trino/issues/new)
- Want to help us build dbt-trino? Check out the [Contributing Guide](https://github.com/starburstdata/dbt-trino/blob/HEAD/CONTRIBUTING.md)
### Release process
First 5 steps are ONLY relevant for bumping __minor__ version:
1. Create `1.x.latest` branch from the latest tag corresponding to current minor version, e.g. `git checkout -b 1.6.latest v1.6.2` (when bumping to 1.7). Push branch to remote. This branch will be used for potential backports.
2. Create new branch (Do not push below commits to `1.x.latest`). Add a new entry in `.changes/0.0.0.md` that points to the newly created latest branch.
3. Run `changie merge` to update `README.md`. After that, remove changie files and folders related to current minor version. Commit.
4. Bump version of `dbt-tests-adapter`. Commit.
5. Merge these 2 commits into the master branch. Add a `Skip Changlelog` label to the PR.
Continue with the next steps for a __minor__ version bump. Start from this point for a __patch__ version bump:
1. Run `Version Bump` workflow. The major and minor part of the dbt version are used to associate dbt-trino's version with the dbt version.
2. Merge the bump PR. Make sure that test suite pass.
3. Run `dbt-trino release` workflow to release `dbt-trino` to PyPi and GitHub.
### Backport process
Sometimes it is necessary to backport some changes to some older versions. In that case, create branch from `x.x.latest` branch. There is a `x.x.latest` for each minor version, e.g. `1.3.latest`. Make a fix and open PR back to `x.x.latest`. Create changelog by `changie new` as ususal, as separate changlog for each minor version is kept on every `x.x.latest` branch.
After merging, to make a release of that version, just follow instructions from **Release process** section, but run every workflow on `x.x.latest` branch.
## Code of Conduct
Everyone interacting in the dbt project's codebases, issue trackers, chat rooms, and mailing lists is expected
to follow the [PyPA Code of Conduct](https://www.pypa.io/en/latest/code-of-conduct/).
================================================
FILE: dbt/adapters/trino/__init__.py
================================================
from dbt.adapters.base import AdapterPlugin
from dbt.adapters.trino.column import TrinoColumn # noqa
from dbt.adapters.trino.connections import TrinoConnectionManager # noqa
from dbt.adapters.trino.connections import TrinoCredentialsFactory
from dbt.adapters.trino.relation import TrinoRelation # noqa
from dbt.adapters.trino.impl import TrinoAdapter # isort: split
from dbt.include import trino
Plugin = AdapterPlugin(
adapter=TrinoAdapter, # type: ignore
credentials=TrinoCredentialsFactory, # type: ignore
include_path=trino.PACKAGE_PATH,
)
================================================
FILE: dbt/adapters/trino/__version__.py
================================================
version = "1.10.1"
================================================
FILE: dbt/adapters/trino/catalogs/__init__.py
================================================
from dbt.adapters.trino.catalogs._relation import TrinoCatalogRelation
from dbt.adapters.trino.catalogs._trino_catalog_metastore import TrinoCatalogIntegration
__all__ = [
"TrinoCatalogIntegration",
"TrinoCatalogRelation",
]
================================================
FILE: dbt/adapters/trino/catalogs/_relation.py
================================================
from dataclasses import dataclass
from typing import Optional
from dbt.adapters.catalogs import CatalogRelation
from dbt.adapters.trino import constants
@dataclass
class TrinoCatalogRelation(CatalogRelation):
catalog_type: str = constants.DEFAULT_TRINO_CATALOG.catalog_type
catalog_name: Optional[str] = constants.DEFAULT_TRINO_CATALOG.name
table_format: Optional[str] = None
file_format: Optional[str] = None
external_volume: Optional[str] = None
storage_uri: Optional[str] = None
================================================
FILE: dbt/adapters/trino/catalogs/_trino_catalog_metastore.py
================================================
from typing import Optional
from dbt.adapters.catalogs import CatalogIntegration, CatalogIntegrationConfig
from dbt.adapters.contracts.relation import RelationConfig
from dbt.adapters.trino import constants
from dbt.adapters.trino.catalogs._relation import TrinoCatalogRelation
class TrinoCatalogIntegration(CatalogIntegration):
"""
Catalog type:
In Trino, the metastore for a catalog is set when configuring the connector.
This cannot be configured using dbt's generated SQL.
Documentation:
https://trino.io/docs/current/overview/concepts.html#catalog
https://trino.io/docs/current/object-storage/metastores.html
Table format:
For Trino and Starburst SEP, the table format is specified by the connector configuration.
Setting table_format here will result in error, as 'type' property is unavailable in Trino and Starburst SEP.
If you are using Starburst Galaxy, you can set the default table format to use for this catalog.
It will set `type` property to specified table format.
Documentation:
https://docs.starburst.io/starburst-galaxy/data-engineering/working-with-data-lakes/table-formats/index.html
"""
catalog_type = constants.TRINO_CATALOG_TYPE
allows_writes = True
def __init__(self, config: CatalogIntegrationConfig) -> None:
super().__init__(config)
self.storage_uri = config.adapter_properties.get("storage_uri")
def build_relation(self, model: RelationConfig) -> TrinoCatalogRelation:
return TrinoCatalogRelation(
catalog_type=self.catalog_type,
catalog_name=self.catalog_name,
table_format=self.table_format,
file_format=self.file_format,
external_volume=self.external_volume,
storage_uri=self._calculate_storage_uri(model),
)
def _calculate_storage_uri(self, model: RelationConfig) -> Optional[str]:
if not model.config:
return None
if model_storage_uri := model.config.get("storage_uri"):
return model_storage_uri
if not self.external_volume:
return None
# Default dbt behavior is that if base_location_root is not specified, `_dbt` prefix is added.
# Even if base_location_root is explicitly set to None, `_dbt` prefix is still added.
# Allow omitting the prefix by setting omit_base_location_root to True.
omit_base_location_root = model.config.get("omit_base_location_root")
if omit_base_location_root:
storage_uri = f"{self.external_volume}/{model.schema}/{model.name}"
else:
prefix = model.config.get("base_location_root") or "_dbt"
storage_uri = f"{self.external_volume}/{prefix}/{model.schema}/{model.name}"
if suffix := model.config.get("base_location_subpath"):
storage_uri = f"{storage_uri}/{suffix}"
return storage_uri
================================================
FILE: dbt/adapters/trino/column.py
================================================
import re
from dataclasses import dataclass
from typing import ClassVar, Dict
from dbt.adapters.base.column import Column
from dbt_common.exceptions import DbtRuntimeError
# Taken from the MAX_LENGTH variable in
# https://github.com/trinodb/trino/blob/master/core/trino-spi/src/main/java/io/trino/spi/type/VarcharType.java
TRINO_VARCHAR_MAX_LENGTH = 2147483646
@dataclass
class TrinoColumn(Column):
TYPE_LABELS: ClassVar[Dict[str, str]] = {
"STRING": "VARCHAR",
"FLOAT": "DOUBLE",
}
@property
def data_type(self):
# when varchar has no defined size, default to unbound varchar
# the super().data_type defaults to varchar(256)
if self.dtype.lower() == "varchar" and self.char_size is None:
return self.dtype
return super().data_type
def is_string(self) -> bool:
return self.dtype.lower() in ["varchar", "char"]
def is_float(self) -> bool:
return self.dtype.lower() in [
"real",
"double precision",
"double",
]
def is_integer(self) -> bool:
return self.dtype.lower() in [
"tinyint",
"smallint",
"integer",
"int",
"bigint",
]
def is_numeric(self) -> bool:
return self.dtype.lower() == "decimal"
@classmethod
def string_type(cls, size: int) -> str:
return "varchar({})".format(size)
def string_size(self) -> int:
# override the string_size function to handle the unbound varchar case
if self.dtype.lower() == "varchar" and self.char_size is None:
return TRINO_VARCHAR_MAX_LENGTH
return super().string_size()
@classmethod
def from_description(cls, name: str, raw_data_type: str) -> "Column":
# Most of the Trino data types specify a type and not a precision/scale/charsize
if not raw_data_type.lower().startswith(("varchar", "char", "decimal")):
return cls(name, raw_data_type)
# Trino data types that do specify a precision/scale/charsize:
match = re.match(
r"(?P<type>[^(]+)(?P<size>\([^)]+\))?(?P<type_suffix>[\w ]+)?", raw_data_type
)
if match is None:
raise DbtRuntimeError(f'Could not interpret data type "{raw_data_type}"')
data_type = match.group("type")
size_info = match.group("size")
data_type_suffix = match.group("type_suffix")
if data_type_suffix:
data_type += data_type_suffix
char_size = None
numeric_precision = None
numeric_scale = None
if size_info is not None:
# strip out the parentheses
size_info = size_info[1:-1]
parts = size_info.split(",")
if len(parts) == 1:
try:
char_size = int(parts[0])
except ValueError:
raise DbtRuntimeError(
f'Could not interpret data_type "{raw_data_type}": '
f'could not convert "{parts[0]}" to an integer'
)
elif len(parts) == 2:
try:
numeric_precision = int(parts[0])
except ValueError:
raise DbtRuntimeError(
f'Could not interpret data_type "{raw_data_type}": '
f'could not convert "{parts[0]}" to an integer'
)
try:
numeric_scale = int(parts[1])
except ValueError:
raise DbtRuntimeError(
f'Could not interpret data_type "{raw_data_type}": '
f'could not convert "{parts[1]}" to an integer'
)
return cls(name, data_type, char_size, numeric_precision, numeric_scale)
================================================
FILE: dbt/adapters/trino/connections.py
================================================
import decimal
import os
import re
from abc import ABCMeta, abstractmethod
from contextlib import contextmanager
from dataclasses import dataclass, field
from datetime import date, datetime
from enum import Enum
from typing import Any, Dict, List, Optional, Union
import sqlparse
import trino
from dbt.adapters.contracts.connection import AdapterResponse, Credentials
from dbt.adapters.events.logging import AdapterLogger
from dbt.adapters.exceptions.connection import FailedToConnectError
from dbt.adapters.sql import SQLConnectionManager
from dbt_common.exceptions import DbtDatabaseError, DbtRuntimeError
from dbt_common.helper_types import Port
from trino.transaction import IsolationLevel
from dbt.adapters.trino.__version__ import version
logger = AdapterLogger("Trino")
PREPARED_STATEMENTS_ENABLED_DEFAULT = True
class HttpScheme(Enum):
HTTP = "http"
HTTPS = "https"
class TrinoCredentialsFactory:
@classmethod
def _create_trino_profile(cls, profile):
if "method" in profile:
method = profile["method"]
if method == "ldap":
return TrinoLdapCredentials
elif method == "certificate":
return TrinoCertificateCredentials
elif method == "kerberos":
return TrinoKerberosCredentials
elif method == "jwt":
return TrinoJwtCredentials
elif method == "oauth":
return TrinoOauthCredentials
elif method == "oauth_console":
return TrinoOauthConsoleCredentials
return TrinoNoneCredentials
@classmethod
def translate_aliases(cls, kwargs: Dict[str, Any], recurse: bool = False) -> Dict[str, Any]:
klazz = cls._create_trino_profile(kwargs)
return klazz.translate_aliases(kwargs, recurse)
@classmethod
def validate(cls, data: Any):
klazz = cls._create_trino_profile(data)
return klazz.validate(data)
@classmethod
def from_dict(cls, data: Any):
klazz = cls._create_trino_profile(data)
return klazz.from_dict(data)
class TrinoCredentials(Credentials, metaclass=ABCMeta):
_ALIASES = {"catalog": "database"}
@property
def type(self):
return "trino"
@property
def unique_field(self):
return self.host
def _connection_keys(self):
return (
"method",
"host",
"port",
"user",
"database",
"schema",
"cert",
"prepared_statements_enabled",
)
@abstractmethod
def trino_auth(self) -> Optional[trino.auth.Authentication]:
pass
@dataclass
class TrinoNoneCredentials(TrinoCredentials):
host: str
port: Port
user: str
client_tags: Optional[List[str]] = None
roles: Optional[Dict[str, str]] = None
cert: Optional[Union[str, bool]] = None
http_scheme: HttpScheme = HttpScheme.HTTP
http_headers: Optional[Dict[str, str]] = None
session_properties: Dict[str, Any] = field(default_factory=dict)
prepared_statements_enabled: bool = PREPARED_STATEMENTS_ENABLED_DEFAULT
retries: Optional[int] = trino.constants.DEFAULT_MAX_ATTEMPTS
timezone: Optional[str] = None
suppress_cert_warning: Optional[bool] = None
@property
def method(self):
return "none"
def trino_auth(self):
return trino.constants.DEFAULT_AUTH
@dataclass
class TrinoCertificateCredentials(TrinoCredentials):
host: str
port: Port
client_certificate: str
client_private_key: str
user: Optional[str] = None
client_tags: Optional[List[str]] = None
roles: Optional[Dict[str, str]] = None
cert: Optional[Union[str, bool]] = None
http_headers: Optional[Dict[str, str]] = None
session_properties: Dict[str, Any] = field(default_factory=dict)
prepared_statements_enabled: bool = PREPARED_STATEMENTS_ENABLED_DEFAULT
retries: Optional[int] = trino.constants.DEFAULT_MAX_ATTEMPTS
timezone: Optional[str] = None
suppress_cert_warning: Optional[bool] = None
@property
def http_scheme(self):
return HttpScheme.HTTPS
@property
def method(self):
return "certificate"
def trino_auth(self):
return trino.auth.CertificateAuthentication(
self.client_certificate, self.client_private_key
)
@dataclass
class TrinoLdapCredentials(TrinoCredentials):
host: str
port: Port
user: str
password: str
impersonation_user: Optional[str] = None
client_tags: Optional[List[str]] = None
roles: Optional[Dict[str, str]] = None
cert: Optional[Union[str, bool]] = None
http_headers: Optional[Dict[str, str]] = None
session_properties: Dict[str, Any] = field(default_factory=dict)
prepared_statements_enabled: bool = PREPARED_STATEMENTS_ENABLED_DEFAULT
retries: Optional[int] = trino.constants.DEFAULT_MAX_ATTEMPTS
timezone: Optional[str] = None
suppress_cert_warning: Optional[bool] = None
@property
def http_scheme(self):
return HttpScheme.HTTPS
@property
def method(self):
return "ldap"
def trino_auth(self):
return trino.auth.BasicAuthentication(username=self.user, password=self.password)
@dataclass
class TrinoKerberosCredentials(TrinoCredentials):
host: str
port: Port
user: str
client_tags: Optional[List[str]] = None
roles: Optional[Dict[str, str]] = None
keytab: Optional[str] = None
principal: Optional[str] = None
krb5_config: Optional[str] = None
service_name: Optional[str] = "trino"
mutual_authentication: Optional[bool] = False
cert: Optional[Union[str, bool]] = None
http_headers: Optional[Dict[str, str]] = None
force_preemptive: Optional[bool] = False
hostname_override: Optional[str] = None
sanitize_mutual_error_response: Optional[bool] = True
delegate: Optional[bool] = False
session_properties: Dict[str, Any] = field(default_factory=dict)
prepared_statements_enabled: bool = PREPARED_STATEMENTS_ENABLED_DEFAULT
retries: Optional[int] = trino.constants.DEFAULT_MAX_ATTEMPTS
timezone: Optional[str] = None
suppress_cert_warning: Optional[bool] = None
@property
def http_scheme(self):
return HttpScheme.HTTPS
@property
def method(self):
return "kerberos"
def trino_auth(self):
os.environ["KRB5_CLIENT_KTNAME"] = self.keytab
return trino.auth.KerberosAuthentication(
config=self.krb5_config,
service_name=self.service_name,
principal=self.principal,
mutual_authentication=self.mutual_authentication,
ca_bundle=self.cert,
force_preemptive=self.force_preemptive,
hostname_override=self.hostname_override,
sanitize_mutual_error_response=self.sanitize_mutual_error_response,
delegate=self.delegate,
)
@dataclass
class TrinoJwtCredentials(TrinoCredentials):
host: str
port: Port
jwt_token: str
user: Optional[str] = None
client_tags: Optional[List[str]] = None
roles: Optional[Dict[str, str]] = None
cert: Optional[Union[str, bool]] = None
http_headers: Optional[Dict[str, str]] = None
session_properties: Dict[str, Any] = field(default_factory=dict)
prepared_statements_enabled: bool = PREPARED_STATEMENTS_ENABLED_DEFAULT
retries: Optional[int] = trino.constants.DEFAULT_MAX_ATTEMPTS
timezone: Optional[str] = None
suppress_cert_warning: Optional[bool] = None
@property
def http_scheme(self):
return HttpScheme.HTTPS
@property
def method(self):
return "jwt"
def trino_auth(self):
return trino.auth.JWTAuthentication(self.jwt_token)
@dataclass
class TrinoOauthCredentials(TrinoCredentials):
host: str
port: Port
user: Optional[str] = None
client_tags: Optional[List[str]] = None
roles: Optional[Dict[str, str]] = None
cert: Optional[Union[str, bool]] = None
http_headers: Optional[Dict[str, str]] = None
session_properties: Dict[str, Any] = field(default_factory=dict)
prepared_statements_enabled: bool = PREPARED_STATEMENTS_ENABLED_DEFAULT
retries: Optional[int] = trino.constants.DEFAULT_MAX_ATTEMPTS
timezone: Optional[str] = None
OAUTH = trino.auth.OAuth2Authentication(
redirect_auth_url_handler=trino.auth.WebBrowserRedirectHandler()
)
suppress_cert_warning: Optional[bool] = None
@property
def http_scheme(self):
return HttpScheme.HTTPS
@property
def method(self):
return "oauth"
def trino_auth(self):
return self.OAUTH
@dataclass
class TrinoOauthConsoleCredentials(TrinoCredentials):
host: str
port: Port
user: Optional[str] = None
client_tags: Optional[List[str]] = None
roles: Optional[Dict[str, str]] = None
cert: Optional[Union[str, bool]] = None
http_headers: Optional[Dict[str, str]] = None
session_properties: Dict[str, Any] = field(default_factory=dict)
prepared_statements_enabled: bool = PREPARED_STATEMENTS_ENABLED_DEFAULT
retries: Optional[int] = trino.constants.DEFAULT_MAX_ATTEMPTS
timezone: Optional[str] = None
OAUTH = trino.auth.OAuth2Authentication(
redirect_auth_url_handler=trino.auth.ConsoleRedirectHandler()
)
suppress_cert_warning: Optional[bool] = None
@property
def http_scheme(self):
return HttpScheme.HTTPS
@property
def method(self):
return "oauth_console"
def trino_auth(self):
return self.OAUTH
class ConnectionWrapper(object):
"""Wrap a Trino connection in a way that accomplishes two tasks:
- prefetch results from execute() calls so that trino calls actually
persist to the db but then present the usual cursor interface
- provide `cancel()` on the same object as `commit()`/`rollback()`/...
"""
def __init__(self, handle, prepared_statements_enabled):
self.handle = handle
self._cursor = None
self._fetch_result = None
self._prepared_statements_enabled = prepared_statements_enabled
def cursor(self):
self._cursor = self.handle.cursor()
return self
def cancel(self):
if self._cursor is not None:
self._cursor.cancel()
def close(self):
# this is a noop on trino, but pass it through anyway
self.handle.close()
def commit(self):
pass
def rollback(self):
pass
def start_transaction(self):
pass
def fetchall(self):
if self._cursor is None:
return None
if self._fetch_result is not None:
ret = self._fetch_result
self._fetch_result = None
return ret
return None
def fetchone(self):
if self._cursor is None:
return None
if self._fetch_result is not None:
ret = self._fetch_result[0]
self._fetch_result = None
return ret
return None
def fetchmany(self, size):
if self._cursor is None:
return None
if self._fetch_result is not None:
ret = self._fetch_result[:size]
self._fetch_result = None
return ret
return None
def execute(self, sql, bindings=None):
if not self._prepared_statements_enabled and bindings is not None:
# DEPRECATED: by default prepared statements are used.
# Code is left as an escape hatch if prepared statements
# are failing.
bindings = tuple(self._escape_value(b) for b in bindings)
sql = sql % bindings
result = self._cursor.execute(sql)
else:
result = self._cursor.execute(sql, params=bindings)
self._fetch_result = self._cursor.fetchall()
return result
@property
def description(self):
return self._cursor.description
@classmethod
def _escape_value(cls, value):
"""A not very comprehensive system for escaping bindings.
I think "'" (a single quote) is the only character that matters.
"""
numbers = (decimal.Decimal, int, float)
if value is None:
return "NULL"
elif isinstance(value, str):
return "'{}'".format(value.replace("'", "''"))
elif isinstance(value, numbers):
return value
elif isinstance(value, datetime):
time_formatted = value.strftime("%Y-%m-%d %H:%M:%S.%f")[:-3]
return "TIMESTAMP '{}'".format(time_formatted)
elif isinstance(value, date):
date_formatted = value.strftime("%Y-%m-%d")
return "DATE '{}'".format(date_formatted)
else:
raise ValueError("Cannot escape {}".format(type(value)))
@dataclass
class TrinoAdapterResponse(AdapterResponse):
query: str = ""
query_id: str = ""
class TrinoConnectionManager(SQLConnectionManager):
TYPE = "trino"
behavior_flags = None
def __init__(self, profile, mp_context, behavior_flags=None) -> None:
super().__init__(profile, mp_context)
TrinoConnectionManager.behavior_flags = behavior_flags
@contextmanager
def exception_handler(self, sql):
try:
yield
except trino.exceptions.Error as e:
msg = str(e)
if "Failed to establish a new connection" in msg:
raise FailedToConnectError(msg) from e
if isinstance(e, trino.exceptions.TrinoQueryError):
logger.debug("Trino query id: {}".format(e.query_id))
logger.debug("Trino error: {}".format(msg))
raise DbtDatabaseError(msg)
except Exception as e:
msg = str(e)
if isinstance(e, DbtRuntimeError):
# during a sql query, an internal to dbt exception was raised.
# this sounds a lot like a signal handler and probably has
# useful information, so raise it without modification.
raise
raise DbtRuntimeError(msg) from e
# For connection in auto-commit mode there is no need to start
# separate transaction. If using auto-commit, the client will
# create a new transaction and commit/rollback for each query
def add_begin_query(self):
pass
def add_commit_query(self):
pass
@classmethod
def open(cls, connection):
if connection.state == "open":
logger.debug("Connection is already open, skipping open.")
return connection
credentials = connection.credentials
# set default `cert` value, according to
# require_certificate_validation behavior flag
if credentials.cert is None:
req_cert_val_flag = cls.behavior_flags.require_certificate_validation.setting
if req_cert_val_flag:
credentials.cert = True
if credentials.suppress_cert_warning:
import urllib3
urllib3.disable_warnings(urllib3.exceptions.InsecureRequestWarning)
# it's impossible for trino to fail here as 'connections' are actually
# just cursor factories.
trino_conn = trino.dbapi.connect(
host=credentials.host,
port=credentials.port,
user=credentials.impersonation_user
if getattr(credentials, "impersonation_user", None)
else credentials.user,
client_tags=credentials.client_tags,
roles=credentials.roles,
catalog=credentials.database,
schema=credentials.schema,
http_scheme=credentials.http_scheme.value,
http_headers=credentials.http_headers,
session_properties=credentials.session_properties,
auth=credentials.trino_auth(),
max_attempts=credentials.retries,
isolation_level=IsolationLevel.AUTOCOMMIT,
source=f"dbt-trino-{version}",
verify=credentials.cert,
timezone=credentials.timezone,
)
connection.state = "open"
connection.handle = ConnectionWrapper(trino_conn, credentials.prepared_statements_enabled)
return connection
@classmethod
def get_response(cls, cursor) -> TrinoAdapterResponse:
code = cursor._cursor.update_type
if code is None:
code = "SUCCESS"
rows_affected = cursor._cursor.rowcount
if rows_affected == -1:
message = f"{code}"
else:
message = f"{code} ({rows_affected:_} rows)"
return TrinoAdapterResponse(
_message=message,
query=cursor._cursor.query,
query_id=cursor._cursor.query_id,
rows_affected=rows_affected,
) # type: ignore
def cancel(self, connection):
connection.handle.cancel()
def add_query(self, sql, auto_begin=True, bindings=None, abridge_sql_log=False):
connection = None
cursor = None
# TODO: is this sufficient? Largely copy+pasted from snowflake, so
# there's some common behavior here we can maybe factor out into the
# SQLAdapter?
queries = [q.rstrip(";") for q in sqlparse.split(sql)]
for individual_query in queries:
# hack -- after the last ';', remove comments and don't run
# empty queries. this avoids using exceptions as flow control,
# and also allows us to return the status of the last cursor
without_comments = re.sub(
re.compile("^.*(--.*)$", re.MULTILINE), "", individual_query
).strip()
if without_comments == "":
continue
parent = super(TrinoConnectionManager, self)
connection, cursor = parent.add_query(
individual_query, auto_begin, bindings, abridge_sql_log
)
if cursor is None:
conn = self.get_thread_connection()
if conn is None or conn.name is None:
conn_name = "<None>"
else:
conn_name = conn.name
raise DbtRuntimeError(
"Tried to run an empty query on model '{}'. If you are "
"conditionally running\nsql, eg. in a model hook, make "
"sure your `else` clause contains valid sql!\n\n"
"Provided SQL:\n{}".format(conn_name, sql)
)
return connection, cursor
@classmethod
def data_type_code_to_name(cls, type_code) -> str:
return type_code.split("(")[0].upper()
================================================
FILE: dbt/adapters/trino/constants.py
================================================
from types import SimpleNamespace
ADAPTER_TYPE = "trino"
TRINO_CATALOG_TYPE = "trino"
DEFAULT_TRINO_CATALOG = SimpleNamespace(
name="trino_default",
catalog_name="trino_default",
catalog_type="trino",
table_format=None,
file_format=None,
external_volume=None,
adapter_properties={},
)
================================================
FILE: dbt/adapters/trino/impl.py
================================================
from dataclasses import dataclass
from typing import Dict, List, Optional
import agate
from dbt.adapters.base.impl import AdapterConfig, ConstraintSupport
from dbt.adapters.base.meta import available
from dbt.adapters.capability import (
Capability,
CapabilityDict,
CapabilitySupport,
Support,
)
from dbt.adapters.catalogs import CatalogRelation
from dbt.adapters.contracts.relation import RelationConfig
from dbt.adapters.sql import SQLAdapter
from dbt_common.behavior_flags import BehaviorFlag
from dbt_common.contracts.constraints import ConstraintType
from dbt_common.exceptions import DbtDatabaseError
from dbt.adapters.trino import (
TrinoColumn,
TrinoConnectionManager,
TrinoRelation,
constants,
parse_model,
)
from dbt.adapters.trino.catalogs import TrinoCatalogIntegration
@dataclass
class TrinoConfig(AdapterConfig):
properties: Optional[Dict[str, str]] = None
view_security: Optional[str] = "definer"
class TrinoAdapter(SQLAdapter):
Relation = TrinoRelation
Column = TrinoColumn
ConnectionManager = TrinoConnectionManager
AdapterSpecificConfigs = TrinoConfig
CATALOG_INTEGRATIONS = [
TrinoCatalogIntegration,
]
CONSTRAINT_SUPPORT = {
ConstraintType.check: ConstraintSupport.NOT_SUPPORTED,
ConstraintType.not_null: ConstraintSupport.ENFORCED,
ConstraintType.unique: ConstraintSupport.NOT_SUPPORTED,
ConstraintType.primary_key: ConstraintSupport.NOT_SUPPORTED,
ConstraintType.foreign_key: ConstraintSupport.NOT_SUPPORTED,
}
_capabilities: CapabilityDict = CapabilityDict(
{
Capability.SchemaMetadataByRelations: CapabilitySupport(support=Support.Full),
# No information about last table modification in information_schema.tables
Capability.TableLastModifiedMetadata: CapabilitySupport(support=Support.Unsupported),
Capability.TableLastModifiedMetadataBatch: CapabilitySupport(
support=Support.Unsupported
),
}
)
def __init__(self, config, mp_context) -> None:
super().__init__(config, mp_context)
self.connections = self.ConnectionManager(config, mp_context, self.behavior)
self.add_catalog_integration(constants.DEFAULT_TRINO_CATALOG)
@property
def _behavior_flags(self) -> List[BehaviorFlag]:
return [
{ # type: ignore
"name": "require_certificate_validation",
"default": False,
"description": (
"SSL certificate validation is disabled by default. "
"It is legacy behavior which will be changed in future releases. "
"It is strongly advised to enable `require_certificate_validation` flag "
"or explicitly set `cert` configuration to `True` for security reasons. "
"You may receive an error after that if your SSL setup is incorrect."
),
}
]
@classmethod
def date_function(cls):
return "datenow()"
@classmethod
def convert_text_type(cls, agate_table, col_idx):
return "VARCHAR"
@classmethod
def convert_number_type(cls, agate_table, col_idx):
decimals = agate_table.aggregate(agate.MaxPrecision(col_idx))
return "DOUBLE" if decimals else "INTEGER"
@classmethod
def convert_datetime_type(cls, agate_table, col_idx):
return "TIMESTAMP"
@classmethod
def convert_date_type(cls, agate_table: agate.Table, col_idx: int) -> str:
return "DATE"
def timestamp_add_sql(self, add_to: str, number: int = 1, interval: str = "hour") -> str:
return f"{add_to} + interval '{number}' {interval}"
def get_columns_in_relation(self, relation):
try:
return super().get_columns_in_relation(relation)
except DbtDatabaseError as exc:
if "does not exist" in str(exc):
return []
else:
raise
def valid_incremental_strategies(self):
return ["append", "merge", "delete+insert", "microbatch"]
@available
def build_catalog_relation(self, model: RelationConfig) -> Optional[CatalogRelation]:
"""
Builds a relation for a given configuration.
This method uses the provided configuration to determine the appropriate catalog
integration and config parser for building the relation. It defaults to the trino
catalog if none is provided in the configuration for backward compatibility.
Args:
model (RelationConfig): `config.model` (not `model`) from the jinja context
Returns:
Any: The constructed relation object generated through the catalog integration and parser
"""
if catalog := parse_model.catalog_name(model):
catalog_integration = self.get_catalog_integration(catalog)
return catalog_integration.build_relation(model)
return None
================================================
FILE: dbt/adapters/trino/parse_model.py
================================================
from typing import Optional
from dbt.adapters.catalogs import CATALOG_INTEGRATION_MODEL_CONFIG_NAME # type: ignore
from dbt.adapters.contracts.relation import RelationConfig
from dbt.adapters.trino import constants
def catalog_name(model: RelationConfig) -> Optional[str]:
"""Extract catalog name from model configuration"""
if not hasattr(model, "config") or not model.config:
return None
if catalog := model.config.get(CATALOG_INTEGRATION_MODEL_CONFIG_NAME):
return catalog
return constants.DEFAULT_TRINO_CATALOG.name
================================================
FILE: dbt/adapters/trino/relation.py
================================================
from dataclasses import dataclass, field
from dbt.adapters.base.relation import BaseRelation, EventTimeFilter, Policy
from dbt.adapters.contracts.relation import ComponentName
@dataclass(frozen=True, eq=False, repr=False)
class TrinoRelation(BaseRelation):
quote_policy: Policy = field(default_factory=lambda: Policy())
require_alias: bool = False
# Overridden as Trino converts relation identifiers to lowercase
def _is_exactish_match(self, field: ComponentName, value: str) -> bool:
return self.path.get_lowered_part(field) == value.lower()
# Overridden because Trino cannot compare a TIMESTAMP column with a VARCHAR literal.
def _render_event_time_filtered(self, event_time_filter: EventTimeFilter) -> str:
"""
Returns "" if start and end are both None
"""
filter = ""
if event_time_filter.start and event_time_filter.end:
filter = f"{event_time_filter.field_name} >= TIMESTAMP '{event_time_filter.start}' and {event_time_filter.field_name} < TIMESTAMP '{event_time_filter.end}'"
elif event_time_filter.start:
filter = f"{event_time_filter.field_name} >= TIMESTAMP '{event_time_filter.start}'"
elif event_time_filter.end:
filter = f"{event_time_filter.field_name} < TIMESTAMP '{event_time_filter.end}'"
return filter
================================================
FILE: dbt/include/trino/__init__.py
================================================
import os
PACKAGE_PATH = os.path.dirname(__file__)
================================================
FILE: dbt/include/trino/dbt_project.yml
================================================
name: dbt_trino
version: 1.0
config-version: 2
macro-paths: ["macros"]
================================================
FILE: dbt/include/trino/macros/adapters.sql
================================================
-- - get_catalog
-- - list_relations_without_caching
-- - get_columns_in_relation
{% macro trino__get_columns_in_relation(relation) -%}
{%- set sql -%}
select column_name, data_type
from {{ relation.information_schema() }}.columns
where
table_catalog = '{{ relation.database | lower }}'
and table_schema = '{{ relation.schema | lower }}'
and table_name = '{{ relation.identifier | lower}}'
{%- endset -%}
{%- set result = run_query(sql) -%}
{% set maximum = 10000 %}
{% if (result | length) >= maximum %}
{% set msg %}
Too many columns in relation {{ relation }}! dbt can only get
information about relations with fewer than {{ maximum }} columns.
{% endset %}
{% do exceptions.raise_compiler_error(msg) %}
{% endif %}
{% set columns = [] %}
{% for row in result %}
{% do columns.append(api.Column.from_description(row['column_name'].lower(), row['data_type'])) %}
{% endfor %}
{% do return(columns) %}
{% endmacro %}
{% macro trino__list_relations_without_caching(relation) %}
{% call statement('list_relations_without_caching', fetch_result=True) -%}
select
t.table_catalog as database,
t.table_name as name,
t.table_schema as schema,
case when mv.name is not null then 'materialized_view'
when t.table_type = 'BASE TABLE' then 'table'
when t.table_type = 'VIEW' then 'view'
else t.table_type
end as table_type
from {{ relation.information_schema() }}.tables t
left join (
select * from system.metadata.materialized_views
where catalog_name = '{{ relation.database | lower }}'
and schema_name = '{{ relation.schema | lower }}') mv
on mv.catalog_name = t.table_catalog and mv.schema_name = t.table_schema and mv.name = t.table_name
where t.table_schema = '{{ relation.schema | lower }}'
{% endcall %}
{{ return(load_result('list_relations_without_caching').table) }}
{% endmacro %}
{% macro trino__reset_csv_table(model, full_refresh, old_relation, agate_table) %}
{{ adapter.drop_relation(old_relation) }}
{{ return(create_csv_table(model, agate_table)) }}
{% endmacro %}
{% macro trino__create_csv_table(model, agate_table) %}
{%- set column_override = model['config'].get('column_types', {}) -%}
{%- set quote_seed_column = model['config'].get('quote_columns', None) -%}
{% set sql %}
create table {{ this.render() }} (
{%- for col_name in agate_table.column_names -%}
{%- set inferred_type = adapter.convert_type(agate_table, loop.index0) -%}
{%- set type = column_override.get(col_name, inferred_type) -%}
{%- set column_name = (col_name | string) -%}
{{ adapter.quote_seed_column(column_name, quote_seed_column) }} {{ type }} {%- if not loop.last -%}, {%- endif -%}
{%- endfor -%}
) {{ properties() }}
{% endset %}
{% call statement('_') -%}
{{ sql }}
{%- endcall %}
{{ return(sql) }}
{% endmacro %}
{% macro properties(temporary=False) %}
{%- set _properties = config.get('properties') -%}
{%- set table_format = config.get('table_format') -%}
{%- set file_format = config.get('file_format') -%}
{%- set catalog_relation = adapter.build_catalog_relation(config.model) -%}
{%- set catalog_table_format = catalog_relation.table_format -%}
{%- set catalog_file_format = catalog_relation.file_format -%}
{%- set catalog_storage_uri = catalog_relation.storage_uri -%}
{%- if file_format -%}
{%- if _properties -%}
{%- if _properties.format -%}
{% set msg %}
You can specify either 'file_format' or 'properties.format' configurations, but not both.
{% endset %}
{% do exceptions.raise_compiler_error(msg) %}
{%- else -%}
{%- do _properties.update({'format': "'" ~ file_format ~ "'"}) -%}
{%- endif -%}
{%- else -%}
{%- set _properties = {'format': "'" ~ file_format ~ "'"} -%}
{%- endif -%}
{%- elif (not _properties.format) and catalog_file_format -%}
{%- if _properties -%}
{%- do _properties.update({'format': "'" ~ catalog_file_format ~ "'"}) -%}
{%- else -%}
{%- set _properties = {'format': "'" ~ catalog_file_format ~ "'"} -%}
{%- endif -%}
{%- endif -%}
{%- if table_format -%}
{%- if _properties -%}
{%- if _properties.type -%}
{% set msg %}
You can specify either 'table_format' or 'properties.type' configurations, but not both.
{% endset %}
{% do exceptions.raise_compiler_error(msg) %}
{%- else -%}
{%- do _properties.update({'type': "'" ~ table_format ~ "'"}) -%}
{%- endif -%}
{%- else -%}
{%- set _properties = {'type': "'" ~ table_format ~ "'"} -%}
{%- endif -%}
{%- elif (not _properties.type) and (catalog_table_format is not none) -%}
{%- if _properties -%}
{%- do _properties.update({'type': "'" ~ catalog_table_format ~ "'"}) -%}
{%- else -%}
{%- set _properties = {'type': "'" ~ catalog_table_format ~ "'"} -%}
{%- endif -%}
{%- endif -%}
{%- if not _properties.location and catalog_storage_uri -%}
{%- if _properties -%}
{%- do _properties.update({'location': "'" ~ catalog_storage_uri ~ "'"}) -%}
{%- else -%}
{%- set _properties = {'location': "'" ~ catalog_storage_uri ~ "'"} -%}
{%- endif -%}
{%- endif -%}
{%- if temporary -%}
{%- if _properties -%}
{%- if _properties.location -%}
{%- do _properties.update({'location': _properties.location[:-1] ~ "__dbt_tmp'"}) -%}
{%- endif -%}
{%- endif -%}
{%- endif -%}
{%- if _properties is not none -%}
WITH (
{%- for key, value in _properties.items() -%}
{{ key }} = {{ value }}
{%- if not loop.last -%}{{ ',\n ' }}{%- endif -%}
{%- endfor -%}
)
{%- endif -%}
{%- endmacro -%}
{% macro comment(comment) %}
{%- set persist_docs = model['config'].get('persist_docs') -%}
{%- if persist_docs -%}
{%- set persist_relation = persist_docs.get('relation') -%}
{%- if persist_relation and comment is not none and comment|length > 0 -%}
comment '{{ comment | replace("'", "''") }}'
{%- endif -%}
{%- endif -%}
{%- endmacro -%}
{% macro trino__create_table_as(temporary, relation, sql, on_exists=None) -%}
{%- set or_replace = ' or replace' if on_exists == 'replace' else '' -%}
{%- set if_not_exists = ' if not exists' if on_exists == 'skip' else '' -%}
{%- set contract_config = config.get('contract') -%}
{%- if contract_config.enforced -%}
create{{ or_replace }} table{{ if_not_exists }}
{{ relation }}
{{ get_table_columns_and_constraints() }}
{{ get_assert_columns_equivalent(sql) }}
{%- set sql = get_select_subquery(sql) %}
{{ comment(model.get('description')) }}
{{ properties(temporary) }}
;
insert into {{ relation }}
(
{{ sql }}
)
;
{%- else %}
create{{ or_replace }} table{{ if_not_exists }} {{ relation }}
{{ comment(model.get('description')) }}
{{ properties(temporary) }}
as (
{{ sql }}
);
{%- endif %}
{% endmacro %}
{% macro trino__create_view_as(relation, sql) -%}
{%- set view_security = config.get('view_security', 'definer') -%}
{%- if view_security not in ['definer', 'invoker'] -%}
{%- set log_message = 'Invalid value for view_security (%s) specified. Setting default value (%s).' % (view_security, 'definer') -%}
{% do log(log_message) %}
{%- set on_table_exists = 'definer' -%}
{% endif %}
create or replace view
{{ relation }}
{%- set contract_config = config.get('contract') -%}
{%- if contract_config.enforced -%}
{{ get_assert_columns_equivalent(sql) }}
{%- endif %}
security {{ view_security }}
as
{{ sql }}
;
{% endmacro %}
{%- macro trino__get_drop_sql(relation) -%}
{% set relation_type = relation.type|replace("_", " ") %}
drop {{ relation_type }} if exists {{ relation }}
{% endmacro %}
{# see this issue: https://github.com/dbt-labs/dbt/issues/2267 #}
{% macro trino__information_schema_name(database) -%}
{%- if database -%}
{{ database }}.INFORMATION_SCHEMA
{%- else -%}
INFORMATION_SCHEMA
{%- endif -%}
{%- endmacro %}
{# On Trino, 'cascade' is not supported so we have to manually cascade. #}
{% macro trino__drop_schema(relation) -%}
{% for row in list_relations_without_caching(relation) %}
{% set rel_db = row[0] %}
{% set rel_identifier = row[1] %}
{% set rel_schema = row[2] %}
{% set rel_type = api.Relation.get_relation_type(row[3]) %}
{% set existing = api.Relation.create(database=rel_db, schema=rel_schema, identifier=rel_identifier, type=rel_type) %}
{% do drop_relation(existing) %}
{% endfor %}
{%- call statement('drop_schema') -%}
drop schema if exists {{ relation }}
{% endcall %}
{% endmacro %}
{% macro trino__rename_relation(from_relation, to_relation) -%}
{% set from_relation_type = from_relation.type|replace("_", " ") %}
{% call statement('rename_relation') -%}
alter {{ from_relation_type }} {{ from_relation }} rename to {{ to_relation }}
{%- endcall %}
{% endmacro %}
{% macro trino__alter_relation_comment(relation, relation_comment) -%}
comment on {{ relation.type }} {{ relation }} is '{{ relation_comment | replace("'", "''") }}';
{% endmacro %}
{% macro trino__alter_column_comment(relation, column_dict) %}
{% set existing_columns = adapter.get_columns_in_relation(relation) | map(attribute="name") | list %}
{% for column_name in column_dict if (column_name in existing_columns) %}
{% set comment = column_dict[column_name]['description'] %}
{%- if comment|length -%}
comment on column {{ relation }}.{{ adapter.quote(column_name) if column_dict[column_name]['quote'] else column_name }} is '{{ comment | replace("'", "''") }}';
{%- else -%}
comment on column {{ relation }}.{{ adapter.quote(column_name) if column_dict[column_name]['quote'] else column_name }} is null;
{%- endif -%}
{% endfor %}
{% endmacro %}
{% macro trino__list_schemas(database) -%}
{% call statement('list_schemas', fetch_result=True, auto_begin=False) %}
select schema_name
from {{ information_schema_name(database) }}.schemata
{% endcall %}
{{ return(load_result('list_schemas').table) }}
{% endmacro %}
{% macro trino__check_schema_exists(information_schema, schema) -%}
{% call statement('check_schema_exists', fetch_result=True, auto_begin=False) -%}
select count(*)
from {{ information_schema }}.schemata
where catalog_name = '{{ information_schema.database }}'
and schema_name = '{{ schema | lower }}'
{%- endcall %}
{{ return(load_result('check_schema_exists').table) }}
{% endmacro %}
{% macro trino__get_binding_char() %}
{%- if target.prepared_statements_enabled|as_bool -%}
{{ return('?') }}
{%- else -%}
{{ return('%s') }}
{%- endif -%}
{% endmacro %}
{% macro trino__alter_relation_add_remove_columns(relation, add_columns, remove_columns) %}
{% if add_columns is none %}
{% set add_columns = [] %}
{% endif %}
{% if remove_columns is none %}
{% set remove_columns = [] %}
{% endif %}
{% for column in add_columns %}
{% set sql -%}
alter {{ relation.type }} {{ relation }} add column {{ adapter.quote(column.name) }} {{ column.data_type }}
{%- endset -%}
{% do run_query(sql) %}
{% endfor %}
{% for column in remove_columns %}
{% set sql -%}
alter {{ relation.type }} {{ relation }} drop column {{ adapter.quote(column.name) }}
{%- endset -%}
{% do run_query(sql) %}
{% endfor %}
{% endmacro %}
{% macro create_or_replace_view() %}
{%- set identifier = model['alias'] -%}
{%- set old_relation = adapter.get_relation(database=database, schema=schema, identifier=identifier) -%}
{%- set exists_as_view = (old_relation is not none and old_relation.is_view) -%}
{%- set target_relation = api.Relation.create(
identifier=identifier, schema=schema, database=database,
type='view') -%}
{% set grant_config = config.get('grants') %}
{{ run_hooks(pre_hooks) }}
-- If there is another object delete it
{%- if old_relation is not none and not old_relation.is_view -%}
{{ handle_existing_table(should_full_refresh(), old_relation) }}
{%- endif -%}
-- build model
{% call statement('main') -%}
{{ get_create_view_as_sql(target_relation, sql) }}
{%- endcall %}
{% set should_revoke = should_revoke(exists_as_view, full_refresh_mode=True) %}
{% do apply_grants(target_relation, grant_config, should_revoke=True) %}
{{ run_hooks(post_hooks) }}
{{ return({'relations': [target_relation]}) }}
{% endmacro %}
{% macro trino__alter_column_type(relation, column_name, new_column_type) %}
{#
1. Create a new column (w/ temp name and correct type)
2. Copy data over to it
3. Drop the existing column
4. Rename the new column to existing column
#}
{%- set tmp_column = column_name + "__dbt_alter" -%}
{% call statement('alter_column_type') %}
alter table {{ relation }} add column {{ adapter.quote(tmp_column) }} {{ new_column_type }};
update {{ relation }} set {{ adapter.quote(tmp_column) }} = CAST({{ adapter.quote(column_name) }} AS {{ new_column_type }});
alter table {{ relation }} drop column {{ adapter.quote(column_name) }};
alter table {{ relation }} rename column {{ adapter.quote(tmp_column) }} to {{ adapter.quote(column_name) }}
{% endcall %}
{% endmacro %}
================================================
FILE: dbt/include/trino/macros/apply_grants.sql
================================================
{% macro trino__get_show_grant_sql(relation) -%}
select
grantee,
lower(privilege_type) as privilege_type
from information_schema.table_privileges
where table_catalog = '{{ relation.database }}'
and table_schema = '{{ relation.schema }}'
and table_name = '{{ relation.identifier }}'
{%- endmacro %}
{% macro trino__copy_grants() %}
{#
-- This macro should return true or false depending on the answer to
-- following question:
-- when an object is fully replaced on your database, do grants copy over?
-- e.g. on Postgres this is never true,
-- on Spark this is different for views vs. non-Delta tables vs. Delta tables,
-- on Snowflake it depends on the user-supplied copy_grants configuration.
-- true by default, which means “play it safe”: grants MIGHT have copied over,
-- so dbt will run an extra query to check them + calculate diffs.
#}
{{ return(False) }}
{% endmacro %}
{%- macro trino__get_grant_sql(relation, privilege, grantees) -%}
grant {{ privilege }} on {{ relation }} to {{ adapter.quote(grantees[0]) }}
{%- endmacro %}
{%- macro trino__support_multiple_grantees_per_dcl_statement() -%}
{#
-- This macro should return true or false depending on the answer to
-- following question:
-- does this database support grant {privilege} to user_a, user_b, ...?
-- or do user_a + user_b need their own separate grant statements?
#}
{{ return(False) }}
{%- endmacro -%}
{% macro trino__call_dcl_statements(dcl_statement_list) %}
{% for dcl_statement in dcl_statement_list %}
{% call statement('grant_or_revoke') %}
{{ dcl_statement }}
{% endcall %}
{% endfor %}
{% endmacro %}
================================================
FILE: dbt/include/trino/macros/catalog.sql
================================================
{% macro trino__get_catalog(information_schema, schemas) -%}
{% set query %}
with tables as (
{{ trino__get_catalog_tables_sql(information_schema) }}
{{ trino__get_catalog_schemas_where_clause_sql(schemas) }}
),
columns as (
{{ trino__get_catalog_columns_sql(information_schema) }}
{{ trino__get_catalog_schemas_where_clause_sql(schemas) }}
),
table_comment as (
{{ trino__get_catalog_table_comment_schemas_sql(information_schema, schemas) }}
)
{{ trino__get_catalog_results_sql() }}
{%- endset -%}
{{ return(run_query(query)) }}
{%- endmacro %}
{% macro trino__get_catalog_relations(information_schema, relations) -%}
{% set query %}
with tables as (
{{ trino__get_catalog_tables_sql(information_schema) }}
{{ trino__get_catalog_relations_where_clause_sql(relations) }}
),
columns as (
{{ trino__get_catalog_columns_sql(information_schema) }}
{{ trino__get_catalog_relations_where_clause_sql(relations) }}
),
table_comment as (
{{ trino__get_catalog_table_comment_relations_sql(information_schema, relations) }}
)
{{ trino__get_catalog_results_sql() }}
{%- endset -%}
{{ return(run_query(query)) }}
{%- endmacro %}
{% macro trino__get_catalog_tables_sql(information_schema) -%}
select
table_catalog as "table_database",
table_schema as "table_schema",
table_name as "table_name",
table_type as "table_type",
null as "table_owner"
from {{ information_schema }}.tables
{%- endmacro %}
{% macro trino__get_catalog_columns_sql(information_schema) -%}
select
table_catalog as "table_database",
table_schema as "table_schema",
table_name as "table_name",
column_name as "column_name",
ordinal_position as "column_index",
data_type as "column_type",
comment as "column_comment"
from {{ information_schema }}.columns
{%- endmacro %}
{% macro trino__get_catalog_table_comment_schemas_sql(information_schema, schemas) -%}
select
catalog_name as "table_database",
schema_name as "table_schema",
table_name as "table_name",
comment as "table_comment"
from system.metadata.table_comments
where
catalog_name = '{{ information_schema.database }}'
and
schema_name != 'information_schema'
and
schema_name in ('{{ schemas | join("','") | lower }}')
{%- endmacro %}
{% macro trino__get_catalog_table_comment_relations_sql(information_schema, relations) -%}
{%- for relation in relations %}
select
catalog_name as "table_database",
schema_name as "table_schema",
table_name as "table_name",
comment as "table_comment"
from system.metadata.table_comments
where
catalog_name = '{{ information_schema.database }}'
and
schema_name != 'information_schema'
and
{% if relation.schema and relation.identifier %}
(
schema_name = '{{ relation.schema | lower }}'
and table_name = '{{ relation.identifier | lower }}'
)
{% elif relation.schema %}
(
schema_name = '{{ relation.schema | lower }}'
)
{% else %}
{% do exceptions.raise_compiler_error(
'`get_catalog_relations` requires a list of relations, each with a schema'
) %}
{% endif %}
{%- if not loop.last %}
union all
{% endif -%}
{%- endfor -%}
{%- endmacro %}
{% macro trino__get_catalog_results_sql() -%}
select
table_database,
table_schema,
table_name,
table_type,
table_owner,
column_name,
column_index,
column_type,
column_comment,
table_comment
from tables
join columns using ("table_database", "table_schema", "table_name")
join table_comment using ("table_database", "table_schema", "table_name")
order by "column_index"
{%- endmacro %}
{% macro trino__get_catalog_schemas_where_clause_sql(schemas) -%}
where
table_schema != 'information_schema'
and
table_schema in ('{{ schemas | join("','") | lower }}')
{%- endmacro %}
{% macro trino__get_catalog_relations_where_clause_sql(relations) -%}
where
table_schema != 'information_schema'
and
(
{%- for relation in relations -%}
{% if relation.schema and relation.identifier %}
(
table_schema = '{{ relation.schema | lower }}'
and table_name = '{{ relation.identifier | lower }}'
)
{% elif relation.schema %}
(
table_schema = '{{ relation.schema | lower }}'
)
{% else %}
{% do exceptions.raise_compiler_error(
'`get_catalog_relations` requires a list of relations, each with a schema'
) %}
{% endif %}
{%- if not loop.last %} or {% endif -%}
{%- endfor -%}
)
{%- endmacro %}
================================================
FILE: dbt/include/trino/macros/materializations/incremental.sql
================================================
{% macro get_incremental_tmp_relation_type(strategy, unique_key, language) %}
/* {#
If we are running multiple statements (DELETE + INSERT),
we must first save the model query results as a temporary table
in order to guarantee consistent inputs to both statements.
If we are running a single statement (MERGE or INSERT alone),
we can save the model query definition as a view instead,
for faster overall incremental processing.
#} */
{%- set views_enabled = config.get('views_enabled', true) -%}
{% if language == 'sql' and (views_enabled and (strategy in ('default', 'append', 'merge') or (unique_key is none))) %}
{{ return('view') }}
{% else %} {#-- play it safe -- #}
{{ return('table') }}
{% endif %}
{% endmacro %}
{% materialization incremental, adapter='trino', supported_languages=['sql'] -%}
{#-- configs --#}
{%- set unique_key = config.get('unique_key') -%}
{%- set full_refresh_mode = (should_full_refresh()) -%}
{%- set on_schema_change = incremental_validate_on_schema_change(config.get('on_schema_change'), default='ignore') -%}
{%- set language = model['language'] -%}
{%- set on_table_exists = config.get('on_table_exists', 'rename') -%}
{% if on_table_exists not in ['rename', 'drop', 'replace'] %}
{%- set log_message = 'Invalid value for on_table_exists (%s) specified. Setting default value (%s).' % (on_table_exists, 'rename') -%}
{% do log(log_message) %}
{%- set on_table_exists = 'rename' -%}
{% endif %}
{#-- Get the incremental_strategy and the macro to use for the strategy --#}
{% set incremental_strategy = config.get('incremental_strategy') or 'default' %}
{% set incremental_predicates = config.get('predicates', none) or config.get('incremental_predicates', none) %}
{% set strategy_sql_macro_func = adapter.get_incremental_strategy_macro(context, incremental_strategy) %}
{#-- relations --#}
{%- set existing_relation = load_cached_relation(this) -%}
{%- set target_relation = this.incorporate(type='table') -%}
{#-- The temp relation will be a view (faster) or temp table, depending on upsert/merge strategy --#}
{%- set tmp_relation_type = get_incremental_tmp_relation_type(incremental_strategy, unique_key, language) -%}
{%- set tmp_relation = make_temp_relation(this).incorporate(type=tmp_relation_type) -%}
{%- set intermediate_relation = make_intermediate_relation(target_relation) -%}
{%- set backup_relation_type = 'table' if existing_relation is none else existing_relation.type -%}
{%- set backup_relation = make_backup_relation(target_relation, backup_relation_type) -%}
{#-- the temp_ and backup_ relation should not already exist in the database; get_relation
-- will return None in that case. Otherwise, we get a relation that we can drop
-- later, before we try to use this name for the current operation.#}
{%- set preexisting_tmp_relation = load_cached_relation(tmp_relation)-%}
{%- set preexisting_intermediate_relation = load_cached_relation(intermediate_relation)-%}
{%- set preexisting_backup_relation = load_cached_relation(backup_relation) -%}
{#--- grab current tables grants config for comparision later on#}
{% set grant_config = config.get('grants') %}
-- drop the temp relations if they exist already in the database
{{ drop_relation_if_exists(preexisting_tmp_relation) }}
{{ drop_relation_if_exists(preexisting_intermediate_relation) }}
{{ drop_relation_if_exists(preexisting_backup_relation) }}
{{ run_hooks(pre_hooks) }}
{% if existing_relation is none %}
{%- call statement('main', language=language) -%}
{{ create_table_as(False, target_relation, compiled_code, language) }}
{%- endcall -%}
{% elif existing_relation.is_view %}
{#-- Can't overwrite a view with a table - we must drop --#}
{{ log("Dropping relation " ~ target_relation ~ " because it is a view and this model is a table.") }}
{% do adapter.drop_relation(existing_relation) %}
{%- call statement('main', language=language) -%}
{{ create_table_as(False, target_relation, compiled_code, language) }}
{%- endcall -%}
{% elif full_refresh_mode %}
{#-- Create table with given `on_table_exists` mode #}
{% do on_table_exists_logic(on_table_exists, existing_relation, intermediate_relation, backup_relation, target_relation) %}
{% else %}
{#-- Create the temp relation, either as a view or as a temp table --#}
{% if tmp_relation_type == 'view' %}
{%- call statement('create_tmp_relation') -%}
{{ create_view_as(tmp_relation, compiled_code) }}
{%- endcall -%}
{% else %}
{%- call statement('create_tmp_relation', language=language) -%}
{{ create_table_as(True, tmp_relation, compiled_code, language) }}
{%- endcall -%}
{% endif %}
{% do adapter.expand_target_column_types(
from_relation=tmp_relation,
to_relation=target_relation) %}
{#-- Process schema changes. Returns dict of changes if successful. Use source columns for upserting/merging --#}
{% set dest_columns = process_schema_changes(on_schema_change, tmp_relation, existing_relation) %}
{% if not dest_columns %}
{% set dest_columns = adapter.get_columns_in_relation(existing_relation) %}
{% endif %}
{#-- Build the sql --#}
{% set strategy_arg_dict = ({'target_relation': target_relation, 'temp_relation': tmp_relation, 'unique_key': unique_key, 'dest_columns': dest_columns, 'incremental_predicates': incremental_predicates }) %}
{%- call statement('main') -%}
{{ strategy_sql_macro_func(strategy_arg_dict) }}
{%- endcall -%}
{% endif %}
{% do drop_relation_if_exists(tmp_relation) %}
{{ run_hooks(post_hooks) }}
{% set should_revoke =
should_revoke(existing_relation.is_table, full_refresh_mode) %}
{% do apply_grants(target_relation, grant_config, should_revoke=should_revoke) %}
{% do persist_docs(target_relation, model) %}
{{ return({'relations': [target_relation]}) }}
{%- endmaterialization %}
{% macro trino__get_delete_insert_merge_sql(target, source, unique_key, dest_columns, incremental_predicates) -%}
{%- set dest_cols_csv = get_quoted_csv(dest_columns | map(attribute="name")) -%}
{% if unique_key %}
{% if unique_key is sequence and unique_key is not string %}
delete from {{ target }}
where exists (
select 1
from {{ source }}
where
{% for key in unique_key %}
{{ target }}.{{ key }} = {{ source }}.{{ key }}
{{ "and " if not loop.last }}
{% endfor %}
)
{% if incremental_predicates %}
{% for predicate in incremental_predicates %}
and {{ predicate }}
{% endfor %}
{% endif %}
;
{% else %}
delete from {{ target }}
where (
{{ unique_key }}) in (
select {{ unique_key }}
from {{ source }}
)
{%- if incremental_predicates %}
{% for predicate in incremental_predicates %}
and {{ predicate }}
{% endfor %}
{%- endif -%};
{% endif %}
{% endif %}
insert into {{ target }} ({{ dest_cols_csv }})
(
select {{ dest_cols_csv }}
from {{ source }}
)
{%- endmacro %}
{% macro trino__get_merge_sql(target, source, unique_key, dest_columns, incremental_predicates) -%}
{%- set predicates = [] if incremental_predicates is none else [] + incremental_predicates -%}
{%- set dest_cols_csv = get_quoted_csv(dest_columns | map(attribute="name")) -%}
{%- set dest_cols_csv_source = dest_cols_csv.split(', ') -%}
{%- set merge_update_columns = config.get('merge_update_columns') -%}
{%- set merge_exclude_columns = config.get('merge_exclude_columns') -%}
{%- set update_columns = get_merge_update_columns(merge_update_columns, merge_exclude_columns, dest_columns) -%}
{%- set sql_header = config.get('sql_header', none) -%}
{% if unique_key %}
{% if unique_key is sequence and unique_key is not mapping and unique_key is not string %}
{% for key in unique_key %}
{% set this_key_match %}
DBT_INTERNAL_SOURCE.{{ key }} = DBT_INTERNAL_DEST.{{ key }}
{% endset %}
{% do predicates.append(this_key_match) %}
{% endfor %}
{% else %}
{% set unique_key_match %}
DBT_INTERNAL_SOURCE.{{ unique_key }} = DBT_INTERNAL_DEST.{{ unique_key }}
{% endset %}
{% do predicates.append(unique_key_match) %}
{% endif %}
{{ sql_header if sql_header is not none }}
merge into {{ target }} as DBT_INTERNAL_DEST
using {{ source }} as DBT_INTERNAL_SOURCE
on {{"(" ~ predicates | join(") and (") ~ ")"}}
{% if unique_key %}
when matched then update set
{% for column_name in update_columns -%}
{{ column_name }} = DBT_INTERNAL_SOURCE.{{ column_name }}
{%- if not loop.last %}, {%- endif %}
{%- endfor %}
{% endif %}
when not matched then insert
({{ dest_cols_csv }})
values
({% for dest_cols in dest_cols_csv_source -%}
DBT_INTERNAL_SOURCE.{{ dest_cols }}
{%- if not loop.last %}, {% endif %}
{%- endfor %})
{% else %}
insert into {{ target }} ({{ dest_cols_csv }})
(
select {{ dest_cols_csv }}
from {{ source }}
)
{% endif %}
{% endmacro %}
{% macro trino__get_incremental_microbatch_sql(arg_dict) %}
{%- set target = arg_dict["target_relation"] -%}
{%- set source = arg_dict["temp_relation"] -%}
{%- set dest_columns = arg_dict["dest_columns"] -%}
{%- set incremental_predicates = [] if arg_dict.get('incremental_predicates') is none else arg_dict.get('incremental_predicates') -%}
{#-- Add additional incremental_predicates to filter for batch --#}
{% if model.config.get("__dbt_internal_microbatch_event_time_start") -%}
{% do incremental_predicates.append(model.config.event_time ~ " >= TIMESTAMP '" ~ model.config.__dbt_internal_microbatch_event_time_start ~ "'") %}
{% endif %}
{% if model.config.get("__dbt_internal_microbatch_event_time_end") -%}
{% do incremental_predicates.append(model.config.event_time ~ " < TIMESTAMP '" ~ model.config.__dbt_internal_microbatch_event_time_end ~ "'") %}
{% endif %}
{% do arg_dict.update({'incremental_predicates': incremental_predicates}) %}
delete from {{ target }}
where (
{% for predicate in incremental_predicates %}
{%- if not loop.first %}and {% endif -%} {{ predicate }}
{% endfor %}
);
{%- set dest_cols_csv = get_quoted_csv(dest_columns | map(attribute="name")) -%}
insert into {{ target }} ({{ dest_cols_csv }})
(
select {{ dest_cols_csv }}
from {{ source }}
)
{% endmacro %}
================================================
FILE: dbt/include/trino/macros/materializations/materialized_view.sql
================================================
{%- macro trino__get_create_materialized_view_as_sql(target_relation, sql) -%}
create materialized view {{ target_relation }}
{%- set grace_period = config.get('grace_period') %}
{%- if grace_period is not none %}
grace period {{ grace_period }}
{%- endif %}
{{ properties() }}
as
{{ sql }}
;
{%- endmacro -%}
{% macro trino__get_replace_materialized_view_as_sql(relation, sql, existing_relation, backup_relation, intermediate_relation) %}
{{- trino__get_create_materialized_view_as_sql(intermediate_relation, sql) }}
{% if existing_relation is not none %}
{{ log("Found a " ~ existing_relation.type ~ " with same name. Will drop it", info=true) }}
alter {{ existing_relation.type|replace("_", " ") }} {{ existing_relation }} rename to {{ backup_relation }};
{% endif %}
alter materialized view {{ intermediate_relation }} rename to {{ relation }};
{% endmacro %}
{#-- Applying materialized view configuration changes via alter is not supported. --#}
{#-- Return None, so `refresh_materialized_view` macro is invoked even --#}
{#-- if materialized view configuration changes are made. --#}
{#-- After configuration change, full refresh needs to be performed on mv. --#}
{% macro trino__get_materialized_view_configuration_changes(existing_relation, new_config) %}
{% do return(None) %}
{% endmacro %}
{%- macro trino__refresh_materialized_view(relation) -%}
refresh materialized view {{ relation }}
{%- endmacro -%}
================================================
FILE: dbt/include/trino/macros/materializations/seeds/helpers.sql
================================================
{% macro trino__get_batch_size() %}
{{ return(1000) }}
{% endmacro %}
{% macro create_bindings(row, types) %}
{% set values = [] %}
{% set re = modules.re %}
{%- for item in row -%}
{%- set type = types[loop.index0] -%}
{%- set match_type = re.match("(\w+)(\(.*\))?", type) -%}
{%- if item is not none and item is string and 'interval' in match_type.group(1) -%}
{%- do values.append((none, match_type.group(1).upper() ~ " " ~ item)) -%}
{%- elif item is not none and item is string and 'varchar' not in type.lower() -%}
{%- do values.append((none, match_type.group(1).upper() ~ " '" ~ item ~ "'")) -%}
{%- elif item is not none and 'varchar' in type.lower() -%}
{%- do values.append((get_binding_char(), item|string())) -%}
{%- else -%}
{%- do values.append((get_binding_char(), item)) -%}
{% endif -%}
{%- endfor -%}
{{ return(values) }}
{% endmacro %}
{#
We need to override the default__load_csv_rows macro as Trino requires values to be typed according to the column type
as in following example:
create table "memory"."default"."string_type" ("varchar_example" varchar,"varchar_n_example" varchar(10),"char_example" char,"char_n_example" char(10),"varbinary_example" varbinary,"json_example" json)
insert into "memory"."default"."string_type" ("varchar_example", "varchar_n_example", "char_example", "char_n_example", "varbinary_example", "json_example") values
('test','abc',CHAR 'd',CHAR 'ghi',VARBINARY '65683F',JSON '{"k1":1,"k2":23,"k3":456}'),(NULL,NULL,NULL,NULL,NULL,NULL)
Usually seed row's values through agate_table's data type detection and come through as python types, in this case typing is
handled by using bindings in `ConnectionWrapper.execute`. However dbt also allows you to override the data types of the created table
through setting `column_types`, this case is handled here where we have the type information of the seed table.
#}
{% macro trino__load_csv_rows(model, agate_table) %}
{% set column_override = model['config'].get('column_types', {}) %}
{% set types = [] %}
{%- for col_name in agate_table.column_names -%}
{%- set inferred_type = adapter.convert_type(agate_table, loop.index0) -%}
{%- set type = column_override.get(col_name, inferred_type) -%}
{%- do types.append(type) -%}
{%- endfor -%}
{% set batch_size = get_batch_size() %}
{% set cols_sql = get_seed_column_quoted_csv(model, agate_table.column_names) %}
{% set bindings = [] %}
{% set statements = [] %}
{% for chunk in agate_table.rows | batch(batch_size) %}
{% set bindings = [] %}
{% set sql %}
insert into {{ this.render() }} ({{ cols_sql }}) values
{% for row in chunk -%}
({%- for tuple in create_bindings(row, types) -%}
{%- if tuple.0 is not none -%}
{{ tuple.0 }}
{%- do bindings.append(tuple.1) -%}
{%- else -%}
{{ tuple.1 }}
{%- endif -%}
{%- if not loop.last%},{%- endif %}
{%- endfor -%})
{%- if not loop.last%},{%- endif %}
{%- endfor %}
{% endset %}
{% do adapter.add_query(sql, bindings=bindings, abridge_sql_log=True) %}
{% if loop.index0 == 0 %}
{% do statements.append(sql) %}
{% endif %}
{% endfor %}
{# Return SQL so we can render it out into the compiled files #}
{{ return(statements[0]) }}
{% endmacro %}
================================================
FILE: dbt/include/trino/macros/materializations/snapshot.sql
================================================
{% materialization snapshot, adapter='trino' %}
{% if config.get('properties') %}
{% if config.get('properties').get('location') %}
{%- do exceptions.raise_compiler_error("Specifying 'location' property in snapshots is not supported.") -%}
{% endif %}
{% endif %}
{{ return(materialization_snapshot_default()) }}
{% endmaterialization %}
{% macro trino__snapshot_hash_arguments(args) -%}
lower(to_hex(md5(to_utf8(concat({%- for arg in args -%}
coalesce(cast({{ arg }} as varchar), ''){% if not loop.last %}, '|',{% endif -%}
{%- endfor -%}
)))))
{%- endmacro %}
{% macro trino__post_snapshot(staging_relation) %}
-- Clean up the snapshot temp table
{% do drop_relation(staging_relation) %}
{% endmacro %}
{% macro trino__snapshot_merge_sql(target, source, insert_cols) -%}
{%- set insert_cols_csv = insert_cols | join(', ') -%}
{%- set columns = config.get("snapshot_table_column_names") or get_snapshot_table_column_names() -%}
merge into {{ target.render() }} as DBT_INTERNAL_DEST
using {{ source }} as DBT_INTERNAL_SOURCE
on DBT_INTERNAL_SOURCE.{{ columns.dbt_scd_id }} = DBT_INTERNAL_DEST.{{ columns.dbt_scd_id }}
when matched
{% if config.get("dbt_valid_to_current") %}
and (DBT_INTERNAL_DEST.{{ columns.dbt_valid_to }} = {{ config.get('dbt_valid_to_current') }} or
DBT_INTERNAL_DEST.{{ columns.dbt_valid_to }} is null)
{% else %}
and DBT_INTERNAL_DEST.{{ columns.dbt_valid_to }} is null
{% endif %}
and DBT_INTERNAL_SOURCE.dbt_change_type in ('update', 'delete')
then update
set {{ columns.dbt_valid_to }} = DBT_INTERNAL_SOURCE.{{ columns.dbt_valid_to }}
when not matched
and DBT_INTERNAL_SOURCE.dbt_change_type = 'insert'
then insert ({{ insert_cols_csv }})
values ({% for insert_col in insert_cols -%}
DBT_INTERNAL_SOURCE.{{ insert_col }}
{%- if not loop.last %}, {% endif %}
{%- endfor %})
{% endmacro %}
================================================
FILE: dbt/include/trino/macros/materializations/table.sql
================================================
{% materialization table, adapter = 'trino' %}
{%- set on_table_exists = config.get('on_table_exists', 'rename') -%}
{% if on_table_exists not in ['rename', 'drop', 'replace', 'skip'] %}
{%- set log_message = 'Invalid value for on_table_exists (%s) specified. Setting default value (%s).' % (on_table_exists, 'rename') -%}
{% do log(log_message) %}
{%- set on_table_exists = 'rename' -%}
{% endif %}
{%- set existing_relation = load_cached_relation(this) -%}
{%- set target_relation = this.incorporate(type='table') %}
{% if on_table_exists == 'rename' %}
{%- set intermediate_relation = make_intermediate_relation(target_relation) -%}
-- the intermediate_relation should not already exist in the database; get_relation
-- will return None in that case. Otherwise, we get a relation that we can drop
-- later, before we try to use this name for the current operation
{%- set preexisting_intermediate_relation = load_cached_relation(intermediate_relation) -%}
{%- set backup_relation_type = 'table' if existing_relation is none else existing_relation.type -%}
{%- set backup_relation = make_backup_relation(target_relation, backup_relation_type) -%}
-- as above, the backup_relation should not already exist
{%- set preexisting_backup_relation = load_cached_relation(backup_relation) -%}
-- drop the temp relations if they exist already in the database
{{ drop_relation_if_exists(preexisting_intermediate_relation) }}
{{ drop_relation_if_exists(preexisting_backup_relation) }}
{% endif %}
{{ run_hooks(pre_hooks) }}
-- grab current tables grants config for comparision later on
{% set grant_config = config.get('grants') %}
{#-- Create table with given `on_table_exists` mode #}
{% do on_table_exists_logic(on_table_exists, existing_relation, intermediate_relation, backup_relation, target_relation) %}
{% do persist_docs(target_relation, model) %}
{% set should_revoke = should_revoke(existing_relation, full_refresh_mode=True) %}
{% do apply_grants(target_relation, grant_config, should_revoke=should_revoke) %}
{{ run_hooks(post_hooks) }}
{{ return({'relations': [target_relation]}) }}
{% endmaterialization %}
{% macro on_table_exists_logic(on_table_exists, existing_relation, intermediate_relation, backup_relation, target_relation) -%}
{#-- Create table with given `on_table_exists` mode #}
{% if on_table_exists == 'rename' %}
{#-- table does not exists #}
{% if existing_relation is none %}
{% call statement('main') -%}
{{ create_table_as(False, target_relation, sql) }}
{%- endcall %}
{#-- table does exists #}
{% else %}
{#-- build modeldock #}
{% call statement('main') -%}
{{ create_table_as(False, intermediate_relation, sql) }}
{%- endcall %}
{#-- cleanup #}
{{ adapter.rename_relation(existing_relation, backup_relation) }}
{{ adapter.rename_relation(intermediate_relation, target_relation) }}
{#-- finally, drop the existing/backup relation after the commit #}
{{ drop_relation_if_exists(backup_relation) }}
{% endif %}
{% elif on_table_exists == 'drop' %}
{#-- cleanup #}
{%- if existing_relation is not none -%}
{{ adapter.drop_relation(existing_relation) }}
{%- endif -%}
{#-- build model #}
{% call statement('main') -%}
{{ create_table_as(False, target_relation, sql) }}
{%- endcall %}
{% elif on_table_exists == 'replace' %}
{#-- build model #}
{% call statement('main') -%}
{{ create_table_as(False, target_relation, sql, 'replace') }}
{%- endcall %}
{% elif on_table_exists == 'skip' %}
{#-- build model #}
{% call statement('main') -%}
{{ create_table_as(False, target_relation, sql, 'skip') }}
{%- endcall %}
{% endif %}
{% endmacro %}
================================================
FILE: dbt/include/trino/macros/materializations/view.sql
================================================
{% materialization view, adapter='trino' -%}
{% set to_return = create_or_replace_view() %}
{% set target_relation = this.incorporate(type='view') %}
{% do persist_docs(target_relation, model) %}
{% do return(to_return) %}
{%- endmaterialization %}
================================================
FILE: dbt/include/trino/macros/utils/any_value.sql
================================================
{% macro trino__any_value(expression) -%}
min({{ expression }})
{%- endmacro %}
================================================
FILE: dbt/include/trino/macros/utils/array_append.sql
================================================
{% macro trino__array_append(array, new_element) -%}
{{ array_concat(array, array_construct([new_element])) }}
{%- endmacro %}
================================================
FILE: dbt/include/trino/macros/utils/array_concat.sql
================================================
{% macro trino__array_concat(array_1, array_2) -%}
concat({{ array_1 }}, {{ array_2 }})
{%- endmacro %}
================================================
FILE: dbt/include/trino/macros/utils/array_construct.sql
================================================
{% macro trino__array_construct(inputs, data_type) -%}
{%- if not inputs -%}
null
{%- else -%}
array[ {{ inputs|join(' , ') }} ]
{%- endif -%}
{%- endmacro %}
================================================
FILE: dbt/include/trino/macros/utils/bool_or.sql
================================================
{% macro trino__bool_or(expression) -%}
bool_or({{ expression }})
{%- endmacro %}
================================================
FILE: dbt/include/trino/macros/utils/datatypes.sql
================================================
{% macro trino__type_float() -%}
double
{%- endmacro %}
{% macro trino__type_string() -%}
varchar
{%- endmacro %}
{% macro trino__type_numeric() -%}
decimal(28, 6)
{%- endmacro %}
{%- macro trino__type_int() -%}
integer
{%- endmacro -%}
================================================
FILE: dbt/include/trino/macros/utils/date_spine.sql
================================================
{% macro trino__date_spine(datepart, start_date, end_date) %}
{# call as follows:
date_spine(
"day",
"to_date('01/01/2016', 'mm/dd/yyyy')",
"dbt.dateadd(week, 1, current_date)"
) #}
with rawdata as (
{{dbt.generate_series(
dbt.get_intervals_between(start_date, end_date, datepart)
)}}
),
all_periods as (
select (
{{
dbt.dateadd(
datepart,
"row_number() over (order by 1) - 1",
"cast(" ~ start_date ~ " as date)"
)
}}
) as date_{{datepart}}
from rawdata
),
filtered as (
select *
from all_periods
where date_{{datepart}} <= cast({{ end_date }} as date)
)
select * from filtered
{% endmacro %}
================================================
FILE: dbt/include/trino/macros/utils/date_trunc.sql
================================================
{% macro trino__date_trunc(datepart, date) -%}
date_trunc('{{datepart}}', {{date}})
{%- endmacro %}
================================================
FILE: dbt/include/trino/macros/utils/dateadd.sql
================================================
{% macro trino__dateadd(datepart, interval, from_date_or_timestamp) -%}
date_add('{{ datepart }}', {{ interval }}, {{ from_date_or_timestamp }})
{%- endmacro %}
================================================
FILE: dbt/include/trino/macros/utils/datediff.sql
================================================
{% macro trino__datediff(first_date, second_date, datepart) -%}
{%- if datepart == 'year' -%}
(year(CAST({{ second_date }} AS TIMESTAMP)) - year(CAST({{ first_date }} AS TIMESTAMP)))
{%- elif datepart == 'quarter' -%}
({{ datediff(first_date, second_date, 'year') }} * 4) + quarter(CAST({{ second_date }} AS TIMESTAMP)) - quarter(CAST({{ first_date }} AS TIMESTAMP))
{%- elif datepart == 'month' -%}
({{ datediff(first_date, second_date, 'year') }} * 12) + month(CAST({{ second_date }} AS TIMESTAMP)) - month(CAST({{ first_date }} AS TIMESTAMP))
{%- elif datepart == 'day' -%}
((to_milliseconds((CAST(CAST({{ second_date }} AS TIMESTAMP) AS DATE) - CAST(CAST({{ first_date }} AS TIMESTAMP) AS DATE)))) / 86400000)
{%- elif datepart == 'week' -%}
({{ datediff(first_date, second_date, 'day') }} / 7 + case
when dow(CAST({{first_date}} AS TIMESTAMP)) <= dow(CAST({{second_date}} AS TIMESTAMP)) then
case when {{first_date}} <= {{second_date}} then 0 else -1 end
else
case when {{first_date}} <= {{second_date}} then 1 else 0 end
end)
{%- elif datepart == 'hour' -%}
({{ datediff(first_date, second_date, 'day') }} * 24 + hour(CAST({{ second_date }} AS TIMESTAMP)) - hour(CAST({{ first_date }} AS TIMESTAMP)))
{%- elif datepart == 'minute' -%}
({{ datediff(first_date, second_date, 'hour') }} * 60 + minute(CAST({{ second_date }} AS TIMESTAMP)) - minute(CAST({{ first_date }} AS TIMESTAMP)))
{%- elif datepart == 'second' -%}
({{ datediff(first_date, second_date, 'minute') }} * 60 + second(CAST({{ second_date }} AS TIMESTAMP)) - second(CAST({{ first_date }} AS TIMESTAMP)))
{%- elif datepart == 'millisecond' -%}
(to_milliseconds((CAST({{ second_date }} AS TIMESTAMP) - CAST({{ first_date }} AS TIMESTAMP))))
{%- else -%}
{% if execute %}{{ exceptions.raise_compiler_error("Unsupported datepart for macro datediff in Trino: {!r}".format(datepart)) }}{% endif %}
{%- endif -%}
{%- endmacro %}
================================================
FILE: dbt/include/trino/macros/utils/hash.sql
================================================
{% macro trino__hash(field) -%}
lower(to_hex(md5(to_utf8(cast({{field}} as varchar)))))
{%- endmacro %}
================================================
FILE: dbt/include/trino/macros/utils/listagg.sql
================================================
{% macro trino__listagg(measure, delimiter_text, order_by_clause, limit_num) -%}
{% set collect_list %} array_agg({{ measure }} {% if order_by_clause -%}{{ order_by_clause }}{%- endif %}) {% endset %}
{% set limited %} slice({{ collect_list }}, 1, {{ limit_num }}) {% endset %}
{% set collected = limited if limit_num else collect_list %}
{% set final %} array_join({{ collected }}, {{ delimiter_text }}) {% endset %}
{% do return(final) %}
{%- endmacro %}
================================================
FILE: dbt/include/trino/macros/utils/right.sql
================================================
{% macro trino__right(string_text, length_expression) %}
case when {{ length_expression }} = 0
then ''
else
substr({{ string_text }}, -1 * ({{ length_expression }}))
end
{%- endmacro -%}
================================================
FILE: dbt/include/trino/macros/utils/safe_cast.sql
================================================
{% macro trino__safe_cast(field, type) -%}
try_cast({{field}} as {{type}})
{%- endmacro %}
================================================
FILE: dbt/include/trino/macros/utils/split_part.sql
================================================
{% macro trino__split_part(string_text, delimiter_text, part_number) %}
{% if part_number >= 0 %}
{{ dbt.default__split_part(string_text, delimiter_text, part_number) }}
{% else %}
{{ dbt._split_part_negative(string_text, delimiter_text, part_number) }}
{% endif %}
{% endmacro %}
================================================
FILE: dbt/include/trino/macros/utils/timestamps.sql
================================================
{% macro trino__current_timestamp() -%}
current_timestamp
{%- endmacro %}
{% macro trino__snapshot_string_as_time(timestamp) %}
{%- set result = "timestamp '" ~ timestamp ~ "'" -%}
{{ return(result) }}
{% endmacro %}
================================================
FILE: dbt/include/trino/sample_profiles.yml
================================================
default:
outputs:
dev:
type: trino
method: none # optional, one of {none | ldap | kerberos}
user: [dev_user]
password: [password] # required if method is ldap or kerberos
database: [database name]
host: [hostname]
port: [port number]
schema: [dev_schema]
threads: [1 or more]
prod:
type: trino
method: none # optional, one of {none | ldap | kerberos}
user: [prod_user]
password: [prod_password] # required if method is ldap or kerberos
database: [database name]
host: [hostname]
port: [port number]
schema: [prod_schema]
threads: [1 or more]
target: dev
================================================
FILE: dev_requirements.txt
================================================
dbt-tests-adapter~=1.19.1
mypy==1.19.1 # patch updates have historically introduced breaking changes
pre-commit~=4.3
pytest~=8.4
tox~=4.30
================================================
FILE: docker/init_starburst.bash
================================================
#!/bin/bash
# move to wherever we are so docker things work
cd "$(dirname "${BASH_SOURCE[0]}")"
cd ..
set -exo pipefail
docker compose -f docker-compose-starburst.yml build
docker compose -f docker-compose-starburst.yml up -d --quiet-pull
timeout 5m bash -c -- 'while ! docker compose -f docker-compose-starburst.yml logs trino 2>&1 | tail -n 1 | grep "SERVER STARTED"; do sleep 2; done'
================================================
FILE: docker/init_trino.bash
================================================
#!/bin/bash
# move to wherever we are so docker things work
cd "$(dirname "${BASH_SOURCE[0]}")"
cd ..
set -exo pipefail
docker compose -f docker-compose-trino.yml build
docker compose -f docker-compose-trino.yml up -d --quiet-pull
timeout 5m bash -c -- 'while ! docker compose -f docker-compose-trino.yml logs trino 2>&1 | tail -n 1 | grep "SERVER STARTED"; do sleep 2; done'
================================================
FILE: docker/remove_starburst.bash
================================================
#!/bin/bash
# move to wherever we are so docker things work
cd "$(dirname "${BASH_SOURCE[0]}")"
cd ..
docker compose -f docker-compose-starburst.yml down
================================================
FILE: docker/remove_trino.bash
================================================
#!/bin/bash
# move to wherever we are so docker things work
cd "$(dirname "${BASH_SOURCE[0]}")"
cd ..
docker compose -f docker-compose-trino.yml down
================================================
FILE: docker/starburst/catalog/delta.properties
================================================
connector.name=delta-lake
delta.enable-non-concurrent-writes=true
fs.native-s3.enabled=true
s3.region=us-east-1
s3.endpoint=http://minio:9000
s3.path-style-access=true
hive.metastore.uri=thrift://hive-metastore:9083
s3.aws-access-key=minio
s3.aws-secret-key=minio123
hive.metastore-cache-ttl=0s
hive.metastore-refresh-interval=5s
delta.security=allow-all
================================================
FILE: docker/starburst/catalog/hive.properties
================================================
connector.name=hive
hive.metastore.uri=thrift://hive-metastore:9083
fs.native-s3.enabled=true
s3.region=us-east-1
s3.endpoint=http://minio:9000
s3.path-style-access=true
s3.aws-access-key=minio
s3.aws-secret-key=minio123
hive.metastore-cache-ttl=0s
hive.metastore-refresh-interval=5s
hive.security=sql-standard
================================================
FILE: docker/starburst/catalog/iceberg.properties
================================================
connector.name=iceberg
hive.metastore.uri=thrift://hive-metastore:9083
fs.native-s3.enabled=true
s3.region=us-east-1
s3.endpoint=http://minio:9000
s3.path-style-access=true
s3.aws-access-key=minio
s3.aws-secret-key=minio123
hive.metastore-cache-ttl=0s
hive.metastore-refresh-interval=5s
iceberg.unique-table-location=true
================================================
FILE: docker/starburst/catalog/memory.properties
================================================
connector.name=memory
memory.max-data-per-node=128MB
================================================
FILE: docker/starburst/catalog/postgresql.properties
================================================
connector.name=postgresql
connection-url=jdbc:postgresql://postgres:5432/dbt-trino
connection-user=dbt-trino
connection-password=dbt-trino
================================================
FILE: docker/starburst/catalog/tpch.properties
================================================
connector.name=tpch
================================================
FILE: docker/starburst/etc/config.properties
================================================
coordinator=true
node-scheduler.include-coordinator=true
http-server.http.port=8080
discovery.uri=http://localhost:8080
================================================
FILE: docker/starburst/etc/jvm.config
================================================
-server
-XX:InitialRAMPercentage=80
-XX:MaxRAMPercentage=80
-XX:G1HeapRegionSize=32M
-XX:+ExplicitGCInvokesConcurrent
-XX:+HeapDumpOnOutOfMemoryError
-XX:+ExitOnOutOfMemoryError
-XX:-OmitStackTraceInFastThrow
-XX:ReservedCodeCacheSize=256M
-XX:PerMethodRecompilationCutoff=10000
-XX:PerBytecodeRecompilationCutoff=10000
-Djdk.attach.allowAttachSelf=true
-Djdk.nio.maxCachedBufferSize=2000000
================================================
FILE: docker/starburst/etc/node.properties
================================================
node.environment=docker
node.data-dir=/data/starburst
================================================
FILE: docker/trino/catalog/delta.properties
================================================
connector.name=delta-lake
delta.enable-non-concurrent-writes=true
fs.native-s3.enabled=true
s3.region=us-east-1
s3.endpoint=http://minio:9000
s3.path-style-access=true
hive.metastore.uri=thrift://hive-metastore:9083
s3.aws-access-key=minio
s3.aws-secret-key=minio123
hive.metastore-cache-ttl=0s
hive.metastore-refresh-interval=5s
================================================
FILE: docker/trino/catalog/hive.properties
================================================
connector.name=hive
hive.metastore.uri=thrift://hive-metastore:9083
fs.native-s3.enabled=true
s3.region=us-east-1
s3.endpoint=http://minio:9000
s3.path-style-access=true
s3.aws-access-key=minio
s3.aws-secret-key=minio123
hive.metastore-cache-ttl=0s
hive.metastore-refresh-interval=5s
hive.security=sql-standard
================================================
FILE: docker/trino/catalog/iceberg.properties
================================================
connector.name=iceberg
hive.metastore.uri=thrift://hive-metastore:9083
fs.native-s3.enabled=true
s3.region=us-east-1
s3.endpoint=http://minio:9000
s3.path-style-access=true
s3.aws-access-key=minio
s3.aws-secret-key=minio123
hive.metastore-cache-ttl=0s
hive.metastore-refresh-interval=5s
================================================
FILE: docker/trino/catalog/memory.properties
================================================
connector.name=memory
memory.max-data-per-node=128MB
================================================
FILE: docker/trino/catalog/postgresql.properties
================================================
connector.name=postgresql
connection-url=jdbc:postgresql://postgres:5432/dbt-trino
connection-user=dbt-trino
connection-password=dbt-trino
================================================
FILE: docker/trino/catalog/tpch.properties
================================================
connector.name=tpch
================================================
FILE: docker/trino/etc/config.properties
================================================
coordinator=true
node-scheduler.include-coordinator=true
http-server.http.port=8080
discovery.uri=http://localhost:8080
================================================
FILE: docker/trino/etc/jvm.config
================================================
-server
-XX:InitialRAMPercentage=80
-XX:MaxRAMPercentage=80
-XX:G1HeapRegionSize=32M
-XX:+ExplicitGCInvokesConcurrent
-XX:+HeapDumpOnOutOfMemoryError
-XX:+ExitOnOutOfMemoryError
-XX:-OmitStackTraceInFastThrow
-XX:ReservedCodeCacheSize=256M
-XX:PerMethodRecompilationCutoff=10000
-XX:PerBytecodeRecompilationCutoff=10000
-Djdk.attach.allowAttachSelf=true
-Djdk.nio.maxCachedBufferSize=2000000
================================================
FILE: docker/trino/etc/node.properties
================================================
node.environment=docker
node.data-dir=/data/trino
================================================
FILE: docker-compose-starburst.yml
================================================
services:
trino:
ports:
- "8080:8080"
image: "starburstdata/starburst-enterprise:477-e.1"
volumes:
- ./docker/starburst/etc:/etc/starburst
- ./docker/starburst/catalog:/etc/starburst/catalog
environment:
- _JAVA_OPTIONS=-Dfile.encoding=UTF-8
postgres:
ports:
- "5432:5432"
image: postgres:18
environment:
POSTGRES_USER: dbt-trino
POSTGRES_PASSWORD: dbt-trino
metastore_db:
image: postgres:18
hostname: metastore_db
environment:
POSTGRES_USER: hive
POSTGRES_PASSWORD: hive
POSTGRES_DB: metastore
hive-metastore:
hostname: hive-metastore
image: 'starburstdata/hive:3.1.3-e.15'
ports:
- '9083:9083' # Metastore Thrift
environment:
HIVE_METASTORE_DRIVER: org.postgresql.Driver
HIVE_METASTORE_JDBC_URL: jdbc:postgresql://metastore_db:5432/metastore
HIVE_METASTORE_USER: hive
HIVE_METASTORE_PASSWORD: hive
HIVE_METASTORE_WAREHOUSE_DIR: s3://datalake/
S3_ENDPOINT: http://minio:9000
S3_ACCESS_KEY: minio
S3_SECRET_KEY: minio123
S3_PATH_STYLE_ACCESS: "true"
REGION: ""
GOOGLE_CLOUD_KEY_FILE_PATH: ""
AZURE_ADL_CLIENT_ID: ""
AZURE_ADL_CREDENTIAL: ""
AZURE_ADL_REFRESH_URL: ""
AZURE_ABFS_STORAGE_ACCOUNT: ""
AZURE_ABFS_ACCESS_KEY: ""
AZURE_WASB_STORAGE_ACCOUNT: ""
AZURE_ABFS_OAUTH: ""
AZURE_ABFS_OAUTH_TOKEN_PROVIDER: ""
AZURE_ABFS_OAUTH_CLIENT_ID: ""
AZURE_ABFS_OAUTH_SECRET: ""
AZURE_ABFS_OAUTH_ENDPOINT: ""
AZURE_WASB_ACCESS_KEY: ""
HIVE_METASTORE_USERS_IN_ADMIN_ROLE: "admin"
depends_on:
- metastore_db
minio:
hostname: minio
image: 'minio/minio:RELEASE.2025-09-07T16-13-09Z'
container_name: minio
ports:
- '9000:9000'
- '9001:9001'
environment:
MINIO_ACCESS_KEY: minio
MINIO_SECRET_KEY: minio123
command: server /data --console-address ":9001"
# This job will create the "datalake" bucket on Minio
mc-job:
image: 'minio/mc:RELEASE.2025-04-16T18-13-26Z'
entrypoint: |
/bin/bash -c "
sleep 5;
/usr/bin/mc config --quiet host add myminio http://minio:9000 minio minio123;
/usr/bin/mc mb --quiet myminio/datalake
"
depends_on:
- minio
networks:
default:
name: dbt-net
external: true
================================================
FILE: docker-compose-trino.yml
================================================
services:
trino:
ports:
- "8080:8080"
image: "trinodb/trino:478"
volumes:
- ./docker/trino/etc:/usr/lib/trino/etc:ro
- ./docker/trino/catalog:/etc/trino/catalog
postgres:
ports:
- "5432:5432"
image: postgres:18
container_name: postgres
environment:
POSTGRES_USER: dbt-trino
POSTGRES_PASSWORD: dbt-trino
metastore_db:
image: postgres:18
hostname: metastore_db
environment:
POSTGRES_USER: hive
POSTGRES_PASSWORD: hive
POSTGRES_DB: metastore
hive-metastore:
hostname: hive-metastore
image: 'starburstdata/hive:3.1.3-e.15'
ports:
- '9083:9083' # Metastore Thrift
environment:
HIVE_METASTORE_DRIVER: org.postgresql.Driver
HIVE_METASTORE_JDBC_URL: jdbc:postgresql://metastore_db:5432/metastore
HIVE_METASTORE_USER: hive
HIVE_METASTORE_PASSWORD: hive
HIVE_METASTORE_WAREHOUSE_DIR: s3://datalake/
S3_ENDPOINT: http://minio:9000
S3_ACCESS_KEY: minio
S3_SECRET_KEY: minio123
S3_PATH_STYLE_ACCESS: "true"
REGION: ""
GOOGLE_CLOUD_KEY_FILE_PATH: ""
AZURE_ADL_CLIENT_ID: ""
AZURE_ADL_CREDENTIAL: ""
AZURE_ADL_REFRESH_URL: ""
AZURE_ABFS_STORAGE_ACCOUNT: ""
AZURE_ABFS_ACCESS_KEY: ""
AZURE_WASB_STORAGE_ACCOUNT: ""
AZURE_ABFS_OAUTH: ""
AZURE_ABFS_OAUTH_TOKEN_PROVIDER: ""
AZURE_ABFS_OAUTH_CLIENT_ID: ""
AZURE_ABFS_OAUTH_SECRET: ""
AZURE_ABFS_OAUTH_ENDPOINT: ""
AZURE_WASB_ACCESS_KEY: ""
HIVE_METASTORE_USERS_IN_ADMIN_ROLE: "admin"
depends_on:
- metastore_db
minio:
hostname: minio
image: 'minio/minio:RELEASE.2025-09-07T16-13-09Z'
container_name: minio
ports:
- '9000:9000'
- '9001:9001'
environment:
MINIO_ACCESS_KEY: minio
MINIO_SECRET_KEY: minio123
command: server /data --console-address ":9001"
# This job will create the "datalake" bucket on Minio
mc-job:
image: 'minio/mc:RELEASE.2025-04-16T18-13-26Z'
entrypoint: |
/bin/bash -c "
sleep 5;
/usr/bin/mc config --quiet host add myminio http://minio:9000 minio minio123;
/usr/bin/mc mb --quiet myminio/datalake
"
depends_on:
- minio
networks:
default:
name: dbt-net
external: true
================================================
FILE: mypy.ini
================================================
[mypy]
namespace_packages = True
explicit_package_bases = True
================================================
FILE: pytest.ini
================================================
[pytest]
filterwarnings =
ignore:.*'soft_unicode' has been renamed to 'soft_str'*:DeprecationWarning
ignore:unclosed file .*:ResourceWarning
testpaths =
tests/unit
tests/functional
markers =
delta
iceberg
hive
postgresql
prepared_statements_disabled
skip_profile(profile)
================================================
FILE: setup.py
================================================
#!/usr/bin/env python
import os
import re
import sys
# require python 3.9 or newer
if sys.version_info < (3, 9):
print("Error: dbt does not support this version of Python.")
print("Please upgrade to Python 3.9 or higher.")
sys.exit(1)
# require version of setuptools that supports find_namespace_packages
from setuptools import setup
try:
from setuptools import find_namespace_packages
except ImportError:
# the user has a downlevel version of setuptools.
print("Error: dbt requires setuptools v40.1.0 or higher.")
print('Please upgrade setuptools with "pip install --upgrade setuptools" ' "and try again")
sys.exit(1)
this_directory = os.path.abspath(os.path.dirname(__file__))
with open(os.path.join(this_directory, "README.md")) as f:
long_description = f.read()
package_name = "dbt-trino"
# get this package's version from dbt/adapters/<name>/__version__.py
def _get_plugin_version_dict():
_version_path = os.path.join(this_directory, "dbt", "adapters", "trino", "__version__.py")
_semver = r"""(?P<major>\d+)\.(?P<minor>\d+)\.(?P<patch>\d+)"""
_pre = r"""((?P<prekind>a|b|rc)(?P<pre>\d+))?"""
_version_pattern = rf"""version\s*=\s*["']{_semver}{_pre}["']"""
with open(_version_path) as f:
match = re.search(_version_pattern, f.read().strip())
if match is None:
raise ValueError(f"invalid version at {_version_path}")
return match.groupdict()
def _dbt_trino_version():
parts = _get_plugin_version_dict()
trino_version = "{major}.{minor}.{patch}".format(**parts)
if parts["prekind"] and parts["pre"]:
trino_version += parts["prekind"] + parts["pre"]
return trino_version
package_version = _dbt_trino_version()
description = """The trino adapter plugin for dbt (data build tool)"""
setup(
name=package_name,
version=package_version,
description=description,
long_description=long_description,
long_description_content_type="text/markdown",
platforms="any",
license="Apache License 2.0",
license_files=("LICENSE.txt",),
author="Starburst Data",
author_email="info@starburstdata.com",
url="https://github.com/starburstdata/dbt-trino",
packages=find_namespace_packages(include=["dbt", "dbt.*"]),
package_data={
"dbt": [
"include/trino/dbt_project.yml",
"include/trino/sample_profiles.yml",
"include/trino/macros/*.sql",
"include/trino/macros/*/*.sql",
"include/trino/macros/*/*/*.sql",
]
},
install_requires=[
"dbt-common>=1.25.0,<2.0",
"dbt-adapters>=1.16,<2.0",
"trino~=0.331",
# add dbt-core to ensure backwards compatibility of installation, this is not a functional dependency
"dbt-core>=1.8.0",
],
zip_safe=False,
classifiers=[
"Development Status :: 5 - Production/Stable",
"License :: OSI Approved :: Apache Software License",
"Operating System :: Microsoft :: Windows",
"Operating System :: MacOS :: MacOS X",
"Operating System :: POSIX :: Linux",
"Programming Language :: Python :: 3.9",
"Programming Language :: Python :: 3.10",
"Programming Language :: Python :: 3.11",
"Programming Language :: Python :: 3.12",
"Programming Language :: Python :: 3.13",
],
python_requires=">=3.9",
)
================================================
FILE: tests/conftest.py
================================================
import os
import pytest
import trino
# Import the functional fixtures as a plugin
# Note: fixtures with session scope need to be local
pytest_plugins = ["dbt.tests.fixtures.project"]
def pytest_addoption(parser):
parser.addoption("--profile", action="store", default="trino_starburst", type=str)
# Skip tests for profiles marked with @pytest.mark.skip_profile
# See pytest docs for skipping based on command-line options:
# https://docs.pytest.org/en/latest/example/simple.html#control-skipping-of-tests-according-to-command-line-option
def pytest_collection_modifyitems(config, items):
profile_type = config.getoption("--profile")
for item in items:
if skip_profile_marker := item.get_closest_marker("skip_profile"):
if profile_type in skip_profile_marker.args:
skip_profile = pytest.mark.skip(reason=f"skipped on {profile_type} profile")
item.add_marker(skip_profile)
# The profile dictionary, used to write out profiles.yml
@pytest.fixture(scope="class")
def dbt_profile_target(request):
profile_type = request.config.getoption("--profile")
if profile_type == "trino_starburst":
target = get_trino_starburst_target()
elif profile_type == "starburst_galaxy":
target = get_galaxy_target()
else:
raise ValueError(f"Invalid profile type '{profile_type}'")
prepared_statements_disabled = request.node.get_closest_marker("prepared_statements_disabled")
if prepared_statements_disabled:
target.update({"prepared_statements_enabled": False})
postgresql = request.node.get_closest_marker("postgresql")
iceberg = request.node.get_closest_marker("iceberg")
delta = request.node.get_closest_marker("delta")
hive = request.node.get_closest_marker("hive")
if sum(bool(x) for x in (postgresql, iceberg, delta)) > 1:
raise ValueError("Only one of postgresql, iceberg, delta can be specified as a marker")
if postgresql:
target.update({"catalog": "postgresql"})
if delta:
target.update({"catalog": "delta"})
if iceberg:
target.update({"catalog": "iceberg"})
if hive:
target.update({"catalog": "hive"})
return target
def get_trino_starburst_target():
return {
"type": "trino",
"method": "none",
"threads": 4,
"host": "localhost",
"port": 8080,
"user": "admin",
"password": "",
"roles": {
"hive": "admin",
},
"catalog": "memory",
"schema": "default",
"timezone": "UTC",
}
def get_galaxy_target():
return {
"type": "trino",
"method": "ldap",
"threads": 4,
"retries": 5,
"host": os.environ.get("DBT_TESTS_STARBURST_GALAXY_HOST"),
"port": 443,
"user": os.environ.get("DBT_TESTS_STARBURST_GALAXY_USER"),
"password": os.environ.get("DBT_TESTS_STARBURST_GALAXY_PASSWORD"),
"catalog": "iceberg",
"schema": "default",
"timezone": "UTC",
}
@pytest.fixture(scope="class")
def trino_connection(dbt_profile_target):
if dbt_profile_target["method"] == "ldap":
return trino.dbapi.connect(
host=dbt_profile_target["host"],
port=dbt_profile_target["port"],
auth=trino.auth.BasicAuthentication(
dbt_profile_target["user"], dbt_profile_target["password"]
),
catalog=dbt_profile_target["catalog"],
schema=dbt_profile_target["schema"],
http_scheme="https",
)
else:
return trino.dbapi.connect(
host=dbt_profile_target["host"],
port=dbt_profile_target["port"],
user=dbt_profile_target["user"],
catalog=dbt_profile_target["catalog"],
schema=dbt_profile_target["schema"],
)
def get_engine_type(trino_connection):
conn = trino_connection
if "galaxy.starburst.io" in conn.host:
return "starburst_galaxy"
cur = conn.cursor()
cur.execute("SELECT version()")
version = cur.fetchone()
if "-e" in version[0]:
return "starburst_enterprise"
else:
return "trino"
@pytest.fixture(autouse=True)
def skip_by_engine_type(request, trino_connection):
engine_type = get_engine_type(trino_connection)
if request.node.get_closest_marker("skip_engine"):
for skip_engine_type in request.node.get_closest_marker("skip_engine").args:
if skip_engine_type == engine_type:
pytest.skip(f"skipped on {engine_type} engine")
================================================
FILE: tests/functional/adapter/behavior_flags/test_require_certificate_validation.py
================================================
import warnings
import pytest
from dbt.tests.util import run_dbt, run_dbt_and_capture
from urllib3.exceptions import InsecureRequestWarning
class TestRequireCertificateValidationDefault:
@pytest.fixture(scope="class")
def project_config_update(self):
return {"flags": {}}
def test_cert_default_value(self, project):
assert project.adapter.connections.profile.credentials.cert is None
def test_require_certificate_validation_logs(self, project):
dbt_args = ["show", "--inline", "select 1"]
_, logs = run_dbt_and_capture(dbt_args)
assert "It is strongly advised to enable `require_certificate_validation` flag" in logs
@pytest.mark.skip_profile("trino_starburst")
def test_require_certificate_validation_insecure_request_warning(self, project):
with warnings.catch_warnings(record=True) as w:
dbt_args = ["show", "--inline", "select 1"]
run_dbt(dbt_args)
# Check if any InsecureRequestWarning was raised
assert any(
issubclass(warning.category, InsecureRequestWarning) for warning in w
), "InsecureRequestWarning was not raised"
class TestRequireCertificateValidationFalse:
@pytest.fixture(scope="class")
def project_config_update(self):
return {"flags": {"require_certificate_validation": False}}
def test_cert_default_value(self, project):
assert project.adapter.connections.profile.credentials.cert is None
def test_require_certificate_validation_logs(self, project):
dbt_args = ["show", "--inline", "select 1"]
_, logs = run_dbt_and_capture(dbt_args)
assert "It is strongly advised to enable `require_certificate_validation` flag" in logs
@pytest.mark.skip_profile("trino_starburst")
def test_require_certificate_validation_insecure_request_warning(self, project):
with warnings.catch_warnings(record=True) as w:
dbt_args = ["show", "--inline", "select 1"]
run_dbt(dbt_args)
# Check if any InsecureRequestWarning was raised
assert any(
issubclass(warning.category, InsecureRequestWarning) for warning in w
), "InsecureRequestWarning was not raised"
class TestRequireCertificateValidationTrue:
@pytest.fixture(scope="class")
def project_config_update(self):
return {"flags": {"require_certificate_validation": True}}
def test_cert_default_value(self, project):
assert project.adapter.connections.profile.credentials.cert is True
def test_require_certificate_validation_logs(self, project):
dbt_args = ["show", "--inline", "select 1"]
_, logs = run_dbt_and_capture(dbt_args)
assert "It is strongly advised to enable `require_certificate_validation` flag" not in logs
@pytest.mark.skip_profile("trino_starburst")
def test_require_certificate_validation_insecure_request_warning(self, project):
with warnings.catch_warnings(record=True) as w:
dbt_args = ["show", "--inline", "select 1"]
run_dbt(dbt_args)
# Check if not any InsecureRequestWarning was raised
assert not any(
issubclass(warning.category, InsecureRequestWarning) for warning in w
), "InsecureRequestWarning was not raised"
================================================
FILE: tests/functional/adapter/catalog_integrations/fixtures.py
================================================
MODEL_WITHOUT_CATALOG = """
{{ config(
materialized='table',
) }}
select 1 as id, 'test' as name
"""
MODEL_WITH_CATALOG = """
{{ config(
materialized='table',
catalog_name='test_trino_catalog'
) }}
select 1 as id, 'test' as name
"""
MODEL_WITH_CATALOG_CONFIGS_TABLE_FORMAT = """
{{ config(
materialized='table',
catalog_name='test_trino_catalog',
table_format='delta',
) }}
select 1 as id, 'test' as name
"""
MODEL_WITH_CATALOG_CONFIGS_FILE_FORMAT = """
{{ config(
materialized='table',
catalog_name='test_trino_catalog',
file_format='parquet',
) }}
select 1 as id, 'test' as name
"""
MODEL_WITH_CATALOG_CONFIGS_LOCATION = """
{{ config(
materialized='table',
catalog_name='test_trino_catalog',
storage_uri='s3://datalake/storage_uri',
properties= {
'location': "'s3://datalake/location'",
}
) }}
select 1 as id, 'test' as name
"""
MODEL_WITH_CATALOG_CONFIGS_STORAGE_URI = """
{{ config(
materialized='table',
catalog_name='test_trino_catalog',
storage_uri='s3://datalake/storage_uri',
) }}
select 1 as id, 'test' as name
"""
MODEL_WITH_CATALOG_CONFIGS_BASE_LOCATION = """
{{ config(
materialized='table',
catalog_name='test_trino_catalog',
base_location_root='foo',
base_location_subpath='bar',
) }}
select 1 as id, 'test' as name
"""
MODEL_WITH_CATALOG_CONFIGS_BASE_LOCATION_NONE = """
{{ config(
materialized='table',
catalog_name='test_trino_catalog',
base_location_root=None,
) }}
select 1 as id, 'test' as name
"""
MODEL_WITH_CATALOG_CONFIGS_BASE_LOCATION_NONE_OMIT_BASE_LOCATION_ROOT = """
{{ config(
materialized='table',
catalog_name='test_trino_catalog',
base_location_root=None,
omit_base_location_root=true,
) }}
select 1 as id, 'test' as name
"""
================================================
FILE: tests/functional/adapter/catalog_integrations/test_catalog_integration.py
================================================
import pytest
from dbt.tests.adapter.catalog_integrations.test_catalog_integration import (
BaseCatalogIntegrationValidation,
)
from dbt.tests.util import run_dbt_and_capture, write_file
from tests.functional.adapter.catalog_integrations.fixtures import (
MODEL_WITH_CATALOG,
MODEL_WITH_CATALOG_CONFIGS_BASE_LOCATION,
MODEL_WITH_CATALOG_CONFIGS_BASE_LOCATION_NONE,
MODEL_WITH_CATALOG_CONFIGS_BASE_LOCATION_NONE_OMIT_BASE_LOCATION_ROOT,
MODEL_WITH_CATALOG_CONFIGS_FILE_FORMAT,
MODEL_WITH_CATALOG_CONFIGS_LOCATION,
MODEL_WITH_CATALOG_CONFIGS_STORAGE_URI,
MODEL_WITH_CATALOG_CONFIGS_TABLE_FORMAT,
MODEL_WITHOUT_CATALOG,
)
@pytest.mark.iceberg
class TestTrinoCatalogIntegrationFileFormat(BaseCatalogIntegrationValidation):
@pytest.fixture(scope="class")
def catalogs(self):
return {
"catalogs": [
{
"name": "test_trino_catalog",
"active_write_integration": "trino_integration",
"write_integrations": [
{
"name": "trino_integration",
"catalog_type": "trino",
"file_format": "orc",
}
],
}
]
}
def test_model_without_catalog(self, project):
# Create model with catalog configuration
write_file(MODEL_WITHOUT_CATALOG, project.project_root, "models", "test_model.sql")
results, logs = run_dbt_and_capture(["--debug", "run"], expect_pass=True)
assert len(results) == 1
assert "CREATE TABLE" in logs
assert "WITH (" not in logs
def test_model_with_catalog(self, project):
# Create model with catalog configuration
write_file(MODEL_WITH_CATALOG, project.project_root, "models", "test_model.sql")
results, logs = run_dbt_and_capture(["--debug", "run"], expect_pass=True)
assert len(results) == 1
assert "CREATE TABLE" in logs
assert "WITH (" in logs
assert "format = 'orc'" in logs
def test_model_with_catalog_configs_file_format(self, project):
# Create model with catalog configuration
write_file(
MODEL_WITH_CATALOG_CONFIGS_FILE_FORMAT,
project.project_root,
"models",
"test_model.sql",
)
results, logs = run_dbt_and_capture(["--debug", "run"], expect_pass=True)
assert len(results) == 1
assert "CREATE TABLE" in logs
assert "WITH (" in logs
assert "format = 'parquet'" in logs
@pytest.mark.iceberg
# Setting `type` property is available only in Starburst Galaxy
# https://docs.starburst.io/starburst-galaxy/data-engineering/working-with-data-lakes/table-formats/gl-iceberg.html
@pytest.mark.skip_profile("trino_starburst")
class TestMyAdapterCatalogIntegration(BaseCatalogIntegrationValidation):
@pytest.fixture(scope="class")
def catalogs(self):
return {
"catalogs": [
{
"name": "test_trino_catalog",
"active_write_integration": "trino_integration",
"write_integrations": [
{
"name": "trino_integration",
"catalog_type": "trino",
"table_format": "iceberg",
}
],
}
]
}
def test_model_with_catalog(self, project):
# Create model with catalog configuration
write_file(MODEL_WITH_CATALOG, project.project_root, "models", "test_model.sql")
results, logs = run_dbt_and_capture(["--debug", "run"], expect_pass=True)
assert len(results) == 1
assert "CREATE TABLE" in logs
assert "WITH (" in logs
assert "type = 'iceberg'" in logs
def test_model_with_catalog_configs_table_format(self, project):
# Create model with catalog configuration
write_file(
MODEL_WITH_CATALOG_CONFIGS_TABLE_FORMAT,
project.project_root,
"models",
"test_model.sql",
)
results, logs = run_dbt_and_capture(["--debug", "run"], expect_pass=True)
assert len(results) == 1
assert "CREATE TABLE" in logs
assert "WITH (" in logs
assert "type = 'delta'" in logs
@pytest.mark.iceberg
@pytest.mark.skip_profile("starburst_galaxy")
class TestTrinoCatalogIntegrationLocation(BaseCatalogIntegrationValidation):
@pytest.fixture(scope="class")
def catalogs(self):
return {
"catalogs": [
{
"name": "test_trino_catalog",
"active_write_integration": "trino_integration",
"write_integrations": [
{
"name": "trino_integration",
"catalog_type": "trino",
"external_volume": "s3://datalake",
}
],
}
]
}
def test_model_with_catalog(self, project):
# Create model with catalog configuration
write_file(MODEL_WITH_CATALOG, project.project_root, "models", "test_model.sql")
results, logs = run_dbt_and_capture(["--debug", "run"], expect_pass=True)
assert len(results) == 1
assert "CREATE TABLE" in logs
assert "WITH (" in logs
assert f"location = 's3://datalake/_dbt/{project.test_schema}/test_model'" in logs
def test_model_with_catalog_configs_location(self, project):
# Create model with catalog configuration
write_file(
MODEL_WITH_CATALOG_CONFIGS_LOCATION, project.project_root, "models", "test_model.sql"
)
results, logs = run_dbt_and_capture(["--debug", "run"], expect_pass=True)
assert len(results) == 1
assert "CREATE TABLE" in logs
assert "WITH (" in logs
assert "location = 's3://datalake/location'" in logs
def test_model_with_catalog_configs_storage_uri(self, project):
# Create model with catalog configuration
write_file(
MODEL_WITH_CATALOG_CONFIGS_STORAGE_URI,
project.project_root,
"models",
"test_model.sql",
)
results, logs = run_dbt_and_capture(["--debug", "run"], expect_pass=True)
assert len(results) == 1
assert "CREATE TABLE" in logs
assert "WITH (" in logs
assert "location = 's3://datalake/storage_uri'" in logs
def test_model_with_catalog_configs_base_location(self, project):
# Create model with catalog configuration
write_file(
MODEL_WITH_CATALOG_CONFIGS_BASE_LOCATION,
project.project_root,
"models",
"test_model.sql",
)
results, logs = run_dbt_and_capture(["--debug", "run"], expect_pass=True)
assert len(results) == 1
assert "CREATE TABLE" in logs
assert "WITH (" in logs
assert f"location = 's3://datalake/foo/{project.test_schema}/test_model/bar'" in logs
def test_model_with_catalog_configs_base_location_none(self, project):
# Create model with catalog configuration
write_file(
MODEL_WITH_CATALOG_CONFIGS_BASE_LOCATION_NONE,
project.project_root,
"models",
"test_model.sql",
)
results, logs = run_dbt_and_capture(["--debug", "run"], expect_pass=True)
assert len(results) == 1
assert "CREATE TABLE" in logs
assert "WITH (" in logs
assert f"location = 's3://datalake/_dbt/{project.test_schema}/test_model'" in logs
def test_model_with_catalog_configs_base_location_none_omit_base_location_root(self, project):
# Create model with catalog configuration
write_file(
MODEL_WITH_CATALOG_CONFIGS_BASE_LOCATION_NONE_OMIT_BASE_LOCATION_ROOT,
project.project_root,
"models",
"test_model.sql",
)
results, logs = run_dbt_and_capture(["--debug", "run"], expect_pass=True)
assert len(results) == 1
assert "CREATE TABLE" in logs
assert "WITH (" in logs
assert f"location = 's3://datalake/{project.test_schema}/test_model'" in logs
================================================
FILE: tests/functional/adapter/column_types/fixtures.py
================================================
model_sql = """
select
cast(0 as tinyint) as tinyint_col,
cast(1 as smallint) as smallint_col,
cast(2 as integer) as integer_col,
cast(2 as int) as int_col,
cast(3 as bigint) as bigint_col,
cast(4.0 as real) as real_col,
cast(5.0 as double) as double_col,
cast(5.5 as double precision) as double_precision_col,
cast(6.0 as decimal) as decimal_col,
cast('7' as char) as char_col,
cast('8' as varchar(20)) as varchar_col
"""
schema_yml = """
version: 2
models:
- name: model
tests:
- is_type:
column_map:
tinyint_col: ['integer', 'number']
smallint_col: ['integer', 'number']
integer_col: ['integer', 'number']
int_col: ['integer', 'number']
bigint_col: ['integer', 'number']
real_col: ['float', 'number']
double_col: ['float', 'number']
double_precision_col: ['float', 'number']
decimal_col: ['numeric', 'number']
char_col: ['string', 'not number']
varchar_col: ['string', 'not number']
"""
================================================
FILE: tests/functional/adapter/column_types/test_column_types.py
================================================
import pytest
from dbt.tests.adapter.column_types.test_column_types import BaseColumnTypes
from tests.functional.adapter.column_types.fixtures import model_sql, schema_yml
class TestTrinoColumnTypes(BaseColumnTypes):
@pytest.fixture(scope="class")
def models(self):
return {"model.sql": model_sql, "schema.yml": schema_yml}
def test_run_and_test(self, project):
self.run_and_test()
================================================
FILE: tests/functional/adapter/constraints/fixtures.py
================================================
trino_model_contract_sql_header_sql = """
{{
config(
materialized = "table"
)
}}
{% call set_sql_header(config) %}
set time zone 'Asia/Kolkata';
{%- endcall %}
select current_timezone() as column_name
"""
trino_model_incremental_contract_sql_header_sql = """
{{
config(
materialized = "incremental",
on_schema_change="append_new_columns"
)
}}
{% call set_sql_header(config) %}
set time zone 'Asia/Kolkata';
{%- endcall %}
select current_timezone() as column_name
"""
trino_model_schema_yml = """
version: 2
models:
- name: my_model
config:
contract:
enforced: true
columns:
- name: id
quote: true
data_type: integer
description: hello
constraints:
- type: not_null
- type: check
expression: (id > 0)
tests:
- unique
- name: color
data_type: varchar
- name: date_day
data_type: varchar
- name: my_model_error
config:
contract:
enforced: true
columns:
- name: id
data_type: integer
description: hello
constraints:
- type: not_null
- type: check
expression: (id > 0)
tests:
- unique
- name: color
data_type: varchar
- name: date_day
data_type: varchar
- name: my_model_wrong_order
config:
contract:
enforced: true
columns:
- name: id
data_type: integer
description: hello
constraints:
- type: not_null
- type: check
expression: (id > 0)
tests:
- unique
- name: color
data_type: varchar
- name: date_day
data_type: varchar
- name: my_model_wrong_name
config:
contract:
enforced: true
columns:
- name: id
data_type: integer
description: hello
constraints:
- type: not_null
- type: check
expression: (id > 0)
tests:
- unique
- name: color
data_type: varchar
- name: date_day
data_type: varchar
"""
trino_constrained_model_schema_yml = """
version: 2
models:
- name: my_model
config:
contract:
enforced: true
constraints:
- type: check
expression: (id > 0)
- type: primary_key
columns: [ id ]
- type: unique
columns: [ color, date_day ]
name: strange_uniqueness_requirement
columns:
- name: id
quote: true
data_type: integer
description: hello
constraints:
- type: not_null
tests:
- unique
- name: color
data_type: varchar
- name: date_day
data_type: varchar
"""
trino_model_quoted_column_schema_yml = """
version: 2
models:
- name: my_model
config:
contract:
enforced: true
materialized: table
constraints:
- type: check
# this one is the on the user
expression: ("from" = 'blue')
columns: [ '"from"' ]
columns:
- name: id
data_type: integer
description: hello
constraints:
- type: not_null
tests:
- unique
- name: from # reserved word
quote: true
data_type: varchar
constraints:
- type: not_null
- name: date_day
data_type: varchar
"""
trino_model_contract_header_schema_yml = """
version: 2
models:
- name: my_model_contract_sql_header
config:
contract:
enforced: true
columns:
- name: column_name
data_type: varchar
"""
================================================
FILE: tests/functional/adapter/constraints/test_constraints.py
================================================
import pytest
from dbt.tests.adapter.constraints.fixtures import (
my_incremental_model_sql,
my_model_incremental_wrong_name_sql,
my_model_incremental_wrong_order_sql,
my_model_sql,
my_model_view_wrong_name_sql,
my_model_view_wrong_order_sql,
my_model_with_quoted_column_name_sql,
my_model_wrong_name_sql,
my_model_wrong_order_sql,
)
from dbt.tests.adapter.constraints.test_constraints import (
BaseConstraintQuotedColumn,
BaseConstraintsRollback,
BaseConstraintsRuntimeDdlEnforcement,
BaseIncrementalConstraintsColumnsEqual,
BaseIncrementalConstraintsRollback,
BaseIncrementalConstraintsRuntimeDdlEnforcement,
BaseIncrementalContractSqlHeader,
BaseModelConstraintsRuntimeEnforcement,
BaseTableConstraintsColumnsEqual,
BaseTableContractSqlHeader,
BaseViewConstraintsColumnsEqual,
)
from tests.functional.adapter.constraints.fixtures import (
trino_constrained_model_schema_yml,
trino_model_contract_header_schema_yml,
trino_model_contract_sql_header_sql,
trino_model_incremental_contract_sql_header_sql,
trino_model_quoted_column_schema_yml,
trino_model_schema_yml,
)
_expected_sql_trino = """
create table <model_identifier> (
"id" integer not null,
color varchar,
date_day varchar
) ;
insert into <model_identifier>
(
select
"id",
color,
date_day from
(
select
'blue' as color,
1 as id,
'2019-01-01' as date_day
) as model_subq
)
;
"""
class TrinoColumnEqualSetup:
@pytest.fixture
def string_type(self):
return "VARCHAR"
@pytest.fixture
def data_types(self, schema_int_type, int_type, string_type):
# sql_column_value, schema_data_type, error_data_type
return [
["1", schema_int_type, int_type],
["'1'", string_type, string_type],
["cast('2019-01-01' as date)", "date", "DATE"],
["true", "boolean", "BOOLEAN"],
["cast('2013-11-03 00:00:00-07' as TIMESTAMP)", "timestamp(6)", "TIMESTAMP"],
[
"cast('2013-11-03 00:00:00-07' as TIMESTAMP WITH TIME ZONE)",
"timestamp(6)",
"TIMESTAMP",
],
["ARRAY['a','b','c']", "ARRAY(VARCHAR)", "ARRAY"],
["ARRAY[1,2,3]", "ARRAY(INTEGER)", "ARRAY"],
["cast('1' as DECIMAL)", "DECIMAL", "DECIMAL"],
]
@pytest.mark.iceberg
class TestTrinoTableConstraintsColumnsEqual(
TrinoColumnEqualSetup, BaseTableConstraintsColumnsEqual
):
@pytest.fixture(scope="class")
def models(self):
return {
"my_model_wrong_order.sql": my_model_wrong_order_sql,
"my_model_wrong_name.sql": my_model_wrong_name_sql,
"constraints_schema.yml": trino_model_schema_yml,
}
class TestTrinoViewConstraintsColumnsEqual(TrinoColumnEqualSetup, BaseViewConstraintsColumnsEqual):
@pytest.fixture(scope="class")
def models(self):
return {
"my_model_wrong_order.sql": my_model_view_wrong_order_sql,
"my_model_wrong_name.sql": my_model_view_wrong_name_sql,
"constraints_schema.yml": trino_model_schema_yml,
}
@pytest.mark.iceberg
class TestTrinoIncrementalConstraintsColumnsEqual(
TrinoColumnEqualSetup, BaseIncrementalConstraintsColumnsEqual
):
@pytest.fixture(scope="class")
def models(self):
return {
"my_model_wrong_order.sql": my_model_incremental_wrong_order_sql,
"my_model_wrong_name.sql": my_model_incremental_wrong_name_sql,
"constraints_schema.yml": trino_model_schema_yml,
}
@pytest.mark.iceberg
class TestTrinoTableConstraintsRuntimeDdlEnforcement(BaseConstraintsRuntimeDdlEnforcement):
@pytest.fixture(scope="class")
def models(self):
return {
"my_model.sql": my_model_wrong_order_sql,
"constraints_schema.yml": trino_model_schema_yml,
}
@pytest.fixture(scope="class")
def expected_sql(self):
return _expected_sql_trino
@pytest.mark.iceberg
class TestTrinoTableConstraintsRollback(BaseConstraintsRollback):
@pytest.fixture(scope="class")
def models(self):
return {
"my_model.sql": my_model_sql,
"constraints_schema.yml": trino_model_schema_yml,
}
@pytest.fixture(scope="class")
def expected_error_messages(self):
return ["NULL value not allowed for NOT NULL column: id"]
@pytest.mark.iceberg
class TestTrinoIncrementalConstraintsRuntimeDdlEnforcement(
BaseIncrementalConstraintsRuntimeDdlEnforcement
):
@pytest.fixture(scope="class")
def models(self):
return {
"my_model.sql": my_model_incremental_wrong_order_sql,
"constraints_schema.yml": trino_model_schema_yml,
}
@pytest.fixture(scope="class")
def expected_sql(self):
return _expected_sql_trino
@pytest.mark.iceberg
class TestTrinoIncrementalConstraintsRollback(BaseIncrementalConstraintsRollback):
@pytest.fixture(scope="class")
def models(self):
return {
"my_model.sql": my_incremental_model_sql,
"constraints_schema.yml": trino_model_schema_yml,
}
@pytest.fixture(scope="class")
def expected_error_messages(self):
return ["NULL value not allowed for NOT NULL column: id"]
class TestTrinoTableContractSqlHeader(BaseTableContractSqlHeader):
@pytest.fixture(scope="class")
def models(self):
return {
"my_model_contract_sql_header.sql": trino_model_contract_sql_header_sql,
"constraints_schema.yml": trino_model_contract_header_schema_yml,
}
class TestTrinoIncrementalContractSqlHeader(BaseIncrementalContractSqlHeader):
@pytest.fixture(scope="class")
def models(self):
return {
"my_model_contract_sql_header.sql": trino_model_incremental_contract_sql_header_sql,
"constraints_schema.yml": trino_model_contract_header_schema_yml,
}
@pytest.mark.iceberg
class TestTrinoModelConstraintsRuntimeEnforcement(BaseModelConstraintsRuntimeEnforcement):
@pytest.fixture(scope="class")
def models(self):
return {
"my_model.sql": my_model_sql,
"constraints_schema.yml": trino_constrained_model_schema_yml,
}
@pytest.fixture(scope="class")
def expected_sql(self):
return """
create table <model_identifier> (
"id" integer not null,
color varchar,
date_day varchar
) ;
insert into <model_identifier>
(
select
"id",
color,
date_day from
(
select
1 as id,
'blue' as color,
'2019-01-01' as date_day
) as model_subq
)
;
"""
@pytest.mark.iceberg
class TestTrinoConstraintQuotedColumn(BaseConstraintQuotedColumn):
@pytest.fixture(scope="class")
def models(self):
return {
"my_model.sql": my_model_with_quoted_column_name_sql,
"constraints_schema.yml": trino_model_quoted_column_schema_yml,
}
@pytest.fixture(scope="class")
def expected_sql(self):
return """
create table <model_identifier> (
id integer not null,
"from" varchar not null,
date_day varchar
) ;
insert into <model_identifier>
(
select id, "from", date_day
from (
select
'blue' as "from",
1 as id,
'2019-01-01' as date_day
) as model_subq
);
"""
================================================
FILE: tests/functional/adapter/dbt_clone/test_dbt_clone.py
================================================
import pytest
from dbt.tests.adapter.dbt_clone.fixtures import (
custom_can_clone_tables_false_macros_sql,
get_schema_name_sql,
infinite_macros_sql,
macros_sql,
)
from dbt.tests.adapter.dbt_clone.test_dbt_clone import BaseCloneNotPossible
iceberg_macro_override_sql = """
{% macro trino__current_timestamp() -%}
current_timestamp(6)
{%- endmacro %}
"""
class TestTrinoCloneNotPossible(BaseCloneNotPossible):
@pytest.fixture(scope="class")
def macros(self):
return {
"macros.sql": macros_sql,
"my_can_clone_tables.sql": custom_can_clone_tables_false_macros_sql,
"infinite_macros.sql": infinite_macros_sql,
"get_schema_name.sql": get_schema_name_sql,
"iceberg.sql": iceberg_macro_override_sql,
}
# TODO: below method probably should be implemented in base class (on dbt-core side)
@pytest.fixture(autouse=True)
def clean_up(self, project):
yield
with project.adapter.connection_named("__test"):
relation = project.adapter.Relation.create(
database=project.database, schema=f"{project.test_schema}_seeds"
)
project.adapter.drop_schema(relation)
relation = project.adapter.Relation.create(
database=project.database, schema=project.test_schema
)
project.adapter.drop_schema(relation)
================================================
FILE: tests/functional/adapter/dbt_debug/test_dbt_debug.py
================================================
import pytest
from dbt.tests.adapter.dbt_debug.test_dbt_debug import (
BaseDebug,
BaseDebugProfileVariable,
)
from dbt.tests.util import run_dbt
class TestDebugTrino(BaseDebug):
# TODO: below teardown method probably should be implemented in base class (on dbt-core side)
@pytest.fixture(scope="function", autouse=True)
def teardown_method(self, project):
yield
project.run_sql(f"drop schema if exists {project.test_schema}")
def test_ok_trino(self, project):
run_dbt(["debug"])
assert "ERROR" not in self.capsys.readouterr().out
class TestDebugProfileVariableTrino(BaseDebugProfileVariable):
# TODO: below teardown method probably should be implemented in base class (on dbt-core side)
@pytest.fixture(scope="function", autouse=True)
def teardown_method(self, project):
yield
project.run_sql(f"drop schema if exists {project.test_schema}")
def test_ok_trino(self, project):
run_dbt(["debug"])
assert "ERROR" not in self.capsys.readouterr().out
================================================
FILE: tests/functional/adapter/dbt_show/test_dbt_show.py
================================================
from dbt.tests.adapter.dbt_show.test_dbt_show import BaseShowLimit, BaseShowSqlHeader
class TestTrinoShowSqlHeader(BaseShowSqlHeader):
pass
class TestTrinoShowLimit(BaseShowLimit):
pass
================================================
FILE: tests/functional/adapter/empty/test_empty.py
====================
gitextract_za4id763/ ├── .changes/ │ ├── 0.0.0.md │ ├── 1.10.0/ │ │ └── Features-20251210-194211.yaml │ ├── 1.10.0.md │ ├── 1.10.1/ │ │ └── Dependencies-20260115-092226.yaml │ ├── 1.10.1.md │ ├── header.tpl.md │ └── unreleased/ │ └── .gitkeep ├── .changie.yaml ├── .flake8 ├── .github/ │ ├── ISSUE_TEMPLATE/ │ │ ├── bug_report.yml │ │ ├── config.yml │ │ └── feature_request.yml │ ├── dependabot.yml │ ├── pull_request_template.md │ └── workflows/ │ ├── bot-changelog.yml │ ├── changelog-existence.yml │ ├── ci.yml │ ├── release.yml │ ├── security.yml │ └── version-bump.yml ├── .gitignore ├── .pre-commit-config.yaml ├── CHANGELOG.md ├── CONTRIBUTING.md ├── LICENSE.txt ├── Makefile ├── README.md ├── dbt/ │ ├── adapters/ │ │ └── trino/ │ │ ├── __init__.py │ │ ├── __version__.py │ │ ├── catalogs/ │ │ │ ├── __init__.py │ │ │ ├── _relation.py │ │ │ └── _trino_catalog_metastore.py │ │ ├── column.py │ │ ├── connections.py │ │ ├── constants.py │ │ ├── impl.py │ │ ├── parse_model.py │ │ └── relation.py │ └── include/ │ └── trino/ │ ├── __init__.py │ ├── dbt_project.yml │ ├── macros/ │ │ ├── adapters.sql │ │ ├── apply_grants.sql │ │ ├── catalog.sql │ │ ├── materializations/ │ │ │ ├── incremental.sql │ │ │ ├── materialized_view.sql │ │ │ ├── seeds/ │ │ │ │ └── helpers.sql │ │ │ ├── snapshot.sql │ │ │ ├── table.sql │ │ │ └── view.sql │ │ └── utils/ │ │ ├── any_value.sql │ │ ├── array_append.sql │ │ ├── array_concat.sql │ │ ├── array_construct.sql │ │ ├── bool_or.sql │ │ ├── datatypes.sql │ │ ├── date_spine.sql │ │ ├── date_trunc.sql │ │ ├── dateadd.sql │ │ ├── datediff.sql │ │ ├── hash.sql │ │ ├── listagg.sql │ │ ├── right.sql │ │ ├── safe_cast.sql │ │ ├── split_part.sql │ │ └── timestamps.sql │ └── sample_profiles.yml ├── dev_requirements.txt ├── docker/ │ ├── init_starburst.bash │ ├── init_trino.bash │ ├── remove_starburst.bash │ ├── remove_trino.bash │ ├── starburst/ │ │ ├── catalog/ │ │ │ ├── delta.properties │ │ │ ├── hive.properties │ │ │ ├── iceberg.properties │ │ │ ├── memory.properties │ │ │ ├── postgresql.properties │ │ │ └── tpch.properties │ │ └── etc/ │ │ ├── config.properties │ │ ├── jvm.config │ │ └── node.properties │ └── trino/ │ ├── catalog/ │ │ ├── delta.properties │ │ ├── hive.properties │ │ ├── iceberg.properties │ │ ├── memory.properties │ │ ├── postgresql.properties │ │ └── tpch.properties │ └── etc/ │ ├── config.properties │ ├── jvm.config │ └── node.properties ├── docker-compose-starburst.yml ├── docker-compose-trino.yml ├── mypy.ini ├── pytest.ini ├── setup.py ├── tests/ │ ├── conftest.py │ ├── functional/ │ │ └── adapter/ │ │ ├── behavior_flags/ │ │ │ └── test_require_certificate_validation.py │ │ ├── catalog_integrations/ │ │ │ ├── fixtures.py │ │ │ └── test_catalog_integration.py │ │ ├── column_types/ │ │ │ ├── fixtures.py │ │ │ └── test_column_types.py │ │ ├── constraints/ │ │ │ ├── fixtures.py │ │ │ └── test_constraints.py │ │ ├── dbt_clone/ │ │ │ └── test_dbt_clone.py │ │ ├── dbt_debug/ │ │ │ └── test_dbt_debug.py │ │ ├── dbt_show/ │ │ │ └── test_dbt_show.py │ │ ├── empty/ │ │ │ └── test_empty.py │ │ ├── fixture_datediff.py │ │ ├── hooks/ │ │ │ ├── data/ │ │ │ │ ├── seed_model.sql │ │ │ │ └── seed_run.sql │ │ │ ├── test_hooks_delete.py │ │ │ ├── test_model_hooks.py │ │ │ └── test_run_hooks.py │ │ ├── materialization/ │ │ │ ├── fixtures.py │ │ │ ├── test_incremental_delete_insert.py │ │ │ ├── test_incremental_merge.py │ │ │ ├── test_incremental_microbatch.py │ │ │ ├── test_incremental_predicates.py │ │ │ ├── test_incremental_schema.py │ │ │ ├── test_incremental_views_enabled.py │ │ │ ├── test_materialized_view.py │ │ │ ├── test_on_table_exists.py │ │ │ ├── test_prepared_statements.py │ │ │ ├── test_snapshot.py │ │ │ └── test_view_security.py │ │ ├── materialized_view_tests/ │ │ │ ├── test_materialized_view_dbt_core.py │ │ │ └── utils.py │ │ ├── persist_docs/ │ │ │ ├── fixtures.py │ │ │ └── test_persist_docs.py │ │ ├── show/ │ │ │ ├── fixtures.py │ │ │ └── test_show.py │ │ ├── simple_seed/ │ │ │ ├── seed_bom.csv │ │ │ ├── seeds.py │ │ │ └── test_seed.py │ │ ├── store_failures/ │ │ │ ├── fixtures.py │ │ │ └── test_store_failures.py │ │ ├── test_basic.py │ │ ├── test_caching.py │ │ ├── test_changing_relation_type.py │ │ ├── test_concurrency.py │ │ ├── test_custom_schema.py │ │ ├── test_ephemeral.py │ │ ├── test_get_incremental_tmp_relation_type_macro.py │ │ ├── test_grants.py │ │ ├── test_query_comments.py │ │ ├── test_quote_policy.py │ │ ├── test_sample_mode.py │ │ ├── test_seeds_column_types_overrides.py │ │ ├── test_session_property.py │ │ ├── test_simple_copy.py │ │ ├── test_simple_snapshot.py │ │ ├── test_sql_status_output.py │ │ ├── test_table_properties.py │ │ ├── unit_testing/ │ │ │ └── test_unit_testing.py │ │ └── utils/ │ │ ├── fixture_date_spine.py │ │ ├── fixture_get_intervals_between.py │ │ ├── test_data_types.py │ │ ├── test_date_spine.py │ │ ├── test_get_intervals_between.py │ │ ├── test_timestamps.py │ │ └── test_utils.py │ └── unit/ │ ├── __init__.py │ ├── test_adapter.py │ └── utils.py └── tox.ini
SYMBOL INDEX (694 symbols across 63 files)
FILE: dbt/adapters/trino/catalogs/_relation.py
class TrinoCatalogRelation (line 10) | class TrinoCatalogRelation(CatalogRelation):
FILE: dbt/adapters/trino/catalogs/_trino_catalog_metastore.py
class TrinoCatalogIntegration (line 10) | class TrinoCatalogIntegration(CatalogIntegration):
method __init__ (line 33) | def __init__(self, config: CatalogIntegrationConfig) -> None:
method build_relation (line 37) | def build_relation(self, model: RelationConfig) -> TrinoCatalogRelation:
method _calculate_storage_uri (line 47) | def _calculate_storage_uri(self, model: RelationConfig) -> Optional[str]:
FILE: dbt/adapters/trino/column.py
class TrinoColumn (line 14) | class TrinoColumn(Column):
method data_type (line 21) | def data_type(self):
method is_string (line 29) | def is_string(self) -> bool:
method is_float (line 32) | def is_float(self) -> bool:
method is_integer (line 39) | def is_integer(self) -> bool:
method is_numeric (line 48) | def is_numeric(self) -> bool:
method string_type (line 52) | def string_type(cls, size: int) -> str:
method string_size (line 55) | def string_size(self) -> int:
method from_description (line 63) | def from_description(cls, name: str, raw_data_type: str) -> "Column":
FILE: dbt/adapters/trino/connections.py
class HttpScheme (line 27) | class HttpScheme(Enum):
class TrinoCredentialsFactory (line 32) | class TrinoCredentialsFactory:
method _create_trino_profile (line 34) | def _create_trino_profile(cls, profile):
method translate_aliases (line 52) | def translate_aliases(cls, kwargs: Dict[str, Any], recurse: bool = Fal...
method validate (line 57) | def validate(cls, data: Any):
method from_dict (line 62) | def from_dict(cls, data: Any):
class TrinoCredentials (line 67) | class TrinoCredentials(Credentials, metaclass=ABCMeta):
method type (line 71) | def type(self):
method unique_field (line 75) | def unique_field(self):
method _connection_keys (line 78) | def _connection_keys(self):
method trino_auth (line 91) | def trino_auth(self) -> Optional[trino.auth.Authentication]:
class TrinoNoneCredentials (line 96) | class TrinoNoneCredentials(TrinoCredentials):
method method (line 112) | def method(self):
method trino_auth (line 115) | def trino_auth(self):
class TrinoCertificateCredentials (line 120) | class TrinoCertificateCredentials(TrinoCredentials):
method http_scheme (line 137) | def http_scheme(self):
method method (line 141) | def method(self):
method trino_auth (line 144) | def trino_auth(self):
class TrinoLdapCredentials (line 151) | class TrinoLdapCredentials(TrinoCredentials):
method http_scheme (line 168) | def http_scheme(self):
method method (line 172) | def method(self):
method trino_auth (line 175) | def trino_auth(self):
class TrinoKerberosCredentials (line 180) | class TrinoKerberosCredentials(TrinoCredentials):
method http_scheme (line 204) | def http_scheme(self):
method method (line 208) | def method(self):
method trino_auth (line 211) | def trino_auth(self):
class TrinoJwtCredentials (line 227) | class TrinoJwtCredentials(TrinoCredentials):
method http_scheme (line 243) | def http_scheme(self):
method method (line 247) | def method(self):
method trino_auth (line 250) | def trino_auth(self):
class TrinoOauthCredentials (line 255) | class TrinoOauthCredentials(TrinoCredentials):
method http_scheme (line 273) | def http_scheme(self):
method method (line 277) | def method(self):
method trino_auth (line 280) | def trino_auth(self):
class TrinoOauthConsoleCredentials (line 285) | class TrinoOauthConsoleCredentials(TrinoCredentials):
method http_scheme (line 303) | def http_scheme(self):
method method (line 307) | def method(self):
method trino_auth (line 310) | def trino_auth(self):
class ConnectionWrapper (line 314) | class ConnectionWrapper(object):
method __init__ (line 323) | def __init__(self, handle, prepared_statements_enabled):
method cursor (line 329) | def cursor(self):
method cancel (line 333) | def cancel(self):
method close (line 337) | def close(self):
method commit (line 341) | def commit(self):
method rollback (line 344) | def rollback(self):
method start_transaction (line 347) | def start_transaction(self):
method fetchall (line 350) | def fetchall(self):
method fetchone (line 361) | def fetchone(self):
method fetchmany (line 372) | def fetchmany(self, size):
method execute (line 383) | def execute(self, sql, bindings=None):
method description (line 399) | def description(self):
method _escape_value (line 403) | def _escape_value(cls, value):
class TrinoAdapterResponse (line 426) | class TrinoAdapterResponse(AdapterResponse):
class TrinoConnectionManager (line 431) | class TrinoConnectionManager(SQLConnectionManager):
method __init__ (line 435) | def __init__(self, profile, mp_context, behavior_flags=None) -> None:
method exception_handler (line 441) | def exception_handler(self, sql):
method add_begin_query (line 467) | def add_begin_query(self):
method add_commit_query (line 470) | def add_commit_query(self):
method open (line 474) | def open(cls, connection):
method get_response (line 520) | def get_response(cls, cursor) -> TrinoAdapterResponse:
method cancel (line 537) | def cancel(self, connection):
method add_query (line 540) | def add_query(self, sql, auto_begin=True, bindings=None, abridge_sql_l...
method data_type_code_to_name (line 582) | def data_type_code_to_name(cls, type_code) -> str:
FILE: dbt/adapters/trino/impl.py
class TrinoConfig (line 31) | class TrinoConfig(AdapterConfig):
class TrinoAdapter (line 36) | class TrinoAdapter(SQLAdapter):
method __init__ (line 65) | def __init__(self, config, mp_context) -> None:
method _behavior_flags (line 71) | def _behavior_flags(self) -> List[BehaviorFlag]:
method date_function (line 87) | def date_function(cls):
method convert_text_type (line 91) | def convert_text_type(cls, agate_table, col_idx):
method convert_number_type (line 95) | def convert_number_type(cls, agate_table, col_idx):
method convert_datetime_type (line 100) | def convert_datetime_type(cls, agate_table, col_idx):
method convert_date_type (line 104) | def convert_date_type(cls, agate_table: agate.Table, col_idx: int) -> ...
method timestamp_add_sql (line 107) | def timestamp_add_sql(self, add_to: str, number: int = 1, interval: st...
method get_columns_in_relation (line 110) | def get_columns_in_relation(self, relation):
method valid_incremental_strategies (line 119) | def valid_incremental_strategies(self):
method build_catalog_relation (line 123) | def build_catalog_relation(self, model: RelationConfig) -> Optional[Ca...
FILE: dbt/adapters/trino/parse_model.py
function catalog_name (line 9) | def catalog_name(model: RelationConfig) -> Optional[str]:
FILE: dbt/adapters/trino/relation.py
class TrinoRelation (line 8) | class TrinoRelation(BaseRelation):
method _is_exactish_match (line 13) | def _is_exactish_match(self, field: ComponentName, value: str) -> bool:
method _render_event_time_filtered (line 17) | def _render_event_time_filtered(self, event_time_filter: EventTimeFilt...
FILE: dbt/include/trino/macros/materializations/seeds/helpers.sql
type "memory" (line 31) | create table "memory"."default"."string_type" ("varchar_example" varchar...
FILE: setup.py
function _get_plugin_version_dict (line 33) | def _get_plugin_version_dict():
function _dbt_trino_version (line 45) | def _dbt_trino_version():
FILE: tests/conftest.py
function pytest_addoption (line 12) | def pytest_addoption(parser):
function pytest_collection_modifyitems (line 19) | def pytest_collection_modifyitems(config, items):
function dbt_profile_target (line 30) | def dbt_profile_target(request):
function get_trino_starburst_target (line 66) | def get_trino_starburst_target():
function get_galaxy_target (line 84) | def get_galaxy_target():
function trino_connection (line 101) | def trino_connection(dbt_profile_target):
function get_engine_type (line 123) | def get_engine_type(trino_connection):
function skip_by_engine_type (line 137) | def skip_by_engine_type(request, trino_connection):
FILE: tests/functional/adapter/behavior_flags/test_require_certificate_validation.py
class TestRequireCertificateValidationDefault (line 8) | class TestRequireCertificateValidationDefault:
method project_config_update (line 10) | def project_config_update(self):
method test_cert_default_value (line 13) | def test_cert_default_value(self, project):
method test_require_certificate_validation_logs (line 16) | def test_require_certificate_validation_logs(self, project):
method test_require_certificate_validation_insecure_request_warning (line 22) | def test_require_certificate_validation_insecure_request_warning(self,...
class TestRequireCertificateValidationFalse (line 33) | class TestRequireCertificateValidationFalse:
method project_config_update (line 35) | def project_config_update(self):
method test_cert_default_value (line 38) | def test_cert_default_value(self, project):
method test_require_certificate_validation_logs (line 41) | def test_require_certificate_validation_logs(self, project):
method test_require_certificate_validation_insecure_request_warning (line 47) | def test_require_certificate_validation_insecure_request_warning(self,...
class TestRequireCertificateValidationTrue (line 58) | class TestRequireCertificateValidationTrue:
method project_config_update (line 60) | def project_config_update(self):
method test_cert_default_value (line 63) | def test_cert_default_value(self, project):
method test_require_certificate_validation_logs (line 66) | def test_require_certificate_validation_logs(self, project):
method test_require_certificate_validation_insecure_request_warning (line 72) | def test_require_certificate_validation_insecure_request_warning(self,...
FILE: tests/functional/adapter/catalog_integrations/test_catalog_integration.py
class TestTrinoCatalogIntegrationFileFormat (line 21) | class TestTrinoCatalogIntegrationFileFormat(BaseCatalogIntegrationValida...
method catalogs (line 23) | def catalogs(self):
method test_model_without_catalog (line 40) | def test_model_without_catalog(self, project):
method test_model_with_catalog (line 48) | def test_model_with_catalog(self, project):
method test_model_with_catalog_configs_file_format (line 57) | def test_model_with_catalog_configs_file_format(self, project):
class TestMyAdapterCatalogIntegration (line 76) | class TestMyAdapterCatalogIntegration(BaseCatalogIntegrationValidation):
method catalogs (line 78) | def catalogs(self):
method test_model_with_catalog (line 95) | def test_model_with_catalog(self, project):
method test_model_with_catalog_configs_table_format (line 104) | def test_model_with_catalog_configs_table_format(self, project):
class TestTrinoCatalogIntegrationLocation (line 121) | class TestTrinoCatalogIntegrationLocation(BaseCatalogIntegrationValidati...
method catalogs (line 123) | def catalogs(self):
method test_model_with_catalog (line 140) | def test_model_with_catalog(self, project):
method test_model_with_catalog_configs_location (line 149) | def test_model_with_catalog_configs_location(self, project):
method test_model_with_catalog_configs_storage_uri (line 160) | def test_model_with_catalog_configs_storage_uri(self, project):
method test_model_with_catalog_configs_base_location (line 174) | def test_model_with_catalog_configs_base_location(self, project):
method test_model_with_catalog_configs_base_location_none (line 188) | def test_model_with_catalog_configs_base_location_none(self, project):
method test_model_with_catalog_configs_base_location_none_omit_base_location_root (line 202) | def test_model_with_catalog_configs_base_location_none_omit_base_locat...
FILE: tests/functional/adapter/column_types/test_column_types.py
class TestTrinoColumnTypes (line 7) | class TestTrinoColumnTypes(BaseColumnTypes):
method models (line 9) | def models(self):
method test_run_and_test (line 12) | def test_run_and_test(self, project):
FILE: tests/functional/adapter/constraints/test_constraints.py
class TrinoColumnEqualSetup (line 59) | class TrinoColumnEqualSetup:
method string_type (line 61) | def string_type(self):
method data_types (line 65) | def data_types(self, schema_int_type, int_type, string_type):
class TestTrinoTableConstraintsColumnsEqual (line 85) | class TestTrinoTableConstraintsColumnsEqual(
method models (line 89) | def models(self):
class TestTrinoViewConstraintsColumnsEqual (line 97) | class TestTrinoViewConstraintsColumnsEqual(TrinoColumnEqualSetup, BaseVi...
method models (line 99) | def models(self):
class TestTrinoIncrementalConstraintsColumnsEqual (line 108) | class TestTrinoIncrementalConstraintsColumnsEqual(
method models (line 112) | def models(self):
class TestTrinoTableConstraintsRuntimeDdlEnforcement (line 121) | class TestTrinoTableConstraintsRuntimeDdlEnforcement(BaseConstraintsRunt...
method models (line 123) | def models(self):
method expected_sql (line 130) | def expected_sql(self):
class TestTrinoTableConstraintsRollback (line 135) | class TestTrinoTableConstraintsRollback(BaseConstraintsRollback):
method models (line 137) | def models(self):
method expected_error_messages (line 144) | def expected_error_messages(self):
class TestTrinoIncrementalConstraintsRuntimeDdlEnforcement (line 149) | class TestTrinoIncrementalConstraintsRuntimeDdlEnforcement(
method models (line 153) | def models(self):
method expected_sql (line 160) | def expected_sql(self):
class TestTrinoIncrementalConstraintsRollback (line 165) | class TestTrinoIncrementalConstraintsRollback(BaseIncrementalConstraints...
method models (line 167) | def models(self):
method expected_error_messages (line 174) | def expected_error_messages(self):
class TestTrinoTableContractSqlHeader (line 178) | class TestTrinoTableContractSqlHeader(BaseTableContractSqlHeader):
method models (line 180) | def models(self):
class TestTrinoIncrementalContractSqlHeader (line 187) | class TestTrinoIncrementalContractSqlHeader(BaseIncrementalContractSqlHe...
method models (line 189) | def models(self):
class TestTrinoModelConstraintsRuntimeEnforcement (line 197) | class TestTrinoModelConstraintsRuntimeEnforcement(BaseModelConstraintsRu...
method models (line 199) | def models(self):
method expected_sql (line 206) | def expected_sql(self):
class TestTrinoConstraintQuotedColumn (line 231) | class TestTrinoConstraintQuotedColumn(BaseConstraintQuotedColumn):
method models (line 233) | def models(self):
method expected_sql (line 240) | def expected_sql(self):
FILE: tests/functional/adapter/dbt_clone/test_dbt_clone.py
class TestTrinoCloneNotPossible (line 17) | class TestTrinoCloneNotPossible(BaseCloneNotPossible):
method macros (line 19) | def macros(self):
method clean_up (line 30) | def clean_up(self, project):
FILE: tests/functional/adapter/dbt_debug/test_dbt_debug.py
class TestDebugTrino (line 9) | class TestDebugTrino(BaseDebug):
method teardown_method (line 12) | def teardown_method(self, project):
method test_ok_trino (line 16) | def test_ok_trino(self, project):
class TestDebugProfileVariableTrino (line 21) | class TestDebugProfileVariableTrino(BaseDebugProfileVariable):
method teardown_method (line 24) | def teardown_method(self, project):
method test_ok_trino (line 28) | def test_ok_trino(self, project):
FILE: tests/functional/adapter/dbt_show/test_dbt_show.py
class TestTrinoShowSqlHeader (line 4) | class TestTrinoShowSqlHeader(BaseShowSqlHeader):
class TestTrinoShowLimit (line 8) | class TestTrinoShowLimit(BaseShowLimit):
FILE: tests/functional/adapter/empty/test_empty.py
class TestTrinoEmpty (line 7) | class TestTrinoEmpty(BaseTestEmpty):
class TestTrinoEmptyInlineSourceRef (line 11) | class TestTrinoEmptyInlineSourceRef(BaseTestEmptyInlineSourceRef):
FILE: tests/functional/adapter/hooks/test_hooks_delete.py
class BaseTestHooksDelete (line 29) | class BaseTestHooksDelete:
method seeds (line 31) | def seeds(self):
method models (line 37) | def models(self):
method project_config_update (line 43) | def project_config_update(self):
method test_pre_and_post_run_hooks (line 51) | def test_pre_and_post_run_hooks(self, project, dbt_profile_target):
class TestBaseTestHooksDeleteDelta (line 76) | class TestBaseTestHooksDeleteDelta(BaseTestHooksDelete):
class TestBaseTestHooksDeleteIceberg (line 81) | class TestBaseTestHooksDeleteIceberg(BaseTestHooksDelete):
FILE: tests/functional/adapter/hooks/test_model_hooks.py
class TestTrinoPrePostModelHooks (line 5) | class TestTrinoPrePostModelHooks(core_base.TestPrePostModelHooks):
method check_hooks (line 6) | def check_hooks(self, state, project, host, count=1):
class TestTrinoPrePostModelHooksUnderscores (line 10) | class TestTrinoPrePostModelHooksUnderscores(core_base.TestPrePostModelHo...
method check_hooks (line 11) | def check_hooks(self, state, project, host, count=1):
class TestTrinoHookRefs (line 15) | class TestTrinoHookRefs(core_base.TestHookRefs):
method check_hooks (line 16) | def check_hooks(self, state, project, host, count=1):
class TestTrinoPrePostModelHooksOnSeeds (line 21) | class TestTrinoPrePostModelHooksOnSeeds(core_base.TestPrePostModelHooksO...
method check_hooks (line 22) | def check_hooks(self, state, project, host, count=1):
method project_config_update (line 26) | def project_config_update(self):
FILE: tests/functional/adapter/hooks/test_run_hooks.py
class TestPrePostRunHooksTrino (line 8) | class TestPrePostRunHooksTrino(BasePrePostRunHooks):
method project_config_update (line 10) | def project_config_update(self):
method check_hooks (line 35) | def check_hooks(self, state, project, host):
class TestAfterRunHooksTrino (line 56) | class TestAfterRunHooksTrino(BaseAfterRunHooks):
FILE: tests/functional/adapter/materialization/test_incremental_delete_insert.py
class TrinoIncrementalUniqueKey (line 170) | class TrinoIncrementalUniqueKey(BaseIncrementalUniqueKey):
method seeds (line 172) | def seeds(self):
method models (line 180) | def models(self):
class TestIcebergIncrementalDeleteInsert (line 200) | class TestIcebergIncrementalDeleteInsert(TrinoIncrementalUniqueKey):
method project_config_update (line 202) | def project_config_update(self):
class TestDeltaIncrementalDeleteInsert (line 211) | class TestDeltaIncrementalDeleteInsert(TrinoIncrementalUniqueKey):
method test__no_unique_keys (line 212) | def test__no_unique_keys(self, project):
method project_config_update (line 216) | def project_config_update(self):
class TestIcebergIncrementalDeleteInsertWithLocation (line 226) | class TestIcebergIncrementalDeleteInsertWithLocation:
method models (line 228) | def models(self):
method test_temporary_table_location (line 233) | def test_temporary_table_location(self, project):
class TestIcebergCompositeUniqueKeys (line 252) | class TestIcebergCompositeUniqueKeys(BaseIncrementalPredicates):
method seeds (line 254) | def seeds(self):
method models (line 261) | def models(self):
method test__incremental_predicates_composite_keys (line 267) | def test__incremental_predicates_composite_keys(self, project):
FILE: tests/functional/adapter/materialization/test_incremental_merge.py
class TrinoIncrementalUniqueKey (line 101) | class TrinoIncrementalUniqueKey(BaseIncrementalUniqueKey):
method seeds (line 103) | def seeds(self):
method models (line 111) | def models(self):
class TestIcebergIncrementalMerge (line 131) | class TestIcebergIncrementalMerge(TrinoIncrementalUniqueKey):
method project_config_update (line 133) | def project_config_update(self):
class TestDeltaIncrementalMerge (line 142) | class TestDeltaIncrementalMerge(TrinoIncrementalUniqueKey):
method project_config_update (line 144) | def project_config_update(self):
FILE: tests/functional/adapter/materialization/test_incremental_microbatch.py
class TestTrinoMicrobatchIceberg (line 6) | class TestTrinoMicrobatchIceberg(BaseMicrobatch):
FILE: tests/functional/adapter/materialization/test_incremental_predicates.py
class TestIcebergPredicatesDeleteInsertTrino (line 8) | class TestIcebergPredicatesDeleteInsertTrino(BaseIncrementalPredicates):
method project_config_update (line 10) | def project_config_update(self):
class TestDeltaPredicatesDeleteInsertTrino (line 15) | class TestDeltaPredicatesDeleteInsertTrino(BaseIncrementalPredicates):
method project_config_update (line 17) | def project_config_update(self):
class TestIcebergIncrementalPredicatesMergeTrino (line 22) | class TestIcebergIncrementalPredicatesMergeTrino(BaseIncrementalPredicat...
method project_config_update (line 24) | def project_config_update(self):
class TestDeltaIncrementalPredicatesMergeTrino (line 34) | class TestDeltaIncrementalPredicatesMergeTrino(BaseIncrementalPredicates):
method project_config_update (line 36) | def project_config_update(self):
class TestIcebergPredicatesMergeTrino (line 46) | class TestIcebergPredicatesMergeTrino(BaseIncrementalPredicates):
method project_config_update (line 48) | def project_config_update(self):
class TestDeltaPredicatesMergeTrino (line 58) | class TestDeltaPredicatesMergeTrino(BaseIncrementalPredicates):
method project_config_update (line 60) | def project_config_update(self):
FILE: tests/functional/adapter/materialization/test_incremental_schema.py
class OnSchemaChangeBase (line 36) | class OnSchemaChangeBase:
method project_config_update (line 39) | def project_config_update(self):
method models (line 44) | def models(self):
method tests (line 64) | def tests(self):
method list_tests_and_assert (line 81) | def list_tests_and_assert(self, include, exclude, expected_tests):
method run_tests_and_assert (line 93) | def run_tests_and_assert(
method run_incremental_ignore (line 117) | def run_incremental_ignore(self, project):
method run_incremental_append_new_columns (line 136) | def run_incremental_append_new_columns(self, project):
method run_incremental_append_new_columns_remove_one (line 154) | def run_incremental_append_new_columns_remove_one(self, project):
method run_incremental_sync_all_columns (line 171) | def run_incremental_sync_all_columns(self, project):
method run_incremental_sync_all_columns_quoted (line 189) | def run_incremental_sync_all_columns_quoted(self, project):
method run_incremental_sync_all_columns_data_type_change (line 207) | def run_incremental_sync_all_columns_data_type_change(self, project):
method run_incremental_fail_on_schema_change (line 225) | def run_incremental_fail_on_schema_change(self, _):
method test_run_incremental_ignore (line 231) | def test_run_incremental_ignore(self, project):
method test_run_incremental_append_new_columns (line 234) | def test_run_incremental_append_new_columns(self, project):
method test_run_incremental_sync_all_columns (line 238) | def test_run_incremental_sync_all_columns(self, project):
method test_run_incremental_sync_all_columns_data_type_change (line 242) | def test_run_incremental_sync_all_columns_data_type_change(self, proje...
method test_run_incremental_fail_on_schema_change (line 245) | def test_run_incremental_fail_on_schema_change(self, project):
class TestIcebergOnSchemaChange (line 250) | class TestIcebergOnSchemaChange(OnSchemaChangeBase):
method project_config_update (line 252) | def project_config_update(self):
class TestDeltaOnSchemaChange (line 260) | class TestDeltaOnSchemaChange(OnSchemaChangeBase):
method project_config_update (line 262) | def project_config_update(self):
method test_run_incremental_sync_all_columns (line 272) | def test_run_incremental_sync_all_columns(self, project):
method test_run_incremental_sync_all_columns_data_type_change (line 276) | def test_run_incremental_sync_all_columns_data_type_change(self, proje...
FILE: tests/functional/adapter/materialization/test_incremental_views_enabled.py
class BaseViewsEnabled (line 7) | class BaseViewsEnabled:
method seeds (line 10) | def seeds(self):
method models (line 17) | def models(self):
class TestViewsEnabledTrue (line 23) | class TestViewsEnabledTrue(BaseViewsEnabled):
method project_config_update (line 29) | def project_config_update(self):
method test_run_seed_test (line 40) | def test_run_seed_test(self, project):
class TestViewsEnabledFalse (line 57) | class TestViewsEnabledFalse(BaseViewsEnabled):
method project_config_update (line 63) | def project_config_update(self):
method test_run_seed_test (line 74) | def test_run_seed_test(self, project):
FILE: tests/functional/adapter/materialization/test_materialized_view.py
class TestIcebergMaterializedViewBase (line 20) | class TestIcebergMaterializedViewBase:
method teardown_method (line 22) | def teardown_method(self, project):
class TestIcebergMaterializedViewExists (line 37) | class TestIcebergMaterializedViewExists(TestIcebergMaterializedViewBase):
method project_config_update (line 39) | def project_config_update(self):
method models (line 45) | def models(self):
method test_mv_is_dropped_when_model_runs_view (line 54) | def test_mv_is_dropped_when_model_runs_view(self, project):
class TestIcebergMaterializedViewWithCTE (line 77) | class TestIcebergMaterializedViewWithCTE(TestIcebergMaterializedViewBase):
method project_config_update (line 80) | def project_config_update(self):
method seeds (line 93) | def seeds(self):
method models (line 100) | def models(self):
method test_mv_with_cte_is_created (line 105) | def test_mv_with_cte_is_created(self, project):
class TestIcebergMaterializedViewCreate (line 112) | class TestIcebergMaterializedViewCreate(TestIcebergMaterializedViewBase):
method project_config_update (line 115) | def project_config_update(self):
method seeds (line 128) | def seeds(self):
method models (line 135) | def models(self):
method test_mv_is_created_and_refreshed (line 140) | def test_mv_is_created_and_refreshed(self, project):
class TestIcebergMaterializedViewDropAndCreate (line 171) | class TestIcebergMaterializedViewDropAndCreate(TestIcebergMaterializedVi...
method project_config_update (line 174) | def project_config_update(self):
method seeds (line 188) | def seeds(self):
method models (line 195) | def models(self):
method test_mv_overrides_relation (line 202) | def test_mv_overrides_relation(self, project):
class TestIcebergMaterializedViewProperties (line 239) | class TestIcebergMaterializedViewProperties(TestIcebergMaterializedViewB...
method project_config_update (line 242) | def project_config_update(self):
method seeds (line 256) | def seeds(self):
method models (line 263) | def models(self):
method test_set_mv_properties (line 268) | def test_set_mv_properties(self, project):
class TestIcebergMaterializedViewWithGracePeriod (line 287) | class TestIcebergMaterializedViewWithGracePeriod(TestIcebergMaterialized...
method project_config_update (line 290) | def project_config_update(self):
method seeds (line 301) | def seeds(self):
method models (line 308) | def models(self):
method test_set_mv_properties (line 313) | def test_set_mv_properties(self, project):
FILE: tests/functional/adapter/materialization/test_on_table_exists.py
class BaseOnTableExists (line 11) | class BaseOnTableExists:
method seeds (line 14) | def seeds(self):
method models (line 21) | def models(self):
class TestOnTableExistsRename (line 28) | class TestOnTableExistsRename(BaseOnTableExists):
method project_config_update (line 35) | def project_config_update(self):
method test_run_seed_test (line 46) | def test_run_seed_test(self, project):
class TestOnTableExistsRenameIncrementalFullRefresh (line 83) | class TestOnTableExistsRenameIncrementalFullRefresh(BaseOnTableExists):
method project_config_update (line 90) | def project_config_update(self):
method test_run_seed_test (line 101) | def test_run_seed_test(self, project):
class TestOnTableExistsDrop (line 138) | class TestOnTableExistsDrop(BaseOnTableExists):
method project_config_update (line 145) | def project_config_update(self):
method test_run_seed_test (line 156) | def test_run_seed_test(self, project):
class TestOnTableExistsDropIncrementalFullRefresh (line 173) | class TestOnTableExistsDropIncrementalFullRefresh(BaseOnTableExists):
method project_config_update (line 180) | def project_config_update(self):
method test_run_seed_test (line 191) | def test_run_seed_test(self, project):
class BaseOnTableExistsReplace (line 216) | class BaseOnTableExistsReplace(BaseOnTableExists):
method project_config_update (line 223) | def project_config_update(self):
method test_run_seed_test (line 234) | def test_run_seed_test(self, project):
class TestOnTableExistsReplaceIceberg (line 254) | class TestOnTableExistsReplaceIceberg(BaseOnTableExistsReplace):
class TestOnTableExistsReplaceDelta (line 259) | class TestOnTableExistsReplaceDelta(BaseOnTableExistsReplace):
class BaseOnTableExistsReplaceIncrementalFullRefresh (line 263) | class BaseOnTableExistsReplaceIncrementalFullRefresh(BaseOnTableExists):
method project_config_update (line 270) | def project_config_update(self):
method test_run_seed_test (line 281) | def test_run_seed_test(self, project):
class TestOnTableExistsReplaceIcebergIncrementalFullRefresh (line 301) | class TestOnTableExistsReplaceIcebergIncrementalFullRefresh(
class TestOnTableExistsReplaceDeltaIncrementalFullRefresh (line 308) | class TestOnTableExistsReplaceDeltaIncrementalFullRefresh(
class TestOnTableExistsSkip (line 314) | class TestOnTableExistsSkip(BaseOnTableExists):
method project_config_update (line 321) | def project_config_update(self):
method test_run_seed_test (line 332) | def test_run_seed_test(self, project):
FILE: tests/functional/adapter/materialization/test_prepared_statements.py
class PreparedStatementsBase (line 11) | class PreparedStatementsBase:
method project_config_update (line 19) | def project_config_update(self):
method seeds (line 29) | def seeds(self):
method models (line 36) | def models(self):
method retrieve_num_prepared_statements (line 42) | def retrieve_num_prepared_statements(self, trino_connection):
method run_seed_with_prepared_statements (line 50) | def run_seed_with_prepared_statements(
class TestPreparedStatementsDisabled (line 76) | class TestPreparedStatementsDisabled(PreparedStatementsBase):
method test_run_seed_with_prepared_statements_disabled (line 77) | def test_run_seed_with_prepared_statements_disabled(self, project, tri...
class TestPreparedStatementsEnabled (line 82) | class TestPreparedStatementsEnabled(PreparedStatementsBase):
method test_run_seed_with_prepared_statements_enabled (line 83) | def test_run_seed_with_prepared_statements_enabled(self, project, trin...
FILE: tests/functional/adapter/materialization/test_snapshot.py
class BaseTrinoSnapshotTimestamp (line 61) | class BaseTrinoSnapshotTimestamp(BaseSnapshotTimestamp):
method test_snapshot_timestamp (line 62) | def test_snapshot_timestamp(self, project):
class TestIcebergSnapshotCheckColsTrino (line 79) | class TestIcebergSnapshotCheckColsTrino(BaseSnapshotCheckCols):
method project_config_update (line 81) | def project_config_update(self):
method seeds (line 90) | def seeds(self):
method macros (line 97) | def macros(self):
class TestIcebergSnapshotTimestampTrino (line 102) | class TestIcebergSnapshotTimestampTrino(BaseTrinoSnapshotTimestamp):
method seeds (line 104) | def seeds(self):
method project_config_update (line 112) | def project_config_update(self):
class TestDeltaSnapshotCheckColsTrino (line 125) | class TestDeltaSnapshotCheckColsTrino(BaseSnapshotCheckCols):
method project_config_update (line 127) | def project_config_update(self):
method seeds (line 139) | def seeds(self):
class TestDeltaSnapshotTimestampTrino (line 147) | class TestDeltaSnapshotTimestampTrino(BaseTrinoSnapshotTimestamp):
method project_config_update (line 149) | def project_config_update(self):
method seeds (line 161) | def seeds(self):
class TestSnapshotLocationPropertyExceptionTrino (line 169) | class TestSnapshotLocationPropertyExceptionTrino(BaseSnapshotCheckCols):
method project_config_update (line 175) | def project_config_update(self):
method test_snapshot_check_cols (line 185) | def test_snapshot_check_cols(self, project):
FILE: tests/functional/adapter/materialization/test_view_security.py
class TestViewSecurity (line 11) | class TestViewSecurity:
method project_config_update (line 19) | def project_config_update(self):
method seeds (line 30) | def seeds(self):
method models (line 37) | def models(self):
method test_run_seed_test (line 45) | def test_run_seed_test(self, project):
FILE: tests/functional/adapter/materialized_view_tests/test_materialized_view_dbt_core.py
class TestTrinoMaterializedViewsBasic (line 12) | class TestTrinoMaterializedViewsBasic(MaterializedViewBasic):
method insert_record (line 14) | def insert_record(project, table: BaseRelation, record: Tuple[int, int]):
method refresh_materialized_view (line 19) | def refresh_materialized_view(project, materialized_view: BaseRelation):
method query_row_count (line 24) | def query_row_count(project, relation: BaseRelation) -> int:
method query_relation_type (line 29) | def query_relation_type(project, relation: BaseRelation) -> Optional[s...
method setup (line 34) | def setup(self, project, my_materialized_view):
method test_materialized_view_only_updates_after_refresh (line 65) | def test_materialized_view_only_updates_after_refresh(self):
FILE: tests/functional/adapter/materialized_view_tests/utils.py
function query_relation_type (line 8) | def query_relation_type(project, relation: BaseRelation) -> Optional[str]:
FILE: tests/functional/adapter/persist_docs/test_persist_docs.py
class TestPersistDocsBase (line 22) | class TestPersistDocsBase:
method schema (line 28) | def schema(self):
method seeds (line 33) | def seeds(self):
class TestPersistDocsTable (line 39) | class TestPersistDocsTable(TestPersistDocsBase):
method project_config_update (line 41) | def project_config_update(self):
method models (line 53) | def models(self):
method test_run_seed_test (line 59) | def test_run_seed_test(self, project):
class TestPersistDocsView (line 70) | class TestPersistDocsView(TestPersistDocsBase):
method project_config_update (line 72) | def project_config_update(self):
method models (line 88) | def models(self):
method test_run_seed_test (line 94) | def test_run_seed_test(self, project):
class TestPersistDocsIncremental (line 105) | class TestPersistDocsIncremental(TestPersistDocsBase):
method project_config_update (line 107) | def project_config_update(self):
method models (line 119) | def models(self):
method test_run_seed_test (line 125) | def test_run_seed_test(self, project):
class TestPersistDocs (line 139) | class TestPersistDocs(BasePersistDocs):
class TestPersistDocsColumnMissing (line 143) | class TestPersistDocsColumnMissing(BasePersistDocsColumnMissing):
class TestPersistDocsCommentOnQuotedColumn (line 147) | class TestPersistDocsCommentOnQuotedColumn(BasePersistDocsCommentOnQuote...
class BasePersistDocsDisabled (line 151) | class BasePersistDocsDisabled(BasePersistDocsBase):
method test_persist_docs_disabled (line 152) | def test_persist_docs_disabled(self, project):
class TestPersistDocsDisabledByDefault (line 163) | class TestPersistDocsDisabledByDefault(BasePersistDocsDisabled):
class TestPersistDocsRelationSetToFalse (line 174) | class TestPersistDocsRelationSetToFalse(BasePersistDocsDisabled):
method project_config_update (line 180) | def project_config_update(self):
class TestPersistDocsRelationNotSet (line 196) | class TestPersistDocsRelationNotSet(BasePersistDocsDisabled):
method project_config_update (line 202) | def project_config_update(self):
FILE: tests/functional/adapter/show/test_show.py
class TestShow (line 17) | class TestShow:
method models (line 19) | def models(self):
method seeds (line 28) | def seeds(self):
method test_none (line 31) | def test_none(self, project):
method test_select_model_text (line 38) | def test_select_model_text(self, project):
method test_select_multiple_model_text (line 47) | def test_select_multiple_model_text(self, project):
method test_select_single_model_json (line 56) | def test_select_single_model_json(self, project):
method test_inline_pass (line 65) | def test_inline_pass(self, project):
method test_inline_fail (line 74) | def test_inline_fail(self, project):
method test_inline_fail_database_error (line 78) | def test_inline_fail_database_error(self, project):
method test_ephemeral_model (line 82) | def test_ephemeral_model(self, project):
method test_second_ephemeral_model (line 87) | def test_second_ephemeral_model(self, project):
method test_limit (line 103) | def test_limit(self, project, args, expected):
method test_seed (line 109) | def test_seed(self, project):
method test_sql_header (line 113) | def test_sql_header(self, project):
class TestShowModelVersions (line 119) | class TestShowModelVersions:
method models (line 121) | def models(self):
method seeds (line 129) | def seeds(self):
method test_version_unspecified (line 132) | def test_version_unspecified(self, project):
method test_none (line 138) | def test_none(self, project):
class TestShowPrivateModel (line 145) | class TestShowPrivateModel:
method models (line 147) | def models(self):
method seeds (line 154) | def seeds(self):
method test_version_unspecified (line 157) | def test_version_unspecified(self, project):
FILE: tests/functional/adapter/simple_seed/test_seed.py
class TrinoSetUpFixture (line 45) | class TrinoSetUpFixture:
method setUp (line 47) | def setUp(self, project):
class TestTrinoBasicSeedTests (line 53) | class TestTrinoBasicSeedTests(TrinoSetUpFixture, CoreTestBasicSeedTests):
method test_simple_seed_full_refresh_flag (line 57) | def test_simple_seed_full_refresh_flag(self, project):
class TestTrinoSeedConfigFullRefreshOn (line 64) | class TestTrinoSeedConfigFullRefreshOn(TrinoSetUpFixture, CoreTestSeedCo...
class TestTrinoSeedConfigFullRefreshOff (line 68) | class TestTrinoSeedConfigFullRefreshOff(TrinoSetUpFixture, CoreTestSeedC...
class TestTrinoSeedCustomSchema (line 72) | class TestTrinoSeedCustomSchema(TrinoSetUpFixture, CoreTestSeedCustomSch...
class TestTrinoSeedWithUniqueDelimiter (line 76) | class TestTrinoSeedWithUniqueDelimiter(TrinoSetUpFixture, CoreTestSeedWi...
class TestTrinoSeedWithWrongDelimiter (line 80) | class TestTrinoSeedWithWrongDelimiter(TrinoSetUpFixture, CoreTestSeedWit...
method test_seed_with_wrong_delimiter (line 81) | def test_seed_with_wrong_delimiter(self, project):
class TestTrinoSeedWithEmptyDelimiter (line 87) | class TestTrinoSeedWithEmptyDelimiter(TrinoSetUpFixture, CoreTestSeedWit...
class TestTrinoSimpleSeedEnabledViaConfig (line 91) | class TestTrinoSimpleSeedEnabledViaConfig(CoreTestSimpleSeedEnabledViaCo...
class TestTrinoSeedParsing (line 95) | class TestTrinoSeedParsing(TrinoSetUpFixture, CoreTestSeedParsing):
class TestTrinoSimpleSeedWithBOM (line 99) | class TestTrinoSimpleSeedWithBOM(CoreTestSimpleSeedWithBOM):
method setUp (line 101) | def setUp(self, project):
class TestTrinoSeedSpecificFormats (line 113) | class TestTrinoSeedSpecificFormats(CoreTestSeedSpecificFormats):
FILE: tests/functional/adapter/store_failures/test_store_failures.py
class TestStoreFailuresTable (line 15) | class TestStoreFailuresTable:
method schema (line 17) | def schema(self):
method seeds (line 22) | def seeds(self):
method project_config_update (line 28) | def project_config_update(self):
method models (line 46) | def models(self):
method teardown_method (line 53) | def teardown_method(self, project):
method test_run_seed_test (line 61) | def test_run_seed_test(self, project):
class TestTrinoTestStoreTestFailures (line 75) | class TestTrinoTestStoreTestFailures(TestStoreTestFailures):
class TestStoreTestFailuresAsInteractions (line 79) | class TestStoreTestFailuresAsInteractions(basic.StoreTestFailuresAsInter...
class TestStoreTestFailuresAsProjectLevelOff (line 83) | class TestStoreTestFailuresAsProjectLevelOff(basic.StoreTestFailuresAsPr...
class TestStoreTestFailuresAsProjectLevelView (line 87) | class TestStoreTestFailuresAsProjectLevelView(basic.StoreTestFailuresAsP...
class TestStoreTestFailuresAsGeneric (line 91) | class TestStoreTestFailuresAsGeneric(basic.StoreTestFailuresAsGeneric):
class TestStoreTestFailuresAsProjectLevelEphemeral (line 95) | class TestStoreTestFailuresAsProjectLevelEphemeral(basic.StoreTestFailur...
class TestStoreTestFailuresAsExceptions (line 99) | class TestStoreTestFailuresAsExceptions(basic.StoreTestFailuresAsExcepti...
FILE: tests/functional/adapter/test_basic.py
class TestAdapterMethods (line 87) | class TestAdapterMethods(BaseAdapterMethod):
class TestSimpleMaterializationsTrino (line 94) | class TestSimpleMaterializationsTrino(BaseSimpleMaterializations):
method project_config_update (line 96) | def project_config_update(self):
method seeds (line 105) | def seeds(self):
class TestSingularTestsTrino (line 111) | class TestSingularTestsTrino(BaseSingularTests):
class TestSingularTestsEphemeralTrino (line 115) | class TestSingularTestsEphemeralTrino(BaseSingularTestsEphemeral):
method project_config_update (line 117) | def project_config_update(self):
method seeds (line 126) | def seeds(self):
class TestEmptyTrino (line 132) | class TestEmptyTrino(BaseEmpty):
class TestEphemeralTrino (line 136) | class TestEphemeralTrino(BaseEphemeral):
method project_config_update (line 138) | def project_config_update(self):
method seeds (line 147) | def seeds(self):
class TestIncrementalTrino (line 153) | class TestIncrementalTrino(BaseIncremental):
method project_config_update (line 155) | def project_config_update(self):
method seeds (line 164) | def seeds(self):
class TestIncrementalFullRefreshTrino (line 168) | class TestIncrementalFullRefreshTrino(TestIncrementalTrino):
method test_incremental (line 169) | def test_incremental(self, project):
class TestIncrementalNotSchemaChangeTrino (line 175) | class TestIncrementalNotSchemaChangeTrino(BaseIncrementalNotSchemaChange):
method models (line 177) | def models(self):
class TestGenericTestsTrino (line 181) | class TestGenericTestsTrino(BaseGenericTests):
method project_config_update (line 183) | def project_config_update(self):
method seeds (line 192) | def seeds(self):
class TestTrinoValidateConnection (line 196) | class TestTrinoValidateConnection(BaseValidateConnection):
class TestDocsGenerateTrino (line 200) | class TestDocsGenerateTrino(BaseDocsGenerate):
method project_config_update (line 202) | def project_config_update(self, unique_schema):
method seeds (line 218) | def seeds(self):
method expected_catalog (line 222) | def expected_catalog(self, project, profile_user):
FILE: tests/functional/adapter/test_caching.py
class TestCachingLowerCaseModel (line 8) | class TestCachingLowerCaseModel(BaseCachingLowercaseModel):
class TestCachingUppercaseModel (line 12) | class TestCachingUppercaseModel(BaseCachingUppercaseModel):
class TestCachingSelectedSchemaOnly (line 16) | class TestCachingSelectedSchemaOnly(BaseCachingSelectedSchemaOnly):
FILE: tests/functional/adapter/test_changing_relation_type.py
class TestTrinoChangeRelationTypes (line 6) | class TestTrinoChangeRelationTypes(BaseChangeRelationTypeValidator):
FILE: tests/functional/adapter/test_concurrency.py
class TestConcurrencyTrino (line 8) | class TestConcurrencyTrino(BaseConcurrency):
method test_concurrency (line 9) | def test_concurrency(self, project):
FILE: tests/functional/adapter/test_custom_schema.py
class CustomSchemaBase (line 15) | class CustomSchemaBase(ABC):
method table_type (line 26) | def table_type(self):
method materialization (line 31) | def materialization(self):
method custom_schema_model (line 35) | def custom_schema_model(self, materialization):
method project_config_update (line 47) | def project_config_update(self):
method seeds (line 56) | def seeds(self):
method models (line 63) | def models(self):
method teardown_method (line 71) | def teardown_method(self, project):
method test_custom_schema_trino (line 78) | def test_custom_schema_trino(self, project):
class TestCustomSchemaTable (line 104) | class TestCustomSchemaTable(CustomSchemaBase):
method materialization (line 105) | def materialization(self):
method table_type (line 108) | def table_type(self):
class TestCustomSchemaView (line 112) | class TestCustomSchemaView(CustomSchemaBase):
method materialization (line 113) | def materialization(self):
method table_type (line 116) | def table_type(self):
class TestCustomSchemaIncremental (line 120) | class TestCustomSchemaIncremental(CustomSchemaBase):
method materialization (line 121) | def materialization(self):
method table_type (line 124) | def table_type(self):
FILE: tests/functional/adapter/test_ephemeral.py
class TestEphemeralMultiTrino (line 9) | class TestEphemeralMultiTrino(BaseEphemeralMulti):
method test_ephemeral_multi (line 10) | def test_ephemeral_multi(self, project):
class TestEphemeralNestedTrino (line 19) | class TestEphemeralNestedTrino(BaseEphemeralNested):
method test_ephemeral_nested (line 20) | def test_ephemeral_nested(self, project):
class TestEphemeralErrorHandlingTrino (line 25) | class TestEphemeralErrorHandlingTrino(BaseEphemeralErrorHandling):
FILE: tests/functional/adapter/test_get_incremental_tmp_relation_type_macro.py
class CustomSchemaBase (line 7) | class CustomSchemaBase(ABC):
method expected_types (line 15) | def expected_types(self):
method incremental_model (line 21) | def incremental_model(self):
method project_config_update (line 35) | def project_config_update(self):
method models (line 40) | def models(self):
method test_get_incremental_tmp_relation_type (line 43) | def test_get_incremental_tmp_relation_type(self, project):
class TestViewsEnabled (line 59) | class TestViewsEnabled(CustomSchemaBase):
method expected_types (line 61) | def expected_types(self):
method project_config_update (line 65) | def project_config_update(self):
class TestViewsNotEnabled (line 71) | class TestViewsNotEnabled(CustomSchemaBase):
method expected_types (line 73) | def expected_types(self):
method project_config_update (line 79) | def project_config_update(self):
FILE: tests/functional/adapter/test_grants.py
class TestModelGrantsTrino (line 16) | class TestModelGrantsTrino(BaseModelGrants):
method assert_expected_grants_match_actual (line 17) | def assert_expected_grants_match_actual(self, project, relation_name, ...
class TestInvalidGrantsTrino (line 41) | class TestInvalidGrantsTrino(BaseInvalidGrants):
FILE: tests/functional/adapter/test_query_comments.py
class TestQueryCommentsTrino (line 11) | class TestQueryCommentsTrino(BaseQueryComments):
class TestMacroQueryCommentsTrino (line 15) | class TestMacroQueryCommentsTrino(BaseMacroQueryComments):
class TestMacroArgsQueryCommentsTrino (line 19) | class TestMacroArgsQueryCommentsTrino(BaseMacroArgsQueryComments):
class TestMacroInvalidQueryCommentsTrino (line 23) | class TestMacroInvalidQueryCommentsTrino(BaseMacroInvalidQueryComments):
class TestNullQueryCommentsTrino (line 27) | class TestNullQueryCommentsTrino(BaseNullQueryComments):
class TestEmptyQueryCommentsTrino (line 31) | class TestEmptyQueryCommentsTrino(BaseEmptyQueryComments):
FILE: tests/functional/adapter/test_quote_policy.py
function unique_schema (line 7) | def unique_schema(request, prefix) -> str:
class TestTrinoQuotePolicy (line 11) | class TestTrinoQuotePolicy(TestIncrementalTrino):
FILE: tests/functional/adapter/test_sample_mode.py
class TestTrinoSampleMode (line 4) | class TestTrinoSampleMode(BaseSampleModeTest):
FILE: tests/functional/adapter/test_seeds_column_types_overrides.py
function get_relation_columns (line 64) | def get_relation_columns(adapter, name):
class TestSeedsColumnTypesOverrides (line 78) | class TestSeedsColumnTypesOverrides:
method project_config_update (line 80) | def project_config_update(self):
method seeds (line 94) | def seeds(self):
method test_seeds_column_overrides (line 102) | def test_seeds_column_overrides(self, project):
FILE: tests/functional/adapter/test_session_property.py
class TestSessionProperty (line 7) | class TestSessionProperty:
method schema (line 15) | def schema(self):
method session_property_model (line 18) | def session_property_model(self, prehook):
method models (line 30) | def models(self):
method test_custom_schema_trino (line 33) | def test_custom_schema_trino(self, project):
FILE: tests/functional/adapter/test_simple_copy.py
class TestSimpleCopyBase (line 10) | class TestSimpleCopyBase(SimpleCopyBase):
method test_simple_copy_with_materialized_views (line 11) | def test_simple_copy_with_materialized_views(self, project):
function trino_get_tables_in_schema (line 41) | def trino_get_tables_in_schema(prj):
class TestEmptyModelsArentRun (line 57) | class TestEmptyModelsArentRun(EmptyModelsArentRunBase):
method test_dbt_doesnt_run_empty_models (line 58) | def test_dbt_doesnt_run_empty_models(self, project):
FILE: tests/functional/adapter/test_simple_snapshot.py
class TrinoSimpleSnapshot (line 15) | class TrinoSimpleSnapshot(BaseSimpleSnapshot):
method test_updates_are_captured_by_snapshot (line 16) | def test_updates_are_captured_by_snapshot(self, project):
method test_new_column_captured_by_snapshot (line 29) | def test_new_column_captured_by_snapshot(self, project):
class TrinoSnapshotCheck (line 50) | class TrinoSnapshotCheck(BaseSnapshotCheck):
method test_column_selection_is_reflected_in_snapshot (line 51) | def test_column_selection_is_reflected_in_snapshot(self, project):
class TestIcebergSimpleSnapshot (line 71) | class TestIcebergSimpleSnapshot(TrinoSimpleSnapshot):
method project_config_update (line 73) | def project_config_update(self):
class TestDeltaSimpleSnapshot (line 82) | class TestDeltaSimpleSnapshot(TrinoSimpleSnapshot):
class TestIcebergSnapshotCheck (line 87) | class TestIcebergSnapshotCheck(TrinoSnapshotCheck):
method macros (line 89) | def macros(self):
class TestDeltaSnapshotCheck (line 94) | class TestDeltaSnapshotCheck(TrinoSnapshotCheck):
FILE: tests/functional/adapter/test_sql_status_output.py
class TestSqlStatusOutput (line 17) | class TestSqlStatusOutput:
method seeds (line 23) | def seeds(self):
method models (line 29) | def models(self):
method project_config_update (line 36) | def project_config_update(self):
method test_run_seed_test (line 47) | def test_run_seed_test(self, project):
FILE: tests/functional/adapter/test_table_properties.py
class BaseTableProperties (line 7) | class BaseTableProperties:
method seeds (line 10) | def seeds(self):
method models (line 17) | def models(self):
class TestTableProperties (line 24) | class TestTableProperties(BaseTableProperties):
method project_config_update (line 27) | def project_config_update(self):
method test_table_properties (line 39) | def test_table_properties(self, project):
class TestFileFormatConfig (line 53) | class TestFileFormatConfig(BaseTableProperties):
method project_config_update (line 56) | def project_config_update(self):
method test_table_properties (line 65) | def test_table_properties(self, project):
class TestFileFormatConfigAndFormatTablePropertyFail (line 78) | class TestFileFormatConfigAndFormatTablePropertyFail(BaseTableProperties):
method project_config_update (line 81) | def project_config_update(self):
method test_table_properties (line 93) | def test_table_properties(self, project):
class TestTableFormatConfig (line 111) | class TestTableFormatConfig(BaseTableProperties):
method project_config_update (line 114) | def project_config_update(self):
method test_table_properties (line 123) | def test_table_properties(self, project):
class TestTableFormatConfigAndTypeTablePropertyFail (line 139) | class TestTableFormatConfigAndTypeTablePropertyFail(BaseTableProperties):
method project_config_update (line 142) | def project_config_update(self):
method test_table_properties (line 154) | def test_table_properties(self, project):
FILE: tests/functional/adapter/unit_testing/test_unit_testing.py
class TestTrinoUnitTestingTypesTrinoStarburst (line 10) | class TestTrinoUnitTestingTypesTrinoStarburst(BaseUnitTestingTypes):
method data_types (line 12) | def data_types(self):
class TestTrinoUnitTestingTypesGalaxy (line 31) | class TestTrinoUnitTestingTypesGalaxy(BaseUnitTestingTypes):
method data_types (line 33) | def data_types(self):
class TestTrinoUnitTestCaseInsensitivity (line 46) | class TestTrinoUnitTestCaseInsensitivity(BaseUnitTestCaseInsensivity):
class TestTrinoUnitTestInvalidInput (line 50) | class TestTrinoUnitTestInvalidInput(BaseUnitTestInvalidInput):
FILE: tests/functional/adapter/utils/test_data_types.py
class TestTypeBigInt (line 11) | class TestTypeBigInt(BaseTypeBigInt):
class TestTypeFloat (line 15) | class TestTypeFloat(BaseTypeFloat):
class TestTypeInt (line 19) | class TestTypeInt(BaseTypeInt):
class TestTypeNumeric (line 23) | class TestTypeNumeric(BaseTypeNumeric):
method numeric_fixture_type (line 24) | def numeric_fixture_type(self):
class TestTypeString (line 28) | class TestTypeString(BaseTypeString):
class TestTypeTimestamp (line 34) | class TestTypeTimestamp(BaseTypeTimestamp):
class TestTypeBoolean (line 38) | class TestTypeBoolean(BaseTypeBoolean):
FILE: tests/functional/adapter/utils/test_date_spine.py
class BaseDateSpine (line 10) | class BaseDateSpine(BaseUtils):
method models (line 12) | def models(self):
class TestDateSpine (line 21) | class TestDateSpine(BaseDateSpine):
FILE: tests/functional/adapter/utils/test_get_intervals_between.py
class BaseGetIntervalsBetween (line 12) | class BaseGetIntervalsBetween(BaseUtils):
method models (line 14) | def models(self):
class TestGetIntervalsBetween (line 23) | class TestGetIntervalsBetween(BaseGetIntervalsBetween):
FILE: tests/functional/adapter/utils/test_timestamps.py
class TestCurrentTimestampTrino (line 5) | class TestCurrentTimestampTrino(BaseCurrentTimestamps):
method models (line 7) | def models(self):
method expected_schema (line 13) | def expected_schema(self):
method expected_sql (line 17) | def expected_sql(self):
FILE: tests/functional/adapter/utils/test_utils.py
class TestAnyValue (line 58) | class TestAnyValue(BaseAnyValue):
class TestArrayAppend (line 64) | class TestArrayAppend(BaseArrayAppend):
method models (line 66) | def models(self):
class TestArrayConcat (line 75) | class TestArrayConcat(BaseArrayConcat):
method models (line 77) | def models(self):
class TestArrayConstruct (line 84) | class TestArrayConstruct(BaseArrayConstruct):
class TestBoolOr (line 88) | class TestBoolOr(BaseBoolOr):
class TestCastBoolToText (line 92) | class TestCastBoolToText(BaseCastBoolToText):
class TestConcat (line 96) | class TestConcat(BaseConcat):
class TestCurrentTimestamp (line 100) | class TestCurrentTimestamp(BaseCurrentTimestampAware):
class TestDateAdd (line 104) | class TestDateAdd(BaseDateAdd):
method project_config_update (line 106) | def project_config_update(self):
class TestDateDiff (line 118) | class TestDateDiff(BaseDateDiff):
method project_config_update (line 120) | def project_config_update(self):
method seeds (line 129) | def seeds(self):
method models (line 133) | def models(self):
class TestDateTrunc (line 142) | class TestDateTrunc(BaseDateTrunc):
method project_config_update (line 144) | def project_config_update(self):
class TestEquals (line 153) | class TestEquals(BaseEquals):
class TestEscapeSingleQuotes (line 157) | class TestEscapeSingleQuotes(BaseEscapeSingleQuotesQuote):
class TestExcept (line 161) | class TestExcept(BaseExcept):
class TestGenerateSeries (line 165) | class TestGenerateSeries(BaseGenerateSeries):
class TestGetPowersOfTwo (line 169) | class TestGetPowersOfTwo(BaseGetPowersOfTwo):
class TestHash (line 173) | class TestHash(BaseHash):
class TestIntersect (line 177) | class TestIntersect(BaseIntersect):
class TestLastDay (line 181) | class TestLastDay(BaseLastDay):
class TestLength (line 185) | class TestLength(BaseLength):
class TestListagg (line 189) | class TestListagg(BaseListagg):
class TestPosition (line 193) | class TestPosition(BasePosition):
class TestReplace (line 197) | class TestReplace(BaseReplace):
class TestRight (line 201) | class TestRight(BaseRight):
class TestSafeCast (line 205) | class TestSafeCast(BaseSafeCast):
class TestSplitPart (line 209) | class TestSplitPart(BaseSplitPart):
class TestStringLiteral (line 213) | class TestStringLiteral(BaseStringLiteral):
class TestValidateSqlMethod (line 217) | class TestValidateSqlMethod(BaseValidateSqlMethod):
FILE: tests/unit/test_adapter.py
class TestTrinoAdapter (line 30) | class TestTrinoAdapter(unittest.TestCase):
method setUp (line 31) | def setUp(self):
method adapter (line 74) | def adapter(self):
method test_acquire_connection (line 78) | def test_acquire_connection(self):
method test_cancel_open_connections_empty (line 85) | def test_cancel_open_connections_empty(self):
method test_cancel_open_connections_master (line 88) | def test_cancel_open_connections_master(self):
method test_database_exception (line 94) | def test_database_exception(self, get_thread_connection):
method test_failed_to_connect_exception (line 102) | def test_failed_to_connect_exception(self, get_thread_connection):
method test_dbt_exception (line 111) | def test_dbt_exception(self, get_thread_connection):
method _setup_mock_exception (line 116) | def _setup_mock_exception(self, get_thread_connection, exception):
class TestTrinoAdapterAuthenticationMethods (line 125) | class TestTrinoAdapterAuthenticationMethods(unittest.TestCase):
method setUp (line 126) | def setUp(self):
method acquire_connection_with_profile (line 129) | def acquire_connection_with_profile(self, profile):
method assert_default_connection_credentials (line 151) | def assert_default_connection_credentials(self, credentials):
method test_none_authentication (line 165) | def test_none_authentication(self):
method test_none_authentication_with_method (line 195) | def test_none_authentication_with_method(self):
method test_none_authentication_without_http_scheme (line 226) | def test_none_authentication_without_http_scheme(self):
method test_ldap_authentication (line 256) | def test_ldap_authentication(self):
method test_kerberos_authentication (line 292) | def test_kerberos_authentication(self):
method test_certificate_authentication (line 323) | def test_certificate_authentication(self):
method test_jwt_authentication (line 359) | def test_jwt_authentication(self):
method test_oauth_authentication (line 389) | def test_oauth_authentication(self):
method test_oauth_console_authentication (line 419) | def test_oauth_console_authentication(self):
class TestPreparedStatementsEnabled (line 450) | class TestPreparedStatementsEnabled(TestCase):
method setup_profile (line 451) | def setup_profile(self, credentials):
method test_default (line 474) | def test_default(self):
method test_false (line 489) | def test_false(self):
method test_true (line 505) | def test_true(self):
class TestAdapterConversions (line 522) | class TestAdapterConversions(TestCase):
method _get_tester_for (line 523) | def _get_tester_for(self, column_type):
method _make_table_of (line 533) | def _make_table_of(self, rows, column_types):
class TestTrinoAdapterConversions (line 543) | class TestTrinoAdapterConversions(TestAdapterConversions):
method test_convert_text_type (line 544) | def test_convert_text_type(self):
method test_convert_number_type (line 555) | def test_convert_number_type(self):
method test_convert_boolean_type (line 566) | def test_convert_boolean_type(self):
method test_convert_datetime_type (line 577) | def test_convert_datetime_type(self):
method test_convert_date_type (line 590) | def test_convert_date_type(self):
class TestTrinoColumn (line 602) | class TestTrinoColumn(unittest.TestCase):
method test_bound_varchar (line 603) | def test_bound_varchar(self):
method test_unbound_varchar (line 615) | def test_unbound_varchar(self):
FILE: tests/unit/utils.py
function normalize (line 13) | def normalize(path):
class Obj (line 26) | class Obj:
function mock_connection (line 31) | def mock_connection(name):
function profile_from_dict (line 37) | def profile_from_dict(profile, profile_name, cli_vars="{}"):
function project_from_dict (line 61) | def project_from_dict(project, profile, packages=None, selectors=None, c...
function config_from_parts_or_dicts (line 81) | def config_from_parts_or_dicts(project, profile, packages=None, selector...
function inject_plugin (line 117) | def inject_plugin(plugin):
function inject_adapter (line 124) | def inject_adapter(value, plugin):
class ContractTestCase (line 135) | class ContractTestCase(TestCase):
method setUp (line 138) | def setUp(self):
method assert_to_dict (line 142) | def assert_to_dict(self, obj, dct):
method assert_from_dict (line 145) | def assert_from_dict(self, obj, dct, cls=None):
method assert_symmetric (line 150) | def assert_symmetric(self, obj, dct, cls=None):
method assert_fails_validation (line 154) | def assert_fails_validation(self, dct, cls=None):
function generate_name_macros (line 163) | def generate_name_macros(package):
Condensed preview — 164 files, each showing path, character count, and a content snippet. Download the .json file or copy for the full structured content (493K chars).
[
{
"path": ".changes/0.0.0.md",
"chars": 918,
"preview": "## Previous Releases\n\nFor information on prior major and minor releases, see their changelogs:\n\n* [1.9](https://github.c"
},
{
"path": ".changes/1.10.0/Features-20251210-194211.yaml",
"chars": 149,
"preview": "kind: Features\nbody: Add support for catalog integration\ntime: 2025-12-10T19:42:11.700646+01:00\ncustom:\n Author: damian"
},
{
"path": ".changes/1.10.0.md",
"chars": 280,
"preview": "## dbt-trino 1.10.0 - December 16, 2025\n### Features\n- Add support for catalog integration ([#502](https://github.com/st"
},
{
"path": ".changes/1.10.1/Dependencies-20260115-092226.yaml",
"chars": 153,
"preview": "kind: Dependencies\nbody: Bump dbt-adapters>=1.16,<2.0\ntime: 2026-01-15T09:22:26.968512-08:00\ncustom:\n Author: zquresh"
},
{
"path": ".changes/1.10.1.md",
"chars": 335,
"preview": "## dbt-trino 1.10.1 - January 16, 2026\n### Dependencies\n- Bump dbt-adapters>=1.16,<2.0 ([#507](https://github.com/starbu"
},
{
"path": ".changes/header.tpl.md",
"chars": 638,
"preview": "# dbt-trino Changelog\n\n- This file provides a full account of all changes to `dbt-trino`\n- Changes are listed under the "
},
{
"path": ".changes/unreleased/.gitkeep",
"chars": 0,
"preview": ""
},
{
"path": ".changie.yaml",
"chars": 2703,
"preview": "changesDir: .changes\nunreleasedDir: unreleased\nheaderPath: header.tpl.md\nversionHeaderPath: \"\"\nchangelogPath: CHANGELOG."
},
{
"path": ".flake8",
"chars": 110,
"preview": "[flake8]\nselect =\n E\n W\n F\nignore =\n W503,\n W504,\n E203,\n E741,\n E501,\nexclude = test\n"
},
{
"path": ".github/ISSUE_TEMPLATE/bug_report.yml",
"chars": 2515,
"preview": "---\nname: Bug report\ndescription: Report a bug or an issue you've found with dbt-trino\nlabels: bug\nbody:\n - type: texta"
},
{
"path": ".github/ISSUE_TEMPLATE/config.yml",
"chars": 241,
"preview": "---\ncontact_links:\n - name: Ask a question or get help around `dbt-trino` on Slack\n url: https://getdbt.slack.com/ch"
},
{
"path": ".github/ISSUE_TEMPLATE/feature_request.yml",
"chars": 1315,
"preview": "---\nname: Feature request\ndescription: Suggest an idea for dbt-trino\nlabels: enhancement\nbody:\n - type: textarea\n at"
},
{
"path": ".github/dependabot.yml",
"chars": 352,
"preview": "version: 2\nupdates:\n # python dependencies\n - package-ecosystem: \"pip\"\n directory: \"/\"\n schedule:\n interval"
},
{
"path": ".github/pull_request_template.md",
"chars": 643,
"preview": "## Overview\n<!---\n Include the number of the issue addressed by this PR above if applicable.\n PRs for code changes wit"
},
{
"path": ".github/workflows/bot-changelog.yml",
"chars": 1694,
"preview": "# **what?**\n# When bots create a PR, this action will add a corresponding changie yaml file to that\n# PR when a specific"
},
{
"path": ".github/workflows/changelog-existence.yml",
"chars": 1396,
"preview": "\n\n# **what?**\n# Checks that a file has been committed under the /.changes directory\n# as a new CHANGELOG entry. Cannot "
},
{
"path": ".github/workflows/ci.yml",
"chars": 2549,
"preview": "name: dbt-trino tests\non:\n push:\n branches:\n - master\n - \"*.*.latest\"\n paths-ignore:\n - \"**/*.md\"\n"
},
{
"path": ".github/workflows/release.yml",
"chars": 3036,
"preview": "name: dbt-trino release\n\non:\n workflow_dispatch:\n\njobs:\n test:\n runs-on: ubuntu-latest\n steps:\n - name: Che"
},
{
"path": ".github/workflows/security.yml",
"chars": 508,
"preview": "name: Veracode SCA\n\non:\n workflow_dispatch:\n\njobs:\n veracode-sca-task:\n runs-on: ubuntu-latest\n name: Scan repos"
},
{
"path": ".github/workflows/version-bump.yml",
"chars": 3956,
"preview": "# **what?**\n# This workflow will take the new version number to bump to. With that\n# it will run versionbump to update t"
},
{
"path": ".gitignore",
"chars": 130,
"preview": "*.egg-info\nenv/\n__pycache__/\n.tox/\n.idea/\nbuild/\ndist/\ndbt-integration-tests\ndocker/dbt/.user.yml\n.DS_Store\n.vscode/\nlog"
},
{
"path": ".pre-commit-config.yaml",
"chars": 2270,
"preview": "# Configuration for pre-commit hooks (see https://pre-commit.com/).\n# Eventually the hooks described here will be run as"
},
{
"path": "CHANGELOG.md",
"chars": 2171,
"preview": "# dbt-trino Changelog\n\n- This file provides a full account of all changes to `dbt-trino`\n- Changes are listed under the "
},
{
"path": "CONTRIBUTING.md",
"chars": 8093,
"preview": "# Contributing to `dbt-trino`\n\n## Getting the code\n\n### How to contribute?\n\nYou can contribute to `dbt-trino` by forking"
},
{
"path": "LICENSE.txt",
"chars": 11350,
"preview": " Apache License\n Version 2.0, January 2004\n "
},
{
"path": "Makefile",
"chars": 450,
"preview": ".EXPORT_ALL_VARIABLES:\n\nDBT_TEST_USER_1=user1\nDBT_TEST_USER_2=user2\nDBT_TEST_USER_3=user3\n\nstart-trino:\n\tdocker network "
},
{
"path": "README.md",
"chars": 6453,
"preview": "# dbt-trino\n\n<picture>\n <source media=\"(prefers-color-scheme: dark)\" srcset=\"https://raw.githubusercontent.com/starburs"
},
{
"path": "dbt/adapters/trino/__init__.py",
"chars": 565,
"preview": "from dbt.adapters.base import AdapterPlugin\n\nfrom dbt.adapters.trino.column import TrinoColumn # noqa\nfrom dbt.adapters"
},
{
"path": "dbt/adapters/trino/__version__.py",
"chars": 19,
"preview": "version = \"1.10.1\"\n"
},
{
"path": "dbt/adapters/trino/catalogs/__init__.py",
"chars": 234,
"preview": "from dbt.adapters.trino.catalogs._relation import TrinoCatalogRelation\nfrom dbt.adapters.trino.catalogs._trino_catalog_m"
},
{
"path": "dbt/adapters/trino/catalogs/_relation.py",
"chars": 510,
"preview": "from dataclasses import dataclass\nfrom typing import Optional\n\nfrom dbt.adapters.catalogs import CatalogRelation\n\nfrom d"
},
{
"path": "dbt/adapters/trino/catalogs/_trino_catalog_metastore.py",
"chars": 2972,
"preview": "from typing import Optional\n\nfrom dbt.adapters.catalogs import CatalogIntegration, CatalogIntegrationConfig\nfrom dbt.ada"
},
{
"path": "dbt/adapters/trino/column.py",
"chars": 3879,
"preview": "import re\nfrom dataclasses import dataclass\nfrom typing import ClassVar, Dict\n\nfrom dbt.adapters.base.column import Colu"
},
{
"path": "dbt/adapters/trino/connections.py",
"chars": 18659,
"preview": "import decimal\nimport os\nimport re\nfrom abc import ABCMeta, abstractmethod\nfrom contextlib import contextmanager\nfrom da"
},
{
"path": "dbt/adapters/trino/constants.py",
"chars": 316,
"preview": "from types import SimpleNamespace\n\nADAPTER_TYPE = \"trino\"\n\nTRINO_CATALOG_TYPE = \"trino\"\n\nDEFAULT_TRINO_CATALOG = SimpleN"
},
{
"path": "dbt/adapters/trino/impl.py",
"chars": 5036,
"preview": "from dataclasses import dataclass\nfrom typing import Dict, List, Optional\n\nimport agate\nfrom dbt.adapters.base.impl impo"
},
{
"path": "dbt/adapters/trino/parse_model.py",
"chars": 559,
"preview": "from typing import Optional\n\nfrom dbt.adapters.catalogs import CATALOG_INTEGRATION_MODEL_CONFIG_NAME # type: ignore\nfro"
},
{
"path": "dbt/adapters/trino/relation.py",
"chars": 1358,
"preview": "from dataclasses import dataclass, field\n\nfrom dbt.adapters.base.relation import BaseRelation, EventTimeFilter, Policy\nf"
},
{
"path": "dbt/include/trino/__init__.py",
"chars": 52,
"preview": "import os\n\nPACKAGE_PATH = os.path.dirname(__file__)\n"
},
{
"path": "dbt/include/trino/dbt_project.yml",
"chars": 72,
"preview": "name: dbt_trino\nversion: 1.0\nconfig-version: 2\n\nmacro-paths: [\"macros\"]\n"
},
{
"path": "dbt/include/trino/macros/adapters.sql",
"chars": 13590,
"preview": "\n-- - get_catalog\n-- - list_relations_without_caching\n-- - get_columns_in_relation\n\n{% macro trino__get_columns_in_relat"
},
{
"path": "dbt/include/trino/macros/apply_grants.sql",
"chars": 1785,
"preview": "{% macro trino__get_show_grant_sql(relation) -%}\n select\n grantee,\n lower(privilege_type) as privilege_"
},
{
"path": "dbt/include/trino/macros/catalog.sql",
"chars": 5484,
"preview": "{% macro trino__get_catalog(information_schema, schemas) -%}\n\n {% set query %}\n with tables as (\n {"
},
{
"path": "dbt/include/trino/macros/materializations/incremental.sql",
"chars": 11255,
"preview": "{% macro get_incremental_tmp_relation_type(strategy, unique_key, language) %}\n\n /* {#\n If we are running multiple"
},
{
"path": "dbt/include/trino/macros/materializations/materialized_view.sql",
"chars": 1487,
"preview": "{%- macro trino__get_create_materialized_view_as_sql(target_relation, sql) -%}\n create materialized view {{ target_rela"
},
{
"path": "dbt/include/trino/macros/materializations/seeds/helpers.sql",
"chars": 3547,
"preview": "{% macro trino__get_batch_size() %}\n {{ return(1000) }}\n{% endmacro %}\n\n\n{% macro create_bindings(row, types) %}\n {% s"
},
{
"path": "dbt/include/trino/macros/materializations/snapshot.sql",
"chars": 2023,
"preview": "{% materialization snapshot, adapter='trino' %}\n {% if config.get('properties') %}\n {% if config.get('properti"
},
{
"path": "dbt/include/trino/macros/materializations/table.sql",
"chars": 3985,
"preview": "{% materialization table, adapter = 'trino' %}\n {%- set on_table_exists = config.get('on_table_exists', 'rename') -%}\n "
},
{
"path": "dbt/include/trino/macros/materializations/view.sql",
"chars": 267,
"preview": "{% materialization view, adapter='trino' -%}\n {% set to_return = create_or_replace_view() %}\n {% set target_relati"
},
{
"path": "dbt/include/trino/macros/utils/any_value.sql",
"chars": 84,
"preview": "{% macro trino__any_value(expression) -%}\n min({{ expression }})\n{%- endmacro %}\n"
},
{
"path": "dbt/include/trino/macros/utils/array_append.sql",
"chars": 131,
"preview": "{% macro trino__array_append(array, new_element) -%}\n {{ array_concat(array, array_construct([new_element])) }}\n{%- e"
},
{
"path": "dbt/include/trino/macros/utils/array_concat.sql",
"chars": 108,
"preview": "{% macro trino__array_concat(array_1, array_2) -%}\n concat({{ array_1 }}, {{ array_2 }})\n{%- endmacro %}\n"
},
{
"path": "dbt/include/trino/macros/utils/array_construct.sql",
"chars": 179,
"preview": "{% macro trino__array_construct(inputs, data_type) -%}\n {%- if not inputs -%}\n null\n {%- else -%}\n array[ {{"
},
{
"path": "dbt/include/trino/macros/utils/bool_or.sql",
"chars": 86,
"preview": "{% macro trino__bool_or(expression) -%}\n bool_or({{ expression }})\n{%- endmacro %}\n"
},
{
"path": "dbt/include/trino/macros/utils/datatypes.sql",
"chars": 256,
"preview": "{% macro trino__type_float() -%}\n double\n{%- endmacro %}\n\n{% macro trino__type_string() -%}\n varchar\n{%- endmacro "
},
{
"path": "dbt/include/trino/macros/utils/date_spine.sql",
"chars": 856,
"preview": "{% macro trino__date_spine(datepart, start_date, end_date) %}\n\n\n {# call as follows:\n\n date_spine(\n \"day\",\n"
},
{
"path": "dbt/include/trino/macros/utils/date_trunc.sql",
"chars": 104,
"preview": "{% macro trino__date_trunc(datepart, date) -%}\n date_trunc('{{datepart}}', {{date}})\n{%- endmacro %}\n"
},
{
"path": "dbt/include/trino/macros/utils/dateadd.sql",
"chars": 165,
"preview": "{% macro trino__dateadd(datepart, interval, from_date_or_timestamp) -%}\n date_add('{{ datepart }}', {{ interval }}, {"
},
{
"path": "dbt/include/trino/macros/utils/datediff.sql",
"chars": 2080,
"preview": "{% macro trino__datediff(first_date, second_date, datepart) -%}\n {%- if datepart == 'year' -%}\n (year(CAST({{ "
},
{
"path": "dbt/include/trino/macros/utils/hash.sql",
"chars": 108,
"preview": "{% macro trino__hash(field) -%}\n lower(to_hex(md5(to_utf8(cast({{field}} as varchar)))))\n{%- endmacro %}\n"
},
{
"path": "dbt/include/trino/macros/utils/listagg.sql",
"chars": 477,
"preview": "{% macro trino__listagg(measure, delimiter_text, order_by_clause, limit_num) -%}\n {% set collect_list %} array_agg({{"
},
{
"path": "dbt/include/trino/macros/utils/right.sql",
"chars": 215,
"preview": "{% macro trino__right(string_text, length_expression) %}\n case when {{ length_expression }} = 0\n then ''\n e"
},
{
"path": "dbt/include/trino/macros/utils/safe_cast.sql",
"chars": 95,
"preview": "{% macro trino__safe_cast(field, type) -%}\n try_cast({{field}} as {{type}})\n{%- endmacro %}\n"
},
{
"path": "dbt/include/trino/macros/utils/split_part.sql",
"chars": 295,
"preview": "{% macro trino__split_part(string_text, delimiter_text, part_number) %}\n {% if part_number >= 0 %}\n {{ dbt.default__"
},
{
"path": "dbt/include/trino/macros/utils/timestamps.sql",
"chars": 230,
"preview": "{% macro trino__current_timestamp() -%}\n current_timestamp\n{%- endmacro %}\n\n{% macro trino__snapshot_string_as_time(t"
},
{
"path": "dbt/include/trino/sample_profiles.yml",
"chars": 681,
"preview": "default:\n outputs:\n\n dev:\n type: trino\n method: none # optional, one of {none | ldap | kerberos}\n us"
},
{
"path": "dev_requirements.txt",
"chars": 140,
"preview": "dbt-tests-adapter~=1.19.1\nmypy==1.19.1 # patch updates have historically introduced breaking changes\npre-commit~=4.3\npy"
},
{
"path": "docker/init_starburst.bash",
"chars": 391,
"preview": "#!/bin/bash\n\n# move to wherever we are so docker things work\ncd \"$(dirname \"${BASH_SOURCE[0]}\")\"\ncd ..\n\nset -exo pipefai"
},
{
"path": "docker/init_trino.bash",
"chars": 379,
"preview": "#!/bin/bash\n\n# move to wherever we are so docker things work\ncd \"$(dirname \"${BASH_SOURCE[0]}\")\"\ncd ..\n\nset -exo pipefai"
},
{
"path": "docker/remove_starburst.bash",
"chars": 155,
"preview": "#!/bin/bash\n\n# move to wherever we are so docker things work\ncd \"$(dirname \"${BASH_SOURCE[0]}\")\"\ncd ..\ndocker compose -f"
},
{
"path": "docker/remove_trino.bash",
"chars": 151,
"preview": "#!/bin/bash\n\n# move to wherever we are so docker things work\ncd \"$(dirname \"${BASH_SOURCE[0]}\")\"\ncd ..\ndocker compose -f"
},
{
"path": "docker/starburst/catalog/delta.properties",
"chars": 355,
"preview": "connector.name=delta-lake\ndelta.enable-non-concurrent-writes=true\nfs.native-s3.enabled=true\ns3.region=us-east-1\ns3.endpo"
},
{
"path": "docker/starburst/catalog/hive.properties",
"chars": 311,
"preview": "connector.name=hive\nhive.metastore.uri=thrift://hive-metastore:9083\nfs.native-s3.enabled=true\ns3.region=us-east-1\ns3.end"
},
{
"path": "docker/starburst/catalog/iceberg.properties",
"chars": 322,
"preview": "connector.name=iceberg\nhive.metastore.uri=thrift://hive-metastore:9083\nfs.native-s3.enabled=true\ns3.region=us-east-1\ns3."
},
{
"path": "docker/starburst/catalog/memory.properties",
"chars": 53,
"preview": "connector.name=memory\nmemory.max-data-per-node=128MB\n"
},
{
"path": "docker/starburst/catalog/postgresql.properties",
"chars": 139,
"preview": "connector.name=postgresql\nconnection-url=jdbc:postgresql://postgres:5432/dbt-trino\nconnection-user=dbt-trino\nconnection-"
},
{
"path": "docker/starburst/catalog/tpch.properties",
"chars": 20,
"preview": "connector.name=tpch\n"
},
{
"path": "docker/starburst/etc/config.properties",
"chars": 120,
"preview": "coordinator=true\nnode-scheduler.include-coordinator=true\nhttp-server.http.port=8080\ndiscovery.uri=http://localhost:8080\n"
},
{
"path": "docker/starburst/etc/jvm.config",
"chars": 392,
"preview": "-server\n-XX:InitialRAMPercentage=80\n-XX:MaxRAMPercentage=80\n-XX:G1HeapRegionSize=32M\n-XX:+ExplicitGCInvokesConcurrent\n-X"
},
{
"path": "docker/starburst/etc/node.properties",
"chars": 54,
"preview": "node.environment=docker\nnode.data-dir=/data/starburst\n"
},
{
"path": "docker/trino/catalog/delta.properties",
"chars": 330,
"preview": "connector.name=delta-lake\ndelta.enable-non-concurrent-writes=true\nfs.native-s3.enabled=true\ns3.region=us-east-1\ns3.endpo"
},
{
"path": "docker/trino/catalog/hive.properties",
"chars": 311,
"preview": "connector.name=hive\nhive.metastore.uri=thrift://hive-metastore:9083\nfs.native-s3.enabled=true\ns3.region=us-east-1\ns3.end"
},
{
"path": "docker/trino/catalog/iceberg.properties",
"chars": 287,
"preview": "connector.name=iceberg\nhive.metastore.uri=thrift://hive-metastore:9083\nfs.native-s3.enabled=true\ns3.region=us-east-1\ns3."
},
{
"path": "docker/trino/catalog/memory.properties",
"chars": 53,
"preview": "connector.name=memory\nmemory.max-data-per-node=128MB\n"
},
{
"path": "docker/trino/catalog/postgresql.properties",
"chars": 139,
"preview": "connector.name=postgresql\nconnection-url=jdbc:postgresql://postgres:5432/dbt-trino\nconnection-user=dbt-trino\nconnection-"
},
{
"path": "docker/trino/catalog/tpch.properties",
"chars": 20,
"preview": "connector.name=tpch\n"
},
{
"path": "docker/trino/etc/config.properties",
"chars": 120,
"preview": "coordinator=true\nnode-scheduler.include-coordinator=true\nhttp-server.http.port=8080\ndiscovery.uri=http://localhost:8080\n"
},
{
"path": "docker/trino/etc/jvm.config",
"chars": 392,
"preview": "-server\n-XX:InitialRAMPercentage=80\n-XX:MaxRAMPercentage=80\n-XX:G1HeapRegionSize=32M\n-XX:+ExplicitGCInvokesConcurrent\n-X"
},
{
"path": "docker/trino/etc/node.properties",
"chars": 50,
"preview": "node.environment=docker\nnode.data-dir=/data/trino\n"
},
{
"path": "docker-compose-starburst.yml",
"chars": 2377,
"preview": "services:\n trino:\n ports:\n - \"8080:8080\"\n image: \"starburstdata/starburst-enterprise:477-e.1\"\n volumes:\n "
},
{
"path": "docker-compose-trino.yml",
"chars": 2315,
"preview": "services:\n trino:\n ports:\n - \"8080:8080\"\n image: \"trinodb/trino:478\"\n volumes:\n - ./docker/trino/etc"
},
{
"path": "mypy.ini",
"chars": 63,
"preview": "[mypy]\nnamespace_packages = True\nexplicit_package_bases = True\n"
},
{
"path": "pytest.ini",
"chars": 312,
"preview": "[pytest]\nfilterwarnings =\n ignore:.*'soft_unicode' has been renamed to 'soft_str'*:DeprecationWarning\n ignore:uncl"
},
{
"path": "setup.py",
"chars": 3401,
"preview": "#!/usr/bin/env python\nimport os\nimport re\nimport sys\n\n# require python 3.9 or newer\nif sys.version_info < (3, 9):\n pr"
},
{
"path": "tests/conftest.py",
"chars": 4590,
"preview": "import os\n\nimport pytest\nimport trino\n\n# Import the functional fixtures as a plugin\n# Note: fixtures with session scope "
},
{
"path": "tests/functional/adapter/behavior_flags/test_require_certificate_validation.py",
"chars": 3348,
"preview": "import warnings\n\nimport pytest\nfrom dbt.tests.util import run_dbt, run_dbt_and_capture\nfrom urllib3.exceptions import In"
},
{
"path": "tests/functional/adapter/catalog_integrations/fixtures.py",
"chars": 1797,
"preview": "MODEL_WITHOUT_CATALOG = \"\"\"\n{{ config(\n materialized='table',\n) }}\n\nselect 1 as id, 'test' as name\n\"\"\"\n\nMODEL_WITH_CA"
},
{
"path": "tests/functional/adapter/catalog_integrations/test_catalog_integration.py",
"chars": 8426,
"preview": "import pytest\nfrom dbt.tests.adapter.catalog_integrations.test_catalog_integration import (\n BaseCatalogIntegrationVa"
},
{
"path": "tests/functional/adapter/column_types/fixtures.py",
"chars": 1089,
"preview": "model_sql = \"\"\"\nselect\n cast(0 as tinyint) as tinyint_col,\n cast(1 as smallint) as smallint_col,\n cast(2 as int"
},
{
"path": "tests/functional/adapter/column_types/test_column_types.py",
"chars": 414,
"preview": "import pytest\nfrom dbt.tests.adapter.column_types.test_column_types import BaseColumnTypes\n\nfrom tests.functional.adapte"
},
{
"path": "tests/functional/adapter/constraints/fixtures.py",
"chars": 3667,
"preview": "trino_model_contract_sql_header_sql = \"\"\"\n{{\n config(\n materialized = \"table\"\n )\n}}\n\n{% call set_sql_header(config)"
},
{
"path": "tests/functional/adapter/constraints/test_constraints.py",
"chars": 7517,
"preview": "import pytest\nfrom dbt.tests.adapter.constraints.fixtures import (\n my_incremental_model_sql,\n my_model_incrementa"
},
{
"path": "tests/functional/adapter/dbt_clone/test_dbt_clone.py",
"chars": 1421,
"preview": "import pytest\nfrom dbt.tests.adapter.dbt_clone.fixtures import (\n custom_can_clone_tables_false_macros_sql,\n get_s"
},
{
"path": "tests/functional/adapter/dbt_debug/test_dbt_debug.py",
"chars": 1055,
"preview": "import pytest\nfrom dbt.tests.adapter.dbt_debug.test_dbt_debug import (\n BaseDebug,\n BaseDebugProfileVariable,\n)\nfr"
},
{
"path": "tests/functional/adapter/dbt_show/test_dbt_show.py",
"chars": 198,
"preview": "from dbt.tests.adapter.dbt_show.test_dbt_show import BaseShowLimit, BaseShowSqlHeader\n\n\nclass TestTrinoShowSqlHeader(Bas"
},
{
"path": "tests/functional/adapter/empty/test_empty.py",
"chars": 230,
"preview": "from dbt.tests.adapter.empty.test_empty import (\n BaseTestEmpty,\n BaseTestEmptyInlineSourceRef,\n)\n\n\nclass TestTrin"
},
{
"path": "tests/functional/adapter/fixture_datediff.py",
"chars": 2646,
"preview": "seeds__data_datediff_csv = \"\"\"first_date,second_date,datepart,result\n2018-01-01 01:00:00,2018-01-02 01:00:00,day,1\n2018-"
},
{
"path": "tests/functional/adapter/hooks/data/seed_model.sql",
"chars": 459,
"preview": "drop table if exists {schema}.on_model_hook;\n\ncreate table {schema}.on_model_hook (\n test_state VARCHAR, -- sta"
},
{
"path": "tests/functional/adapter/hooks/data/seed_run.sql",
"chars": 455,
"preview": "drop table if exists {schema}.on_run_hook;\n\ncreate table {schema}.on_run_hook (\n test_state VARCHAR, -- start|e"
},
{
"path": "tests/functional/adapter/hooks/test_hooks_delete.py",
"chars": 2184,
"preview": "# Test hooks with DELETE statement\nimport pytest\nfrom dbt.tests.util import run_dbt, run_sql_with_adapter\n\nseed = \"\"\"\nid"
},
{
"path": "tests/functional/adapter/hooks/test_model_hooks.py",
"chars": 1306,
"preview": "import pytest\nfrom dbt.tests.adapter.hooks import test_model_hooks as core_base\n\n\nclass TestTrinoPrePostModelHooks(core_"
},
{
"path": "tests/functional/adapter/hooks/test_run_hooks.py",
"chars": 2608,
"preview": "import pytest\nfrom dbt.tests.adapter.hooks.test_run_hooks import (\n BaseAfterRunHooks,\n BasePrePostRunHooks,\n)\n\n\nc"
},
{
"path": "tests/functional/adapter/materialization/fixtures.py",
"chars": 10762,
"preview": "seed_csv = \"\"\"\nid,name,some_date\n1,Easton,1981-05-20 06:46:51\n2,Lillian,1978-09-03 18:10:33\n3,Jeremiah,1982-03-11 03:59:"
},
{
"path": "tests/functional/adapter/materialization/test_incremental_delete_insert.py",
"chars": 8859,
"preview": "import pytest\nfrom dbt.tests.adapter.incremental.test_incremental_predicates import (\n BaseIncrementalPredicates,\n "
},
{
"path": "tests/functional/adapter/materialization/test_incremental_merge.py",
"chars": 4934,
"preview": "import pytest\nfrom dbt.tests.adapter.incremental.test_incremental_unique_id import (\n BaseIncrementalUniqueKey,\n m"
},
{
"path": "tests/functional/adapter/materialization/test_incremental_microbatch.py",
"chars": 181,
"preview": "import pytest\nfrom dbt.tests.adapter.incremental.test_incremental_microbatch import BaseMicrobatch\n\n\n@pytest.mark.iceber"
},
{
"path": "tests/functional/adapter/materialization/test_incremental_predicates.py",
"chars": 2037,
"preview": "import pytest\nfrom dbt.tests.adapter.incremental.test_incremental_predicates import (\n BaseIncrementalPredicates,\n)\n\n"
},
{
"path": "tests/functional/adapter/materialization/test_incremental_schema.py",
"chars": 13287,
"preview": "import pytest\nfrom dbt.tests.util import check_relations_equal, run_dbt\n\nfrom tests.functional.adapter.materialization.f"
},
{
"path": "tests/functional/adapter/materialization/test_incremental_views_enabled.py",
"chars": 2720,
"preview": "import pytest\nfrom dbt.tests.util import run_dbt, run_dbt_and_capture\n\nfrom tests.functional.adapter.materialization.fix"
},
{
"path": "tests/functional/adapter/materialization/test_materialized_view.py",
"chars": 10169,
"preview": "import pytest\nfrom dbt.tests.util import (\n check_relation_types,\n check_relations_equal,\n run_dbt,\n run_dbt"
},
{
"path": "tests/functional/adapter/materialization/test_on_table_exists.py",
"chars": 13528,
"preview": "import pytest\nfrom dbt.tests.util import check_relations_equal, run_dbt, run_dbt_and_capture\n\nfrom tests.functional.adap"
},
{
"path": "tests/functional/adapter/materialization/test_prepared_statements.py",
"chars": 2864,
"preview": "import pytest\nfrom dbt.tests.util import check_relations_equal, run_dbt\n\nfrom tests.functional.adapter.materialization.f"
},
{
"path": "tests/functional/adapter/materialization/test_snapshot.py",
"chars": 5640,
"preview": "import pytest\nfrom dbt.tests.adapter.basic.test_snapshot_check_cols import BaseSnapshotCheckCols\nfrom dbt.tests.adapter."
},
{
"path": "tests/functional/adapter/materialization/test_view_security.py",
"chars": 1748,
"preview": "import pytest\nfrom dbt.tests.util import check_relations_equal, run_dbt\n\nfrom tests.functional.adapter.materialization.f"
},
{
"path": "tests/functional/adapter/materialized_view_tests/test_materialized_view_dbt_core.py",
"chars": 2617,
"preview": "from typing import Optional, Tuple\n\nimport pytest\nfrom dbt.adapters.base.relation import BaseRelation\nfrom dbt.tests.ada"
},
{
"path": "tests/functional/adapter/materialized_view_tests/utils.py",
"chars": 1261,
"preview": "from typing import Optional\n\nfrom dbt.adapters.base.relation import BaseRelation\n\nfrom dbt.adapters.trino.relation impor"
},
{
"path": "tests/functional/adapter/persist_docs/fixtures.py",
"chars": 4818,
"preview": "seed_csv = \"\"\"\nid,name,date\n1,Easton,1981-05-20 06:46:51\n2,Lillian,1978-09-03 18:10:33\n3,Jeremiah,1982-03-11 03:59:51\n4,"
},
{
"path": "tests/functional/adapter/persist_docs/test_persist_docs.py",
"chars": 6342,
"preview": "import pytest\nfrom dbt.tests.adapter.persist_docs.test_persist_docs import (\n BasePersistDocs,\n BasePersistDocsBas"
},
{
"path": "tests/functional/adapter/show/fixtures.py",
"chars": 1571,
"preview": "models__sample_model = \"\"\"\nselect * from {{ ref('sample_seed') }}\n\"\"\"\n\nmodels__second_model = \"\"\"\nselect\n sample_num "
},
{
"path": "tests/functional/adapter/show/test_show.py",
"chars": 5769,
"preview": "import pytest\nfrom dbt.tests.util import run_dbt, run_dbt_and_capture\nfrom dbt_common.exceptions import DbtBaseException"
},
{
"path": "tests/functional/adapter/simple_seed/seed_bom.csv",
"chars": 33346,
"preview": "seed_id,first_name,email,ip_address,birthday\n1,Larry,lking0@miitbeian.gov.cn,69.135.206.194,2008-09-12 19:08:31\n2,Larry"
},
{
"path": "tests/functional/adapter/simple_seed/seeds.py",
"chars": 46151,
"preview": "trino_seeds__expected_sql_create_table = \"\"\"\ncreate table {schema}.seed_expected (\nseed_id INTEGER,\nfirst_name VARCHAR,\n"
},
{
"path": "tests/functional/adapter/simple_seed/test_seed.py",
"chars": 3919,
"preview": "from pathlib import Path\n\nimport pytest\nfrom dbt.tests.adapter.simple_seed.test_seed import (\n TestBasicSeedTests as "
},
{
"path": "tests/functional/adapter/store_failures/fixtures.py",
"chars": 606,
"preview": "seed_csv = \"\"\"\nid,value\n1,1\n2,2\n3,3\n4,4\n\"\"\".lstrip()\n\ntable_model = \"\"\"\nselect * from {{ ref('seed') }}\n\"\"\"\n\ntable_profi"
},
{
"path": "tests/functional/adapter/store_failures/test_store_failures.py",
"chars": 2764,
"preview": "import pytest\nfrom dbt.tests.adapter.store_test_failures_tests import basic\nfrom dbt.tests.adapter.store_test_failures_t"
},
{
"path": "tests/functional/adapter/test_basic.py",
"chars": 6829,
"preview": "import pytest\nfrom dbt.tests.adapter.basic.expected_catalog import base_expected_catalog, no_stats\nfrom dbt.tests.adapte"
},
{
"path": "tests/functional/adapter/test_caching.py",
"chars": 373,
"preview": "from dbt.tests.adapter.caching.test_caching import (\n BaseCachingLowercaseModel,\n BaseCachingSelectedSchemaOnly,\n "
},
{
"path": "tests/functional/adapter/test_changing_relation_type.py",
"chars": 189,
"preview": "from dbt.tests.adapter.relations.test_changing_relation_type import (\n BaseChangeRelationTypeValidator,\n)\n\n\nclass Tes"
},
{
"path": "tests/functional/adapter/test_concurrency.py",
"chars": 1179,
"preview": "from dbt.tests.adapter.concurrency.test_concurrency import (\n BaseConcurrency,\n seeds__update_csv,\n)\nfrom dbt.test"
},
{
"path": "tests/functional/adapter/test_custom_schema.py",
"chars": 3672,
"preview": "from abc import ABC, abstractmethod\n\nimport pytest\nfrom dbt.tests.util import run_dbt, run_sql_with_adapter\n\nseed_csv = "
},
{
"path": "tests/functional/adapter/test_ephemeral.py",
"chars": 768,
"preview": "from dbt.tests.adapter.ephemeral.test_ephemeral import (\n BaseEphemeralErrorHandling,\n BaseEphemeralMulti,\n Bas"
},
{
"path": "tests/functional/adapter/test_get_incremental_tmp_relation_type_macro.py",
"chars": 3077,
"preview": "from abc import ABC, abstractmethod\n\nimport pytest\nfrom dbt.tests.util import run_dbt, run_sql_with_adapter\n\n\nclass Cust"
},
{
"path": "tests/functional/adapter/test_grants.py",
"chars": 1841,
"preview": "import pytest\nfrom dbt.context.base import BaseContext # diff_of_two_dicts only\nfrom dbt.tests.adapter.grants.test_inva"
},
{
"path": "tests/functional/adapter/test_query_comments.py",
"chars": 670,
"preview": "from dbt.tests.adapter.query_comment.test_query_comment import (\n BaseEmptyQueryComments,\n BaseMacroArgsQueryComme"
},
{
"path": "tests/functional/adapter/test_quote_policy.py",
"chars": 254,
"preview": "import pytest\n\nfrom tests.functional.adapter.test_basic import TestIncrementalTrino\n\n\n@pytest.fixture(scope=\"class\")\ndef"
},
{
"path": "tests/functional/adapter/test_sample_mode.py",
"chars": 136,
"preview": "from dbt.tests.adapter.sample_mode.test_sample_mode import BaseSampleModeTest\n\n\nclass TestTrinoSampleMode(BaseSampleMode"
},
{
"path": "tests/functional/adapter/test_seeds_column_types_overrides.py",
"chars": 5885,
"preview": "import pytest\nfrom dbt.tests.util import get_connection, relation_from_name, run_dbt\n\nboolean_type = \"\"\"\nboolean_example"
},
{
"path": "tests/functional/adapter/test_session_property.py",
"chars": 1219,
"preview": "import pytest\nfrom dbt.tests.util import run_dbt\n\nset_session_property = \"set session query_max_run_time='20s'\"\n\n\nclass "
},
{
"path": "tests/functional/adapter/test_simple_copy.py",
"chars": 2491,
"preview": "import pytest\nfrom dbt.tests.adapter.simple_copy.test_simple_copy import (\n EmptyModelsArentRunBase,\n SimpleCopyBa"
},
{
"path": "tests/functional/adapter/test_simple_snapshot.py",
"chars": 3182,
"preview": "import pytest\nfrom dbt.tests.adapter.simple_snapshot.test_snapshot import (\n BaseSimpleSnapshot,\n BaseSnapshotChec"
},
{
"path": "tests/functional/adapter/test_sql_status_output.py",
"chars": 1664,
"preview": "import pytest\nfrom dbt.tests.util import run_dbt, run_dbt_and_capture\n\nseed_csv = \"\"\"\nid,name,some_date\n1,Easton,1981-05"
},
{
"path": "tests/functional/adapter/test_table_properties.py",
"chars": 5280,
"preview": "import pytest\nfrom dbt.tests.util import run_dbt, run_dbt_and_capture\n\nfrom tests.functional.adapter.materialization.fix"
},
{
"path": "tests/functional/adapter/unit_testing/test_unit_testing.py",
"chars": 1733,
"preview": "import pytest\nfrom dbt.tests.adapter.unit_testing.test_case_insensitivity import (\n BaseUnitTestCaseInsensivity,\n)\nfr"
},
{
"path": "tests/functional/adapter/utils/fixture_date_spine.py",
"chars": 1028,
"preview": "# If date_spine works properly, there should be no `null` values in the resulting model\n\nmodels__trino_test_date_spine_s"
},
{
"path": "tests/functional/adapter/utils/fixture_get_intervals_between.py",
"chars": 163,
"preview": "models__trino_test_get_intervals_between_sql = \"\"\"\nSELECT\n {{ get_intervals_between(\"'2023-09-01'\", \"'2023-09-12'\", \"da"
},
{
"path": "tests/functional/adapter/utils/test_data_types.py",
"chars": 1095,
"preview": "import pytest\nfrom dbt.tests.adapter.utils.data_types.test_type_bigint import BaseTypeBigInt\nfrom dbt.tests.adapter.util"
},
{
"path": "tests/functional/adapter/utils/test_date_spine.py",
"chars": 637,
"preview": "import pytest\nfrom dbt.tests.adapter.utils.base_utils import BaseUtils\nfrom dbt.tests.adapter.utils.fixture_date_spine i"
},
{
"path": "tests/functional/adapter/utils/test_get_intervals_between.py",
"chars": 775,
"preview": "import pytest\nfrom dbt.tests.adapter.utils.base_utils import BaseUtils\nfrom dbt.tests.adapter.utils.fixture_get_interval"
},
{
"path": "tests/functional/adapter/utils/test_timestamps.py",
"chars": 593,
"preview": "import pytest\nfrom dbt.tests.adapter.utils.test_timestamps import BaseCurrentTimestamps\n\n\nclass TestCurrentTimestampTrin"
},
{
"path": "tests/functional/adapter/utils/test_utils.py",
"chars": 5790,
"preview": "import pytest\nfrom dbt.tests.adapter.utils.fixture_datediff import models__test_datediff_yml\nfrom dbt.tests.adapter.util"
},
{
"path": "tests/unit/__init__.py",
"chars": 0,
"preview": ""
},
{
"path": "tests/unit/test_adapter.py",
"chars": 25176,
"preview": "import string\nimport unittest\nfrom multiprocessing import get_context\nfrom unittest import TestCase\nfrom unittest.mock i"
},
{
"path": "tests/unit/utils.py",
"chars": 5397,
"preview": "\"\"\"Unit test utility functions.\n\nNote that all imports should be inside the functions to avoid import/mocking\nissues.\n\"\""
},
{
"path": "tox.ini",
"chars": 563,
"preview": "[tox]\nskipsdist = True\nenvlist = unit, integration\n\n[testenv:unit]\ndescription = unit testing\nbasepython = python3\ncomma"
}
]
About this extraction
This page contains the full source code of the starburstdata/dbt-trino GitHub repository, extracted and formatted as plain text for AI agents and large language models (LLMs). The extraction includes 164 files (451.5 KB), approximately 132.4k tokens, and a symbol index with 694 extracted functions, classes, methods, constants, and types. Use this with OpenClaw, Claude, ChatGPT, Cursor, Windsurf, or any other AI tool that accepts text input. You can copy the full output to your clipboard or download it as a .txt file.
Extracted by GitExtract — free GitHub repo to text converter for AI. Built by Nikandr Surkov.