Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

master merge for 0.4.1 release #849

Merged
merged 69 commits into from
Dec 23, 2023
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
69 commits
Select commit Hold shift + click to select a range
582448c
bumps devel to pre-release 0.4.1a0
rudolfix Nov 18, 2023
105795c
Parametrized destinations (#746)
steinitzu Nov 18, 2023
1d4995d
start implementation of platform trace support
sh-rp Oct 30, 2023
2a6bcb1
add beacon integration
sh-rp Oct 30, 2023
fea80fa
small changes
sh-rp Nov 1, 2023
f08b795
fix tests
sh-rp Nov 1, 2023
dc81f05
add sending on threads to platform connection
sh-rp Nov 20, 2023
d310949
fix config test
sh-rp Nov 20, 2023
5758e05
revert to adding tracestep on end tracestep
sh-rp Nov 20, 2023
03fcc21
add test for platform connection
sh-rp Nov 20, 2023
02bac98
schema contract (#594)
sh-rp Nov 21, 2023
3fb2c30
trigger devel docs deploy
sh-rp Nov 21, 2023
4b9adae
pr fixes
sh-rp Nov 21, 2023
5484e22
Merge branch 'devel' into d#/platform_connection
sh-rp Nov 21, 2023
4194284
fix tests
sh-rp Nov 21, 2023
1770d0d
fix linting
sh-rp Nov 21, 2023
28dbba6
source and schema changes (#769)
sh-rp Nov 21, 2023
83279f2
add configurable docusaurus root url
sh-rp Nov 22, 2023
bd3461d
add current version as tag to docusaurus
sh-rp Nov 22, 2023
35e6416
add black and makefile commands
sh-rp Aug 24, 2023
57e5b22
post rebase lockfile update
sh-rp Nov 22, 2023
eba666e
disable isort
sh-rp Nov 22, 2023
75373f7
update flake8 config
sh-rp Nov 22, 2023
c3ddbaa
format all files
sh-rp Nov 22, 2023
a2bb35d
fix snippets linting
sh-rp Nov 22, 2023
f9b4657
add second docs link to header
sh-rp Nov 22, 2023
49978c5
exclude venv from formatting
sh-rp Nov 23, 2023
2a0e008
fix one lint error
sh-rp Nov 23, 2023
e14461c
Merge pull request #583 from dlt-hub/d#/formatting
sh-rp Nov 23, 2023
3d35f9b
Create .git-blame-ignore-revs
sh-rp Nov 23, 2023
029d9bd
Merge branch 'devel' into d#/docs_tests
sh-rp Nov 23, 2023
2d280e1
Merge branch 'devel' into d#/platform_connection
sh-rp Nov 23, 2023
d4adf68
post formatting docs updates
sh-rp Nov 23, 2023
765e72f
add top bar on devel and stable / devel switch link
sh-rp Nov 23, 2023
02d52a3
fix switch links
sh-rp Nov 23, 2023
01998ef
Merge pull request #784 from dlt-hub/d#/docs_tests
sh-rp Nov 23, 2023
d453db3
small pr changes
sh-rp Nov 23, 2023
0601d14
improves data contract docs (#782)
rudolfix Nov 23, 2023
e48813f
fix: ensure accessor typing does not make static type checker error (…
z3z1ma Nov 23, 2023
cfb6e66
Merge pull request #727 from dlt-hub/d#/platform_connection
sh-rp Nov 24, 2023
1f94a3b
Rfix/load package extract (#790)
rudolfix Nov 29, 2023
fcb7d5e
destination config updates (#783)
sh-rp Dec 1, 2023
199e0c2
declares extract, normalize and extract step info and WithStateInfo
rudolfix Dec 3, 2023
1c0c60e
converts extract to class, implements extract step info
rudolfix Dec 3, 2023
690f8a7
implements WithStateInfo in normalize
rudolfix Dec 3, 2023
ee70217
implements WithStateInfo in load
rudolfix Dec 3, 2023
c6b1916
changes Pipeline class to use WithStateInfo and extract as class
rudolfix Dec 3, 2023
88ba90f
Merge pull request #801 from dlt-hub/rfix/step-info-refactor
sh-rp Dec 5, 2023
eea7d6d
adds utils to generate exception traces
rudolfix Dec 5, 2023
7593676
adds extended exception trace to step trace
rudolfix Dec 5, 2023
f60b935
skips config location checks for unknown modules
rudolfix Dec 6, 2023
df39b72
Merge pull request #806 from dlt-hub/rfix/adds-exception-traces
sh-rp Dec 6, 2023
5102f77
bumps to alpha 0.4.1a1
rudolfix Dec 7, 2023
6ba1f90
fix documentation typos (#820)
IlyaFaer Dec 13, 2023
dfc9c05
fixed attribute check: getuid -> geteuid (#823)
jorritsandbrink Dec 14, 2023
00c2725
allows to run parallel pipelines in separate threads (#813)
rudolfix Dec 14, 2023
bd37f9f
bumps to 0.4.1a2
rudolfix Dec 14, 2023
6abdd1c
Fix Windows lint issue and implement CI lint matrix strategy (#827)
jorritsandbrink Dec 17, 2023
c360820
791 test mssql credentialspy is odbc driver 18 dependent (#834)
jorritsandbrink Dec 18, 2023
0a683e5
sync master to devel (#842)
sh-rp Dec 19, 2023
d4d0ec0
exclude docs group from poetry install in lint workflow (#844)
jorritsandbrink Dec 21, 2023
ed364ec
adds extract and normalize traces (#839)
rudolfix Dec 22, 2023
e3347f3
bumps dlt version to 0.4.1 (#848)
rudolfix Dec 22, 2023
03f8cf6
Merge branch 'master' into devel
rudolfix Dec 22, 2023
a0301aa
fixes poetry lock
rudolfix Dec 22, 2023
842bb6a
removes pandas and pyarrow from snowflake extras
rudolfix Dec 22, 2023
efb2522
late instantiation of storages
rudolfix Dec 22, 2023
b1494ae
fixes package infos on deleted loaded packages
rudolfix Dec 22, 2023
ebc250b
fixes dev dependencies
rudolfix Dec 22, 2023
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
The table of contents is too big for display.
Diff view
Diff view
  •  
  •  
  •  
2 changes: 2 additions & 0 deletions .git-blame-ignore-revs
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
# introduce formatting with black
c3ddbaa6e61c44a3809e625c802cb4c7632934a3
40 changes: 29 additions & 11 deletions .github/workflows/lint.yml
Original file line number Diff line number Diff line change
Expand Up @@ -13,11 +13,19 @@ jobs:
uses: ./.github/workflows/get_docs_changes.yml

run_lint:
name: Runs mypy, flake and bandit
name: Lint
needs: get_docs_changes
if: needs.get_docs_changes.outputs.changes_outside_docs == 'true'
strategy:
fail-fast: false
matrix:
os: ["ubuntu-latest", "macos-latest", "windows-latest"]
python-version: ["3.8.x", "3.9.x", "3.10.x", "3.11.x"]

runs-on: ubuntu-latest
defaults:
run:
shell: bash
runs-on: ${{ matrix.os }}

steps:

Expand All @@ -27,34 +35,44 @@ jobs:
- name: Setup Python
uses: actions/setup-python@v4
with:
python-version: "3.10.x"
python-version: ${{ matrix.python-version }}

- name: Install Poetry
uses: snok/install-poetry@v1
with:
virtualenvs-create: true
virtualenvs-in-project: true
installer-parallel: true
installer-parallel: true

- name: Load cached venv
id: cached-poetry-dependencies
uses: actions/cache@v3
with:
path: .venv
key: venv-${{ runner.os }}-${{ steps.setup-python.outputs.python-version }}-${{ hashFiles('**/poetry.lock') }}
key: venv-${{ matrix.os }}-${{ matrix.python-version }}-${{ hashFiles('**/poetry.lock') }}

- name: Install dependencies
# if: steps.cached-poetry-dependencies.outputs.cache-hit != 'true'
run: poetry install --no-interaction --all-extras --with airflow

# - name: Install self
# run: poetry install --no-interaction
run: poetry install --all-extras --with airflow,providers,pipeline,sentry-sdk

- name: Run lint
run: make lint
- name: Run make lint
run: |
export PATH=$PATH:"/c/Program Files/usr/bin" # needed for Windows
make lint

# - name: print envs
# run: |
# echo "The GitHub Actor's username is: $GITHUB_ACTOR"
# echo "The GitHub repo owner is: $GITHUB_REPOSITORY_OWNER"
# echo "The GitHub repo is: $GITHUB_REPOSITORY"

matrix_job_required_check:
name: Lint results
needs: run_lint
runs-on: ubuntu-latest
if: always()
steps:
- name: Check matrix job results
if: contains(needs.*.result, 'failure') || contains(needs.*.result, 'cancelled')
run: |
echo "One or more matrix job tests failed or were cancelled. You may need to re-run them." && exit 1
2 changes: 1 addition & 1 deletion .github/workflows/test_airflow.yml
Original file line number Diff line number Diff line change
Expand Up @@ -41,7 +41,7 @@ jobs:
key: venv-${{ runner.os }}-${{ steps.setup-python.outputs.python-version }}-${{ hashFiles('**/poetry.lock') }}-airflow-runner

- name: Install dependencies
run: poetry install --no-interaction --with airflow -E duckdb -E parquet
run: poetry install --no-interaction --with airflow --with pipeline -E duckdb -E parquet --with sentry-sdk

- run: |
poetry run pytest tests/helpers/airflow_tests
Expand Down
65 changes: 46 additions & 19 deletions .github/workflows/test_common.yml
Original file line number Diff line number Diff line change
Expand Up @@ -55,40 +55,67 @@ jobs:
virtualenvs-in-project: true
installer-parallel: true

- name: Load cached venv
id: cached-poetry-dependencies
uses: actions/cache@v3
with:
# path: ${{ steps.pip-cache.outputs.dir }}
path: .venv
key: venv-${{ matrix.os }}-${{ matrix.python-version }}-${{ hashFiles('**/poetry.lock') }}
# NOTE: do not cache. we want to have a clean state each run and we upgrade depdendencies later
# - name: Load cached venv
# id: cached-poetry-dependencies
# uses: actions/cache@v3
# with:
# # path: ${{ steps.pip-cache.outputs.dir }}
# path: .venv
# key: venv-${{ matrix.os }}-${{ matrix.python-version }}-${{ hashFiles('**/poetry.lock') }}

- name: Install dependencies
run: poetry install --no-interaction --with sentry-sdk

- run: |
poetry run pytest tests/common tests/normalize tests/reflection tests/sources tests/load/test_dummy_client.py tests/extract/test_extract.py tests/extract/test_sources.py tests/pipeline/test_pipeline_state.py
if: runner.os != 'Windows'
name: Run common tests with minimum dependencies Linux/MAC
- run: |
poetry run pytest tests/common tests/normalize tests/reflection tests/sources tests/load/test_dummy_client.py tests/extract/test_extract.py tests/extract/test_sources.py tests/pipeline/test_pipeline_state.py -m "not forked"
if: runner.os == 'Windows'
name: Run common tests with minimum dependencies Windows
shell: cmd

- name: Install dependencies + sentry
run: poetry install --no-interaction -E parquet -E pydantic && pip install sentry-sdk
- name: Install duckdb dependencies
run: poetry install --no-interaction -E duckdb --with sentry-sdk

- run: |
poetry run pytest tests/common tests/normalize tests/reflection tests/sources
poetry run pytest tests/pipeline/test_pipeline.py
if: runner.os != 'Windows'
name: Run tests Linux/MAC
name: Run pipeline smoke tests with minimum deps Linux/MAC
- run: |
poetry run pytest tests/common tests/normalize tests/reflection tests/sources -m "not forked"
poetry run pytest tests/pipeline/test_pipeline.py
if: runner.os == 'Windows'
name: Run tests Windows
name: Run smoke tests with minimum deps Windows
shell: cmd

- name: Install extra dependencies
run: poetry install --no-interaction -E duckdb -E cli -E parquet -E pydantic
- name: Install pipeline dependencies
run: poetry install --no-interaction -E duckdb -E cli -E parquet --with sentry-sdk --with pipeline

- run: |
poetry run pytest tests/extract tests/pipeline tests/cli/common
poetry run pytest tests/extract tests/pipeline tests/libs tests/cli/common tests/destinations
if: runner.os != 'Windows'
name: Run extra tests Linux/MAC
name: Run extract and pipeline tests Linux/MAC
- run: |
poetry run pytest tests/extract tests/pipeline tests/cli/common
poetry run pytest tests/extract tests/pipeline tests/libs tests/cli/common tests/destinations
if: runner.os == 'Windows'
name: Run extra tests Windows
name: Run extract tests Windows
shell: cmd

# - name: Install Pydantic 1.0
# run: pip install "pydantic<2"

# - run: |
# poetry run pytest tests/libs
# if: runner.os != 'Windows'
# name: Run extract and pipeline tests Linux/MAC
# - run: |
# poetry run pytest tests/libs
# if: runner.os == 'Windows'
# name: Run extract tests Windows
# shell: cmd

matrix_job_required_check:
name: Common tests
needs: run_common
Expand Down
2 changes: 1 addition & 1 deletion .github/workflows/test_dbt_runner.yml
Original file line number Diff line number Diff line change
Expand Up @@ -68,7 +68,7 @@ jobs:

- name: Install dependencies
# install dlt with postgres support
run: poetry install --no-interaction -E postgres -E dbt
run: poetry install --no-interaction -E postgres -E dbt --with sentry-sdk

- run: |
poetry run pytest tests/helpers/dbt_tests -k '(not venv)'
Expand Down
6 changes: 3 additions & 3 deletions .github/workflows/test_destination_athena.yml
Original file line number Diff line number Diff line change
Expand Up @@ -9,9 +9,9 @@ on:
workflow_dispatch:

env:
DESTINATION__FILESYSTEM__CREDENTIALS__AWS_ACCESS_KEY_ID: AKIAT4QMVMC4J46G55G4
DESTINATION__FILESYSTEM__CREDENTIALS__AWS_ACCESS_KEY_ID: AKIAT4QMVMC4LGORLZOK
DESTINATION__FILESYSTEM__CREDENTIALS__AWS_SECRET_ACCESS_KEY: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
DESTINATION__ATHENA__CREDENTIALS__AWS_ACCESS_KEY_ID: AKIAT4QMVMC4J46G55G4
DESTINATION__ATHENA__CREDENTIALS__AWS_ACCESS_KEY_ID: AKIAT4QMVMC4LGORLZOK
DESTINATION__ATHENA__CREDENTIALS__AWS_SECRET_ACCESS_KEY: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
DESTINATION__ATHENA__CREDENTIALS__REGION_NAME: eu-central-1
DESTINATION__ATHENA__QUERY_RESULT_BUCKET: s3://dlt-athena-output
Expand Down Expand Up @@ -70,7 +70,7 @@ jobs:

- name: Install dependencies
# if: steps.cached-poetry-dependencies.outputs.cache-hit != 'true'
run: poetry install --no-interaction -E athena
run: poetry install --no-interaction -E athena --with sentry-sdk --with pipeline

- run: |
poetry run pytest tests/load
Expand Down
6 changes: 3 additions & 3 deletions .github/workflows/test_destination_athena_iceberg.yml
Original file line number Diff line number Diff line change
Expand Up @@ -9,9 +9,9 @@ on:
workflow_dispatch:

env:
DESTINATION__FILESYSTEM__CREDENTIALS__AWS_ACCESS_KEY_ID: AKIAT4QMVMC4J46G55G4
DESTINATION__FILESYSTEM__CREDENTIALS__AWS_ACCESS_KEY_ID: AKIAT4QMVMC4LGORLZOK
DESTINATION__FILESYSTEM__CREDENTIALS__AWS_SECRET_ACCESS_KEY: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
DESTINATION__ATHENA__CREDENTIALS__AWS_ACCESS_KEY_ID: AKIAT4QMVMC4J46G55G4
DESTINATION__ATHENA__CREDENTIALS__AWS_ACCESS_KEY_ID: AKIAT4QMVMC4LGORLZOK
DESTINATION__ATHENA__CREDENTIALS__AWS_SECRET_ACCESS_KEY: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
DESTINATION__ATHENA__CREDENTIALS__REGION_NAME: eu-central-1
DESTINATION__ATHENA__QUERY_RESULT_BUCKET: s3://dlt-athena-output
Expand Down Expand Up @@ -70,7 +70,7 @@ jobs:

- name: Install dependencies
# if: steps.cached-poetry-dependencies.outputs.cache-hit != 'true'
run: poetry install --no-interaction -E athena
run: poetry install --no-interaction -E --with sentry-sdk --with pipeline

- run: |
poetry run pytest tests/load
Expand Down
4 changes: 2 additions & 2 deletions .github/workflows/test_destination_bigquery.yml
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,7 @@ env:
CREDENTIALS__REFRESH_TOKEN: ${{ secrets.CREDENTIALS__REFRESH_TOKEN }}

# needed for bigquery staging tests
# DESTINATION__FILESYSTEM__CREDENTIALS__AWS_ACCESS_KEY_ID: AKIAT4QMVMC4J46G55G4
# DESTINATION__FILESYSTEM__CREDENTIALS__AWS_ACCESS_KEY_ID: AKIAT4QMVMC4LGORLZOK
# DESTINATION__FILESYSTEM__CREDENTIALS__AWS_SECRET_ACCESS_KEY: ${{ secrets.AWS_SECRET_ACCESS_KEY }}

RUNTIME__SENTRY_DSN: https://[email protected]/4504819859914752
Expand Down Expand Up @@ -79,7 +79,7 @@ jobs:

- name: Install dependencies
# if: steps.cached-poetry-dependencies.outputs.cache-hit != 'true'
run: poetry install --no-interaction -E bigquery --with providers -E parquet
run: poetry install --no-interaction -E bigquery --with providers -E parquet --with sentry-sdk --with pipeline

# - name: Install self
# run: poetry install --no-interaction
Expand Down
2 changes: 1 addition & 1 deletion .github/workflows/test_destination_mssql.yml
Original file line number Diff line number Diff line change
Expand Up @@ -65,7 +65,7 @@ jobs:
key: venv-${{ runner.os }}-${{ steps.setup-python.outputs.python-version }}-${{ hashFiles('**/poetry.lock') }}-gcp

- name: Install dependencies
run: poetry install --no-interaction -E mssql -E s3 -E gs -E az -E parquet
run: poetry install --no-interaction -E mssql -E s3 -E gs -E az -E parquet --with sentry-sdk --with pipeline

- run: |
poetry run pytest tests/load --ignore tests/load/pipeline/test_dbt_helper.py
Expand Down
2 changes: 1 addition & 1 deletion .github/workflows/test_destination_qdrant.yml
Original file line number Diff line number Diff line change
Expand Up @@ -59,7 +59,7 @@ jobs:
key: venv-${{ runner.os }}-${{ steps.setup-python.outputs.python-version }}-${{ hashFiles('**/poetry.lock') }}-gcp

- name: Install dependencies
run: poetry install --no-interaction -E qdrant -E parquet
run: poetry install --no-interaction -E qdrant -E parquet --with sentry-sdk --with pipeline
- run: |
poetry run pytest tests/load/
if: runner.os != 'Windows'
Expand Down
4 changes: 2 additions & 2 deletions .github/workflows/test_destination_snowflake.yml
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@ env:
CREDENTIALS__PASSWORD: ${{ secrets.PG_PASSWORD }}

# needed for snowflake staging tests
DESTINATION__FILESYSTEM__CREDENTIALS__AWS_ACCESS_KEY_ID: AKIAT4QMVMC4J46G55G4
DESTINATION__FILESYSTEM__CREDENTIALS__AWS_ACCESS_KEY_ID: AKIAT4QMVMC4LGORLZOK
DESTINATION__FILESYSTEM__CREDENTIALS__AWS_SECRET_ACCESS_KEY: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
DESTINATION__FILESYSTEM__CREDENTIALS__PROJECT_ID: chat-analytics-rasa-ci
DESTINATION__FILESYSTEM__CREDENTIALS__CLIENT_EMAIL: chat-analytics-loader@chat-analytics-rasa-ci.iam.gserviceaccount.com
Expand Down Expand Up @@ -71,7 +71,7 @@ jobs:
key: venv-${{ runner.os }}-${{ steps.setup-python.outputs.python-version }}-${{ hashFiles('**/poetry.lock') }}-gcp

- name: Install dependencies
run: poetry install --no-interaction -E snowflake -E s3 -E gs -E az
run: poetry install --no-interaction -E snowflake -E s3 -E gs -E az -E parquet --with sentry-sdk --with pipeline

- run: |
poetry run pytest tests/load
Expand Down
8 changes: 4 additions & 4 deletions .github/workflows/test_destination_synapse.yml
Original file line number Diff line number Diff line change
Expand Up @@ -5,9 +5,9 @@ on:
branches:
- master
- devel

workflow_dispatch:

env:
DESTINATION__SYNAPSE__CREDENTIALS: ${{ secrets.SYNAPSE_CREDENTIALS }}
DESTINATION__SYNAPSE__CREDENTIALS__PASSWORD: ${{ secrets.SYNAPSE_PASSWORD }}
Expand Down Expand Up @@ -42,7 +42,7 @@ jobs:
runs-on: ${{ matrix.os }}

steps:

- name: Check out
uses: actions/checkout@master

Expand Down Expand Up @@ -70,7 +70,7 @@ jobs:
key: venv-${{ runner.os }}-${{ steps.setup-python.outputs.python-version }}-${{ hashFiles('**/poetry.lock') }}-gcp

- name: Install dependencies
run: poetry install --no-interaction -E synapse -E s3 -E gs -E az
run: poetry install --no-interaction -E synapse -E s3 -E gs -E az --with sentry-sdk --with pipeline

- run: |
poetry run pytest tests/load --ignore tests/load/pipeline/test_dbt_helper.py
Expand Down
2 changes: 1 addition & 1 deletion .github/workflows/test_destination_weaviate.yml
Original file line number Diff line number Diff line change
Expand Up @@ -61,7 +61,7 @@ jobs:
key: venv-${{ runner.os }}-${{ steps.setup-python.outputs.python-version }}-${{ hashFiles('**/poetry.lock') }}-gcp

- name: Install dependencies
run: poetry install --no-interaction -E weaviate -E parquet
run: poetry install --no-interaction -E weaviate -E parquet --with sentry-sdk --with pipeline
- run: |
poetry run pytest tests/load/
if: runner.os != 'Windows'
Expand Down
6 changes: 3 additions & 3 deletions .github/workflows/test_destinations.yml
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,7 @@ env:
DESTINATION__POSTGRES__CREDENTIALS: postgresql://[email protected]:5432/dlt_data
DESTINATION__DUCKDB__CREDENTIALS: duckdb:///_storage/test_quack.duckdb
DESTINATION__REDSHIFT__CREDENTIALS: postgresql://[email protected]:5439/dlt_ci
DESTINATION__FILESYSTEM__CREDENTIALS__AWS_ACCESS_KEY_ID: AKIAT4QMVMC4J46G55G4
DESTINATION__FILESYSTEM__CREDENTIALS__AWS_ACCESS_KEY_ID: AKIAT4QMVMC4LGORLZOK
DESTINATION__FILESYSTEM__CREDENTIALS__AWS_SECRET_ACCESS_KEY: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
DESTINATION__FILESYSTEM__CREDENTIALS__AZURE_STORAGE_ACCOUNT_NAME: dltdata
DESTINATION__FILESYSTEM__CREDENTIALS__AZURE_STORAGE_ACCOUNT_KEY: ${{ secrets.AZURE_STORAGE_ACCOUNT_KEY }}
Expand All @@ -22,7 +22,7 @@ env:
TESTS__R2_AWS_SECRET_ACCESS_KEY: ${{ secrets.CLOUDFLARE_R2_SECRET_ACCESS_KEY }}
TESTS__R2_ENDPOINT_URL: https://9830548e4e4b582989be0811f2a0a97f.r2.cloudflarestorage.com

# DESTINATION__ATHENA__CREDENTIALS__AWS_ACCESS_KEY_ID: AKIAT4QMVMC4J46G55G4
# DESTINATION__ATHENA__CREDENTIALS__AWS_ACCESS_KEY_ID: AKIAT4QMVMC4LGORLZOK
# DESTINATION__ATHENA__CREDENTIALS__AWS_SECRET_ACCESS_KEY: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
# DESTINATION__ATHENA__CREDENTIALS__REGION_NAME: eu-central-1
# DESTINATION__ATHENA__QUERY_RESULT_BUCKET: s3://dlt-athena-output
Expand Down Expand Up @@ -87,7 +87,7 @@ jobs:

- name: Install dependencies
# if: steps.cached-poetry-dependencies.outputs.cache-hit != 'true'
run: poetry install --no-interaction -E redshift -E gs -E s3 -E az -E parquet -E duckdb -E cli
run: poetry install --no-interaction -E redshift -E gs -E s3 -E az -E parquet -E duckdb -E cli --with sentry-sdk --with pipeline

# - name: Install self
# run: poetry install --no-interaction
Expand Down
2 changes: 1 addition & 1 deletion .github/workflows/test_doc_snippets.yml
Original file line number Diff line number Diff line change
Expand Up @@ -63,7 +63,7 @@ jobs:

- name: Install dependencies
# if: steps.cached-poetry-dependencies.outputs.cache-hit != 'true'
run: poetry install --no-interaction -E duckdb -E weaviate -E parquet -E qdrant --with docs --without airflow
run: poetry install --no-interaction -E duckdb -E weaviate -E parquet -E qdrant --with docs,sentry-sdk --without airflow

- name: Run linter and tests
run: make test-and-lint-snippets
Expand Down
2 changes: 1 addition & 1 deletion .github/workflows/test_local_destinations.yml
Original file line number Diff line number Diff line change
Expand Up @@ -84,7 +84,7 @@ jobs:
key: venv-${{ runner.os }}-${{ steps.setup-python.outputs.python-version }}-${{ hashFiles('**/poetry.lock') }}-local-destinations

- name: Install dependencies
run: poetry install --no-interaction -E postgres -E duckdb -E parquet -E filesystem -E cli -E weaviate
run: poetry install --no-interaction -E postgres -E duckdb -E parquet -E filesystem -E cli -E weaviate --with sentry-sdk --with pipeline

- run: poetry run pytest tests/load && poetry run pytest tests/cli
name: Run tests Linux
Expand Down
Loading