Skip to content

Commit 2e86bfd

Browse files
committed
fix installation command"
1 parent 6f7591e commit 2e86bfd

File tree

3 files changed

+3
-3
lines changed

3 files changed

+3
-3
lines changed

docs/examples/postgres_to_postgres/postgres_to_postgres.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -33,7 +33,7 @@
3333
Install `dlt` with `duckdb` as extra, also `connectorx`, Postgres adapter and progress bar tool:
3434
3535
```sh
36-
pip install dlt[duckdb] connectorx pyarrow psycopg2-binary alive-progress
36+
pip install "dlt[duckdb]" connectorx pyarrow psycopg2-binary alive-progress
3737
```
3838
3939
Run the example:

docs/website/blog/2024-01-10-dlt-mode.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -124,7 +124,7 @@ With the model we just created, called Products, a chart can be instantly create
124124
In this demo, we’ll forego the authentication issues of connecting to a data warehouse, and choose the DuckDB destination to show how the Python environment within Mode can be used to initialize a data pipeline and dump normalized data into a destination. In order to see how it works, we first install dlt[duckdb] into the Python environment.
125125

126126
```sh
127-
!pip install dlt[duckdb]
127+
!pip install "dlt[duckdb]"
128128
```
129129

130130
Next, we initialize the dlt pipeline:

docs/website/docs/walkthroughs/dispatch-to-multiple-tables.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -12,7 +12,7 @@ We'll use the [GitHub API](https://docs.github.com/en/rest) to fetch the events
1212
1. Install dlt with duckdb support:
1313

1414
```sh
15-
pip install dlt[duckdb]
15+
pip install "dlt[duckdb]"
1616
```
1717

1818
2. Create a new a new file `github_events_dispatch.py` and paste the following code:

0 commit comments

Comments
 (0)