Skip to content

Commit 5d2d1ec

Browse files
authored
updated installation command in destination docs and a few others (#1410)
1 parent 993ac37 commit 5d2d1ec

File tree

18 files changed

+25
-25
lines changed

18 files changed

+25
-25
lines changed

docs/website/docs/dlt-ecosystem/destinations/athena.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -11,7 +11,7 @@ The Athena destination stores data as Parquet files in S3 buckets and creates [e
1111
## Install dlt with Athena
1212
**To install the dlt library with Athena dependencies:**
1313
```sh
14-
pip install dlt[athena]
14+
pip install "dlt[athena]"
1515
```
1616

1717
## Setup Guide
@@ -30,7 +30,7 @@ First, install dependencies by running:
3030
```sh
3131
pip install -r requirements.txt
3232
```
33-
or with `pip install dlt[athena]`, which will install `s3fs`, `pyarrow`, `pyathena`, and `botocore` packages.
33+
or with `pip install "dlt[athena]"`, which will install `s3fs`, `pyarrow`, `pyathena`, and `botocore` packages.
3434

3535
:::caution
3636

docs/website/docs/dlt-ecosystem/destinations/bigquery.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -11,7 +11,7 @@ keywords: [bigquery, destination, data warehouse]
1111
**To install the dlt library with BigQuery dependencies:**
1212

1313
```sh
14-
pip install dlt[bigquery]
14+
pip install "dlt[bigquery]"
1515
```
1616

1717
## Setup Guide

docs/website/docs/dlt-ecosystem/destinations/clickhouse.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -11,7 +11,7 @@ keywords: [ clickhouse, destination, data warehouse ]
1111
**To install the DLT library with ClickHouse dependencies:**
1212

1313
```sh
14-
pip install dlt[clickhouse]
14+
pip install "dlt[clickhouse]"
1515
```
1616

1717
## Setup Guide
@@ -33,7 +33,7 @@ requirements file by executing it as follows:
3333
pip install -r requirements.txt
3434
```
3535

36-
or with `pip install dlt[clickhouse]`, which installs the `dlt` library and the necessary dependencies for working with ClickHouse as a destination.
36+
or with `pip install "dlt[clickhouse]"`, which installs the `dlt` library and the necessary dependencies for working with ClickHouse as a destination.
3737

3838
### 2. Setup ClickHouse database
3939

docs/website/docs/dlt-ecosystem/destinations/databricks.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -12,7 +12,7 @@ keywords: [Databricks, destination, data warehouse]
1212
## Install dlt with Databricks
1313
**To install the dlt library with Databricks dependencies:**
1414
```sh
15-
pip install dlt[databricks]
15+
pip install "dlt[databricks]"
1616
```
1717

1818
## Set up your Databricks workspace

docs/website/docs/dlt-ecosystem/destinations/dremio.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -9,7 +9,7 @@ keywords: [dremio, iceberg, aws, glue catalog]
99
## Install dlt with Dremio
1010
**To install the dlt library with Dremio and s3 dependencies:**
1111
```sh
12-
pip install dlt[dremio,s3]
12+
pip install "dlt[dremio,s3]"
1313
```
1414

1515
## Setup Guide
@@ -28,7 +28,7 @@ First install dependencies by running:
2828
```sh
2929
pip install -r requirements.txt
3030
```
31-
or with `pip install dlt[dremio,s3]` which will install `s3fs`, `pyarrow`, and `botocore` packages.
31+
or with `pip install "dlt[dremio,s3]"` which will install `s3fs`, `pyarrow`, and `botocore` packages.
3232

3333
To edit the `dlt` credentials file with your secret info, open `.dlt/secrets.toml`. You will need to provide a `bucket_url` which holds the uploaded parquet files.
3434

docs/website/docs/dlt-ecosystem/destinations/duckdb.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -9,7 +9,7 @@ keywords: [duckdb, destination, data warehouse]
99
## Install dlt with DuckDB
1010
**To install the dlt library with DuckDB dependencies, run:**
1111
```sh
12-
pip install dlt[duckdb]
12+
pip install "dlt[duckdb]"
1313
```
1414

1515
## Setup Guide

docs/website/docs/dlt-ecosystem/destinations/filesystem.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -6,7 +6,7 @@ The Filesystem destination stores data in remote file systems and bucket storage
66
## Install dlt with filesystem
77
**To install the dlt library with filesystem dependencies:**
88
```sh
9-
pip install dlt[filesystem]
9+
pip install "dlt[filesystem]"
1010
```
1111

1212
This installs `s3fs` and `botocore` packages.
@@ -125,7 +125,7 @@ client_kwargs = '{"verify": "public.crt"}'
125125
```
126126

127127
#### Google Storage
128-
Run `pip install dlt[gs]` which will install the `gcfs` package.
128+
Run `pip install "dlt[gs]"` which will install the `gcfs` package.
129129

130130
To edit the `dlt` credentials file with your secret info, open `.dlt/secrets.toml`.
131131
You'll see AWS credentials by default.
@@ -148,7 +148,7 @@ if you have default google cloud credentials in your environment (i.e. on cloud
148148
Use **Cloud Storage** admin to create a new bucket. Then assign the **Storage Object Admin** role to your service account.
149149

150150
#### Azure Blob Storage
151-
Run `pip install dlt[az]` which will install the `adlfs` package to interface with Azure Blob Storage.
151+
Run `pip install "dlt[az]"` which will install the `adlfs` package to interface with Azure Blob Storage.
152152

153153
Edit the credentials in `.dlt/secrets.toml`, you'll see AWS credentials by default replace them with your Azure credentials:
154154

docs/website/docs/dlt-ecosystem/destinations/motherduck.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -10,7 +10,7 @@ keywords: [MotherDuck, duckdb, destination, data warehouse]
1010
## Install dlt with MotherDuck
1111
**To install the dlt library with MotherDuck dependencies:**
1212
```sh
13-
pip install dlt[motherduck]
13+
pip install "dlt[motherduck]"
1414
```
1515

1616
:::tip

docs/website/docs/dlt-ecosystem/destinations/mssql.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -9,7 +9,7 @@ keywords: [mssql, sqlserver, destination, data warehouse]
99
## Install dlt with MS SQL
1010
**To install the dlt library with MS SQL dependencies, use:**
1111
```sh
12-
pip install dlt[mssql]
12+
pip install "dlt[mssql]"
1313
```
1414

1515
## Setup guide
@@ -38,7 +38,7 @@ pip install -r requirements.txt
3838
```
3939
or run:
4040
```sh
41-
pip install dlt[mssql]
41+
pip install "dlt[mssql]"
4242
```
4343
This will install `dlt` with the `mssql` extra, which contains all the dependencies required by the SQL server client.
4444

docs/website/docs/dlt-ecosystem/destinations/postgres.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -9,7 +9,7 @@ keywords: [postgres, destination, data warehouse]
99
## Install dlt with PostgreSQL
1010
**To install the dlt library with PostgreSQL dependencies, run:**
1111
```sh
12-
pip install dlt[postgres]
12+
pip install "dlt[postgres]"
1313
```
1414

1515
## Setup Guide

0 commit comments

Comments
 (0)