Skip to content

Commit 154e1e6

Browse files
committed
update changelogs
1 parent 489a530 commit 154e1e6

File tree

3 files changed

+7
-4
lines changed

3 files changed

+7
-4
lines changed

docs/integrations/destinations/redshift.md

+3-2
Original file line numberDiff line numberDiff line change
@@ -6,7 +6,7 @@ The Airbyte Redshift destination allows you to sync data to Redshift.
66

77
This Redshift destination connector has two replication strategies:
88

9-
1. INSERT: Replicates data via SQL INSERT queries. This is built on top of the destination-jdbc code base and is configured to rely on JDBC 4.2 standard drivers provided by Amazon via Mulesoft [here](https://mvnrepository.com/artifact/com.amazon.redshift/redshift-jdbc42) as described in Redshift documentation [here](https://docs.aws.amazon.com/redshift/latest/mgmt/jdbc20-install.html). **Not recommended for production workloads as this does not scale well**.
9+
1. INSERT: Replicates data via SQL INSERT queries. This is built on top of the destination-jdbc code base and is configured to rely on JDBC 4.2 standard drivers provided by Amazon via Mulesoft [here](https://mvnrepository.com/artifact/com.amazon.redshift/redshift-jdbc42) as described in Redshift documentation [here](https://docs.aws.amazon.com/redshift/latest/mgmt/jdbc20-install.html). **Not recommended for production workloads as this does not scale well**.
1010
2. COPY: Replicates data by first uploading data to an S3 bucket and issuing a COPY command. This is the recommended loading approach described by Redshift [best practices](https://docs.aws.amazon.com/redshift/latest/dg/c_loading-data-best-practices.html). Requires an S3 bucket and credentials.
1111

1212
Airbyte automatically picks an approach depending on the given configuration - if S3 configuration is present, Airbyte will use the COPY strategy and vice versa.
@@ -79,7 +79,7 @@ Provide the required S3 info.
7979
* Place the S3 bucket and the Redshift cluster in the same region to save on networking costs.
8080
* **Access Key Id**
8181
* See [this](https://docs.aws.amazon.com/general/latest/gr/aws-sec-cred-types.html#access-keys-and-secret-access-keys) on how to generate an access key.
82-
* We recommend creating an Airbyte-specific user. This user will require [read and write permissions](https://docs.aws.amazon.com/IAM/latest/UserGuide/reference_policies_examples_s3_rw-bucket.html) to objects in the staging bucket.
82+
* We recommend creating an Airbyte-specific user. This user will require [read and write permissions](https://docs.aws.amazon.com/IAM/latest/UserGuide/reference_policies_examples_s3_rw-bucket.html) to objects in the staging bucket.
8383
* **Secret Access Key**
8484
* Corresponding key to the above key id.
8585
* **Part Size**
@@ -118,6 +118,7 @@ All Redshift connections are encrypted using SSL
118118

119119
| Version | Date | Pull Request | Subject |
120120
| :------ | :-------- | :----- | :------ |
121+
| 0.3.21 | 2021-12-10 | [#8562](https://github.com/airbytehq/airbyte/pull/8562) | Moving classes around for better dependency management |
121122
| 0.3.20 | 2021-11-08 | [#7719](https://github.com/airbytehq/airbyte/pull/7719) | Improve handling of wide rows by buffering records based on their byte size rather than their count |
122123
| 0.3.19 | 2021-10-21 | [7234](https://github.com/airbytehq/airbyte/pull/7234) | Allow SSL traffic only |
123124
| 0.3.17 | 2021-10-12 | [6965](https://github.com/airbytehq/airbyte/pull/6965) | Added SSL Support |

docs/integrations/destinations/s3.md

+2-1
Original file line numberDiff line numberDiff line change
@@ -223,7 +223,8 @@ Under the hood, an Airbyte data stream in Json schema is first converted to an A
223223

224224
| Version | Date | Pull Request | Subject |
225225
| :--- | :--- | :--- | :--- |
226-
| 0.1.15 | 2021-12-03 | [\#9999](https://github.com/airbytehq/airbyte/pull/9999) | Remove excessive logging for Avro and Parquet invalid date strings. |
226+
| 0.1.16 | 2021-12-10 | [\#8562](https://github.com/airbytehq/airbyte/pull/8562) | Swap dependencies with destination-jdbc. |
227+
| 0.1.15 | 2021-12-03 | [\#8501](https://github.com/airbytehq/airbyte/pull/8501) | Remove excessive logging for Avro and Parquet invalid date strings. |
227228
| 0.1.14 | 2021-11-09 | [\#7732](https://github.com/airbytehq/airbyte/pull/7732) | Support timestamp in Avro and Parquet |
228229
| 0.1.13 | 2021-11-03 | [\#7288](https://github.com/airbytehq/airbyte/issues/7288) | Support Json `additionalProperties`. |
229230
| 0.1.12 | 2021-09-13 | [\#5720](https://github.com/airbytehq/airbyte/issues/5720) | Added configurable block size for stream. Each stream is limited to 10,000 by S3 |

docs/integrations/destinations/snowflake.md

+2-1
Original file line numberDiff line numberDiff line change
@@ -162,7 +162,7 @@ First you will need to create a GCS bucket.
162162

163163
Then you will need to run the script below:
164164

165-
* You must run the script as the account admin for Snowflake.
165+
* You must run the script as the account admin for Snowflake.
166166
* You should replace `AIRBYTE_ROLE` with the role you used for Airbyte's Snowflake configuration.
167167
* Replace `YOURBUCKETNAME` with your bucket name
168168
* The stage name can be modified to any valid name.
@@ -194,6 +194,7 @@ Finally, you need to add read/write permissions to your bucket with that email.
194194

195195
| Version | Date | Pull Request | Subject |
196196
| :------ | :-------- | :----- | :------ |
197+
| 0.3.20 | 2021-12-10 | [#8562](https://github.com/airbytehq/airbyte/pull/8562) | Moving classes around for better dependency management; compatibility fix for Java 17 |
197198
| 0.3.19 | 2021-12-06 | [#8528](https://github.com/airbytehq/airbyte/pull/8528) | Set Internal Staging as default choice |
198199
| 0.3.18 | 2021-11-26 | [#8253](https://github.com/airbytehq/airbyte/pull/8253) | Snowflake Internal Staging Support |
199200
| 0.3.17 | 2021-11-08 | [#7719](https://github.com/airbytehq/airbyte/pull/7719) | Improve handling of wide rows by buffering records based on their byte size rather than their count |

0 commit comments

Comments
 (0)