Skip to content

Commit b201955

Browse files
edgaoschlattk
authored andcommitted
Bump connector versions for publishing (airbytehq#8717)
version bumps for airbytehq#8562
1 parent 049eb06 commit b201955

File tree

8 files changed

+16
-13
lines changed

8 files changed

+16
-13
lines changed

airbyte-config/init/src/main/resources/seed/destination_definitions.yaml

+3-3
Original file line numberDiff line numberDiff line change
@@ -139,13 +139,13 @@
139139
- name: Redshift
140140
destinationDefinitionId: f7a7d195-377f-cf5b-70a5-be6b819019dc
141141
dockerRepository: airbyte/destination-redshift
142-
dockerImageTag: 0.3.20
142+
dockerImageTag: 0.3.21
143143
documentationUrl: https://docs.airbyte.io/integrations/destinations/redshift
144144
icon: redshift.svg
145145
- name: S3
146146
destinationDefinitionId: 4816b78f-1489-44c1-9060-4b19d5fa9362
147147
dockerRepository: airbyte/destination-s3
148-
dockerImageTag: 0.1.15
148+
dockerImageTag: 0.1.16
149149
documentationUrl: https://docs.airbyte.io/integrations/destinations/s3
150150
icon: s3.svg
151151
- name: SFTP-JSON
@@ -157,7 +157,7 @@
157157
- name: Snowflake
158158
destinationDefinitionId: 424892c4-daac-4491-b35d-c6688ba547ba
159159
dockerRepository: airbyte/destination-snowflake
160-
dockerImageTag: 0.3.19
160+
dockerImageTag: 0.3.20
161161
documentationUrl: https://docs.airbyte.io/integrations/destinations/snowflake
162162
icon: snowflake.svg
163163
- name: MariaDB ColumnStore

airbyte-config/init/src/main/resources/seed/destination_specs.yaml

+3-3
Original file line numberDiff line numberDiff line change
@@ -2713,7 +2713,7 @@
27132713
supported_destination_sync_modes:
27142714
- "overwrite"
27152715
- "append"
2716-
- dockerImage: "airbyte/destination-redshift:0.3.20"
2716+
- dockerImage: "airbyte/destination-redshift:0.3.21"
27172717
spec:
27182718
documentationUrl: "https://docs.airbyte.io/integrations/destinations/redshift"
27192719
connectionSpecification:
@@ -2838,7 +2838,7 @@
28382838
- "overwrite"
28392839
- "append"
28402840
- "append_dedup"
2841-
- dockerImage: "airbyte/destination-s3:0.1.15"
2841+
- dockerImage: "airbyte/destination-s3:0.1.16"
28422842
spec:
28432843
documentationUrl: "https://docs.airbyte.io/integrations/destinations/s3"
28442844
connectionSpecification:
@@ -3209,7 +3209,7 @@
32093209
supported_destination_sync_modes:
32103210
- "overwrite"
32113211
- "append"
3212-
- dockerImage: "airbyte/destination-snowflake:0.3.19"
3212+
- dockerImage: "airbyte/destination-snowflake:0.3.20"
32133213
spec:
32143214
documentationUrl: "https://docs.airbyte.io/integrations/destinations/snowflake"
32153215
connectionSpecification:

airbyte-integrations/connectors/destination-redshift/Dockerfile

+1-1
Original file line numberDiff line numberDiff line change
@@ -8,5 +8,5 @@ COPY build/distributions/${APPLICATION}*.tar ${APPLICATION}.tar
88

99
RUN tar xf ${APPLICATION}.tar --strip-components=1
1010

11-
LABEL io.airbyte.version=0.3.20
11+
LABEL io.airbyte.version=0.3.21
1212
LABEL io.airbyte.name=airbyte/destination-redshift

airbyte-integrations/connectors/destination-s3/Dockerfile

+1-1
Original file line numberDiff line numberDiff line change
@@ -7,5 +7,5 @@ COPY build/distributions/${APPLICATION}*.tar ${APPLICATION}.tar
77

88
RUN tar xf ${APPLICATION}.tar --strip-components=1
99

10-
LABEL io.airbyte.version=0.1.15
10+
LABEL io.airbyte.version=0.1.16
1111
LABEL io.airbyte.name=airbyte/destination-s3

airbyte-integrations/connectors/destination-snowflake/Dockerfile

+1-1
Original file line numberDiff line numberDiff line change
@@ -18,5 +18,5 @@ COPY build/distributions/${APPLICATION}*.tar ${APPLICATION}.tar
1818

1919
RUN tar xf ${APPLICATION}.tar --strip-components=1
2020

21-
LABEL io.airbyte.version=0.3.19
21+
LABEL io.airbyte.version=0.3.20
2222
LABEL io.airbyte.name=airbyte/destination-snowflake

docs/integrations/destinations/redshift.md

+3-2
Original file line numberDiff line numberDiff line change
@@ -6,7 +6,7 @@ The Airbyte Redshift destination allows you to sync data to Redshift.
66

77
This Redshift destination connector has two replication strategies:
88

9-
1. INSERT: Replicates data via SQL INSERT queries. This is built on top of the destination-jdbc code base and is configured to rely on JDBC 4.2 standard drivers provided by Amazon via Mulesoft [here](https://mvnrepository.com/artifact/com.amazon.redshift/redshift-jdbc42) as described in Redshift documentation [here](https://docs.aws.amazon.com/redshift/latest/mgmt/jdbc20-install.html). **Not recommended for production workloads as this does not scale well**.
9+
1. INSERT: Replicates data via SQL INSERT queries. This is built on top of the destination-jdbc code base and is configured to rely on JDBC 4.2 standard drivers provided by Amazon via Mulesoft [here](https://mvnrepository.com/artifact/com.amazon.redshift/redshift-jdbc42) as described in Redshift documentation [here](https://docs.aws.amazon.com/redshift/latest/mgmt/jdbc20-install.html). **Not recommended for production workloads as this does not scale well**.
1010
2. COPY: Replicates data by first uploading data to an S3 bucket and issuing a COPY command. This is the recommended loading approach described by Redshift [best practices](https://docs.aws.amazon.com/redshift/latest/dg/c_loading-data-best-practices.html). Requires an S3 bucket and credentials.
1111

1212
Airbyte automatically picks an approach depending on the given configuration - if S3 configuration is present, Airbyte will use the COPY strategy and vice versa.
@@ -79,7 +79,7 @@ Provide the required S3 info.
7979
* Place the S3 bucket and the Redshift cluster in the same region to save on networking costs.
8080
* **Access Key Id**
8181
* See [this](https://docs.aws.amazon.com/general/latest/gr/aws-sec-cred-types.html#access-keys-and-secret-access-keys) on how to generate an access key.
82-
* We recommend creating an Airbyte-specific user. This user will require [read and write permissions](https://docs.aws.amazon.com/IAM/latest/UserGuide/reference_policies_examples_s3_rw-bucket.html) to objects in the staging bucket.
82+
* We recommend creating an Airbyte-specific user. This user will require [read and write permissions](https://docs.aws.amazon.com/IAM/latest/UserGuide/reference_policies_examples_s3_rw-bucket.html) to objects in the staging bucket.
8383
* **Secret Access Key**
8484
* Corresponding key to the above key id.
8585
* **Part Size**
@@ -118,6 +118,7 @@ All Redshift connections are encrypted using SSL
118118

119119
| Version | Date | Pull Request | Subject |
120120
| :------ | :-------- | :----- | :------ |
121+
| 0.3.21 | 2021-12-10 | [#8562](https://github.com/airbytehq/airbyte/pull/8562) | Moving classes around for better dependency management |
121122
| 0.3.20 | 2021-11-08 | [#7719](https://github.com/airbytehq/airbyte/pull/7719) | Improve handling of wide rows by buffering records based on their byte size rather than their count |
122123
| 0.3.19 | 2021-10-21 | [7234](https://github.com/airbytehq/airbyte/pull/7234) | Allow SSL traffic only |
123124
| 0.3.17 | 2021-10-12 | [6965](https://github.com/airbytehq/airbyte/pull/6965) | Added SSL Support |

docs/integrations/destinations/s3.md

+2-1
Original file line numberDiff line numberDiff line change
@@ -223,7 +223,8 @@ Under the hood, an Airbyte data stream in Json schema is first converted to an A
223223

224224
| Version | Date | Pull Request | Subject |
225225
| :--- | :--- | :--- | :--- |
226-
| 0.1.15 | 2021-12-03 | [\#9999](https://github.com/airbytehq/airbyte/pull/9999) | Remove excessive logging for Avro and Parquet invalid date strings. |
226+
| 0.1.16 | 2021-12-10 | [\#8562](https://github.com/airbytehq/airbyte/pull/8562) | Swap dependencies with destination-jdbc. |
227+
| 0.1.15 | 2021-12-03 | [\#8501](https://github.com/airbytehq/airbyte/pull/8501) | Remove excessive logging for Avro and Parquet invalid date strings. |
227228
| 0.1.14 | 2021-11-09 | [\#7732](https://github.com/airbytehq/airbyte/pull/7732) | Support timestamp in Avro and Parquet |
228229
| 0.1.13 | 2021-11-03 | [\#7288](https://github.com/airbytehq/airbyte/issues/7288) | Support Json `additionalProperties`. |
229230
| 0.1.12 | 2021-09-13 | [\#5720](https://github.com/airbytehq/airbyte/issues/5720) | Added configurable block size for stream. Each stream is limited to 10,000 by S3 |

docs/integrations/destinations/snowflake.md

+2-1
Original file line numberDiff line numberDiff line change
@@ -162,7 +162,7 @@ First you will need to create a GCS bucket.
162162

163163
Then you will need to run the script below:
164164

165-
* You must run the script as the account admin for Snowflake.
165+
* You must run the script as the account admin for Snowflake.
166166
* You should replace `AIRBYTE_ROLE` with the role you used for Airbyte's Snowflake configuration.
167167
* Replace `YOURBUCKETNAME` with your bucket name
168168
* The stage name can be modified to any valid name.
@@ -194,6 +194,7 @@ Finally, you need to add read/write permissions to your bucket with that email.
194194

195195
| Version | Date | Pull Request | Subject |
196196
| :------ | :-------- | :----- | :------ |
197+
| 0.3.20 | 2021-12-10 | [#8562](https://github.com/airbytehq/airbyte/pull/8562) | Moving classes around for better dependency management; compatibility fix for Java 17 |
197198
| 0.3.19 | 2021-12-06 | [#8528](https://github.com/airbytehq/airbyte/pull/8528) | Set Internal Staging as default choice |
198199
| 0.3.18 | 2021-11-26 | [#8253](https://github.com/airbytehq/airbyte/pull/8253) | Snowflake Internal Staging Support |
199200
| 0.3.17 | 2021-11-08 | [#7719](https://github.com/airbytehq/airbyte/pull/7719) | Improve handling of wide rows by buffering records based on their byte size rather than their count |

0 commit comments

Comments
 (0)