Skip to content

Commit db04610

Browse files
natikgadzhiChristoGraboctavia-squidington-iii
authored
chore(destination-sftp-json): migrate to poetry and base image (#46873)
Co-authored-by: ChristoGrab <[email protected]> Co-authored-by: Octavia Squidington III <[email protected]>
1 parent 521bce7 commit db04610

File tree

17 files changed

+2426
-168
lines changed

17 files changed

+2426
-168
lines changed

airbyte-integrations/connectors/destination-sftp-json/.dockerignore

-5
This file was deleted.

airbyte-integrations/connectors/destination-sftp-json/Dockerfile

-13
This file was deleted.
Original file line numberDiff line numberDiff line change
@@ -1,118 +1,90 @@
1-
# Sftp Json Destination
1+
# Sftp-Json source connector
22

3-
This is the repository for the SFTP JSON destination connector, written in Python.
4-
Data received by this destination will be written to a JSON lines file at the path and filename supplied in the configuration.
5-
6-
For information about how to use this connector within Airbyte, see [the documentation](https://docs.airbyte.io/integrations/destinations/sftp-json).
3+
This is the repository for the SFTP-JSON destination connector, written in Python.
4+
For information about how to use this connector within Airbyte, see [the documentation](https://docs.airbyte.com/integrations/destinations/sftp-json).
75

86
## Local development
97

108
### Prerequisites
9+
* Python (~=3.9)
10+
* Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation)
1111

12-
**To iterate on this connector, make sure to complete this prerequisites section.**
13-
14-
#### Minimum Python version required `= 3.7.0`
15-
16-
#### Build & Activate Virtual Environment and install dependencies
17-
18-
From this connector directory, create a virtual environment:
19-
20-
```
21-
python -m venv .venv
22-
```
23-
24-
This will generate a virtualenv for this module in `.venv/`. Make sure this venv is active in your
25-
development environment of choice. To activate it from the terminal, run:
2612

13+
### Installing the connector
14+
From this connector directory, run:
15+
```bash
16+
poetry install --with dev
2717
```
28-
source .venv/bin/activate
29-
pip install -r requirements.txt
30-
```
31-
32-
If you are in an IDE, follow your IDE's instructions to activate the virtualenv.
33-
34-
Note that while we are installing dependencies from `requirements.txt`, you should only edit `setup.py` for your dependencies. `requirements.txt` is
35-
used for editable installs (`pip install -e`) to pull in Python dependencies from the monorepo and will call `setup.py`.
36-
If this is mumbo jumbo to you, don't worry about it, just put your deps in `setup.py` but install using `pip install -r requirements.txt` and everything
37-
should work as you expect.
3818

39-
#### Create credentials
4019

41-
**If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.io/integrations/destinations/sftp-json)
42-
to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `destination_sftp_json/spec.json` file.
43-
Note that the `secrets` directory is gitignored by default, so there is no danger of accidentally checking in sensitive information.
44-
See `integration_tests/sample_config.json` for a sample config file.
20+
### Create credentials
21+
**If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/destinations/sftp-json)
22+
to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `destination_sftp_json/spec.yaml` file.
23+
Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information.
24+
See `sample_files/sample_config.json` for a sample config file.
4525

46-
**If you are an Airbyte core member**, copy the credentials in Lastpass under the secret name `destination sftp-json test creds`
47-
and place them into `secrets/config.json`.
4826

4927
### Locally running the connector
50-
5128
```
52-
python main.py spec
53-
python main.py check --config secrets/config.json
54-
python main.py discover --config secrets/config.json
55-
python main.py read --config secrets/config.json --catalog integration_tests/configured_catalog.json
29+
poetry run destination-sftp-json spec
30+
poetry run destination-sftp-json check --config secrets/config.json
31+
poetry run destination-sftp-json discover --config secrets/config.json
32+
poetry run destination-sftp-json read --config secrets/config.json --catalog sample_files/configured_catalog.json
5633
```
5734

58-
### Locally running the connector docker image
59-
60-
#### Build
61-
62-
**Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):**
35+
### Running unit tests
36+
To run unit tests locally, from the connector directory run:
37+
```
38+
poetry run pytest unit_tests
39+
```
6340

41+
### Building the docker image
42+
1. Install [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md)
43+
2. Run the following command to build the docker image:
6444
```bash
6545
airbyte-ci connectors --name=destination-sftp-json build
6646
```
6747

68-
An image will be built with the tag `airbyte/destination-sftp-json:dev`.
69-
70-
**Via `docker build`:**
71-
72-
```bash
73-
docker build -t airbyte/destination-sftp-json:dev .
74-
```
48+
An image will be available on your host with the tag `airbyte/destination-sftp-json:dev`.
7549

76-
#### Run
7750

51+
### Running as a docker container
7852
Then run any of the connector commands as follows:
79-
8053
```
8154
docker run --rm airbyte/destination-sftp-json:dev spec
8255
docker run --rm -v $(pwd)/secrets:/secrets airbyte/destination-sftp-json:dev check --config /secrets/config.json
83-
# messages.jsonl is a file containing line-separated JSON representing AirbyteMessages
84-
cat messages.jsonl | docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/destination-sftp-json:dev write --config /secrets/config.json --catalog /integration_tests/configured_catalog.json
56+
docker run --rm -v $(pwd)/secrets:/secrets airbyte/destination-sftp-json:dev discover --config /secrets/config.json
57+
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/destination-sftp-json:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.json
8558
```
8659

87-
## Testing
88-
60+
### Running our CI test suite
8961
You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md):
90-
9162
```bash
9263
airbyte-ci connectors --name=destination-sftp-json test
9364
```
9465

9566
### Customizing acceptance Tests
96-
97-
Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information.
67+
Customize `acceptance-test-config.yml` file to configure acceptance tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information.
9868
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.
9969

100-
## Dependency Management
101-
102-
All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.
103-
We split dependencies between two groups, dependencies that are:
104-
105-
- required for your connector to work need to go to `MAIN_REQUIREMENTS` list.
106-
- required for the testing need to go to `TEST_REQUIREMENTS` list
70+
### Dependency Management
71+
All of your dependencies should be managed via Poetry.
72+
To add a new dependency, run:
73+
```bash
74+
poetry add <package-name>
75+
```
10776

108-
### Publishing a new version of the connector
77+
Please commit the changes to `pyproject.toml` and `poetry.lock` files.
10978

79+
## Publishing a new version of the connector
11080
You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?
111-
11281
1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=destination-sftp-json test`
113-
2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors).
82+
2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)):
83+
- bump the `dockerImageTag` value in in `metadata.yaml`
84+
- bump the `version` value in `pyproject.toml`
11485
3. Make sure the `metadata.yaml` content is up to date.
115-
4. Make the connector documentation and its changelog is up to date (`docs/integrations/destinations/sftp-json.md`).
86+
4. Make sure the connector documentation and its changelog is up to date (`docs/integrations/destinations/sftp-json.md`).
11687
5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention).
11788
6. Pat yourself on the back for being an awesome contributor.
11889
7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.
90+
8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry.
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,22 @@
1+
#
2+
# Copyright (c) 2023 Airbyte, Inc., all rights reserved.
3+
#
4+
from __future__ import annotations
5+
6+
from typing import TYPE_CHECKING
7+
8+
9+
if TYPE_CHECKING:
10+
from dagger import Container
11+
12+
13+
async def pre_connector_install(base_image_container: Container) -> Container:
14+
"""
15+
Docker compose is required to run the integration tests so we install Docker on top of the base image.
16+
"""
17+
return (
18+
base_image_container.with_exec(["sh", "-c", "apt-get update && apt-get install -y curl"], use_entrypoint=True)
19+
.with_exec(["curl", "-fsSL", "https://get.docker.com", "-o", "/tmp/install-docker.sh"], use_entrypoint=True)
20+
.with_exec(["sh", "/tmp/install-docker.sh", "--version", "23.0"], use_entrypoint=True)
21+
.with_exec(["rm", "/tmp/install-docker.sh"], use_entrypoint=True)
22+
)

airbyte-integrations/connectors/destination-sftp-json/destination_sftp_json/client.py

-1
Original file line numberDiff line numberDiff line change
@@ -2,7 +2,6 @@
22
# Copyright (c) 2023 Airbyte, Inc., all rights reserved.
33
#
44

5-
65
import contextlib
76
import errno
87
import json

airbyte-integrations/connectors/destination-sftp-json/destination_sftp_json/destination.py

+4-3
Original file line numberDiff line numberDiff line change
@@ -3,11 +3,11 @@
33
#
44

55

6+
import logging
67
import traceback
78
import uuid
89
from typing import Any, Iterable, Mapping
910

10-
from airbyte_cdk import AirbyteLogger
1111
from airbyte_cdk.destinations import Destination
1212
from airbyte_cdk.models import AirbyteConnectionStatus, AirbyteMessage, ConfiguredAirbyteCatalog, DestinationSyncMode, Status, Type
1313
from destination_sftp_json.client import SftpClient
@@ -47,12 +47,13 @@ def write(
4747
yield message
4848
elif message.type == Type.RECORD:
4949
record = message.record
50-
writer.write(record.stream, record.data)
50+
if record is not None:
51+
writer.write(record.stream, record.data)
5152
else:
5253
# ignore other message types for now
5354
continue
5455

55-
def check(self, logger: AirbyteLogger, config: Mapping[str, Any]) -> AirbyteConnectionStatus:
56+
def check(self, logger: logging.Logger, config: Mapping[str, Any]) -> AirbyteConnectionStatus:
5657
"""
5758
Tests if the input configuration can be used to successfully connect to the destination with the needed permissions
5859
e.g: if a provided API token or password can be used to connect and write to the destination.
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,17 @@
1+
#
2+
# Copyright (c) 2023 Airbyte, Inc., all rights reserved.
3+
#
4+
from __future__ import annotations
5+
6+
import pytest
7+
8+
9+
pytest_plugins = ("connector_acceptance_test.plugin",)
10+
11+
12+
@pytest.fixture(scope="session", autouse=True)
13+
def connector_setup():
14+
"""This fixture is a placeholder for external resources that acceptance test might require."""
15+
# TODO: setup test dependencies
16+
yield
17+
# TODO: clean up test dependencies
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,49 @@
1+
#
2+
# Copyright (c) 2023 Airbyte, Inc., all rights reserved.
3+
#
4+
5+
import socket
6+
from pathlib import Path
7+
from typing import Mapping
8+
9+
import paramiko
10+
import paramiko.client
11+
import pytest
12+
from paramiko.ssh_exception import SSHException
13+
14+
15+
HERE = Path(__file__).parent.absolute()
16+
17+
18+
@pytest.fixture(scope="session")
19+
def docker_compose_file() -> Path:
20+
return HERE / "docker-compose.yml"
21+
22+
23+
def is_sftp_ready(ip: str, config: Mapping) -> bool:
24+
"""Helper function that checks if sftp is served on provided ip address and port."""
25+
try:
26+
with paramiko.client.SSHClient() as ssh:
27+
ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy)
28+
# hardcoding the credentials is okay here, we're not testing them explicitly.
29+
ssh.connect(
30+
ip,
31+
port=config["port"],
32+
username=config["username"],
33+
password=config["password"],
34+
)
35+
return True
36+
except (SSHException, socket.error):
37+
return False
38+
39+
40+
@pytest.fixture(scope="module")
41+
def config(docker_ip, docker_services) -> Mapping:
42+
"""
43+
Provides the SFTP configuration using docker_services.
44+
Waits for the docker container to become available before returning the config.
45+
"""
46+
port = docker_services.port_for("sftp", 22)
47+
config_data = {"host": docker_ip, "port": port, "username": "user1", "password": "abc123", "destination_path": "upload"}
48+
docker_services.wait_until_responsive(timeout=30.0, pause=0.1, check=lambda: is_sftp_ready(docker_ip, config_data))
49+
return config_data
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,12 @@
1+
version: "3"
2+
services:
3+
sftp:
4+
image: atmoz/sftp
5+
ports:
6+
- "2222:22"
7+
command: user1:abc123:1001:1001:upload
8+
healthcheck:
9+
test: ["CMD", "nc", "-z", "localhost", "22"]
10+
interval: 2s
11+
timeout: 1s
12+
retries: 5

0 commit comments

Comments
 (0)