|
1 |
| -# Sftp Json Destination |
| 1 | +# Sftp-Json source connector |
2 | 2 |
|
3 |
| -This is the repository for the SFTP JSON destination connector, written in Python. |
4 |
| -Data received by this destination will be written to a JSON lines file at the path and filename supplied in the configuration. |
5 |
| - |
6 |
| -For information about how to use this connector within Airbyte, see [the documentation](https://docs.airbyte.io/integrations/destinations/sftp-json). |
| 3 | +This is the repository for the SFTP-JSON destination connector, written in Python. |
| 4 | +For information about how to use this connector within Airbyte, see [the documentation](https://docs.airbyte.com/integrations/destinations/sftp-json). |
7 | 5 |
|
8 | 6 | ## Local development
|
9 | 7 |
|
10 | 8 | ### Prerequisites
|
| 9 | +* Python (~=3.9) |
| 10 | +* Poetry (~=1.7) - installation instructions [here](https://python-poetry.org/docs/#installation) |
11 | 11 |
|
12 |
| -**To iterate on this connector, make sure to complete this prerequisites section.** |
13 |
| - |
14 |
| -#### Minimum Python version required `= 3.7.0` |
15 |
| - |
16 |
| -#### Build & Activate Virtual Environment and install dependencies |
17 |
| - |
18 |
| -From this connector directory, create a virtual environment: |
19 |
| - |
20 |
| -``` |
21 |
| -python -m venv .venv |
22 |
| -``` |
23 |
| - |
24 |
| -This will generate a virtualenv for this module in `.venv/`. Make sure this venv is active in your |
25 |
| -development environment of choice. To activate it from the terminal, run: |
26 | 12 |
|
| 13 | +### Installing the connector |
| 14 | +From this connector directory, run: |
| 15 | +```bash |
| 16 | +poetry install --with dev |
27 | 17 | ```
|
28 |
| -source .venv/bin/activate |
29 |
| -pip install -r requirements.txt |
30 |
| -``` |
31 |
| - |
32 |
| -If you are in an IDE, follow your IDE's instructions to activate the virtualenv. |
33 |
| - |
34 |
| -Note that while we are installing dependencies from `requirements.txt`, you should only edit `setup.py` for your dependencies. `requirements.txt` is |
35 |
| -used for editable installs (`pip install -e`) to pull in Python dependencies from the monorepo and will call `setup.py`. |
36 |
| -If this is mumbo jumbo to you, don't worry about it, just put your deps in `setup.py` but install using `pip install -r requirements.txt` and everything |
37 |
| -should work as you expect. |
38 | 18 |
|
39 |
| -#### Create credentials |
40 | 19 |
|
41 |
| -**If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.io/integrations/destinations/sftp-json) |
42 |
| -to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `destination_sftp_json/spec.json` file. |
43 |
| -Note that the `secrets` directory is gitignored by default, so there is no danger of accidentally checking in sensitive information. |
44 |
| -See `integration_tests/sample_config.json` for a sample config file. |
| 20 | +### Create credentials |
| 21 | +**If you are a community contributor**, follow the instructions in the [documentation](https://docs.airbyte.com/integrations/destinations/sftp-json) |
| 22 | +to generate the necessary credentials. Then create a file `secrets/config.json` conforming to the `destination_sftp_json/spec.yaml` file. |
| 23 | +Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. |
| 24 | +See `sample_files/sample_config.json` for a sample config file. |
45 | 25 |
|
46 |
| -**If you are an Airbyte core member**, copy the credentials in Lastpass under the secret name `destination sftp-json test creds` |
47 |
| -and place them into `secrets/config.json`. |
48 | 26 |
|
49 | 27 | ### Locally running the connector
|
50 |
| - |
51 | 28 | ```
|
52 |
| -python main.py spec |
53 |
| -python main.py check --config secrets/config.json |
54 |
| -python main.py discover --config secrets/config.json |
55 |
| -python main.py read --config secrets/config.json --catalog integration_tests/configured_catalog.json |
| 29 | +poetry run destination-sftp-json spec |
| 30 | +poetry run destination-sftp-json check --config secrets/config.json |
| 31 | +poetry run destination-sftp-json discover --config secrets/config.json |
| 32 | +poetry run destination-sftp-json read --config secrets/config.json --catalog sample_files/configured_catalog.json |
56 | 33 | ```
|
57 | 34 |
|
58 |
| -### Locally running the connector docker image |
59 |
| - |
60 |
| -#### Build |
61 |
| - |
62 |
| -**Via [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) (recommended):** |
| 35 | +### Running unit tests |
| 36 | +To run unit tests locally, from the connector directory run: |
| 37 | +``` |
| 38 | +poetry run pytest unit_tests |
| 39 | +``` |
63 | 40 |
|
| 41 | +### Building the docker image |
| 42 | +1. Install [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md) |
| 43 | +2. Run the following command to build the docker image: |
64 | 44 | ```bash
|
65 | 45 | airbyte-ci connectors --name=destination-sftp-json build
|
66 | 46 | ```
|
67 | 47 |
|
68 |
| -An image will be built with the tag `airbyte/destination-sftp-json:dev`. |
69 |
| - |
70 |
| -**Via `docker build`:** |
71 |
| - |
72 |
| -```bash |
73 |
| -docker build -t airbyte/destination-sftp-json:dev . |
74 |
| -``` |
| 48 | +An image will be available on your host with the tag `airbyte/destination-sftp-json:dev`. |
75 | 49 |
|
76 |
| -#### Run |
77 | 50 |
|
| 51 | +### Running as a docker container |
78 | 52 | Then run any of the connector commands as follows:
|
79 |
| - |
80 | 53 | ```
|
81 | 54 | docker run --rm airbyte/destination-sftp-json:dev spec
|
82 | 55 | docker run --rm -v $(pwd)/secrets:/secrets airbyte/destination-sftp-json:dev check --config /secrets/config.json
|
83 |
| -# messages.jsonl is a file containing line-separated JSON representing AirbyteMessages |
84 |
| -cat messages.jsonl | docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/destination-sftp-json:dev write --config /secrets/config.json --catalog /integration_tests/configured_catalog.json |
| 56 | +docker run --rm -v $(pwd)/secrets:/secrets airbyte/destination-sftp-json:dev discover --config /secrets/config.json |
| 57 | +docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/destination-sftp-json:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.json |
85 | 58 | ```
|
86 | 59 |
|
87 |
| -## Testing |
88 |
| - |
| 60 | +### Running our CI test suite |
89 | 61 | You can run our full test suite locally using [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md):
|
90 |
| - |
91 | 62 | ```bash
|
92 | 63 | airbyte-ci connectors --name=destination-sftp-json test
|
93 | 64 | ```
|
94 | 65 |
|
95 | 66 | ### Customizing acceptance Tests
|
96 |
| - |
97 |
| -Customize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. |
| 67 | +Customize `acceptance-test-config.yml` file to configure acceptance tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information. |
98 | 68 | If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.
|
99 | 69 |
|
100 |
| -## Dependency Management |
101 |
| - |
102 |
| -All of your dependencies should go in `setup.py`, NOT `requirements.txt`. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. |
103 |
| -We split dependencies between two groups, dependencies that are: |
104 |
| - |
105 |
| -- required for your connector to work need to go to `MAIN_REQUIREMENTS` list. |
106 |
| -- required for the testing need to go to `TEST_REQUIREMENTS` list |
| 70 | +### Dependency Management |
| 71 | +All of your dependencies should be managed via Poetry. |
| 72 | +To add a new dependency, run: |
| 73 | +```bash |
| 74 | +poetry add <package-name> |
| 75 | +``` |
107 | 76 |
|
108 |
| -### Publishing a new version of the connector |
| 77 | +Please commit the changes to `pyproject.toml` and `poetry.lock` files. |
109 | 78 |
|
| 79 | +## Publishing a new version of the connector |
110 | 80 | You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?
|
111 |
| - |
112 | 81 | 1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=destination-sftp-json test`
|
113 |
| -2. Bump the connector version in `metadata.yaml`: increment the `dockerImageTag` value. Please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors). |
| 82 | +2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)): |
| 83 | + - bump the `dockerImageTag` value in in `metadata.yaml` |
| 84 | + - bump the `version` value in `pyproject.toml` |
114 | 85 | 3. Make sure the `metadata.yaml` content is up to date.
|
115 |
| -4. Make the connector documentation and its changelog is up to date (`docs/integrations/destinations/sftp-json.md`). |
| 86 | +4. Make sure the connector documentation and its changelog is up to date (`docs/integrations/destinations/sftp-json.md`). |
116 | 87 | 5. Create a Pull Request: use [our PR naming conventions](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#pull-request-title-convention).
|
117 | 88 | 6. Pat yourself on the back for being an awesome contributor.
|
118 | 89 | 7. Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.
|
| 90 | +8. Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. |
0 commit comments