-
Notifications
You must be signed in to change notification settings - Fork 4.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
🐛 Destination Databricks: Fix the S3 table path to use schema from the configured schema instead of the default. #35396
Conversation
The latest updates on your projects. Learn more about Vercel for Git ↗︎
|
Before Merging a Connector Pull RequestWow! What a great pull request you have here! 🎉 To merge this PR, ensure the following has been done/considered for each connector added or updated:
If the checklist is complete, but the CI check is failing,
|
@gisripa What do you think? |
@tanawatpan We had to the pin the version
|
@gisripa Sure, I'll run a forked build for now and watch for the updated connector. |
Hi @gisripa how soon is the revamp for databricks connector? Is there a roadmap or issue I can track? |
What
The Airbyte Databricks Connector for S3 currently constructs table paths using a default schema value when multiple schemas contain tables with the same name, leading to potential data overlap or collisions in S3.
How
Instead of using a default schema value, the connector will use the
schema
in the configuredDestination Namespace
. This change allows for distinct and schema-specific paths for each table.🚨 User Impact 🚨
the schema of the S3 Destination location will be derived from the Destination Namespace (if specified).
Test