Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[wip] fix(ui): add external url button for ml entities for v2 #12893

Draft
wants to merge 2 commits into
base: master
Choose a base branch
from

Merge branch 'master' into enable-ml-external-url

d782583
Select commit
Loading
Failed to load commit list.
Sign in for the full log view
Draft

[wip] fix(ui): add external url button for ml entities for v2 #12893

Merge branch 'master' into enable-ml-external-url
d782583
Select commit
Loading
Failed to load commit list.
GitHub Actions / Unit Test Results (GX Plugin) failed Mar 17, 2025 in 0s

2 fail, 4 pass in 59s

6 tests  ±0   4 ✅ ±0   59s ⏱️ -1s
1 suites ±0   0 💤 ±0 
1 files   ±0   2 ❌ ±0 

Results for commit d782583. ± Comparison against earlier commit 6a8de85.

Annotations

Check warning on line 0 in tests.integration.test_great_expectations

See this annotation in the file changed.

@github-actions github-actions / Unit Test Results (GX Plugin)

test_ge_ingest[test_checkpoint-ge_mcps_golden.json] (tests.integration.test_great_expectations) failed

artifacts/Test Results (GX Plugin 3.11)/metadata-ingestion-modules/gx-plugin/junit.quick.xml [took 11s]
Raw output
NotImplementedError: DeltaInfoOperator needs to define a normalize_value_for_hashing method to be compatible with ignore_order=True or iterable_compare_func.
docker_compose_runner = <function docker_compose_runner.<locals>.run at 0x7f6f08b87100>
pytestconfig = <_pytest.config.Config object at 0x7f6f33f0aa10>
tmp_path = PosixPath('/tmp/pytest-of-runner/pytest-0/test_ge_ingest_test_checkpoint0')
checkpoint = 'test_checkpoint', golden_json = 'ge_mcps_golden.json', kwargs = {}
test_resources_dir = PosixPath('/home/runner/work/datahub/datahub/metadata-ingestion-modules/gx-plugin/tests/integration')
docker_services = Services(_docker_compose=DockerComposeExecutor(_compose_command='docker compose', _compose_files=[PosixPath('/home/run...x-plugin/tests/integration/docker-compose.yml')], _compose_project_name='pytest4080-great-expectations'), _services={})
mock_emit_mcp = <MagicMock name='emit_mcp' id='140114864208720'>
emitter = <test_great_expectations.MockDatahubEmitter object at 0x7f6f08b57950>
gx_context_folder_name = 'gx'
context = {
  "anonymous_usage_statistics": {
    "explicit_id": true,
    "data_context_id": "e6cb8e56-7193-436a-a131-7639f607e...d": true,
        "base_directory": "checkpoints/"
      }
    }
  },
  "validations_store_name": "validations_store"
}

    @freeze_time(FROZEN_TIME)
    @pytest.mark.integration
    @pytest.mark.parametrize(
        "checkpoint, golden_json",
        [
            ("test_checkpoint", "ge_mcps_golden.json"),
            ("test_checkpoint_2", "ge_mcps_golden_2.json"),
        ],
    )
    def test_ge_ingest(
        docker_compose_runner,
        pytestconfig,
        tmp_path,
        checkpoint,
        golden_json,
        **kwargs,
    ):
        test_resources_dir = pytestconfig.rootpath / "tests/integration"
    
        with docker_compose_runner(
            test_resources_dir / "docker-compose.yml", "great-expectations"
        ) as docker_services, mock.patch(
            "datahub.emitter.rest_emitter.DatahubRestEmitter.emit_mcp"
        ) as mock_emit_mcp:
            wait_for_port(docker_services, "ge_postgres", 5432)
    
            emitter = MockDatahubEmitter("")
            mock_emit_mcp.side_effect = emitter.emit_mcp
    
            gx_context_folder_name = "gx" if use_gx_folder else "great_expectations"
            shutil.copytree(
                test_resources_dir / "setup/great_expectations",
                tmp_path / gx_context_folder_name,
            )
    
            context = FileDataContext.create(tmp_path)
            context.run_checkpoint(checkpoint_name=checkpoint)
    
            emitter.write_to_file(tmp_path / "ge_mcps.json")
    
>           assert_metadata_files_equal(
                output_path=tmp_path / "ge_mcps.json",
                golden_path=test_resources_dir / golden_json,
                ignore_paths=[],
            )

tests/integration/test_great_expectations.py:79: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
../../metadata-ingestion/src/datahub/testing/compare_metadata_json.py:90: in assert_metadata_files_equal
    diff = diff_metadata_json(output, golden, ignore_paths, ignore_order=ignore_order)
../../metadata-ingestion/src/datahub/testing/compare_metadata_json.py:126: in diff_metadata_json
    return MCPDiff.create(
../../metadata-ingestion/src/datahub/testing/mcp_diff.py:174: in create
    diff = DeepDiff(
venv/lib/python3.11/site-packages/deepdiff/diff.py:351: in __init__
    self._diff(root, parents_ids=frozenset({id(t1)}), _original_type=_original_type)
venv/lib/python3.11/site-packages/deepdiff/diff.py:1723: in _diff
    self._diff_iterable(level, parents_ids, _original_type=_original_type, local_tree=local_tree)
venv/lib/python3.11/site-packages/deepdiff/diff.py:741: in _diff_iterable
    self._diff_iterable_with_deephash(level, parents_ids, _original_type=_original_type, local_tree=local_tree)
venv/lib/python3.11/site-packages/deepdiff/diff.py:1285: in _diff_iterable_with_deephash
    full_t1_hashtable = self._create_hashtable(level, 't1')
venv/lib/python3.11/site-packages/deepdiff/diff.py:1101: in _create_hashtable
    deep_hash = DeepHash(
venv/lib/python3.11/site-packages/deepdiff/deephash.py:224: in __init__
    self._hash(obj, parent=parent, parents_ids=frozenset({get_id(obj)}))
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <[TypeError("object of type 'object' has no len()") raised in repr()] DeepHash object at 0x7f6f0aa58710>
obj = 
		{"urn": "urn:li:assertion:9729dfafea4bb2c2f114bc80e513a7ec", "change_type": "UPSERT", "aspect_name": "assertionInfo", "aspect": "<aspect>"}
parent = 'root[0]', parents_ids = frozenset({'!>*id140115375587472'})

    def _hash(self, obj, parent, parents_ids=EMPTY_FROZENSET):
        """The main hash method"""
        counts = 1
        if self.custom_operators is not None:
            for operator in self.custom_operators:
                func = getattr(operator, 'normalize_value_for_hashing', None)
                if func is None:
>                   raise NotImplementedError(f"{operator.__class__.__name__} needs to define a normalize_value_for_hashing method to be compatible with ignore_order=True or iterable_compare_func.".format(operator))
E                   NotImplementedError: DeltaInfoOperator needs to define a normalize_value_for_hashing method to be compatible with ignore_order=True or iterable_compare_func.

venv/lib/python3.11/site-packages/deepdiff/deephash.py:516: NotImplementedError

Check warning on line 0 in tests.integration.test_great_expectations

See this annotation in the file changed.

@github-actions github-actions / Unit Test Results (GX Plugin)

test_ge_ingest[test_checkpoint_2-ge_mcps_golden_2.json] (tests.integration.test_great_expectations) failed

artifacts/Test Results (GX Plugin 3.11)/metadata-ingestion-modules/gx-plugin/junit.quick.xml [took 6s]
Raw output
NotImplementedError: DeltaInfoOperator needs to define a normalize_value_for_hashing method to be compatible with ignore_order=True or iterable_compare_func.
docker_compose_runner = <function docker_compose_runner.<locals>.run at 0x7f6f08b87100>
pytestconfig = <_pytest.config.Config object at 0x7f6f33f0aa10>
tmp_path = PosixPath('/tmp/pytest-of-runner/pytest-0/test_ge_ingest_test_checkpoint1')
checkpoint = 'test_checkpoint_2', golden_json = 'ge_mcps_golden_2.json'
kwargs = {}
test_resources_dir = PosixPath('/home/runner/work/datahub/datahub/metadata-ingestion-modules/gx-plugin/tests/integration')
docker_services = Services(_docker_compose=DockerComposeExecutor(_compose_command='docker compose', _compose_files=[PosixPath('/home/run...x-plugin/tests/integration/docker-compose.yml')], _compose_project_name='pytest4080-great-expectations'), _services={})
mock_emit_mcp = <MagicMock name='emit_mcp' id='140115369048848'>
emitter = <test_great_expectations.MockDatahubEmitter object at 0x7f6f26cca9d0>
gx_context_folder_name = 'gx'
context = {
  "anonymous_usage_statistics": {
    "explicit_id": true,
    "data_context_id": "e6cb8e56-7193-436a-a131-7639f607e...d": true,
        "base_directory": "checkpoints/"
      }
    }
  },
  "validations_store_name": "validations_store"
}

    @freeze_time(FROZEN_TIME)
    @pytest.mark.integration
    @pytest.mark.parametrize(
        "checkpoint, golden_json",
        [
            ("test_checkpoint", "ge_mcps_golden.json"),
            ("test_checkpoint_2", "ge_mcps_golden_2.json"),
        ],
    )
    def test_ge_ingest(
        docker_compose_runner,
        pytestconfig,
        tmp_path,
        checkpoint,
        golden_json,
        **kwargs,
    ):
        test_resources_dir = pytestconfig.rootpath / "tests/integration"
    
        with docker_compose_runner(
            test_resources_dir / "docker-compose.yml", "great-expectations"
        ) as docker_services, mock.patch(
            "datahub.emitter.rest_emitter.DatahubRestEmitter.emit_mcp"
        ) as mock_emit_mcp:
            wait_for_port(docker_services, "ge_postgres", 5432)
    
            emitter = MockDatahubEmitter("")
            mock_emit_mcp.side_effect = emitter.emit_mcp
    
            gx_context_folder_name = "gx" if use_gx_folder else "great_expectations"
            shutil.copytree(
                test_resources_dir / "setup/great_expectations",
                tmp_path / gx_context_folder_name,
            )
    
            context = FileDataContext.create(tmp_path)
            context.run_checkpoint(checkpoint_name=checkpoint)
    
            emitter.write_to_file(tmp_path / "ge_mcps.json")
    
>           assert_metadata_files_equal(
                output_path=tmp_path / "ge_mcps.json",
                golden_path=test_resources_dir / golden_json,
                ignore_paths=[],
            )

tests/integration/test_great_expectations.py:79: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
../../metadata-ingestion/src/datahub/testing/compare_metadata_json.py:90: in assert_metadata_files_equal
    diff = diff_metadata_json(output, golden, ignore_paths, ignore_order=ignore_order)
../../metadata-ingestion/src/datahub/testing/compare_metadata_json.py:126: in diff_metadata_json
    return MCPDiff.create(
../../metadata-ingestion/src/datahub/testing/mcp_diff.py:174: in create
    diff = DeepDiff(
venv/lib/python3.11/site-packages/deepdiff/diff.py:351: in __init__
    self._diff(root, parents_ids=frozenset({id(t1)}), _original_type=_original_type)
venv/lib/python3.11/site-packages/deepdiff/diff.py:1723: in _diff
    self._diff_iterable(level, parents_ids, _original_type=_original_type, local_tree=local_tree)
venv/lib/python3.11/site-packages/deepdiff/diff.py:741: in _diff_iterable
    self._diff_iterable_with_deephash(level, parents_ids, _original_type=_original_type, local_tree=local_tree)
venv/lib/python3.11/site-packages/deepdiff/diff.py:1285: in _diff_iterable_with_deephash
    full_t1_hashtable = self._create_hashtable(level, 't1')
venv/lib/python3.11/site-packages/deepdiff/diff.py:1101: in _create_hashtable
    deep_hash = DeepHash(
venv/lib/python3.11/site-packages/deepdiff/deephash.py:224: in __init__
    self._hash(obj, parent=parent, parents_ids=frozenset({get_id(obj)}))
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <[TypeError("object of type 'object' has no len()") raised in repr()] DeepHash object at 0x7f6f16190710>
obj = 
		{"urn": "urn:li:assertion:789be88e19f54e4cae03aebf28d5ce18", "change_type": "UPSERT", "aspect_name": "assertionInfo", "aspect": "<aspect>"}
parent = 'root[0]', parents_ids = frozenset({'!>*id140115368690512'})

    def _hash(self, obj, parent, parents_ids=EMPTY_FROZENSET):
        """The main hash method"""
        counts = 1
        if self.custom_operators is not None:
            for operator in self.custom_operators:
                func = getattr(operator, 'normalize_value_for_hashing', None)
                if func is None:
>                   raise NotImplementedError(f"{operator.__class__.__name__} needs to define a normalize_value_for_hashing method to be compatible with ignore_order=True or iterable_compare_func.".format(operator))
E                   NotImplementedError: DeltaInfoOperator needs to define a normalize_value_for_hashing method to be compatible with ignore_order=True or iterable_compare_func.

venv/lib/python3.11/site-packages/deepdiff/deephash.py:516: NotImplementedError

Check notice on line 0 in .github

See this annotation in the file changed.

@github-actions github-actions / Unit Test Results (GX Plugin)

6 tests found

There are 6 tests, see "Raw output" for the full list of tests.
Raw output
tests.integration.test_great_expectations ‑ test_ge_ingest[test_checkpoint-ge_mcps_golden.json]
tests.integration.test_great_expectations ‑ test_ge_ingest[test_checkpoint_2-ge_mcps_golden_2.json]
tests.unit.test_great_expectations_action ‑ test_DataHubValidationAction_graceful_failure
tests.unit.test_great_expectations_action ‑ test_DataHubValidationAction_not_supported
tests.unit.test_great_expectations_action ‑ test_DataHubValidationAction_pandas
tests.unit.test_great_expectations_action ‑ test_DataHubValidationAction_sqlalchemy