DataHub Python Builds

These prebuilt wheel files can be used to install our Python packages as of a specific commit.

Build context

Built at 2026-04-24T05:28:18.234367+00:00.

{
  "timestamp": "2026-04-24T05:28:18.234367+00:00",
  "branch": "feat/sigma-dm-containers",
  "commit": {
    "hash": "0b6a6b088685d93f89eec306d06a7ef746d62545",
    "message": "fix(ingest/sigma): DM container description, /lineage pagination, pagination cycle guard\n\nAddresses PR review feedback on the DM ingestion hardening.\n\nM1 (Major): ``gen_containers`` for a Sigma DM received\n``description=data_model.description or \"\"``, so a DM with no\ndescription sent ``\"\"`` to GMS and would blank out any description the\nuser had edited in the DataHub UI on the next ingest. Pass ``None``\nthrough instead, matching the element-Dataset fix and the Qlik\nconnector pattern.\n\nM2 (Major): ``_get_data_model_lineage_entries`` read only the first\npage of ``/dataModels/{id}/lineage``. For a DM whose lineage spans\nmultiple pages, elements on page 2+ silently lost their upstreams\n(``data_model_element_upstreams_unresolved`` did not bump because the\nelement's ``source_ids`` were simply absent). Split the HTTP loop out\nof ``_paginated_entries`` into ``_paginated_raw_entries`` and have\n``/lineage`` consume it. Expected \"no lineage\" statuses (400/403/404/\n500) are still swallowed silently via a new ``silent_statuses``\nparameter applied only to the first page.\n\nM3 (Major): ``_paginated_raw_entries`` now tracks already-seen\n``nextPage`` / ``nextPageToken`` cursors in a ``Set[str]``. A broken\nSigma proxy that echoes the same cursor on every response used to loop\nforever and accumulate duplicates; it now breaks with a report\nwarning.\n\nm1 (polish): DM element Datasets now carry\n``externalUrl=?:nodeId=`` so users can\nclick from a DM-element Dataset back into Sigma. Mirrors the\ndeep-link shape used for workbook elements in ``get_page_elements``.\n\nm3 (polish): ``SigmaDataModelElement._discard_api_bare_string_columns``\nnow emits a ``logger.debug`` when it drops bare-string columns, so an\noperator investigating an empty-schema DM element can see the columns\nwere intentionally deferred to ``/columns``.\n\nm4 (polish): ``TestAssembleDataModelFileMetaFallback`` replaces the\n``patch.stopall()`` try/finally pattern with a nested\n``with patch.object(...)`` context-manager helper, guaranteeing\ncleanup even if an assertion raises mid-test.\n\nTests: adds ``TestPaginatedRawEntries`` covering multi-page via\n``nextPage`` and ``nextPageToken``, the cycle-protection break, the\nsilent-status swallow, and an end-to-end multi-page ``/lineage``\nregression for M2."
  },
  "base": {
    "hash": "af552cce4caa59c833522b3f3e5d3343bcb71ffe",
    "message": "fix: secrets can paginate now (#17172)"
  },
  "pr": {
    "number": 17173,
    "title": "feat(ingest/sigma): emit Data Models as Containers with per-element Datasets",
    "url": "https://github.com/datahub-project/datahub/pull/17173"
  }
}

Usage

Current base URL: unknown

Package Size Install command
acryl-datahub 3.589 MB uv pip install 'acryl-datahub @ <base-url>/artifacts/wheels/acryl_datahub-0.0.0.dev1-py3-none-any.whl'
acryl-datahub-actions 0.105 MB uv pip install 'acryl-datahub-actions @ <base-url>/artifacts/wheels/acryl_datahub_actions-0.0.0.dev1-py3-none-any.whl'
acryl-datahub-airflow-plugin 0.108 MB uv pip install 'acryl-datahub-airflow-plugin @ <base-url>/artifacts/wheels/acryl_datahub_airflow_plugin-0.0.0.dev1-py3-none-any.whl'
acryl-datahub-dagster-plugin 0.020 MB uv pip install 'acryl-datahub-dagster-plugin @ <base-url>/artifacts/wheels/acryl_datahub_dagster_plugin-0.0.0.dev1-py3-none-any.whl'
acryl-datahub-gx-plugin 0.011 MB uv pip install 'acryl-datahub-gx-plugin @ <base-url>/artifacts/wheels/acryl_datahub_gx_plugin-0.0.0.dev1-py3-none-any.whl'
prefect-datahub 0.011 MB uv pip install 'prefect-datahub @ <base-url>/artifacts/wheels/prefect_datahub-0.0.0.dev1-py3-none-any.whl'