Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

logs warning if deduplication state is large #1877

Open
wants to merge 3 commits into
base: devel
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
8 changes: 7 additions & 1 deletion dlt/extract/incremental/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -502,7 +502,13 @@ def __call__(self, rows: TDataItems, meta: Any = None) -> Optional[TDataItems]:
# add directly computed hashes
unique_hashes.update(transformer.unique_hashes)
self._cached_state["unique_hashes"] = list(unique_hashes)

dedup_count = len(self._cached_state["unique_hashes"])
DEDUP_WARNING_THRESHOLD = 200
if dedup_count > DEDUP_WARNING_THRESHOLD:
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this will flood user with messages. please compare the number of hashes before unique_hashes.update(transformer.unique_hashes) and after to see if threshold is crossed. so you display that only once

logger.warning(
f"There are {dedup_count} records to be deduplicated because"
f" they share the same primary key `{self.primary_key}`."
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

we need way better explanation here.

  1. primary key is not known sometimes, then we hash the content.
  2. the root cause is not primary key. it is low resolution of cursor column. ie it is on a day. use cursor column name here

)
return rows


Expand Down
21 changes: 21 additions & 0 deletions tests/extract/test_incremental.py
Original file line number Diff line number Diff line change
Expand Up @@ -2586,3 +2586,24 @@ def updated_is_int(updated_at=dlt.sources.incremental("updated_at", initial_valu
pipeline.run(updated_is_int())
assert isinstance(pip_ex.value.__cause__, IncrementalCursorInvalidCoercion)
assert pip_ex.value.__cause__.cursor_path == "updated_at"


@pytest.mark.parametrize("item_type", ALL_TEST_DATA_ITEM_FORMATS)
@pytest.mark.parametrize("primary_key", ["id", None])
def test_warning_large_deduplication_state(item_type: TestDataItemFormat, primary_key, mocker):
@dlt.resource(primary_key=primary_key)
def some_data(
created_at=dlt.sources.incremental("created_at"),
):
yield data_to_item_format(
item_type,
[{"id": i, "created_at": 1} for i in range(201)],
)

logger_spy = mocker.spy(dlt.common.logger, "warning")
p = dlt.pipeline(pipeline_name=uniq_id())
p.extract(some_data(1))
logger_spy.assert_any_call(
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

make sure it is displayed only once here

"There are 201 records to be deduplicated because they share the same primary key"
f" `{primary_key}`."
)
Loading