Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

logs warning if deduplication state is large #1877

Open
wants to merge 3 commits into
base: devel
Choose a base branch
from

Conversation

willi-mueller
Copy link
Collaborator

Description

As a first step towards resolving this issue: warns if the deduplication state grows over a large count of hashes of the primary key.

Related Issues

#1131

Copy link

netlify bot commented Sep 26, 2024

Deploy Preview for dlt-hub-docs ready!

Name Link
🔨 Latest commit 4e59fb7
🔍 Latest deploy log https://app.netlify.com/sites/dlt-hub-docs/deploys/66f6a650716eb00008923626
😎 Deploy Preview https://deploy-preview-1877--dlt-hub-docs.netlify.app
📱 Preview on mobile
Toggle QR Code...

QR Code

Use your smartphone camera to open QR code link.

To edit notification comments on pull requests, go to your Netlify site configuration.

@willi-mueller willi-mueller force-pushed the feat/1131-warn-large-deduplication-state branch from cf80f0d to aeca74a Compare September 26, 2024 08:53
@willi-mueller willi-mueller changed the title logs warning if deduplication state is large logs warning if deduplication state is large (feat/1331) Sep 28, 2024
@willi-mueller willi-mueller changed the title logs warning if deduplication state is large (feat/1331) logs warning if deduplication state is large Sep 28, 2024
@willi-mueller willi-mueller self-assigned this Sep 28, 2024

dedup_count = len(self._cached_state["unique_hashes"])
DEDUP_WARNING_THRESHOLD = 200
if dedup_count > DEDUP_WARNING_THRESHOLD:
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this will flood user with messages. please compare the number of hashes before unique_hashes.update(transformer.unique_hashes) and after to see if threshold is crossed. so you display that only once

if dedup_count > DEDUP_WARNING_THRESHOLD:
logger.warning(
f"There are {dedup_count} records to be deduplicated because"
f" they share the same primary key `{self.primary_key}`."
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

we need way better explanation here.

  1. primary key is not known sometimes, then we hash the content.
  2. the root cause is not primary key. it is low resolution of cursor column. ie it is on a day. use cursor column name here

logger_spy = mocker.spy(dlt.common.logger, "warning")
p = dlt.pipeline(pipeline_name=uniq_id())
p.extract(some_data(1))
logger_spy.assert_any_call(
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

make sure it is displayed only once here

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

disable rows deduplication if Incremental is attached to a resource with merge write disposition
2 participants