-
Notifications
You must be signed in to change notification settings - Fork 53
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Fix schema violation for timezone data (#259) #278
base: master
Are you sure you want to change the base?
Conversation
Codecov Report
@@ Coverage Diff @@
## master #278 +/- ##
==========================================
- Coverage 89.88% 89.86% -0.02%
==========================================
Files 39 39
Lines 3746 3749 +3
Branches 911 915 +4
==========================================
+ Hits 3367 3369 +2
Misses 223 223
- Partials 156 157 +1
Continue to review full report at Codecov.
|
16d2233
to
b7ea769
Compare
@@ -78,6 +77,35 @@ def _schema_compat(self): | |||
|
|||
schema = schema.remove_metadata() | |||
md = {b"pandas": _dict_to_binary(pandas_metadata)} | |||
# https://github.com/JDASoftwareGroup/kartothek/issues/259 | |||
|
|||
if schema is not None: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
why can it be None
here? 3 lines above we just call schema = schema.remove_metadata()
. Can this return None
?
|
||
if schema is not None: | ||
fields = [] | ||
for f in schema: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
short comment before the entire massaging would be nice. This function just does more and more and it is hard to follow along.
|
||
fields.append(f) | ||
return pa.schema(fields, schema.metadata) | ||
return schema |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
a direct return from the line above would work now as well.
@@ -53,4 +52,7 @@ def test_arrow_compat(arrow_version, reference_store, mocker): | |||
if arrow_version in ("0.14.1", "0.15.0", "0.16.0") and not ARROW_LARGER_EQ_0141: | |||
orig = orig.astype({"null": float}) | |||
|
|||
if LooseVersion(arrow_version) < "0.13.0": |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
can you add a short comment on the reason? Is this because timezones cannot be preserved with these old versions?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Because the changes from @ged-steponavicius fixes #259, enabling proper compatibilty of datetimes with timezones i.e. these columns, and we currently only support pyarrow >= 0.13.0
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Then why do we need to check < "0.13.0":
That's dead code, isn't it?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
arrow_version
here refers to the arrow version with which the reference data file was created
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
if LooseVersion(arrow_version) < "0.13.0": | |
if LooseVersion(arrow_version) < "0.13.0": # gh-259: compatibilty of datetimes with timezones only supported in kartothek versions with pyarrow >= 0.13.0 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
suggestion looks good to me, thanks :)
This PR now also includes the appropriate changelog entry and I added some commits to update the reference Parquet files. @ged-steponavicius feel free to push to this branch if you'd like to address the review comments |
Supersedes #260
I tried to rebase this branch on top of master but it was quite the adventure to do so.
I excluded the changes to the changelog and it seems like some removal of comments is also missing, but the important parts should be there.
cc @ged-steponavicius