You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
This issue is related to submission 0d9ebdfe-c920-11ed-a325-5376c1e42e63
In this submission we planned to map all of HuBMAP public primary data. But I just noticed in the submission that the reason we are failing is that due to the complexity of the relationship of entities, we have several entities referencing each other.
For example, multiple collections can reference the same sample. Because of this the sample ID appears more than once in the table. We would like to keep the original IDs in the submission since these are generated as part of our ingestion process.
Please advise on how to proceed with this matter.
The text was updated successfully, but these errors were encountered:
This is a class of error handled by the frictionless validation. When I run it in the CLI, it does group the error under the specific subject.tsv file that was being checked. However, the way we are getting a validation "report" from the frictionless API, we seem to lose this TSV file context. I've opened nih-cfde/cfde-deriva#396 as a potential enhancement for the submission pipeline, but I don't have a schedule for when this might be looked at.
From Ivan:
This issue is related to submission 0d9ebdfe-c920-11ed-a325-5376c1e42e63
In this submission we planned to map all of HuBMAP public primary data. But I just noticed in the submission that the reason we are failing is that due to the complexity of the relationship of entities, we have several entities referencing each other.
For example, multiple collections can reference the same sample. Because of this the sample ID appears more than once in the table. We would like to keep the original IDs in the submission since these are generated as part of our ingestion process.
Please advise on how to proceed with this matter.
The text was updated successfully, but these errors were encountered: