You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
> this is erroring on a review (bibat) which means there is an outlier format in archive that isn't being handled as we would like it to be handled.
Whoops! Not sure how I missed this on the first go around. I track this down and it was because the JOSS DOI field was left blank. I have updated the parser/validator to handle missing data like this in b6b9f27
When running it locally is that it hangs. I suspect it's slower because adding python-doi means we are parsing and checking urls / DOIs for each package. For now, we might consider adding more output so a user knows it's doing something. Maybe the package name it's processing in the terminal would be good to print out? I almost thought it was broken, and then I saw it was just processing but slower than it was previously.
Ooof, I didn't realize how much slower it was until now. Indeed it is the DOI validation step that is causing this slowdown. I can add a print statement, perhaps in this for-loop to indicate each package it is currently processing.
With this dramatic of a slow down, I'm considering not doing the full validation of the DOI URLs to ensure they resolve correctly. Perhaps these could be gathered and at the end of critical build steps like for the documentation we check these DOI URLs to validate that the resolve. For now, I'm adding a simple print statement in b25ebbc to indicate progress, but I think we should come back to this later.
the above issue was noted by @banesullivan as we work on the DOI workflow.
One idea is that we build a separate doi validation workflow that runs independently once a month to notify us of incorrect dois. In theory, once we have a doi we shouldn't have to check it again. Maybe there is a way to adapt the workflow to only test new doi's and maybe we do it less frequently than every build.
For instance, in our contributor workflow, we start by processing existing contribs in the yml. file. idk if thre is something we could learn from there.
The text was updated successfully, but these errors were encountered:
build a separate doi validation workflow that runs independently once a month to notify us of incorrect dois
I like this idea! I could easily set up a dedicated script and GHA workflow to do this. Could even have the GHA comment on the problematic review issue noting that the DOI is invalid which would ping the original authors and editor for that review.
lwasser
changed the title
Validation of DOI's
feat: Validate JOSS and Zenodo dois in a separate workflow
Jan 22, 2025
Whoops! Not sure how I missed this on the first go around. I track this down and it was because the JOSS DOI field was left blank. I have updated the parser/validator to handle missing data like this in b6b9f27
Ooof, I didn't realize how much slower it was until now. Indeed it is the DOI validation step that is causing this slowdown. I can add a print statement, perhaps in this for-loop to indicate each package it is currently processing.
With this dramatic of a slow down, I'm considering not doing the full validation of the DOI URLs to ensure they resolve correctly. Perhaps these could be gathered and at the end of critical build steps like for the documentation we check these DOI URLs to validate that the resolve. For now, I'm adding a simple
print
statement in b25ebbc to indicate progress, but I think we should come back to this later.Originally posted by @banesullivan in #243 (comment)
the above issue was noted by @banesullivan as we work on the DOI workflow.
One idea is that we build a separate doi validation workflow that runs independently once a month to notify us of incorrect dois. In theory, once we have a doi we shouldn't have to check it again. Maybe there is a way to adapt the workflow to only test new doi's and maybe we do it less frequently than every build.
For instance, in our contributor workflow, we start by processing existing contribs in the yml. file. idk if thre is something we could learn from there.
The text was updated successfully, but these errors were encountered: