Logging to BigQuery can fail if number of rows to insert is too large #3008
Labels
fixed; not released
issues that have been fixed on the develop branch but have not yet been part of a tagged release.
Pulled from #2989. I ran into an issue where load_bq.py would fail as the number of rows was too large for insert_rows. My workaround was to batch over slices of the
jobs
list with a batch size set to 10000 (which I think is the upper limit for insert_rows).I can send a PR with the patch but I was a bit hesitant given the lack of unit tests for the script and adding those would be a bit of work (faking the client etc)
The text was updated successfully, but these errors were encountered: