-
Notifications
You must be signed in to change notification settings - Fork 178
Parts/Chapters again? #215
Comments
I'm having the same problem - maybe it's nested chapters? |
There's a fix for nested chapters, but other stuff is broken on the master branch for me. So I manually flattened the chapters.json |
I went ahead and did the manual flattening thing. In case it helps anyone else, here's the structure. This represents a single chapter nested inside a "part". The offset and length of the "Part" is recorded after the chapter. This is the piece of the book where the reader says something like "Part eight". It's very short, and its starting offset is before the chapter above it.
To flatten it out, move the "Part" stuff into curly braces and put it before the chapter. I don't know if the order matters or not, but it worked for me. Like so:
|
Here you go (https://jqplay.org/s/nGtqx7eMzsm):
find all arrays within chapter_info recursively, delete the nested "chapters" arrays, then sort the chapters by offset |
Had the same problem, the following worked for me:
A little heads up for anyone also using audiobook-binder: default setting is to not create files longer then 12hrs. I recomend setting this to infinity because having the whole book in one file is a lot more convinient then having 2-3 separate files to deal with. Also the quality is set to 128kbps which is a lot for just voice in AAC, what can be lowered quite a bit withough noticable loss of qualitly I wonder though if it is possible to use audible-cli data for chapters while outputting only to one file? because when i remove the |
I found #184 and thought it would solve my problem, so I pulled the latest code and tried again.
Unfortunately I still get only 4 mp3 files when I try to convert asin
B088C4DBYP
. The total output is about 4 MB, while the AAXC file is nearly 1GB.Here is my commandline:
In case it helps, here's the json file too (as a .txt because Github won't allow .json files).
Heavens_River_Bobiverse_Book_4-chapters.json.txt
The text was updated successfully, but these errors were encountered: