You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
A large number of older webcomics have large png and jpg files (some nicely optimized, some not). It would be nice to have an alternative for long-term archiving, to save space, such as converting files to webp automatically, as they're downloaded. Currently I'm using cbxconverter, but that's a windows only gui program that (at least on my laptop) takes longer to process each image than it takes to download it and can only be run after the comic has been zipped, not while it downloads.
The text was updated successfully, but these errors were encountered:
If this is an unwanted feature, it would also be nice to have a more general post-processing function to run a custom command/script on each downloaded file, and another on each directory after any given comic is complete
For individual post-processing, it would allow
convert image to webp
convert -trim $image $image.webp && rm $image
reduce image quality
convert -quality=75 $image.jpg.
For post-processing on directories, this would allow
creating customized cbz files
for a in 01 02 03 do; zip -m -0 $(dirname $PWD)_20$a.cbz 20$a*; done
A large number of older webcomics have large png and jpg files (some nicely optimized, some not). It would be nice to have an alternative for long-term archiving, to save space, such as converting files to webp automatically, as they're downloaded. Currently I'm using cbxconverter, but that's a windows only gui program that (at least on my laptop) takes longer to process each image than it takes to download it and can only be run after the comic has been zipped, not while it downloads.
The text was updated successfully, but these errors were encountered: