Humble request for commented ipynb notebook with example face training #117
AlongForTheR1de
started this conversation in
Ideas
Replies: 1 comment 2 replies
-
I created one here: https://github.com/cian0/lora/tree/master/training_scripts (see the notebook file, accompanying captioned training images and safetensors output) @brian6091 and @cloneofsimo was actively assisting me to understand how to use the repo so I think we're on the same boat. Simu Liu was chosen as subject as I was having a hard time getting new concepts to train but was able to get it to work now. See output images: |
Beta Was this translation helpful? Give feedback.
2 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hi folks,
I've been following all your awesome developments closely and have managed to get the basic lora training running with a few modifications of the kohya_ss repo to allow it to run in Linux. My outputs were not stellar after multiple attempts, but there are so many different ways I could be failing, and the methods available there are naturally lagging a bit behind the latest and greatest.
For a basically literate but not experienced person like me, it would be amazing if someone could but together a well commented Jupyter notebook that walks through implementing the best training methods you guys have arrived at thus far. If it would be possible to make the training dataset available so we could actually run through it in a tutorial style, that would be fantastic. No need for a video or anything, just a script that starts with training data, produces the lora network, then applies the network to a txt2img prompt.
Also -- it would be wonderful if this could include whatever the stand-alone command is to convert .pt to .safetensors format in order to make it possible use the final output in the Auto1111 extension. I've figured out that I can save things as a safetensor using --save_model_as=safetensors in the kohya train_network.py script, but for the life of me I cannot figure out what this ends up activating as their code comments are a bit sparse.
Thanks!!
Beta Was this translation helpful? Give feedback.
All reactions