-
Notifications
You must be signed in to change notification settings - Fork 187
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
make clearer #1046
base: main
Are you sure you want to change the base?
make clearer #1046
Conversation
06_gpu_and_ml/comfyui/comfyapp.py
Outdated
@@ -72,126 +48,105 @@ | |||
.run_commands( # use comfy-cli to install the ComfyUI repo and its dependencies | |||
"comfy --skip-prompt install --nvidia" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
minor note: i don't think there's a way to pin which version of comfyui you're installing, which represents some risk
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ah I was under the impression the comfy version was pinned by the comfy-cli version but maybe that's not the case
🚀 The docs preview is ready! Check it out here: https://modal-labs-examples--frontend-preview-7e1f16b.modal.run |
06_gpu_and_ml/comfyui/comfyapp.py
Outdated
# | ||
# 2. Stand up the ComfyUI server in development mode: | ||
# To run this simple text-to-image [Flux Schnell workflow](https://github.com/modal-labs/modal-examples/blob/main/06_gpu_and_ml/comfyui/workflow_api.json) as an API: | ||
# 1. Stand up the ComfyUI server in development mode: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
# 1. Stand up the ComfyUI server in development mode: | |
# 1. Start up the ComfyUI server in development mode: |
06_gpu_and_ml/comfyui/comfyapp.py
Outdated
@@ -72,126 +48,105 @@ | |||
.run_commands( # use comfy-cli to install the ComfyUI repo and its dependencies | |||
"comfy --skip-prompt install --nvidia" | |||
) | |||
.add_local_file( | |||
.add_local_file( # copy ComfyUI workflow JSON to the container | |||
Path(__file__).parent / "workflow_api.json", | |||
"/root/workflow_api.json", | |||
copy=True, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We don't need copy=True
here if we move this step to the end of image build since we don't use it in the subsequent steps. If we remove copy=True
and set it as the last step of the image build we will avoid rebuilding subsequent image layers when we modify the workfow file
|
||
vol = modal.Volume.from_name("comfyui-models", create_if_missing=True) | ||
# symlink the model to the right ComfyUI directory |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hm, why are we symlinking in the hf_download
function that is run as an image layer? Shouldn't we only need so symlink it in the @enter
method of the function? or just before we start the comfy server in the interactive function
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
i did it in the hf_download
function because hf_hub_download
returns the filepath where the model is saved so wanted to do the symlink within the same scope if that makes sense?
06_gpu_and_ml/comfyui/comfyapp.py
Outdated
# To run a workflow as an API: | ||
# 1. Stand up a "headless" ComfyUI server in the background when the app starts. | ||
# 2. Define an `infer` method that takes in a workflow path and runs the workflow on the ComfyUI server. | ||
# 3. Stand up a web handler `api` with `web_endpoint`, so that we can run our workflow as a service and accept inputs from clients. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
# 3. Stand up a web handler `api` with `web_endpoint`, so that we can run our workflow as a service and accept inputs from clients. | |
# 3. Start up a web handler `api` with `web_endpoint`, so that we can run our workflow as a service and accept inputs from clients. |
great feedback @kramstrom working on it |
🚀 The docs preview is ready! Check it out here: https://modal-labs-examples--frontend-preview-b3fb736.modal.run |
06_gpu_and_ml/comfyui/comfyapp.py
Outdated
mounts=[ # mount workflow JSON that we want to serve to the container | ||
modal.Mount.from_local_file( | ||
Path(__file__).parent / "workflow_api.json", | ||
remote_path="/root/workflow_api.json", | ||
) | ||
], |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Sorry this was my bad, mounts on classes / functions is about to be deprecated (I think there's already a warning about it or about to be) so the previous implementation was correct with add_local_file
on the Image. I was just suggesting to put it as the very last step of the image build so we don't need the copy=True
and subsequent steps aren't invalidated whenever the workflow file changes
🚀 The docs preview is ready! Check it out here: https://modal-labs-examples--frontend-preview-9d64a08.modal.run |
Type of Change
Checklist
lambda-test: false
is added to example frontmatter (---
)modal run
or an alternativecmd
is provided in the example frontmatter (e.g.cmd: ["modal", "deploy"]
)args
are provided in the example frontmatter (e.g.args: ["--prompt", "Formula for room temperature superconductor:"]
latest
python_version
for the base image, if it is used~=x.y.z
or==x.y
version < 1
are pinned to patch version,==0.y.z
Outside contributors
You're great! Thanks for your contribution.