Advertisement

Runpod Comfyui Template

Runpod Comfyui Template - Or is this just how it is with the current gpu shortage? Maybe on aws or runpod? Runpod is very rough around the edges, and definitely not production worthy. Is there a way to. I wish runpod did a better job of detecting and explaining this state, but unfortunately they leave it up to the user to discover for now. Has anyone of you been successfully deployed a comfyui workflow serverless? I've been building docker images for use on cloud providers (vast, runpod, tensordock etc) and i've just made them compatible with runpod serverless which can make image. With my experience so far, i cannot recommend it for anything beyond simple experimentation with. Community cloud instances advertise 800 mbps yet i get throttled to 500 kbps. Also does anyone have a rough cost estimate for training an sd 1.5 lora?

Or is this just how it is with the current gpu shortage? Is runpod still the best choice for both using and training sd 1.5 and sdxl checkpoints and loras? Aside from this, i'm a pretty happy runpod customer. Runpod's prices have increased and they now hide important details about server quality. Also does anyone have a rough cost estimate for training an sd 1.5 lora? Community cloud instances advertise 800 mbps yet i get throttled to 500 kbps. And would be willing to share some insights? Maybe on aws or runpod? With my experience so far, i cannot recommend it for anything beyond simple experimentation with. Upload your sd models and such to a runpod (or another server) with one click.

ComfyUILauncher/cloud/RUNPOD.md at main ·
ComfyFlow RunPod Template ComfyFlow
at master
ComfyUI Tutorial How To Install ComfyUI On Windows, RunPod, 40 OFF
Manage Pod Templates RunPod Documentation
ComfyFlow RunPod Template ComfyFlow
Blibla ComfyUI on RunPod
GitHub ComfyUI docker images
GitHub Docker image for runpod
ComfyFlow RunPod Template ComfyFlow

Runpod's Prices Have Increased And They Now Hide Important Details About Server Quality.

With my experience so far, i cannot recommend it for anything beyond simple experimentation with. Is runpod still the best choice for both using and training sd 1.5 and sdxl checkpoints and loras? Are there any alternatives that are similar to runpod? Also does anyone have a rough cost estimate for training an sd 1.5 lora?

Has Anyone Of You Been Successfully Deployed A Comfyui Workflow Serverless?

Upload your sd models and such to a runpod (or another server) with one click. And would be willing to share some insights? Runpod is very rough around the edges, and definitely not production worthy. Community cloud instances advertise 800 mbps yet i get throttled to 500 kbps.

I've Been Building Docker Images For Use On Cloud Providers (Vast, Runpod, Tensordock Etc) And I've Just Made Them Compatible With Runpod Serverless Which Can Make Image.

Runpod, on the other hand, works 100% of the time, but the network throttling is ridiculous. I wish runpod did a better job of detecting and explaining this state, but unfortunately they leave it up to the user to discover for now. After getting one too many low quality servers, i'm not using runpod anymore. Maybe on aws or runpod?

Or Is This Just How It Is With The Current Gpu Shortage?

Is there a way to. Aside from this, i'm a pretty happy runpod customer.

Related Post: