How to run ComfyUI on WSL
AMD has unveiled a new way to maximize its GPU performance. This involves using the Windows Subsystem for Linux (WSL) in a Windows environment to install ROCm (AMD's GPU Computing Platform) and run ComfyUI, which is optimized for running AI models.
AMD recently detailed in a blog post how Windows users with its Radeon™ graphics cards can install ROCm and run PyTorch and ComfyUI in a WSL-based Ubuntu environment.
This combination provides powerful performance and flexibility, especially for developers looking to run large AI models, such as Stable Diffusion, locally.
To summarize the installation process, WSL and Ubuntu installation begins with the command wsl --install -d Ubuntu-24.04 in PowerShell. Next, download and install the amdgpu-install package from the official repository.
To install ROCm, build the ROCm environment with the amdgpu-install -y --usecase=wsl,rocm --no-dkms command.
Installing ComfyUI involves setting up a Python virtual environment, manually installing the ROCm-optimized PyTorch package, and then cloning ComfyUI from GitHub.
AMD recommends using ROCm-specific wheel files instead of the official packages when installing PyTorch in a ROCm environment.
This is a measure to take full advantage of GPU acceleration, and prevents conflicts by commenting out the torch, torchaudio, and torchvision packages in requirements.txt.
This guide goes beyond a simple installation manual and is carefully structured to cover everything from checking GPU information (rocminfo) to configuring virtual environments and preventing library conflicts.
In particular, running ComfyUI in a ROCm-based PyTorch environment is a groundbreaking approach that provides Windows users with a high-performance, Linux-based AI development environment.
AMD's latest offering goes beyond simple technical guidance; it offers a new paradigm for Windows-based AI developers, offering the flexibility of Linux and the power of GPU acceleration.