-
-
Notifications
You must be signed in to change notification settings - Fork 14.8k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
nixos/ollama rocm broken for RX 6900 XT. #375910
Comments
Try adding |
I'll try that tomorrow and report. Thank you! |
https://rocm.docs.amd.com/projects/install-on-linux/en/latest/reference/system-requirements.html 6900XT isn't currently supported Refer to ROCm/ROCm#4276 |
But I think this can be closed as |
You could also use the config services.ollama.rocmOverrideGfx instead of the environment variable. This works for me atm with a nixpkgs.config.rocmSupport = true; services = {
ollama = {
enable = true;
acceleration = "rocm";
# Ollama Port 11434/tcp
host = "0.0.0.0";
port = 11434;
openFirewall = true;
# pin ollama v0.5.7 until nixpkgs update
# https://github.com/NixOS/nixpkgs/issues/375359
package = (pinPackage {
name = "ollama";
commit = "d0169965cf1ce1cd68e50a63eabff7c8b8959743";
sha256 = "sha256:1hh0p0p42yqrm69kqlxwzx30m7i7xqw9m8f224i3bm6wsj4dxm05";
});
rocmOverrideGfx = "10.3.0";
# additional environment variables
# environmentVariables = { HSA_OVERRIDE_GFX_VERSION="10.3.0"; };
};
}; My pinPackage Function Definition pinPackage =
{
name,
commit,
sha256,
}:
(import (builtins.fetchTarball {
inherit sha256;
url = "https://github.com/NixOS/nixpkgs/archive/${commit}.tar.gz";
}) { system = pkgs.system; }).${name}; Unfortunately, due to the issue Build failure: rocmPackages.llvm.libcxx, I am currently unable to upgrade to the latest NixOS/nixpkgs/5df43628fdf08d642be8ba5b3625a6c70731c19c (2025-01-16 21:27:11) So there seems to be something wrong with ROCm at the moment. But now I'm also stuck on Ollama v0.5.4, so I had to overwrite the package version with services.ollama.package. If your nixpkgs already contain Ollama v0.5.7, you can probably omit the Very interesting that your nixpkgs seem to contain the new Ollama version, but are not yet affected by #375745. Config I kept services.ollama.acceleration from a previous system configuration, but the documentation says that it also automatically follows |
Wym? Rocm works great for me in Blender. Also Ollama lists support for my GPU. I'll try the |
Nixpkgs Master that I use for the ollama module: Also just in case my ollama configuration. |
Nixpkgs version
Describe the bug
Rocm works fine in blender, but when run with Ollama I get this in logs:
This might be similar to #375359, but that fix didn't solve my issue.
System metadata
"x86_64-linux"
Linux 6.12.9, NixOS, 24.11 (Vicuna), 24.11.20250119.107d5ef
yes
yes
nix-env (Nix) 2.24.11
""
/nix/store/sbnwhhx7zjp45n53bkmnj4whg79aw8pq-source
Notify maintainers
@abysssol
@dit7ya
@elohmeier
@RoyDubnium
Note for maintainers: Please tag this issue in your pull request description. (i.e.
Resolves #ISSUE
.)I assert that this issue is relevant for Nixpkgs
Is this issue important to you?
Add a 👍 reaction to issues you find important.
The text was updated successfully, but these errors were encountered: