Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

nixos/ollama rocm broken for RX 6900 XT. #375910

Open
3 tasks done
voronind-com opened this issue Jan 22, 2025 · 7 comments
Open
3 tasks done

nixos/ollama rocm broken for RX 6900 XT. #375910

voronind-com opened this issue Jan 22, 2025 · 7 comments
Labels
0.kind: bug Something is broken 6.topic: nixos Issues or PRs affecting NixOS modules, or package usability issues specific to NixOS

Comments

@voronind-com
Copy link
Contributor

voronind-com commented Jan 22, 2025

Nixpkgs version

  • Unstable (25.05)

Describe the bug

Rocm works fine in blender, but when run with Ollama I get this in logs:

Jan 22 17:52:59 desktop ollama[196524]: time=2025-01-22T17:52:59.580+03:00 level=WARN source=amd_linux.go:349 msg="unable to verify rocm library: no suitable rocm found, falling back to CPU"
$ ollama --version
Warning: could not connect to a running Ollama instance
Warning: client version is 0.5.7

This might be similar to #375359, but that fix didn't solve my issue.

System metadata

  • system: "x86_64-linux"
  • host os: Linux 6.12.9, NixOS, 24.11 (Vicuna), 24.11.20250119.107d5ef
  • multi-user?: yes
  • sandbox: yes
  • version: nix-env (Nix) 2.24.11
  • channels(root): ""
  • nixpkgs: /nix/store/sbnwhhx7zjp45n53bkmnj4whg79aw8pq-source

Notify maintainers

@abysssol
@dit7ya
@elohmeier
@RoyDubnium

Note for maintainers: Please tag this issue in your pull request description. (i.e. Resolves #ISSUE.)

I assert that this issue is relevant for Nixpkgs

Is this issue important to you?

Add a 👍 reaction to issues you find important.

@voronind-com voronind-com added 0.kind: bug Something is broken 6.topic: nixos Issues or PRs affecting NixOS modules, or package usability issues specific to NixOS labels Jan 22, 2025
@SigmaSquadron
Copy link
Contributor

SigmaSquadron commented Jan 22, 2025

Try adding services.ollama.environmentVariables.HSA_OVERRIDE_GFX_VERSION = "10.3.0"; to your system configuration. I have a RX 6800 XT, so you may need to adjust the GFX version.

@voronind-com
Copy link
Contributor Author

Try adding services.ollama.environmentVariables.HSA_OVERRIDE_GFX_VERSION = "10.3.0"; to your system configuration. I have a RX 6800 XT, so you may need to adjust the GFX version.

I'll try that tomorrow and report. Thank you!

@fabianhjr
Copy link
Member

@SigmaSquadron
Copy link
Contributor

SigmaSquadron commented Jan 22, 2025

Unsupported has never stopped anyone from trying :)

But I think this can be closed as wontfix, even if the override works.

@zierf
Copy link

zierf commented Jan 23, 2025

Try adding services.ollama.environmentVariables.HSA_OVERRIDE_GFX_VERSION = "10.3.0"; to your system configuration. I have a RX 6800 XT, so you may need to adjust the GFX version.

You could also use the config services.ollama.rocmOverrideGfx instead of the environment variable.


This works for me atm with a RX 6900 XT.

  nixpkgs.config.rocmSupport = true;
  services = {
    ollama = {
      enable = true;
      acceleration = "rocm";
      # Ollama Port 11434/tcp
      host = "0.0.0.0";
      port = 11434;
      openFirewall = true;
      # pin ollama v0.5.7 until nixpkgs update
      # https://github.com/NixOS/nixpkgs/issues/375359
      package = (pinPackage {
        name = "ollama";
        commit = "d0169965cf1ce1cd68e50a63eabff7c8b8959743";
        sha256 = "sha256:1hh0p0p42yqrm69kqlxwzx30m7i7xqw9m8f224i3bm6wsj4dxm05";
      });
      rocmOverrideGfx = "10.3.0";
      # additional environment variables
      # environmentVariables = { HSA_OVERRIDE_GFX_VERSION="10.3.0"; };
    };
  };
My pinPackage Function Definition
      pinPackage =
        {
          name,
          commit,
          sha256,
        }:
        (import (builtins.fetchTarball {
          inherit sha256;
          url = "https://github.com/NixOS/nixpkgs/archive/${commit}.tar.gz";
        }) { system = pkgs.system; }).${name};

Unfortunately, due to the issue Build failure: rocmPackages.llvm.libcxx, I am currently unable to upgrade to the latest nixpkgs available and I continue using this revision.

NixOS/nixpkgs/5df43628fdf08d642be8ba5b3625a6c70731c19c (2025-01-16 21:27:11)

So there seems to be something wrong with ROCm at the moment. But now I'm also stuck on Ollama v0.5.4, so I had to overwrite the package version with services.ollama.package. If your nixpkgs already contain Ollama v0.5.7, you can probably omit the services.ollama.package property.

Very interesting that your nixpkgs seem to contain the new Ollama version, but are not yet affected by #375745.
Maybe you can also tell me the revision of the nixpkgs you are currently using?


Config nixpkgs.config.rocmSupport combined with services.ollama.rocmOverrideGfx = "10.3.0"; should be enough to enable ROCm for our cards.

I kept services.ollama.acceleration from a previous system configuration, but the documentation says that it also automatically follows nixpkgs.config.rocmSupport and you can try without setting it explicitly.

@voronind-com
Copy link
Contributor Author

voronind-com commented Jan 23, 2025

Image

https://rocm.docs.amd.com/projects/install-on-linux/en/latest/reference/system-requirements.html

6900XT isn't currently supported

Refer to ROCm/ROCm#4276

Wym? Rocm works great for me in Blender. Also Ollama lists support for my GPU.

I'll try the rocmOverrideGfx = "10.3.0"; today, thank you very much everyone!

@voronind-com
Copy link
Contributor Author

voronind-com commented Jan 23, 2025

Maybe you can also tell me the revision of the nixpkgs you are currently using?

Nixpkgs Master that I use for the ollama module: 2fc5aeb049f44ed4f9e877cda8a1c334612e1d7a
Unstable that I use for the ollama package: 9e4d5190a9482a1fb9d18adf0bdb83c6e506eaab
Stable that I cureently use for the whole system (just in case): 107d5ef05c0b1119749e381451389eded30fb0d5

Also just in case my ollama configuration.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
0.kind: bug Something is broken 6.topic: nixos Issues or PRs affecting NixOS modules, or package usability issues specific to NixOS
Projects
None yet
Development

No branches or pull requests

4 participants