diff --git a/dev/CHANGELOG/index.html b/dev/CHANGELOG/index.html index 78e4cde41..2cb96ae4f 100644 --- a/dev/CHANGELOG/index.html +++ b/dev/CHANGELOG/index.html @@ -467,6 +467,15 @@ -

Documentation#

+

Documentation#

[0.27.1] - 2024-08-09#

-

Documentation#

+

Documentation#

-

Documentation#

+

Documentation#

-

Documentation#

+

Documentation#

-

Documentation#

+

Documentation#

[0.24.2] - 2024-06-14#

-

Documentation#

+

Documentation#

-

Documentation#

+

Documentation#

@@ -3059,7 +3086,7 @@

Changed#1406
  • Add backward compat logic for older lock files by @nichmor in #1425
  • -

    Documentation#

    +

    Documentation#

    -

    Documentation#

    +

    Documentation#

    -

    Documentation#

    +

    Documentation#

    -

    Documentation#

    +

    Documentation#

    -

    Documentation#

    +

    Documentation#

    -

    Documentation#

    +

    Documentation#

    -

    Documentation#

    +

    Documentation#

    -

    Documentation#

    +

    Documentation#

    -

    Documentation#

    +

    Documentation#

    Pixi solves both the conda and PyPI dependencies, where the PyPI dependencies use the conda packages as a base, so you can be sure that the packages are compatible with each other. -These solvers are split between the rattler and rip library, these control the heavy lifting of the solving process, which is executed by our custom SAT solver: resolvo. +These solvers are split between the rattler and uv library, these control the heavy lifting of the solving process, which is executed by our custom SAT solver: resolvo. resolve is able to solve multiple ecosystem like conda and PyPI. It implements the lazy solving process for PyPI packages, which means that it only downloads the metadata of the packages that are needed to solve the environment. It also supports the conda way of solving, which means that it downloads the metadata of all the packages at once and then solves in one go.

    -

    For the [pypi-dependencies], rip implements sdist building to retrieve the metadata of the packages, and wheel building to install the packages. +

    For the [pypi-dependencies], uv implements sdist building to retrieve the metadata of the packages, and wheel building to install the packages. For this building step, pixi requires to first install python in the (conda)[dependencies] section of the pixi.toml file. This will always be slower than the pure conda solves. So for the best pixi experience you should stay within the [dependencies] section of the pixi.toml file.

    Caching#

    diff --git a/dev/ide_integration/devcontainer/index.html b/dev/ide_integration/devcontainer/index.html index 7a81d0eda..92f36f0d9 100644 --- a/dev/ide_integration/devcontainer/index.html +++ b/dev/ide_integration/devcontainer/index.html @@ -1558,7 +1558,7 @@

    Use pixi inside of a devcontainer.devcontainer directory:

    .devcontainer/Dockerfile
    FROM mcr.microsoft.com/devcontainers/base:jammy
     
    -ARG PIXI_VERSION=v0.31.0
    +ARG PIXI_VERSION=v0.32.1
     
     RUN curl -L -o /usr/local/bin/pixi -fsSL --compressed "https://github.com/prefix-dev/pixi/releases/download/${PIXI_VERSION}/pixi-$(uname -m)-unknown-linux-musl" \
         && chmod +x /usr/local/bin/pixi \
    diff --git a/dev/install.ps1 b/dev/install.ps1
    index cd3df90c9..49589c33d 100644
    --- a/dev/install.ps1
    +++ b/dev/install.ps1
    @@ -18,7 +18,7 @@
     .LINK
         https://github.com/prefix-dev/pixi
     .NOTES
    -    Version: v0.32.0
    +    Version: v0.32.1
     #>
     param (
         [string] $PixiVersion = 'latest',
    diff --git a/dev/install.sh b/dev/install.sh
    index cd163b29a..0c9fd7528 100644
    --- a/dev/install.sh
    +++ b/dev/install.sh
    @@ -1,6 +1,6 @@
     #!/usr/bin/env bash
     set -euo pipefail
    -# Version: v0.32.0
    +# Version: v0.32.1
     
     __wrap__() {
     
    diff --git a/dev/schema/manifest/schema.json b/dev/schema/manifest/schema.json
    index bd035f31e..497aaa143 100644
    --- a/dev/schema/manifest/schema.json
    +++ b/dev/schema/manifest/schema.json
    @@ -1,6 +1,6 @@
     {
       "$schema": "http://json-schema.org/draft-07/schema#",
    -  "$id": "https://pixi.sh/v0.32.0/schema/manifest/schema.json",
    +  "$id": "https://pixi.sh/v0.32.1/schema/manifest/schema.json",
       "title": "`pixi.toml` manifest file",
       "description": "The configuration for a [`pixi`](https://pixi.sh) project.",
       "type": "object",
    @@ -13,7 +13,7 @@
           "title": "Schema",
           "description": "The schema identifier for the project's configuration",
           "type": "string",
    -      "default": "https://pixi.sh/v0.32.0/schema/manifest/schema.json",
    +      "default": "https://pixi.sh/v0.32.1/schema/manifest/schema.json",
           "format": "uri-reference"
         },
         "activation": {
    diff --git a/dev/search/search_index.json b/dev/search/search_index.json
    index 904ad3a64..bde37f95e 100644
    --- a/dev/search/search_index.json
    +++ b/dev/search/search_index.json
    @@ -1 +1 @@
    -{"config":{"lang":["en"],"separator":"[\\s\\-]+","pipeline":["stopWordFilter"]},"docs":[{"location":"","title":"Getting Started","text":"

    Pixi is a package management tool for developers. It allows the developer to install libraries and applications in a reproducible way. Use pixi cross-platform, on Windows, Mac and Linux.

    "},{"location":"#installation","title":"Installation","text":"

    To install pixi you can run the following command in your terminal:

    Linux & macOSWindows
    curl -fsSL https://pixi.sh/install.sh | bash\n

    The above invocation will automatically download the latest version of pixi, extract it, and move the pixi binary to ~/.pixi/bin. If this directory does not already exist, the script will create it.

    The script will also update your ~/.bash_profile to include ~/.pixi/bin in your PATH, allowing you to invoke the pixi command from anywhere.

    PowerShell:

    iwr -useb https://pixi.sh/install.ps1 | iex\n
    winget:
    winget install prefix-dev.pixi\n
    The above invocation will automatically download the latest version of pixi, extract it, and move the pixi binary to LocalAppData/pixi/bin. If this directory does not already exist, the script will create it.

    The command will also automatically add LocalAppData/pixi/bin to your path allowing you to invoke pixi from anywhere.

    Tip

    You might need to restart your terminal or source your shell for the changes to take effect.

    You can find more options for the installation script here.

    "},{"location":"#autocompletion","title":"Autocompletion","text":"

    To get autocompletion follow the instructions for your shell. Afterwards, restart the shell or source the shell config file.

    "},{"location":"#bash-default-on-most-linux-systems","title":"Bash (default on most Linux systems)","text":"
    echo 'eval \"$(pixi completion --shell bash)\"' >> ~/.bashrc\n
    "},{"location":"#zsh-default-on-macos","title":"Zsh (default on macOS)","text":"
    echo 'eval \"$(pixi completion --shell zsh)\"' >> ~/.zshrc\n
    "},{"location":"#powershell-pre-installed-on-all-windows-systems","title":"PowerShell (pre-installed on all Windows systems)","text":"
    Add-Content -Path $PROFILE -Value '(& pixi completion --shell powershell) | Out-String | Invoke-Expression'\n

    Failure because no profile file exists

    Make sure your profile file exists, otherwise create it with:

    New-Item -Path $PROFILE -ItemType File -Force\n

    "},{"location":"#fish","title":"Fish","text":"
    echo 'pixi completion --shell fish | source' > ~/.config/fish/completions/pixi.fish\n
    "},{"location":"#nushell","title":"Nushell","text":"

    Add the following to the end of your Nushell env file (find it by running $nu.env-path in Nushell):

    mkdir ~/.cache/pixi\npixi completion --shell nushell | save -f ~/.cache/pixi/completions.nu\n

    And add the following to the end of your Nushell configuration (find it by running $nu.config-path):

    use ~/.cache/pixi/completions.nu *\n
    "},{"location":"#elvish","title":"Elvish","text":"
    echo 'eval (pixi completion --shell elvish | slurp)' >> ~/.elvish/rc.elv\n
    "},{"location":"#alternative-installation-methods","title":"Alternative installation methods","text":"

    Although we recommend installing pixi through the above method we also provide additional installation methods.

    "},{"location":"#homebrew","title":"Homebrew","text":"

    Pixi is available via homebrew. To install pixi via homebrew simply run:

    brew install pixi\n
    "},{"location":"#windows-installer","title":"Windows installer","text":"

    We provide an msi installer on our GitHub releases page. The installer will download pixi and add it to the path.

    "},{"location":"#install-from-source","title":"Install from source","text":"

    pixi is 100% written in Rust, and therefore it can be installed, built and tested with cargo. To start using pixi from a source build run:

    cargo install --locked --git https://github.com/prefix-dev/pixi.git pixi\n

    We don't publish to crates.io anymore, so you need to install it from the repository. The reason for this is that we depend on some unpublished crates which disallows us to publish to crates.io.

    or when you want to make changes use:

    cargo build\ncargo test\n

    If you have any issues building because of the dependency on rattler checkout its compile steps.

    "},{"location":"#installer-script-options","title":"Installer script options","text":"Linux & macOSWindows

    The installation script has several options that can be manipulated through environment variables.

    Variable Description Default Value PIXI_VERSION The version of pixi getting installed, can be used to up- or down-grade. latest PIXI_HOME The location of the binary folder. $HOME/.pixi PIXI_ARCH The architecture the pixi version was built for. uname -m PIXI_NO_PATH_UPDATE If set the $PATH will not be updated to add pixi to it. TMP_DIR The temporary directory the script uses to download to and unpack the binary from. /tmp

    For example, on Apple Silicon, you can force the installation of the x86 version:

    curl -fsSL https://pixi.sh/install.sh | PIXI_ARCH=x86_64 bash\n
    Or set the version
    curl -fsSL https://pixi.sh/install.sh | PIXI_VERSION=v0.18.0 bash\n

    The installation script has several options that can be manipulated through environment variables.

    Variable Environment variable Description Default Value PixiVersion PIXI_VERSION The version of pixi getting installed, can be used to up- or down-grade. latest PixiHome PIXI_HOME The location of the installation. $Env:USERPROFILE\\.pixi NoPathUpdate If set, the $PATH will not be updated to add pixi to it.

    For example, set the version using:

    iwr -useb https://pixi.sh/install.ps1 | iex -Args \"-PixiVersion v0.18.0\"\n
    "},{"location":"#update","title":"Update","text":"

    Updating is as simple as installing, rerunning the installation script gets you the latest version.

    pixi self-update\n
    Or get a specific pixi version using:
    pixi self-update --version x.y.z\n

    Note

    If you've used a package manager like brew, mamba, conda, paru etc. to install pixi. It's preferable to use the built-in update mechanism. e.g. brew upgrade pixi.

    "},{"location":"#uninstall","title":"Uninstall","text":"

    To uninstall pixi from your system, simply remove the binary.

    Linux & macOSWindows
    rm ~/.pixi/bin/pixi\n
    $PIXI_BIN = \"$Env:LocalAppData\\pixi\\bin\\pixi\"; Remove-Item -Path $PIXI_BIN\n

    After this command, you can still use the tools you installed with pixi. To remove these as well, just remove the whole ~/.pixi directory and remove the directory from your path.

    "},{"location":"Community/","title":"Community","text":"

    When you want to show your users and contributors that they can use pixi in your repo, you can use the following badge:

    [![Pixi Badge](https://img.shields.io/endpoint?url=https://raw.githubusercontent.com/prefix-dev/pixi/main/assets/badge/v0.json)](https://pixi.sh)\n

    Customize your badge

    To further customize the look and feel of your badge, you can add &style=<custom-style> at the end of the URL. See the documentation on shields.io for more info.

    "},{"location":"Community/#built-using-pixi","title":"Built using Pixi","text":"
    • Deltares:
      • Ribasim: Water resources model
      • Ribasim-NL: Ribasim water resources modeling in the Netherlands
      • iMOD Python: Make massive MODFLOW models
      • iMOD Coupler: Application for coupling hydrological kernels
      • iMOD Documentation: Documentation of the iMOD suite.
      • Xugrid: Xarray and unstructured grids
      • Numba celltree: Celltree data structure for searching for points, lines, boxes, and cells (convex polygons) in a two dimensional unstructured mesh.
      • QGIS-Tim: QGIS plugin and utilities for TimML multi-layer analytic element model
      • Pandamesh: From geodataframe to mesh
      • Wflow: Hydrological modeling framework
      • HydroMT: Automated and reproducible model building and analysis
      • HydroMT SFINCS: SFINCS plugin for HydroMT
      • PyFlwDir: Fast methods to work with hydro- and topography data in pure Python.
    • USGS:
      • MODFLOW 6: USGS modular hydrological model
    • QuantCo:
      • glum: High performance Python GLMs with all the features!
      • tabmat: Efficient matrix representations for working with tabular data
      • pixi-pack: A tool to pack and unpack conda environments created with pixi
      • polarify: Simplifying conditional Polars Expressions with Python \ud83d\udc0d \ud83d\udc3b\u200d\u2744\ufe0f
      • copier-template-python-open-source: Copier template for python projects using pixi
      • datajudge: Assessing whether data from database complies with reference information
      • ndonnx: ONNX-backed array library that is compliant with the Array API standard
      • multiregex: Quickly match many regexes against a string
      • slim-trees: Pickle your ML models more efficiently for deployment \ud83d\ude80
      • sqlcompyre: Compare SQL tables and databases
      • metalearners: MetaLearners for CATE estimation
      • ndonnx: ONNX-backed array library that is compliant with the Array API standard
      • tabulardelta: Simplify table comparisons
      • pydiverse.pipedag: A library for data pipeline orchestration optimizing high development iteration speed
      • pydiverse.transform: Pipe based dataframe manipulation library that can also transform data on SQL databases
    • pixi-pycharm: Conda shim for PyCharm that proxies pixi
    • pixi-diff-to-markdown: Generate markdown summaries from pixi update
    • jiaxiyang/cpp_project_guideline: Guide the way beginners make their c++ projects.
    • karelze/tclf: A python library for trade classification\u26a1
    • hex-inc/vegafusion: Serverside scaling of Vega and Altair visualizations in Rust, Python, WASM, and Java
    • pablovela5620/arxiv-researcher: Summarize PDF's and Arixv papers with Langchain and Nougat \ud83e\udd89
    • HaoZeke/xtsci-dist: Incremental scipy port using xtensor
    • jslorrma/keyrings.artifacts: Keyring backend that provides authentication for publishing or consuming Python packages to or from Azure Artifacts feeds within Azure DevOps
    • LFortran: A modern cross-platform Fortran compiler
    • Rerun: Rerun is an SDK for building time aware visualizations of multimodal data.
    • conda-auth: a conda plugin providing more secure authentication support to conda.
    • py-rattler: Build your own conda environment manager using the python wrapper of our Rattler backend.
    • array-api-extra: Extra array functions built on top of the Python array API standard.
    "},{"location":"FAQ/","title":"Frequently asked questions","text":""},{"location":"FAQ/#what-is-the-difference-with-conda-mamba-poetry-pip","title":"What is the difference with conda, mamba, poetry, pip","text":"Tool Installs python Builds packages Runs predefined tasks Has lock files builtin Fast Use without python Conda \u2705 \u274c \u274c \u274c \u274c \u274c Mamba \u2705 \u274c \u274c \u274c \u2705 \u2705 Pip \u274c \u2705 \u274c \u274c \u274c \u274c Pixi \u2705 \ud83d\udea7 \u2705 \u2705 \u2705 \u2705 Poetry \u274c \u2705 \u274c \u2705 \u274c \u274c"},{"location":"FAQ/#why-the-name-pixi","title":"Why the name pixi","text":"

    Starting with the name prefix we iterated until we had a name that was easy to pronounce, spell and remember. There also wasn't a cli tool yet using that name. Unlike px, pex, pax, etc. We think it sparks curiosity and fun, if you don't agree, I'm sorry, but you can always alias it to whatever you like.

    Linux & macOSWindows
    alias not_pixi=\"pixi\"\n

    PowerShell:

    New-Alias -Name not_pixi -Value pixi\n

    "},{"location":"FAQ/#where-is-pixi-build","title":"Where is pixi build","text":"

    TL;DR: It's coming we promise!

    pixi build is going to be the subcommand that can generate a conda package out of a pixi project. This requires a solid build tool which we're creating with rattler-build which will be used as a library in pixi.

    "},{"location":"basic_usage/","title":"Basic usage","text":"

    Ensure you've got pixi set up. If running pixi doesn't show the help, see the getting started if it doesn't.

    pixi\n

    Initialize a new project and navigate to the project directory.

    pixi init pixi-hello-world\ncd pixi-hello-world\n

    Add the dependencies you would like to use.

    pixi add python\n

    Create a file named hello_world.py in the directory and paste the following code into the file.

    hello_world.py
    def hello():\n    print(\"Hello World, to the new revolution in package management.\")\n\nif __name__ == \"__main__\":\n    hello()\n

    Run the code inside the environment.

    pixi run python hello_world.py\n

    You can also put this run command in a task.

    pixi task add hello python hello_world.py\n

    After adding the task, you can run the task using its name.

    pixi run hello\n

    Use the shell command to activate the environment and start a new shell in there.

    pixi shell\npython\nexit()\n

    You've just learned the basic features of pixi:

    1. initializing a project
    2. adding a dependency.
    3. adding a task, and executing it.
    4. running a program.

    Feel free to play around with what you just learned like adding more tasks, dependencies or code.

    Happy coding!

    "},{"location":"basic_usage/#use-pixi-as-a-global-installation-tool","title":"Use pixi as a global installation tool","text":"

    Use pixi to install tools on your machine.

    Some notable examples:

    # Awesome cross shell prompt, huge tip when using pixi!\npixi global install starship\n\n# Want to try a different shell?\npixi global install fish\n\n# Install other prefix.dev tools\npixi global install rattler-build\n\n# Install a linter you want to use in multiple projects.\npixi global install ruff\n
    "},{"location":"basic_usage/#using-the-no-activation-option","title":"Using the --no-activation option","text":"

    When installing packages globally, you can use the --no-activation option to prevent the insertion of environment activation code into the installed executable scripts. This means that when you run the installed executable, it won't modify the PATH or CONDA_PREFIX environment variables beforehand.

    Example:

    # Install a package without inserting activation code\npixi global install ruff --no-activation\n

    This option can be useful in scenarios where you want more control over the environment activation or if you're using the installed executables in contexts where automatic activation might interfere with other processes.

    "},{"location":"basic_usage/#use-pixi-in-github-actions","title":"Use pixi in GitHub Actions","text":"

    You can use pixi in GitHub Actions to install dependencies and run commands. It supports automatic caching of your environments.

    - uses: prefix-dev/setup-pixi@v0.5.1\n- run: pixi run cowpy \"Thanks for using pixi\"\n

    See the GitHub Actions for more details.

    "},{"location":"vision/","title":"Vision","text":"

    We created pixi because we want to have a cargo/npm/yarn like package management experience for conda. We really love what the conda packaging ecosystem achieves, but we think that the user experience can be improved a lot. Modern package managers like cargo have shown us, how great a package manager can be. We want to bring that experience to the conda ecosystem.

    "},{"location":"vision/#pixi-values","title":"Pixi values","text":"

    We want to make pixi a great experience for everyone, so we have a few values that we want to uphold:

    1. Fast. We want to have a fast package manager, that is able to solve the environment in a few seconds.
    2. User Friendly. We want to have a package manager that puts user friendliness on the front-line. Providing easy, accessible and intuitive commands. That have the element of least surprise.
    3. Isolated Environment. We want to have isolated environments, that are reproducible and easy to share. Ideally, it should run on all common platforms. The Conda packaging system provides an excellent base for this.
    4. Single Tool. We want to integrate most common uses when working on a development project with Pixi, so it should support at least dependency management, command management, building and uploading packages. You should not need to reach to another external tool for this.
    5. Fun. It should be fun to use pixi and not cause frustrations, you should not need to think about it a lot and it should generally just get out of your way.
    "},{"location":"vision/#conda","title":"Conda","text":"

    We are building on top of the conda packaging ecosystem, this means that we have a huge number of packages available for different platforms on conda-forge. We believe the conda packaging ecosystem provides a solid base to manage your dependencies. Conda-forge is community maintained and very open to contributions. It is widely used in data science and scientific computing, robotics and other fields. And has a proven track record.

    "},{"location":"vision/#target-languages","title":"Target languages","text":"

    Essentially, we are language agnostics, we are targeting any language that can be installed with conda. Including: C++, Python, Rust, Zig etc. But we do believe the python ecosystem can benefit from a good package manager that is based on conda. So we are trying to provide an alternative to existing solutions there. We also think we can provide a good solution for C++ projects, as there are a lot of libraries available on conda-forge today. Pixi also truly shines when using it for multi-language projects e.g. a mix of C++ and Python, because we provide a nice way to build everything up to and including system level packages.

    "},{"location":"advanced/authentication/","title":"Authenticate pixi with a server","text":"

    You can authenticate pixi with a server like prefix.dev, a private quetz instance or anaconda.org. Different servers use different authentication methods. In this documentation page, we detail how you can authenticate against the different servers and where the authentication information is stored.

    Usage: pixi auth login [OPTIONS] <HOST>\n\nArguments:\n  <HOST>  The host to authenticate with (e.g. repo.prefix.dev)\n\nOptions:\n      --token <TOKEN>              The token to use (for authentication with prefix.dev)\n      --username <USERNAME>        The username to use (for basic HTTP authentication)\n      --password <PASSWORD>        The password to use (for basic HTTP authentication)\n      --conda-token <CONDA_TOKEN>  The token to use on anaconda.org / quetz authentication\n  -v, --verbose...                 More output per occurrence\n  -q, --quiet...                   Less output per occurrence\n  -h, --help                       Print help\n

    The different options are \"token\", \"conda-token\" and \"username + password\".

    The token variant implements a standard \"Bearer Token\" authentication as is used on the prefix.dev platform. A Bearer Token is sent with every request as an additional header of the form Authentication: Bearer <TOKEN>.

    The conda-token option is used on anaconda.org and can be used with a quetz server. With this option, the token is sent as part of the URL following this scheme: conda.anaconda.org/t/<TOKEN>/conda-forge/linux-64/....

    The last option, username & password, are used for \"Basic HTTP Authentication\". This is the equivalent of adding http://user:password@myserver.com/.... This authentication method can be configured quite easily with a reverse NGinx or Apache server and is thus commonly used in self-hosted systems.

    "},{"location":"advanced/authentication/#examples","title":"Examples","text":"

    Login to prefix.dev:

    pixi auth login prefix.dev --token pfx_jj8WDzvnuTHEGdAhwRZMC1Ag8gSto8\n

    Login to anaconda.org:

    pixi auth login anaconda.org --conda-token xy-72b914cc-c105-4ec7-a969-ab21d23480ed\n

    Login to a basic HTTP secured server:

    pixi auth login myserver.com --username user --password password\n
    "},{"location":"advanced/authentication/#where-does-pixi-store-the-authentication-information","title":"Where does pixi store the authentication information?","text":"

    The storage location for the authentication information is system-dependent. By default, pixi tries to use the keychain to store this sensitive information securely on your machine.

    On Windows, the credentials are stored in the \"credentials manager\". Searching for rattler (the underlying library pixi uses) you should find any credentials stored by pixi (or other rattler-based programs).

    On macOS, the passwords are stored in the keychain. To access the password, you can use the Keychain Access program that comes pre-installed on macOS. Searching for rattler (the underlying library pixi uses) you should find any credentials stored by pixi (or other rattler-based programs).

    On Linux, one can use GNOME Keyring (or just Keyring) to access credentials that are securely stored by libsecret. Searching for rattler should list all the credentials stored by pixi and other rattler-based programs.

    "},{"location":"advanced/authentication/#fallback-storage","title":"Fallback storage","text":"

    If you run on a server with none of the aforementioned keychains available, then pixi falls back to store the credentials in an insecure JSON file. This JSON file is located at ~/.rattler/credentials.json and contains the credentials.

    "},{"location":"advanced/authentication/#override-the-authentication-storage","title":"Override the authentication storage","text":"

    You can use the RATTLER_AUTH_FILE environment variable to override the default location of the credentials file. When this environment variable is set, it provides the only source of authentication data that is used by pixi.

    E.g.

    export RATTLER_AUTH_FILE=$HOME/credentials.json\n# You can also specify the file in the command line\npixi global install --auth-file $HOME/credentials.json ...\n

    The JSON should follow the following format:

    {\n    \"*.prefix.dev\": {\n        \"BearerToken\": \"your_token\"\n    },\n    \"otherhost.com\": {\n        \"BasicHTTP\": {\n            \"username\": \"your_username\",\n            \"password\": \"your_password\"\n        }\n    },\n    \"conda.anaconda.org\": {\n        \"CondaToken\": \"your_token\"\n    }\n}\n

    Note: if you use a wildcard in the host, any subdomain will match (e.g. *.prefix.dev also matches repo.prefix.dev).

    Lastly you can set the authentication override file in the global configuration file.

    "},{"location":"advanced/authentication/#pypi-authentication","title":"PyPI authentication","text":"

    Currently, we support the following methods for authenticating against PyPI:

    1. keyring authentication.
    2. .netrc file authentication.

    We want to add more methods in the future, so if you have a specific method you would like to see, please let us know.

    "},{"location":"advanced/authentication/#keyring-authentication","title":"Keyring authentication","text":"

    Currently, pixi supports the uv method of authentication through the python keyring library. To enable this use the CLI flag --pypi-keyring-provider which can either be set to subprocess (activated) or disabled.

    # From an existing pixi project\npixi install --pypi-keyring-provider subprocess\n

    This option can also be set in the global configuration file under pypi-config.

    "},{"location":"advanced/authentication/#installing-keyring","title":"Installing keyring","text":"

    To install keyring you can use pixi global install:

    Either use:

    pixi global install keyring\n
    GCP and other backends

    The downside of this method is currently, because you cannot inject into a pixi global environment just yet, that installing different keyring backends is not possible. This allows only the default keyring backend to be used. Give the issue a \ud83d\udc4d up if you would like to see inject as a feature.

    Or alternatively, you can install keyring using pipx:

    # Install pipx if you haven't already\npixi global install pipx\npipx install keyring\n\n# For Google Artifact Registry, also install and initialize its keyring backend.\n# Inject this into the pipx environment\npipx inject keyring keyrings.google-artifactregistry-auth --index-url https://pypi.org/simple\ngcloud auth login\n
    "},{"location":"advanced/authentication/#using-keyring-with-basic-auth","title":"Using keyring with Basic Auth","text":"

    Use keyring to store your credentials e.g:

    keyring set https://my-index/simple your_username\n# prompt will appear for your password\n
    "},{"location":"advanced/authentication/#configuration","title":"Configuration","text":"

    Make sure to include username@ in the URL of the registry. An example of this would be:

    [pypi-options]\nindex-url = \"https://username@custom-registry.com/simple\"\n
    "},{"location":"advanced/authentication/#gcp","title":"GCP","text":"

    For Google Artifact Registry, you can use the Google Cloud SDK to authenticate. Make sure to have run gcloud auth login before using pixi. Another thing to note is that you need to add oauth2accesstoken to the URL of the registry. An example of this would be:

    "},{"location":"advanced/authentication/#configuration_1","title":"Configuration","text":"
    # rest of the pixi.toml\n#\n# Add's the following options to the default feature\n[pypi-options]\nextra-index-urls = [\"https://oauth2accesstoken@<location>-python.pkg.dev/<project>/<repository>/simple\"]\n

    Note

    Include the /simple at the end, replace the <location> etc. with your project and repository and location.

    To find this URL more easily, you can use the gcloud command:

    gcloud artifacts print-settings python --project=<project> --repository=<repository> --location=<location>\n
    "},{"location":"advanced/authentication/#azure-devops","title":"Azure DevOps","text":"

    Similarly for Azure DevOps, you can use the Azure keyring backend for authentication. The backend, along with installation instructions can be found at keyring.artifacts.

    After following the instructions and making sure that keyring works correctly, you can use the following configuration:

    "},{"location":"advanced/authentication/#configuration_2","title":"Configuration","text":"

    # rest of the pixi.toml\n#\n# Adds the following options to the default feature\n[pypi-options]\nextra-index-urls = [\"https://VssSessionToken@pkgs.dev.azure.com/{organization}/{project}/_packaging/{feed}/pypi/simple/\"]\n
    This should allow for getting packages from the Azure DevOps artifact registry.

    "},{"location":"advanced/authentication/#installing-your-environment","title":"Installing your environment","text":"

    To actually install either configure your Global Config, or use the flag:

    pixi install --pypi-keyring-provider subprocess\n

    "},{"location":"advanced/authentication/#netrc-file","title":".netrc file","text":"

    pixi allows you to access private registries securely by authenticating with credentials stored in a .netrc file.

    • The .netrc file can be stored in your home directory ($HOME/.netrc for Unix-like systems)
    • or in the user profile directory on Windows (%HOME%\\_netrc).
    • You can also set up a different location for it using the NETRC variable (export NETRC=/my/custom/location/.netrc). e.g export NETRC=/my/custom/location/.netrc pixi install

    In the .netrc file, you store authentication details like this:

    machine registry-name\nlogin admin\npassword admin\n
    For more details, you can access the .netrc docs.

    "},{"location":"advanced/channel_priority/","title":"Channel Logic","text":"

    All logic regarding the decision which dependencies can be installed from which channel is done by the instruction we give the solver.

    The actual code regarding this is in the rattler_solve crate. This might however be hard to read. Therefore, this document will continue with simplified flow charts.

    "},{"location":"advanced/channel_priority/#channel-specific-dependencies","title":"Channel specific dependencies","text":"

    When a user defines a channel per dependency, the solver needs to know the other channels are unusable for this dependency.

    [project]\nchannels = [\"conda-forge\", \"my-channel\"]\n\n[dependencies]\npackgex = { version = \"*\", channel = \"my-channel\" }\n
    In the packagex example, the solver will understand that the package is only available in my-channel and will not look for it in conda-forge.

    The flowchart of the logic that excludes all other channels:

    flowchart TD\n    A[Start] --> B[Given a Dependency]\n    B --> C{Channel Specific Dependency?}\n    C -->|Yes| D[Exclude All Other Channels for This Package]\n    C -->|No| E{Any Other Dependencies?}\n    E -->|Yes| B\n    E -->|No| F[End]\n    D --> E
    "},{"location":"advanced/channel_priority/#channel-priority","title":"Channel priority","text":"

    Channel priority is dictated by the order in the project.channels array, where the first channel is the highest priority. For instance:

    [project]\nchannels = [\"conda-forge\", \"my-channel\", \"your-channel\"]\n
    If the package is found in conda-forge the solver will not look for it in my-channel and your-channel, because it tells the solver they are excluded. If the package is not found in conda-forge the solver will look for it in my-channel and if it is found there it will tell the solver to exclude your-channel for this package. This diagram explains the logic:
    flowchart TD\n    A[Start] --> B[Given a Dependency]\n    B --> C{Loop Over Channels}\n    C --> D{Package in This Channel?}\n    D -->|No| C\n    D -->|Yes| E{\"This the first channel\n     for this package?\"}\n    E -->|Yes| F[Include Package in Candidates]\n    E -->|No| G[Exclude Package from Candidates]\n    F --> H{Any Other Channels?}\n    G --> H\n    H -->|Yes| C\n    H -->|No| I{Any Other Dependencies?}\n    I -->|No| J[End]\n    I -->|Yes| B

    This method ensures the solver only adds a package to the candidates if it's found in the highest priority channel available. If you have 10 channels and the package is found in the 5th channel it will exclude the next 5 channels from the candidates if they also contain the package.

    "},{"location":"advanced/channel_priority/#use-case-pytorch-and-nvidia-with-conda-forge","title":"Use case: pytorch and nvidia with conda-forge","text":"

    A common use case is to use pytorch with nvidia drivers, while also needing the conda-forge channel for the main dependencies.

    [project]\nchannels = [\"nvidia/label/cuda-11.8.0\", \"nvidia\", \"conda-forge\", \"pytorch\"]\nplatforms = [\"linux-64\"]\n\n[dependencies]\ncuda = {version = \"*\", channel=\"nvidia/label/cuda-11.8.0\"}\npytorch = {version = \"2.0.1.*\", channel=\"pytorch\"}\ntorchvision = {version = \"0.15.2.*\", channel=\"pytorch\"}\npytorch-cuda = {version = \"11.8.*\", channel=\"pytorch\"}\npython = \"3.10.*\"\n
    What this will do is get as much as possible from the nvidia/label/cuda-11.8.0 channel, which is actually only the cuda package.

    Then it will get all packages from the nvidia channel, which is a little more and some packages overlap the nvidia and conda-forge channel. Like the cuda-cudart package, which will now only be retrieved from the nvidia channel because of the priority logic.

    Then it will get the packages from the conda-forge channel, which is the main channel for the dependencies.

    But the user only wants the pytorch packages from the pytorch channel, which is why pytorch is added last and the dependencies are added as channel specific dependencies.

    We don't define the pytorch channel before conda-forge because we want to get as much as possible from the conda-forge as the pytorch channel is not always shipping the best versions of all packages.

    For example, it also ships the ffmpeg package, but only an old version which doesn't work with the newer pytorch versions. Thus breaking the installation if we would skip the conda-forge channel for ffmpeg with the priority logic.

    "},{"location":"advanced/channel_priority/#force-a-specific-channel-priority","title":"Force a specific channel priority","text":"

    If you want to force a specific priority for a channel, you can use the priority (int) key in the channel definition. The higher the number, the higher the priority. Non specified priorities are set to 0 but the index in the array still counts as a priority, where the first in the list has the highest priority.

    This priority definition is mostly important for multiple environments with different channel priorities, as by default feature channels are prepended to the project channels.

    [project]\nname = \"test_channel_priority\"\nplatforms = [\"linux-64\", \"osx-64\", \"win-64\", \"osx-arm64\"]\nchannels = [\"conda-forge\"]\n\n[feature.a]\nchannels = [\"nvidia\"]\n\n[feature.b]\nchannels = [ \"pytorch\", {channel = \"nvidia\", priority = 1}]\n\n[feature.c]\nchannels = [ \"pytorch\", {channel = \"nvidia\", priority = -1}]\n\n[environments]\na = [\"a\"]\nb = [\"b\"]\nc = [\"c\"]\n
    This example creates 4 environments, a, b, c, and the default environment. Which will have the following channel order:

    Environment Resulting Channels order default conda-forge a nvidia, conda-forge b nvidia, pytorch, conda-forge c pytorch, conda-forge, nvidia Check priority result with pixi info

    Using pixi info you can check the priority of the channels in the environment.

    pixi info\nEnvironments\n------------\n       Environment: default\n          Features: default\n          Channels: conda-forge\nDependency count: 0\nTarget platforms: linux-64\n\n       Environment: a\n          Features: a, default\n          Channels: nvidia, conda-forge\nDependency count: 0\nTarget platforms: linux-64\n\n       Environment: b\n          Features: b, default\n          Channels: nvidia, pytorch, conda-forge\nDependency count: 0\nTarget platforms: linux-64\n\n       Environment: c\n          Features: c, default\n          Channels: pytorch, conda-forge, nvidia\nDependency count: 0\nTarget platforms: linux-64\n

    "},{"location":"advanced/explain_info_command/","title":"Info command","text":"

    pixi info prints out useful information to debug a situation or to get an overview of your machine/project. This information can also be retrieved in json format using the --json flag, which can be useful for programmatically reading it.

    Running pixi info in the pixi repo
    \u279c pixi info\n      Pixi version: 0.13.0\n          Platform: linux-64\n  Virtual packages: __unix=0=0\n                  : __linux=6.5.12=0\n                  : __glibc=2.36=0\n                  : __cuda=12.3=0\n                  : __archspec=1=x86_64\n         Cache dir: /home/user/.cache/rattler/cache\n      Auth storage: /home/user/.rattler/credentials.json\n\nProject\n------------\n           Version: 0.13.0\n     Manifest file: /home/user/development/pixi/pixi.toml\n      Last updated: 25-01-2024 10:29:08\n\nEnvironments\n------------\ndefault\n          Features: default\n          Channels: conda-forge\n  Dependency count: 10\n      Dependencies: pre-commit, rust, openssl, pkg-config, git, mkdocs, mkdocs-material, pillow, cairosvg, compilers\n  Target platforms: linux-64, osx-arm64, win-64, osx-64\n             Tasks: docs, test-all, test, build, lint, install, build-docs\n
    "},{"location":"advanced/explain_info_command/#global-info","title":"Global info","text":"

    The first part of the info output is information that is always available and tells you what pixi can read on your machine.

    "},{"location":"advanced/explain_info_command/#platform","title":"Platform","text":"

    This defines the platform you're currently on according to pixi. If this is incorrect, please file an issue on the pixi repo.

    "},{"location":"advanced/explain_info_command/#virtual-packages","title":"Virtual packages","text":"

    The virtual packages that pixi can find on your machine.

    In the Conda ecosystem, you can depend on virtual packages. These packages aren't real dependencies that are going to be installed, but rather are being used in the solve step to find if a package can be installed on the machine. A simple example: When a package depends on Cuda drivers being present on the host machine it can do that by depending on the __cuda virtual package. In that case, if pixi cannot find the __cuda virtual package on your machine the installation will fail.

    "},{"location":"advanced/explain_info_command/#cache-dir","title":"Cache dir","text":"

    The directory where pixi stores its cache. Checkout the cache documentation for more information.

    "},{"location":"advanced/explain_info_command/#auth-storage","title":"Auth storage","text":"

    Check the authentication documentation

    "},{"location":"advanced/explain_info_command/#cache-size","title":"Cache size","text":"

    [requires --extended]

    The size of the previously mentioned \"Cache dir\" in Mebibytes.

    "},{"location":"advanced/explain_info_command/#project-info","title":"Project info","text":"

    Everything below Project is info about the project you're currently in. This info is only available if your path has a manifest file.

    "},{"location":"advanced/explain_info_command/#manifest-file","title":"Manifest file","text":"

    The path to the manifest file that describes the project.

    "},{"location":"advanced/explain_info_command/#last-updated","title":"Last updated","text":"

    The last time the lock file was updated, either manually or by pixi itself.

    "},{"location":"advanced/explain_info_command/#environment-info","title":"Environment info","text":"

    The environment info defined per environment. If you don't have any environments defined, this will only show the default environment.

    "},{"location":"advanced/explain_info_command/#features","title":"Features","text":"

    This lists which features are enabled in the environment. For the default this is only default

    "},{"location":"advanced/explain_info_command/#channels","title":"Channels","text":"

    The list of channels used in this environment.

    "},{"location":"advanced/explain_info_command/#dependency-count","title":"Dependency count","text":"

    The amount of dependencies defined that are defined for this environment (not the amount of installed dependencies).

    "},{"location":"advanced/explain_info_command/#dependencies","title":"Dependencies","text":"

    The list of dependencies defined for this environment.

    "},{"location":"advanced/explain_info_command/#target-platforms","title":"Target platforms","text":"

    The platforms the project has defined.

    "},{"location":"advanced/github_actions/","title":"GitHub Action","text":"

    We created prefix-dev/setup-pixi to facilitate using pixi in CI.

    "},{"location":"advanced/github_actions/#usage","title":"Usage","text":"
    - uses: prefix-dev/setup-pixi@v0.8.0\n  with:\n    pixi-version: v0.32.0\n    cache: true\n    auth-host: prefix.dev\n    auth-token: ${{ secrets.PREFIX_DEV_TOKEN }}\n- run: pixi run test\n

    Pin your action versions

    Since pixi is not yet stable, the API of this action may change between minor versions. Please pin the versions of this action to a specific version (i.e., prefix-dev/setup-pixi@v0.8.0) to avoid breaking changes. You can automatically update the version of this action by using Dependabot.

    Put the following in your .github/dependabot.yml file to enable Dependabot for your GitHub Actions:

    .github/dependabot.yml
    version: 2\nupdates:\n  - package-ecosystem: github-actions\n    directory: /\n    schedule:\n      interval: monthly # (1)!\n    groups:\n      dependencies:\n        patterns:\n          - \"*\"\n
    1. or daily, weekly
    "},{"location":"advanced/github_actions/#features","title":"Features","text":"

    To see all available input arguments, see the action.yml file in setup-pixi. The most important features are described below.

    "},{"location":"advanced/github_actions/#caching","title":"Caching","text":"

    The action supports caching of the pixi environment. By default, caching is enabled if a pixi.lock file is present. It will then use the pixi.lock file to generate a hash of the environment and cache it. If the cache is hit, the action will skip the installation and use the cached environment. You can specify the behavior by setting the cache input argument.

    Customize your cache key

    If you need to customize your cache-key, you can use the cache-key input argument. This will be the prefix of the cache key. The full cache key will be <cache-key><conda-arch>-<hash>.

    Only save caches on main

    In order to not exceed the 10 GB cache size limit as fast, you might want to restrict when the cache is saved. This can be done by setting the cache-write argument.

    - uses: prefix-dev/setup-pixi@v0.8.0\n  with:\n    cache: true\n    cache-write: ${{ github.event_name == 'push' && github.ref_name == 'main' }}\n
    "},{"location":"advanced/github_actions/#multiple-environments","title":"Multiple environments","text":"

    With pixi, you can create multiple environments for different requirements. You can also specify which environment(s) you want to install by setting the environments input argument. This will install all environments that are specified and cache them.

    [project]\nname = \"my-package\"\nchannels = [\"conda-forge\"]\nplatforms = [\"linux-64\"]\n\n[dependencies]\npython = \">=3.11\"\npip = \"*\"\npolars = \">=0.14.24,<0.21\"\n\n[feature.py311.dependencies]\npython = \"3.11.*\"\n[feature.py312.dependencies]\npython = \"3.12.*\"\n\n[environments]\npy311 = [\"py311\"]\npy312 = [\"py312\"]\n
    "},{"location":"advanced/github_actions/#multiple-environments-using-a-matrix","title":"Multiple environments using a matrix","text":"

    The following example will install the py311 and py312 environments in different jobs.

    test:\n  runs-on: ubuntu-latest\n  strategy:\n    matrix:\n      environment: [py311, py312]\n  steps:\n  - uses: actions/checkout@v4\n  - uses: prefix-dev/setup-pixi@v0.8.0\n    with:\n      environments: ${{ matrix.environment }}\n
    "},{"location":"advanced/github_actions/#install-multiple-environments-in-one-job","title":"Install multiple environments in one job","text":"

    The following example will install both the py311 and the py312 environment on the runner.

    - uses: prefix-dev/setup-pixi@v0.8.0\n  with:\n    environments: >- # (1)!\n      py311\n      py312\n- run: |\n  pixi run -e py311 test\n  pixi run -e py312 test\n
    1. separated by spaces, equivalent to

      environments: py311 py312\n

    Caching behavior if you don't specify environments

    If you don't specify any environment, the default environment will be installed and cached, even if you use other environments.

    "},{"location":"advanced/github_actions/#authentication","title":"Authentication","text":"

    There are currently three ways to authenticate with pixi:

    • using a token
    • using a username and password
    • using a conda-token

    For more information, see Authentication.

    Handle secrets with care

    Please only store sensitive information using GitHub secrets. Do not store them in your repository. When your sensitive information is stored in a GitHub secret, you can access it using the ${{ secrets.SECRET_NAME }} syntax. These secrets will always be masked in the logs.

    "},{"location":"advanced/github_actions/#token","title":"Token","text":"

    Specify the token using the auth-token input argument. This form of authentication (bearer token in the request headers) is mainly used at prefix.dev.

    - uses: prefix-dev/setup-pixi@v0.8.0\n  with:\n    auth-host: prefix.dev\n    auth-token: ${{ secrets.PREFIX_DEV_TOKEN }}\n
    "},{"location":"advanced/github_actions/#username-and-password","title":"Username and password","text":"

    Specify the username and password using the auth-username and auth-password input arguments. This form of authentication (HTTP Basic Auth) is used in some enterprise environments with artifactory for example.

    - uses: prefix-dev/setup-pixi@v0.8.0\n  with:\n    auth-host: custom-artifactory.com\n    auth-username: ${{ secrets.PIXI_USERNAME }}\n    auth-password: ${{ secrets.PIXI_PASSWORD }}\n
    "},{"location":"advanced/github_actions/#conda-token","title":"Conda-token","text":"

    Specify the conda-token using the conda-token input argument. This form of authentication (token is encoded in URL: https://my-quetz-instance.com/t/<token>/get/custom-channel) is used at anaconda.org or with quetz instances.

    - uses: prefix-dev/setup-pixi@v0.8.0\n  with:\n    auth-host: anaconda.org # (1)!\n    conda-token: ${{ secrets.CONDA_TOKEN }}\n
    1. or my-quetz-instance.com
    "},{"location":"advanced/github_actions/#custom-shell-wrapper","title":"Custom shell wrapper","text":"

    setup-pixi allows you to run command inside of the pixi environment by specifying a custom shell wrapper with shell: pixi run bash -e {0}. This can be useful if you want to run commands inside of the pixi environment, but don't want to use the pixi run command for each command.

    - run: | # (1)!\n    python --version\n    pip install --no-deps -e .\n  shell: pixi run bash -e {0}\n
    1. everything here will be run inside of the pixi environment

    You can even run Python scripts like this:

    - run: | # (1)!\n    import my_package\n    print(\"Hello world!\")\n  shell: pixi run python {0}\n
    1. everything here will be run inside of the pixi environment

    If you want to use PowerShell, you need to specify -Command as well.

    - run: | # (1)!\n    python --version | Select-String \"3.11\"\n  shell: pixi run pwsh -Command {0} # pwsh works on all platforms\n
    1. everything here will be run inside of the pixi environment

    How does it work under the hood?

    Under the hood, the shell: xyz {0} option is implemented by creating a temporary script file and calling xyz with that script file as an argument. This file does not have the executable bit set, so you cannot use shell: pixi run {0} directly but instead have to use shell: pixi run bash {0}. There are some custom shells provided by GitHub that have slightly different behavior, see jobs.<job_id>.steps[*].shell in the documentation. See the official documentation and ADR 0277 for more information about how the shell: input works in GitHub Actions.

    "},{"location":"advanced/github_actions/#one-off-shell-wrapper-using-pixi-exec","title":"One-off shell wrapper using pixi exec","text":"

    With pixi exec, you can also run a one-off command inside a temporary pixi environment.

    - run: | # (1)!\n    zstd --version\n  shell: pixi exec --spec zstd -- bash -e {0}\n
    1. everything here will be run inside of the temporary pixi environment
    - run: | # (1)!\n    import ruamel.yaml\n    # ...\n  shell: pixi exec --spec python=3.11.* --spec ruamel.yaml -- python {0}\n
    1. everything here will be run inside of the temporary pixi environment

    See here for more information about pixi exec.

    "},{"location":"advanced/github_actions/#environment-activation","title":"Environment activation","text":"

    Instead of using a custom shell wrapper, you can also make all pixi-installed binaries available to subsequent steps by \"activating\" the installed environment in the currently running job. To this end, setup-pixi adds all environment variables set when executing pixi run to $GITHUB_ENV and, similarly, adds all path modifications to $GITHUB_PATH. As a result, all installed binaries can be accessed without having to call pixi run.

    - uses: prefix-dev/setup-pixi@v0.8.0\n  with:\n    activate-environment: true\n

    If you are installing multiple environments, you will need to specify the name of the environment that you want to be activated.

    - uses: prefix-dev/setup-pixi@v0.8.0\n  with:\n    environments: >-\n      py311\n      py312\n    activate-environment: py311\n

    Activating an environment may be more useful than using a custom shell wrapper as it allows non-shell based steps to access binaries on the path. However, be aware that this option augments the environment of your job.

    "},{"location":"advanced/github_actions/#-frozen-and-locked","title":"--frozen and --locked","text":"

    You can specify whether setup-pixi should run pixi install --frozen or pixi install --locked depending on the frozen or the locked input argument. See the official documentation for more information about the --frozen and --locked flags.

    - uses: prefix-dev/setup-pixi@v0.8.0\n  with:\n    locked: true\n    # or\n    frozen: true\n

    If you don't specify anything, the default behavior is to run pixi install --locked if a pixi.lock file is present and pixi install otherwise.

    "},{"location":"advanced/github_actions/#debugging","title":"Debugging","text":"

    There are two types of debug logging that you can enable.

    "},{"location":"advanced/github_actions/#debug-logging-of-the-action","title":"Debug logging of the action","text":"

    The first one is the debug logging of the action itself. This can be enabled by for the action by re-running the action in debug mode:

    Debug logging documentation

    For more information about debug logging in GitHub Actions, see the official documentation.

    "},{"location":"advanced/github_actions/#debug-logging-of-pixi","title":"Debug logging of pixi","text":"

    The second type is the debug logging of the pixi executable. This can be specified by setting the log-level input.

    - uses: prefix-dev/setup-pixi@v0.8.0\n  with:\n    log-level: vvv # (1)!\n
    1. One of q, default, v, vv, or vvv.

    If nothing is specified, log-level will default to default or vv depending on if debug logging is enabled for the action.

    "},{"location":"advanced/github_actions/#self-hosted-runners","title":"Self-hosted runners","text":"

    On self-hosted runners, it may happen that some files are persisted between jobs. This can lead to problems or secrets getting leaked between job runs. To avoid this, you can use the post-cleanup input to specify the post cleanup behavior of the action (i.e., what happens after all your commands have been executed).

    If you set post-cleanup to true, the action will delete the following files:

    • .pixi environment
    • the pixi binary
    • the rattler cache
    • other rattler files in ~/.rattler

    If nothing is specified, post-cleanup will default to true.

    On self-hosted runners, you also might want to alter the default pixi install location to a temporary location. You can use pixi-bin-path: ${{ runner.temp }}/bin/pixi to do this.

    - uses: prefix-dev/setup-pixi@v0.8.0\n  with:\n    post-cleanup: true\n    pixi-bin-path: ${{ runner.temp }}/bin/pixi # (1)!\n
    1. ${{ runner.temp }}\\Scripts\\pixi.exe on Windows

    You can also use a preinstalled local version of pixi on the runner by not setting any of the pixi-version, pixi-url or pixi-bin-path inputs. This action will then try to find a local version of pixi in the runner's PATH.

    "},{"location":"advanced/github_actions/#using-the-pyprojecttoml-as-a-manifest-file-for-pixi","title":"Using the pyproject.toml as a manifest file for pixi.","text":"

    setup-pixi will automatically pick up the pyproject.toml if it contains a [tool.pixi.project] section and no pixi.toml. This can be overwritten by setting the manifest-path input argument.

    - uses: prefix-dev/setup-pixi@v0.8.0\n  with:\n    manifest-path: pyproject.toml\n
    "},{"location":"advanced/github_actions/#more-examples","title":"More examples","text":"

    If you want to see more examples, you can take a look at the GitHub Workflows of the setup-pixi repository.

    "},{"location":"advanced/production_deployment/","title":"Bringing pixi to production","text":"

    You can bring pixi projects into production by either containerizing it using tools like Docker or by using quantco/pixi-pack.

    @pavelzw from QuantCo wrote a blog post about bringing pixi to production. You can read it here.

    "},{"location":"advanced/production_deployment/#docker","title":"Docker","text":"

    We provide a simple docker image at pixi-docker that contains the pixi executable on top of different base images.

    The images are available on ghcr.io/prefix-dev/pixi.

    There are different tags for different base images available:

    • latest - based on ubuntu:jammy
    • focal - based on ubuntu:focal
    • bullseye - based on debian:bullseye
    • jammy-cuda-12.2.2 - based on nvidia/cuda:12.2.2-jammy
    • ... and more

    All tags

    For all tags, take a look at the build script.

    "},{"location":"advanced/production_deployment/#example-usage","title":"Example usage","text":"

    The following example uses the pixi docker image as a base image for a multi-stage build. It also makes use of pixi shell-hook to not rely on pixi being installed in the production container.

    More examples

    For more examples, take a look at pavelzw/pixi-docker-example.

    FROM ghcr.io/prefix-dev/pixi:0.31.0 AS build\n\n# copy source code, pixi.toml and pixi.lock to the container\nWORKDIR /app\nCOPY . .\n# install dependencies to `/app/.pixi/envs/prod`\n# use `--locked` to ensure the lockfile is up to date with pixi.toml\nRUN pixi install --locked -e prod\n# create the shell-hook bash script to activate the environment\nRUN pixi shell-hook -e prod -s bash > /shell-hook\nRUN echo \"#!/bin/bash\" > /app/entrypoint.sh\nRUN cat /shell-hook >> /app/entrypoint.sh\n# extend the shell-hook script to run the command passed to the container\nRUN echo 'exec \"$@\"' >> /app/entrypoint.sh\n\nFROM ubuntu:24.04 AS production\nWORKDIR /app\n# only copy the production environment into prod container\n# please note that the \"prefix\" (path) needs to stay the same as in the build container\nCOPY --from=build /app/.pixi/envs/prod /app/.pixi/envs/prod\nCOPY --from=build --chmod=0755 /app/entrypoint.sh /app/entrypoint.sh\n# copy your project code into the container as well\nCOPY ./my_project /app/my_project\n\nEXPOSE 8000\nENTRYPOINT [ \"/app/entrypoint.sh\" ]\n# run your app inside the pixi environment\nCMD [ \"uvicorn\", \"my_project:app\", \"--host\", \"0.0.0.0\" ]\n
    "},{"location":"advanced/production_deployment/#pixi-pack","title":"pixi-pack","text":"

    pixi-pack is a simple tool that takes a pixi environment and packs it into a compressed archive that can be shipped to the target machine.

    It can be installed via

    pixi global install pixi-pack\n

    Or by downloading our pre-built binaries from the releases page.

    Instead of installing pixi-pack globally, you can also use pixi exec to run pixi-pack in a temporary environment:

    pixi exec pixi-pack pack\npixi exec pixi-pack unpack environment.tar\n

    You can pack an environment with

    pixi-pack pack --manifest-file pixi.toml --environment prod --platform linux-64\n

    This will create a environment.tar file that contains all conda packages required to create the environment.

    # environment.tar\n| pixi-pack.json\n| environment.yml\n| channel\n|    \u251c\u2500\u2500 noarch\n|    |    \u251c\u2500\u2500 tzdata-2024a-h0c530f3_0.conda\n|    |    \u251c\u2500\u2500 ...\n|    |    \u2514\u2500\u2500 repodata.json\n|    \u2514\u2500\u2500 linux-64\n|         \u251c\u2500\u2500 ca-certificates-2024.2.2-hbcca054_0.conda\n|         \u251c\u2500\u2500 ...\n|         \u2514\u2500\u2500 repodata.json\n
    "},{"location":"advanced/production_deployment/#unpacking-an-environment","title":"Unpacking an environment","text":"

    With pixi-pack unpack environment.tar, you can unpack the environment on your target system. This will create a new conda environment in ./env that contains all packages specified in your pixi.toml. It also creates an activate.sh (or activate.bat on Windows) file that lets you activate the environment without needing to have conda or micromamba installed.

    "},{"location":"advanced/production_deployment/#cross-platform-packs","title":"Cross-platform packs","text":"

    Since pixi-pack just downloads the .conda and .tar.bz2 files from the conda repositories, you can trivially create packs for different platforms.

    pixi-pack pack --platform win-64\n

    You can only unpack a pack on a system that has the same platform as the pack was created for.

    "},{"location":"advanced/production_deployment/#inject-additional-packages","title":"Inject additional packages","text":"

    You can inject additional packages into the environment that are not specified in pixi.lock by using the --inject flag:

    pixi-pack pack --inject local-package-1.0.0-hbefa133_0.conda --manifest-pack pixi.toml\n

    This can be particularly useful if you build the project itself and want to include the built package in the environment but still want to use pixi.lock from the project.

    "},{"location":"advanced/production_deployment/#unpacking-without-pixi-pack","title":"Unpacking without pixi-pack","text":"

    If you don't have pixi-pack available on your target system, you can still install the environment if you have conda or micromamba available. Just unarchive the environment.tar, then you have a local channel on your system where all necessary packages are available. Next to this local channel, you will find an environment.yml file that contains the environment specification. You can then install the environment using conda or micromamba:

    tar -xvf environment.tar\nmicromamba create -p ./env --file environment.yml\n# or\nconda env create -p ./env --file environment.yml\n

    The environment.yml and repodata.json files are only for this use case, pixi-pack unpack does not use them.

    "},{"location":"advanced/pyproject_toml/","title":"pyproject.toml in pixi","text":"

    We support the use of the pyproject.toml as our manifest file in pixi. This allows the user to keep one file with all configuration. The pyproject.toml file is a standard for Python projects. We don't advise to use the pyproject.toml file for anything else than python projects, the pixi.toml is better suited for other types of projects.

    "},{"location":"advanced/pyproject_toml/#initial-setup-of-the-pyprojecttoml-file","title":"Initial setup of the pyproject.toml file","text":"

    When you already have a pyproject.toml file in your project, you can run pixi init in a that folder. Pixi will automatically

    • Add a [tool.pixi.project] section to the file, with the platform and channel information required by pixi;
    • Add the current project as an editable pypi dependency;
    • Add some defaults to the .gitignore and .gitattributes files.

    If you do not have an existing pyproject.toml file , you can run pixi init --format pyproject in your project folder. In that case, pixi will create a pyproject.toml manifest from scratch with some sane defaults.

    "},{"location":"advanced/pyproject_toml/#python-dependency","title":"Python dependency","text":"

    The pyproject.toml file supports the requires_python field. Pixi understands that field and automatically adds the version to the dependencies.

    This is an example of a pyproject.toml file with the requires_python field, which will be used as the python dependency:

    pyproject.toml
    [project]\nname = \"my_project\"\nrequires-python = \">=3.9\"\n\n[tool.pixi.project]\nchannels = [\"conda-forge\"]\nplatforms = [\"linux-64\", \"osx-arm64\", \"osx-64\", \"win-64\"]\n

    Which is equivalent to:

    equivalent pixi.toml
    [project]\nname = \"my_project\"\nchannels = [\"conda-forge\"]\nplatforms = [\"linux-64\", \"osx-arm64\", \"osx-64\", \"win-64\"]\n\n[dependencies]\npython = \">=3.9\"\n
    "},{"location":"advanced/pyproject_toml/#dependency-section","title":"Dependency section","text":"

    The pyproject.toml file supports the dependencies field. Pixi understands that field and automatically adds the dependencies to the project as [pypi-dependencies].

    This is an example of a pyproject.toml file with the dependencies field:

    pyproject.toml
    [project]\nname = \"my_project\"\nrequires-python = \">=3.9\"\ndependencies = [\n    \"numpy\",\n    \"pandas\",\n    \"matplotlib\",\n]\n\n[tool.pixi.project]\nchannels = [\"conda-forge\"]\nplatforms = [\"linux-64\", \"osx-arm64\", \"osx-64\", \"win-64\"]\n

    Which is equivalent to:

    equivalent pixi.toml
    [project]\nname = \"my_project\"\nchannels = [\"conda-forge\"]\nplatforms = [\"linux-64\", \"osx-arm64\", \"osx-64\", \"win-64\"]\n\n[pypi-dependencies]\nnumpy = \"*\"\npandas = \"*\"\nmatplotlib = \"*\"\n\n[dependencies]\npython = \">=3.9\"\n

    You can overwrite these with conda dependencies by adding them to the dependencies field:

    pyproject.toml
    [project]\nname = \"my_project\"\nrequires-python = \">=3.9\"\ndependencies = [\n    \"numpy\",\n    \"pandas\",\n    \"matplotlib\",\n]\n\n[tool.pixi.project]\nchannels = [\"conda-forge\"]\nplatforms = [\"linux-64\", \"osx-arm64\", \"osx-64\", \"win-64\"]\n\n[tool.pixi.dependencies]\nnumpy = \"*\"\npandas = \"*\"\nmatplotlib = \"*\"\n

    This would result in the conda dependencies being installed and the pypi dependencies being ignored. As pixi takes the conda dependencies over the pypi dependencies.

    "},{"location":"advanced/pyproject_toml/#optional-dependencies","title":"Optional dependencies","text":"

    If your python project includes groups of optional dependencies, pixi will automatically interpret them as pixi features of the same name with the associated pypi-dependencies.

    You can add them to pixi environments manually, or use pixi init to setup the project, which will create one environment per feature. Self-references to other groups of optional dependencies are also handled.

    For instance, imagine you have a project folder with a pyproject.toml file similar to:

    [project]\nname = \"my_project\"\ndependencies = [\"package1\"]\n\n[project.optional-dependencies]\ntest = [\"pytest\"]\nall = [\"package2\",\"my_project[test]\"]\n

    Running pixi init in that project folder will transform the pyproject.toml file into:

    [project]\nname = \"my_project\"\ndependencies = [\"package1\"]\n\n[project.optional-dependencies]\ntest = [\"pytest\"]\nall = [\"package2\",\"my_project[test]\"]\n\n[tool.pixi.project]\nchannels = [\"conda-forge\"]\nplatforms = [\"linux-64\"] # if executed on linux\n\n[tool.pixi.environments]\ndefault = {features = [], solve-group = \"default\"}\ntest = {features = [\"test\"], solve-group = \"default\"}\nall = {features = [\"all\", \"test\"], solve-group = \"default\"}\n

    In this example, three environments will be created by pixi:

    • default with 'package1' as pypi dependency
    • test with 'package1' and 'pytest' as pypi dependencies
    • all with 'package1', 'package2' and 'pytest' as pypi dependencies

    All environments will be solved together, as indicated by the common solve-group, and added to the lock file. You can edit the [tool.pixi.environments] section manually to adapt it to your use case (e.g. if you do not need a particular environment).

    "},{"location":"advanced/pyproject_toml/#example","title":"Example","text":"

    As the pyproject.toml file supports the full pixi spec with [tool.pixi] prepended an example would look like this:

    pyproject.toml
    [project]\nname = \"my_project\"\nrequires-python = \">=3.9\"\ndependencies = [\n    \"numpy\",\n    \"pandas\",\n    \"matplotlib\",\n    \"ruff\",\n]\n\n[tool.pixi.project]\nchannels = [\"conda-forge\"]\nplatforms = [\"linux-64\", \"osx-arm64\", \"osx-64\", \"win-64\"]\n\n[tool.pixi.dependencies]\ncompilers = \"*\"\ncmake = \"*\"\n\n[tool.pixi.tasks]\nstart = \"python my_project/main.py\"\nlint = \"ruff lint\"\n\n[tool.pixi.system-requirements]\ncuda = \"11.0\"\n\n[tool.pixi.feature.test.dependencies]\npytest = \"*\"\n\n[tool.pixi.feature.test.tasks]\ntest = \"pytest\"\n\n[tool.pixi.environments]\ntest = [\"test\"]\n
    "},{"location":"advanced/pyproject_toml/#build-system-section","title":"Build-system section","text":"

    The pyproject.toml file normally contains a [build-system] section. Pixi will use this section to build and install the project if it is added as a pypi path dependency.

    If the pyproject.toml file does not contain any [build-system] section, pixi will fall back to uv's default, which is equivalent to the below:

    pyproject.toml
    [build-system]\nrequires = [\"setuptools >= 40.8.0\"]\nbuild-backend = \"setuptools.build_meta:__legacy__\"\n

    Including a [build-system] section is highly recommended. If you are not sure of the build-backend you want to use, including the [build-system] section below in your pyproject.toml is a good starting point. pixi init --format pyproject defaults to hatchling. The advantages of hatchling over setuptools are outlined on its website.

    pyproject.toml
    [build-system]\nbuild-backend = \"hatchling.build\"\nrequires = [\"hatchling\"]\n
    "},{"location":"advanced/updates_github_actions/","title":"Update lockfiles with GitHub Actions","text":"

    You can leverage GitHub Actions in combination with pavelzw/pixi-diff-to-markdown to automatically update your lockfiles similar to dependabot or renovate in other ecosystems.

    Dependabot/Renovate support for pixi

    You can track native Dependabot support for pixi in dependabot/dependabot-core #2227 and for Renovate in renovatebot/renovate #2213.

    "},{"location":"advanced/updates_github_actions/#how-to-use","title":"How to use","text":"

    To get started, create a new GitHub Actions workflow file in your repository.

    .github/workflows/update-lockfiles.yml
    name: Update lockfiles\n\npermissions: # (1)!\n  contents: write\n  pull-requests: write\n\non:\n  workflow_dispatch:\n  schedule:\n    - cron: 0 5 1 * * # (2)!\n\njobs:\n  pixi-update:\n    runs-on: ubuntu-latest\n    steps:\n      - uses: actions/checkout@v4\n      - name: Set up pixi\n        uses: prefix-dev/setup-pixi@v0.8.1\n        with:\n          run-install: false\n      - name: Update lockfiles\n        run: |\n          set -o pipefail\n          pixi update --json | pixi exec pixi-diff-to-markdown >> diff.md\n      - name: Create pull request\n        uses: peter-evans/create-pull-request@v6\n        with:\n          token: ${{ secrets.GITHUB_TOKEN }}\n          commit-message: Update pixi lockfile\n          title: Update pixi lockfile\n          body-path: diff.md\n          branch: update-pixi\n          base: main\n          labels: pixi\n          delete-branch: true\n          add-paths: pixi.lock\n
    1. Needed for peter-evans/create-pull-request
    2. Runs at 05:00, on day 1 of the month

    In order for this workflow to work, you need to set \"Allow GitHub Actions to create and approve pull requests\" to true in your repository settings (in \"Actions\" -> \"General\").

    Tip

    If you don't have any pypi-dependencies, you can use pixi update --json --no-install to speed up diff generation.

    "},{"location":"advanced/updates_github_actions/#triggering-ci-in-automated-prs","title":"Triggering CI in automated PRs","text":"

    In order to prevent accidental recursive GitHub Workflow runs, GitHub decided to not trigger any workflows on automated PRs when using the default GITHUB_TOKEN. There are a couple of ways how to work around this limitation. You can find excellent documentation for this in peter-evans/create-pull-request, see here.

    "},{"location":"advanced/updates_github_actions/#customizing-the-summary","title":"Customizing the summary","text":"

    You can customize the summary by either using command-line-arguments of pixi-diff-to-markdown or by specifying the configuration in pixi.toml under [tool.pixi-diff-to-markdown]. See the pixi-diff-to-markdown documentation or run pixi-diff-to-markdown --help for more information.

    "},{"location":"advanced/updates_github_actions/#using-reusable-workflows","title":"Using reusable workflows","text":"

    If you want to use the same workflow in multiple repositories in your GitHub organization, you can create a reusable workflow. You can find more information in the GitHub documentation.

    "},{"location":"design_proposals/pixi_global_manifest/","title":"Pixi Global Manifest","text":"

    Feedback wanted

    This document is work in progress, and community feedback is greatly appreciated. Please share your thoughts at our GitHub discussion.

    "},{"location":"design_proposals/pixi_global_manifest/#motivation","title":"Motivation","text":"

    pixi global is currently limited to imperatively managing CLI packages. The next iteration of this feature should fulfill the following needs:

    • Shareable global environments.
    • Managing complex environments with multiple packages as dependencies
    • Flexible exposure of executables
    "},{"location":"design_proposals/pixi_global_manifest/#design-considerations","title":"Design Considerations","text":"

    There are a few things we wanted to keep in mind in the design:

    1. User-friendliness: Pixi is a user focused tool that goes beyond developers. The feature should have good error reporting and helpful documentation from the start.
    2. Keep it simple: The CLI should be all you strictly need to interact with global environments.
    3. Unsurprising: Simple commands should behave similar to traditional package managers.
    4. Human Readable: Any file created by this feature should be human-readable and modifiable.
    "},{"location":"design_proposals/pixi_global_manifest/#manifest","title":"Manifest","text":"

    The global environments and exposed will be managed by a human-readable manifest. This manifest will stick to conventions set by pixi.toml where possible. Among other things it will be written in the TOML format, be named pixi-global.toml and be placed at ~/.pixi/manifests/pixi-global.toml. The motivation for the location is discussed further below

    pixi-global.toml
    # The name of the environment is `python`\n[envs.python]\nchannels = [\"conda-forge\"]\n# optional, defaults to your current OS\nplatform = \"osx-64\"\n# It will expose python, python3 and python3.11, but not pip\n[envs.python.dependencies]\npython = \"3.11.*\"\npip = \"*\"\n\n[envs.python.exposed]\npython = \"python\"\npython3 = \"python3\"\n\"python3.11\" = \"python3.11\"\n\n# The name of the environment is `python3-10`\n[envs.python3-10]\nchannels = [\"https://fast.prefix.dev/conda-forge\"]\n# It will expose python3.10\n[envs.python3-10.dependencies]\npython = \"3.10.*\"\n\n[envs.python3-10.exposed]\n\"python3.10\" = \"python\"\n
    "},{"location":"design_proposals/pixi_global_manifest/#cli","title":"CLI","text":"

    Install one or more packages PACKAGE and expose their executables. If --environment has been given, all packages will be installed in the same environment. --expose can be given if --environment is given as well or if only a single PACKAGE will be installed. The syntax for MAPPING is exposed_name=executable_name, so for example python3.10=python. --platform sets the platform of the environment to PLATFORM Multiple channels can be specified by using --channel multiple times. By default, if no channel is provided, the default-channels key in the pixi configuration is used, which again defaults to \"conda-forge\".

    pixi global install [--expose MAPPING] [--environment ENV] [--platform PLATFORM] [--no-activation] [--channel CHANNEL]... PACKAGE...\n

    Remove environments ENV.

    pixi global uninstall <ENV>...\n

    Update PACKAGE if --package is given. If not, all packages in environments ENV will be updated. If the update leads to executables being removed, it will offer to remove the mappings. If the user declines the update process will stop. If the update leads to executables being added, it will offer for each binary individually to expose it. --assume-yes will assume yes as answer for every question that would otherwise be asked interactively.

    pixi global update [--package PACKAGE] [--assume-yes] <ENV>...\n

    Updates all packages in all environments. If the update leads to executables being removed, it will offer to remove the mappings. If the user declines the update process will stop. If the update leads to executables being added, it will offer for each binary individually to expose it. --assume-yes will assume yes as answer for every question that would otherwise be asked interactively.

    pixi global update-all [--assume-yes]\n

    Add one or more packages PACKAGE into an existing environment ENV. If environment ENV does not exist, it will return with an error. Without --expose no binary will be exposed. If you don't mention a spec like python=3.8.*, the spec will be unconstrained with *. The syntax for MAPPING is exposed_name=executable_name, so for example python3.10=python.

    pixi global add --environment ENV [--expose MAPPING] <PACKAGE>...\n

    Remove package PACKAGE from environment ENV. If that was the last package remove the whole environment and print that information in the console. If this leads to executables being removed, it will offer to remove the mappings. If the user declines the remove process will stop.

    pixi global remove --environment ENV PACKAGE\n

    Add one or more MAPPING for environment ENV which describe which executables are exposed. The syntax for MAPPING is exposed_name=executable_name, so for example python3.10=python.

    pixi global expose add --environment ENV  <MAPPING>...\n

    Remove one or more exposed BINARY from environment ENV

    pixi global expose remove --environment ENV <BINARY>...\n

    Ensure that the environments on the machine reflect the state in the manifest. The manifest is the single source of truth. Only if there's no manifest, will the data from existing environments be used to create a manifest. pixi global sync is implied by most other pixi global commands.

    pixi global sync\n

    List all environments, their specs and exposed executables

    pixi global list\n

    Set the channels CHANNEL for a certain environment ENV in the pixi global manifest.

    pixi global channel set --environment ENV <CHANNEL>...\n

    Set the platform PLATFORM for a certain environment ENV in the pixi global manifest.

    pixi global platform set --environment ENV PLATFORM\n

    "},{"location":"design_proposals/pixi_global_manifest/#simple-workflow","title":"Simple workflow","text":"

    Create environment python, install package python=3.10.* and expose all executables of that package

    pixi global install python=3.10.*\n

    Update all packages in environment python

    pixi global update python\n

    Remove environment python

    pixi global uninstall python\n

    Create environment python and pip, install corresponding packages and expose all executables of that packages

    pixi global install python pip\n

    Remove environments python and pip

    pixi global uninstall python pip\n

    Create environment python-pip, install python and pip in the same environment and expose all executables of these packages

    pixi global install --environment python-pip python pip\n

    "},{"location":"design_proposals/pixi_global_manifest/#adding-dependencies","title":"Adding dependencies","text":"

    Create environment python, install package python and expose all executables of that package. Then add package hypercorn to environment python but doesn't expose its executables.

    pixi global install python\npixi global add --environment python hypercorn\n

    Update package cryptography (a dependency of hypercorn) to 43.0.0 in environment python

    pixi update --environment python cryptography=43.0.0\n

    Then remove hypercorn again.

    pixi global remove --environment python hypercorn\n

    "},{"location":"design_proposals/pixi_global_manifest/#specifying-which-executables-to-expose","title":"Specifying which executables to expose","text":"

    Make a new environment python3-10 with package python=3.10 and expose the python executable as python3.10.

    pixi global install --environment python3-10 --expose \"python3.10=python\" python=3.10\n

    Now python3.10 is available.

    Run the following in order to expose python from environment python3-10 as python3-10 instead.

    pixi global expose remove --environment python3-10 python3.10\npixi global expose add --environment python3-10 \"python3-10=python\"\n

    Now python3-10 is available, but python3.10 isn't anymore.

    "},{"location":"design_proposals/pixi_global_manifest/#syncing","title":"Syncing","text":"

    Most pixi global sub commands imply a pixi global sync.

    • Users should be able to change the manifest by hand (creating or modifying (adding or removing))
    • Users should be able to \"export\" their existing environments into the manifest, if non-existing.
    • The manifest is always \"in sync\" after install/remove/inject/other global command.

    First time, clean computer. Running the following creates manifest and ~/.pixi/envs/python.

    pixi global install python\n

    Delete ~/.pixi and syncing, should add environment python again as described in the manifest

    rm `~/.pixi/envs`\npixi global sync\n

    If there's no manifest, but existing environments, pixi will create a manifest that matches your current environments. It is to be decided whether the user should be asked if they want an empty manifest instead, or if it should always import the data from the environments.

    rm <manifest>\npixi global sync\n

    If we remove the python environment from the manifest, running pixi global sync will also remove the ~/.pixi/envs/python environment from the file system.

    vim <manifest>\npixi global sync\n

    "},{"location":"design_proposals/pixi_global_manifest/#open-questions","title":"Open Questions","text":""},{"location":"design_proposals/pixi_global_manifest/#should-we-version-the-manifest","title":"Should we version the manifest?","text":"

    Something like:

    [manifest]\nversion = 1\n

    We still have to figure out which existing programs do something similar and how they benefit from it.

    "},{"location":"design_proposals/pixi_global_manifest/#multiple-manifests","title":"Multiple manifests","text":"

    We could go for one default manifest, but also parse other manifests in the same directory. The only requirement to be parsed as manifest is a .toml extension In order to modify those with the CLI one would have to add an option --manifest to select the correct one.

    • pixi-global.toml: Default
    • pixi-global-company-tools.toml
    • pixi-global-from-my-dotfiles.toml

    It is unclear whether the first implementation already needs to support this. At the very least we should put the manifest into its own folder like ~/.pixi/global/manifests/pixi-global.toml

    "},{"location":"design_proposals/pixi_global_manifest/#discovery-via-config-key","title":"Discovery via config key","text":"

    In order to make it easier to manage manifests in version control, we could allow to set the manifest path via a key in the pixi configuration.

    config.toml
    global_manifests = \"/path/to/your/manifests\"\n
    "},{"location":"design_proposals/pixi_global_manifest/#no-activation","title":"No activation","text":"

    The current pixi global install features --no-activation. When this flag is set, CONDA_PREFIX and PATH will not be set when running the exposed executable. This is useful when installing Python package managers or shells.

    Assuming that this needs to be set per mapping, one way to expose this functionality would be to allow the following:

    [envs.pip.exposed]\npip = { executable=\"pip\", activation=false }\n
    "},{"location":"examples/cpp-sdl/","title":"SDL example","text":"

    The cpp-sdl example is located in the pixi repository.

    git clone https://github.com/prefix-dev/pixi.git\n

    Move to the example folder

    cd pixi/examples/cpp-sdl\n

    Run the start command

    pixi run start\n

    Using the depends-on feature you only needed to run the start task but under water it is running the following tasks.

    # Configure the CMake project\npixi run configure\n\n# Build the executable\npixi run build\n\n# Start the build executable\npixi run start\n
    "},{"location":"examples/opencv/","title":"Opencv example","text":"

    The opencv example is located in the pixi repository.

    git clone https://github.com/prefix-dev/pixi.git\n

    Move to the example folder

    cd pixi/examples/opencv\n
    "},{"location":"examples/opencv/#face-detection","title":"Face detection","text":"

    Run the start command to start the face detection algorithm.

    pixi run start\n

    The screen that starts should look like this:

    Check out the webcame_capture.py to see how we detect a face.

    "},{"location":"examples/opencv/#camera-calibration","title":"Camera Calibration","text":"

    Next to face recognition, a camera calibration example is also included.

    You'll need a checkerboard for this to work. Print this:

    Then run

    pixi run calibrate\n

    To make a picture for calibration press SPACE Do this approximately 10 times with the chessboard in view of the camera

    After that press ESC which will start the calibration.

    When the calibration is done, the camera will be used again to find the distance to the checkerboard.

    "},{"location":"examples/ros2-nav2/","title":"Navigation 2 example","text":"

    The nav2 example is located in the pixi repository.

    git clone https://github.com/prefix-dev/pixi.git\n

    Move to the example folder

    cd pixi/examples/ros2-nav2\n

    Run the start command

    pixi run start\n
    "},{"location":"features/advanced_tasks/","title":"Advanced tasks","text":"

    When building a package, you often have to do more than just run the code. Steps like formatting, linting, compiling, testing, benchmarking, etc. are often part of a project. With pixi tasks, this should become much easier to do.

    Here are some quick examples

    pixi.toml
    [tasks]\n# Commands as lists so you can also add documentation in between.\nconfigure = { cmd = [\n    \"cmake\",\n    # Use the cross-platform Ninja generator\n    \"-G\",\n    \"Ninja\",\n    # The source is in the root directory\n    \"-S\",\n    \".\",\n    # We wanna build in the .build directory\n    \"-B\",\n    \".build\",\n] }\n\n# Depend on other tasks\nbuild = { cmd = [\"ninja\", \"-C\", \".build\"], depends-on = [\"configure\"] }\n\n# Using environment variables\nrun = \"python main.py $PIXI_PROJECT_ROOT\"\nset = \"export VAR=hello && echo $VAR\"\n\n# Cross platform file operations\ncopy = \"cp pixi.toml pixi_backup.toml\"\nclean = \"rm pixi_backup.toml\"\nmove = \"mv pixi.toml backup.toml\"\n
    "},{"location":"features/advanced_tasks/#depends-on","title":"Depends on","text":"

    Just like packages can depend on other packages, our tasks can depend on other tasks. This allows for complete pipelines to be run with a single command.

    An obvious example is compiling before running an application.

    Checkout our cpp_sdl example for a running example. In that package we have some tasks that depend on each other, so we can assure that when you run pixi run start everything is set up as expected.

    pixi task add configure \"cmake -G Ninja -S . -B .build\"\npixi task add build \"ninja -C .build\" --depends-on configure\npixi task add start \".build/bin/sdl_example\" --depends-on build\n

    Results in the following lines added to the pixi.toml

    pixi.toml
    [tasks]\n# Configures CMake\nconfigure = \"cmake -G Ninja -S . -B .build\"\n# Build the executable but make sure CMake is configured first.\nbuild = { cmd = \"ninja -C .build\", depends-on = [\"configure\"] }\n# Start the built executable\nstart = { cmd = \".build/bin/sdl_example\", depends-on = [\"build\"] }\n
    pixi run start\n

    The tasks will be executed after each other:

    • First configure because it has no dependencies.
    • Then build as it only depends on configure.
    • Then start as all it dependencies are run.

    If one of the commands fails (exit with non-zero code.) it will stop and the next one will not be started.

    With this logic, you can also create aliases as you don't have to specify any command in a task.

    pixi task add fmt ruff\npixi task add lint pylint\n
    pixi task alias style fmt lint\n

    Results in the following pixi.toml.

    pixi.toml
    fmt = \"ruff\"\nlint = \"pylint\"\nstyle = { depends-on = [\"fmt\", \"lint\"] }\n

    Now run both tools with one command.

    pixi run style\n
    "},{"location":"features/advanced_tasks/#working-directory","title":"Working directory","text":"

    Pixi tasks support the definition of a working directory.

    cwd\" stands for Current Working Directory. The directory is relative to the pixi package root, where the pixi.toml file is located.

    Consider a pixi project structured as follows:

    \u251c\u2500\u2500 pixi.toml\n\u2514\u2500\u2500 scripts\n    \u2514\u2500\u2500 bar.py\n

    To add a task to run the bar.py file, use:

    pixi task add bar \"python bar.py\" --cwd scripts\n

    This will add the following line to manifest file:

    pixi.toml
    [tasks]\nbar = { cmd = \"python bar.py\", cwd = \"scripts\" }\n
    "},{"location":"features/advanced_tasks/#caching","title":"Caching","text":"

    When you specify inputs and/or outputs to a task, pixi will reuse the result of the task.

    For the cache, pixi checks that the following are true:

    • No package in the environment has changed.
    • The selected inputs and outputs are the same as the last time the task was run. We compute fingerprints of all the files selected by the globs and compare them to the last time the task was run.
    • The command is the same as the last time the task was run.

    If all of these conditions are met, pixi will not run the task again and instead use the existing result.

    Inputs and outputs can be specified as globs, which will be expanded to all matching files.

    pixi.toml
    [tasks]\n# This task will only run if the `main.py` file has changed.\nrun = { cmd = \"python main.py\", inputs = [\"main.py\"] }\n\n# This task will remember the result of the `curl` command and not run it again if the file `data.csv` already exists.\ndownload_data = { cmd = \"curl -o data.csv https://example.com/data.csv\", outputs = [\"data.csv\"] }\n\n# This task will only run if the `src` directory has changed and will remember the result of the `make` command.\nbuild = { cmd = \"make\", inputs = [\"src/*.cpp\", \"include/*.hpp\"], outputs = [\"build/app.exe\"] }\n

    Note: if you want to debug the globs you can use the --verbose flag to see which files are selected.

    # shows info logs of all files that were selected by the globs\npixi run -v start\n
    "},{"location":"features/advanced_tasks/#environment-variables","title":"Environment variables","text":"

    You can set environment variables for a task. These are seen as \"default\" values for the variables as you can overwrite them from the shell.

    pixi.toml

    [tasks]\necho = { cmd = \"echo $ARGUMENT\", env = { ARGUMENT = \"hello\" } }\n
    If you run pixi run echo it will output hello. When you set the environment variable ARGUMENT before running the task, it will use that value instead.

    ARGUMENT=world pixi run echo\n\u2728 Pixi task (echo in default): echo $ARGUMENT\nworld\n

    These variables are not shared over tasks, so you need to define these for every task you want to use them in.

    Extend instead of overwrite

    If you use the same environment variable in the value as in the key of the map you will also overwrite the variable. For example overwriting a PATH pixi.toml

    [tasks]\necho = { cmd = \"echo $PATH\", env = { PATH = \"/tmp/path:$PATH\" } }\n
    This will output /tmp/path:/usr/bin:/bin instead of the original /usr/bin:/bin.

    "},{"location":"features/advanced_tasks/#clean-environment","title":"Clean environment","text":"

    You can make sure the environment of a task is \"pixi only\". Here pixi will only include the minimal required environment variables for your platform to run the command in. The environment will contain all variables set by the conda environment like \"CONDA_PREFIX\". It will however include some default values from the shell, like: \"DISPLAY\", \"LC_ALL\", \"LC_TIME\", \"LC_NUMERIC\", \"LC_MEASUREMENT\", \"SHELL\", \"USER\", \"USERNAME\", \"LOGNAME\", \"HOME\", \"HOSTNAME\",\"TMPDIR\", \"XPC_SERVICE_NAME\", \"XPC_FLAGS\"

    [tasks]\nclean_command = { cmd = \"python run_in_isolated_env.py\", clean-env = true}\n
    This setting can also be set from the command line with pixi run --clean-env TASK_NAME.

    clean-env not supported on Windows

    On Windows it's hard to create a \"clean environment\" as conda-forge doesn't ship Windows compilers and Windows needs a lot of base variables. Making this feature not worthy of implementing as the amount of edge cases will make it unusable.

    "},{"location":"features/advanced_tasks/#our-task-runner-deno_task_shell","title":"Our task runner: deno_task_shell","text":"

    To support the different OS's (Windows, OSX and Linux), pixi integrates a shell that can run on all of them. This is deno_task_shell. The task shell is a limited implementation of a bourne-shell interface.

    "},{"location":"features/advanced_tasks/#built-in-commands","title":"Built-in commands","text":"

    Next to running actual executable like ./myprogram, cmake or python the shell has some built-in commandos.

    • cp: Copies files.
    • mv: Moves files.
    • rm: Remove files or directories. Ex: rm -rf [FILE]... - Commonly used to recursively delete files or directories.
    • mkdir: Makes directories. Ex. mkdir -p DIRECTORY... - Commonly used to make a directory and all its parents with no error if it exists.
    • pwd: Prints the name of the current/working directory.
    • sleep: Delays for a specified amount of time. Ex. sleep 1 to sleep for 1 second, sleep 0.5 to sleep for half a second, or sleep 1m to sleep a minute
    • echo: Displays a line of text.
    • cat: Concatenates files and outputs them on stdout. When no arguments are provided, it reads and outputs stdin.
    • exit: Causes the shell to exit.
    • unset: Unsets environment variables.
    • xargs: Builds arguments from stdin and executes a command.
    "},{"location":"features/advanced_tasks/#syntax","title":"Syntax","text":"
    • Boolean list: use && or || to separate two commands.
      • &&: if the command before && succeeds continue with the next command.
      • ||: if the command before || fails continue with the next command.
    • Sequential lists: use ; to run two commands without checking if the first command failed or succeeded.
    • Environment variables:
      • Set env variable using: export ENV_VAR=value
      • Use env variable using: $ENV_VAR
      • unset env variable using unset ENV_VAR
    • Shell variables: Shell variables are similar to environment variables, but won\u2019t be exported to spawned commands.
      • Set them: VAR=value
      • use them: VAR=value && echo $VAR
    • Pipelines: Use the stdout output of a command into the stdin a following command
      • |: echo Hello | python receiving_app.py
      • |&: use this to also get the stderr as input.
    • Command substitution: $() to use the output of a command as input for another command.
      • python main.py $(git rev-parse HEAD)
    • Negate exit code: ! before any command will negate the exit code from 1 to 0 or visa-versa.
    • Redirects: > to redirect the stdout to a file.
      • echo hello > file.txt will put hello in file.txt and overwrite existing text.
      • python main.py 2> file.txt will put the stderr output in file.txt.
      • python main.py &> file.txt will put the stderr and stdout in file.txt.
      • echo hello >> file.txt will append hello to the existing file.txt.
    • Glob expansion: * to expand all options.
      • echo *.py will echo all filenames that end with .py
      • echo **/*.py will echo all filenames that end with .py in this directory and all descendant directories.
      • echo data[0-9].csv will echo all filenames that have a single number after data and before .csv

    More info in deno_task_shell documentation.

    "},{"location":"features/environment/","title":"Environments","text":"

    Pixi is a tool to manage virtual environments. This document explains what an environment looks like and how to use it.

    "},{"location":"features/environment/#structure","title":"Structure","text":"

    A pixi environment is located in the .pixi/envs directory of the project. This location is not configurable as it is a specific design decision to keep the environments in the project directory. This keeps your machine and your project clean and isolated from each other, and makes it easy to clean up after a project is done.

    If you look at the .pixi/envs directory, you will see a directory for each environment, the default being the one that is normally used, if you specify a custom environment the name you specified will be used.

    .pixi\n\u2514\u2500\u2500 envs\n    \u251c\u2500\u2500 cuda\n    \u2502   \u251c\u2500\u2500 bin\n    \u2502   \u251c\u2500\u2500 conda-meta\n    \u2502   \u251c\u2500\u2500 etc\n    \u2502   \u251c\u2500\u2500 include\n    \u2502   \u251c\u2500\u2500 lib\n    \u2502   ...\n    \u2514\u2500\u2500 default\n        \u251c\u2500\u2500 bin\n        \u251c\u2500\u2500 conda-meta\n        \u251c\u2500\u2500 etc\n        \u251c\u2500\u2500 include\n        \u251c\u2500\u2500 lib\n        ...\n

    These directories are conda environments, and you can use them as such, but you cannot manually edit them, this should always go through the pixi.toml. Pixi will always make sure the environment is in sync with the pixi.lock file. If this is not the case then all the commands that use the environment will automatically update the environment, e.g. pixi run, pixi shell.

    "},{"location":"features/environment/#cleaning-up","title":"Cleaning up","text":"

    If you want to clean up the environments, you can simply delete the .pixi/envs directory, and pixi will recreate the environments when needed.

    # either:\nrm -rf .pixi/envs\n\n# or per environment:\nrm -rf .pixi/envs/default\nrm -rf .pixi/envs/cuda\n
    "},{"location":"features/environment/#activation","title":"Activation","text":"

    An environment is nothing more than a set of files that are installed into a certain location, that somewhat mimics a global system install. You need to activate the environment to use it. In the most simple sense that mean adding the bin directory of the environment to the PATH variable. But there is more to it in a conda environment, as it also sets some environment variables.

    To do the activation we have multiple options:

    • Use the pixi shell command to open a shell with the environment activated.
    • Use the pixi shell-hook command to print the command to activate the environment in your current shell.
    • Use the pixi run command to run a command in the environment.

    Where the run command is special as it runs its own cross-platform shell and has the ability to run tasks. More information about tasks can be found in the tasks documentation.

    Using the pixi shell-hook in pixi you would get the following output:

    export PATH=\"/home/user/development/pixi/.pixi/envs/default/bin:/home/user/.local/bin:/home/user/bin:/usr/local/bin:/usr/local/sbin:/usr/bin:/home/user/.pixi/bin\"\nexport CONDA_PREFIX=\"/home/user/development/pixi/.pixi/envs/default\"\nexport PIXI_PROJECT_NAME=\"pixi\"\nexport PIXI_PROJECT_ROOT=\"/home/user/development/pixi\"\nexport PIXI_PROJECT_VERSION=\"0.12.0\"\nexport PIXI_PROJECT_MANIFEST=\"/home/user/development/pixi/pixi.toml\"\nexport CONDA_DEFAULT_ENV=\"pixi\"\nexport PIXI_ENVIRONMENT_PLATFORMS=\"osx-64,linux-64,win-64,osx-arm64\"\nexport PIXI_ENVIRONMENT_NAME=\"default\"\nexport PIXI_PROMPT=\"(pixi) \"\n. \"/home/user/development/pixi/.pixi/envs/default/etc/conda/activate.d/activate-binutils_linux-64.sh\"\n. \"/home/user/development/pixi/.pixi/envs/default/etc/conda/activate.d/activate-gcc_linux-64.sh\"\n. \"/home/user/development/pixi/.pixi/envs/default/etc/conda/activate.d/activate-gfortran_linux-64.sh\"\n. \"/home/user/development/pixi/.pixi/envs/default/etc/conda/activate.d/activate-gxx_linux-64.sh\"\n. \"/home/user/development/pixi/.pixi/envs/default/etc/conda/activate.d/libglib_activate.sh\"\n. \"/home/user/development/pixi/.pixi/envs/default/etc/conda/activate.d/rust.sh\"\n

    It sets the PATH and some more environment variables. But more importantly it also runs activation scripts that are presented by the installed packages. An example of this would be the libglib_activate.sh script. Thus, just adding the bin directory to the PATH is not enough.

    "},{"location":"features/environment/#traditional-conda-activate-like-activation","title":"Traditional conda activate-like activation","text":"

    If you prefer to use the traditional conda activate-like activation, you could use the pixi shell-hook command.

    $ which python\npython not found\n$ eval \"$(pixi shell-hook)\"\n$ (default) which python\n/path/to/project/.pixi/envs/default/bin/python\n

    Warning

    It is not encouraged to use the traditional conda activate-like activation, as deactivating the environment is not really possible. Use pixi shell instead.

    "},{"location":"features/environment/#using-pixi-with-direnv","title":"Using pixi with direnv","text":"Installing direnv

    Of course you can use pixi to install direnv globally. We recommend to run

    pixi global install direnv

    to install the latest version of direnv on your computer.

    This allows you to use pixi in combination with direnv. Enter the following into your .envrc file:

    .envrc
    watch_file pixi.lock # (1)!\neval \"$(pixi shell-hook)\" # (2)!\n
    1. This ensures that every time your pixi.lock changes, direnv invokes the shell-hook again.
    2. This installs if needed, and activates the environment. direnv ensures that the environment is deactivated when you leave the directory.
    $ cd my-project\ndirenv: error /my-project/.envrc is blocked. Run `direnv allow` to approve its content\n$ direnv allow\ndirenv: loading /my-project/.envrc\n\u2714 Project in /my-project is ready to use!\ndirenv: export +CONDA_DEFAULT_ENV +CONDA_PREFIX +PIXI_ENVIRONMENT_NAME +PIXI_ENVIRONMENT_PLATFORMS +PIXI_PROJECT_MANIFEST +PIXI_PROJECT_NAME +PIXI_PROJECT_ROOT +PIXI_PROJECT_VERSION +PIXI_PROMPT ~PATH\n$ which python\n/my-project/.pixi/envs/default/bin/python\n$ cd ..\ndirenv: unloading\n$ which python\npython not found\n
    "},{"location":"features/environment/#environment-variables","title":"Environment variables","text":"

    The following environment variables are set by pixi, when using the pixi run, pixi shell, or pixi shell-hook command:

    • PIXI_PROJECT_ROOT: The root directory of the project.
    • PIXI_PROJECT_NAME: The name of the project.
    • PIXI_PROJECT_MANIFEST: The path to the manifest file (pixi.toml).
    • PIXI_PROJECT_VERSION: The version of the project.
    • PIXI_PROMPT: The prompt to use in the shell, also used by pixi shell itself.
    • PIXI_ENVIRONMENT_NAME: The name of the environment, defaults to default.
    • PIXI_ENVIRONMENT_PLATFORMS: Comma separated list of platforms supported by the project.
    • CONDA_PREFIX: The path to the environment. (Used by multiple tools that already understand conda environments)
    • CONDA_DEFAULT_ENV: The name of the environment. (Used by multiple tools that already understand conda environments)
    • PATH: We prepend the bin directory of the environment to the PATH variable, so you can use the tools installed in the environment directly.
    • INIT_CWD: ONLY IN pixi run: The directory where the command was run from.

    Note

    Even though the variables are environment variables these cannot be overridden. E.g. you can not change the root of the project by setting PIXI_PROJECT_ROOT in the environment.

    "},{"location":"features/environment/#solving-environments","title":"Solving environments","text":"

    When you run a command that uses the environment, pixi will check if the environment is in sync with the pixi.lock file. If it is not, pixi will solve the environment and update it. This means that pixi will retrieve the best set of packages for the dependency requirements that you specified in the pixi.toml and will put the output of the solve step into the pixi.lock file. Solving is a mathematical problem and can take some time, but we take pride in the way we solve environments, and we are confident that we can solve your environment in a reasonable time. If you want to learn more about the solving process, you can read these:

    • Rattler(conda) resolver blog
    • Rip(PyPI) resolver blog

    Pixi solves both the conda and PyPI dependencies, where the PyPI dependencies use the conda packages as a base, so you can be sure that the packages are compatible with each other. These solvers are split between the rattler and rip library, these control the heavy lifting of the solving process, which is executed by our custom SAT solver: resolvo. resolve is able to solve multiple ecosystem like conda and PyPI. It implements the lazy solving process for PyPI packages, which means that it only downloads the metadata of the packages that are needed to solve the environment. It also supports the conda way of solving, which means that it downloads the metadata of all the packages at once and then solves in one go.

    For the [pypi-dependencies], rip implements sdist building to retrieve the metadata of the packages, and wheel building to install the packages. For this building step, pixi requires to first install python in the (conda)[dependencies] section of the pixi.toml file. This will always be slower than the pure conda solves. So for the best pixi experience you should stay within the [dependencies] section of the pixi.toml file.

    "},{"location":"features/environment/#caching","title":"Caching","text":"

    Pixi caches all previously downloaded packages in a cache folder. This cache folder is shared between all pixi projects and globally installed tools.

    Normally the location would be the following platform-specific default cache folder:

    • Linux: $XDG_CACHE_HOME/rattler or $HOME/.cache/rattler
    • macOS: $HOME/Library/Caches/rattler
    • Windows: %LOCALAPPDATA%\\rattler

    This location is configurable by setting the PIXI_CACHE_DIR or RATTLER_CACHE_DIR environment variable.

    When you want to clean the cache, you can simply delete the cache directory, and pixi will re-create the cache when needed.

    The cache contains multiple folders concerning different caches from within pixi.

    • pkgs: Contains the downloaded/unpacked conda packages.
    • repodata: Contains the conda repodata cache.
    • uv-cache: Contains the uv cache. This includes multiple caches, e.g. built-wheels wheels archives
    • http-cache: Contains the conda-pypi mapping cache.
    "},{"location":"features/lockfile/","title":"The pixi.lock lock file","text":"

    A lock file is the protector of the environments, and pixi is the key to unlock it.

    "},{"location":"features/lockfile/#what-is-a-lock-file","title":"What is a lock file?","text":"

    A lock file locks the environment in a specific state. Within pixi a lock file is a description of the packages in an environment. The lock file contains two definitions:

    • The environments that are used in the project with their complete set of packages. e.g.:

      environments:\n    default:\n        channels:\n          - url: https://conda.anaconda.org/conda-forge/\n        packages:\n            linux-64:\n            ...\n            - conda: https://conda.anaconda.org/conda-forge/linux-64/python-3.12.2-hab00c5b_0_cpython.conda\n            ...\n            osx-64:\n            ...\n            - conda: https://conda.anaconda.org/conda-forge/osx-64/python-3.12.2-h9f0c242_0_cpython.conda\n            ...\n
      • The definition of the packages themselves. e.g.:

        - kind: conda\n  name: python\n  version: 3.12.2\n  build: h9f0c242_0_cpython\n  subdir: osx-64\n  url: https://conda.anaconda.org/conda-forge/osx-64/python-3.12.2-h9f0c242_0_cpython.conda\n  sha256: 7647ac06c3798a182a4bcb1ff58864f1ef81eb3acea6971295304c23e43252fb\n  md5: 0179b8007ba008cf5bec11f3b3853902\n  depends:\n    - bzip2 >=1.0.8,<2.0a0\n    - libexpat >=2.5.0,<3.0a0\n    - libffi >=3.4,<4.0a0\n    - libsqlite >=3.45.1,<4.0a0\n    - libzlib >=1.2.13,<1.3.0a0\n    - ncurses >=6.4,<7.0a0\n    - openssl >=3.2.1,<4.0a0\n    - readline >=8.2,<9.0a0\n    - tk >=8.6.13,<8.7.0a0\n    - tzdata\n    - xz >=5.2.6,<6.0a0\n  constrains:\n    - python_abi 3.12.* *_cp312\n  license: Python-2.0\n  size: 14596811\n  timestamp: 1708118065292\n
    "},{"location":"features/lockfile/#why-a-lock-file","title":"Why a lock file","text":"

    Pixi uses the lock file for the following reasons:

    • To save a working installation state, without copying the entire environment's data.
    • To ensure the project configuration is aligned with the installed environment.
    • To give the user a file that contains all the information about the environment.

    This gives you (and your collaborators) a way to really reproduce the environment they are working in. Using tools such as docker suddenly becomes much less necessary.

    "},{"location":"features/lockfile/#when-is-a-lock-file-generated","title":"When is a lock file generated?","text":"

    A lock file is generated when you install a package. More specifically, a lock file is generated from the solve step of the installation process. The solve will return a list of packages that are to be installed, and the lock file will be generated from this list. This diagram tries to explain the process:

    graph TD\n    A[Install] --> B[Solve]\n    B --> C[Generate and write lock file]\n    C --> D[Install Packages]
    "},{"location":"features/lockfile/#how-to-use-a-lock-file","title":"How to use a lock file","text":"

    Do not edit the lock file

    A lock file is a machine only file, and should not be edited by hand.

    That said, the pixi.lock is human-readable, so it's easy to track the changes in the environment. We recommend you track the lock file in git or other version control systems. This will ensure that the environment is always reproducible and that you can always revert back to a working state, in case something goes wrong. The pixi.lock and the manifest file pixi.toml/pyproject.toml should always be in sync.

    Running the following commands will check and automatically update the lock file if you changed any dependencies:

    • pixi install
    • pixi run
    • pixi shell
    • pixi shell-hook
    • pixi tree
    • pixi list
    • pixi add
    • pixi remove

    All the commands that support the interaction with the lock file also include some lock file usage options:

    • --frozen: install the environment as defined in the lock file, doesn't update pixi.lock if it isn't up-to-date with manifest file. It can also be controlled by the PIXI_FROZEN environment variable (example: PIXI_FROZEN=true).
    • --locked: only install if the pixi.lock is up-to-date with the manifest file[^1]. It can also be controlled by the PIXI_LOCKED environment variable (example: PIXI_LOCKED=true). Conflicts with --frozen.

    Syncing the lock file with the manifest file

    The lock file is always matched with the whole configuration in the manifest file. This means that if you change the manifest file, the lock file will be updated.

    flowchart TD\n    C[manifest] --> A[lockfile] --> B[environment]

    "},{"location":"features/lockfile/#lockfile-satisfiability","title":"Lockfile satisfiability","text":"

    The lock file is a description of the environment, and it should always be satisfiable. Satisfiable means that the given manifest file and the created environment are in sync with the lockfile. If the lock file is not satisfiable, pixi will generate a new lock file automatically.

    Steps to check if the lock file is satisfiable:

    • All environments in the manifest file are in the lock file
    • All channels in the manifest file are in the lock file
    • All packages in the manifest file are in the lock file, and the versions in the lock file are compatible with the requirements in the manifest file, for both conda and pypi packages.
      • Conda packages use a matchspec which can match on all the information we store in the lockfile, even timestamp, subdir and license.
    • If pypi-dependencies are added, all conda package that are python packages in the lock file have a purls field.
    • All hashes for the pypi editable packages are correct.
    • There is only a single entry for every package in the lock file.

    If you want to get more details checkout the actual code as this is a simplification of the actual code.

    "},{"location":"features/lockfile/#the-version-of-the-lock-file","title":"The version of the lock file","text":"

    The lock file has a version number, this is to ensure that the lock file is compatible with the local version of pixi.

    version: 4\n

    Pixi is backward compatible with the lock file, but not forward compatible. This means that you can use an older lock file with a newer version of pixi, but not the other way around.

    "},{"location":"features/lockfile/#your-lock-file-is-big","title":"Your lock file is big","text":"

    The lock file can grow quite large, especially if you have a lot of packages installed. This is because the lock file contains all the information about the packages.

    1. We try to keep the lock file as small as possible.
    2. It's always smaller than a docker image.
    3. Downloading the lock file is always faster than downloading the incorrect packages.
    "},{"location":"features/lockfile/#you-dont-need-a-lock-file-because","title":"You don't need a lock file because...","text":"

    If you can not think of a case where you would benefit from a fast reproducible environment, then you don't need a lock file.

    But take note of the following:

    • A lock file allows you to run the same environment on different machines, think CI systems.
    • It also allows you to go back to a working state if you have made a mistake.
    • It helps other users onboard to your project as they don't have to figure out the environment setup or solve dependency issues.
    "},{"location":"features/lockfile/#removing-the-lock-file","title":"Removing the lock file","text":"

    If you want to remove the lock file, you can simply delete it.

    rm pixi.lock\n

    This will remove the lock file, and the next time you run a command that requires the lock file, it will be generated again.

    Note

    This does remove the locked state of the environment, and the environment will be updated to the latest version of the packages.

    "},{"location":"features/multi_environment/","title":"Multi Environment Support","text":""},{"location":"features/multi_environment/#motivating-example","title":"Motivating Example","text":"

    There are multiple scenarios where multiple environments are useful.

    • Testing of multiple package versions, e.g. py39 and py310 or polars 0.12 and 0.13.
    • Smaller single tool environments, e.g. lint or docs.
    • Large developer environments, that combine all the smaller environments, e.g. dev.
    • Strict supersets of environments, e.g. prod and test-prod where test-prod is a strict superset of prod.
    • Multiple machines from one project, e.g. a cuda environment and a cpu environment.
    • And many more. (Feel free to edit this document in our GitHub and add your use case.)

    This prepares pixi for use in large projects with multiple use-cases, multiple developers and different CI needs.

    "},{"location":"features/multi_environment/#design-considerations","title":"Design Considerations","text":"

    There are a few things we wanted to keep in mind in the design:

    1. User-friendliness: Pixi is a user focussed tool that goes beyond developers. The feature should have good error reporting and helpful documentation from the start.
    2. Keep it simple: Not understanding the multiple environments feature shouldn't limit a user to use pixi. The feature should be \"invisible\" to the non-multi env use-cases.
    3. No Automatic Combinatorial: To ensure the dependency resolution process remains manageable, the solution should avoid a combinatorial explosion of dependency sets. By making the environments user defined and not automatically inferred by testing a matrix of the features.
    4. Single environment Activation: The design should allow only one environment to be active at any given time, simplifying the resolution process and preventing conflicts.
    5. Fixed lock files: It's crucial to preserve fixed lock files for consistency and predictability. Solutions must ensure reliability not just for authors but also for end-users, particularly at the time of lock file creation.
    "},{"location":"features/multi_environment/#feature-environment-set-definitions","title":"Feature & Environment Set Definitions","text":"

    Introduce environment sets into the pixi.toml this describes environments based on feature's. Introduce features into the pixi.toml that can describe parts of environments. As an environment goes beyond just dependencies the features should be described including the following fields:

    • dependencies: The conda package dependencies
    • pypi-dependencies: The pypi package dependencies
    • system-requirements: The system requirements of the environment
    • activation: The activation information for the environment
    • platforms: The platforms the environment can be run on.
    • channels: The channels used to create the environment. Adding the priority field to the channels to allow concatenation of channels instead of overwriting.
    • target: All the above features but also separated by targets.
    • tasks: Feature specific tasks, tasks in one environment are selected as default tasks for the environment.
    Default features
    [dependencies] # short for [feature.default.dependencies]\npython = \"*\"\nnumpy = \"==2.3\"\n\n[pypi-dependencies] # short for [feature.default.pypi-dependencies]\npandas = \"*\"\n\n[system-requirements] # short for [feature.default.system-requirements]\nlibc = \"2.33\"\n\n[activation] # short for [feature.default.activation]\nscripts = [\"activate.sh\"]\n
    Different dependencies per feature
    [feature.py39.dependencies]\npython = \"~=3.9.0\"\n[feature.py310.dependencies]\npython = \"~=3.10.0\"\n[feature.test.dependencies]\npytest = \"*\"\n
    Full set of environment modification in one feature
    [feature.cuda]\ndependencies = {cuda = \"x.y.z\", cudnn = \"12.0\"}\npypi-dependencies = {torch = \"1.9.0\"}\nplatforms = [\"linux-64\", \"osx-arm64\"]\nactivation = {scripts = [\"cuda_activation.sh\"]}\nsystem-requirements = {cuda = \"12\"}\n# Channels concatenate using a priority instead of overwrite, so the default channels are still used.\n# Using the priority the concatenation is controlled, default is 0, the default channels are used last.\n# Highest priority comes first.\nchannels = [\"nvidia\", {channel = \"pytorch\", priority = -1}] # Results in:  [\"nvidia\", \"conda-forge\", \"pytorch\"] when the default is `conda-forge`\ntasks = { warmup = \"python warmup.py\" }\ntarget.osx-arm64 = {dependencies = {mlx = \"x.y.z\"}}\n
    Define tasks as defaults of an environment
    [feature.test.tasks]\ntest = \"pytest\"\n\n[environments]\ntest = [\"test\"]\n\n# `pixi run test` == `pixi run --environment test test`\n

    The environment definition should contain the following fields:

    • features: Vec<Feature>: The features that are included in the environment set, which is also the default field in the environments.
    • solve-group: String: The solve group is used to group environments together at the solve stage. This is useful for environments that need to have the same dependencies but might extend them with additional dependencies. For instance when testing a production environment with additional test dependencies.
    Creating environments from features
    [environments]\n# implicit: default = [\"default\"]\ndefault = [\"py39\"] # implicit: default = [\"py39\", \"default\"]\npy310 = [\"py310\"] # implicit: py310 = [\"py310\", \"default\"]\ntest = [\"test\"] # implicit: test = [\"test\", \"default\"]\ntest39 = [\"test\", \"py39\"] # implicit: test39 = [\"test\", \"py39\", \"default\"]\n
    Testing a production environment with additional dependencies
    [environments]\n# Creating a `prod` environment which is the minimal set of dependencies used for production.\nprod = {features = [\"py39\"], solve-group = \"prod\"}\n# Creating a `test_prod` environment which is the `prod` environment plus the `test` feature.\ntest_prod = {features = [\"py39\", \"test\"], solve-group = \"prod\"}\n# Using the `solve-group` to solve the `prod` and `test_prod` environments together\n# Which makes sure the tested environment has the same version of the dependencies as the production environment.\n
    Creating environments without including the default feature
    [dependencies]\npython = \"*\"\nnumpy = \"*\"\n\n[feature.lint.dependencies]\npre-commit = \"*\"\n\n[environments]\n# Create a custom environment which only has the `lint` feature (numpy isn't part of that env).\nlint = {features = [\"lint\"], no-default-feature = true}\n
    "},{"location":"features/multi_environment/#lock-file-structure","title":"lock file Structure","text":"

    Within the pixi.lock file, a package may now include an additional environments field, specifying the environment to which it belongs. To avoid duplication the packages environments field may contain multiple environments so the lock file is of minimal size.

    - platform: linux-64\n  name: pre-commit\n  version: 3.3.3\n  category: main\n  environments:\n    - dev\n    - test\n    - lint\n  ...:\n- platform: linux-64\n  name: python\n  version: 3.9.3\n  category: main\n  environments:\n    - dev\n    - test\n    - lint\n    - py39\n    - default\n  ...:\n
    "},{"location":"features/multi_environment/#user-interface-environment-activation","title":"User Interface Environment Activation","text":"

    Users can manually activate the desired environment via command line or configuration. This approach guarantees a conflict-free environment by allowing only one feature set to be active at a time. For the user the cli would look like this:

    Default behavior
    \u279c pixi run python\n# Runs python in the `default` environment\n
    Activating an specific environment
    \u279c pixi run -e test pytest\n\u279c pixi run --environment test pytest\n# Runs `pytest` in the `test` environment\n
    Activating a shell in an environment
    \u279c pixi shell -e cuda\npixi shell --environment cuda\n# Starts a shell in the `cuda` environment\n
    Running any command in an environment
    \u279c pixi run -e test any_command\n# Runs any_command in the `test` environment which doesn't require to be predefined as a task.\n
    "},{"location":"features/multi_environment/#ambiguous-environment-selection","title":"Ambiguous Environment Selection","text":"

    It's possible to define tasks in multiple environments, in this case the user should be prompted to select the environment.

    Here is a simple example of a task only manifest:

    pixi.toml

    [project]\nname = \"test_ambiguous_env\"\nchannels = []\nplatforms = [\"linux-64\", \"win-64\", \"osx-64\", \"osx-arm64\"]\n\n[tasks]\ndefault = \"echo Default\"\nambi = \"echo Ambi::Default\"\n[feature.test.tasks]\ntest = \"echo Test\"\nambi = \"echo Ambi::Test\"\n\n[feature.dev.tasks]\ndev = \"echo Dev\"\nambi = \"echo Ambi::Dev\"\n\n[environments]\ndefault = [\"test\", \"dev\"]\ntest = [\"test\"]\ndev = [\"dev\"]\n
    Trying to run the abmi task will prompt the user to select the environment. As it is available in all environments.

    Interactive selection of environments if task is in multiple environments
    \u279c pixi run ambi\n? The task 'ambi' can be run in multiple environments.\n\nPlease select an environment to run the task in: \u203a\n\u276f default # selecting default\n  test\n  dev\n\n\u2728 Pixi task (ambi in default): echo Ambi::Test\nAmbi::Test\n

    As you can see it runs the task defined in the feature.task but it is run in the default environment. This happens because the ambi task is defined in the test feature, and it is overwritten in the default environment. So the tasks.default is now non-reachable from any environment.

    Some other results running in this example:

    \u279c pixi run --environment test ambi\n\u2728 Pixi task (ambi in test): echo Ambi::Test\nAmbi::Test\n\n\u279c pixi run --environment dev ambi\n\u2728 Pixi task (ambi in dev): echo Ambi::Dev\nAmbi::Dev\n\n# dev is run in the default environment\n\u279c pixi run dev\n\u2728 Pixi task (dev in default): echo Dev\nDev\n\n# dev is run in the dev environment\n\u279c pixi run -e dev dev\n\u2728 Pixi task (dev in dev): echo Dev\nDev\n

    "},{"location":"features/multi_environment/#important-links","title":"Important links","text":"
    • Initial writeup of the proposal: GitHub Gist by 0xbe7a
    • GitHub project: #10
    "},{"location":"features/multi_environment/#real-world-example-use-cases","title":"Real world example use cases","text":"Polarify test setup

    In polarify they want to test multiple versions combined with multiple versions of polars. This is currently done by using a matrix in GitHub actions. This can be replaced by using multiple environments.

    pixi.toml
    [project]\nname = \"polarify\"\n# ...\nchannels = [\"conda-forge\"]\nplatforms = [\"linux-64\", \"osx-arm64\", \"osx-64\", \"win-64\"]\n\n[tasks]\npostinstall = \"pip install --no-build-isolation --no-deps --disable-pip-version-check -e .\"\n\n[dependencies]\npython = \">=3.9\"\npip = \"*\"\npolars = \">=0.14.24,<0.21\"\n\n[feature.py39.dependencies]\npython = \"3.9.*\"\n[feature.py310.dependencies]\npython = \"3.10.*\"\n[feature.py311.dependencies]\npython = \"3.11.*\"\n[feature.py312.dependencies]\npython = \"3.12.*\"\n[feature.pl017.dependencies]\npolars = \"0.17.*\"\n[feature.pl018.dependencies]\npolars = \"0.18.*\"\n[feature.pl019.dependencies]\npolars = \"0.19.*\"\n[feature.pl020.dependencies]\npolars = \"0.20.*\"\n\n[feature.test.dependencies]\npytest = \"*\"\npytest-md = \"*\"\npytest-emoji = \"*\"\nhypothesis = \"*\"\n[feature.test.tasks]\ntest = \"pytest\"\n\n[feature.lint.dependencies]\npre-commit = \"*\"\n[feature.lint.tasks]\nlint = \"pre-commit run --all\"\n\n[environments]\npl017 = [\"pl017\", \"py39\", \"test\"]\npl018 = [\"pl018\", \"py39\", \"test\"]\npl019 = [\"pl019\", \"py39\", \"test\"]\npl020 = [\"pl020\", \"py39\", \"test\"]\npy39 = [\"py39\", \"test\"]\npy310 = [\"py310\", \"test\"]\npy311 = [\"py311\", \"test\"]\npy312 = [\"py312\", \"test\"]\n
    .github/workflows/test.yml
    jobs:\n  tests-per-env:\n    runs-on: ubuntu-latest\n    strategy:\n      matrix:\n        environment: [py311, py312]\n    steps:\n    - uses: actions/checkout@v4\n      - uses: prefix-dev/setup-pixi@v0.5.1\n        with:\n          environments: ${{ matrix.environment }}\n      - name: Run tasks\n        run: |\n          pixi run --environment ${{ matrix.environment }} test\n  tests-with-multiple-envs:\n    runs-on: ubuntu-latest\n    steps:\n    - uses: actions/checkout@v4\n    - uses: prefix-dev/setup-pixi@v0.5.1\n      with:\n       environments: pl017 pl018\n    - run: |\n        pixi run -e pl017 test\n        pixi run -e pl018 test\n
    Test vs Production example

    This is an example of a project that has a test feature and prod environment. The prod environment is a production environment that contains the run dependencies. The test feature is a set of dependencies and tasks that we want to put on top of the previously solved prod environment. This is a common use case where we want to test the production environment with additional dependencies.

    pixi.toml

    [project]\nname = \"my-app\"\n# ...\nchannels = [\"conda-forge\"]\nplatforms = [\"osx-arm64\", \"linux-64\"]\n\n[tasks]\npostinstall-e = \"pip install --no-build-isolation --no-deps --disable-pip-version-check -e .\"\npostinstall = \"pip install --no-build-isolation --no-deps --disable-pip-version-check .\"\ndev = \"uvicorn my_app.app:main --reload\"\nserve = \"uvicorn my_app.app:main\"\n\n[dependencies]\npython = \">=3.12\"\npip = \"*\"\npydantic = \">=2\"\nfastapi = \">=0.105.0\"\nsqlalchemy = \">=2,<3\"\nuvicorn = \"*\"\naiofiles = \"*\"\n\n[feature.test.dependencies]\npytest = \"*\"\npytest-md = \"*\"\npytest-asyncio = \"*\"\n[feature.test.tasks]\ntest = \"pytest --md=report.md\"\n\n[environments]\n# both default and prod will have exactly the same dependency versions when they share a dependency\ndefault = {features = [\"test\"], solve-group = \"prod-group\"}\nprod = {features = [], solve-group = \"prod-group\"}\n
    In ci you would run the following commands:
    pixi run postinstall-e && pixi run test\n
    Locally you would run the following command:
    pixi run postinstall-e && pixi run dev\n

    Then in a Dockerfile you would run the following command: Dockerfile

    FROM ghcr.io/prefix-dev/pixi:latest # this doesn't exist yet\nWORKDIR /app\nCOPY . .\nRUN pixi run --environment prod postinstall\nEXPOSE 8080\nCMD [\"/usr/local/bin/pixi\", \"run\", \"--environment\", \"prod\", \"serve\"]\n

    Multiple machines from one project

    This is an example for an ML project that should be executable on a machine that supports cuda and mlx. It should also be executable on machines that don't support cuda or mlx, we use the cpu feature for this.

    pixi.toml
    [project]\nname = \"my-ml-project\"\ndescription = \"A project that does ML stuff\"\nauthors = [\"Your Name <your.name@gmail.com>\"]\nchannels = [\"conda-forge\", \"pytorch\"]\n# All platforms that are supported by the project as the features will take the intersection of the platforms defined there.\nplatforms = [\"win-64\", \"linux-64\", \"osx-64\", \"osx-arm64\"]\n\n[tasks]\ntrain-model = \"python train.py\"\nevaluate-model = \"python test.py\"\n\n[dependencies]\npython = \"3.11.*\"\npytorch = {version = \">=2.0.1\", channel = \"pytorch\"}\ntorchvision = {version = \">=0.15\", channel = \"pytorch\"}\npolars = \">=0.20,<0.21\"\nmatplotlib-base = \">=3.8.2,<3.9\"\nipykernel = \">=6.28.0,<6.29\"\n\n[feature.cuda]\nplatforms = [\"win-64\", \"linux-64\"]\nchannels = [\"nvidia\", {channel = \"pytorch\", priority = -1}]\nsystem-requirements = {cuda = \"12.1\"}\n\n[feature.cuda.tasks]\ntrain-model = \"python train.py --cuda\"\nevaluate-model = \"python test.py --cuda\"\n\n[feature.cuda.dependencies]\npytorch-cuda = {version = \"12.1.*\", channel = \"pytorch\"}\n\n[feature.mlx]\nplatforms = [\"osx-arm64\"]\n# MLX is only available on macOS >=13.5 (>14.0 is recommended)\nsystem-requirements = {macos = \"13.5\"}\n\n[feature.mlx.tasks]\ntrain-model = \"python train.py --mlx\"\nevaluate-model = \"python test.py --mlx\"\n\n[feature.mlx.dependencies]\nmlx = \">=0.16.0,<0.17.0\"\n\n[feature.cpu]\nplatforms = [\"win-64\", \"linux-64\", \"osx-64\", \"osx-arm64\"]\n\n[environments]\ncuda = [\"cuda\"]\nmlx = [\"mlx\"]\ndefault = [\"cpu\"]\n
    Running the project on a cuda machine
    pixi run train-model --environment cuda\n# will execute `python train.py --cuda`\n# fails if not on linux-64 or win-64 with cuda 12.1\n
    Running the project with mlx
    pixi run train-model --environment mlx\n# will execute `python train.py --mlx`\n# fails if not on osx-arm64\n
    Running the project on a machine without cuda or mlx
    pixi run train-model\n
    "},{"location":"features/multi_platform_configuration/","title":"Multi platform config","text":"

    Pixi's vision includes being supported on all major platforms. Sometimes that needs some extra configuration to work well. On this page, you will learn what you can configure to align better with the platform you are making your application for.

    Here is an example manifest file that highlights some of the features:

    pixi.tomlpyproject.toml pixi.toml
    [project]\n# Default project info....\n# A list of platforms you are supporting with your package.\nplatforms = [\"win-64\", \"linux-64\", \"osx-64\", \"osx-arm64\"]\n\n[dependencies]\npython = \">=3.8\"\n\n[target.win-64.dependencies]\n# Overwrite the needed python version only on win-64\npython = \"3.7\"\n\n\n[activation]\nscripts = [\"setup.sh\"]\n\n[target.win-64.activation]\n# Overwrite activation scripts only for windows\nscripts = [\"setup.bat\"]\n
    pyproject.toml
    [tool.pixi.project]\n# Default project info....\n# A list of platforms you are supporting with your package.\nplatforms = [\"win-64\", \"linux-64\", \"osx-64\", \"osx-arm64\"]\n\n[tool.pixi.dependencies]\npython = \">=3.8\"\n\n[tool.pixi.target.win-64.dependencies]\n# Overwrite the needed python version only on win-64\npython = \"~=3.7.0\"\n\n\n[tool.pixi.activation]\nscripts = [\"setup.sh\"]\n\n[tool.pixi.target.win-64.activation]\n# Overwrite activation scripts only for windows\nscripts = [\"setup.bat\"]\n
    "},{"location":"features/multi_platform_configuration/#platform-definition","title":"Platform definition","text":"

    The project.platforms defines which platforms your project supports. When multiple platforms are defined, pixi determines which dependencies to install for each platform individually. All of this is stored in a lock file.

    Running pixi install on a platform that is not configured will warn the user that it is not setup for that platform:

    \u276f pixi install\n  \u00d7 the project is not configured for your current platform\n   \u256d\u2500[pixi.toml:6:1]\n 6 \u2502 channels = [\"conda-forge\"]\n 7 \u2502 platforms = [\"osx-64\", \"osx-arm64\", \"win-64\"]\n   \u00b7             \u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u252c\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\n   \u00b7                             \u2570\u2500\u2500 add 'linux-64' here\n 8 \u2502\n   \u2570\u2500\u2500\u2500\u2500\n  help: The project needs to be configured to support your platform (linux-64).\n
    "},{"location":"features/multi_platform_configuration/#target-specifier","title":"Target specifier","text":"

    With the target specifier, you can overwrite the original configuration specifically for a single platform. If you are targeting a specific platform in your target specifier that was not specified in your project.platforms then pixi will throw an error.

    "},{"location":"features/multi_platform_configuration/#dependencies","title":"Dependencies","text":"

    It might happen that you want to install a certain dependency only on a specific platform, or you might want to use a different version on different platforms.

    pixi.toml
    [dependencies]\npython = \">=3.8\"\n\n[target.win-64.dependencies]\nmsmpi = \"*\"\npython = \"3.8\"\n

    In the above example, we specify that we depend on msmpi only on Windows. We also specifically want python on 3.8 when installing on Windows. This will overwrite the dependencies from the generic set of dependencies. This will not touch any of the other platforms.

    You can use pixi's cli to add these dependencies to the manifest file.

    pixi add --platform win-64 posix\n

    This also works for the host and build dependencies.

    pixi add --host --platform win-64 posix\npixi add --build --platform osx-64 clang\n

    Which results in this.

    pixi.toml
    [target.win-64.host-dependencies]\nposix = \"1.0.0.*\"\n\n[target.osx-64.build-dependencies]\nclang = \"16.0.6.*\"\n
    "},{"location":"features/multi_platform_configuration/#activation","title":"Activation","text":"

    Pixi's vision is to enable completely cross-platform projects, but you often need to run tools that are not built by your projects. Generated activation scripts are often in this category, default scripts in unix are bash and for windows they are bat

    To deal with this, you can define your activation scripts using the target definition.

    pixi.toml

    [activation]\nscripts = [\"setup.sh\", \"local_setup.bash\"]\n\n[target.win-64.activation]\nscripts = [\"setup.bat\", \"local_setup.bat\"]\n
    When this project is run on win-64 it will only execute the target scripts not the scripts specified in the default activation.scripts

    "},{"location":"features/system_requirements/","title":"System Requirements in pixi","text":"

    System requirements define the minimal system specifications necessary during dependency resolution for a project. For instance, specifying a Unix system with a particular minimal libc version ensures that dependencies are compatible with the project's environment.

    System specifications are closely related to virtual packages, allowing for flexible and accurate dependency management.

    "},{"location":"features/system_requirements/#default-system-requirements","title":"Default System Requirements","text":"

    The following configurations outline the default minimal system requirements for different operating systems:

    LinuxWindowsosx-64osx-arm64
    # Default system requirements for Linux\n[system-requirements]\nlinux = \"4.18\"\nlibc = { family = \"glibc\", version = \"2.28\" }\n

    Windows currently has no minimal system requirements defined. If your project requires specific Windows configurations, you should define them accordingly.

    # Default system requirements for macOS\n[system-requirements]\nmacos = \"13.0\"\n
    # Default system requirements for macOS ARM64\n[system-requirements]\nmacos = \"13.0\"\n
    "},{"location":"features/system_requirements/#customizing-system-requirements","title":"Customizing System Requirements","text":"

    You only need to define system requirements if your project necessitates a different set from the defaults. This is common when installing environments on older or newer versions of operating systems.

    "},{"location":"features/system_requirements/#adjusting-for-older-systems","title":"Adjusting for Older Systems","text":"

    If you're encountering an error like:

    \u00d7 The current system has a mismatching virtual package. The project requires '__linux' to be at least version '4.18' but the system has version '4.12.14'\n

    This indicates that the project's system requirements are higher than your current system's specifications. To resolve this, you can lower the system requirements in your project's configuration:

    [system-requirements]\nlinux = \"4.12.14\"\n

    This adjustment informs the dependency resolver to accommodate the older system version.

    "},{"location":"features/system_requirements/#using-cuda-in-pixi","title":"Using CUDA in pixi","text":"

    To utilize CUDA in your project, you must specify the desired CUDA version in the system-requirements table. This ensures that CUDA is recognized and appropriately locked into the lock file if necessary.

    Example Configuration

    [system-requirements]\ncuda = \"12\"  # Replace \"12\" with the specific CUDA version you intend to use\n
    "},{"location":"features/system_requirements/#setting-system-requirements-environment-specific","title":"Setting System Requirements environment specific","text":"

    This can be set per feature in the the manifest file.

    [feature.cuda.system-requirements]\ncuda = \"12\"\n\n[environments]\ncuda = [\"cuda\"]\n
    "},{"location":"features/system_requirements/#available-override-options","title":"Available Override Options","text":"

    In certain scenarios, you might need to override the system requirements detected on your machine. This can be particularly useful when working on systems that do not meet the project's default requirements.

    You can override virtual packages by setting the following environment variables:

    • CONDA_OVERRIDE_CUDA - Description: Sets the CUDA version. - Usage Example: CONDA_OVERRIDE_CUDA=11
    • CONDA_OVERRIDE_GLIBC - Description: Sets the glibc version. - Usage Example: CONDA_OVERRIDE_GLIBC=2.28
    • CONDA_OVERRIDE_OSX - Description: Sets the macOS version. - Usage Example: CONDA_OVERRIDE_OSX=13.0
    "},{"location":"features/system_requirements/#additional-resources","title":"Additional Resources","text":"

    For more detailed information on managing virtual packages and overriding system requirements, refer to the Conda Documentation.

    "},{"location":"ide_integration/devcontainer/","title":"Use pixi inside of a devcontainer","text":"

    VSCode Devcontainers are a popular tool to develop on a project with a consistent environment. They are also used in GitHub Codespaces which makes it a great way to develop on a project without having to install anything on your local machine.

    To use pixi inside of a devcontainer, follow these steps:

    Create a new directory .devcontainer in the root of your project. Then, create the following two files in the .devcontainer directory:

    .devcontainer/Dockerfile
    FROM mcr.microsoft.com/devcontainers/base:jammy\n\nARG PIXI_VERSION=v0.31.0\n\nRUN curl -L -o /usr/local/bin/pixi -fsSL --compressed \"https://github.com/prefix-dev/pixi/releases/download/${PIXI_VERSION}/pixi-$(uname -m)-unknown-linux-musl\" \\\n    && chmod +x /usr/local/bin/pixi \\\n    && pixi info\n\n# set some user and workdir settings to work nicely with vscode\nUSER vscode\nWORKDIR /home/vscode\n\nRUN echo 'eval \"$(pixi completion -s bash)\"' >> /home/vscode/.bashrc\n
    .devcontainer/devcontainer.json
    {\n    \"name\": \"my-project\",\n    \"build\": {\n      \"dockerfile\": \"Dockerfile\",\n      \"context\": \"..\",\n    },\n    \"customizations\": {\n      \"vscode\": {\n        \"settings\": {},\n        \"extensions\": [\"ms-python.python\", \"charliermarsh.ruff\", \"GitHub.copilot\"]\n      }\n    },\n    \"features\": {\n      \"ghcr.io/devcontainers/features/docker-in-docker:2\": {}\n    },\n    \"mounts\": [\"source=${localWorkspaceFolderBasename}-pixi,target=${containerWorkspaceFolder}/.pixi,type=volume\"],\n    \"postCreateCommand\": \"sudo chown vscode .pixi && pixi install\"\n}\n

    Put .pixi in a mount

    In the above example, we mount the .pixi directory into a volume. This is needed since the .pixi directory shouldn't be on a case insensitive filesystem (default on macOS, Windows) but instead in its own volume. There are some conda packages (for example ncurses-feedstock#73) that contain files that only differ in case which leads to errors on case insensitive filesystems.

    "},{"location":"ide_integration/devcontainer/#secrets","title":"Secrets","text":"

    If you want to authenticate to a private conda channel, you can add secrets to your devcontainer.

    .devcontainer/devcontainer.json
    {\n    \"build\": \"Dockerfile\",\n    \"context\": \"..\",\n    \"options\": [\n        \"--secret\",\n        \"id=prefix_dev_token,env=PREFIX_DEV_TOKEN\",\n    ],\n    // ...\n}\n
    .devcontainer/Dockerfile
    # ...\nRUN --mount=type=secret,id=prefix_dev_token,uid=1000 \\\n    test -s /run/secrets/prefix_dev_token \\\n    && pixi auth login --token \"$(cat /run/secrets/prefix_dev_token)\" https://repo.prefix.dev\n

    These secrets need to be present either as an environment variable when starting the devcontainer locally or in your GitHub Codespaces settings under Secrets.

    "},{"location":"ide_integration/jupyterlab/","title":"JupyterLab Integration","text":""},{"location":"ide_integration/jupyterlab/#basic-usage","title":"Basic usage","text":"

    Using JupyterLab with pixi is very simple. You can just create a new pixi project and add the jupyterlab package to it. The full example is provided under the following Github link.

    pixi init\npixi add jupyterlab\n

    This will create a new pixi project and add the jupyterlab package to it. You can then start JupyterLab using the following command:

    pixi run jupyter lab\n

    If you want to add more \"kernels\" to JupyterLab, you can simply add them to your current project \u2013 as well as any dependencies from the scientific stack you might need.

    pixi add bash_kernel ipywidgets matplotlib numpy pandas  # ...\n
    "},{"location":"ide_integration/jupyterlab/#what-kernels-are-available","title":"What kernels are available?","text":"

    You can easily install more \"kernels\" for JupyterLab. The conda-forge repository has a number of interesting additional kernels - not just Python!

    • bash_kernel A kernel for bash
    • xeus-cpp A C++ kernel based on the new clang-repl
    • xeus-cling A C++ kernel based on the slightly older Cling
    • xeus-lua A Lua kernel
    • xeus-sql A kernel for SQL
    • r-irkernel An R kernel
    "},{"location":"ide_integration/jupyterlab/#advanced-usage","title":"Advanced usage","text":"

    If you want to have only one instance of JupyterLab running but still want per-directory Pixi environments, you can use one of the kernels provided by the pixi-kernel package.

    "},{"location":"ide_integration/jupyterlab/#configuring-jupyterlab","title":"Configuring JupyterLab","text":"

    To get started, create a Pixi project, add jupyterlab and pixi-kernel and then start JupyterLab:

    pixi init\npixi add jupyterlab pixi-kernel\npixi run jupyter lab\n

    This will start JupyterLab and open it in your browser.

    pixi-kernel searches for a manifest file, either pixi.toml or pyproject.toml, in the same directory of your notebook or in any parent directory. When it finds one, it will use the environment specified in the manifest file to start the kernel and run your notebooks.

    "},{"location":"ide_integration/jupyterlab/#binder","title":"Binder","text":"

    If you just want to check a JupyterLab environment running in the cloud using pixi-kernel, you can visit Binder.

    "},{"location":"ide_integration/pycharm/","title":"PyCharm Integration","text":"

    You can use PyCharm with pixi environments by using the conda shim provided by the pixi-pycharm package.

    "},{"location":"ide_integration/pycharm/#how-to-use","title":"How to use","text":"

    To get started, add pixi-pycharm to your pixi project.

    pixi add pixi-pycharm\n

    This will ensure that the conda shim is installed in your project's environment.

    Having pixi-pycharm installed, you can now configure PyCharm to use your pixi environments. Go to the Add Python Interpreter dialog (bottom right corner of the PyCharm window) and select Conda Environment. Set Conda Executable to the full path of the conda file (on Windows: conda.bat) which is located in .pixi/envs/default/libexec. You can get the path using the following command:

    Linux & macOSWindows
    pixi run 'echo $CONDA_PREFIX/libexec/conda'\n
    pixi run 'echo $CONDA_PREFIX\\\\libexec\\\\conda.bat'\n

    This is an executable that tricks PyCharm into thinking it's the proper conda executable. Under the hood it redirects all calls to the corresponding pixi equivalent.

    Use the conda shim from this pixi project

    Please make sure that this is the conda shim from this pixi project and not another one. If you use multiple pixi projects, you might have to adjust the path accordingly as PyCharm remembers the path to the conda executable.

    Having selected the environment, PyCharm will now use the Python interpreter from your pixi environment.

    PyCharm should now be able to show you the installed packages as well.

    You can now run your programs and tests as usual.

    Mark .pixi as excluded

    In order for PyCharm to not get confused about the .pixi directory, please mark it as excluded.

    Also, when using a remote interpreter, you should exclude the .pixi directory on the remote machine. Instead, you should run pixi install on the remote machine and select the conda shim from there.

    "},{"location":"ide_integration/pycharm/#multiple-environments","title":"Multiple environments","text":"

    If your project uses multiple environments to tests different Python versions or dependencies, you can add multiple environments to PyCharm by specifying Use existing environment in the Add Python Interpreter dialog.

    You can then specify the corresponding environment in the bottom right corner of the PyCharm window.

    "},{"location":"ide_integration/pycharm/#multiple-pixi-projects","title":"Multiple pixi projects","text":"

    When using multiple pixi projects, remember to select the correct Conda Executable for each project as mentioned above. It also might come up that you have multiple environments it might come up that you have multiple environments with the same name.

    It is recommended to rename the environments to something unique.

    "},{"location":"ide_integration/pycharm/#debugging","title":"Debugging","text":"

    Logs are written to ~/.cache/pixi-pycharm.log. You can use them to debug problems. Please attach the logs when filing a bug report.

    "},{"location":"ide_integration/r_studio/","title":"Developing R scripts in RStudio","text":"

    You can use pixi to manage your R dependencies. The conda-forge channel contains a wide range of R packages that can be installed using pixi.

    "},{"location":"ide_integration/r_studio/#installing-r-packages","title":"Installing R packages","text":"

    R packages are usually prefixed with r- in the conda-forge channel. To install an R package, you can use the following command:

    pixi add r-<package-name>\n# for example\npixi add r-ggplot2\n
    "},{"location":"ide_integration/r_studio/#using-r-packages-in-rstudio","title":"Using R packages in RStudio","text":"

    To use the R packages installed by pixi in RStudio, you need to run rstudio from an activated environment. This can be achieved by running RStudio from pixi shell or from a task in the pixi.toml file.

    "},{"location":"ide_integration/r_studio/#full-example","title":"Full example","text":"

    The full example can be found here: RStudio example. Here is an example of a pixi.toml file that sets up an RStudio task:

    [project]\nname = \"r\"\nchannels = [\"conda-forge\"]\nplatforms = [\"linux-64\", \"osx-64\", \"osx-arm64\"]\n\n[target.linux.tasks]\nrstudio = \"rstudio\"\n\n[target.osx.tasks]\nrstudio = \"open -a rstudio\"\n# or alternatively with the full path:\n# rstudio = \"/Applications/RStudio.app/Contents/MacOS/RStudio\"\n\n[dependencies]\nr = \">=4.3,<5\"\nr-ggplot2 = \">=3.5.0,<3.6\"\n

    Once RStudio has loaded, you can execute the following R code that uses the ggplot2 package:

    # Load the ggplot2 package\nlibrary(ggplot2)\n\n# Load the built-in 'mtcars' dataset\ndata <- mtcars\n\n# Create a scatterplot of 'mpg' vs 'wt'\nggplot(data, aes(x = wt, y = mpg)) +\n  geom_point() +\n  labs(x = \"Weight (1000 lbs)\", y = \"Miles per Gallon\") +\n  ggtitle(\"Fuel Efficiency vs. Weight\")\n

    Note

    This example assumes that you have installed RStudio system-wide. We are working on updating RStudio as well as the R interpreter builds on Windows for maximum compatibility with pixi.

    "},{"location":"reference/cli/","title":"Commands","text":""},{"location":"reference/cli/#global-options","title":"Global options","text":"
    • --verbose (-v|vv|vvv) Increase the verbosity of the output messages, the -v|vv|vvv increases the level of verbosity respectively.
    • --help (-h) Shows help information, use -h to get the short version of the help.
    • --version (-V): shows the version of pixi that is used.
    • --quiet (-q): Decreases the amount of output.
    • --color <COLOR>: Whether the log needs to be colored [env: PIXI_COLOR=] [default: auto] [possible values: always, never, auto]. Pixi also honors the FORCE_COLOR and NO_COLOR environment variables. They both take precedence over --color and PIXI_COLOR.
    • --no-progress: Disables the progress bar.[env: PIXI_NO_PROGRESS] [default: false]
    "},{"location":"reference/cli/#init","title":"init","text":"

    This command is used to create a new project. It initializes a pixi.toml file and also prepares a .gitignore to prevent the environment from being added to git.

    It also supports the pyproject.toml file, if you have a pyproject.toml file in the directory where you run pixi init, it appends the pixi data to the pyproject.toml instead of a new pixi.toml file.

    "},{"location":"reference/cli/#arguments","title":"Arguments","text":"
    1. [PATH]: Where to place the project (defaults to current path) [default: .]
    "},{"location":"reference/cli/#options","title":"Options","text":"
    • --channel <CHANNEL> (-c): specify a channel that the project uses. Defaults to conda-forge. (Allowed to be used more than once)
    • --platform <PLATFORM> (-p): specify a platform that the project supports. (Allowed to be used more than once)
    • --import <ENV_FILE> (-i): Import an existing conda environment file, e.g. environment.yml.
    • --format <FORMAT>: Specify the format of the project file, either pyproject or pixi. [default: pixi]

    Importing an environment.yml

    When importing an environment, the pixi.toml will be created with the dependencies from the environment file. The pixi.lock will be created when you install the environment. We don't support git+ urls as dependencies for pip packages and for the defaults channel we use main, r and msys2 as the default channels.

    pixi init myproject\npixi init ~/myproject\npixi init  # Initializes directly in the current directory.\npixi init --channel conda-forge --channel bioconda myproject\npixi init --platform osx-64 --platform linux-64 myproject\npixi init --import environment.yml\npixi init --format pyproject\npixi init --format pixi\n
    "},{"location":"reference/cli/#add","title":"add","text":"

    Adds dependencies to the manifest file. It will only add if the package with its version constraint is able to work with rest of the dependencies in the project. More info on multi-platform configuration.

    If the project manifest is a pyproject.toml, adding a pypi dependency will add it to the native pyproject project.dependencies array, or to the native project.optional-dependencies table if a feature is specified:

    • pixi add --pypi boto3 would add boto3 to the project.dependencies array
    • pixi add --pypi boto3 --feature aws would add boto3 to the project.dependencies.aws array

    These dependencies will be read by pixi as if they had been added to the pixi pypi-dependencies tables of the default or a named feature.

    "},{"location":"reference/cli/#arguments_1","title":"Arguments","text":"
    1. [SPECS]: The package(s) to add, space separated. The version constraint is optional.
    "},{"location":"reference/cli/#options_1","title":"Options","text":"
    • --manifest-path <MANIFEST_PATH>: the path to manifest file, by default it searches for one in the parent directories.
    • --host: Specifies a host dependency, important for building a package.
    • --build: Specifies a build dependency, important for building a package.
    • --pypi: Specifies a PyPI dependency, not a conda package. Parses dependencies as PEP508 requirements, supporting extras and versions. See configuration for details.
    • --no-install: Don't install the package to the environment, only add the package to the lock-file.
    • --no-lockfile-update: Don't update the lock-file, implies the --no-install flag.
    • --platform <PLATFORM> (-p): The platform for which the dependency should be added. (Allowed to be used more than once)
    • --feature <FEATURE> (-f): The feature for which the dependency should be added.
    • --editable: Specifies an editable dependency, only use in combination with --pypi.
    pixi add numpy # (1)!\npixi add numpy pandas \"pytorch>=1.8\" # (2)!\npixi add \"numpy>=1.22,<1.24\" # (3)!\npixi add --manifest-path ~/myproject/pixi.toml numpy # (4)!\npixi add --host \"python>=3.9.0\" # (5)!\npixi add --build cmake # (6)!\npixi add --platform osx-64 clang # (7)!\npixi add --no-install numpy # (8)!\npixi add --no-lockfile-update numpy # (9)!\npixi add --feature featurex numpy # (10)!\n\n# Add a pypi dependency\npixi add --pypi requests[security] # (11)!\npixi add --pypi Django==5.1rc1 # (12)!\npixi add --pypi \"boltons>=24.0.0\" --feature lint # (13)!\npixi add --pypi \"boltons @ https://files.pythonhosted.org/packages/46/35/e50d4a115f93e2a3fbf52438435bb2efcf14c11d4fcd6bdcd77a6fc399c9/boltons-24.0.0-py3-none-any.whl\" # (14)!\npixi add --pypi \"exchangelib @ git+https://github.com/ecederstrand/exchangelib\" # (15)!\npixi add --pypi \"project @ file:///absolute/path/to/project\" # (16)!\npixi add --pypi \"project@file:///absolute/path/to/project\" --editable # (17)!\n
    1. This will add the numpy package to the project with the latest available for the solved environment.
    2. This will add multiple packages to the project solving them all together.
    3. This will add the numpy package with the version constraint.
    4. This will add the numpy package to the project of the manifest file at the given path.
    5. This will add the python package as a host dependency. There is currently no different behavior for host dependencies.
    6. This will add the cmake package as a build dependency. There is currently no different behavior for build dependencies.
    7. This will add the clang package only for the osx-64 platform.
    8. This will add the numpy package to the manifest and lockfile, without installing it in an environment.
    9. This will add the numpy package to the manifest without updating the lockfile or installing it in the environment.
    10. This will add the numpy package in the feature featurex.
    11. This will add the requests package as pypi dependency with the security extra.
    12. This will add the pre-release version of Django to the project as a pypi dependency.
    13. This will add the boltons package in the feature lint as pypi dependency.
    14. This will add the boltons package with the given url as pypi dependency.
    15. This will add the exchangelib package with the given git url as pypi dependency.
    16. This will add the project package with the given file url as pypi dependency.
    17. This will add the project package with the given file url as an editable package as pypi dependency.

    Tip

    If you want to use a non default pinning strategy, you can set it using pixi's configuration.

    pixi config set pinning-strategy no-pin --global\n
    The default is semver which will pin the dependencies to the latest major version or minor for v0 versions.

    "},{"location":"reference/cli/#install","title":"install","text":"

    Installs an environment based on the manifest file. If there is no pixi.lock file or it is not up-to-date with the manifest file, it will (re-)generate the lock file.

    pixi install only installs one environment at a time, if you have multiple environments you can select the right one with the --environment flag. If you don't provide an environment, the default environment will be installed.

    Running pixi install is not required before running other commands. As all commands interacting with the environment will first run the install command if the environment is not ready, to make sure you always run in a correct state. E.g. pixi run, pixi shell, pixi shell-hook, pixi add, pixi remove to name a few.

    "},{"location":"reference/cli/#options_2","title":"Options","text":"
    • --manifest-path <MANIFEST_PATH>: the path to manifest file, by default it searches for one in the parent directories.
    • --frozen: install the environment as defined in the lock file, doesn't update pixi.lock if it isn't up-to-date with manifest file. It can also be controlled by the PIXI_FROZEN environment variable (example: PIXI_FROZEN=true).
    • --locked: only install if the pixi.lock is up-to-date with the manifest file1. It can also be controlled by the PIXI_LOCKED environment variable (example: PIXI_LOCKED=true). Conflicts with --frozen.
    • --environment <ENVIRONMENT> (-e): The environment to install, if none are provided the default environment will be used.
    pixi install\npixi install --manifest-path ~/myproject/pixi.toml\npixi install --frozen\npixi install --locked\npixi install --environment lint\npixi install -e lint\n
    "},{"location":"reference/cli/#update","title":"update","text":"

    The update command checks if there are newer versions of the dependencies and updates the pixi.lock file and environments accordingly. It will only update the lock file if the dependencies in the manifest file are still compatible with the new versions.

    "},{"location":"reference/cli/#arguments_2","title":"Arguments","text":"
    1. [PACKAGES]... The packages to update, space separated. If no packages are provided, all packages will be updated.
    "},{"location":"reference/cli/#options_3","title":"Options","text":"
    • --manifest-path <MANIFEST_PATH>: the path to manifest file, by default it searches for one in the parent directories.
    • --environment <ENVIRONMENT> (-e): The environment to install, if none are provided all the environments are updated.
    • --platform <PLATFORM> (-p): The platform for which the dependencies should be updated.
    • --dry-run (-n): Only show the changes that would be made, without actually updating the lock file or environment.
    • --no-install: Don't install the (solve) environment needed for solving pypi-dependencies.
    • --json: Output the changes in json format.
    pixi update numpy\npixi update numpy pandas\npixi update --manifest-path ~/myproject/pixi.toml numpy\npixi update --environment lint python\npixi update -e lint -e schema -e docs pre-commit\npixi update --platform osx-arm64 mlx\npixi update -p linux-64 -p osx-64 numpy\npixi update --dry-run\npixi update --no-install boto3\n
    "},{"location":"reference/cli/#run","title":"run","text":"

    The run commands first checks if the environment is ready to use. When you didn't run pixi install the run command will do that for you. The custom tasks defined in the manifest file are also available through the run command.

    You cannot run pixi run source setup.bash as source is not available in the deno_task_shell commandos and not an executable.

    "},{"location":"reference/cli/#arguments_3","title":"Arguments","text":"
    1. [TASK]... The task you want to run in the projects environment, this can also be a normal command. And all arguments after the task will be passed to the task.
    "},{"location":"reference/cli/#options_4","title":"Options","text":"
    • --manifest-path <MANIFEST_PATH>: the path to manifest file, by default it searches for one in the parent directories.
    • --frozen: install the environment as defined in the lock file, doesn't update pixi.lock if it isn't up-to-date with manifest file. It can also be controlled by the PIXI_FROZEN environment variable (example: PIXI_FROZEN=true).
    • --locked: only install if the pixi.lock is up-to-date with the manifest file1. It can also be controlled by the PIXI_LOCKED environment variable (example: PIXI_LOCKED=true). Conflicts with --frozen.
    • --environment <ENVIRONMENT> (-e): The environment to run the task in, if none are provided the default environment will be used or a selector will be given to select the right environment.
    • --clean-env: Run the task in a clean environment, this will remove all environment variables of the shell environment except for the ones pixi sets. THIS DOESN't WORK ON Windows.
      pixi run python\npixi run cowpy \"Hey pixi user\"\npixi run --manifest-path ~/myproject/pixi.toml python\npixi run --frozen python\npixi run --locked python\n# If you have specified a custom task in the pixi.toml you can run it with run as well\npixi run build\n# Extra arguments will be passed to the tasks command.\npixi run task argument1 argument2\n\n# If you have multiple environments you can select the right one with the --environment flag.\npixi run --environment cuda python\n\n# THIS DOESN'T WORK ON WINDOWS\n# If you want to run a command in a clean environment you can use the --clean-env flag.\n# The PATH should only contain the pixi environment here.\npixi run --clean-env \"echo \\$PATH\"\n

    Info

    In pixi the deno_task_shell is the underlying runner of the run command. Checkout their documentation for the syntax and available commands. This is done so that the run commands can be run across all platforms.

    Cross environment tasks

    If you're using the depends-on feature of the tasks, the tasks will be run in the order you specified them. The depends-on can be used cross environment, e.g. you have this pixi.toml:

    pixi.toml
    [tasks]\nstart = { cmd = \"python start.py\", depends-on = [\"build\"] }\n\n[feature.build.tasks]\nbuild = \"cargo build\"\n[feature.build.dependencies]\nrust = \">=1.74\"\n\n[environments]\nbuild = [\"build\"]\n

    Then you're able to run the build from the build environment and start from the default environment. By only calling:

    pixi run start\n

    "},{"location":"reference/cli/#exec","title":"exec","text":"

    Runs a command in a temporary environment disconnected from any project. This can be useful to quickly test out a certain package or version.

    Temporary environments are cached. If the same command is run again, the same environment will be reused.

    Cleaning temporary environments

    Currently, temporary environments can only be cleaned up manually. Environments for pixi exec are stored under cached-envs-v0/ in the cache directory. Run pixi info to find the cache directory.

    "},{"location":"reference/cli/#arguments_4","title":"Arguments","text":"
    1. <COMMAND>: The command to run.
    "},{"location":"reference/cli/#options_5","title":"Options:","text":"
    • --spec <SPECS> (-s): Matchspecs of packages to install. If this is not provided, the package is guessed from the command.
    • --channel <CHANNELS> (-c): The channel to install the packages from. If not specified the default channel is used.
    • --force-reinstall If specified a new environment is always created even if one already exists.
    pixi exec python\n\n# Add a constraint to the python version\npixi exec -s python=3.9 python\n\n# Run ipython and include the py-rattler package in the environment\npixi exec -s ipython -s py-rattler ipython\n\n# Force reinstall to recreate the environment and get the latest package versions\npixi exec --force-reinstall -s ipython -s py-rattler ipython\n
    "},{"location":"reference/cli/#remove","title":"remove","text":"

    Removes dependencies from the manifest file.

    If the project manifest is a pyproject.toml, removing a pypi dependency with the --pypi flag will remove it from either - the native pyproject project.dependencies array or the native project.optional-dependencies table (if a feature is specified) - pixi pypi-dependencies tables of the default or a named feature (if a feature is specified)

    "},{"location":"reference/cli/#arguments_5","title":"Arguments","text":"
    1. <DEPS>...: List of dependencies you wish to remove from the project.
    "},{"location":"reference/cli/#options_6","title":"Options","text":"
    • --manifest-path <MANIFEST_PATH>: the path to manifest file, by default it searches for one in the parent directories.
    • --host: Specifies a host dependency, important for building a package.
    • --build: Specifies a build dependency, important for building a package.
    • --pypi: Specifies a PyPI dependency, not a conda package.
    • --platform <PLATFORM> (-p): The platform from which the dependency should be removed.
    • --feature <FEATURE> (-f): The feature from which the dependency should be removed.
    • --no-install: Don't install the environment, only remove the package from the lock-file and manifest.
    • --no-lockfile-update: Don't update the lock-file, implies the --no-install flag.
    pixi remove numpy\npixi remove numpy pandas pytorch\npixi remove --manifest-path ~/myproject/pixi.toml numpy\npixi remove --host python\npixi remove --build cmake\npixi remove --pypi requests\npixi remove --platform osx-64 --build clang\npixi remove --feature featurex clang\npixi remove --feature featurex --platform osx-64 clang\npixi remove --feature featurex --platform osx-64 --build clang\npixi remove --no-install numpy\n
    "},{"location":"reference/cli/#task","title":"task","text":"

    If you want to make a shorthand for a specific command you can add a task for it.

    "},{"location":"reference/cli/#options_7","title":"Options","text":"
    • --manifest-path <MANIFEST_PATH>: the path to manifest file, by default it searches for one in the parent directories.
    "},{"location":"reference/cli/#task-add","title":"task add","text":"

    Add a task to the manifest file, use --depends-on to add tasks you want to run before this task, e.g. build before an execute task.

    "},{"location":"reference/cli/#arguments_6","title":"Arguments","text":"
    1. <NAME>: The name of the task.
    2. <COMMAND>: The command to run. This can be more than one word.

    Info

    If you are using $ for env variables they will be resolved before adding them to the task. If you want to use $ in the task you need to escape it with a \\, e.g. echo \\$HOME.

    "},{"location":"reference/cli/#options_8","title":"Options","text":"
    • --platform <PLATFORM> (-p): the platform for which this task should be added.
    • --feature <FEATURE> (-f): the feature for which the task is added, if non provided the default tasks will be added.
    • --depends-on <DEPENDS_ON>: the task it depends on to be run before the one your adding.
    • --cwd <CWD>: the working directory for the task relative to the root of the project.
    • --env <ENV>: the environment variables as key=value pairs for the task, can be used multiple times, e.g. --env \"VAR1=VALUE1\" --env \"VAR2=VALUE2\".
    • --description <DESCRIPTION>: a description of the task.
    pixi task add cow cowpy \"Hello User\"\npixi task add tls ls --cwd tests\npixi task add test cargo t --depends-on build\npixi task add build-osx \"METAL=1 cargo build\" --platform osx-64\npixi task add train python train.py --feature cuda\npixi task add publish-pypi \"hatch publish --yes --repo main\" --feature build --env HATCH_CONFIG=config/hatch.toml --description \"Publish the package to pypi\"\n

    This adds the following to the manifest file:

    [tasks]\ncow = \"cowpy \\\"Hello User\\\"\"\ntls = { cmd = \"ls\", cwd = \"tests\" }\ntest = { cmd = \"cargo t\", depends-on = [\"build\"] }\n\n[target.osx-64.tasks]\nbuild-osx = \"METAL=1 cargo build\"\n\n[feature.cuda.tasks]\ntrain = \"python train.py\"\n\n[feature.build.tasks]\npublish-pypi = { cmd = \"hatch publish --yes --repo main\", env = { HATCH_CONFIG = \"config/hatch.toml\" }, description = \"Publish the package to pypi\" }\n

    Which you can then run with the run command:

    pixi run cow\n# Extra arguments will be passed to the tasks command.\npixi run test --test test1\n
    "},{"location":"reference/cli/#task-remove","title":"task remove","text":"

    Remove the task from the manifest file

    "},{"location":"reference/cli/#arguments_7","title":"Arguments","text":"
    • <NAMES>: The names of the tasks, space separated.
    "},{"location":"reference/cli/#options_9","title":"Options","text":"
    • --platform <PLATFORM> (-p): the platform for which this task is removed.
    • --feature <FEATURE> (-f): the feature for which the task is removed.
    pixi task remove cow\npixi task remove --platform linux-64 test\npixi task remove --feature cuda task\n
    "},{"location":"reference/cli/#task-alias","title":"task alias","text":"

    Create an alias for a task.

    "},{"location":"reference/cli/#arguments_8","title":"Arguments","text":"
    1. <ALIAS>: The alias name
    2. <DEPENDS_ON>: The names of the tasks you want to execute on this alias, order counts, first one runs first.
    "},{"location":"reference/cli/#options_10","title":"Options","text":"
    • --platform <PLATFORM> (-p): the platform for which this alias is created.
    pixi task alias test-all test-py test-cpp test-rust\npixi task alias --platform linux-64 test test-linux\npixi task alias moo cow\n
    "},{"location":"reference/cli/#task-list","title":"task list","text":"

    List all tasks in the project.

    "},{"location":"reference/cli/#options_11","title":"Options","text":"
    • --environment(-e): the environment's tasks list, if non is provided the default tasks will be listed.
    • --summary(-s): list the tasks per environment.
    pixi task list\npixi task list --environment cuda\npixi task list --summary\n
    "},{"location":"reference/cli/#list","title":"list","text":"

    List project's packages. Highlighted packages are explicit dependencies.

    "},{"location":"reference/cli/#options_12","title":"Options","text":"
    • --platform <PLATFORM> (-p): The platform to list packages for. Defaults to the current platform
    • --json: Whether to output in json format.
    • --json-pretty: Whether to output in pretty json format
    • --sort-by <SORT_BY>: Sorting strategy [default: name] [possible values: size, name, type]
    • --explicit (-x): Only list the packages that are explicitly added to the manifest file.
    • --manifest-path <MANIFEST_PATH>: The path to manifest file, by default it searches for one in the parent directories.
    • --environment (-e): The environment's packages to list, if non is provided the default environment's packages will be listed.
    • --frozen: install the environment as defined in the lock file, doesn't update pixi.lock if it isn't up-to-date with manifest file. It can also be controlled by the PIXI_FROZEN environment variable (example: PIXI_FROZEN=true).
    • --locked: Only install if the pixi.lock is up-to-date with the manifest file1. It can also be controlled by the PIXI_LOCKED environment variable (example: PIXI_LOCKED=true). Conflicts with --frozen.
    • --no-install: Don't install the environment for pypi solving, only update the lock-file if it can solve without installing. (Implied by --frozen and --locked)
    pixi list\npixi list --json-pretty\npixi list --explicit\npixi list --sort-by size\npixi list --platform win-64\npixi list --environment cuda\npixi list --frozen\npixi list --locked\npixi list --no-install\n

    Output will look like this, where python will be green as it is the package that was explicitly added to the manifest file:

    \u279c pixi list\n Package           Version     Build               Size       Kind   Source\n _libgcc_mutex     0.1         conda_forge         2.5 KiB    conda  _libgcc_mutex-0.1-conda_forge.tar.bz2\n _openmp_mutex     4.5         2_gnu               23.1 KiB   conda  _openmp_mutex-4.5-2_gnu.tar.bz2\n bzip2             1.0.8       hd590300_5          248.3 KiB  conda  bzip2-1.0.8-hd590300_5.conda\n ca-certificates   2023.11.17  hbcca054_0          150.5 KiB  conda  ca-certificates-2023.11.17-hbcca054_0.conda\n ld_impl_linux-64  2.40        h41732ed_0          688.2 KiB  conda  ld_impl_linux-64-2.40-h41732ed_0.conda\n libexpat          2.5.0       hcb278e6_1          76.2 KiB   conda  libexpat-2.5.0-hcb278e6_1.conda\n libffi            3.4.2       h7f98852_5          56.9 KiB   conda  libffi-3.4.2-h7f98852_5.tar.bz2\n libgcc-ng         13.2.0      h807b86a_4          755.7 KiB  conda  libgcc-ng-13.2.0-h807b86a_4.conda\n libgomp           13.2.0      h807b86a_4          412.2 KiB  conda  libgomp-13.2.0-h807b86a_4.conda\n libnsl            2.0.1       hd590300_0          32.6 KiB   conda  libnsl-2.0.1-hd590300_0.conda\n libsqlite         3.44.2      h2797004_0          826 KiB    conda  libsqlite-3.44.2-h2797004_0.conda\n libuuid           2.38.1      h0b41bf4_0          32.8 KiB   conda  libuuid-2.38.1-h0b41bf4_0.conda\n libxcrypt         4.4.36      hd590300_1          98 KiB     conda  libxcrypt-4.4.36-hd590300_1.conda\n libzlib           1.2.13      hd590300_5          60.1 KiB   conda  libzlib-1.2.13-hd590300_5.conda\n ncurses           6.4         h59595ed_2          863.7 KiB  conda  ncurses-6.4-h59595ed_2.conda\n openssl           3.2.0       hd590300_1          2.7 MiB    conda  openssl-3.2.0-hd590300_1.conda\n python            3.12.1      hab00c5b_1_cpython  30.8 MiB   conda  python-3.12.1-hab00c5b_1_cpython.conda\n readline          8.2         h8228510_1          274.9 KiB  conda  readline-8.2-h8228510_1.conda\n tk                8.6.13      noxft_h4845f30_101  3.2 MiB    conda  tk-8.6.13-noxft_h4845f30_101.conda\n tzdata            2023d       h0c530f3_0          116.8 KiB  conda  tzdata-2023d-h0c530f3_0.conda\n xz                5.2.6       h166bdaf_0          408.6 KiB  conda  xz-5.2.6-h166bdaf_0.tar.bz2\n
    "},{"location":"reference/cli/#tree","title":"tree","text":"

    Display the project's packages in a tree. Highlighted packages are those specified in the manifest.

    The package tree can also be inverted (-i), to see which packages require a specific dependencies.

    "},{"location":"reference/cli/#arguments_9","title":"Arguments","text":"
    • REGEX optional regex of which dependencies to filter the tree to, or which dependencies to start with when inverting the tree.
    "},{"location":"reference/cli/#options_13","title":"Options","text":"
    • --invert (-i): Invert the dependency tree, that is given a REGEX pattern that matches some packages, show all the packages that depend on those.
    • --platform <PLATFORM> (-p): The platform to list packages for. Defaults to the current platform
    • --manifest-path <MANIFEST_PATH>: The path to manifest file, by default it searches for one in the parent directories.
    • --environment (-e): The environment's packages to list, if non is provided the default environment's packages will be listed.
    • --frozen: install the environment as defined in the lock file, doesn't update pixi.lock if it isn't up-to-date with manifest file. It can also be controlled by the PIXI_FROZEN environment variable (example: PIXI_FROZEN=true).
    • --locked: Only install if the pixi.lock is up-to-date with the manifest file1. It can also be controlled by the PIXI_LOCKED environment variable (example: PIXI_LOCKED=true). Conflicts with --frozen.
    • --no-install: Don't install the environment for pypi solving, only update the lock-file if it can solve without installing. (Implied by --frozen and --locked)
    pixi tree\npixi tree pre-commit\npixi tree -i yaml\npixi tree --environment docs\npixi tree --platform win-64\n

    Warning

    Use -v to show which pypi packages are not yet parsed correctly. The extras and markers parsing is still under development.

    Output will look like this, where direct packages in the manifest file will be green. Once a package has been displayed once, the tree won't continue to recurse through its dependencies (compare the first time python appears, vs the rest), and it will instead be marked with a star (*).

    Version numbers are colored by the package type, yellow for Conda packages and blue for PyPI.

    \u279c pixi tree\n\u251c\u2500\u2500 pre-commit v3.3.3\n\u2502   \u251c\u2500\u2500 cfgv v3.3.1\n\u2502   \u2502   \u2514\u2500\u2500 python v3.12.2\n\u2502   \u2502       \u251c\u2500\u2500 bzip2 v1.0.8\n\u2502   \u2502       \u251c\u2500\u2500 libexpat v2.6.2\n\u2502   \u2502       \u251c\u2500\u2500 libffi v3.4.2\n\u2502   \u2502       \u251c\u2500\u2500 libsqlite v3.45.2\n\u2502   \u2502       \u2502   \u2514\u2500\u2500 libzlib v1.2.13\n\u2502   \u2502       \u251c\u2500\u2500 libzlib v1.2.13 (*)\n\u2502   \u2502       \u251c\u2500\u2500 ncurses v6.4.20240210\n\u2502   \u2502       \u251c\u2500\u2500 openssl v3.2.1\n\u2502   \u2502       \u251c\u2500\u2500 readline v8.2\n\u2502   \u2502       \u2502   \u2514\u2500\u2500 ncurses v6.4.20240210 (*)\n\u2502   \u2502       \u251c\u2500\u2500 tk v8.6.13\n\u2502   \u2502       \u2502   \u2514\u2500\u2500 libzlib v1.2.13 (*)\n\u2502   \u2502       \u2514\u2500\u2500 xz v5.2.6\n\u2502   \u251c\u2500\u2500 identify v2.5.35\n\u2502   \u2502   \u2514\u2500\u2500 python v3.12.2 (*)\n...\n\u2514\u2500\u2500 tbump v6.9.0\n...\n    \u2514\u2500\u2500 tomlkit v0.12.4\n        \u2514\u2500\u2500 python v3.12.2 (*)\n

    A regex pattern can be specified to filter the tree to just those that show a specific direct, or transitive dependency:

    \u279c pixi tree pre-commit\n\u2514\u2500\u2500 pre-commit v3.3.3\n    \u251c\u2500\u2500 virtualenv v20.25.1\n    \u2502   \u251c\u2500\u2500 filelock v3.13.1\n    \u2502   \u2502   \u2514\u2500\u2500 python v3.12.2\n    \u2502   \u2502       \u251c\u2500\u2500 libexpat v2.6.2\n    \u2502   \u2502       \u251c\u2500\u2500 readline v8.2\n    \u2502   \u2502       \u2502   \u2514\u2500\u2500 ncurses v6.4.20240210\n    \u2502   \u2502       \u251c\u2500\u2500 libsqlite v3.45.2\n    \u2502   \u2502       \u2502   \u2514\u2500\u2500 libzlib v1.2.13\n    \u2502   \u2502       \u251c\u2500\u2500 bzip2 v1.0.8\n    \u2502   \u2502       \u251c\u2500\u2500 libzlib v1.2.13 (*)\n    \u2502   \u2502       \u251c\u2500\u2500 libffi v3.4.2\n    \u2502   \u2502       \u251c\u2500\u2500 tk v8.6.13\n    \u2502   \u2502       \u2502   \u2514\u2500\u2500 libzlib v1.2.13 (*)\n    \u2502   \u2502       \u251c\u2500\u2500 xz v5.2.6\n    \u2502   \u2502       \u251c\u2500\u2500 ncurses v6.4.20240210 (*)\n    \u2502   \u2502       \u2514\u2500\u2500 openssl v3.2.1\n    \u2502   \u251c\u2500\u2500 platformdirs v4.2.0\n    \u2502   \u2502   \u2514\u2500\u2500 python v3.12.2 (*)\n    \u2502   \u251c\u2500\u2500 distlib v0.3.8\n    \u2502   \u2502   \u2514\u2500\u2500 python v3.12.2 (*)\n    \u2502   \u2514\u2500\u2500 python v3.12.2 (*)\n    \u251c\u2500\u2500 pyyaml v6.0.1\n...\n

    Additionally, the tree can be inverted, and it can show which packages depend on a regex pattern. The packages specified in the manifest will also be highlighted (in this case cffconvert and pre-commit would be).

    \u279c pixi tree -i yaml\n\nruamel.yaml v0.18.6\n\u251c\u2500\u2500 pykwalify v1.8.0\n\u2502   \u2514\u2500\u2500 cffconvert v2.0.0\n\u2514\u2500\u2500 cffconvert v2.0.0\n\npyyaml v6.0.1\n\u2514\u2500\u2500 pre-commit v3.3.3\n\nruamel.yaml.clib v0.2.8\n\u2514\u2500\u2500 ruamel.yaml v0.18.6\n    \u251c\u2500\u2500 pykwalify v1.8.0\n    \u2502   \u2514\u2500\u2500 cffconvert v2.0.0\n    \u2514\u2500\u2500 cffconvert v2.0.0\n\nyaml v0.2.5\n\u2514\u2500\u2500 pyyaml v6.0.1\n    \u2514\u2500\u2500 pre-commit v3.3.3\n
    "},{"location":"reference/cli/#shell","title":"shell","text":"

    This command starts a new shell in the project's environment. To exit the pixi shell, simply run exit.

    "},{"location":"reference/cli/#options_14","title":"Options","text":"
    • --change-ps1 <true or false>: When set to false, the (pixi) prefix in the shell prompt is removed (default: true). The default behavior can be configured globally.
    • --manifest-path <MANIFEST_PATH>: the path to manifest file, by default it searches for one in the parent directories.
    • --frozen: install the environment as defined in the lock file, doesn't update pixi.lock if it isn't up-to-date with manifest file. It can also be controlled by the PIXI_FROZEN environment variable (example: PIXI_FROZEN=true).
    • --locked: only install if the pixi.lock is up-to-date with the manifest file1. It can also be controlled by the PIXI_LOCKED environment variable (example: PIXI_LOCKED=true). Conflicts with --frozen.
    • --environment <ENVIRONMENT> (-e): The environment to activate the shell in, if none are provided the default environment will be used or a selector will be given to select the right environment.
    pixi shell\nexit\npixi shell --manifest-path ~/myproject/pixi.toml\nexit\npixi shell --frozen\nexit\npixi shell --locked\nexit\npixi shell --environment cuda\nexit\n
    "},{"location":"reference/cli/#shell-hook","title":"shell-hook","text":"

    This command prints the activation script of an environment.

    "},{"location":"reference/cli/#options_15","title":"Options","text":"
    • --shell <SHELL> (-s): The shell for which the activation script should be printed. Defaults to the current shell. Currently supported variants: [bash, zsh, xonsh, cmd, powershell, fish, nushell]
    • --manifest-path: the path to manifest file, by default it searches for one in the parent directories.
    • --frozen: install the environment as defined in the lock file, doesn't update pixi.lock if it isn't up-to-date with manifest file. It can also be controlled by the PIXI_FROZEN environment variable (example: PIXI_FROZEN=true).
    • --locked: only install if the pixi.lock is up-to-date with the manifest file1. It can also be controlled by the PIXI_LOCKED environment variable (example: PIXI_LOCKED=true). Conflicts with --frozen.
    • --environment <ENVIRONMENT> (-e): The environment to activate, if none are provided the default environment will be used or a selector will be given to select the right environment.
    • --json: Print all environment variables that are exported by running the activation script as JSON. When specifying this option, --shell is ignored.
    pixi shell-hook\npixi shell-hook --shell bash\npixi shell-hook --shell zsh\npixi shell-hook -s powershell\npixi shell-hook --manifest-path ~/myproject/pixi.toml\npixi shell-hook --frozen\npixi shell-hook --locked\npixi shell-hook --environment cuda\npixi shell-hook --json\n

    Example use-case, when you want to get rid of the pixi executable in a Docker container.

    pixi shell-hook --shell bash > /etc/profile.d/pixi.sh\nrm ~/.pixi/bin/pixi # Now the environment will be activated without the need for the pixi executable.\n
    "},{"location":"reference/cli/#search","title":"search","text":"

    Search a package, output will list the latest version of the package.

    "},{"location":"reference/cli/#arguments_10","title":"Arguments","text":"
    1. <PACKAGE>: Name of package to search, it's possible to use wildcards (*).
    "},{"location":"reference/cli/#options_16","title":"Options","text":"
    • --manifest-path <MANIFEST_PATH>: the path to manifest file, by default it searches for one in the parent directories.
    • --channel <CHANNEL> (-c): specify a channel that the project uses. Defaults to conda-forge. (Allowed to be used more than once)
    • --limit <LIMIT> (-l): optionally limit the number of search results
    • --platform <PLATFORM> (-p): specify a platform that you want to search for. (default: current platform)
    pixi search pixi\npixi search --limit 30 \"py*\"\n# search in a different channel and for a specific platform\npixi search -c robostack --platform linux-64 \"plotjuggler*\"\n
    "},{"location":"reference/cli/#self-update","title":"self-update","text":"

    Update pixi to the latest version or a specific version. If the pixi binary is not found in the default location (e.g. ~/.pixi/bin/pixi), pixi won't update to prevent breaking the current installation (Homebrew, etc.). The behaviour can be overridden with the --force flag

    "},{"location":"reference/cli/#options_17","title":"Options","text":"
    • --version <VERSION>: The desired version (to downgrade or upgrade to). Update to the latest version if not specified.
    • --force: Force the update even if the pixi binary is not found in the default location.
    pixi self-update\npixi self-update --version 0.13.0\npixi self-update --force\n
    "},{"location":"reference/cli/#info","title":"info","text":"

    Shows helpful information about the pixi installation, cache directories, disk usage, and more. More information here.

    "},{"location":"reference/cli/#options_18","title":"Options","text":"
    • --manifest-path <MANIFEST_PATH>: the path to manifest file, by default it searches for one in the parent directories.
    • --extended: extend the information with more slow queries to the system, like directory sizes.
    • --json: Get a machine-readable version of the information as output.
    pixi info\npixi info --json --extended\n
    "},{"location":"reference/cli/#clean","title":"clean","text":"

    Clean the parts of your system which are touched by pixi. Defaults to cleaning the environments and task cache. Use the cache subcommand to clean the cache

    "},{"location":"reference/cli/#options_19","title":"Options","text":"
    • --manifest-path <MANIFEST_PATH>: the path to manifest file, by default it searches for one in the parent directories.
    • --environment <ENVIRONMENT> (-e): The environment to clean, if none are provided all environments will be removed.
    pixi clean\n
    "},{"location":"reference/cli/#clean-cache","title":"clean cache","text":"

    Clean the pixi cache on your system.

    "},{"location":"reference/cli/#options_20","title":"Options","text":"
    • --pypi: Clean the pypi cache.
    • --conda: Clean the conda cache.
    • --yes: Skip the confirmation prompt.
    pixi clean cache # clean all pixi caches\npixi clean cache --pypi # clean only the pypi cache\npixi clean cache --conda # clean only the conda cache\npixi clean cache --yes # skip the confirmation prompt\n
    "},{"location":"reference/cli/#upload","title":"upload","text":"

    Upload a package to a prefix.dev channel

    "},{"location":"reference/cli/#arguments_11","title":"Arguments","text":"
    1. <HOST>: The host + channel to upload to.
    2. <PACKAGE_FILE>: The package file to upload.
    pixi upload https://prefix.dev/api/v1/upload/my_channel my_package.conda\n
    "},{"location":"reference/cli/#auth","title":"auth","text":"

    This command is used to authenticate the user's access to remote hosts such as prefix.dev or anaconda.org for private channels.

    "},{"location":"reference/cli/#auth-login","title":"auth login","text":"

    Store authentication information for given host.

    Tip

    The host is real hostname not a channel.

    "},{"location":"reference/cli/#arguments_12","title":"Arguments","text":"
    1. <HOST>: The host to authenticate with.
    "},{"location":"reference/cli/#options_21","title":"Options","text":"
    • --token <TOKEN>: The token to use for authentication with prefix.dev.
    • --username <USERNAME>: The username to use for basic HTTP authentication
    • --password <PASSWORD>: The password to use for basic HTTP authentication.
    • --conda-token <CONDA_TOKEN>: The token to use on anaconda.org / quetz authentication.
    pixi auth login repo.prefix.dev --token pfx_JQEV-m_2bdz-D8NSyRSaAndHANx0qHjq7f2iD\npixi auth login anaconda.org --conda-token ABCDEFGHIJKLMNOP\npixi auth login https://myquetz.server --username john --password xxxxxx\n
    "},{"location":"reference/cli/#auth-logout","title":"auth logout","text":"

    Remove authentication information for a given host.

    "},{"location":"reference/cli/#arguments_13","title":"Arguments","text":"
    1. <HOST>: The host to authenticate with.
    pixi auth logout <HOST>\npixi auth logout repo.prefix.dev\npixi auth logout anaconda.org\n
    "},{"location":"reference/cli/#config","title":"config","text":"

    Use this command to manage the configuration.

    "},{"location":"reference/cli/#options_22","title":"Options","text":"
    • --system (-s): Specify management scope to system configuration.
    • --global (-g): Specify management scope to global configuration.
    • --local (-l): Specify management scope to local configuration.

    Checkout the pixi configuration for more information about the locations.

    "},{"location":"reference/cli/#config-edit","title":"config edit","text":"

    Edit the configuration file in the default editor.

    pixi config edit --system\npixi config edit --local\npixi config edit -g\n
    "},{"location":"reference/cli/#config-list","title":"config list","text":"

    List the configuration

    "},{"location":"reference/cli/#arguments_14","title":"Arguments","text":"
    1. [KEY]: The key to list the value of. (all if not provided)
    "},{"location":"reference/cli/#options_23","title":"Options","text":"
    • --json: Output the configuration in JSON format.
    pixi config list default-channels\npixi config list --json\npixi config list --system\npixi config list -g\n
    "},{"location":"reference/cli/#config-prepend","title":"config prepend","text":"

    Prepend a value to a list configuration key.

    "},{"location":"reference/cli/#arguments_15","title":"Arguments","text":"
    1. <KEY>: The key to prepend the value to.
    2. <VALUE>: The value to prepend.
    pixi config prepend default-channels conda-forge\n
    "},{"location":"reference/cli/#config-append","title":"config append","text":"

    Append a value to a list configuration key.

    "},{"location":"reference/cli/#arguments_16","title":"Arguments","text":"
    1. <KEY>: The key to append the value to.
    2. <VALUE>: The value to append.
    pixi config append default-channels robostack\npixi config append default-channels bioconda --global\n
    "},{"location":"reference/cli/#config-set","title":"config set","text":"

    Set a configuration key to a value.

    "},{"location":"reference/cli/#arguments_17","title":"Arguments","text":"
    1. <KEY>: The key to set the value of.
    2. [VALUE]: The value to set. (if not provided, the key will be removed)
    pixi config set default-channels '[\"conda-forge\", \"bioconda\"]'\npixi config set --global mirrors '{\"https://conda.anaconda.org/\": [\"https://prefix.dev/conda-forge\"]}'\npixi config set repodata-config.disable-zstd true --system\npixi config set --global detached-environments \"/opt/pixi/envs\"\npixi config set detached-environments false\n
    "},{"location":"reference/cli/#config-unset","title":"config unset","text":"

    Unset a configuration key.

    "},{"location":"reference/cli/#arguments_18","title":"Arguments","text":"
    1. <KEY>: The key to unset.
    pixi config unset default-channels\npixi config unset --global mirrors\npixi config unset repodata-config.disable-zstd --system\n
    "},{"location":"reference/cli/#global","title":"global","text":"

    Global is the main entry point for the part of pixi that executes on the global(system) level.

    Tip

    Binaries and environments installed globally are stored in ~/.pixi by default, this can be changed by setting the PIXI_HOME environment variable.

    "},{"location":"reference/cli/#global-install","title":"global install","text":"

    This command installs package(s) into its own environment and adds the binary to PATH, allowing you to access it anywhere on your system without activating the environment.

    "},{"location":"reference/cli/#arguments_19","title":"Arguments","text":"

    1.<PACKAGE>: The package(s) to install, this can also be a version constraint.

    "},{"location":"reference/cli/#options_24","title":"Options","text":"
    • --channel <CHANNEL> (-c): specify a channel that the project uses. Defaults to conda-forge. (Allowed to be used more than once)
    • --platform <PLATFORM> (-p): specify a platform that you want to install the package for. (default: current platform)
    • --no-activation: Do not insert conda_prefix, path modifications, and activation script into the installed executable script.
    pixi global install ruff\n# multiple packages can be installed at once\npixi global install starship rattler-build\n# specify the channel(s)\npixi global install --channel conda-forge --channel bioconda trackplot\n# Or in a more concise form\npixi global install -c conda-forge -c bioconda trackplot\n\n# Support full conda matchspec\npixi global install python=3.9.*\npixi global install \"python [version='3.11.0', build_number=1]\"\npixi global install \"python [version='3.11.0', build=he550d4f_1_cpython]\"\npixi global install python=3.11.0=h10a6764_1_cpython\n\n# Install for a specific platform, only useful on osx-arm64\npixi global install --platform osx-64 ruff\n\n# Install without inserting activation code into the executable script\npixi global install ruff --no-activation\n

    Tip

    Running osx-64 on Apple Silicon will install the Intel binary but run it using Rosetta

    pixi global install --platform osx-64 ruff\n

    After using global install, you can use the package you installed anywhere on your system.

    "},{"location":"reference/cli/#global-list","title":"global list","text":"

    This command shows the current installed global environments including what binaries come with it. A global installed package/environment can possibly contain multiple binaries and they will be listed out in the command output. Here is an example of a few installed packages:

    > pixi global list\nGlobal install location: /home/hanabi/.pixi\n\u251c\u2500\u2500 bat 0.24.0\n|   \u2514\u2500 exec: bat\n\u251c\u2500\u2500 conda-smithy 3.31.1\n|   \u2514\u2500 exec: feedstocks, conda-smithy\n\u251c\u2500\u2500 rattler-build 0.13.0\n|   \u2514\u2500 exec: rattler-build\n\u251c\u2500\u2500 ripgrep 14.1.0\n|   \u2514\u2500 exec: rg\n\u2514\u2500\u2500 uv 0.1.17\n    \u2514\u2500 exec: uv\n
    "},{"location":"reference/cli/#global-upgrade","title":"global upgrade","text":"

    This command upgrades a globally installed package (to the latest version by default).

    "},{"location":"reference/cli/#arguments_20","title":"Arguments","text":"
    1. <PACKAGE>: The package to upgrade.
    "},{"location":"reference/cli/#options_25","title":"Options","text":"
    • --channel <CHANNEL> (-c): specify a channel that the project uses. Defaults to conda-forge. Note the channel the package was installed from will be always used for upgrade. (Allowed to be used more than once)
    • --platform <PLATFORM> (-p): specify a platform that you want to upgrade the package for. (default: current platform)
    pixi global upgrade ruff\npixi global upgrade --channel conda-forge --channel bioconda trackplot\n# Or in a more concise form\npixi global upgrade -c conda-forge -c bioconda trackplot\n\n# Conda matchspec is supported\n# You can specify the version to upgrade to when you don't want the latest version\n# or you can even use it to downgrade a globally installed package\npixi global upgrade python=3.10\n
    "},{"location":"reference/cli/#global-upgrade-all","title":"global upgrade-all","text":"

    This command upgrades all globally installed packages to their latest version.

    "},{"location":"reference/cli/#options_26","title":"Options","text":"
    • --channel <CHANNEL> (-c): specify a channel that the project uses. Defaults to conda-forge. Note the channel the package was installed from will be always used for upgrade. (Allowed to be used more than once)
    pixi global upgrade-all\npixi global upgrade-all --channel conda-forge --channel bioconda\n# Or in a more concise form\npixi global upgrade-all -c conda-forge -c bioconda trackplot\n
    "},{"location":"reference/cli/#global-remove","title":"global remove","text":"

    Removes a package previously installed into a globally accessible location via pixi global install

    Use pixi global info to find out what the package name is that belongs to the tool you want to remove.

    "},{"location":"reference/cli/#arguments_21","title":"Arguments","text":"
    1. <PACKAGE>: The package(s) to remove.
    pixi global remove pre-commit\n\n# multiple packages can be removed at once\npixi global remove pre-commit starship\n
    "},{"location":"reference/cli/#project","title":"project","text":"

    This subcommand allows you to modify the project configuration through the command line interface.

    "},{"location":"reference/cli/#options_27","title":"Options","text":"
    • --manifest-path <MANIFEST_PATH>: the path to manifest file, by default it searches for one in the parent directories.
    "},{"location":"reference/cli/#project-channel-add","title":"project channel add","text":"

    Add channels to the channel list in the project configuration. When you add channels, the channels are tested for existence, added to the lock file and the environment is reinstalled.

    "},{"location":"reference/cli/#arguments_22","title":"Arguments","text":"
    1. <CHANNEL>: The channels to add, name or URL.
    "},{"location":"reference/cli/#options_28","title":"Options","text":"
    • --no-install: do not update the environment, only add changed packages to the lock-file.
    • --feature <FEATURE> (-f): The feature for which the channel is added.
    pixi project channel add robostack\npixi project channel add bioconda conda-forge robostack\npixi project channel add file:///home/user/local_channel\npixi project channel add https://repo.prefix.dev/conda-forge\npixi project channel add --no-install robostack\npixi project channel add --feature cuda nvidia\n
    "},{"location":"reference/cli/#project-channel-list","title":"project channel list","text":"

    List the channels in the manifest file

    "},{"location":"reference/cli/#options_29","title":"Options","text":"
    • urls: show the urls of the channels instead of the names.
    $ pixi project channel list\nEnvironment: default\n- conda-forge\n\n$ pixi project channel list --urls\nEnvironment: default\n- https://conda.anaconda.org/conda-forge/\n
    "},{"location":"reference/cli/#project-channel-remove","title":"project channel remove","text":"

    List the channels in the manifest file

    "},{"location":"reference/cli/#arguments_23","title":"Arguments","text":"
    1. <CHANNEL>...: The channels to remove, name(s) or URL(s).
    "},{"location":"reference/cli/#options_30","title":"Options","text":"
    • --no-install: do not update the environment, only add changed packages to the lock-file.
    • --feature <FEATURE> (-f): The feature for which the channel is removed.
    pixi project channel remove conda-forge\npixi project channel remove https://conda.anaconda.org/conda-forge/\npixi project channel remove --no-install conda-forge\npixi project channel remove --feature cuda nvidia\n
    "},{"location":"reference/cli/#project-description-get","title":"project description get","text":"

    Get the project description.

    $ pixi project description get\nPackage management made easy!\n
    "},{"location":"reference/cli/#project-description-set","title":"project description set","text":"

    Set the project description.

    "},{"location":"reference/cli/#arguments_24","title":"Arguments","text":"
    1. <DESCRIPTION>: The description to set.
    pixi project description set \"my new description\"\n
    "},{"location":"reference/cli/#project-environment-add","title":"project environment add","text":"

    Add an environment to the manifest file.

    "},{"location":"reference/cli/#arguments_25","title":"Arguments","text":"
    1. <NAME>: The name of the environment to add.
    "},{"location":"reference/cli/#options_31","title":"Options","text":"
    • -f, --feature <FEATURES>: Features to add to the environment.
    • --solve-group <SOLVE_GROUP>: The solve-group to add the environment to.
    • --no-default-feature: Don't include the default feature in the environment.
    • --force: Update the manifest even if the environment already exists.
    pixi project environment add env1 --feature feature1 --feature feature2\npixi project environment add env2 -f feature1 --solve-group test\npixi project environment add env3 -f feature1 --no-default-feature\npixi project environment add env3 -f feature1 --force\n
    "},{"location":"reference/cli/#project-environment-remove","title":"project environment remove","text":"

    Remove an environment from the manifest file.

    "},{"location":"reference/cli/#arguments_26","title":"Arguments","text":"
    1. <NAME>: The name of the environment to remove.
    pixi project environment remove env1\n
    "},{"location":"reference/cli/#project-environment-list","title":"project environment list","text":"

    List the environments in the manifest file.

    pixi project environment list\n
    "},{"location":"reference/cli/#project-export-conda_environment","title":"project export conda_environment","text":"

    Exports a conda environment.yml file. The file can be used to create a conda environment using conda/mamba:

    pixi project export conda-environment environment.yml\nmamba create --name <env> --file environment.yml\n
    "},{"location":"reference/cli/#arguments_27","title":"Arguments","text":"
    1. <OUTPUT_PATH>: Optional path to render environment.yml to. Otherwise it will be printed to standard out.
    "},{"location":"reference/cli/#options_32","title":"Options","text":"
    • --environment <ENVIRONMENT> (-e): Environment to render.
    • --platform <PLATFORM> (-p): The platform to render.
    pixi project export conda-environment --environment lint\npixi project export conda-environment --platform linux-64 environment.linux-64.yml\n
    "},{"location":"reference/cli/#project-export-conda_explicit_spec","title":"project export conda_explicit_spec","text":"

    Render a platform-specific conda explicit specification file for an environment. The file can be then used to create a conda environment using conda/mamba:

    mamba create --name <env> --file <explicit spec file>\n

    As the explicit specification file format does not support pypi-dependencies, use the --ignore-pypi-errors option to ignore those dependencies.

    "},{"location":"reference/cli/#arguments_28","title":"Arguments","text":"
    1. <OUTPUT_DIR>: Output directory for rendered explicit environment spec files.
    "},{"location":"reference/cli/#options_33","title":"Options","text":"
    • --environment <ENVIRONMENT> (-e): Environment to render. Can be repeated for multiple envs. Defaults to all environments.
    • --platform <PLATFORM> (-p): The platform to render. Can be repeated for multiple platforms. Defaults to all platforms available for selected environments.
    • --ignore-pypi-errors: PyPI dependencies are not supported in the conda explicit spec file. This flag allows creating the spec file even if PyPI dependencies are present.
    pixi project export conda_explicit_spec output\npixi project export conda_explicit_spec -e default -e test -p linux-64 output\n
    "},{"location":"reference/cli/#project-platform-add","title":"project platform add","text":"

    Adds a platform(s) to the manifest file and updates the lock file.

    "},{"location":"reference/cli/#arguments_29","title":"Arguments","text":"
    1. <PLATFORM>...: The platforms to add.
    "},{"location":"reference/cli/#options_34","title":"Options","text":"
    • --no-install: do not update the environment, only add changed packages to the lock-file.
    • --feature <FEATURE> (-f): The feature for which the platform will be added.
    pixi project platform add win-64\npixi project platform add --feature test win-64\n
    "},{"location":"reference/cli/#project-platform-list","title":"project platform list","text":"

    List the platforms in the manifest file.

    $ pixi project platform list\nosx-64\nlinux-64\nwin-64\nosx-arm64\n
    "},{"location":"reference/cli/#project-platform-remove","title":"project platform remove","text":"

    Remove platform(s) from the manifest file and updates the lock file.

    "},{"location":"reference/cli/#arguments_30","title":"Arguments","text":"
    1. <PLATFORM>...: The platforms to remove.
    "},{"location":"reference/cli/#options_35","title":"Options","text":"
    • --no-install: do not update the environment, only add changed packages to the lock-file.
    • --feature <FEATURE> (-f): The feature for which the platform will be removed.
    pixi project platform remove win-64\npixi project platform remove --feature test win-64\n
    "},{"location":"reference/cli/#project-version-get","title":"project version get","text":"

    Get the project version.

    $ pixi project version get\n0.11.0\n
    "},{"location":"reference/cli/#project-version-set","title":"project version set","text":"

    Set the project version.

    "},{"location":"reference/cli/#arguments_31","title":"Arguments","text":"
    1. <VERSION>: The version to set.
    pixi project version set \"0.13.0\"\n
    "},{"location":"reference/cli/#project-version-majorminorpatch","title":"project version {major|minor|patch}","text":"

    Bump the project version to {MAJOR|MINOR|PATCH}.

    pixi project version major\npixi project version minor\npixi project version patch\n
    1. An up-to-date lock file means that the dependencies in the lock file are allowed by the dependencies in the manifest file. For example

      • a manifest with python = \">= 3.11\" is up-to-date with a name: python, version: 3.11.0 in the pixi.lock.
      • a manifest with python = \">= 3.12\" is not up-to-date with a name: python, version: 3.11.0 in the pixi.lock.

      Being up-to-date does not mean that the lock file holds the latest version available on the channel for the given dependency.\u00a0\u21a9\u21a9\u21a9\u21a9\u21a9\u21a9

    "},{"location":"reference/pixi_configuration/","title":"The configuration of pixi itself","text":"

    Apart from the project specific configuration pixi supports configuration options which are not required for the project to work but are local to the machine. The configuration is loaded in the following order:

    LinuxmacOSWindows Priority Location Comments 1 /etc/pixi/config.toml System-wide configuration 2 $XDG_CONFIG_HOME/pixi/config.toml XDG compliant user-specific configuration 3 $HOME/.config/pixi/config.toml User-specific configuration 4 $PIXI_HOME/config.toml Global configuration in the user home directory. PIXI_HOME defaults to ~/.pixi 5 your_project/.pixi/config.toml Project-specific configuration 6 Command line arguments (--tls-no-verify, --change-ps1=false, etc.) Configuration via command line arguments Priority Location Comments 1 /etc/pixi/config.toml System-wide configuration 2 $XDG_CONFIG_HOME/pixi/config.toml XDG compliant user-specific configuration 3 $HOME/Library/Application Support/pixi/config.toml User-specific configuration 4 $PIXI_HOME/config.toml Global configuration in the user home directory. PIXI_HOME defaults to ~/.pixi 5 your_project/.pixi/config.toml Project-specific configuration 6 Command line arguments (--tls-no-verify, --change-ps1=false, etc.) Configuration via command line arguments Priority Location Comments 1 C:\\ProgramData\\pixi\\config.toml System-wide configuration 2 %APPDATA%\\pixi\\config.toml User-specific configuration 3 $PIXI_HOME\\config.toml Global configuration in the user home directory. PIXI_HOME defaults to %USERPROFILE%/.pixi 4 your_project\\.pixi\\config.toml Project-specific configuration 5 Command line arguments (--tls-no-verify, --change-ps1=false, etc.) Configuration via command line arguments

    Note

    The highest priority wins. If a configuration file is found in a higher priority location, the values from the configuration read from lower priority locations are overwritten.

    Note

    To find the locations where pixi looks for configuration files, run pixi with -vv.

    "},{"location":"reference/pixi_configuration/#reference","title":"Reference","text":"Casing In Configuration

    In versions of pixi 0.20.1 and older the global configuration used snake_case we've changed to kebab-case for consistency with the rest of the configuration. But we still support the old snake_case configuration, for older configuration options. These are:

    • default_channels
    • change_ps1
    • tls_no_verify
    • authentication_override_file
    • mirrors and sub-options
    • repodata-config and sub-options

    The following reference describes all available configuration options.

    "},{"location":"reference/pixi_configuration/#default-channels","title":"default-channels","text":"

    The default channels to select when running pixi init or pixi global install. This defaults to only conda-forge. config.toml

    default-channels = [\"conda-forge\"]\n

    Note

    The default-channels are only used when initializing a new project. Once initialized the channels are used from the project manifest.

    "},{"location":"reference/pixi_configuration/#change-ps1","title":"change-ps1","text":"

    When set to false, the (pixi) prefix in the shell prompt is removed. This applies to the pixi shell subcommand. You can override this from the CLI with --change-ps1.

    config.toml
    change-ps1 = true\n
    "},{"location":"reference/pixi_configuration/#tls-no-verify","title":"tls-no-verify","text":"

    When set to true, the TLS certificates are not verified.

    Warning

    This is a security risk and should only be used for testing purposes or internal networks.

    You can override this from the CLI with --tls-no-verify.

    config.toml
    tls-no-verify = false\n
    "},{"location":"reference/pixi_configuration/#authentication-override-file","title":"authentication-override-file","text":"

    Override from where the authentication information is loaded. Usually, we try to use the keyring to load authentication data from, and only use a JSON file as a fallback. This option allows you to force the use of a JSON file. Read more in the authentication section. config.toml

    authentication-override-file = \"/path/to/your/override.json\"\n

    "},{"location":"reference/pixi_configuration/#detached-environments","title":"detached-environments","text":"

    The directory where pixi stores the project environments, what would normally be placed in the .pixi/envs folder in a project's root. It doesn't affect the environments built for pixi global. The location of environments created for a pixi global installation can be controlled using the PIXI_HOME environment variable.

    Warning

    We recommend against using this because any environment created for a project is no longer placed in the same folder as the project. This creates a disconnect between the project and its environments and manual cleanup of the environments is required when deleting the project.

    However, in some cases, this option can still be very useful, for instance to:

    • force the installation on a specific filesystem/drive.
    • install environments locally but keep the project on a network drive.
    • let a system-administrator have more control over all environments on a system.

    This field can consist of two types of input.

    • A boolean value, true or false, which will enable or disable the feature respectively. (not \"true\" or \"false\", this is read as false)
    • A string value, which will be the absolute path to the directory where the environments will be stored.

    config.toml

    detached-environments = true\n
    or: config.toml
    detached-environments = \"/opt/pixi/envs\"\n

    The environments will be stored in the cache directory when this option is true. When you specify a custom path the environments will be stored in that directory.

    The resulting directory structure will look like this: config.toml

    detached-environments = \"/opt/pixi/envs\"\n
    /opt/pixi/envs\n\u251c\u2500\u2500 pixi-6837172896226367631\n\u2502   \u2514\u2500\u2500 envs\n\u2514\u2500\u2500 NAME_OF_PROJECT-HASH_OF_ORIGINAL_PATH\n    \u251c\u2500\u2500 envs # the runnable environments\n    \u2514\u2500\u2500 solve-group-envs # If there are solve groups\n

    "},{"location":"reference/pixi_configuration/#pinning-strategy","title":"pinning-strategy","text":"

    The strategy to use for pinning dependencies when running pixi add. The default is semver but you can set the following:

    • no-pin: No pinning, resulting in an unconstraint dependency. *
    • semver: Pinning to the latest version that satisfies the semver constraint. Resulting in a pin to major for most versions and to minor for v0 versions.
    • exact-version: Pinning to the exact version, 1.2.3 -> ==1.2.3.
    • major: Pinning to the major version, 1.2.3 -> >=1.2.3, <2.
    • minor: Pinning to the minor version, 1.2.3 -> >=1.2.3, <1.3.
    • latest-up: Pinning to the latest version, 1.2.3 -> >=1.2.3.
    config.toml
    pinning-strategy = \"no-pin\"\n
    "},{"location":"reference/pixi_configuration/#mirrors","title":"mirrors","text":"

    Configuration for conda channel-mirrors, more info below.

    config.toml
    [mirrors]\n# redirect all requests for conda-forge to the prefix.dev mirror\n\"https://conda.anaconda.org/conda-forge\" = [\n    \"https://prefix.dev/conda-forge\"\n]\n\n# redirect all requests for bioconda to one of the three listed mirrors\n# Note: for repodata we try the first mirror first.\n\"https://conda.anaconda.org/bioconda\" = [\n    \"https://conda.anaconda.org/bioconda\",\n    # OCI registries are also supported\n    \"oci://ghcr.io/channel-mirrors/bioconda\",\n    \"https://prefix.dev/bioconda\",\n]\n
    "},{"location":"reference/pixi_configuration/#repodata-config","title":"repodata-config","text":"

    Configuration for repodata fetching. config.toml

    [repodata-config]\n# disable fetching of jlap, bz2 or zstd repodata files.\n# This should only be used for specific old versions of artifactory and other non-compliant\n# servers.\ndisable-jlap = true  # don't try to download repodata.jlap\ndisable-bzip2 = true # don't try to download repodata.json.bz2\ndisable-zstd = true  # don't try to download repodata.json.zst\n

    "},{"location":"reference/pixi_configuration/#pypi-config","title":"pypi-config","text":"

    To setup a certain number of defaults for the usage of PyPI registries. You can use the following configuration options:

    • index-url: The default index URL to use for PyPI packages. This will be added to a manifest file on a pixi init.
    • extra-index-urls: A list of additional URLs to use for PyPI packages. This will be added to a manifest file on a pixi init.
    • keyring-provider: Allows the use of the keyring python package to store and retrieve credentials.
    config.toml
    [pypi-config]\n# Main index url\nindex-url = \"https://pypi.org/simple\"\n# list of additional urls\nextra-index-urls = [\"https://pypi.org/simple2\"]\n# can be \"subprocess\" or \"disabled\"\nkeyring-provider = \"subprocess\"\n

    index-url and extra-index-urls are not globals

    Unlike pip, these settings, with the exception of keyring-provider will only modify the pixi.toml/pyproject.toml file and are not globally interpreted when not present in the manifest. This is because we want to keep the manifest file as complete and reproducible as possible.

    "},{"location":"reference/pixi_configuration/#mirror-configuration","title":"Mirror configuration","text":"

    You can configure mirrors for conda channels. We expect that mirrors are exact copies of the original channel. The implementation will look for the mirror key (a URL) in the mirrors section of the configuration file and replace the original URL with the mirror URL.

    To also include the original URL, you have to repeat it in the list of mirrors.

    The mirrors are prioritized based on the order of the list. We attempt to fetch the repodata (the most important file) from the first mirror in the list. The repodata contains all the SHA256 hashes of the individual packages, so it is important to get this file from a trusted source.

    You can also specify mirrors for an entire \"host\", e.g.

    config.toml
    [mirrors]\n\"https://conda.anaconda.org\" = [\n    \"https://prefix.dev/\"\n]\n

    This will forward all request to channels on anaconda.org to prefix.dev. Channels that are not currently mirrored on prefix.dev will fail in the above example.

    "},{"location":"reference/pixi_configuration/#oci-mirrors","title":"OCI Mirrors","text":"

    You can also specify mirrors on the OCI registry. There is a public mirror on the Github container registry (ghcr.io) that is maintained by the conda-forge team. You can use it like this:

    config.toml
    [mirrors]\n\"https://conda.anaconda.org/conda-forge\" = [\n    \"oci://ghcr.io/channel-mirrors/conda-forge\"\n]\n

    The GHCR mirror also contains bioconda packages. You can search the available packages on Github.

    "},{"location":"reference/project_configuration/","title":"Configuration","text":"

    The pixi.toml is the pixi project configuration file, also known as the project manifest.

    A toml file is structured in different tables. This document will explain the usage of the different tables. For more technical documentation check pixi on crates.io.

    Tip

    We also support the pyproject.toml file. It has the same structure as the pixi.toml file. except that you need to prepend the tables with tool.pixi instead of just the table name. For example, the [project] table becomes [tool.pixi.project]. There are also some small extras that are available in the pyproject.toml file, checkout the pyproject.toml documentation for more information.

    "},{"location":"reference/project_configuration/#the-project-table","title":"The project table","text":"

    The minimally required information in the project table is:

    [project]\nchannels = [\"conda-forge\"]\nname = \"project-name\"\nplatforms = [\"linux-64\"]\n
    "},{"location":"reference/project_configuration/#name","title":"name","text":"

    The name of the project.

    name = \"project-name\"\n
    "},{"location":"reference/project_configuration/#channels","title":"channels","text":"

    This is a list that defines the channels used to fetch the packages from. If you want to use channels hosted on anaconda.org you only need to use the name of the channel directly.

    channels = [\"conda-forge\", \"robostack\", \"bioconda\", \"nvidia\", \"pytorch\"]\n

    Channels situated on the file system are also supported with absolute file paths:

    channels = [\"conda-forge\", \"file:///home/user/staged-recipes/build_artifacts\"]\n

    To access private or public channels on prefix.dev or Quetz use the url including the hostname:

    channels = [\"conda-forge\", \"https://repo.prefix.dev/channel-name\"]\n
    "},{"location":"reference/project_configuration/#platforms","title":"platforms","text":"

    Defines the list of platforms that the project supports. Pixi solves the dependencies for all these platforms and puts them in the lock file (pixi.lock).

    platforms = [\"win-64\", \"linux-64\", \"osx-64\", \"osx-arm64\"]\n

    The available platforms are listed here: link

    Special macOS behavior

    macOS has two platforms: osx-64 for Intel Macs and osx-arm64 for Apple Silicon Macs. To support both, include both in your platforms list. Fallback: If osx-arm64 can't resolve, use osx-64. Running osx-64 on Apple Silicon uses Rosetta for Intel binaries.

    "},{"location":"reference/project_configuration/#version-optional","title":"version (optional)","text":"

    The version of the project. This should be a valid version based on the conda Version Spec. See the version documentation, for an explanation of what is allowed in a Version Spec.

    version = \"1.2.3\"\n
    "},{"location":"reference/project_configuration/#authors-optional","title":"authors (optional)","text":"

    This is a list of authors of the project.

    authors = [\"John Doe <j.doe@prefix.dev>\", \"Marie Curie <mss1867@gmail.com>\"]\n
    "},{"location":"reference/project_configuration/#description-optional","title":"description (optional)","text":"

    This should contain a short description of the project.

    description = \"A simple description\"\n
    "},{"location":"reference/project_configuration/#license-optional","title":"license (optional)","text":"

    The license as a valid SPDX string (e.g. MIT AND Apache-2.0)

    license = \"MIT\"\n
    "},{"location":"reference/project_configuration/#license-file-optional","title":"license-file (optional)","text":"

    Relative path to the license file.

    license-file = \"LICENSE.md\"\n
    "},{"location":"reference/project_configuration/#readme-optional","title":"readme (optional)","text":"

    Relative path to the README file.

    readme = \"README.md\"\n
    "},{"location":"reference/project_configuration/#homepage-optional","title":"homepage (optional)","text":"

    URL of the project homepage.

    homepage = \"https://pixi.sh\"\n
    "},{"location":"reference/project_configuration/#repository-optional","title":"repository (optional)","text":"

    URL of the project source repository.

    repository = \"https://github.com/prefix-dev/pixi\"\n
    "},{"location":"reference/project_configuration/#documentation-optional","title":"documentation (optional)","text":"

    URL of the project documentation.

    documentation = \"https://pixi.sh\"\n
    "},{"location":"reference/project_configuration/#conda-pypi-map-optional","title":"conda-pypi-map (optional)","text":"

    Mapping of channel name or URL to location of mapping that can be URL/Path. Mapping should be structured in json format where conda_name: pypi_package_name. Example:

    local/robostack_mapping.json
    {\n  \"jupyter-ros\": \"my-name-from-mapping\",\n  \"boltons\": \"boltons-pypi\"\n}\n

    If conda-forge is not present in conda-pypi-map pixi will use prefix.dev mapping for it.

    conda-pypi-map = { \"conda-forge\" = \"https://example.com/mapping\", \"https://repo.prefix.dev/robostack\" = \"local/robostack_mapping.json\"}\n
    "},{"location":"reference/project_configuration/#channel-priority-optional","title":"channel-priority (optional)","text":"

    This is the setting for the priority of the channels in the solver step.

    Options:

    • strict: Default, The channels are used in the order they are defined in the channels list. Only packages from the first channel that has the package are used. This ensures that different variants for a single package are not mixed from different channels. Using packages from different incompatible channels like conda-forge and main can lead to hard to debug ABI incompatibilities.

      We strongly recommend not to switch the default. - disabled: There is no priority, all package variants from all channels will be set per package name and solved as one. Care should be taken when using this option. Since package variants can come from any channel when you use this mode, packages might not be compatible. This can cause hard to debug ABI incompatibilities.

      We strongly discourage using this option.

    channel-priority = \"disabled\"\n

    channel-priority = \"disabled\" is a security risk

    Disabling channel priority may lead to unpredictable dependency resolutions. This is a possible security risk as it may lead to packages being installed from unexpected channels. It's advisable to maintain the default strict setting and order channels thoughtfully. If necessary, specify a channel directly for a dependency.

    [project]\n# Putting conda-forge first solves most issues\nchannels = [\"conda-forge\", \"channel-name\"]\n[dependencies]\npackage = {version = \"*\", channel = \"channel-name\"}\n

    "},{"location":"reference/project_configuration/#the-tasks-table","title":"The tasks table","text":"

    Tasks are a way to automate certain custom commands in your project. For example, a lint or format step. Tasks in a pixi project are essentially cross-platform shell commands, with a unified syntax across platforms. For more in-depth information, check the Advanced tasks documentation. Pixi's tasks are run in a pixi environment using pixi run and are executed using the deno_task_shell.

    [tasks]\nsimple = \"echo This is a simple task\"\ncmd = { cmd=\"echo Same as a simple task but now more verbose\"}\ndepending = { cmd=\"echo run after simple\", depends-on=\"simple\"}\nalias = { depends-on=[\"depending\"]}\ndownload = { cmd=\"curl -o file.txt https://example.com/file.txt\" , outputs=[\"file.txt\"]}\nbuild = { cmd=\"npm build\", cwd=\"frontend\", inputs=[\"frontend/package.json\", \"frontend/*.js\"]}\nrun = { cmd=\"python run.py $ARGUMENT\", env={ ARGUMENT=\"value\" }}\nformat = { cmd=\"black $INIT_CWD\" } # runs black where you run pixi run format\nclean-env = { cmd = \"python isolated.py\", clean-env = true} # Only on Unix!\n

    You can modify this table using pixi task.

    Note

    Specify different tasks for different platforms using the target table

    Info

    If you want to hide a task from showing up with pixi task list or pixi info, you can prefix the name with _. For example, if you want to hide depending, you can rename it to _depending.

    "},{"location":"reference/project_configuration/#the-system-requirements-table","title":"The system-requirements table","text":"

    The system requirements are used to define minimal system specifications used during dependency resolution.

    For example, we can define a unix system with a specific minimal libc version.

    [system-requirements]\nlibc = \"2.28\"\n
    or make the project depend on a specific version of cuda:
    [system-requirements]\ncuda = \"12\"\n

    The options are:

    • linux: The minimal version of the linux kernel.
    • libc: The minimal version of the libc library. Also allows specifying the family of the libc library. e.g. libc = { family=\"glibc\", version=\"2.28\" }
    • macos: The minimal version of the macOS operating system.
    • cuda: The minimal version of the CUDA library.

    More information in the system requirements documentation.

    "},{"location":"reference/project_configuration/#the-pypi-options-table","title":"The pypi-options table","text":"

    The pypi-options table is used to define options that are specific to PyPI registries. These options can be specified either at the root level, which will add it to the default options feature, or on feature level, which will create a union of these options when the features are included in the environment.

    The options that can be defined are:

    • index-url: replaces the main index url.
    • extra-index-urls: adds an extra index url.
    • find-links: similar to --find-links option in pip.
    • no-build-isolation: disables build isolation, can only be set per package.
    • index-strategy: allows for specifying the index strategy to use.

    These options are explained in the sections below. Most of these options are taken directly or with slight modifications from the uv settings. If any are missing that you need feel free to create an issue requesting them.

    "},{"location":"reference/project_configuration/#alternative-registries","title":"Alternative registries","text":"

    Strict Index Priority

    Unlike pip, because we make use of uv, we have a strict index priority. This means that the first index is used where a package can be found. The order is determined by the order in the toml file. Where the extra-index-urls are preferred over the index-url. Read more about this on the uv docs

    Often you might want to use an alternative or extra index for your project. This can be done by adding the pypi-options table to your pixi.toml file, the following options are available:

    • index-url: replaces the main index url. If this is not set the default index used is https://pypi.org/simple. Only one index-url can be defined per environment.
    • extra-index-urls: adds an extra index url. The urls are used in the order they are defined. And are preferred over the index-url. These are merged across features into an environment.
    • find-links: which can either be a path {path = './links'} or a url {url = 'https://example.com/links'}. This is similar to the --find-links option in pip. These are merged across features into an environment.

    An example:

    [pypi-options]\nindex-url = \"https://pypi.org/simple\"\nextra-index-urls = [\"https://example.com/simple\"]\nfind-links = [{path = './links'}]\n

    There are some examples in the pixi repository, that make use of this feature.

    Authentication Methods

    To read about existing authentication methods for private registries, please check the PyPI Authentication section.

    "},{"location":"reference/project_configuration/#no-build-isolation","title":"No Build Isolation","text":"

    Even though build isolation is a good default. One can choose to not isolate the build for a certain package name, this allows the build to access the pixi environment. This is convenient if you want to use torch or something similar for your build-process.

    [dependencies]\npytorch = \"2.4.0\"\n\n[pypi-options]\nno-build-isolation = [\"detectron2\"]\n\n[pypi-dependencies]\ndetectron2 = { git = \"https://github.com/facebookresearch/detectron2.git\", rev = \"5b72c27ae39f99db75d43f18fd1312e1ea934e60\"}\n

    Conda dependencies define the build environment

    To use no-build-isolation effectively, use conda dependencies to define the build environment. These are installed before the PyPI dependencies are resolved, this way these dependencies are available during the build process. In the example above adding torch as a PyPI dependency would be ineffective, as it would not yet be installed during the PyPI resolution phase.

    "},{"location":"reference/project_configuration/#index-strategy","title":"Index Strategy","text":"

    The strategy to use when resolving against multiple index URLs. Description modified from the uv documentation:

    By default, uv and thus pixi, will stop at the first index on which a given package is available, and limit resolutions to those present on that first index (first-match). This prevents dependency confusion attacks, whereby an attack can upload a malicious package under the same name to a secondary index.

    One index strategy per environment

    Only one index-strategy can be defined per environment or solve-group, otherwise, an error will be shown.

    "},{"location":"reference/project_configuration/#possible-values","title":"Possible values:","text":"
    • \"first-index\": Only use results from the first index that returns a match for a given package name
    • \"unsafe-first-match\": Search for every package name across all indexes, exhausting the versions from the first index before moving on to the next. Meaning if the package a is available on index x and y, it will prefer the version from x unless you've requested a package version that is only available on y.
    • \"unsafe-best-match\": Search for every package name across all indexes, preferring the best version found. If a package version is in multiple indexes, only look at the entry for the first index. So given index, x and y that both contain package a, it will take the best version from either x or y, but should that version be available on both indexes it will prefer x.

    PyPI only

    The index-strategy only changes PyPI package resolution and not conda package resolution.

    "},{"location":"reference/project_configuration/#the-dependencies-tables","title":"The dependencies table(s)","text":"

    This section defines what dependencies you would like to use for your project.

    There are multiple dependencies tables. The default is [dependencies], which are dependencies that are shared across platforms.

    Dependencies are defined using a VersionSpec. A VersionSpec combines a Version with an optional operator.

    Some examples are:

    # Use this exact package version\npackage0 = \"1.2.3\"\n# Use 1.2.3 up to 1.3.0\npackage1 = \"~=1.2.3\"\n# Use larger than 1.2 lower and equal to 1.4\npackage2 = \">1.2,<=1.4\"\n# Bigger or equal than 1.2.3 or lower not including 1.0.0\npackage3 = \">=1.2.3|<1.0.0\"\n

    Dependencies can also be defined as a mapping where it is using a matchspec:

    package0 = { version = \">=1.2.3\", channel=\"conda-forge\" }\npackage1 = { version = \">=1.2.3\", build=\"py34_0\" }\n

    Tip

    The dependencies can be easily added using the pixi add command line. Running add for an existing dependency will replace it with the newest it can use.

    Note

    To specify different dependencies for different platforms use the target table

    "},{"location":"reference/project_configuration/#dependencies","title":"dependencies","text":"

    Add any conda package dependency that you want to install into the environment. Don't forget to add the channel to the project table should you use anything different than conda-forge. Even if the dependency defines a channel that channel should be added to the project.channels list.

    [dependencies]\npython = \">3.9,<=3.11\"\nrust = \"1.72\"\npytorch-cpu = { version = \"~=1.1\", channel = \"pytorch\" }\n
    "},{"location":"reference/project_configuration/#pypi-dependencies","title":"pypi-dependencies","text":"Details regarding the PyPI integration

    We use uv, which is a new fast pip replacement written in Rust.

    We integrate uv as a library, so we use the uv resolver, to which we pass the conda packages as 'locked'. This disallows uv from installing these dependencies itself, and ensures it uses the exact version of these packages in the resolution. This is unique amongst conda based package managers, which usually just call pip from a subprocess.

    The uv resolution is included in the lock file directly.

    Pixi directly supports depending on PyPI packages, the PyPA calls a distributed package a 'distribution'. There are Source and Binary distributions both of which are supported by pixi. These distributions are installed into the environment after the conda environment has been resolved and installed. PyPI packages are not indexed on prefix.dev but can be viewed on pypi.org.

    Important considerations

    • Stability: PyPI packages might be less stable than their conda counterparts. Prefer using conda packages in the dependencies table where possible.
    "},{"location":"reference/project_configuration/#version-specification","title":"Version specification:","text":"

    These dependencies don't follow the conda matchspec specification. The version is a string specification of the version according to PEP404/PyPA. Additionally, a list of extra's can be included, which are essentially optional dependencies. Note that this version is distinct from the conda MatchSpec type. See the example below to see how this is used in practice:

    [dependencies]\n# When using pypi-dependencies, python is needed to resolve pypi dependencies\n# make sure to include this\npython = \">=3.6\"\n\n[pypi-dependencies]\nfastapi = \"*\"  # This means any version (the wildcard `*` is a pixi addition, not part of the specification)\npre-commit = \"~=3.5.0\" # This is a single version specifier\n# Using the toml map allows the user to add `extras`\npandas = { version = \">=1.0.0\", extras = [\"dataframe\", \"sql\"]}\n\n# git dependencies\n# With ssh\nflask = { git = \"ssh://git@github.com/pallets/flask\" }\n# With https and a specific revision\nrequests = { git = \"https://github.com/psf/requests.git\", rev = \"0106aced5faa299e6ede89d1230bd6784f2c3660\" }\n# TODO: will support later -> branch = '' or tag = '' to specify a branch or tag\n\n# You can also directly add a source dependency from a path, tip keep this relative to the root of the project.\nminimal-project = { path = \"./minimal-project\", editable = true}\n\n# You can also use a direct url, to either a `.tar.gz` or `.zip`, or a `.whl` file\nclick = { url = \"https://github.com/pallets/click/releases/download/8.1.7/click-8.1.7-py3-none-any.whl\" }\n\n# You can also just the default git repo, it will checkout the default branch\npytest = { git = \"https://github.com/pytest-dev/pytest.git\"}\n
    "},{"location":"reference/project_configuration/#full-specification","title":"Full specification","text":"

    The full specification of a PyPI dependencies that pixi supports can be split into the following fields:

    "},{"location":"reference/project_configuration/#extras","title":"extras","text":"

    A list of extras to install with the package. e.g. [\"dataframe\", \"sql\"] The extras field works with all other version specifiers as it is an addition to the version specifier.

    pandas = { version = \">=1.0.0\", extras = [\"dataframe\", \"sql\"]}\npytest = { git = \"URL\", extras = [\"dev\"]}\nblack = { url = \"URL\", extras = [\"cli\"]}\nminimal-project = { path = \"./minimal-project\", editable = true, extras = [\"dev\"]}\n
    "},{"location":"reference/project_configuration/#version","title":"version","text":"

    The version of the package to install. e.g. \">=1.0.0\" or * which stands for any version, this is pixi specific. Version is our default field so using no inline table ({}) will default to this field.

    py-rattler = \"*\"\nruff = \"~=1.0.0\"\npytest = {version = \"*\", extras = [\"dev\"]}\n
    "},{"location":"reference/project_configuration/#git","title":"git","text":"

    A git repository to install from. This support both https:// and ssh:// urls.

    Use git in combination with rev or subdirectory:

    • rev: A specific revision to install. e.g. rev = \"0106aced5faa299e6ede89d1230bd6784f2c3660
    • subdirectory: A subdirectory to install from. subdirectory = \"src\" or subdirectory = \"src/packagex\"
    # Note don't forget the `ssh://` or `https://` prefix!\npytest = { git = \"https://github.com/pytest-dev/pytest.git\"}\nrequests = { git = \"https://github.com/psf/requests.git\", rev = \"0106aced5faa299e6ede89d1230bd6784f2c3660\" }\npy-rattler = { git = \"ssh://git@github.com/mamba-org/rattler.git\", subdirectory = \"py-rattler\" }\n
    "},{"location":"reference/project_configuration/#path","title":"path","text":"

    A local path to install from. e.g. path = \"./path/to/package\" We would advise to keep your path projects in the project, and to use a relative path.

    Set editable to true to install in editable mode, this is highly recommended as it is hard to reinstall if you're not using editable mode. e.g. editable = true

    minimal-project = { path = \"./minimal-project\", editable = true}\n
    "},{"location":"reference/project_configuration/#url","title":"url","text":"

    A URL to install a wheel or sdist directly from an url.

    pandas = {url = \"https://files.pythonhosted.org/packages/3d/59/2afa81b9fb300c90531803c0fd43ff4548074fa3e8d0f747ef63b3b5e77a/pandas-2.2.1.tar.gz\"}\n
    Did you know you can use: add --pypi?

    Use the --pypi flag with the add command to quickly add PyPI packages from the CLI. E.g pixi add --pypi flask

    This does not support all the features of the pypi-dependencies table yet.

    "},{"location":"reference/project_configuration/#source-dependencies-sdist","title":"Source dependencies (sdist)","text":"

    The Source Distribution Format is a source based format (sdist for short), that a package can include alongside the binary wheel format. Because these distributions need to be built, the need a python executable to do this. This is why python needs to be present in a conda environment. Sdists usually depend on system packages to be built, especially when compiling C/C++ based python bindings. Think for example of Python SDL2 bindings depending on the C library: SDL2. To help built these dependencies we activate the conda environment that includes these pypi dependencies before resolving. This way when a source distribution depends on gcc for example, it's used from the conda environment instead of the system.

    "},{"location":"reference/project_configuration/#host-dependencies","title":"host-dependencies","text":"

    This table contains dependencies that are needed to build your project but which should not be included when your project is installed as part of another project. In other words, these dependencies are available during the build but are no longer available when your project is installed. Dependencies listed in this table are installed for the architecture of the target machine.

    [host-dependencies]\npython = \"~=3.10.3\"\n

    Typical examples of host dependencies are:

    • Base interpreters: a Python package would list python here and an R package would list mro-base or r-base.
    • Libraries your project links against during compilation like openssl, rapidjson, or xtensor.
    "},{"location":"reference/project_configuration/#build-dependencies","title":"build-dependencies","text":"

    This table contains dependencies that are needed to build the project. Different from dependencies and host-dependencies these packages are installed for the architecture of the build machine. This enables cross-compiling from one machine architecture to another.

    [build-dependencies]\ncmake = \"~=3.24\"\n

    Typical examples of build dependencies are:

    • Compilers are invoked on the build machine, but they generate code for the target machine. If the project is cross-compiled, the architecture of the build and target machine might differ.
    • cmake is invoked on the build machine to generate additional code- or project-files which are then include in the compilation process.

    Info

    The build target refers to the machine that will execute the build. Programs and libraries installed by these dependencies will be executed on the build machine.

    For example, if you compile on a MacBook with an Apple Silicon chip but target Linux x86_64 then your build platform is osx-arm64 and your host platform is linux-64.

    "},{"location":"reference/project_configuration/#the-activation-table","title":"The activation table","text":"

    The activation table is used for specialized activation operations that need to be run when the environment is activated.

    There are two types of activation operations a user can modify in the manifest:

    • scripts: A list of scripts that are run when the environment is activated.
    • env: A mapping of environment variables that are set when the environment is activated.

    These activation operations will be run before the pixi run and pixi shell commands.

    Note

    The activation operations are run by the system shell interpreter as they run before an environment is available. This means that it runs as cmd.exe on windows and bash on linux and osx (Unix). Only .sh, .bash and .bat files are supported.

    And the environment variables are set in the shell that is running the activation script, thus take note when using e.g. $ or %.

    If you have scripts or env variable per platform use the target table.

    [activation]\nscripts = [\"env_setup.sh\"]\nenv = { ENV_VAR = \"value\" }\n\n# To support windows platforms as well add the following\n[target.win-64.activation]\nscripts = [\"env_setup.bat\"]\n\n[target.linux-64.activation.env]\nENV_VAR = \"linux-value\"\n\n# You can also reference existing environment variables, but this has\n# to be done separately for unix-like operating systems and Windows\n[target.unix.activation.env]\nENV_VAR = \"$OTHER_ENV_VAR/unix-value\"\n\n[target.win.activation.env]\nENV_VAR = \"%OTHER_ENV_VAR%\\\\windows-value\"\n
    "},{"location":"reference/project_configuration/#the-target-table","title":"The target table","text":"

    The target table is a table that allows for platform specific configuration. Allowing you to make different sets of tasks or dependencies per platform.

    The target table is currently implemented for the following sub-tables:

    • activation
    • dependencies
    • tasks

    The target table is defined using [target.PLATFORM.SUB-TABLE]. E.g [target.linux-64.dependencies]

    The platform can be any of:

    • win, osx, linux or unix (unix matches linux and osx)
    • or any of the (more) specific target platforms, e.g. linux-64, osx-arm64

    The sub-table can be any of the specified above.

    To make it a bit more clear, let's look at an example below. Currently, pixi combines the top level tables like dependencies with the target-specific ones into a single set. Which, in the case of dependencies, can both add or overwrite dependencies. In the example below, we have cmake being used for all targets but on osx-64 or osx-arm64 a different version of python will be selected.

    [dependencies]\ncmake = \"3.26.4\"\npython = \"3.10\"\n\n[target.osx.dependencies]\npython = \"3.11\"\n

    Here are some more examples:

    [target.win-64.activation]\nscripts = [\"setup.bat\"]\n\n[target.win-64.dependencies]\nmsmpi = \"~=10.1.1\"\n\n[target.win-64.build-dependencies]\nvs2022_win-64 = \"19.36.32532\"\n\n[target.win-64.tasks]\ntmp = \"echo $TEMP\"\n\n[target.osx-64.dependencies]\nclang = \">=16.0.6\"\n
    "},{"location":"reference/project_configuration/#the-feature-and-environments-tables","title":"The feature and environments tables","text":"

    The feature table allows you to define features that can be used to create different [environments]. The [environments] table allows you to define different environments. The design is explained in the this design document.

    Simplest example
    [feature.test.dependencies]\npytest = \"*\"\n\n[environments]\ntest = [\"test\"]\n

    This will create an environment called test that has pytest installed.

    "},{"location":"reference/project_configuration/#the-feature-table","title":"The feature table","text":"

    The feature table allows you to define the following fields per feature.

    • dependencies: Same as the dependencies.
    • pypi-dependencies: Same as the pypi-dependencies.
    • pypi-options: Same as the pypi-options.
    • system-requirements: Same as the system-requirements.
    • activation: Same as the activation.
    • platforms: Same as the platforms. Unless overridden, the platforms of the feature will be those defined at project level.
    • channels: Same as the channels. Unless overridden, the channels of the feature will be those defined at project level.
    • channel-priority: Same as the channel-priority.
    • target: Same as the target.
    • tasks: Same as the tasks.

    These tables are all also available without the feature prefix. When those are used we call them the default feature. This is a protected name you can not use for your own feature.

    Cuda feature table example
    [feature.cuda]\nactivation = {scripts = [\"cuda_activation.sh\"]}\n# Results in:  [\"nvidia\", \"conda-forge\"] when the default is `conda-forge`\nchannels = [\"nvidia\"]\ndependencies = {cuda = \"x.y.z\", cudnn = \"12.0\"}\npypi-dependencies = {torch = \"==1.9.0\"}\nplatforms = [\"linux-64\", \"osx-arm64\"]\nsystem-requirements = {cuda = \"12\"}\ntasks = { warmup = \"python warmup.py\" }\ntarget.osx-arm64 = {dependencies = {mlx = \"x.y.z\"}}\n
    Cuda feature table example but written as separate tables
    [feature.cuda.activation]\nscripts = [\"cuda_activation.sh\"]\n\n[feature.cuda.dependencies]\ncuda = \"x.y.z\"\ncudnn = \"12.0\"\n\n[feature.cuda.pypi-dependencies]\ntorch = \"==1.9.0\"\n\n[feature.cuda.system-requirements]\ncuda = \"12\"\n\n[feature.cuda.tasks]\nwarmup = \"python warmup.py\"\n\n[feature.cuda.target.osx-arm64.dependencies]\nmlx = \"x.y.z\"\n\n# Channels and Platforms are not available as separate tables as they are implemented as lists\n[feature.cuda]\nchannels = [\"nvidia\"]\nplatforms = [\"linux-64\", \"osx-arm64\"]\n
    "},{"location":"reference/project_configuration/#the-environments-table","title":"The environments table","text":"

    The [environments] table allows you to define environments that are created using the features defined in the [feature] tables.

    The environments table is defined using the following fields:

    • features: The features that are included in the environment. Unless no-default-feature is set to true, the default feature is implicitly included in the environment.
    • solve-group: The solve group is used to group environments together at the solve stage. This is useful for environments that need to have the same dependencies but might extend them with additional dependencies. For instance when testing a production environment with additional test dependencies. These dependencies will then be the same version in all environments that have the same solve group. But the different environments contain different subsets of the solve-groups dependencies set.
    • no-default-feature: Whether to include the default feature in that environment. The default is false, to include the default feature.

    Full environments table specification

    [environments]\ntest = {features = [\"test\"], solve-group = \"test\"}\nprod = {features = [\"prod\"], solve-group = \"test\"}\nlint = {features = [\"lint\"], no-default-feature = true}\n
    As shown in the example above, in the simplest of cases, it is possible to define an environment only by listing its features:

    Simplest example
    [environments]\ntest = [\"test\"]\n

    is equivalent to

    Simplest example expanded
    [environments]\ntest = {features = [\"test\"]}\n

    When an environment comprises several features (including the default feature): - The activation and tasks of the environment are the union of the activation and tasks of all its features. - The dependencies and pypi-dependencies of the environment are the union of the dependencies and pypi-dependencies of all its features. This means that if several features define a requirement for the same package, both requirements will be combined. Beware of conflicting requirements across features added to the same environment. - The system-requirements of the environment is the union of the system-requirements of all its features. If multiple features specify a requirement for the same system package, the highest version is chosen. - The channels of the environment is the union of the channels of all its features. Channel priorities can be specified in each feature, to ensure channels are considered in the right order in the environment. - The platforms of the environment is the intersection of the platforms of all its features. Be aware that the platforms supported by a feature (including the default feature) will be considered as the platforms defined at project level (unless overridden in the feature). This means that it is usually a good idea to set the project platforms to all platforms it can support across its environments.

    "},{"location":"reference/project_configuration/#global-configuration","title":"Global configuration","text":"

    The global configuration options are documented in the global configuration section.

    "},{"location":"switching_from/conda/","title":"Transitioning from the conda or mamba to pixi","text":"

    Welcome to the guide designed to ease your transition from conda or mamba to pixi. This document compares key commands and concepts between these tools, highlighting pixi's unique approach to managing environments and packages. With pixi, you'll experience a project-based workflow, enhancing your development process, and allowing for easy sharing of your work.

    "},{"location":"switching_from/conda/#why-pixi","title":"Why Pixi?","text":"

    Pixi builds upon the foundation of the conda ecosystem, introducing a project-centric approach rather than focusing solely on environments. This shift towards projects offers a more organized and efficient way to manage dependencies and run code, tailored to modern development practices.

    "},{"location":"switching_from/conda/#key-differences-at-a-glance","title":"Key Differences at a Glance","text":"Task Conda/Mamba Pixi Installation Requires an installer Download and add to path (See installation) Creating an Environment conda create -n myenv -c conda-forge python=3.8 pixi init myenv followed by pixi add python=3.8 Activating an Environment conda activate myenv pixi shell within the project directory Deactivating an Environment conda deactivate exit from the pixi shell Running a Task conda run -n myenv python my_program.py pixi run python my_program.py (See run) Installing a Package conda install numpy pixi add numpy Uninstalling a Package conda remove numpy pixi remove numpy

    No base environment

    Conda has a base environment, which is the default environment when you start a new shell. Pixi does not have a base environment. And requires you to install the tools you need in the project or globally. Using pixi global install bat will install bat in a global environment, which is not the same as the base environment in conda.

    Activating pixi environment in the current shell

    For some advanced use-cases, you can activate the environment in the current shell. This uses the pixi shell-hook which prints the activation script, which can be used to activate the environment in the current shell without pixi itself.

    ~/myenv > eval \"$(pixi shell-hook)\"\n

    "},{"location":"switching_from/conda/#environment-vs-project","title":"Environment vs Project","text":"

    Conda and mamba focus on managing environments, while pixi emphasizes projects. In pixi, a project is a folder containing a manifest(pixi.toml/pyproject.toml) file that describes the project, a pixi.lock lock-file that describes the exact dependencies, and a .pixi folder that contains the environment.

    This project-centric approach allows for easy sharing and collaboration, as the project folder contains all the necessary information to recreate the environment. It manages more than one environment for more than one platform in a single project, and allows for easy switching between them. (See multiple environments)

    "},{"location":"switching_from/conda/#global-environments","title":"Global environments","text":"

    conda installs all environments in one global location. When this is important to you for filesystem reasons, you can use the detached-environments feature of pixi.

    pixi config set detached-environments true\n# or a specific location\npixi config set detached-environments /path/to/envs\n
    This doesn't allow you to activate the environments using pixi shell -n but it will make the installation of the environments go to the same folder.

    pixi does have the pixi global command to install tools on your machine. (See global) This is not a replacement for conda but works the same as pipx and condax. It creates a single isolated environment for the given requirement and installs the binaries into the global path.

    pixi global install bat\nbat pixi.toml\n

    Never install pip with pixi global

    Installations with pixi global get their own isolated environment. Installing pip with pixi global will create a new isolated environment with its own pip binary. Using that pip binary will install packages in the pip environment, making it unreachable form anywhere as you can't activate it.

    "},{"location":"switching_from/conda/#automated-switching","title":"Automated switching","text":"

    With pixi you can import environment.yml files into a pixi project. (See import)

    pixi init --import environment.yml\n
    This will create a new project with the dependencies from the environment.yml file.

    Exporting your environment

    If you are working with Conda users or systems, you can export your environment to a environment.yml file to share them.

    pixi project export conda\n
    Additionally you can export a conda explicit specification.

    "},{"location":"switching_from/conda/#troubleshooting","title":"Troubleshooting","text":"

    Encountering issues? Here are solutions to some common problems when being used to the conda workflow:

    • Dependency is excluded because due to strict channel priority not using this option from: 'https://conda.anaconda.org/conda-forge/' This error occurs when the package is in multiple channels. pixi uses a strict channel priority. See channel priority for more information.
    • pixi global install pip, pip doesn't work. pip is installed in the global isolated environment. Use pixi add pip in a project to install pip in the project environment and use that project.
    • pixi global install <Any Library> -> import <Any Library> -> ModuleNotFoundError: No module named '<Any Library>' The library is installed in the global isolated environment. Use pixi add <Any Library> in a project to install the library in the project environment and use that project.
    "},{"location":"switching_from/poetry/","title":"Transitioning from poetry to pixi","text":"

    Welcome to the guide designed to ease your transition from poetry to pixi. This document compares key commands and concepts between these tools, highlighting pixi's unique approach to managing environments and packages. With pixi, you'll experience a project-based workflow similar to poetry while including the conda ecosystem and allowing for easy sharing of your work.

    "},{"location":"switching_from/poetry/#why-pixi","title":"Why Pixi?","text":"

    Poetry is most-likely the closest tool to pixi in terms of project management, in the python ecosystem. On top of the PyPI ecosystem, pixi adds the power of the conda ecosystem, allowing for a more flexible and powerful environment management.

    "},{"location":"switching_from/poetry/#quick-look-at-the-differences","title":"Quick look at the differences","text":"Task Poetry Pixi Creating an Environment poetry new myenv pixi init myenv Running a Task poetry run which python pixi run which python pixi uses a built-in cross platform shell for run where poetry uses your shell. Installing a Package poetry add numpy pixi add numpy adds the conda variant. pixi add --pypi numpy adds the PyPI variant. Uninstalling a Package poetry remove numpy pixi remove numpy removes the conda variant. pixi remove --pypi numpy removes the PyPI variant. Building a package poetry build We've yet to implement package building and publishing Publishing a package poetry publish We've yet to implement package building and publishing Reading the pyproject.toml [tool.poetry] [tool.pixi] Defining dependencies [tool.poetry.dependencies] [tool.pixi.dependencies] for conda, [tool.pixi.pypi-dependencies] or [project.dependencies] for PyPI dependencies Dependency definition - numpy = \"^1.2.3\"- numpy = \"~1.2.3\"- numpy = \"*\" - numpy = \">=1.2.3 <2.0.0\"- numpy = \">=1.2.3 <1.3.0\"- numpy = \"*\" Lock file poetry.lock pixi.lock Environment directory ~/.cache/pypoetry/virtualenvs/myenv ./.pixi Defaults to the project folder, move this using the detached-environments"},{"location":"switching_from/poetry/#support-both-poetry-and-pixi-in-my-project","title":"Support both poetry and pixi in my project","text":"

    You can allow users to use poetry and pixi in the same project, they will not touch each other's parts of the configuration or system. It's best to duplicate the dependencies, basically making an exact copy of the tool.poetry.dependencies into tool.pixi.pypi-dependencies. Make sure that python is only defined in the tool.pixi.dependencies and not in the tool.pixi.pypi-dependencies.

    Mixing pixi and poetry

    It's possible to use poetry in pixi environments but this is advised against. Pixi supports PyPI dependencies in a different way than poetry does, and mixing them can lead to unexpected behavior. As you can only use one package manager at a time, it's best to stick to one.

    If using poetry on top of a pixi project, you'll always need to install the poetry environment after the pixi environment. And let pixi handle the python and poetry installation.

    "},{"location":"tutorials/python/","title":"Tutorial: Doing Python development with Pixi","text":"

    In this tutorial, we will show you how to create a simple Python project with pixi. We will show some of the features that pixi provides, that are currently not a part of pdm, poetry etc.

    "},{"location":"tutorials/python/#why-is-this-useful","title":"Why is this useful?","text":"

    Pixi builds upon the conda ecosystem, which allows you to create a Python environment with all the dependencies you need. This is especially useful when you are working with multiple Python interpreters and bindings to C and C++ libraries. For example, GDAL from PyPI does not have binary C dependencies, but the conda package does. On the other hand, some packages are only available through PyPI, which pixi can also install for you. Best of both world, let's give it a go!

    "},{"location":"tutorials/python/#pixitoml-and-pyprojecttoml","title":"pixi.toml and pyproject.toml","text":"

    We support two manifest formats: pyproject.toml and pixi.toml. In this tutorial, we will use the pyproject.toml format because it is the most common format for Python projects.

    "},{"location":"tutorials/python/#lets-get-started","title":"Let's get started","text":"

    Let's start out by making a directory and creating a new pyproject.toml file.

    pixi init pixi-py --format pyproject\n

    This gives you the following pyproject.toml:

    [project]\nname = \"pixi-py\"\nversion = \"0.1.0\"\ndescription = \"Add a short description here\"\nauthors = [{name = \"Tim de Jager\", email = \"tim@prefix.dev\"}]\nrequires-python = \">= 3.11\"\ndependencies = []\n\n[build-system]\nbuild-backend = \"hatchling.build\"\nrequires = [\"hatchling\"]\n\n[tool.pixi.project]\nchannels = [\"conda-forge\"]\nplatforms = [\"osx-arm64\"]\n\n[tool.pixi.pypi-dependencies]\npixi-py = { path = \".\", editable = true }\n\n[tool.pixi.tasks]\n

    Let's add the Python project to the tree:

    Linux & macOSWindows
    cd pixi-py # move into the project directory\nmkdir pixi_py\ntouch pixi_py/__init__.py\n
    cd pixi-py\nmkdir pixi_py\ntype nul > pixi_py\\__init__.py\n

    We now have the following directory structure:

    .\n\u251c\u2500\u2500 pixi_py\n\u2502\u00a0\u00a0 \u251c\u2500\u2500 __init__.py\n\u2514\u2500\u2500 pyproject.toml\n

    We've used a flat-layout here but pixi supports both flat- and src-layouts.

    "},{"location":"tutorials/python/#whats-in-the-pyprojecttoml","title":"What's in the pyproject.toml?","text":"

    Okay, so let's have a look at what's sections have been added and how we can modify the pyproject.toml.

    These first entries were added to the pyproject.toml file:

    # Main pixi entry\n[tool.pixi.project]\nchannels = [\"conda-forge\"]\n# This is your machine platform by default\nplatforms = [\"osx-arm64\"]\n

    The channels and platforms are added to the [tool.pixi.project] section. Channels like conda-forge manage packages similar to PyPI but allow for different packages across languages. The keyword platforms determines what platform the project supports.

    The pixi_py package itself is added as an editable dependency. This means that the package is installed in editable mode, so you can make changes to the package and see the changes reflected in the environment, without having to re-install the environment.

    # Editable installs\n[tool.pixi.pypi-dependencies]\npixi-py = { path = \".\", editable = true }\n

    In pixi, unlike other package managers, this is explicitly stated in the pyproject.toml file. The main reason being so that you can choose which environment this package should be included in.

    "},{"location":"tutorials/python/#managing-both-conda-and-pypi-dependencies-in-pixi","title":"Managing both conda and PyPI dependencies in pixi","text":"

    Our projects usually depend on other packages.

    $ pixi add black\nAdded black\n

    This will result in the following addition to the pyproject.toml:

    # Dependencies\n[tool.pixi.dependencies]\nblack = \">=24.4.2,<24.5\"\n

    But we can also be strict about the version that should be used with pixi add black=24, resulting in

    [tool.pixi.dependencies]\nblack = \"24.*\"\n

    Now, let's add some optional dependencies:

    pixi add --pypi --feature test pytest\n

    Which results in the following fields added to the pyproject.toml:

    [project.optional-dependencies]\ntest = [\"pytest\"]\n

    After we have added the optional dependencies to the pyproject.toml, pixi automatically creates a feature, which can contain a collection of dependencies, tasks, channels, and more.

    Sometimes there are packages that aren't available on conda channels but are published on PyPI. We can add these as well, which pixi will solve together with the default dependencies.

    $ pixi add black --pypi\nAdded black\nAdded these as pypi-dependencies.\n

    which results in the addition to the dependencies key in the pyproject.toml

    dependencies = [\"black\"]\n

    When using the pypi-dependencies you can make use of the optional-dependencies that other packages make available. For example, black makes the cli dependencies option, which can be added with the --pypi keyword:

    $ pixi add black[cli] --pypi\nAdded black[cli]\nAdded these as pypi-dependencies.\n

    which updates the dependencies entry to

    dependencies = [\"black[cli]\"]\n
    Optional dependencies in pixi.toml

    This tutorial focuses on the use of the pyproject.toml, but in case you're curious, the pixi.toml would contain the following entry after the installation of a PyPI package including an optional dependency:

    [pypi-dependencies]\nblack = { version = \"*\", extras = [\"cli\"] }\n

    "},{"location":"tutorials/python/#installation-pixi-install","title":"Installation: pixi install","text":"

    Now let's install the project with pixi install:

    $ pixi install\n\u2714 Project in /path/to/pixi-py is ready to use!\n

    We now have a new directory called .pixi in the project root. This directory contains the environment that was created when we ran pixi install. The environment is a conda environment that contains the dependencies that we specified in the pyproject.toml file. We can also install the test environment with pixi install -e test. We can use these environments for executing code.

    We also have a new file called pixi.lock in the project root. This file contains the exact versions of the dependencies that were installed in the environment across platforms.

    "},{"location":"tutorials/python/#whats-in-the-environment","title":"What's in the environment?","text":"

    Using pixi list, you can see what's in the environment, this is essentially a nicer view on the lock file:

    $ pixi list\nPackage          Version       Build               Size       Kind   Source\nbzip2            1.0.8         h93a5062_5          119.5 KiB  conda  bzip2-1.0.8-h93a5062_5.conda\nblack            24.4.2                            3.8 MiB    pypi   black-24.4.2-cp312-cp312-win_amd64.http.whl\nca-certificates  2024.2.2      hf0a4a13_0          152.1 KiB  conda  ca-certificates-2024.2.2-hf0a4a13_0.conda\nlibexpat         2.6.2         hebf3989_0          62.2 KiB   conda  libexpat-2.6.2-hebf3989_0.conda\nlibffi           3.4.2         h3422bc3_5          38.1 KiB   conda  libffi-3.4.2-h3422bc3_5.tar.bz2\nlibsqlite        3.45.2        h091b4b1_0          806 KiB    conda  libsqlite-3.45.2-h091b4b1_0.conda\nlibzlib          1.2.13        h53f4e23_5          47 KiB     conda  libzlib-1.2.13-h53f4e23_5.conda\nncurses          6.4.20240210  h078ce10_0          801 KiB    conda  ncurses-6.4.20240210-h078ce10_0.conda\nopenssl          3.2.1         h0d3ecfb_1          2.7 MiB    conda  openssl-3.2.1-h0d3ecfb_1.conda\npython           3.12.3        h4a7b5fc_0_cpython  12.6 MiB   conda  python-3.12.3-h4a7b5fc_0_cpython.conda\nreadline         8.2           h92ec313_1          244.5 KiB  conda  readline-8.2-h92ec313_1.conda\ntk               8.6.13        h5083fa2_1          3 MiB      conda  tk-8.6.13-h5083fa2_1.conda\ntzdata           2024a         h0c530f3_0          117 KiB    conda  tzdata-2024a-h0c530f3_0.conda\npixi-py          0.1.0                                        pypi   . (editable)\nxz               5.2.6         h57fd34a_0          230.2 KiB  conda  xz-5.2.6-h57fd34a_0.tar.bz2\n

    Python

    The Python interpreter is also installed in the environment. This is because the Python interpreter version is read from the requires-python field in the pyproject.toml file. This is used to determine the Python version to install in the environment. This way, pixi automatically manages/bootstraps the Python interpreter for you, so no more brew, apt or other system install steps.

    Here, you can see the different conda and Pypi packages listed. As you can see, the pixi-py package that we are working on is installed in editable mode. Every environment in pixi is isolated but reuses files that are hard-linked from a central cache directory. This means that you can have multiple environments with the same packages but only have the individual files stored once on disk.

    We can create the default and test environments based on our own test feature from the optional-dependency:

    pixi project environment add default --solve-group default\npixi project environment add test --feature test --solve-group default\n

    Which results in:

    # Environments\n[tool.pixi.environments]\ndefault = { solve-group = \"default\" }\ntest = { features = [\"test\"], solve-group = \"default\" }\n
    Solve Groups

    Solve groups are a way to group dependencies together. This is useful when you have multiple environments that share the same dependencies. For example, maybe pytest is a dependency that influences the dependencies of the default environment. By putting these in the same solve group, you ensure that the versions in test and default are exactly the same.

    The default environment is created when you run pixi install. The test environment is created from the optional dependencies in the pyproject.toml file. You can execute commands in this environment with e.g. pixi run -e test python

    "},{"location":"tutorials/python/#getting-code-to-run","title":"Getting code to run","text":"

    Let's add some code to the pixi-py package. We will add a new function to the pixi_py/__init__.py file:

    from rich import print\n\ndef hello():\n    return \"Hello, [bold magenta]World[/bold magenta]!\", \":vampire:\"\n\ndef say_hello():\n    print(*hello())\n

    Now add the rich dependency from PyPI using: pixi add --pypi rich.

    Let's see if this works by running:

    pixi r python -c \"import pixi_py; pixi_py.say_hello()\"\nHello, World! \ud83e\udddb\n
    Slow?

    This might be slow(2 minutes) the first time because pixi installs the project, but it will be near instant the second time.

    Pixi runs the self installed Python interpreter. Then, we are importing the pixi_py package, which is installed in editable mode. The code calls the say_hello function that we just added. And it works! Cool!

    "},{"location":"tutorials/python/#testing-this-code","title":"Testing this code","text":"

    Okay, so let's add a test for this function. Let's add a tests/test_me.py file in the root of the project.

    Giving us the following project structure:

    .\n\u251c\u2500\u2500 pixi.lock\n\u251c\u2500\u2500 pixi_py\n\u2502\u00a0\u00a0 \u251c\u2500\u2500 __init__.py\n\u251c\u2500\u2500 pyproject.toml\n\u2514\u2500\u2500 tests/test_me.py\n
    from pixi_py import hello\n\ndef test_pixi_py():\n    assert hello() == (\"Hello, [bold magenta]World[/bold magenta]!\", \":vampire:\")\n

    Let's add an easy task for running the tests.

    $ pixi task add --feature test test \"pytest\"\n\u2714 Added task `test`: pytest .\n

    So pixi has a task system to make it easy to run commands. Similar to npm scripts or something you would specify in a Justfile.

    Pixi tasks

    Tasks are actually a pretty cool pixi feature that is powerful and runs in a cross-platform shell. You can do caching, dependencies and more. Read more about tasks in the tasks section.

    $ pixi r test\n\u2728 Pixi task (test): pytest .\n================================================================================================= test session starts =================================================================================================\nplatform darwin -- Python 3.12.2, pytest-8.1.1, pluggy-1.4.0\nrootdir: /private/tmp/pixi-py\nconfigfile: pyproject.toml\ncollected 1 item\n\ntest_me.py .                                                                                                                                                                                                    [100%]\n\n================================================================================================== 1 passed in 0.00s =================================================================================================\n

    Neat! It seems to be working!

    "},{"location":"tutorials/python/#test-vs-default-environment","title":"Test vs Default environment","text":"

    The interesting thing is if we compare the output of the two environments.

    pixi list -e test\n# v.s. default environment\npixi list\n

    Is that the test environment has:

    package          version       build               size       kind   source\n...\npytest           8.1.1                             1.1 mib    pypi   pytest-8.1.1-py3-none-any.whl\n...\n

    But the default environment is missing this package. This way, you can finetune your environments to only have the packages that are needed for that environment. E.g. you could also have a dev environment that has pytest and ruff installed, but you could omit these from the prod environment. There is a docker example that shows how to set up a minimal prod environment and copy from there.

    "},{"location":"tutorials/python/#replacing-pypi-packages-with-conda-packages","title":"Replacing PyPI packages with conda packages","text":"

    Last thing, pixi provides the ability for pypi packages to depend on conda packages. Let's confirm this with pixi list:

    $ pixi list\nPackage          Version       Build               Size       Kind   Source\n...\npygments         2.17.2                            4.1 MiB    pypi   pygments-2.17.2-py3-none-any.http.whl\n...\n

    Let's explicitly add pygments to the pyproject.toml file. Which is a dependency of the rich package.

    pixi add pygments\n

    This will add the following to the pyproject.toml file:

    [tool.pixi.dependencies]\npygments = \">=2.17.2,<2.18\"\n

    We can now see that the pygments package is now installed as a conda package.

    $ pixi list\nPackage          Version       Build               Size       Kind   Source\n...\npygments         2.17.2        pyhd8ed1ab_0        840.3 KiB  conda  pygments-2.17.2-pyhd8ed1ab_0.conda\n

    This way, PyPI dependencies and conda dependencies can be mixed and matched to seamlessly interoperate.

    $  pixi r python -c \"import pixi_py; pixi_py.say_hello()\"\nHello, World! \ud83e\udddb\n

    And it still works!

    "},{"location":"tutorials/python/#conclusion","title":"Conclusion","text":"

    In this tutorial, you've seen how easy it is to use a pyproject.toml to manage your pixi dependencies and environments. We have also explored how to use PyPI and conda dependencies seamlessly together in the same project and install optional dependencies to manage Python packages.

    Hopefully, this provides a flexible and powerful way to manage your Python projects and a fertile base for further exploration with Pixi.

    Thanks for reading! Happy Coding \ud83d\ude80

    Any questions? Feel free to reach out or share this tutorial on X, join our Discord, send us an e-mail or follow our GitHub.

    "},{"location":"tutorials/ros2/","title":"Tutorial: Develop a ROS 2 package with pixi","text":"

    In this tutorial, we will show you how to develop a ROS 2 package using pixi. The tutorial is written to be executed from top to bottom, missing steps might result in errors.

    The audience for this tutorial is developers who are familiar with ROS 2 and how are interested to try pixi for their development workflow.

    "},{"location":"tutorials/ros2/#prerequisites","title":"Prerequisites","text":"
    • You need to have pixi installed. If you haven't installed it yet, you can follow the instructions in the installation guide. The crux of this tutorial is to show you only need pixi!
    • On Windows, it's advised to enable Developer mode. Go to Settings -> Update & Security -> For developers -> Developer mode.

    If you're new to pixi, you can check out the basic usage guide. This will teach you the basics of pixi project within 3 minutes.

    "},{"location":"tutorials/ros2/#create-a-pixi-project","title":"Create a pixi project.","text":"
    pixi init my_ros2_project -c robostack-staging -c conda-forge\ncd my_ros2_project\n

    It should have created a directory structure like this:

    my_ros2_project\n\u251c\u2500\u2500 .gitattributes\n\u251c\u2500\u2500 .gitignore\n\u2514\u2500\u2500 pixi.toml\n

    The pixi.toml file is the manifest file for your project. It should look like this:

    pixi.toml
    [project]\nname = \"my_ros2_project\"\nversion = \"0.1.0\"\ndescription = \"Add a short description here\"\nauthors = [\"User Name <user.name@email.url>\"]\nchannels = [\"robostack-staging\", \"conda-forge\"]\n# Your project can support multiple platforms, the current platform will be automatically added.\nplatforms = [\"linux-64\"]\n\n[tasks]\n\n[dependencies]\n

    The channels you added to the init command are repositories of packages, you can search in these repositories through our prefix.dev website. The platforms are the systems you want to support, in pixi you can support multiple platforms, but you have to define which platforms, so pixi can test if those are supported for your dependencies. For the rest of the fields, you can fill them in as you see fit.

    "},{"location":"tutorials/ros2/#add-ros-2-dependencies","title":"Add ROS 2 dependencies","text":"

    To use a pixi project you don't need any dependencies on your system, all the dependencies you need should be added through pixi, so other users can use your project without any issues.

    Let's start with the turtlesim example

    pixi add ros-humble-desktop ros-humble-turtlesim\n

    This will add the ros-humble-desktop and ros-humble-turtlesim packages to your manifest. Depending on your internet speed this might take a minute, as it will also install ROS in your project folder (.pixi).

    Now run the turtlesim example.

    pixi run ros2 run turtlesim turtlesim_node\n

    Or use the shell command to start an activated environment in your terminal.

    pixi shell\nros2 run turtlesim turtlesim_node\n

    Congratulations you have ROS 2 running on your machine with pixi!

    Some more fun with the turtle

    To control the turtle you can run the following command in a new terminal

    cd my_ros2_project\npixi run ros2 run turtlesim turtle_teleop_key\n

    Now you can control the turtle with the arrow keys on your keyboard.

    "},{"location":"tutorials/ros2/#add-a-custom-python-node","title":"Add a custom Python node","text":"

    As ros works with custom nodes, let's add a custom node to our project.

    pixi run ros2 pkg create --build-type ament_python --destination-directory src --node-name my_node my_package\n

    To build the package we need some more dependencies:

    pixi add colcon-common-extensions \"setuptools<=58.2.0\"\n

    Add the created initialization script for the ros workspace to your manifest file.

    Then run the build command

    pixi run colcon build\n

    This will create a sourceable script in the install folder, you can source this script through an activation script to use your custom node. Normally this would be the script you add to your .bashrc but now you tell pixi to use it.

    Linux & macOSWindows pixi.toml
    [activation]\nscripts = [\"install/setup.sh\"]\n
    pixi.toml
    [activation]\nscripts = [\"install/setup.bat\"]\n
    Multi platform support

    You can add multiple activation scripts for different platforms, so you can support multiple platforms with one project. Use the following example to add support for both Linux and Windows, using the target syntax.

    [project]\nplatforms = [\"linux-64\", \"win-64\"]\n\n[activation]\nscripts = [\"install/setup.sh\"]\n[target.win-64.activation]\nscripts = [\"install/setup.bat\"]\n

    Now you can run your custom node with the following command

    pixi run ros2 run my_package my_node\n
    "},{"location":"tutorials/ros2/#simplify-the-user-experience","title":"Simplify the user experience","text":"

    In pixi we have a feature called tasks, this allows you to define a task in your manifest file and run it with a simple command. Let's add a task to run the turtlesim example and the custom node.

    pixi task add sim \"ros2 run turtlesim turtlesim_node\"\npixi task add build \"colcon build --symlink-install\"\npixi task add hello \"ros2 run my_package my_node\"\n

    Now you can run these task by simply running

    pixi run sim\npixi run build\npixi run hello\n
    Advanced task usage

    Tasks are a powerful feature in pixi.

    • You can add depends-on to the tasks to create a task chain.
    • You can add cwd to the tasks to run the task in a different directory from the root of the project.
    • You can add inputs and outputs to the tasks to create a task that only runs when the inputs are changed.
    • You can use the target syntax to run specific tasks on specific machines.
    [tasks]\nsim = \"ros2 run turtlesim turtlesim_node\"\nbuild = {cmd = \"colcon build --symlink-install\", inputs = [\"src\"]}\nhello = { cmd = \"ros2 run my_package my_node\", depends-on = [\"build\"] }\n
    "},{"location":"tutorials/ros2/#build-a-c-node","title":"Build a C++ node","text":"

    To build a C++ node you need to add the ament_cmake and some other build dependencies to your manifest file.

    pixi add ros-humble-ament-cmake-auto compilers pkg-config cmake ninja\n

    Now you can create a C++ node with the following command

    pixi run ros2 pkg create --build-type ament_cmake --destination-directory src --node-name my_cpp_node my_cpp_package\n

    Now you can build it again and run it with the following commands

    # Passing arguments to the build command to build with Ninja, add them to the manifest if you want to default to ninja.\npixi run build --cmake-args -G Ninja\npixi run ros2 run my_cpp_package my_cpp_node\n
    Tip

    Add the cpp task to the manifest file to simplify the user experience.

    pixi task add hello-cpp \"ros2 run my_cpp_package my_cpp_node\"\n
    "},{"location":"tutorials/ros2/#conclusion","title":"Conclusion","text":"

    In this tutorial, we showed you how to create a Python & CMake ROS2 project using pixi. We also showed you how to add dependencies to your project using pixi, and how to run your project using pixi run. This way you can make sure that your project is reproducible on all your machines that have pixi installed.

    "},{"location":"tutorials/ros2/#show-off-your-work","title":"Show Off Your Work!","text":"

    Finished with your project? We'd love to see what you've created! Share your work on social media using the hashtag #pixi and tag us @prefix_dev. Let's inspire the community together!

    "},{"location":"tutorials/ros2/#frequently-asked-questions","title":"Frequently asked questions","text":""},{"location":"tutorials/ros2/#what-happens-with-rosdep","title":"What happens with rosdep?","text":"

    Currently, we don't support rosdep in a pixi environment, so you'll have to add the packages using pixi add. rosdep will call conda install which isn't supported in a pixi environment.

    "},{"location":"tutorials/rust/","title":"Tutorial: Develop a Rust package using pixi","text":"

    In this tutorial, we will show you how to develop a Rust package using pixi. The tutorial is written to be executed from top to bottom, missing steps might result in errors.

    The audience for this tutorial is developers who are familiar with Rust and cargo and how are interested to try pixi for their development workflow. The benefit would be within a rust workflow that you lock both rust and the C/System dependencies your project might be using. E.g tokio users will almost most definitely use openssl.

    If you're new to pixi, you can check out the basic usage guide. This will teach you the basics of pixi project within 3 minutes.

    "},{"location":"tutorials/rust/#prerequisites","title":"Prerequisites","text":"
    • You need to have pixi installed. If you haven't installed it yet, you can follow the instructions in the installation guide. The crux of this tutorial is to show you only need pixi!
    "},{"location":"tutorials/rust/#create-a-pixi-project","title":"Create a pixi project.","text":"
    pixi init my_rust_project\ncd my_rust_project\n

    It should have created a directory structure like this:

    my_rust_project\n\u251c\u2500\u2500 .gitattributes\n\u251c\u2500\u2500 .gitignore\n\u2514\u2500\u2500 pixi.toml\n

    The pixi.toml file is the manifest file for your project. It should look like this:

    pixi.toml
    [project]\nname = \"my_rust_project\"\nversion = \"0.1.0\"\ndescription = \"Add a short description here\"\nauthors = [\"User Name <user.name@email.url>\"]\nchannels = [\"conda-forge\"]\nplatforms = [\"linux-64\"] # (1)!\n\n[tasks]\n\n[dependencies]\n
    1. The platforms is set to your system's platform by default. You can change it to any platform you want to support. e.g. [\"linux-64\", \"osx-64\", \"osx-arm64\", \"win-64\"].
    "},{"location":"tutorials/rust/#add-rust-dependencies","title":"Add Rust dependencies","text":"

    To use a pixi project you don't need any dependencies on your system, all the dependencies you need should be added through pixi, so other users can use your project without any issues.

    pixi add rust\n

    This will add the rust package to your pixi.toml file under [dependencies]. Which includes the rust toolchain, and cargo.

    "},{"location":"tutorials/rust/#add-a-cargo-project","title":"Add a cargo project","text":"

    Now that you have rust installed, you can create a cargo project in your pixi project.

    pixi run cargo init\n

    pixi run is pixi's way to run commands in the pixi environment, it will make sure that the environment is set up correctly for the command to run. It runs its own cross-platform shell, if you want more information checkout the tasks documentation. You can also activate the environment in your own shell by running pixi shell, after that you don't need pixi run ... anymore.

    Now we can build a cargo project using pixi.

    pixi run cargo build\n
    To simplify the build process, you can add a build task to your pixi.toml file using the following command:
    pixi task add build \"cargo build\"\n
    Which creates this field in the pixi.toml file: pixi.toml
    [tasks]\nbuild = \"cargo build\"\n

    And now you can build your project using:

    pixi run build\n

    You can also run your project using:

    pixi run cargo run\n
    Which you can simplify with a task again.
    pixi task add start \"cargo run\"\n

    So you should get the following output:

    pixi run start\nHello, world!\n

    Congratulations, you have a Rust project running on your machine with pixi!

    "},{"location":"tutorials/rust/#next-steps-why-is-this-useful-when-there-is-rustup","title":"Next steps, why is this useful when there is rustup?","text":"

    Cargo is not a binary package manager, but a source-based package manager. This means that you need to have the Rust compiler installed on your system to use it. And possibly other dependencies that are not included in the cargo package manager. For example, you might need to install openssl or libssl-dev on your system to build a package. This is the case for pixi as well, but pixi will install these dependencies in your project folder, so you don't have to worry about them.

    Add the following dependencies to your cargo project:

    pixi run cargo add git2\n

    If your system is not preconfigured to build C and have the libssl-dev package installed you will not be able to build the project:

    pixi run build\n...\nCould not find directory of OpenSSL installation, and this `-sys` crate cannot\nproceed without this knowledge. If OpenSSL is installed and this crate had\ntrouble finding it,  you can set the `OPENSSL_DIR` environment variable for the\ncompilation process.\n\nMake sure you also have the development packages of openssl installed.\nFor example, `libssl-dev` on Ubuntu or `openssl-devel` on Fedora.\n\nIf you're in a situation where you think the directory *should* be found\nautomatically, please open a bug at https://github.com/sfackler/rust-openssl\nand include information about your system as well as this message.\n\n$HOST = x86_64-unknown-linux-gnu\n$TARGET = x86_64-unknown-linux-gnu\nopenssl-sys = 0.9.102\n\n\nIt looks like you're compiling on Linux and also targeting Linux. Currently this\nrequires the `pkg-config` utility to find OpenSSL but unfortunately `pkg-config`\ncould not be found. If you have OpenSSL installed you can likely fix this by\ninstalling `pkg-config`.\n...\n
    You can fix this, by adding the necessary dependencies for building git2, with pixi:
    pixi add openssl pkg-config compilers\n

    Now you should be able to build your project again:

    pixi run build\n...\n   Compiling git2 v0.18.3\n   Compiling my_rust_project v0.1.0 (/my_rust_project)\n    Finished dev [unoptimized + debuginfo] target(s) in 7.44s\n     Running `target/debug/my_rust_project`\n

    "},{"location":"tutorials/rust/#extra-add-more-tasks","title":"Extra: Add more tasks","text":"

    You can add more tasks to your pixi.toml file to simplify your workflow.

    For example, you can add a test task to run your tests:

    pixi task add test \"cargo test\"\n

    And you can add a clean task to clean your project:

    pixi task add clean \"cargo clean\"\n

    You can add a formatting task to your project:

    pixi task add fmt \"cargo fmt\"\n

    You can extend these tasks to run multiple commands with the use of the depends-on field.

    pixi task add lint \"cargo clippy\" --depends-on fmt\n

    "},{"location":"tutorials/rust/#conclusion","title":"Conclusion","text":"

    In this tutorial, we showed you how to create a Rust project using pixi. We also showed you how to add dependencies to your project using pixi. This way you can make sure that your project is reproducible on any system that has pixi installed.

    "},{"location":"tutorials/rust/#show-off-your-work","title":"Show Off Your Work!","text":"

    Finished with your project? We'd love to see what you've created! Share your work on social media using the hashtag #pixi and tag us @prefix_dev. Let's inspire the community together!

    "},{"location":"CHANGELOG/","title":"Changelog","text":"

    All notable changes to this project will be documented in this file.

    The format is based on Keep a Changelog, and this project adheres to Semantic Versioning.

    "},{"location":"CHANGELOG/#0320-2024-10-08","title":"[0.32.0] - 2024-10-08","text":""},{"location":"CHANGELOG/#highlights","title":"\u2728 Highlights","text":"

    The biggest fix in this PR is the move to the latest rattler as it came with some major bug fixes for macOS and Rust 1.81 compatibility.

    "},{"location":"CHANGELOG/#changed","title":"Changed","text":"
    • Correctly implement total ordering for dependency provider by @tdejager in rattler/#892
    "},{"location":"CHANGELOG/#fixed","title":"Fixed","text":"
    • Fixed self-clobber issue when up/down grading packages by @wolfv in rattler/#893
    • Check environment name before returning not found print by @ruben-arts in #2198
    • Turn off symlink follow for task cache by @ruben-arts in #2209
    "},{"location":"CHANGELOG/#0310-2024-10-03","title":"[0.31.0] - 2024-10-03","text":""},{"location":"CHANGELOG/#highlights_1","title":"\u2728 Highlights","text":"

    Thanks to our maintainer @baszamstra! He sped up the resolver for all cases we could think of in #2162 Check the result of times it takes to solve the environments in our test set:

    "},{"location":"CHANGELOG/#added","title":"Added","text":"
    • Add nodefaults to imported conda envs by @ruben-arts in #2097
    • Add newline to .gitignore by @ruben-arts in #2095
    • Add --no-activation option to prevent env activation during global install/upgrade by @183amir in #1980
    • Add --priority arg to project channel add by @minrk in #2086
    "},{"location":"CHANGELOG/#changed_1","title":"Changed","text":"
    • Use pixi spec for conda environment yml by @ruben-arts in #2096
    • Update rattler by @nichmor in #2120
    • Update README.md by @ruben-arts in #2129
    • Follow symlinks while walking files by @0xbe7a in #2141
    "},{"location":"CHANGELOG/#documentation","title":"Documentation","text":"
    • Adapt wording in pixi global proposal by @Hofer-Julian in #2098
    • Community: add array-api-extra by @lucascolley in #2107
    • pixi global mention no-activation by @Hofer-Julian in #2109
    • Add minimal constructor example by @bollwyvl in #2102
    • Update global manifest install by @Hofer-Julian in #2128
    • Add description for pixi update --json by @scottamain in #2160
    • Fixes backticks for doc strings by @rachfop in #2174
    "},{"location":"CHANGELOG/#fixed_1","title":"Fixed","text":"
    • Sort exported conda explicit spec topologically by @synapticarbors in #2101
    • --import env_file breaks channel priority by @fecet in #2113
    • Allow exact yanked pypi packages by @nichmor in #2116
    • Check if files are same in self-update by @apoorvkh in #2132
    • get_or_insert_nested_table by @Hofer-Julian in #2167
    • Improve install.sh PATH handling and general robustness by @Arcitec in #2189
    • Output tasks on pixi run without input by @ruben-arts in #2193
    "},{"location":"CHANGELOG/#performance","title":"Performance","text":"
    • Significantly speed up conda resolution by @baszalmstra in #2162
    "},{"location":"CHANGELOG/#new-contributors","title":"New Contributors","text":"
    • @Arcitec made their first contribution in #2189
    • @rachfop made their first contribution in #2174
    • @scottamain made their first contribution in #2160
    • @apoorvkh made their first contribution in #2132
    • @0xbe7a made their first contribution in #2141
    • @fecet made their first contribution in #2113
    • @minrk made their first contribution in #2086
    • @183amir made their first contribution in #1980
    • @lucascolley made their first contribution in #2107
    "},{"location":"CHANGELOG/#0300-2024-09-19","title":"[0.30.0] - 2024-09-19","text":""},{"location":"CHANGELOG/#highlights_2","title":"\u2728 Highlights","text":"

    I want to thank @synapticarbors and @abkfenris for starting the work on pixi project export. Pixi now supports the export of a conda environment.yml file and a conda explicit specification file. This is a great addition to the project and will help users to share their projects with other non pixi users.

    "},{"location":"CHANGELOG/#added_1","title":"Added","text":"
    • Export conda explicit specification file from project by @synapticarbors in #1873
    • Add flag to pixi search by @Hofer-Julian in #2018
    • Adds the ability to set the index strategy by @tdejager in #1986
    • Export conda environment.yml by @abkfenris in #2003
    "},{"location":"CHANGELOG/#changed_2","title":"Changed","text":"
    • Improve examples/docker by @jennydaman in #1965
    • Minimal pre-commit tasks by @Hofer-Julian in #1984
    • Improve error and feedback when target does not exist by @tdejager in #1961
    • Move the rectangle using a mouse in SDL by @certik in #2069
    "},{"location":"CHANGELOG/#documentation_1","title":"Documentation","text":"
    • Update cli.md by @xela-95 in #2047
    • Update system-requirements information by @ruben-arts in #2079
    • Append to file syntax in task docs by @nicornk in #2013
    • Change documentation of pixi upload to refer to correct API endpoint by @traversaro in #2074
    "},{"location":"CHANGELOG/#testing","title":"Testing","text":"
    • Add downstream nerfstudio test by @tdejager in #1996
    • Run pytests in parallel by @tdejager in #2027
    • Testing common wheels by @tdejager in #2031
    "},{"location":"CHANGELOG/#fixed_2","title":"Fixed","text":"
    • Lock file is always outdated for pypi path dependencies by @nichmor in #2039
    • Fix error message for export conda explicit spec by @synapticarbors in #2048
    • Use conda-pypi-map for feature channels by @nichmor in #2038
    • Constrain feature platforms in schema by @bollwyvl in #2055
    • Split tag creation functions by @tdejager in #2062
    • Tree print to pipe by @ruben-arts in #2064
    • subdirectory in pypi url by @ruben-arts in #2065
    • Create a GUI application on Windows, not Console by @certik in #2067
    • Make dashes underscores in python package names by @ruben-arts in #2073
    • Give better errors on broken pyproject.toml by @ruben-arts in #2075
    "},{"location":"CHANGELOG/#refactor","title":"Refactor","text":"
    • Stop duplicating strip_channel_alias from rattler by @Hofer-Julian in #2017
    • Follow-up wheels tests by @Hofer-Julian in #2063
    • Integration test suite by @Hofer-Julian in #2081
    • Remove psutils by @Hofer-Julian in #2083
    • Add back older caching method by @tdejager in #2046
    • Release script by @Hofer-Julian in #1978
    • Activation script by @Hofer-Julian in #2014
    • Pins python version in add_pypi_functionality by @tdejager in #2040
    • Improve the lock_file_usage flags and behavior. by @ruben-arts in #2078
    • Move matrix to workflow that it is used in by @tdejager in #1987
    • Refactor manifest into more generic approach by @nichmor in #2015
    "},{"location":"CHANGELOG/#new-contributors_1","title":"New Contributors","text":"
    • @certik made their first contribution in #2069
    • @xela-95 made their first contribution in #2047
    • @nicornk made their first contribution in #2013
    • @jennydaman made their first contribution in #1965
    "},{"location":"CHANGELOG/#0290-2024-09-04","title":"[0.29.0] - 2024-09-04","text":""},{"location":"CHANGELOG/#highlights_3","title":"\u2728 Highlights","text":"
    • Add build-isolation options, for more details check out our docs
    • Allow to use virtual package overrides from environment variables (PR)
    • Many bug fixes
    "},{"location":"CHANGELOG/#added_2","title":"Added","text":"
    • Add build-isolation options by @tdejager in #1909
    • Add release script by @Hofer-Julian in #1971
    "},{"location":"CHANGELOG/#changed_3","title":"Changed","text":"
    • Use rustls-tls instead of native-tls per default by @Hofer-Julian in #1929
    • Upgrade to uv 0.3.4 by @tdejager in #1936
    • Upgrade to uv 0.4.0 by @tdejager in #1944
    • Better error for when the target or platform are missing by @tdejager in #1959
    • Improve integration tests by @Hofer-Julian in #1958
    • Improve release script by @Hofer-Julian in #1974
    "},{"location":"CHANGELOG/#fixed_3","title":"Fixed","text":"
    • Update env variables in installation docs by @lev112 in #1937
    • Always overwrite when pixi adding the dependency by @ruben-arts in #1935
    • Typo in schema.json by @SobhanMP in #1948
    • Using file url as mapping by @nichmor in #1930
    • Offline mapping should not request by @nichmor in #1968
    • pixi init for pyproject.toml by @Hofer-Julian in #1947
    • Use two in memory indexes, for resolve and builds by @tdejager in #1969
    • Minor issues and todos by @KGrewal1 in #1963
    "},{"location":"CHANGELOG/#refactor_1","title":"Refactor","text":"
    • Improve integration tests by @Hofer-Julian in #1942
    "},{"location":"CHANGELOG/#new-contributors_2","title":"New Contributors","text":"
    • @SobhanMP made their first contribution in #1948
    • @lev112 made their first contribution in #1937
    "},{"location":"CHANGELOG/#0282-2024-08-28","title":"[0.28.2] - 2024-08-28","text":""},{"location":"CHANGELOG/#changed_4","title":"Changed","text":"
    • Use mold on linux by @Hofer-Julian in #1914
    "},{"location":"CHANGELOG/#documentation_2","title":"Documentation","text":"
    • Fix global manifest by @Hofer-Julian in #1912
    • Document azure keyring usage by @tdejager in #1913
    "},{"location":"CHANGELOG/#fixed_4","title":"Fixed","text":"
    • Let init add dependencies independent of target and don't install by @ruben-arts in #1916
    • Enable use of manylinux wheeltags once again by @tdejager in #1925
    • The bigger runner by @ruben-arts in #1902
    "},{"location":"CHANGELOG/#0281-2024-08-26","title":"[0.28.1] - 2024-08-26","text":""},{"location":"CHANGELOG/#changed_5","title":"Changed","text":"
    • Uv upgrade to 0.3.2 by @tdejager in #1900
    "},{"location":"CHANGELOG/#documentation_3","title":"Documentation","text":"
    • Add keyrings.artifacts to the list of project built with pixi by @jslorrma in #1908
    "},{"location":"CHANGELOG/#fixed_5","title":"Fixed","text":"
    • Use default indexes if non where given by the lockfile by @ruben-arts in #1910
    "},{"location":"CHANGELOG/#new-contributors_3","title":"New Contributors","text":"
    • @jslorrma made their first contribution in #1908
    "},{"location":"CHANGELOG/#0280-2024-08-22","title":"[0.28.0] - 2024-08-22","text":""},{"location":"CHANGELOG/#highlights_4","title":"\u2728 Highlights","text":"
    • Bug Fixes: Major fixes in general but especially for PyPI installation issues and better error messaging.
    • Compatibility: Default Linux version downgraded to 4.18 for broader support.
    • New Features: Added INIT_CWD in pixi run, improved logging, and more cache options.
    "},{"location":"CHANGELOG/#added_3","title":"Added","text":"
    • Add INIT_CWD to activated env pixi run by @ruben-arts in #1798
    • Add context to error when parsing conda-meta files by @baszalmstra in #1854
    • Add some logging for when packages are actually overridden by conda by @tdejager in #1874
    • Add package when extra is added by @ruben-arts in #1856
    "},{"location":"CHANGELOG/#changed_6","title":"Changed","text":"
    • Use new gateway to get the repodata for global install by @nichmor in #1767
    • Pixi global proposal by @Hofer-Julian in #1757
    • Upgrade to new uv 0.2.37 by @tdejager in #1829
    • Use new gateway for pixi search by @nichmor in #1819
    • Extend pixi clean cache with more cache options by @ruben-arts in #1872
    • Downgrade __linux default to 4.18 by @ruben-arts in #1887
    "},{"location":"CHANGELOG/#documentation_4","title":"Documentation","text":"
    • Fix instructions for update github actions by @Hofer-Julian in #1774
    • Fix fish completion script by @dennis-wey in #1789
    • Expands the environment variable examples in the reference section by @travishathaway in #1779
    • Community feedback pixi global by @Hofer-Julian in #1800
    • Additions to the pixi global proposal by @Hofer-Julian in #1803
    • Stop using invalid environment name in pixi global proposal by @Hofer-Julian in #1826
    • Extend pixi global proposal by @Hofer-Julian in #1861
    • Make channels required in pixi global manifest by @Hofer-Julian in #1868
    • Fix linux minimum version in project_configuration docs by @traversaro in #1888
    "},{"location":"CHANGELOG/#fixed_6","title":"Fixed","text":"
    • Try to increase rlimit by @baszalmstra in #1766
    • Add test for invalid environment names by @Hofer-Julian in #1825
    • Show global config in info command by @ruben-arts in #1807
    • Correct documentation of PIXI_ENVIRONMENT_PLATFORMS by @traversaro in #1842
    • Format in docs/features/environment.md by @cdeil in #1846
    • Make proper use of NamedChannelOrUrl by @ruben-arts in #1820
    • Trait impl override by @baszalmstra in #1848
    • Tame pixi search by @baszalmstra in #1849
    • Fix pixi tree -i duplicate output by @baszalmstra in #1847
    • Improve spec parsing error messages by @baszalmstra in #1786
    • Parse matchspec from CLI Lenient by @baszalmstra in #1852
    • Improve parsing of pypi-dependencies by @baszalmstra in #1851
    • Don't enforce system requirements for task tests by @baszalmstra in #1855
    • Satisfy when there are no pypi packages in the lockfile by @ruben-arts in #1862
    • Ssh url should not contain colon by @baszalmstra in #1865
    • find-links with manifest-path by @baszalmstra in #1864
    • Increase stack size in debug mode on windows by @baszalmstra in #1867
    • Solve-group-envs should reside in .pixi folder by @baszalmstra in #1866
    • Move package-override logging by @tdejager in #1883
    • Pinning logic for minor and major by @baszalmstra in #1885
    • Docs manifest tests by @ruben-arts in #1879
    "},{"location":"CHANGELOG/#refactor_2","title":"Refactor","text":"
    • Encapsulate channel resolution logic for CLI by @olivier-lacroix in #1781
    • Move to pub(crate) fn in order to detect and remove unused functions by @Hofer-Julian in #1805
    • Only compile TaskNode::full_command for tests by @Hofer-Julian in #1809
    • Derive Default for more structs by @Hofer-Julian in #1824
    • Rename get_up_to_date_prefix to update_prefix by @Hofer-Julian in #1837
    • Make HasSpecs implementation more functional by @Hofer-Julian in #1863
    "},{"location":"CHANGELOG/#new-contributors_4","title":"New Contributors","text":"
    • @cdeil made their first contribution in #1846
    "},{"location":"CHANGELOG/#0271-2024-08-09","title":"[0.27.1] - 2024-08-09","text":""},{"location":"CHANGELOG/#documentation_5","title":"Documentation","text":"
    • Fix mlx feature in \"multiple machines\" example by @rgommers in #1762
    • Update some of the cli and add osx rosetta mention by @ruben-arts in #1760
    • Fix typo by @pavelzw in #1771
    "},{"location":"CHANGELOG/#fixed_7","title":"Fixed","text":"
    • User agent string was wrong by @wolfv in #1759
    • Dont accidentally wipe pyproject.toml on init by @ruben-arts in #1775
    "},{"location":"CHANGELOG/#refactor_3","title":"Refactor","text":"
    • Add pixi_spec crate by @baszalmstra in #1741
    "},{"location":"CHANGELOG/#new-contributors_5","title":"New Contributors","text":"
    • @rgommers made their first contribution in #1762
    "},{"location":"CHANGELOG/#0270-2024-08-07","title":"[0.27.0] - 2024-08-07","text":""},{"location":"CHANGELOG/#highlights_5","title":"\u2728 Highlights","text":"

    This release contains a lot of refactoring and improvements to the codebase, in preparation for future features and improvements. Including with that we've fixed a ton of bugs. To make sure we're not breaking anything we've added a lot of tests and CI checks. But let us know if you find any issues!

    As a reminder, you can update pixi using pixi self-update and move to a specific version, including backwards, with pixi self-update --version 0.27.0.

    "},{"location":"CHANGELOG/#added_4","title":"Added","text":"
    • Add pixi run completion for fish shell by @dennis-wey in #1680
    "},{"location":"CHANGELOG/#changed_7","title":"Changed","text":"
    • Move examples from setuptools to hatchling by @Hofer-Julian in #1692
    • Let pixi init create hatchling pyproject.toml by @Hofer-Julian in #1693
    • Make [project] table optional for pyproject.toml manifests by @olivier-lacroix in #1732
    "},{"location":"CHANGELOG/#documentation_6","title":"Documentation","text":"
    • Improve the fish completions location by @tdejager in #1647
    • Explain why we use hatchling by @Hofer-Julian
    • Update install CLI doc now that the update command exist by @olivier-lacroix in #1690
    • Mention pixi exec in GHA docs by @pavelzw in #1724
    • Update to correct spelling by @ahnsn in #1730
    • Ensure hatchling is used everywhere in documentation by @olivier-lacroix in #1733
    • Add readme to WASM example by @wolfv in #1703
    • Fix typo by @pavelzw in #1660
    • Fix typo by @DimitriPapadopoulos in #1743
    • Fix typo by @SeaOtocinclus in #1651
    "},{"location":"CHANGELOG/#testing_1","title":"Testing","text":"
    • Added script and tasks for testing examples by @tdejager in #1671
    • Add simple integration tests by @ruben-arts in #1719
    "},{"location":"CHANGELOG/#fixed_8","title":"Fixed","text":"
    • Prepend pixi to path instead of appending by @vigneshmanick in #1644
    • Add manifest tests and run them in ci by @ruben-arts in #1667
    • Use hashed pypi mapping by @baszalmstra in #1663
    • Depend on pep440_rs from crates.io and use replace by @baszalmstra in #1698
    • pixi add with more than just package name and version by @ruben-arts in #1704
    • Ignore pypi logic on non pypi projects by @ruben-arts in #1705
    • Fix and refactor --no-lockfile-update by @ruben-arts in #1683
    • Changed example to use hatchling by @tdejager in #1729
    • Todo clean up by @KGrewal1 in #1735
    • Allow for init to pixi.toml when pyproject.toml is available. by @ruben-arts in #1640
    • Test on macos-13 by @ruben-arts in #1739
    • Make sure pixi vars are available before activation.env vars are by @ruben-arts in #1740
    • Authenticate exec package download by @olivier-lacroix in #1751
    "},{"location":"CHANGELOG/#refactor_4","title":"Refactor","text":"
    • Extract pixi_manifest by @baszalmstra in #1656
    • Delay channel config url evaluation by @baszalmstra in #1662
    • Split out pty functionality by @tdejager in #1678
    • Make project manifest loading DRY and consistent by @olivier-lacroix in #1688
    • Refactor channel add and remove CLI commands by @olivier-lacroix in #1689
    • Refactor pixi::consts and pixi::config into separate crates by @tdejager in #1684
    • Move dependencies to pixi_manifest by @tdejager in #1700
    • Moved pypi environment modifiers by @tdejager in #1699
    • Split HasFeatures by @tdejager in #1712
    • Move, splits and renames the HasFeatures trait by @tdejager in #1717
    • Merge utils by @tdejager in #1718
    • Move fancy to its own crate by @tdejager in #1722
    • Move config to repodata functions by @tdejager in #1723
    • Move pypi-mapping to its own crate by @tdejager in #1725
    • Split utils into 2 crates by @tdejager in #1736
    • Add progress bar as a crate by @nichmor in #1727
    • Split up pixi_manifest lib by @tdejager in #1661
    "},{"location":"CHANGELOG/#new-contributors_6","title":"New Contributors","text":"
    • @DimitriPapadopoulos made their first contribution in #1743
    • @KGrewal1 made their first contribution in #1735
    • @ahnsn made their first contribution in #1730
    • @dennis-wey made their first contribution in #1680
    "},{"location":"CHANGELOG/#0261-2024-07-22","title":"[0.26.1] - 2024-07-22","text":""},{"location":"CHANGELOG/#fixed_9","title":"Fixed","text":"
    • Make sure we also build the msi installer by @ruben-arts in #1645
    "},{"location":"CHANGELOG/#0260-2024-07-19","title":"[0.26.0] - 2024-07-19","text":""},{"location":"CHANGELOG/#highlights_6","title":"\u2728 Highlights","text":"
    • Specify how pixi pins your dependencies with the pinning-strategy in the config. e.g. semver -> >=1.2.3,<2 and no-pin -> *) #1516
    • Specify how pixi solves multiple channels with channel-priority in the manifest. #1631
    "},{"location":"CHANGELOG/#added_5","title":"Added","text":"
    • Add short options to config location flags by @ruben-arts in #1586
    • Add a file guard to indicate if an environment is being installed by @baszalmstra in #1593
    • Add pinning-strategy to the configuration by @ruben-arts in #1516
    • Add channel-priority to the manifest and solve by @ruben-arts in #1631
    • Add nushell completion by @Hofer-Julian in #1599
    • Add nushell completions for pixi run by @Hofer-Julian in #1627
    • Add completion for pixi run --environment for nushell by @Hofer-Julian in #1636
    "},{"location":"CHANGELOG/#changed_8","title":"Changed","text":"
    • Upgrade uv 0.2.18 by @tdejager in #1540
    • Refactor pyproject.toml parser by @nichmor in #1592
    • Interactive warning for packages in pixi global install by @ruben-arts in #1626
    "},{"location":"CHANGELOG/#documentation_7","title":"Documentation","text":"
    • Add WASM example with JupyterLite by @wolfv in #1623
    • Added LLM example by @ytjhai in #1545
    • Add note to mark directory as excluded in pixi-pycharm by @pavelzw in #1579
    • Add changelog to docs by @vigneshmanick in #1574
    • Updated the values of the system requirements by @tdejager in #1575
    • Tell cargo install which bin to install by @ruben-arts in #1584
    • Update conflict docs for cargo add by @Hofer-Julian in #1600
    • Revert \"Update conflict docs for cargo add \" by @Hofer-Julian in #1605
    • Add reference documentation for the exec command by @baszalmstra in #1587
    • Add transitioning docs for poetry and conda by @ruben-arts in #1624
    • Add pixi-pack by @pavelzw in #1629
    • Use '-' instead of '_' for package name by @olivier-lacroix in #1628
    "},{"location":"CHANGELOG/#fixed_10","title":"Fixed","text":"
    • Flaky task test by @tdejager in #1581
    • Pass command line arguments verbatim by @baszalmstra in #1582
    • Run clippy on all targets by @Hofer-Julian in #1588
    • Pre-commit install pixi task by @Hofer-Julian in #1590
    • Add clap_complete_nushell to dependencies by @Hofer-Julian in #1625
    • Write to stdout for machine readable output by @Hofer-Julian in #1639
    "},{"location":"CHANGELOG/#refactor_5","title":"Refactor","text":"
    • Migrate to workspace by @baszalmstra in #1597
    "},{"location":"CHANGELOG/#removed","title":"Removed","text":"
    • Remove double manifest warning by @tdejager in #1580
    "},{"location":"CHANGELOG/#new-contributors_7","title":"New Contributors","text":"
    • @ytjhai made their first contribution in #1545
    "},{"location":"CHANGELOG/#0250-2024-07-05","title":"[0.25.0] - 2024-07-05","text":""},{"location":"CHANGELOG/#highlights_7","title":"\u2728 Highlights","text":"
    • pixi exec command, execute commands in temporary environments, useful for testing in short-lived sessions.
    • We've bumped the default system-requirements to higher defaults: glibc (2.17 -> 2.28), osx64 (10.15 -> 13.0), osx-arm64 (11.0 -> 13.0). Let us know if this causes any issues. To keep the previous values please use a system-requirements table, this is explained here
    "},{"location":"CHANGELOG/#changed_9","title":"Changed","text":"
    • Bump system requirements by @wolfv in #1553
    • Better error when exec is missing a cmd by @tdejager in #1565
    • Make exec use authenticated client by @tdejager in #1568
    "},{"location":"CHANGELOG/#documentation_8","title":"Documentation","text":"
    • Automatic updating using github actions by @pavelzw in #1456
    • Describe the --change-ps1 option for pixi shell by @Yura52 in #1536
    • Add some other quantco repos by @pavelzw in #1542
    • Add example using geos-rs by @Hofer-Julian in #1563
    "},{"location":"CHANGELOG/#fixed_11","title":"Fixed","text":"
    • Tiny error in basic_usage.md by @Sjouks in #1513
    • Lazy initialize client by @baszalmstra in #1511
    • URL typos in rtd examples by @kklein in #1538
    • Fix satisfiability for short sha hashes by @tdejager in #1530
    • Wrong path passed to dynamic check by @tdejager in #1552
    • Don't error if no tasks is available on platform by @hoxbro in #1550
    "},{"location":"CHANGELOG/#refactor_6","title":"Refactor","text":"
    • Add to use update code by @baszalmstra in #1508
    "},{"location":"CHANGELOG/#new-contributors_8","title":"New Contributors","text":"
    • @kklein made their first contribution in #1538
    • @Yura52 made their first contribution in #1536
    • @Sjouks made their first contribution in #1513
    "},{"location":"CHANGELOG/#0242-2024-06-14","title":"[0.24.2] - 2024-06-14","text":""},{"location":"CHANGELOG/#documentation_9","title":"Documentation","text":"
    • Add readthedocs examples by @bollwyvl in #1423
    • Fix typo in project_configuration.md by @RaulPL in #1502
    "},{"location":"CHANGELOG/#fixed_12","title":"Fixed","text":"
    • Too much shell variables in activation of pixi shell by @ruben-arts in #1507
    "},{"location":"CHANGELOG/#0241-2024-06-12","title":"[0.24.1] - 2024-06-12","text":""},{"location":"CHANGELOG/#fixed_13","title":"Fixed","text":"
    • Replace http code %2b with + by @ruben-arts in #1500
    "},{"location":"CHANGELOG/#0240-2024-06-12","title":"[0.24.0] - 2024-06-12","text":""},{"location":"CHANGELOG/#highlights_8","title":"\u2728 Highlights","text":"
    • You can now run in a more isolated environment on unix machines, using pixi run --clean-env TASK_NAME.
    • You can new easily clean your environment with pixi clean or the cache with pixi clean cache
    "},{"location":"CHANGELOG/#added_6","title":"Added","text":"
    • Add pixi clean command by @ruben-arts in #1325
    • Add --clean-env flag to tasks and run command by @ruben-arts in #1395
    • Add description field to task by @jjjermiah in #1479
    • Add pixi file to the environment to add pixi specific details by @ruben-arts in #1495
    "},{"location":"CHANGELOG/#changed_10","title":"Changed","text":"
    • Project environment cli by @baszalmstra in #1433
    • Update task list console output by @vigneshmanick in #1443
    • Upgrade uv by @tdejager in #1436
    • Sort packages in list_global_packages by @dhirschfeld in #1458
    • Added test for special chars wheel filename by @tdejager in #1454
    "},{"location":"CHANGELOG/#documentation_10","title":"Documentation","text":"
    • Improve multi env tasks documentation by @ruben-arts in #1494
    "},{"location":"CHANGELOG/#fixed_14","title":"Fixed","text":"
    • Use the activated environment when running a task by @tdejager in #1461
    • Fix authentication pypi-deps for download from lockfile by @tdejager in #1460
    • Display channels correctly in pixi info by @ruben-arts in #1459
    • Render help for --frozen by @ruben-arts in #1468
    • Don't record purl for non conda-forge channels by @nichmor in #1451
    • Use best_platform to verify the run platform by @ruben-arts in #1472
    • Creation of parent dir of symlink by @ruben-arts in #1483
    • pixi install --all output missing newline by @vigneshmanick in #1487
    • Don't error on already existing dependency by @ruben-arts in #1449
    • Remove debug true in release by @ruben-arts in #1477
    "},{"location":"CHANGELOG/#new-contributors_9","title":"New Contributors","text":"
    • @dhirschfeld made their first contribution in #1458

    Full commit history

    "},{"location":"CHANGELOG/#0230-2024-05-27","title":"[0.23.0] - 2024-05-27","text":""},{"location":"CHANGELOG/#highlights_9","title":"\u2728 Highlights","text":"
    • This release adds two new commands pixi config and pixi update
      • pixi config allows you to edit, set, unset, append, prepend and list your local/global or system configuration.
      • pixi update re-solves the full lockfile or use pixi update PACKAGE to only update PACKAGE, making sure your project is using the latest versions that the manifest allows for.
    "},{"location":"CHANGELOG/#added_7","title":"Added","text":"
    • Add pixi config command by @chawyehsu in #1339
    • Add pixi list --explicit flag command by @jjjermiah in #1403
    • Add [activation.env] table for environment variables by @ruben-arts in #1156
    • Allow installing multiple envs, including --all at once by @tdejager in #1413
    • Add pixi update command to re-solve the lockfile by @baszalmstra in #1431 (fixes 20 :thumbsup:)
    • Add detached-environments to the config, move environments outside the project folder by @ruben-arts in #1381 (fixes 11 :thumbsup:)
    "},{"location":"CHANGELOG/#changed_11","title":"Changed","text":"
    • Use the gateway to fetch repodata by @baszalmstra in #1307
    • Switch to compressed mapping by @nichmor in #1335
    • Warn on pypi conda clobbering by @nichmor in #1353
    • Align remove arguments with add by @olivier-lacroix in #1406
    • Add backward compat logic for older lock files by @nichmor in #1425
    "},{"location":"CHANGELOG/#documentation_11","title":"Documentation","text":"
    • Fix small screen by removing getting started section. by @ruben-arts in #1393
    • Improve caching docs by @ruben-arts in #1422
    • Add example, python library using gcp upload by @tdejager in #1380
    • Correct typos with --no-lockfile-update. by @tobiasraabe in #1396
    "},{"location":"CHANGELOG/#fixed_15","title":"Fixed","text":"
    • Trim channel url when filter packages_for_prefix_mapping by @zen-xu in #1391
    • Use the right channels when upgrading global packages by @olivier-lacroix in #1326
    • Fish prompt display looks wrong in tide by @tfriedel in #1424
    • Use local mapping instead of remote by @nichmor in #1430
    "},{"location":"CHANGELOG/#refactor_7","title":"Refactor","text":"
    • Remove unused fetch_sparse_repodata by @olivier-lacroix in #1411
    • Remove project level method that are per environment by @olivier-lacroix in #1412
    • Update lockfile functionality for reusability by @baszalmstra in #1426
    "},{"location":"CHANGELOG/#new-contributors_10","title":"New Contributors","text":"
    • @tfriedel made their first contribution in #1424
    • @jjjermiah made their first contribution in #1403
    • @tobiasraabe made their first contribution in #1396

    Full commit history

    "},{"location":"CHANGELOG/#0220-2024-05-13","title":"[0.22.0] - 2024-05-13","text":""},{"location":"CHANGELOG/#highlights_10","title":"\u2728 Highlights","text":"
    • Support for source pypi dependencies through the cli:
      • pixi add --pypi 'package @ package.whl', perfect for adding just build wheels to your environment in CI.
      • pixi add --pypi 'package_from_git @ git+https://github.com/org/package.git', to add a package from a git repository.
      • pixi add --pypi 'package_from_path @ file:///path/to/package' --editable, to add a package from a local path.
    "},{"location":"CHANGELOG/#added_8","title":"Added","text":"
    • Implement more functions for pixi add --pypi by @wolfv in #1244
    "},{"location":"CHANGELOG/#documentation_12","title":"Documentation","text":"
    • Update install cli doc by @vigneshmanick in #1336
    • Replace empty default example with no-default-feature by @beenje in #1352
    • Document the add & remove cli behaviour with pyproject.toml manifest by @olivier-lacroix in #1338
    • Add environment activation to GitHub actions docs by @pavelzw in #1371
    • Clarify in CLI that run can also take commands by @twrightsman in #1368
    "},{"location":"CHANGELOG/#fixed_16","title":"Fixed","text":"
    • Automated update of install script in pixi.sh by @ruben-arts in #1351
    • Wrong description on pixi project help by @notPlancha in #1358
    • Don't need a python interpreter when not having pypi dependencies. by @ruben-arts in #1366
    • Don't error on not editable not path by @ruben-arts in #1365
    • Align shell-hook cli with shell by @ruben-arts in #1364
    • Only write prefix file if needed by @ruben-arts in #1363
    "},{"location":"CHANGELOG/#refactor_8","title":"Refactor","text":"
    • Lock-file resolve functionality in separated modules by @tdejager in #1337
    • Use generic for RepoDataRecordsByName and PypiRecordsByName by @olivier-lacroix in #1341
    "},{"location":"CHANGELOG/#new-contributors_11","title":"New Contributors","text":"
    • @twrightsman made their first contribution in #1368
    • @notPlancha made their first contribution in #1358
    • @vigneshmanick made their first contribution in #1336

    Full commit history

    "},{"location":"CHANGELOG/#0211-2024-05-07","title":"[0.21.1] - 2024-05-07","text":""},{"location":"CHANGELOG/#fixed_17","title":"Fixed","text":"
    • Use read timeout, not global timeout by @wolfv in #1329
    • Channel priority logic by @ruben-arts in #1332

    Full commit history

    "},{"location":"CHANGELOG/#0210-2024-05-06","title":"[0.21.0] - 2024-05-06","text":""},{"location":"CHANGELOG/#highlights_11","title":"\u2728 Highlights","text":"
    • This release adds support for configuring PyPI settings globally, to use alternative PyPI indexes and load credentials with keyring.
    • We now support cross-platform running, for osx-64 on osx-arm64 and wasm environments.
    • There is now a no-default-feature option to simplify usage of environments.
    "},{"location":"CHANGELOG/#added_9","title":"Added","text":"
    • Add pypi config for global local config file + keyring support by @wolfv in #1279
    • Allow for cross-platform running, for osx-64 on osx-arm64 and wasm environments by @wolfv in #1020
    "},{"location":"CHANGELOG/#changed_12","title":"Changed","text":"
    • Add no-default-feature option to environments by @olivier-lacroix in #1092
    • Add /etc/pixi/config.toml to global configuration search paths by @pavelzw in #1304
    • Change global config fields to kebab-case by @tdejager in #1308
    • Show all available task with task list by @Hoxbro in #1286
    • Allow to emit activation environment variables as JSON by @borchero in #1317
    • Use locked pypi packages as preferences in the pypi solve to get minimally updating lock files by @ruben-arts in #1320
    • Allow to upgrade several global packages at once by @olivier-lacroix in #1324
    "},{"location":"CHANGELOG/#documentation_13","title":"Documentation","text":"
    • Typo in tutorials python by @carschandler in #1297
    • Python Tutorial: Dependencies, PyPI, Order, Grammar by @JesperDramsch in #1313
    "},{"location":"CHANGELOG/#fixed_18","title":"Fixed","text":"
    • Schema version and add it to tbump by @ruben-arts in #1284
    • Make integration test fail in ci and fix ssh issue by @ruben-arts in #1301
    • Automate adding install scripts to the docs by @ruben-arts in #1302
    • Do not always request for prefix mapping by @nichmor in #1300
    • Align CLI aliases and add missing by @ruben-arts in #1316
    • Alias depends_on to depends-on by @ruben-arts in #1310
    • Add error if channel or platform doesn't exist on remove by @ruben-arts in #1315
    • Allow spec in pixi q instead of only name by @ruben-arts in #1314
    • Remove dependency on sysroot for linux by @ruben-arts in #1319
    • Fix linking symlink issue, by updating to the latest rattler by @baszalmstra in #1327
    "},{"location":"CHANGELOG/#refactor_9","title":"Refactor","text":"
    • Use IndexSet instead of Vec for collections of unique elements by @olivier-lacroix in #1289
    • Use generics over PyPiDependencies and CondaDependencies by @olivier-lacroix in #1303
    "},{"location":"CHANGELOG/#new-contributors_12","title":"New Contributors","text":"
    • @borchero made their first contribution in #1317
    • @JesperDramsch made their first contribution in #1313
    • @Hoxbro made their first contribution in #1286
    • @carschandler made their first contribution in #1297

    Full commit history

    "},{"location":"CHANGELOG/#0201-2024-04-26","title":"[0.20.1] - 2024-04-26","text":""},{"location":"CHANGELOG/#highlights_12","title":"\u2728 Highlights","text":"
    • Big improvements on the pypi-editable installs.
    "},{"location":"CHANGELOG/#fixed_19","title":"Fixed","text":"
    • Editable non-satisfiable by @baszalmstra in #1251
    • Satisfiability with pypi extras by @baszalmstra in #1253
    • Change global install activation script permission from 0o744 -> 0o755 by @zen-xu in #1250
    • Avoid creating Empty TOML tables by @olivier-lacroix in #1270
    • Uses the special-case uv path handling for both built and source by @tdejager in #1263
    • Modify test before attempting to write to .bash_profile in install.sh by @bruchim-cisco in #1267
    • Parse properly 'default' as environment Cli argument by @olivier-lacroix in #1247
    • Apply schema.json normalization, add to docs by @bollwyvl in #1265
    • Improve absolute path satisfiability by @tdejager in #1252
    • Improve parse deno error and make task a required field in the cli by @ruben-arts in #1260
    "},{"location":"CHANGELOG/#new-contributors_13","title":"New Contributors","text":"
    • @bollwyvl made their first contribution in #1265
    • @bruchim-cisco made their first contribution in #1267
    • @zen-xu made their first contribution in #1250

    Full commit history

    "},{"location":"CHANGELOG/#0200-2024-04-19","title":"[0.20.0] - 2024-04-19","text":""},{"location":"CHANGELOG/#highlights_13","title":"\u2728 Highlights","text":"
    • We now support env variables in the task definition, these can also be used as default values for parameters in your task which you can overwrite with your shell's env variables. e.g. task = { cmd = \"task to run\", env = { VAR=\"value1\", PATH=\"my/path:$PATH\" } }
    • We made a big effort on fixing issues and improving documentation!
    "},{"location":"CHANGELOG/#added_10","title":"Added","text":"
    • Add env to the tasks to specify tasks specific environment variables by @wolfv in https://github.com/prefix-dev/pixi/pull/972
    "},{"location":"CHANGELOG/#changed_13","title":"Changed","text":"
    • Add --pyproject option to pixi init with a pyproject.toml by @olivier-lacroix in #1188
    • Upgrade to new uv version 0.1.32 by @tdejager in #1208
    "},{"location":"CHANGELOG/#documentation_14","title":"Documentation","text":"
    • Document pixi.lock by @ruben-arts in #1209
    • Document channel priority definition by @ruben-arts in #1234
    • Add rust tutorial including openssl example by @ruben-arts in #1155
    • Add python tutorial to documentation by @tdejager in #1179
    • Add JupyterLab integration docs by @renan-r-santos in #1147
    • Add Windows support for PyCharm integration by @pavelzw in #1192
    • Setup_pixi for local pixi installation by @ytausch in #1181
    • Update pypi docs by @Hofer-Julian in #1215
    • Fix order of --no-deps when pip installing in editable mode by @glemaitre in #1220
    • Fix frozen documentation by @ruben-arts in #1167
    "},{"location":"CHANGELOG/#fixed_20","title":"Fixed","text":"
    • Small typo in list cli by @tdejager in #1169
    • Issue with invalid solve group by @baszalmstra in #1190
    • Improve error on parsing lockfile by @ruben-arts in #1180
    • Replace _ with - when creating environments from features by @wolfv in #1203
    • Prevent duplicate direct dependencies in tree by @abkfenris in #1184
    • Use project root directory instead of task.working_directory for base dir when hashing by @wolfv in #1202
    • Do not leak env vars from bat scripts in cmd.exe by @wolfv in #1205
    • Make file globbing behave more as expected by @wolfv in #1204
    • Fix for using file::// in pyproject.toml dependencies by @tdejager in #1196
    • Improve pypi version conversion in pyproject.toml dependencies by @wolfv in #1201
    • Update to the latest rattler by @wolfv in #1235
    "},{"location":"CHANGELOG/#breaking","title":"BREAKING","text":"
    • task = { cmd = \"task to run\", cwd = \"folder\", inputs = \"input.txt\", output = \"output.txt\"} Where input.txt and output.txt where previously in folder they are now relative the project root. This changed in: #1202
    • task = { cmd = \"task to run\", inputs = \"input.txt\"} previously searched for all input.txt files now only for the ones in the project root. This changed in: #1204
    "},{"location":"CHANGELOG/#new-contributors_14","title":"New Contributors","text":"
    • @glemaitre made their first contribution in #1220

    Full commit history

    "},{"location":"CHANGELOG/#0191-2024-04-11","title":"[0.19.1] - 2024-04-11","text":""},{"location":"CHANGELOG/#highlights_14","title":"\u2728 Highlights","text":"

    This fixes the issue where pixi would generate broken environments/lockfiles when a mapping for a brand-new version of a package is missing.

    "},{"location":"CHANGELOG/#changed_14","title":"Changed","text":"
    • Add fallback mechanism for missing mapping by @nichmor in #1166

    Full commit history

    "},{"location":"CHANGELOG/#0190-2024-04-10","title":"[0.19.0] - 2024-04-10","text":""},{"location":"CHANGELOG/#highlights_15","title":"\u2728 Highlights","text":"
    • This release adds a new pixi tree command to show the dependency tree of the project.
    • Pixi now persists the manifest and environment when activating a shell, so you can use pixi as if you are in that folder while in the shell.
    "},{"location":"CHANGELOG/#added_11","title":"Added","text":"
    • pixi tree command to show dependency tree by @abkfenris in #1069
    • Persistent shell manifests by @abkfenris in #1080
    • Add to pypi in feature (pixi add --feature test --pypi package) by @ruben-arts in #1135
    • Use new mapping by @nichmor in #888
    • --no-progress to disable all progress bars by @baszalmstra in #1105
    • Create a table if channel is specified (pixi add conda-forge::rattler-build) by @baszalmstra in #1079
    "},{"location":"CHANGELOG/#changed_15","title":"Changed","text":"
    • Add the project itself as an editable dependency by @olivier-lacroix in #1084
    • Get tool.pixi.project.name from project.name by @olivier-lacroix in #1112
    • Create features and environments from extras by @olivier-lacroix in #1077
    • Pypi supports come out of Beta by @olivier-lacroix in #1120
    • Enable to force PIXI_ARCH for pixi installation by @beenje in #1129
    • Improve tool.pixi.project detection logic by @olivier-lacroix in #1127
    • Add purls for packages if adding pypi dependencies by @nichmor in #1148
    • Add env name if not default to tree and list commands by @ruben-arts in #1145
    "},{"location":"CHANGELOG/#documentation_15","title":"Documentation","text":"
    • Add MODFLOW 6 to community docs by @Hofer-Julian in #1125
    • Addition of ros2 tutorial by @ruben-arts in #1116
    • Improve install script docs by @ruben-arts in #1136
    • More structured table of content by @tdejager in #1142
    "},{"location":"CHANGELOG/#fixed_21","title":"Fixed","text":"
    • Amend syntax in conda-meta/history to prevent conda.history.History.parse() error by @jaimergp in #1117
    • Fix docker example and include pyproject.toml by @tdejager in #1121
    "},{"location":"CHANGELOG/#new-contributors_15","title":"New Contributors","text":"
    • @abkfenris made their first contribution in #1069
    • @beenje made their first contribution in #1129
    • @jaimergp made their first contribution in #1117

    Full commit history

    "},{"location":"CHANGELOG/#0180-2024-04-02","title":"[0.18.0] - 2024-04-02","text":""},{"location":"CHANGELOG/#highlights_16","title":"\u2728 Highlights","text":"
    • This release adds support for pyproject.toml, now pixi reads from the [tool.pixi] table.
    • We now support editable PyPI dependencies, and PyPI source dependencies, including git, path, and url dependencies.

    [!TIP] These new features are part of the ongoing effort to make pixi more flexible, powerful, and comfortable for the python users. They are still in progress so expect more improvements on these features soon, so please report any issues you encounter and follow our next releases!

    "},{"location":"CHANGELOG/#added_12","title":"Added","text":"
    • Support for pyproject.toml by @olivier-lacroix in #999
    • Support for PyPI source dependencies by @tdejager in #985
    • Support for editable PyPI dependencies by @tdejager in #1044
    "},{"location":"CHANGELOG/#changed_16","title":"Changed","text":"
    • XDG_CONFIG_HOME and XDG_CACHE_HOME compliance by @chawyehsu in #1050
    • Build pixi for windows arm by @baszalmstra in #1053
    • Platform literals by @baszalmstra in #1054
    • Cli docs: --user is actually --username
    • Fixed error in auth example (CLI docs) by @ytausch in #1076
    "},{"location":"CHANGELOG/#documentation_16","title":"Documentation","text":"
    • Add lockfile update description in preparation for pixi update by @ruben-arts in #1073
    • zsh may be used for installation on macOS by @pya in #1091
    • Fix typo in pixi auth documentation by @ytausch in #1076
    • Add rstudio to the IDE integration docs by @wolfv in #1144
    "},{"location":"CHANGELOG/#fixed_22","title":"Fixed","text":"
    • Test failure on riscv64 by @hack3ric in #1045
    • Validation test was testing on a wrong pixi.toml by @ruben-arts in #1056
    • Pixi list shows path and editable by @baszalmstra in #1100
    • Docs ci by @ruben-arts in #1074
    • Add error for unsupported pypi dependencies by @baszalmstra in #1052
    • Interactively delete environment when it was relocated by @baszalmstra in #1102
    • Allow solving for different platforms by @baszalmstra in #1101
    • Don't allow extra keys in pypi requirements by @baszalmstra in #1104
    • Solve when moving dependency from conda to pypi by @baszalmstra in #1099
    "},{"location":"CHANGELOG/#new-contributors_16","title":"New Contributors","text":"
    • @pya made their first contribution in #1091
    • @ytausch made their first contribution in #1076
    • @hack3ric made their first contribution in #1045
    • @olivier-lacroix made their first contribution in #999
    • @henryiii made their first contribution in #1063

    Full commit history

    "},{"location":"CHANGELOG/#0171-2024-03-21","title":"[0.17.1] - 2024-03-21","text":""},{"location":"CHANGELOG/#highlights_17","title":"\u2728 Highlights","text":"

    A quick bug-fix release for pixi list.

    "},{"location":"CHANGELOG/#documentation_17","title":"Documentation","text":"
    • Fix typo by @pavelzw in #1028
    "},{"location":"CHANGELOG/#fixed_23","title":"Fixed","text":"
    • Remove the need for a python interpreter in pixi list by @baszalmstra in #1033
    "},{"location":"CHANGELOG/#0170-2024-03-19","title":"[0.17.0] - 2024-03-19","text":""},{"location":"CHANGELOG/#highlights_18","title":"\u2728 Highlights","text":"
    • This release greatly improves pixi global commands, thanks to @chawyehsu!
    • We now support global (or local) configuration for pixi's own behavior, including mirrors, and OCI registries.
    • We support channel mirrors for corporate environments!
    • Faster task execution thanks to caching \ud83d\ude80 Tasks that already executed successfully can be skipped based on the hash of the inputs and outputs.
    • PyCharm and GitHub Actions integration thanks to @pavelzw \u2013 read more about it in the docs!
    "},{"location":"CHANGELOG/#added_13","title":"Added","text":"
    • Add citation file by @ruben-arts in #908
    • Add a pixi badge by @ruben-arts in #961
    • Add deserialization of pypi source dependencies from toml by @ruben-arts and @wolf in #895 #984
    • Implement mirror and OCI settings by @wolfv in #988
    • Implement inputs and outputs hash based task skipping by @wolfv in #933
    "},{"location":"CHANGELOG/#changed_17","title":"Changed","text":"
    • Refined global upgrade commands by @chawyehsu in #948
    • Global upgrade supports matchspec by @chawyehsu in #962
    • Improve pixi search with platform selection and making limit optional by @wolfv in #979
    • Implement global config options by @wolfv in #960 #1015 #1019
    • Update auth to use rattler cli by @kassoulait by @ruben-arts in #986
    "},{"location":"CHANGELOG/#documentation_18","title":"Documentation","text":"
    • Remove cache: true from setup-pixi by @pavelzw in #950
    • Add GitHub Actions documentation by @pavelzw in #955
    • Add PyCharm documentation by @pavelzw in #974
    • Mention watch_file in direnv usage by @pavelzw in #983
    • Add tip to help users when no PROFILE file exists by @ruben-arts in #991
    • Move yaml comments into mkdocs annotations by @pavelzw in #1003
    • Fix --env and extend actions examples by @ruben-arts in #1005
    • Add Wflow to projects built with pixi by @Hofer-Julian in #1006
    • Removed linenums to avoid buggy visualization by @ruben-arts in #1002
    • Fix typos by @pavelzw in #1016
    "},{"location":"CHANGELOG/#fixed_24","title":"Fixed","text":"
    • Pypi dependencies not being removed by @tdejager in #952
    • Permissions for lint pr by @ruben-arts in #852
    • Install Windows executable with install.sh in Git Bash by @jdblischak in #966
    • Proper scanning of the conda-meta folder for json entries by @wolfv in #971
    • Global shim scripts for Windows by @wolfv in #975
    • Correct fish prompt by @wolfv in #981
    • Prefix_file rename by @ruben-arts in #959
    • Conda transitive dependencies of pypi packages are properly extracted by @baszalmstra in #967
    • Make tests more deterministic and use single * for glob expansion by @wolfv in #987
    • Create conda-meta/history file by @pavelzw in #995
    • Pypi dependency parsing was too lenient by @wolfv in #984
    • Add reactivation of the environment in pixi shell by @wolfv in #982
    • Add tool to strict json schema by @ruben-arts in #969
    "},{"location":"CHANGELOG/#new-contributors_17","title":"New Contributors","text":"
    • @jdblischak made their first contribution in #966
    • @kassoulait made their first contribution in #986

    Full commit history

    "},{"location":"CHANGELOG/#0161-2024-03-11","title":"[0.16.1] - 2024-03-11","text":""},{"location":"CHANGELOG/#fixed_25","title":"Fixed","text":"
    • Parse lockfile matchspecs lenient, fixing bug introduced in 0.16.0 by @ruben-arts in #951

    Full commit history

    "},{"location":"CHANGELOG/#0160-2024-03-09","title":"[0.16.0] - 2024-03-09","text":""},{"location":"CHANGELOG/#highlights_19","title":"\u2728 Highlights","text":"
    • This release removes rip and add uv as the PyPI resolver and installer.
    "},{"location":"CHANGELOG/#added_14","title":"Added","text":"
    • Add tcsh install support by @obust in #898
    • Add user agent to pixi http client by @baszalmstra in #892
    • Add a schema for the pixi.toml by @ruben-arts in #936
    "},{"location":"CHANGELOG/#changed_18","title":"Changed","text":"
    • Switch from rip to uv by @tdejager in #863
    • Move uv options into context by @tdejager in #911
    • Add Deltares projects to Community.md by @Hofer-Julian in #920
    • Upgrade to uv 0.1.16, updated for changes in the API by @tdejager in #935
    "},{"location":"CHANGELOG/#fixed_26","title":"Fixed","text":"
    • Made the uv re-install logic a bit more clear by @tdejager in #894
    • Avoid duplicate pip dependency while importing environment.yaml by @sumanth-manchala in #890
    • Handle custom channels when importing from env yaml by @sumanth-manchala in #901
    • Pip editable installs getting uninstalled by @renan-r-santos in #902
    • Highlight pypi deps in pixi list by @sumanth-manchala in #907
    • Default to the default environment if possible by @ruben-arts in #921
    • Switching channels by @baszalmstra in #923
    • Use correct name of the channel on adding by @ruben-arts in #928
    • Turn back on jlap for faster repodata fetching by @ruben-arts in #937
    • Remove dists site-packages's when python interpreter changes by @tdejager in #896
    "},{"location":"CHANGELOG/#new-contributors_18","title":"New Contributors","text":"
    • @obust made their first contribution in #898
    • @renan-r-santos made their first contribution in #902

    Full Commit history

    "},{"location":"CHANGELOG/#0152-2024-02-29","title":"[0.15.2] - 2024-02-29","text":""},{"location":"CHANGELOG/#changed_19","title":"Changed","text":"
    • Add more info to a failure of activation by @ruben-arts in #873
    "},{"location":"CHANGELOG/#fixed_27","title":"Fixed","text":"
    • Improve global list UX when there is no global env dir created by @sumanth-manchala in #865
    • Update rattler to v0.19.0 by @AliPiccioniQC in #885
    • Error on pixi run if platform is not supported by @ruben-arts in #878
    "},{"location":"CHANGELOG/#new-contributors_19","title":"New Contributors","text":"
    • @sumanth-manchala made their first contribution in #865
    • @AliPiccioniQC made their first contribution in #885

    Full commit history

    "},{"location":"CHANGELOG/#0151-2024-02-26","title":"[0.15.1] - 2024-02-26","text":""},{"location":"CHANGELOG/#added_15","title":"Added","text":"
    • Add prefix to project info json output by @baszalmstra in #859
    "},{"location":"CHANGELOG/#changed_20","title":"Changed","text":"
    • New pixi global list display format by @chawyehsu in #723
    • Add direnv usage by @pavelzw in #845
    • Add docker example by @pavelzw in #846
    • Install/remove multiple packages globally by @chawyehsu in #854
    "},{"location":"CHANGELOG/#fixed_28","title":"Fixed","text":"
    • Prefix file in init --import by @ruben-arts in #855
    • Environment and feature names in pixi info --json by @baszalmstra in #857

    Full commit history

    "},{"location":"CHANGELOG/#0150-2024-02-23","title":"[0.15.0] - 2024-02-23","text":""},{"location":"CHANGELOG/#highlights_20","title":"\u2728 Highlights","text":"
    • [pypi-dependencies] now get build in the created environment so it uses the conda installed build tools.
    • pixi init --import env.yml to import an existing conda environment file.
    • [target.unix.dependencies] to specify dependencies for unix systems instead of per platform.

    [!WARNING] This versions build failed, use v0.15.1

    "},{"location":"CHANGELOG/#added_16","title":"Added","text":"
    • pass environment variables during pypi resolution and install (#818)
    • skip micromamba style selector lines and warn about them (#830)
    • add import yml flag (#792)
    • check duplicate dependencies (#717)
    • (ci) check conventional PR title (#820)
    • add --feature to pixi add (#803)
    • add windows, macos, linux and unix to targets (#832)
    "},{"location":"CHANGELOG/#fixed_29","title":"Fixed","text":"
    • cache and retry pypi name mapping (#839)
    • check duplicates while adding dependencies (#829)
    • logic PIXI_NO_PATH_UPDATE variable (#822)
    "},{"location":"CHANGELOG/#other","title":"Other","text":"
    • add mike to the documentation and update looks (#809)
    • add instructions for installing on Alpine Linux (#828)
    • more error reporting in self-update (#823)
    • disabled jlap for now (#836)

    Full commit history

    "},{"location":"CHANGELOG/#0140-2024-02-15","title":"[0.14.0] - 2024-02-15","text":""},{"location":"CHANGELOG/#highlights_21","title":"\u2728 Highlights","text":"

    Now, solve-groups can be used in [environments] to ensure dependency alignment across different environments without simultaneous installation. This feature is particularly beneficial for managing identical dependencies in test and production environments. Example configuration:

    [environments]\ntest = { features = [\"prod\", \"test\"], solve-groups = [\"group1\"] }\nprod = { features = [\"prod\"], solve-groups = [\"group1\"] }\n
    This setup simplifies managing dependencies that must be consistent across test and production.

    "},{"location":"CHANGELOG/#added_17","title":"Added","text":"
    • Add index field to pypi requirements by @vlad-ivanov-name in #784
    • Add -f/--feature to the pixi project platform command by @ruben-arts in #785
    • Warn user when unused features are defined by @ruben-arts in #762
    • Disambiguate tasks interactive by @baszalmstra in #766
    • Solve groups for conda by @baszalmstra in #783
    • Pypi solve groups by @baszalmstra in #802
    • Enable reflinks by @baszalmstra in #729
    "},{"location":"CHANGELOG/#changed_21","title":"Changed","text":"
    • Add environment name to the progress by @ruben-arts in #788
    • Set color scheme by @ruben-arts in #773
    • Update lock on pixi list by @ruben-arts in #775
    • Use default env if task available in it. by @ruben-arts in #772
    • Color environment name in install step by @ruben-arts in #795
    "},{"location":"CHANGELOG/#fixed_30","title":"Fixed","text":"
    • Running cuda env and using those tasks. by @ruben-arts in #764
    • Make svg a gif by @ruben-arts in #782
    • Fmt by @ruben-arts
    • Check for correct platform in task env creation by @ruben-arts in #759
    • Remove using source name by @ruben-arts in #765
    • Auto-guessing of the shell in the shell-hook by @ruben-arts in https://github.com/prefix-dev/pixi/pull/811
    • sdist with direct references by @nichmor in https://github.com/prefix-dev/pixi/pull/813
    "},{"location":"CHANGELOG/#miscellaneous","title":"Miscellaneous","text":"
    • Add slim-trees to community projects by @pavelzw in #760
    • Add test to default env in polarify example
    • Add multiple machine example by @ruben-arts in #757
    • Add more documentation on environments by @ruben-arts in #790
    • Update rip and rattler by @wolfv in #798
    • Rattler 0.18.0 by @baszalmstra in #805
    • Rip 0.8.0 by @nichmor in #806
    • Fix authentication path by @pavelzw in #796
    • Initial addition of integration test by @ruben-arts in https://github.com/prefix-dev/pixi/pull/804
    "},{"location":"CHANGELOG/#new-contributors_20","title":"New Contributors","text":"
    • @vlad-ivanov-name made their first contribution in #784
    • @nichmor made their first contribution in #806

    Full commit history

    "},{"location":"CHANGELOG/#0130-2024-02-01","title":"[0.13.0] - 2024-02-01","text":""},{"location":"CHANGELOG/#highlights_22","title":"\u2728 Highlights","text":"

    This release is pretty crazy in amount of features! The major ones are: - We added support for multiple environments. :tada: Checkout the documentation - We added support for sdist installation, which greatly improves the amount of packages that can be installed from PyPI. :rocket:

    [!IMPORTANT]

    Renaming of PIXI_PACKAGE_* variables:

    PIXI_PACKAGE_ROOT -> PIXI_PROJECT_ROOT\nPIXI_PACKAGE_NAME ->  PIXI_PROJECT_NAME\nPIXI_PACKAGE_MANIFEST -> PIXI_PROJECT_MANIFEST\nPIXI_PACKAGE_VERSION -> PIXI_PROJECT_VERSION\nPIXI_PACKAGE_PLATFORMS -> PIXI_ENVIRONMENT_PLATFORMS\n
    Check documentation here: https://pixi.sh/environment/

    [!IMPORTANT]

    The .pixi/env/ folder has been moved to accommodate multiple environments. If you only have one environment it is now named .pixi/envs/default.

    "},{"location":"CHANGELOG/#added_18","title":"Added","text":"
    • Add support for multiple environment:
      • Update to rattler lock v4 by @baszalmstra in #698
      • Multi-env installation and usage by @baszalmstra in #721
      • Update all environments in the lock-file when requesting an environment by @baszalmstra in #711
      • Run tasks in the env they are defined by @baszalmstra in #731
      • polarify use-case as an example by @ruben-arts in #735
      • Make environment name parsing strict by @ruben-arts in #673
      • Use named environments (only \"default\" for now) by @ruben-arts in #674
      • Use task graph instead of traversal by @baszalmstra in #725
      • Multi env documentation by @ruben-arts in #703
      • pixi info -e/--environment option by @ruben-arts in #676
      • pixi channel add -f/--feature option by @ruben-arts in #700
      • pixi channel remove -f/--feature option by @ruben-arts in #706
      • pixi remove -f/--feature option by @ruben-arts in #680
      • pixi task list -e/--environment option by @ruben-arts in #694
      • pixi task remove -f/--feature option by @ruben-arts in #694
      • pixi install -e/--environment option by @ruben-arts in #722
    • Support for sdists in pypi-dependencies by @tdejager in #664
    • Add pre-release support to pypi-dependencies by @tdejager in #716
    • Support adding dependencies for project's unsupported platforms by @orhun in #668
    • Add pixi list command by @hadim in #665
    • Add pixi shell-hook command by @orhun in #672#679 #684
    • Use env variable to configure locked, frozen and color by @hadim in #726
    • pixi self-update by @hadim in #675
    • Add PIXI_NO_PATH_UPDATE for PATH update suppression by @chawyehsu in #692
    • Set the cache directory by @ruben-arts in #683
    "},{"location":"CHANGELOG/#changed_22","title":"Changed","text":"
    • Use consistent naming for tests module by @orhun in #678
    • Install pixi and add to the path in docker example by @ruben-arts in #743
    • Simplify the deserializer of PyPiRequirement by @orhun in #744
    • Use tabwriter instead of comfy_table by @baszalmstra in #745
    • Document environment variables by @ruben-arts in #746
    "},{"location":"CHANGELOG/#fixed_31","title":"Fixed","text":"
    • Quote part of the task that has brackets ([ or ]) by @JafarAbdi in #677
    • Package clobber and __pycache__ removal issues by @wolfv in #573
    • Non-global reqwest client by @tdejager in #693
    • Fix broken pipe error during search by @orhun in #699
    • Make pixi search result correct by @chawyehsu in #713
    • Allow the tasks for all platforms to be shown in pixi info by @ruben-arts in #728
    • Flaky tests while installing pypi dependencies by @baszalmstra in #732
    • Linux install script by @mariusvniekerk in #737
    • Download wheels in parallel to avoid deadlock by @baszalmstra in #752
    "},{"location":"CHANGELOG/#new-contributors_21","title":"New Contributors","text":"
    • @JafarAbdi made their first contribution in #677
    • @mariusvniekerk made their first contribution in #737

    Full commit history

    "},{"location":"CHANGELOG/#0120-2024-01-15","title":"[0.12.0] - 2024-01-15","text":""},{"location":"CHANGELOG/#highlights_23","title":"\u2728 Highlights","text":"
    • Some great community contributions, pixi global upgrade, pixi project version commands, a PIXI_HOME variable.
    • A ton of refactor work to prepare for the multi-environment feature.
      • Note that there are no extra environments created yet, but you can just specify them in the pixi.toml file already.
      • Next we'll build the actual environments.
    "},{"location":"CHANGELOG/#added_19","title":"Added","text":"
    • Add global upgrade command to pixi by @trueleo in #614
    • Add configurable PIXI_HOME by @chawyehsu in #627
    • Add --pypi option to pixi remove by @marcelotrevisani in https://github.com/prefix-dev/pixi/pull/602
    • PrioritizedChannels to specify channel priority by @ruben-arts in https://github.com/prefix-dev/pixi/pull/658
    • Add project version {major,minor,patch} CLIs by @hadim in https://github.com/prefix-dev/pixi/pull/633
    "},{"location":"CHANGELOG/#changed_23","title":"Changed","text":"
    • Refactored project model using targets, features and environments by @baszalmstra in https://github.com/prefix-dev/pixi/pull/616
    • Move code from Project to Environment by @baszalmstra in #630
    • Refactored system-requirements from Environment by @baszalmstra in #632
    • Extract activation.scripts into Environment by @baszalmstra in #659
    • Extract pypi-dependencies from Environment by @baszalmstra in https://github.com/prefix-dev/pixi/pull/656
    • De-serialization of features and environments by @ruben-arts in https://github.com/prefix-dev/pixi/pull/636
    "},{"location":"CHANGELOG/#fixed_32","title":"Fixed","text":"
    • Make install.sh also work with wget if curl is not available by @wolfv in #644
    • Use source build for rattler by @ruben-arts
    • Check for pypi-dependencies before amending the pypi purls by @ruben-arts in #661
    • Don't allow the use of reflinks by @ruben-arts in #662
    "},{"location":"CHANGELOG/#removed_1","title":"Removed","text":"
    • Remove windows and unix system requirements by @baszalmstra in #635
    "},{"location":"CHANGELOG/#documentation_19","title":"Documentation","text":"
    • Document the channel logic by @ruben-arts in https://github.com/prefix-dev/pixi/pull/610
    • Update the instructions for installing on Arch Linux by @orhun in https://github.com/prefix-dev/pixi/pull/653
    • Update Community.md by @KarelZe in https://github.com/prefix-dev/pixi/pull/654
    • Replace contributions.md with contributing.md and make it more standardized by @ruben-arts in https://github.com/prefix-dev/pixi/pull/649
    • Remove windows and unix system requirements by @baszalmstra in https://github.com/prefix-dev/pixi/pull/635
    • Add CODE_OF_CONDUCT.md by @ruben-arts in https://github.com/prefix-dev/pixi/pull/648
    • Removed remaining .ps1 references by @bahugo in https://github.com/prefix-dev/pixi/pull/643
    "},{"location":"CHANGELOG/#new-contributors_22","title":"New Contributors","text":"
    • @marcelotrevisani made their first contribution in https://github.com/prefix-dev/pixi/pull/602
    • @trueleo made their first contribution in https://github.com/prefix-dev/pixi/pull/614
    • @bahugo made their first contribution in https://github.com/prefix-dev/pixi/pull/643
    • @KarelZe made their first contribution in https://github.com/prefix-dev/pixi/pull/654

    Full Changelog: https://github.com/prefix-dev/pixi/compare/v0.11.0...v0.12.0

    "},{"location":"CHANGELOG/#0111-2024-01-06","title":"[0.11.1] - 2024-01-06","text":""},{"location":"CHANGELOG/#fixed_33","title":"Fixed","text":"
    • Upgrading rattler to fix pixi auth in #642
    "},{"location":"CHANGELOG/#0110-2024-01-05","title":"[0.11.0] - 2024-01-05","text":""},{"location":"CHANGELOG/#highlights_24","title":"\u2728 Highlights","text":"
    • Lots of important and preparations for the pypi sdist and multi environment feature
    • Lots of new contributors that help pixi improve!
    "},{"location":"CHANGELOG/#added_20","title":"Added","text":"
    • Add new commands for pixi project {version|channel|platform|description} by @hadim in #579
    • Add dependabot.yml by @pavelzw in #606
    "},{"location":"CHANGELOG/#changed_24","title":"Changed","text":"
    • winget-releaser gets correct identifier by @ruben-arts in #561
    • Task run code by @baszalmstra in #556
    • No ps1 in activation scripts by @ruben-arts in #563
    • Changed some names for clarity by @tdejager in #568
    • Change font and make it dark mode by @ruben-arts in #576
    • Moved pypi installation into its own module by @tdejager in #589
    • Move alpha to beta feature and toggle it off with env var by @ruben-arts in #604
    • Improve UX activation scripts by @ruben-arts in #560
    • Add sanity check by @tdejager in #569
    • Refactor manifest by @ruben-arts in #572
    • Improve search by @Johnwillliam in #578
    • Split pypi and conda solve steps by @tdejager in #601
    "},{"location":"CHANGELOG/#fixed_34","title":"Fixed","text":"
    • Save file after lockfile is correctly updated by @ruben-arts in #555
    • Limit the number of concurrent solves by @baszalmstra in #571
    • Use project virtual packages in add command by @msegado in #609
    • Improved mapped dependency by @ruben-arts in #574
    "},{"location":"CHANGELOG/#documentation_20","title":"Documentation","text":"
    • Change font and make it dark mode by @ruben-arts in #576
    • typo: no ps1 in activation scripts by @ruben-arts in #563
    • Document adding CUDA to system-requirements by @ruben-arts in #595
    • Multi env proposal documentation by @ruben-arts in #584
    • Fix multiple typos in configuration.md by @SeaOtocinclus in #608
    • Add multiple machines from one project example by @pavelzw in #605
    "},{"location":"CHANGELOG/#new-contributors_23","title":"New Contributors","text":"
    • @hadim made their first contribution in #579
    • @msegado made their first contribution in #609
    • @Johnwillliam made their first contribution in #578
    • @SeaOtocinclus made their first contribution in #608

    Full Changelog: https://github.com/prefix-dev/pixi/compare/v0.10.0...v0.11.0

    "},{"location":"CHANGELOG/#0100-2023-12-8","title":"[0.10.0] - 2023-12-8","text":""},{"location":"CHANGELOG/#highlights_25","title":"Highlights","text":"
    • Better pypi-dependencies support, now install even more of the pypi packages.
    • pixi add --pypi command to add a pypi package to your project.
    "},{"location":"CHANGELOG/#added_21","title":"Added","text":"
    • Use range (>=1.2.3, <1.3) when adding requirement, instead of 1.2.3.* by @baszalmstra in https://github.com/prefix-dev/pixi/pull/536
    • Update rip to fix by @tdejager in https://github.com/prefix-dev/pixi/pull/543
      • Better Bytecode compilation (.pyc) support by @baszalmstra
      • Recognize .data directory headers by @baszalmstra
    • Also print arguments given to a pixi task by @ruben-arts in https://github.com/prefix-dev/pixi/pull/545
    • Add pixi add --pypi command by @ruben-arts in https://github.com/prefix-dev/pixi/pull/539
    "},{"location":"CHANGELOG/#fixed_35","title":"Fixed","text":"
    • space in global install path by @ruben-arts in https://github.com/prefix-dev/pixi/pull/513
    • Glibc version/family parsing by @baszalmstra in https://github.com/prefix-dev/pixi/pull/535
    • Use build and host specs while getting the best version by @ruben-arts in https://github.com/prefix-dev/pixi/pull/538
    "},{"location":"CHANGELOG/#miscellaneous_1","title":"Miscellaneous","text":"
    • docs: add update manual by @ruben-arts in https://github.com/prefix-dev/pixi/pull/521
    • add lightgbm demo by @partrita in https://github.com/prefix-dev/pixi/pull/492
    • Update documentation link by @williamjamir in https://github.com/prefix-dev/pixi/pull/525
    • Update Community.md by @jiaxiyang in https://github.com/prefix-dev/pixi/pull/527
    • Add winget releaser by @ruben-arts in https://github.com/prefix-dev/pixi/pull/547
    • Custom rerun-sdk example, force driven graph of pixi.lock by @ruben-arts in https://github.com/prefix-dev/pixi/pull/548
    • Better document pypi part by @ruben-arts in https://github.com/prefix-dev/pixi/pull/546
    "},{"location":"CHANGELOG/#new-contributors_24","title":"New Contributors","text":"
    • @partrita made their first contribution in https://github.com/prefix-dev/pixi/pull/492
    • @williamjamir made their first contribution in https://github.com/prefix-dev/pixi/pull/525
    • @jiaxiyang made their first contribution in https://github.com/prefix-dev/pixi/pull/527

    Full Changelog: https://github.com/prefix-dev/pixi/compare/v0.9.1...v0.10.0

    "},{"location":"CHANGELOG/#091-2023-11-29","title":"[0.9.1] - 2023-11-29","text":""},{"location":"CHANGELOG/#highlights_26","title":"Highlights","text":"
    • PyPI's scripts are now fixed. For example: https://github.com/prefix-dev/pixi/issues/516
    "},{"location":"CHANGELOG/#fixed_36","title":"Fixed","text":"
    • Remove attr (unused) and update all dependencies by @wolfv in https://github.com/prefix-dev/pixi/pull/510
    • Remove empty folders on python uninstall by @baszalmstra in https://github.com/prefix-dev/pixi/pull/512
    • Bump rip to add scripts by @baszalmstra in https://github.com/prefix-dev/pixi/pull/517

    Full Changelog: https://github.com/prefix-dev/pixi/compare/v0.9.0...v0.9.1

    "},{"location":"CHANGELOG/#090-2023-11-28","title":"[0.9.0] - 2023-11-28","text":""},{"location":"CHANGELOG/#highlights_27","title":"Highlights","text":"
    • You can now run pixi remove, pixi rm to remove a package from the environment
    • Fix pip install -e issue that was created by release v0.8.0 : https://github.com/prefix-dev/pixi/issues/507
    "},{"location":"CHANGELOG/#added_22","title":"Added","text":"
    • pixi remove command by @Wackyator in https://github.com/prefix-dev/pixi/pull/483
    "},{"location":"CHANGELOG/#fixed_37","title":"Fixed","text":"
    • Install entrypoints for [pypi-dependencies] @baszalmstra in https://github.com/prefix-dev/pixi/pull/508
    • Only uninstall pixi installed packages by @baszalmstra in https://github.com/prefix-dev/pixi/pull/509

    Full Changelog: https://github.com/prefix-dev/pixi/compare/v0.8.0...v0.9.0

    "},{"location":"CHANGELOG/#080-2023-11-27","title":"[0.8.0] - 2023-11-27","text":""},{"location":"CHANGELOG/#highlights_28","title":"Highlights","text":"
    • \ud83c\udf89\ud83d\udc0d[pypi-dependencies] ALPHA RELEASE\ud83d\udc0d\ud83c\udf89, you can now add PyPI dependencies to your pixi project.
    • UX of pixi run has been improved with better errors and showing what task is run.

    [!NOTE] [pypi-dependencies] support is still incomplete, missing functionality is listed here: https://github.com/orgs/prefix-dev/projects/6. Our intent is not to have 100% feature parity with pip, our goal is that you only need pixi for both conda and pypi packages alike.

    "},{"location":"CHANGELOG/#added_23","title":"Added","text":"
    • Bump rattler @ruben-arts in https://github.com/prefix-dev/pixi/pull/496
    • Implement lock-file satisfiability with pypi-dependencies by @baszalmstra in https://github.com/prefix-dev/pixi/pull/494
    • List pixi tasks when command not found is returned by @ruben-arts in https://github.com/prefix-dev/pixi/pull/488
    • Show which command is run as a pixi task by @ruben-arts in https://github.com/prefix-dev/pixi/pull/491 && https://github.com/prefix-dev/pixi/pull/493
    • Add progress info to conda install by @baszalmstra in https://github.com/prefix-dev/pixi/pull/470
    • Install pypi dependencies (alpha) by @baszalmstra in https://github.com/prefix-dev/pixi/pull/452
    "},{"location":"CHANGELOG/#fixed_38","title":"Fixed","text":"
    • Add install scripts to pixi.sh by @ruben-arts in https://github.com/prefix-dev/pixi/pull/458 && https://github.com/prefix-dev/pixi/pull/459 && https://github.com/prefix-dev/pixi/pull/460
    • Fix RECORD not found issue by @baszalmstra in https://github.com/prefix-dev/pixi/pull/495
    • Actually add to the .gitignore and give better errors by @ruben-arts in https://github.com/prefix-dev/pixi/pull/490
    • Support macOS for pypi-dependencies by @baszalmstra in https://github.com/prefix-dev/pixi/pull/478
    • Custom pypi-dependencies type by @ruben-arts in https://github.com/prefix-dev/pixi/pull/471
    • pypi-dependencies parsing errors by @ruben-arts in https://github.com/prefix-dev/pixi/pull/479
    • Progress issues by @baszalmstra in https://github.com/prefix-dev/pixi/pull/4
    "},{"location":"CHANGELOG/#miscellaneous_2","title":"Miscellaneous","text":"
    • Example: ctypes by @liquidcarbon in https://github.com/prefix-dev/pixi/pull/441
    • Mention the AUR package by @orhun in https://github.com/prefix-dev/pixi/pull/464
    • Update rerun example by @ruben-arts in https://github.com/prefix-dev/pixi/pull/489
    • Document pypi-dependencies by @ruben-arts in https://github.com/prefix-dev/pixi/pull/481
    • Ignore docs paths on rust workflow by @ruben-arts in https://github.com/prefix-dev/pixi/pull/482
    • Fix flaky tests, run serially by @baszalmstra in https://github.com/prefix-dev/pixi/pull/477
    "},{"location":"CHANGELOG/#new-contributors_25","title":"New Contributors","text":"
    • @liquidcarbon made their first contribution in https://github.com/prefix-dev/pixi/pull/441
    • @orhun made their first contribution in https://github.com/prefix-dev/pixi/pull/464

    Full Changelog: https://github.com/prefix-dev/pixi/compare/v0.7.0...v0.8.0

    "},{"location":"CHANGELOG/#070-2023-11-14","title":"[0.7.0] - 2023-11-14","text":""},{"location":"CHANGELOG/#highlights_29","title":"Highlights","text":"
    • Channel priority: channels = [\"conda-forge\", \"pytorch\"] All packages found in conda-forge will not be taken from pytorch.
    • Channel specific dependencies: pytorch = { version=\"*\", channel=\"pytorch\"}
    • Autocompletion on pixi run <TABTAB>
    • Moved all pixi documentation into this repo, try it with pixi run docs!
    • Lots of new contributors!
    "},{"location":"CHANGELOG/#added_24","title":"Added","text":"
    • Bump rattler to its newest version by @ruben-arts in https://github.com/prefix-dev/pixi/pull/395 * Some notable changes: * Add channel priority (If a package is found in the first listed channel it will not be looked for in the other channels). * Fix JLAP using wrong hash. * Lockfile forward compatibility error.
    • Add nushell support by @wolfv in https://github.com/prefix-dev/pixi/pull/360
    • Autocomplete tasks on pixi run for bash and zsh by @ruben-arts in https://github.com/prefix-dev/pixi/pull/390
    • Add prefix location file to avoid copy error by @ruben-arts in https://github.com/prefix-dev/pixi/pull/422
    • Channel specific dependencies python = { version = \"*\" channel=\"conda-forge\" } by @ruben-arts in https://github.com/prefix-dev/pixi/pull/439
    "},{"location":"CHANGELOG/#changed_25","title":"Changed","text":"
    • project.version as optional field in the pixi.toml by @ruben-arts in https://github.com/prefix-dev/pixi/pull/400
    "},{"location":"CHANGELOG/#fixed_39","title":"Fixed","text":"
    • Deny unknown fields in pixi.toml to help users find errors by @ruben-arts in https://github.com/prefix-dev/pixi/pull/396
    • install.sh to create dot file if not present by @humphd in https://github.com/prefix-dev/pixi/pull/408
    • Ensure order of repodata fetches by @baszalmstra in https://github.com/prefix-dev/pixi/pull/405
    • Strip Linux binaries by @baszalmstra in https://github.com/prefix-dev/pixi/pull/414
    • Sort task list by @ruben-arts in https://github.com/prefix-dev/pixi/pull/431
    • Fix global install path on windows by @ruben-arts in https://github.com/prefix-dev/pixi/pull/449
    • Let PIXI_BIN_PATH use backslashes by @Hofer-Julian in https://github.com/prefix-dev/pixi/pull/442
    • Print more informative error if created file is empty by @traversaro in https://github.com/prefix-dev/pixi/pull/447
    "},{"location":"CHANGELOG/#docs","title":"Docs","text":"
    • Move to mkdocs with all documentation by @ruben-arts in https://github.com/prefix-dev/pixi/pull/435
    • Fix typing errors by @FarukhS52 in https://github.com/prefix-dev/pixi/pull/426
    • Add social cards to the pages by @ruben-arts in https://github.com/prefix-dev/pixi/pull/445
    • Enhance README.md: Added Table of Contents, Grammar Improvements by @adarsh-jha-dev in https://github.com/prefix-dev/pixi/pull/421
    • Adding conda-auth to community examples by @travishathaway in https://github.com/prefix-dev/pixi/pull/433
    • Minor grammar correction by @tylere in https://github.com/prefix-dev/pixi/pull/406
    • Make capitalization of tab titles consistent by @tylere in https://github.com/prefix-dev/pixi/pull/407
    "},{"location":"CHANGELOG/#new-contributors_26","title":"New Contributors","text":"
    • @tylere made their first contribution in https://github.com/prefix-dev/pixi/pull/406
    • @humphd made their first contribution in https://github.com/prefix-dev/pixi/pull/408
    • @adarsh-jha-dev made their first contribution in https://github.com/prefix-dev/pixi/pull/421
    • @FarukhS52 made their first contribution in https://github.com/prefix-dev/pixi/pull/426
    • @travishathaway made their first contribution in https://github.com/prefix-dev/pixi/pull/433
    • @traversaro made their first contribution in https://github.com/prefix-dev/pixi/pull/447

    Full Changelog: https://github.com/prefix-dev/pixi/compare/v0.6.0...v0.7.0

    "},{"location":"CHANGELOG/#060-2023-10-17","title":"[0.6.0] - 2023-10-17","text":""},{"location":"CHANGELOG/#highlights_30","title":"Highlights","text":"

    This release fixes some bugs and adds the --cwd option to the tasks.

    "},{"location":"CHANGELOG/#fixed_40","title":"Fixed","text":"
    • Improve shell prompts by @ruben-arts in https://github.com/prefix-dev/pixi/pull/385 https://github.com/prefix-dev/pixi/pull/388
    • Change --frozen logic to error when there is no lockfile by @ruben-arts in https://github.com/prefix-dev/pixi/pull/373
    • Don't remove the '.11' from 'python3.11' binary file name by @ruben-arts in https://github.com/prefix-dev/pixi/pull/366
    "},{"location":"CHANGELOG/#changed_26","title":"Changed","text":"
    • Update rerun example to v0.9.1 by @ruben-arts in https://github.com/prefix-dev/pixi/pull/389
    "},{"location":"CHANGELOG/#added_25","title":"Added","text":"
    • Add the current working directory (--cwd) in pixi tasks by @ruben-arts in https://github.com/prefix-dev/pixi/pull/380

    Full Changelog: https://github.com/prefix-dev/pixi/compare/v0.5.0...v0.6.0

    "},{"location":"CHANGELOG/#050-2023-10-03","title":"[0.5.0] - 2023-10-03","text":""},{"location":"CHANGELOG/#highlights_31","title":"Highlights","text":"

    We rebuilt pixi shell, fixing the fact that your rc file would overrule the environment activation.

    "},{"location":"CHANGELOG/#fixed_41","title":"Fixed","text":"
    • Change how shell works and make activation more robust by @wolfv in https://github.com/prefix-dev/pixi/pull/316
    • Documentation: use quotes in cli by @pavelzw in https://github.com/prefix-dev/pixi/pull/367
    "},{"location":"CHANGELOG/#added_26","title":"Added","text":"
    • Create or append to the .gitignore and .gitattributes files by @ruben-arts in https://github.com/prefix-dev/pixi/pull/359
    • Add --locked and --frozen to getting an up-to-date prefix by @ruben-arts in https://github.com/prefix-dev/pixi/pull/363
    • Documentation: improvement/update by @ruben-arts in https://github.com/prefix-dev/pixi/pull/355
    • Example: how to build a docker image using pixi by @ruben-arts in https://github.com/prefix-dev/pixi/pull/353 & https://github.com/prefix-dev/pixi/pull/365
    • Update to the newest rattler by @baszalmstra in https://github.com/prefix-dev/pixi/pull/361
    • Periodic cargo upgrade --all --incompatible by @wolfv in https://github.com/prefix-dev/pixi/pull/358

    Full Changelog: https://github.com/prefix-dev/pixi/compare/v0.4.0...v0.5.0

    "},{"location":"CHANGELOG/#040-2023-09-22","title":"[0.4.0] - 2023-09-22","text":""},{"location":"CHANGELOG/#highlights_32","title":"Highlights","text":"

    This release adds the start of a new cli command pixi project which will allow users to interact with the project configuration from the command line.

    "},{"location":"CHANGELOG/#fixed_42","title":"Fixed","text":"
    • Align with latest rattler version 0.9.0 by @ruben-arts in https://github.com/prefix-dev/pixi/pull/350
    "},{"location":"CHANGELOG/#added_27","title":"Added","text":"
    • Add codespell (config, workflow) to catch typos + catch and fix some of those by @yarikoptic in https://github.com/prefix-dev/pixi/pull/329
    • remove atty and use stdlib by @wolfv in https://github.com/prefix-dev/pixi/pull/337
    • xtsci-dist to Community.md by @HaoZeke in https://github.com/prefix-dev/pixi/pull/339
    • ribasim to Community.md by @Hofer-Julian in https://github.com/prefix-dev/pixi/pull/340
    • LFortran to Community.md by @wolfv in https://github.com/prefix-dev/pixi/pull/341
    • Give tip to resolve virtual package issue by @ruben-arts in https://github.com/prefix-dev/pixi/pull/348
    • pixi project channel add subcommand by @baszalmstra and @ruben-arts in https://github.com/prefix-dev/pixi/pull/347
    "},{"location":"CHANGELOG/#new-contributors_27","title":"New Contributors","text":"
    • @yarikoptic made their first contribution in https://github.com/prefix-dev/pixi/pull/329
    • @HaoZeke made their first contribution in https://github.com/prefix-dev/pixi/pull/339

    Full Changelog: https://github.com/prefix-dev/pixi/compare/v0.3.0...v0.4.0

    "},{"location":"CHANGELOG/#030-2023-09-11","title":"[0.3.0] - 2023-09-11","text":""},{"location":"CHANGELOG/#highlights_33","title":"Highlights","text":"

    This releases fixes a lot of issues encountered by the community as well as some awesome community contributions like the addition of pixi global list and pixi global remove.

    "},{"location":"CHANGELOG/#fixed_43","title":"Fixed","text":"
    • Properly detect Cuda on linux using our build binaries, by @baszalmstra (#290)
    • Package names are now case-insensitive, by @baszalmstra (#285)
    • Issue with starts-with and compatibility operator, by @tdejager (#296)
    • Lock files are now consistently sorted, by @baszalmstra (#295 & #307)
    • Improved xonsh detection and powershell env-var escaping, by @wolfv (#307)
    • system-requirements are properly filtered by platform, by @ruben-arts (#299)
    • Powershell completion install script, by @chawyehsu (#325)
    • Simplified and improved shell quoting, by @baszalmstra (#313)
    • Issue where platform specific subdirs were required, by @baszalmstra (#333)
    • thread 'tokio-runtime-worker' has overflowed its stack issue, by @baszalmstra (#28)
    "},{"location":"CHANGELOG/#added_28","title":"Added","text":"
    • Certificates from the OS certificate store are now used, by @baszalmstra (#310)
    • pixi global list and pixi global remove commands, by @cjfuller (#318)
    "},{"location":"CHANGELOG/#changed_27","title":"Changed","text":"
    • --manifest-path must point to a pixi.toml file, by @baszalmstra (#324)
    "},{"location":"CHANGELOG/#020-2023-08-22","title":"[0.2.0] - 2023-08-22","text":""},{"location":"CHANGELOG/#highlights_34","title":"Highlights","text":"
    • Added pixi search command to search for packages, by @Wackyator. (#244)
    • Added target specific tasks, eg. [target.win-64.tasks], by @ruben-arts. (#269)
    • Flaky install caused by the download of packages, by @baszalmstra. (#281)
    "},{"location":"CHANGELOG/#fixed_44","title":"Fixed","text":"
    • Install instructions, by @baszalmstra. (#258)
    • Typo in getting started, by @RaulPL. (#266)
    • Don't execute alias tasks, by @baszalmstra. (#274)
    "},{"location":"CHANGELOG/#added_29","title":"Added","text":"
    • Rerun example, by @ruben-arts. (#236)
    • Reduction of pixi's binary size, by @baszalmstra (#256)
    • Updated pixi banner, including webp file for faster loading, by @baszalmstra. (#257)
    • Set linguist attributes for pixi.lock automatically, by @spenserblack. (#265)
    • Contribution manual for pixi, by @ruben-arts. (#268)
    • GitHub issue templates, by @ruben-arts. (#271)
    • Links to prefix.dev in readme, by @tdejager. (#279)
    "},{"location":"CHANGELOG/#010-2023-08-11","title":"[0.1.0] - 2023-08-11","text":"

    As this is our first Semantic Versioning release, we'll change from the prototype to the developing phase, as semver describes. A 0.x release could be anything from a new major feature to a breaking change where the 0.0.x releases will be bugfixes or small improvements.

    "},{"location":"CHANGELOG/#highlights_35","title":"Highlights","text":"
    • Update to the latest rattler version, by @baszalmstra. (#249)
    "},{"location":"CHANGELOG/#fixed_45","title":"Fixed","text":"
    • Only add shebang to activation scripts on unix platforms, by @baszalmstra. (#250)
    • Use official crates.io releases for all dependencies, by @baszalmstra. (#252)
    "},{"location":"CHANGELOG/#008-2023-08-01","title":"[0.0.8] - 2023-08-01","text":""},{"location":"CHANGELOG/#highlights_36","title":"Highlights","text":"
    • Much better error printing using miette, by @baszalmstra. (#211)
    • You can now use pixi on aarch64-linux, by @pavelzw. (#233)
    • Use the Rust port of libsolv as the default solver, by @ruben-arts. (#209)
    "},{"location":"CHANGELOG/#added_30","title":"Added","text":"
    • Add mention to condax in the docs, by @maresb. (#207)
    • Add brew installation instructions, by @wolfv. (#208)
    • Add activation.scripts to the pixi.toml to configure environment activation, by @ruben-arts. (#217)
    • Add pixi upload command to upload packages to prefix.dev, by @wolfv. (#127)
    • Add more metadata fields to the pixi.toml, by @wolfv. (#218)
    • Add pixi task list to show all tasks in the project, by @tdejager. (#228)
    • Add --color to configure the colors in the output, by @baszalmstra. (#243)
    • Examples, ROS2 Nav2, JupyterLab and QGIS, by @ruben-arts.
    "},{"location":"CHANGELOG/#fixed_46","title":"Fixed","text":"
    • Add trailing newline to pixi.toml and .gitignore, by @pavelzw. (#216)
    • Deny unknown fields and rename license-file in pixi.toml, by @wolfv. (#220)
    • Overwrite PS1 variable when going into a pixi shell, by @ruben-arts. (#201)
    "},{"location":"CHANGELOG/#changed_28","title":"Changed","text":"
    • Install environment when adding a dependency using pixi add, by @baszalmstra. (#213)
    • Improve and speedup CI, by @baszalmstra. (#241)
    "},{"location":"CHANGELOG/#007-2023-07-11","title":"[0.0.7] - 2023-07-11","text":""},{"location":"CHANGELOG/#highlights_37","title":"Highlights","text":"
    • Transitioned the run subcommand to use the deno_task_shell for improved cross-platform functionality. More details in the Deno Task Runner documentation.
    • Added an info subcommand to retrieve system-specific information understood by pixi.
    "},{"location":"CHANGELOG/#breaking-changes","title":"BREAKING CHANGES","text":"
    • [commands] in the pixi.toml is now called [tasks]. (#177)
    "},{"location":"CHANGELOG/#added_31","title":"Added","text":"
    • The pixi info command to get more system information by @wolfv in (#158)
    • Documentation on how to use the cli by @ruben-arts in (#160)
    • Use the deno_task_shell to execute commands in pixi run by @baszalmstra in (#173)
    • Use new solver backend from rattler by @baszalmstra in (#178)
    • The pixi command command to the cli by @tdejager in (#177)
    • Documentation on how to use the pixi auth command by @wolfv in (#183)
    • Use the newest rattler 0.6.0 by @baszalmstra in (#185)
    • Build with pixi section to the documentation by @tdejager in (#196)
    "},{"location":"CHANGELOG/#fixed_47","title":"Fixed","text":"
    • Running tasks sequentially when using depends_on by @tdejager in (#161)
    • Don't add PATH variable where it is already set by @baszalmstra in (#169)
    • Fix README by @Hofer-Julian in (#182)
    • Fix Ctrl+C signal in pixi run by @tdejager in (#190)
    • Add the correct license information to the lockfiles by @wolfv in (#191)
    "},{"location":"CHANGELOG/#006-2023-06-30","title":"[0.0.6] - 2023-06-30","text":""},{"location":"CHANGELOG/#highlights_38","title":"Highlights","text":"

    Improving the reliability is important to us, so we added an integration testing framework, we can now test as close as possible to the CLI level using cargo.

    "},{"location":"CHANGELOG/#added_32","title":"Added","text":"
    • An integration test harness, to test as close as possible to the user experience but in rust. (#138, #140, #156)
    • Add different levels of dependencies in preparation for pixi build, allowing host- and build- dependencies (#149)
    "},{"location":"CHANGELOG/#fixed_48","title":"Fixed","text":"
    • Use correct folder name on pixi init (#144)
    • Fix windows cli installer (#152)
    • Fix global install path variable (#147)
    • Fix macOS binary notarization (#153)
    "},{"location":"CHANGELOG/#005-2023-06-26","title":"[0.0.5] - 2023-06-26","text":"

    Fixing Windows installer build in CI. (#145)

    "},{"location":"CHANGELOG/#004-2023-06-26","title":"[0.0.4] - 2023-06-26","text":""},{"location":"CHANGELOG/#highlights_39","title":"Highlights","text":"

    A new command, auth which can be used to authenticate the host of the package channels. A new command, shell which can be used to start a shell in the pixi environment of a project. A refactor of the install command which is changed to global install and the install command now installs a pixi project if you run it in the directory. Platform specific dependencies using [target.linux-64.dependencies] instead of [dependencies] in the pixi.toml

    Lots and lots of fixes and improvements to make it easier for this user, where bumping to the new version of rattler helped a lot.

    "},{"location":"CHANGELOG/#added_33","title":"Added","text":"
    • Platform specific dependencies and helpful error reporting on pixi.toml issues(#111)
    • Windows installer, which is very useful for users that want to start using pixi on windows. (#114)
    • shell command to use the pixi environment without pixi run. (#116)
    • Verbosity options using -v, -vv, -vvv (#118)
    • auth command to be able to login or logout of a host like repo.prefix.dev if you're using private channels. (#120)
    • New examples: CPP sdl: #121, Opencv camera calibration #125
    • Apple binary signing and notarization. (#137)
    "},{"location":"CHANGELOG/#changed_29","title":"Changed","text":"
    • pixi install moved to pixi global install and pixi install became the installation of a project using the pixi.toml (#124)
    "},{"location":"CHANGELOG/#fixed_49","title":"Fixed","text":"
    • pixi run uses default shell (#119)
    • pixi add command is fixed. (#132)
    • Community issues fixed: #70, #72, #90, #92, #94, #96
    "}]} \ No newline at end of file +{"config":{"lang":["en"],"separator":"[\\s\\-]+","pipeline":["stopWordFilter"]},"docs":[{"location":"","title":"Getting Started","text":"

    Pixi is a package management tool for developers. It allows the developer to install libraries and applications in a reproducible way. Use pixi cross-platform, on Windows, Mac and Linux.

    "},{"location":"#installation","title":"Installation","text":"

    To install pixi you can run the following command in your terminal:

    Linux & macOSWindows
    curl -fsSL https://pixi.sh/install.sh | bash\n

    The above invocation will automatically download the latest version of pixi, extract it, and move the pixi binary to ~/.pixi/bin. If this directory does not already exist, the script will create it.

    The script will also update your ~/.bash_profile to include ~/.pixi/bin in your PATH, allowing you to invoke the pixi command from anywhere.

    PowerShell:

    iwr -useb https://pixi.sh/install.ps1 | iex\n
    winget:
    winget install prefix-dev.pixi\n
    The above invocation will automatically download the latest version of pixi, extract it, and move the pixi binary to LocalAppData/pixi/bin. If this directory does not already exist, the script will create it.

    The command will also automatically add LocalAppData/pixi/bin to your path allowing you to invoke pixi from anywhere.

    Tip

    You might need to restart your terminal or source your shell for the changes to take effect.

    You can find more options for the installation script here.

    "},{"location":"#autocompletion","title":"Autocompletion","text":"

    To get autocompletion follow the instructions for your shell. Afterwards, restart the shell or source the shell config file.

    "},{"location":"#bash-default-on-most-linux-systems","title":"Bash (default on most Linux systems)","text":"
    echo 'eval \"$(pixi completion --shell bash)\"' >> ~/.bashrc\n
    "},{"location":"#zsh-default-on-macos","title":"Zsh (default on macOS)","text":"
    echo 'eval \"$(pixi completion --shell zsh)\"' >> ~/.zshrc\n
    "},{"location":"#powershell-pre-installed-on-all-windows-systems","title":"PowerShell (pre-installed on all Windows systems)","text":"
    Add-Content -Path $PROFILE -Value '(& pixi completion --shell powershell) | Out-String | Invoke-Expression'\n

    Failure because no profile file exists

    Make sure your profile file exists, otherwise create it with:

    New-Item -Path $PROFILE -ItemType File -Force\n

    "},{"location":"#fish","title":"Fish","text":"
    echo 'pixi completion --shell fish | source' > ~/.config/fish/completions/pixi.fish\n
    "},{"location":"#nushell","title":"Nushell","text":"

    Add the following to the end of your Nushell env file (find it by running $nu.env-path in Nushell):

    mkdir ~/.cache/pixi\npixi completion --shell nushell | save -f ~/.cache/pixi/completions.nu\n

    And add the following to the end of your Nushell configuration (find it by running $nu.config-path):

    use ~/.cache/pixi/completions.nu *\n
    "},{"location":"#elvish","title":"Elvish","text":"
    echo 'eval (pixi completion --shell elvish | slurp)' >> ~/.elvish/rc.elv\n
    "},{"location":"#alternative-installation-methods","title":"Alternative installation methods","text":"

    Although we recommend installing pixi through the above method we also provide additional installation methods.

    "},{"location":"#homebrew","title":"Homebrew","text":"

    Pixi is available via homebrew. To install pixi via homebrew simply run:

    brew install pixi\n
    "},{"location":"#windows-installer","title":"Windows installer","text":"

    We provide an msi installer on our GitHub releases page. The installer will download pixi and add it to the path.

    "},{"location":"#install-from-source","title":"Install from source","text":"

    pixi is 100% written in Rust, and therefore it can be installed, built and tested with cargo. To start using pixi from a source build run:

    cargo install --locked --git https://github.com/prefix-dev/pixi.git pixi\n

    We don't publish to crates.io anymore, so you need to install it from the repository. The reason for this is that we depend on some unpublished crates which disallows us to publish to crates.io.

    or when you want to make changes use:

    cargo build\ncargo test\n

    If you have any issues building because of the dependency on rattler checkout its compile steps.

    "},{"location":"#installer-script-options","title":"Installer script options","text":"Linux & macOSWindows

    The installation script has several options that can be manipulated through environment variables.

    Variable Description Default Value PIXI_VERSION The version of pixi getting installed, can be used to up- or down-grade. latest PIXI_HOME The location of the binary folder. $HOME/.pixi PIXI_ARCH The architecture the pixi version was built for. uname -m PIXI_NO_PATH_UPDATE If set the $PATH will not be updated to add pixi to it. TMP_DIR The temporary directory the script uses to download to and unpack the binary from. /tmp

    For example, on Apple Silicon, you can force the installation of the x86 version:

    curl -fsSL https://pixi.sh/install.sh | PIXI_ARCH=x86_64 bash\n
    Or set the version
    curl -fsSL https://pixi.sh/install.sh | PIXI_VERSION=v0.18.0 bash\n

    The installation script has several options that can be manipulated through environment variables.

    Variable Environment variable Description Default Value PixiVersion PIXI_VERSION The version of pixi getting installed, can be used to up- or down-grade. latest PixiHome PIXI_HOME The location of the installation. $Env:USERPROFILE\\.pixi NoPathUpdate If set, the $PATH will not be updated to add pixi to it.

    For example, set the version using:

    iwr -useb https://pixi.sh/install.ps1 | iex -Args \"-PixiVersion v0.18.0\"\n
    "},{"location":"#update","title":"Update","text":"

    Updating is as simple as installing, rerunning the installation script gets you the latest version.

    pixi self-update\n
    Or get a specific pixi version using:
    pixi self-update --version x.y.z\n

    Note

    If you've used a package manager like brew, mamba, conda, paru etc. to install pixi. It's preferable to use the built-in update mechanism. e.g. brew upgrade pixi.

    "},{"location":"#uninstall","title":"Uninstall","text":"

    To uninstall pixi from your system, simply remove the binary.

    Linux & macOSWindows
    rm ~/.pixi/bin/pixi\n
    $PIXI_BIN = \"$Env:LocalAppData\\pixi\\bin\\pixi\"; Remove-Item -Path $PIXI_BIN\n

    After this command, you can still use the tools you installed with pixi. To remove these as well, just remove the whole ~/.pixi directory and remove the directory from your path.

    "},{"location":"Community/","title":"Community","text":"

    When you want to show your users and contributors that they can use pixi in your repo, you can use the following badge:

    [![Pixi Badge](https://img.shields.io/endpoint?url=https://raw.githubusercontent.com/prefix-dev/pixi/main/assets/badge/v0.json)](https://pixi.sh)\n

    Customize your badge

    To further customize the look and feel of your badge, you can add &style=<custom-style> at the end of the URL. See the documentation on shields.io for more info.

    "},{"location":"Community/#built-using-pixi","title":"Built using Pixi","text":"
    • Deltares:
      • Ribasim: Water resources model
      • Ribasim-NL: Ribasim water resources modeling in the Netherlands
      • iMOD Python: Make massive MODFLOW models
      • iMOD Coupler: Application for coupling hydrological kernels
      • iMOD Documentation: Documentation of the iMOD suite.
      • Xugrid: Xarray and unstructured grids
      • Numba celltree: Celltree data structure for searching for points, lines, boxes, and cells (convex polygons) in a two dimensional unstructured mesh.
      • QGIS-Tim: QGIS plugin and utilities for TimML multi-layer analytic element model
      • Pandamesh: From geodataframe to mesh
      • Wflow: Hydrological modeling framework
      • HydroMT: Automated and reproducible model building and analysis
      • HydroMT SFINCS: SFINCS plugin for HydroMT
      • PyFlwDir: Fast methods to work with hydro- and topography data in pure Python.
    • USGS:
      • MODFLOW 6: USGS modular hydrological model
    • QuantCo:
      • glum: High performance Python GLMs with all the features!
      • tabmat: Efficient matrix representations for working with tabular data
      • pixi-pack: A tool to pack and unpack conda environments created with pixi
      • polarify: Simplifying conditional Polars Expressions with Python \ud83d\udc0d \ud83d\udc3b\u200d\u2744\ufe0f
      • copier-template-python-open-source: Copier template for python projects using pixi
      • datajudge: Assessing whether data from database complies with reference information
      • ndonnx: ONNX-backed array library that is compliant with the Array API standard
      • multiregex: Quickly match many regexes against a string
      • slim-trees: Pickle your ML models more efficiently for deployment \ud83d\ude80
      • sqlcompyre: Compare SQL tables and databases
      • metalearners: MetaLearners for CATE estimation
      • ndonnx: ONNX-backed array library that is compliant with the Array API standard
      • tabulardelta: Simplify table comparisons
      • pydiverse.pipedag: A library for data pipeline orchestration optimizing high development iteration speed
      • pydiverse.transform: Pipe based dataframe manipulation library that can also transform data on SQL databases
    • pixi-pycharm: Conda shim for PyCharm that proxies pixi
    • pixi-diff-to-markdown: Generate markdown summaries from pixi update
    • jiaxiyang/cpp_project_guideline: Guide the way beginners make their c++ projects.
    • karelze/tclf: A python library for trade classification\u26a1
    • hex-inc/vegafusion: Serverside scaling of Vega and Altair visualizations in Rust, Python, WASM, and Java
    • pablovela5620/arxiv-researcher: Summarize PDF's and Arixv papers with Langchain and Nougat \ud83e\udd89
    • HaoZeke/xtsci-dist: Incremental scipy port using xtensor
    • jslorrma/keyrings.artifacts: Keyring backend that provides authentication for publishing or consuming Python packages to or from Azure Artifacts feeds within Azure DevOps
    • LFortran: A modern cross-platform Fortran compiler
    • Rerun: Rerun is an SDK for building time aware visualizations of multimodal data.
    • conda-auth: a conda plugin providing more secure authentication support to conda.
    • py-rattler: Build your own conda environment manager using the python wrapper of our Rattler backend.
    • array-api-extra: Extra array functions built on top of the Python array API standard.
    "},{"location":"FAQ/","title":"Frequently asked questions","text":""},{"location":"FAQ/#what-is-the-difference-with-conda-mamba-poetry-pip","title":"What is the difference with conda, mamba, poetry, pip","text":"Tool Installs python Builds packages Runs predefined tasks Has lock files builtin Fast Use without python Conda \u2705 \u274c \u274c \u274c \u274c \u274c Mamba \u2705 \u274c \u274c \u274c \u2705 \u2705 Pip \u274c \u2705 \u274c \u274c \u274c \u274c Pixi \u2705 \ud83d\udea7 \u2705 \u2705 \u2705 \u2705 Poetry \u274c \u2705 \u274c \u2705 \u274c \u274c"},{"location":"FAQ/#why-the-name-pixi","title":"Why the name pixi","text":"

    Starting with the name prefix we iterated until we had a name that was easy to pronounce, spell and remember. There also wasn't a cli tool yet using that name. Unlike px, pex, pax, etc. We think it sparks curiosity and fun, if you don't agree, I'm sorry, but you can always alias it to whatever you like.

    Linux & macOSWindows
    alias not_pixi=\"pixi\"\n

    PowerShell:

    New-Alias -Name not_pixi -Value pixi\n

    "},{"location":"FAQ/#where-is-pixi-build","title":"Where is pixi build","text":"

    TL;DR: It's coming we promise!

    pixi build is going to be the subcommand that can generate a conda package out of a pixi project. This requires a solid build tool which we're creating with rattler-build which will be used as a library in pixi.

    "},{"location":"basic_usage/","title":"Basic usage","text":"

    Ensure you've got pixi set up. If running pixi doesn't show the help, see the getting started if it doesn't.

    pixi\n

    Initialize a new project and navigate to the project directory.

    pixi init pixi-hello-world\ncd pixi-hello-world\n

    Add the dependencies you would like to use.

    pixi add python\n

    Create a file named hello_world.py in the directory and paste the following code into the file.

    hello_world.py
    def hello():\n    print(\"Hello World, to the new revolution in package management.\")\n\nif __name__ == \"__main__\":\n    hello()\n

    Run the code inside the environment.

    pixi run python hello_world.py\n

    You can also put this run command in a task.

    pixi task add hello python hello_world.py\n

    After adding the task, you can run the task using its name.

    pixi run hello\n

    Use the shell command to activate the environment and start a new shell in there.

    pixi shell\npython\nexit()\n

    You've just learned the basic features of pixi:

    1. initializing a project
    2. adding a dependency.
    3. adding a task, and executing it.
    4. running a program.

    Feel free to play around with what you just learned like adding more tasks, dependencies or code.

    Happy coding!

    "},{"location":"basic_usage/#use-pixi-as-a-global-installation-tool","title":"Use pixi as a global installation tool","text":"

    Use pixi to install tools on your machine.

    Some notable examples:

    # Awesome cross shell prompt, huge tip when using pixi!\npixi global install starship\n\n# Want to try a different shell?\npixi global install fish\n\n# Install other prefix.dev tools\npixi global install rattler-build\n\n# Install a linter you want to use in multiple projects.\npixi global install ruff\n
    "},{"location":"basic_usage/#using-the-no-activation-option","title":"Using the --no-activation option","text":"

    When installing packages globally, you can use the --no-activation option to prevent the insertion of environment activation code into the installed executable scripts. This means that when you run the installed executable, it won't modify the PATH or CONDA_PREFIX environment variables beforehand.

    Example:

    # Install a package without inserting activation code\npixi global install ruff --no-activation\n

    This option can be useful in scenarios where you want more control over the environment activation or if you're using the installed executables in contexts where automatic activation might interfere with other processes.

    "},{"location":"basic_usage/#use-pixi-in-github-actions","title":"Use pixi in GitHub Actions","text":"

    You can use pixi in GitHub Actions to install dependencies and run commands. It supports automatic caching of your environments.

    - uses: prefix-dev/setup-pixi@v0.5.1\n- run: pixi run cowpy \"Thanks for using pixi\"\n

    See the GitHub Actions for more details.

    "},{"location":"vision/","title":"Vision","text":"

    We created pixi because we want to have a cargo/npm/yarn like package management experience for conda. We really love what the conda packaging ecosystem achieves, but we think that the user experience can be improved a lot. Modern package managers like cargo have shown us, how great a package manager can be. We want to bring that experience to the conda ecosystem.

    "},{"location":"vision/#pixi-values","title":"Pixi values","text":"

    We want to make pixi a great experience for everyone, so we have a few values that we want to uphold:

    1. Fast. We want to have a fast package manager, that is able to solve the environment in a few seconds.
    2. User Friendly. We want to have a package manager that puts user friendliness on the front-line. Providing easy, accessible and intuitive commands. That have the element of least surprise.
    3. Isolated Environment. We want to have isolated environments, that are reproducible and easy to share. Ideally, it should run on all common platforms. The Conda packaging system provides an excellent base for this.
    4. Single Tool. We want to integrate most common uses when working on a development project with Pixi, so it should support at least dependency management, command management, building and uploading packages. You should not need to reach to another external tool for this.
    5. Fun. It should be fun to use pixi and not cause frustrations, you should not need to think about it a lot and it should generally just get out of your way.
    "},{"location":"vision/#conda","title":"Conda","text":"

    We are building on top of the conda packaging ecosystem, this means that we have a huge number of packages available for different platforms on conda-forge. We believe the conda packaging ecosystem provides a solid base to manage your dependencies. Conda-forge is community maintained and very open to contributions. It is widely used in data science and scientific computing, robotics and other fields. And has a proven track record.

    "},{"location":"vision/#target-languages","title":"Target languages","text":"

    Essentially, we are language agnostics, we are targeting any language that can be installed with conda. Including: C++, Python, Rust, Zig etc. But we do believe the python ecosystem can benefit from a good package manager that is based on conda. So we are trying to provide an alternative to existing solutions there. We also think we can provide a good solution for C++ projects, as there are a lot of libraries available on conda-forge today. Pixi also truly shines when using it for multi-language projects e.g. a mix of C++ and Python, because we provide a nice way to build everything up to and including system level packages.

    "},{"location":"advanced/authentication/","title":"Authenticate pixi with a server","text":"

    You can authenticate pixi with a server like prefix.dev, a private quetz instance or anaconda.org. Different servers use different authentication methods. In this documentation page, we detail how you can authenticate against the different servers and where the authentication information is stored.

    Usage: pixi auth login [OPTIONS] <HOST>\n\nArguments:\n  <HOST>  The host to authenticate with (e.g. repo.prefix.dev)\n\nOptions:\n      --token <TOKEN>              The token to use (for authentication with prefix.dev)\n      --username <USERNAME>        The username to use (for basic HTTP authentication)\n      --password <PASSWORD>        The password to use (for basic HTTP authentication)\n      --conda-token <CONDA_TOKEN>  The token to use on anaconda.org / quetz authentication\n  -v, --verbose...                 More output per occurrence\n  -q, --quiet...                   Less output per occurrence\n  -h, --help                       Print help\n

    The different options are \"token\", \"conda-token\" and \"username + password\".

    The token variant implements a standard \"Bearer Token\" authentication as is used on the prefix.dev platform. A Bearer Token is sent with every request as an additional header of the form Authentication: Bearer <TOKEN>.

    The conda-token option is used on anaconda.org and can be used with a quetz server. With this option, the token is sent as part of the URL following this scheme: conda.anaconda.org/t/<TOKEN>/conda-forge/linux-64/....

    The last option, username & password, are used for \"Basic HTTP Authentication\". This is the equivalent of adding http://user:password@myserver.com/.... This authentication method can be configured quite easily with a reverse NGinx or Apache server and is thus commonly used in self-hosted systems.

    "},{"location":"advanced/authentication/#examples","title":"Examples","text":"

    Login to prefix.dev:

    pixi auth login prefix.dev --token pfx_jj8WDzvnuTHEGdAhwRZMC1Ag8gSto8\n

    Login to anaconda.org:

    pixi auth login anaconda.org --conda-token xy-72b914cc-c105-4ec7-a969-ab21d23480ed\n

    Login to a basic HTTP secured server:

    pixi auth login myserver.com --username user --password password\n
    "},{"location":"advanced/authentication/#where-does-pixi-store-the-authentication-information","title":"Where does pixi store the authentication information?","text":"

    The storage location for the authentication information is system-dependent. By default, pixi tries to use the keychain to store this sensitive information securely on your machine.

    On Windows, the credentials are stored in the \"credentials manager\". Searching for rattler (the underlying library pixi uses) you should find any credentials stored by pixi (or other rattler-based programs).

    On macOS, the passwords are stored in the keychain. To access the password, you can use the Keychain Access program that comes pre-installed on macOS. Searching for rattler (the underlying library pixi uses) you should find any credentials stored by pixi (or other rattler-based programs).

    On Linux, one can use GNOME Keyring (or just Keyring) to access credentials that are securely stored by libsecret. Searching for rattler should list all the credentials stored by pixi and other rattler-based programs.

    "},{"location":"advanced/authentication/#fallback-storage","title":"Fallback storage","text":"

    If you run on a server with none of the aforementioned keychains available, then pixi falls back to store the credentials in an insecure JSON file. This JSON file is located at ~/.rattler/credentials.json and contains the credentials.

    "},{"location":"advanced/authentication/#override-the-authentication-storage","title":"Override the authentication storage","text":"

    You can use the RATTLER_AUTH_FILE environment variable to override the default location of the credentials file. When this environment variable is set, it provides the only source of authentication data that is used by pixi.

    E.g.

    export RATTLER_AUTH_FILE=$HOME/credentials.json\n# You can also specify the file in the command line\npixi global install --auth-file $HOME/credentials.json ...\n

    The JSON should follow the following format:

    {\n    \"*.prefix.dev\": {\n        \"BearerToken\": \"your_token\"\n    },\n    \"otherhost.com\": {\n        \"BasicHTTP\": {\n            \"username\": \"your_username\",\n            \"password\": \"your_password\"\n        }\n    },\n    \"conda.anaconda.org\": {\n        \"CondaToken\": \"your_token\"\n    }\n}\n

    Note: if you use a wildcard in the host, any subdomain will match (e.g. *.prefix.dev also matches repo.prefix.dev).

    Lastly you can set the authentication override file in the global configuration file.

    "},{"location":"advanced/authentication/#pypi-authentication","title":"PyPI authentication","text":"

    Currently, we support the following methods for authenticating against PyPI:

    1. keyring authentication.
    2. .netrc file authentication.

    We want to add more methods in the future, so if you have a specific method you would like to see, please let us know.

    "},{"location":"advanced/authentication/#keyring-authentication","title":"Keyring authentication","text":"

    Currently, pixi supports the uv method of authentication through the python keyring library. To enable this use the CLI flag --pypi-keyring-provider which can either be set to subprocess (activated) or disabled.

    # From an existing pixi project\npixi install --pypi-keyring-provider subprocess\n

    This option can also be set in the global configuration file under pypi-config.

    "},{"location":"advanced/authentication/#installing-keyring","title":"Installing keyring","text":"

    To install keyring you can use pixi global install:

    Either use:

    pixi global install keyring\n
    GCP and other backends

    The downside of this method is currently, because you cannot inject into a pixi global environment just yet, that installing different keyring backends is not possible. This allows only the default keyring backend to be used. Give the issue a \ud83d\udc4d up if you would like to see inject as a feature.

    Or alternatively, you can install keyring using pipx:

    # Install pipx if you haven't already\npixi global install pipx\npipx install keyring\n\n# For Google Artifact Registry, also install and initialize its keyring backend.\n# Inject this into the pipx environment\npipx inject keyring keyrings.google-artifactregistry-auth --index-url https://pypi.org/simple\ngcloud auth login\n
    "},{"location":"advanced/authentication/#using-keyring-with-basic-auth","title":"Using keyring with Basic Auth","text":"

    Use keyring to store your credentials e.g:

    keyring set https://my-index/simple your_username\n# prompt will appear for your password\n
    "},{"location":"advanced/authentication/#configuration","title":"Configuration","text":"

    Make sure to include username@ in the URL of the registry. An example of this would be:

    [pypi-options]\nindex-url = \"https://username@custom-registry.com/simple\"\n
    "},{"location":"advanced/authentication/#gcp","title":"GCP","text":"

    For Google Artifact Registry, you can use the Google Cloud SDK to authenticate. Make sure to have run gcloud auth login before using pixi. Another thing to note is that you need to add oauth2accesstoken to the URL of the registry. An example of this would be:

    "},{"location":"advanced/authentication/#configuration_1","title":"Configuration","text":"
    # rest of the pixi.toml\n#\n# Add's the following options to the default feature\n[pypi-options]\nextra-index-urls = [\"https://oauth2accesstoken@<location>-python.pkg.dev/<project>/<repository>/simple\"]\n

    Note

    Include the /simple at the end, replace the <location> etc. with your project and repository and location.

    To find this URL more easily, you can use the gcloud command:

    gcloud artifacts print-settings python --project=<project> --repository=<repository> --location=<location>\n
    "},{"location":"advanced/authentication/#azure-devops","title":"Azure DevOps","text":"

    Similarly for Azure DevOps, you can use the Azure keyring backend for authentication. The backend, along with installation instructions can be found at keyring.artifacts.

    After following the instructions and making sure that keyring works correctly, you can use the following configuration:

    "},{"location":"advanced/authentication/#configuration_2","title":"Configuration","text":"

    # rest of the pixi.toml\n#\n# Adds the following options to the default feature\n[pypi-options]\nextra-index-urls = [\"https://VssSessionToken@pkgs.dev.azure.com/{organization}/{project}/_packaging/{feed}/pypi/simple/\"]\n
    This should allow for getting packages from the Azure DevOps artifact registry.

    "},{"location":"advanced/authentication/#installing-your-environment","title":"Installing your environment","text":"

    To actually install either configure your Global Config, or use the flag:

    pixi install --pypi-keyring-provider subprocess\n

    "},{"location":"advanced/authentication/#netrc-file","title":".netrc file","text":"

    pixi allows you to access private registries securely by authenticating with credentials stored in a .netrc file.

    • The .netrc file can be stored in your home directory ($HOME/.netrc for Unix-like systems)
    • or in the user profile directory on Windows (%HOME%\\_netrc).
    • You can also set up a different location for it using the NETRC variable (export NETRC=/my/custom/location/.netrc). e.g export NETRC=/my/custom/location/.netrc pixi install

    In the .netrc file, you store authentication details like this:

    machine registry-name\nlogin admin\npassword admin\n
    For more details, you can access the .netrc docs.

    "},{"location":"advanced/channel_priority/","title":"Channel Logic","text":"

    All logic regarding the decision which dependencies can be installed from which channel is done by the instruction we give the solver.

    The actual code regarding this is in the rattler_solve crate. This might however be hard to read. Therefore, this document will continue with simplified flow charts.

    "},{"location":"advanced/channel_priority/#channel-specific-dependencies","title":"Channel specific dependencies","text":"

    When a user defines a channel per dependency, the solver needs to know the other channels are unusable for this dependency.

    [project]\nchannels = [\"conda-forge\", \"my-channel\"]\n\n[dependencies]\npackgex = { version = \"*\", channel = \"my-channel\" }\n
    In the packagex example, the solver will understand that the package is only available in my-channel and will not look for it in conda-forge.

    The flowchart of the logic that excludes all other channels:

    flowchart TD\n    A[Start] --> B[Given a Dependency]\n    B --> C{Channel Specific Dependency?}\n    C -->|Yes| D[Exclude All Other Channels for This Package]\n    C -->|No| E{Any Other Dependencies?}\n    E -->|Yes| B\n    E -->|No| F[End]\n    D --> E
    "},{"location":"advanced/channel_priority/#channel-priority","title":"Channel priority","text":"

    Channel priority is dictated by the order in the project.channels array, where the first channel is the highest priority. For instance:

    [project]\nchannels = [\"conda-forge\", \"my-channel\", \"your-channel\"]\n
    If the package is found in conda-forge the solver will not look for it in my-channel and your-channel, because it tells the solver they are excluded. If the package is not found in conda-forge the solver will look for it in my-channel and if it is found there it will tell the solver to exclude your-channel for this package. This diagram explains the logic:
    flowchart TD\n    A[Start] --> B[Given a Dependency]\n    B --> C{Loop Over Channels}\n    C --> D{Package in This Channel?}\n    D -->|No| C\n    D -->|Yes| E{\"This the first channel\n     for this package?\"}\n    E -->|Yes| F[Include Package in Candidates]\n    E -->|No| G[Exclude Package from Candidates]\n    F --> H{Any Other Channels?}\n    G --> H\n    H -->|Yes| C\n    H -->|No| I{Any Other Dependencies?}\n    I -->|No| J[End]\n    I -->|Yes| B

    This method ensures the solver only adds a package to the candidates if it's found in the highest priority channel available. If you have 10 channels and the package is found in the 5th channel it will exclude the next 5 channels from the candidates if they also contain the package.

    "},{"location":"advanced/channel_priority/#use-case-pytorch-and-nvidia-with-conda-forge","title":"Use case: pytorch and nvidia with conda-forge","text":"

    A common use case is to use pytorch with nvidia drivers, while also needing the conda-forge channel for the main dependencies.

    [project]\nchannels = [\"nvidia/label/cuda-11.8.0\", \"nvidia\", \"conda-forge\", \"pytorch\"]\nplatforms = [\"linux-64\"]\n\n[dependencies]\ncuda = {version = \"*\", channel=\"nvidia/label/cuda-11.8.0\"}\npytorch = {version = \"2.0.1.*\", channel=\"pytorch\"}\ntorchvision = {version = \"0.15.2.*\", channel=\"pytorch\"}\npytorch-cuda = {version = \"11.8.*\", channel=\"pytorch\"}\npython = \"3.10.*\"\n
    What this will do is get as much as possible from the nvidia/label/cuda-11.8.0 channel, which is actually only the cuda package.

    Then it will get all packages from the nvidia channel, which is a little more and some packages overlap the nvidia and conda-forge channel. Like the cuda-cudart package, which will now only be retrieved from the nvidia channel because of the priority logic.

    Then it will get the packages from the conda-forge channel, which is the main channel for the dependencies.

    But the user only wants the pytorch packages from the pytorch channel, which is why pytorch is added last and the dependencies are added as channel specific dependencies.

    We don't define the pytorch channel before conda-forge because we want to get as much as possible from the conda-forge as the pytorch channel is not always shipping the best versions of all packages.

    For example, it also ships the ffmpeg package, but only an old version which doesn't work with the newer pytorch versions. Thus breaking the installation if we would skip the conda-forge channel for ffmpeg with the priority logic.

    "},{"location":"advanced/channel_priority/#force-a-specific-channel-priority","title":"Force a specific channel priority","text":"

    If you want to force a specific priority for a channel, you can use the priority (int) key in the channel definition. The higher the number, the higher the priority. Non specified priorities are set to 0 but the index in the array still counts as a priority, where the first in the list has the highest priority.

    This priority definition is mostly important for multiple environments with different channel priorities, as by default feature channels are prepended to the project channels.

    [project]\nname = \"test_channel_priority\"\nplatforms = [\"linux-64\", \"osx-64\", \"win-64\", \"osx-arm64\"]\nchannels = [\"conda-forge\"]\n\n[feature.a]\nchannels = [\"nvidia\"]\n\n[feature.b]\nchannels = [ \"pytorch\", {channel = \"nvidia\", priority = 1}]\n\n[feature.c]\nchannels = [ \"pytorch\", {channel = \"nvidia\", priority = -1}]\n\n[environments]\na = [\"a\"]\nb = [\"b\"]\nc = [\"c\"]\n
    This example creates 4 environments, a, b, c, and the default environment. Which will have the following channel order:

    Environment Resulting Channels order default conda-forge a nvidia, conda-forge b nvidia, pytorch, conda-forge c pytorch, conda-forge, nvidia Check priority result with pixi info

    Using pixi info you can check the priority of the channels in the environment.

    pixi info\nEnvironments\n------------\n       Environment: default\n          Features: default\n          Channels: conda-forge\nDependency count: 0\nTarget platforms: linux-64\n\n       Environment: a\n          Features: a, default\n          Channels: nvidia, conda-forge\nDependency count: 0\nTarget platforms: linux-64\n\n       Environment: b\n          Features: b, default\n          Channels: nvidia, pytorch, conda-forge\nDependency count: 0\nTarget platforms: linux-64\n\n       Environment: c\n          Features: c, default\n          Channels: pytorch, conda-forge, nvidia\nDependency count: 0\nTarget platforms: linux-64\n

    "},{"location":"advanced/explain_info_command/","title":"Info command","text":"

    pixi info prints out useful information to debug a situation or to get an overview of your machine/project. This information can also be retrieved in json format using the --json flag, which can be useful for programmatically reading it.

    Running pixi info in the pixi repo
    \u279c pixi info\n      Pixi version: 0.13.0\n          Platform: linux-64\n  Virtual packages: __unix=0=0\n                  : __linux=6.5.12=0\n                  : __glibc=2.36=0\n                  : __cuda=12.3=0\n                  : __archspec=1=x86_64\n         Cache dir: /home/user/.cache/rattler/cache\n      Auth storage: /home/user/.rattler/credentials.json\n\nProject\n------------\n           Version: 0.13.0\n     Manifest file: /home/user/development/pixi/pixi.toml\n      Last updated: 25-01-2024 10:29:08\n\nEnvironments\n------------\ndefault\n          Features: default\n          Channels: conda-forge\n  Dependency count: 10\n      Dependencies: pre-commit, rust, openssl, pkg-config, git, mkdocs, mkdocs-material, pillow, cairosvg, compilers\n  Target platforms: linux-64, osx-arm64, win-64, osx-64\n             Tasks: docs, test-all, test, build, lint, install, build-docs\n
    "},{"location":"advanced/explain_info_command/#global-info","title":"Global info","text":"

    The first part of the info output is information that is always available and tells you what pixi can read on your machine.

    "},{"location":"advanced/explain_info_command/#platform","title":"Platform","text":"

    This defines the platform you're currently on according to pixi. If this is incorrect, please file an issue on the pixi repo.

    "},{"location":"advanced/explain_info_command/#virtual-packages","title":"Virtual packages","text":"

    The virtual packages that pixi can find on your machine.

    In the Conda ecosystem, you can depend on virtual packages. These packages aren't real dependencies that are going to be installed, but rather are being used in the solve step to find if a package can be installed on the machine. A simple example: When a package depends on Cuda drivers being present on the host machine it can do that by depending on the __cuda virtual package. In that case, if pixi cannot find the __cuda virtual package on your machine the installation will fail.

    "},{"location":"advanced/explain_info_command/#cache-dir","title":"Cache dir","text":"

    The directory where pixi stores its cache. Checkout the cache documentation for more information.

    "},{"location":"advanced/explain_info_command/#auth-storage","title":"Auth storage","text":"

    Check the authentication documentation

    "},{"location":"advanced/explain_info_command/#cache-size","title":"Cache size","text":"

    [requires --extended]

    The size of the previously mentioned \"Cache dir\" in Mebibytes.

    "},{"location":"advanced/explain_info_command/#project-info","title":"Project info","text":"

    Everything below Project is info about the project you're currently in. This info is only available if your path has a manifest file.

    "},{"location":"advanced/explain_info_command/#manifest-file","title":"Manifest file","text":"

    The path to the manifest file that describes the project.

    "},{"location":"advanced/explain_info_command/#last-updated","title":"Last updated","text":"

    The last time the lock file was updated, either manually or by pixi itself.

    "},{"location":"advanced/explain_info_command/#environment-info","title":"Environment info","text":"

    The environment info defined per environment. If you don't have any environments defined, this will only show the default environment.

    "},{"location":"advanced/explain_info_command/#features","title":"Features","text":"

    This lists which features are enabled in the environment. For the default this is only default

    "},{"location":"advanced/explain_info_command/#channels","title":"Channels","text":"

    The list of channels used in this environment.

    "},{"location":"advanced/explain_info_command/#dependency-count","title":"Dependency count","text":"

    The amount of dependencies defined that are defined for this environment (not the amount of installed dependencies).

    "},{"location":"advanced/explain_info_command/#dependencies","title":"Dependencies","text":"

    The list of dependencies defined for this environment.

    "},{"location":"advanced/explain_info_command/#target-platforms","title":"Target platforms","text":"

    The platforms the project has defined.

    "},{"location":"advanced/github_actions/","title":"GitHub Action","text":"

    We created prefix-dev/setup-pixi to facilitate using pixi in CI.

    "},{"location":"advanced/github_actions/#usage","title":"Usage","text":"
    - uses: prefix-dev/setup-pixi@v0.8.0\n  with:\n    pixi-version: v0.32.1\n    cache: true\n    auth-host: prefix.dev\n    auth-token: ${{ secrets.PREFIX_DEV_TOKEN }}\n- run: pixi run test\n

    Pin your action versions

    Since pixi is not yet stable, the API of this action may change between minor versions. Please pin the versions of this action to a specific version (i.e., prefix-dev/setup-pixi@v0.8.0) to avoid breaking changes. You can automatically update the version of this action by using Dependabot.

    Put the following in your .github/dependabot.yml file to enable Dependabot for your GitHub Actions:

    .github/dependabot.yml
    version: 2\nupdates:\n  - package-ecosystem: github-actions\n    directory: /\n    schedule:\n      interval: monthly # (1)!\n    groups:\n      dependencies:\n        patterns:\n          - \"*\"\n
    1. or daily, weekly
    "},{"location":"advanced/github_actions/#features","title":"Features","text":"

    To see all available input arguments, see the action.yml file in setup-pixi. The most important features are described below.

    "},{"location":"advanced/github_actions/#caching","title":"Caching","text":"

    The action supports caching of the pixi environment. By default, caching is enabled if a pixi.lock file is present. It will then use the pixi.lock file to generate a hash of the environment and cache it. If the cache is hit, the action will skip the installation and use the cached environment. You can specify the behavior by setting the cache input argument.

    Customize your cache key

    If you need to customize your cache-key, you can use the cache-key input argument. This will be the prefix of the cache key. The full cache key will be <cache-key><conda-arch>-<hash>.

    Only save caches on main

    In order to not exceed the 10 GB cache size limit as fast, you might want to restrict when the cache is saved. This can be done by setting the cache-write argument.

    - uses: prefix-dev/setup-pixi@v0.8.0\n  with:\n    cache: true\n    cache-write: ${{ github.event_name == 'push' && github.ref_name == 'main' }}\n
    "},{"location":"advanced/github_actions/#multiple-environments","title":"Multiple environments","text":"

    With pixi, you can create multiple environments for different requirements. You can also specify which environment(s) you want to install by setting the environments input argument. This will install all environments that are specified and cache them.

    [project]\nname = \"my-package\"\nchannels = [\"conda-forge\"]\nplatforms = [\"linux-64\"]\n\n[dependencies]\npython = \">=3.11\"\npip = \"*\"\npolars = \">=0.14.24,<0.21\"\n\n[feature.py311.dependencies]\npython = \"3.11.*\"\n[feature.py312.dependencies]\npython = \"3.12.*\"\n\n[environments]\npy311 = [\"py311\"]\npy312 = [\"py312\"]\n
    "},{"location":"advanced/github_actions/#multiple-environments-using-a-matrix","title":"Multiple environments using a matrix","text":"

    The following example will install the py311 and py312 environments in different jobs.

    test:\n  runs-on: ubuntu-latest\n  strategy:\n    matrix:\n      environment: [py311, py312]\n  steps:\n  - uses: actions/checkout@v4\n  - uses: prefix-dev/setup-pixi@v0.8.0\n    with:\n      environments: ${{ matrix.environment }}\n
    "},{"location":"advanced/github_actions/#install-multiple-environments-in-one-job","title":"Install multiple environments in one job","text":"

    The following example will install both the py311 and the py312 environment on the runner.

    - uses: prefix-dev/setup-pixi@v0.8.0\n  with:\n    environments: >- # (1)!\n      py311\n      py312\n- run: |\n  pixi run -e py311 test\n  pixi run -e py312 test\n
    1. separated by spaces, equivalent to

      environments: py311 py312\n

    Caching behavior if you don't specify environments

    If you don't specify any environment, the default environment will be installed and cached, even if you use other environments.

    "},{"location":"advanced/github_actions/#authentication","title":"Authentication","text":"

    There are currently three ways to authenticate with pixi:

    • using a token
    • using a username and password
    • using a conda-token

    For more information, see Authentication.

    Handle secrets with care

    Please only store sensitive information using GitHub secrets. Do not store them in your repository. When your sensitive information is stored in a GitHub secret, you can access it using the ${{ secrets.SECRET_NAME }} syntax. These secrets will always be masked in the logs.

    "},{"location":"advanced/github_actions/#token","title":"Token","text":"

    Specify the token using the auth-token input argument. This form of authentication (bearer token in the request headers) is mainly used at prefix.dev.

    - uses: prefix-dev/setup-pixi@v0.8.0\n  with:\n    auth-host: prefix.dev\n    auth-token: ${{ secrets.PREFIX_DEV_TOKEN }}\n
    "},{"location":"advanced/github_actions/#username-and-password","title":"Username and password","text":"

    Specify the username and password using the auth-username and auth-password input arguments. This form of authentication (HTTP Basic Auth) is used in some enterprise environments with artifactory for example.

    - uses: prefix-dev/setup-pixi@v0.8.0\n  with:\n    auth-host: custom-artifactory.com\n    auth-username: ${{ secrets.PIXI_USERNAME }}\n    auth-password: ${{ secrets.PIXI_PASSWORD }}\n
    "},{"location":"advanced/github_actions/#conda-token","title":"Conda-token","text":"

    Specify the conda-token using the conda-token input argument. This form of authentication (token is encoded in URL: https://my-quetz-instance.com/t/<token>/get/custom-channel) is used at anaconda.org or with quetz instances.

    - uses: prefix-dev/setup-pixi@v0.8.0\n  with:\n    auth-host: anaconda.org # (1)!\n    conda-token: ${{ secrets.CONDA_TOKEN }}\n
    1. or my-quetz-instance.com
    "},{"location":"advanced/github_actions/#custom-shell-wrapper","title":"Custom shell wrapper","text":"

    setup-pixi allows you to run command inside of the pixi environment by specifying a custom shell wrapper with shell: pixi run bash -e {0}. This can be useful if you want to run commands inside of the pixi environment, but don't want to use the pixi run command for each command.

    - run: | # (1)!\n    python --version\n    pip install --no-deps -e .\n  shell: pixi run bash -e {0}\n
    1. everything here will be run inside of the pixi environment

    You can even run Python scripts like this:

    - run: | # (1)!\n    import my_package\n    print(\"Hello world!\")\n  shell: pixi run python {0}\n
    1. everything here will be run inside of the pixi environment

    If you want to use PowerShell, you need to specify -Command as well.

    - run: | # (1)!\n    python --version | Select-String \"3.11\"\n  shell: pixi run pwsh -Command {0} # pwsh works on all platforms\n
    1. everything here will be run inside of the pixi environment

    How does it work under the hood?

    Under the hood, the shell: xyz {0} option is implemented by creating a temporary script file and calling xyz with that script file as an argument. This file does not have the executable bit set, so you cannot use shell: pixi run {0} directly but instead have to use shell: pixi run bash {0}. There are some custom shells provided by GitHub that have slightly different behavior, see jobs.<job_id>.steps[*].shell in the documentation. See the official documentation and ADR 0277 for more information about how the shell: input works in GitHub Actions.

    "},{"location":"advanced/github_actions/#one-off-shell-wrapper-using-pixi-exec","title":"One-off shell wrapper using pixi exec","text":"

    With pixi exec, you can also run a one-off command inside a temporary pixi environment.

    - run: | # (1)!\n    zstd --version\n  shell: pixi exec --spec zstd -- bash -e {0}\n
    1. everything here will be run inside of the temporary pixi environment
    - run: | # (1)!\n    import ruamel.yaml\n    # ...\n  shell: pixi exec --spec python=3.11.* --spec ruamel.yaml -- python {0}\n
    1. everything here will be run inside of the temporary pixi environment

    See here for more information about pixi exec.

    "},{"location":"advanced/github_actions/#environment-activation","title":"Environment activation","text":"

    Instead of using a custom shell wrapper, you can also make all pixi-installed binaries available to subsequent steps by \"activating\" the installed environment in the currently running job. To this end, setup-pixi adds all environment variables set when executing pixi run to $GITHUB_ENV and, similarly, adds all path modifications to $GITHUB_PATH. As a result, all installed binaries can be accessed without having to call pixi run.

    - uses: prefix-dev/setup-pixi@v0.8.0\n  with:\n    activate-environment: true\n

    If you are installing multiple environments, you will need to specify the name of the environment that you want to be activated.

    - uses: prefix-dev/setup-pixi@v0.8.0\n  with:\n    environments: >-\n      py311\n      py312\n    activate-environment: py311\n

    Activating an environment may be more useful than using a custom shell wrapper as it allows non-shell based steps to access binaries on the path. However, be aware that this option augments the environment of your job.

    "},{"location":"advanced/github_actions/#-frozen-and-locked","title":"--frozen and --locked","text":"

    You can specify whether setup-pixi should run pixi install --frozen or pixi install --locked depending on the frozen or the locked input argument. See the official documentation for more information about the --frozen and --locked flags.

    - uses: prefix-dev/setup-pixi@v0.8.0\n  with:\n    locked: true\n    # or\n    frozen: true\n

    If you don't specify anything, the default behavior is to run pixi install --locked if a pixi.lock file is present and pixi install otherwise.

    "},{"location":"advanced/github_actions/#debugging","title":"Debugging","text":"

    There are two types of debug logging that you can enable.

    "},{"location":"advanced/github_actions/#debug-logging-of-the-action","title":"Debug logging of the action","text":"

    The first one is the debug logging of the action itself. This can be enabled by for the action by re-running the action in debug mode:

    Debug logging documentation

    For more information about debug logging in GitHub Actions, see the official documentation.

    "},{"location":"advanced/github_actions/#debug-logging-of-pixi","title":"Debug logging of pixi","text":"

    The second type is the debug logging of the pixi executable. This can be specified by setting the log-level input.

    - uses: prefix-dev/setup-pixi@v0.8.0\n  with:\n    log-level: vvv # (1)!\n
    1. One of q, default, v, vv, or vvv.

    If nothing is specified, log-level will default to default or vv depending on if debug logging is enabled for the action.

    "},{"location":"advanced/github_actions/#self-hosted-runners","title":"Self-hosted runners","text":"

    On self-hosted runners, it may happen that some files are persisted between jobs. This can lead to problems or secrets getting leaked between job runs. To avoid this, you can use the post-cleanup input to specify the post cleanup behavior of the action (i.e., what happens after all your commands have been executed).

    If you set post-cleanup to true, the action will delete the following files:

    • .pixi environment
    • the pixi binary
    • the rattler cache
    • other rattler files in ~/.rattler

    If nothing is specified, post-cleanup will default to true.

    On self-hosted runners, you also might want to alter the default pixi install location to a temporary location. You can use pixi-bin-path: ${{ runner.temp }}/bin/pixi to do this.

    - uses: prefix-dev/setup-pixi@v0.8.0\n  with:\n    post-cleanup: true\n    pixi-bin-path: ${{ runner.temp }}/bin/pixi # (1)!\n
    1. ${{ runner.temp }}\\Scripts\\pixi.exe on Windows

    You can also use a preinstalled local version of pixi on the runner by not setting any of the pixi-version, pixi-url or pixi-bin-path inputs. This action will then try to find a local version of pixi in the runner's PATH.

    "},{"location":"advanced/github_actions/#using-the-pyprojecttoml-as-a-manifest-file-for-pixi","title":"Using the pyproject.toml as a manifest file for pixi.","text":"

    setup-pixi will automatically pick up the pyproject.toml if it contains a [tool.pixi.project] section and no pixi.toml. This can be overwritten by setting the manifest-path input argument.

    - uses: prefix-dev/setup-pixi@v0.8.0\n  with:\n    manifest-path: pyproject.toml\n
    "},{"location":"advanced/github_actions/#more-examples","title":"More examples","text":"

    If you want to see more examples, you can take a look at the GitHub Workflows of the setup-pixi repository.

    "},{"location":"advanced/production_deployment/","title":"Bringing pixi to production","text":"

    You can bring pixi projects into production by either containerizing it using tools like Docker or by using quantco/pixi-pack.

    @pavelzw from QuantCo wrote a blog post about bringing pixi to production. You can read it here.

    "},{"location":"advanced/production_deployment/#docker","title":"Docker","text":"

    We provide a simple docker image at pixi-docker that contains the pixi executable on top of different base images.

    The images are available on ghcr.io/prefix-dev/pixi.

    There are different tags for different base images available:

    • latest - based on ubuntu:jammy
    • focal - based on ubuntu:focal
    • bullseye - based on debian:bullseye
    • jammy-cuda-12.2.2 - based on nvidia/cuda:12.2.2-jammy
    • ... and more

    All tags

    For all tags, take a look at the build script.

    "},{"location":"advanced/production_deployment/#example-usage","title":"Example usage","text":"

    The following example uses the pixi docker image as a base image for a multi-stage build. It also makes use of pixi shell-hook to not rely on pixi being installed in the production container.

    More examples

    For more examples, take a look at pavelzw/pixi-docker-example.

    FROM ghcr.io/prefix-dev/pixi:0.32.1 AS build\n\n# copy source code, pixi.toml and pixi.lock to the container\nWORKDIR /app\nCOPY . .\n# install dependencies to `/app/.pixi/envs/prod`\n# use `--locked` to ensure the lockfile is up to date with pixi.toml\nRUN pixi install --locked -e prod\n# create the shell-hook bash script to activate the environment\nRUN pixi shell-hook -e prod -s bash > /shell-hook\nRUN echo \"#!/bin/bash\" > /app/entrypoint.sh\nRUN cat /shell-hook >> /app/entrypoint.sh\n# extend the shell-hook script to run the command passed to the container\nRUN echo 'exec \"$@\"' >> /app/entrypoint.sh\n\nFROM ubuntu:24.04 AS production\nWORKDIR /app\n# only copy the production environment into prod container\n# please note that the \"prefix\" (path) needs to stay the same as in the build container\nCOPY --from=build /app/.pixi/envs/prod /app/.pixi/envs/prod\nCOPY --from=build --chmod=0755 /app/entrypoint.sh /app/entrypoint.sh\n# copy your project code into the container as well\nCOPY ./my_project /app/my_project\n\nEXPOSE 8000\nENTRYPOINT [ \"/app/entrypoint.sh\" ]\n# run your app inside the pixi environment\nCMD [ \"uvicorn\", \"my_project:app\", \"--host\", \"0.0.0.0\" ]\n
    "},{"location":"advanced/production_deployment/#pixi-pack","title":"pixi-pack","text":"

    pixi-pack is a simple tool that takes a pixi environment and packs it into a compressed archive that can be shipped to the target machine.

    It can be installed via

    pixi global install pixi-pack\n

    Or by downloading our pre-built binaries from the releases page.

    Instead of installing pixi-pack globally, you can also use pixi exec to run pixi-pack in a temporary environment:

    pixi exec pixi-pack pack\npixi exec pixi-pack unpack environment.tar\n

    You can pack an environment with

    pixi-pack pack --manifest-file pixi.toml --environment prod --platform linux-64\n

    This will create a environment.tar file that contains all conda packages required to create the environment.

    # environment.tar\n| pixi-pack.json\n| environment.yml\n| channel\n|    \u251c\u2500\u2500 noarch\n|    |    \u251c\u2500\u2500 tzdata-2024a-h0c530f3_0.conda\n|    |    \u251c\u2500\u2500 ...\n|    |    \u2514\u2500\u2500 repodata.json\n|    \u2514\u2500\u2500 linux-64\n|         \u251c\u2500\u2500 ca-certificates-2024.2.2-hbcca054_0.conda\n|         \u251c\u2500\u2500 ...\n|         \u2514\u2500\u2500 repodata.json\n
    "},{"location":"advanced/production_deployment/#unpacking-an-environment","title":"Unpacking an environment","text":"

    With pixi-pack unpack environment.tar, you can unpack the environment on your target system. This will create a new conda environment in ./env that contains all packages specified in your pixi.toml. It also creates an activate.sh (or activate.bat on Windows) file that lets you activate the environment without needing to have conda or micromamba installed.

    "},{"location":"advanced/production_deployment/#cross-platform-packs","title":"Cross-platform packs","text":"

    Since pixi-pack just downloads the .conda and .tar.bz2 files from the conda repositories, you can trivially create packs for different platforms.

    pixi-pack pack --platform win-64\n

    You can only unpack a pack on a system that has the same platform as the pack was created for.

    "},{"location":"advanced/production_deployment/#inject-additional-packages","title":"Inject additional packages","text":"

    You can inject additional packages into the environment that are not specified in pixi.lock by using the --inject flag:

    pixi-pack pack --inject local-package-1.0.0-hbefa133_0.conda --manifest-pack pixi.toml\n

    This can be particularly useful if you build the project itself and want to include the built package in the environment but still want to use pixi.lock from the project.

    "},{"location":"advanced/production_deployment/#unpacking-without-pixi-pack","title":"Unpacking without pixi-pack","text":"

    If you don't have pixi-pack available on your target system, you can still install the environment if you have conda or micromamba available. Just unarchive the environment.tar, then you have a local channel on your system where all necessary packages are available. Next to this local channel, you will find an environment.yml file that contains the environment specification. You can then install the environment using conda or micromamba:

    tar -xvf environment.tar\nmicromamba create -p ./env --file environment.yml\n# or\nconda env create -p ./env --file environment.yml\n

    The environment.yml and repodata.json files are only for this use case, pixi-pack unpack does not use them.

    "},{"location":"advanced/pyproject_toml/","title":"pyproject.toml in pixi","text":"

    We support the use of the pyproject.toml as our manifest file in pixi. This allows the user to keep one file with all configuration. The pyproject.toml file is a standard for Python projects. We don't advise to use the pyproject.toml file for anything else than python projects, the pixi.toml is better suited for other types of projects.

    "},{"location":"advanced/pyproject_toml/#initial-setup-of-the-pyprojecttoml-file","title":"Initial setup of the pyproject.toml file","text":"

    When you already have a pyproject.toml file in your project, you can run pixi init in a that folder. Pixi will automatically

    • Add a [tool.pixi.project] section to the file, with the platform and channel information required by pixi;
    • Add the current project as an editable pypi dependency;
    • Add some defaults to the .gitignore and .gitattributes files.

    If you do not have an existing pyproject.toml file , you can run pixi init --format pyproject in your project folder. In that case, pixi will create a pyproject.toml manifest from scratch with some sane defaults.

    "},{"location":"advanced/pyproject_toml/#python-dependency","title":"Python dependency","text":"

    The pyproject.toml file supports the requires_python field. Pixi understands that field and automatically adds the version to the dependencies.

    This is an example of a pyproject.toml file with the requires_python field, which will be used as the python dependency:

    pyproject.toml
    [project]\nname = \"my_project\"\nrequires-python = \">=3.9\"\n\n[tool.pixi.project]\nchannels = [\"conda-forge\"]\nplatforms = [\"linux-64\", \"osx-arm64\", \"osx-64\", \"win-64\"]\n

    Which is equivalent to:

    equivalent pixi.toml
    [project]\nname = \"my_project\"\nchannels = [\"conda-forge\"]\nplatforms = [\"linux-64\", \"osx-arm64\", \"osx-64\", \"win-64\"]\n\n[dependencies]\npython = \">=3.9\"\n
    "},{"location":"advanced/pyproject_toml/#dependency-section","title":"Dependency section","text":"

    The pyproject.toml file supports the dependencies field. Pixi understands that field and automatically adds the dependencies to the project as [pypi-dependencies].

    This is an example of a pyproject.toml file with the dependencies field:

    pyproject.toml
    [project]\nname = \"my_project\"\nrequires-python = \">=3.9\"\ndependencies = [\n    \"numpy\",\n    \"pandas\",\n    \"matplotlib\",\n]\n\n[tool.pixi.project]\nchannels = [\"conda-forge\"]\nplatforms = [\"linux-64\", \"osx-arm64\", \"osx-64\", \"win-64\"]\n

    Which is equivalent to:

    equivalent pixi.toml
    [project]\nname = \"my_project\"\nchannels = [\"conda-forge\"]\nplatforms = [\"linux-64\", \"osx-arm64\", \"osx-64\", \"win-64\"]\n\n[pypi-dependencies]\nnumpy = \"*\"\npandas = \"*\"\nmatplotlib = \"*\"\n\n[dependencies]\npython = \">=3.9\"\n

    You can overwrite these with conda dependencies by adding them to the dependencies field:

    pyproject.toml
    [project]\nname = \"my_project\"\nrequires-python = \">=3.9\"\ndependencies = [\n    \"numpy\",\n    \"pandas\",\n    \"matplotlib\",\n]\n\n[tool.pixi.project]\nchannels = [\"conda-forge\"]\nplatforms = [\"linux-64\", \"osx-arm64\", \"osx-64\", \"win-64\"]\n\n[tool.pixi.dependencies]\nnumpy = \"*\"\npandas = \"*\"\nmatplotlib = \"*\"\n

    This would result in the conda dependencies being installed and the pypi dependencies being ignored. As pixi takes the conda dependencies over the pypi dependencies.

    "},{"location":"advanced/pyproject_toml/#optional-dependencies","title":"Optional dependencies","text":"

    If your python project includes groups of optional dependencies, pixi will automatically interpret them as pixi features of the same name with the associated pypi-dependencies.

    You can add them to pixi environments manually, or use pixi init to setup the project, which will create one environment per feature. Self-references to other groups of optional dependencies are also handled.

    For instance, imagine you have a project folder with a pyproject.toml file similar to:

    [project]\nname = \"my_project\"\ndependencies = [\"package1\"]\n\n[project.optional-dependencies]\ntest = [\"pytest\"]\nall = [\"package2\",\"my_project[test]\"]\n

    Running pixi init in that project folder will transform the pyproject.toml file into:

    [project]\nname = \"my_project\"\ndependencies = [\"package1\"]\n\n[project.optional-dependencies]\ntest = [\"pytest\"]\nall = [\"package2\",\"my_project[test]\"]\n\n[tool.pixi.project]\nchannels = [\"conda-forge\"]\nplatforms = [\"linux-64\"] # if executed on linux\n\n[tool.pixi.environments]\ndefault = {features = [], solve-group = \"default\"}\ntest = {features = [\"test\"], solve-group = \"default\"}\nall = {features = [\"all\", \"test\"], solve-group = \"default\"}\n

    In this example, three environments will be created by pixi:

    • default with 'package1' as pypi dependency
    • test with 'package1' and 'pytest' as pypi dependencies
    • all with 'package1', 'package2' and 'pytest' as pypi dependencies

    All environments will be solved together, as indicated by the common solve-group, and added to the lock file. You can edit the [tool.pixi.environments] section manually to adapt it to your use case (e.g. if you do not need a particular environment).

    "},{"location":"advanced/pyproject_toml/#example","title":"Example","text":"

    As the pyproject.toml file supports the full pixi spec with [tool.pixi] prepended an example would look like this:

    pyproject.toml
    [project]\nname = \"my_project\"\nrequires-python = \">=3.9\"\ndependencies = [\n    \"numpy\",\n    \"pandas\",\n    \"matplotlib\",\n    \"ruff\",\n]\n\n[tool.pixi.project]\nchannels = [\"conda-forge\"]\nplatforms = [\"linux-64\", \"osx-arm64\", \"osx-64\", \"win-64\"]\n\n[tool.pixi.dependencies]\ncompilers = \"*\"\ncmake = \"*\"\n\n[tool.pixi.tasks]\nstart = \"python my_project/main.py\"\nlint = \"ruff lint\"\n\n[tool.pixi.system-requirements]\ncuda = \"11.0\"\n\n[tool.pixi.feature.test.dependencies]\npytest = \"*\"\n\n[tool.pixi.feature.test.tasks]\ntest = \"pytest\"\n\n[tool.pixi.environments]\ntest = [\"test\"]\n
    "},{"location":"advanced/pyproject_toml/#build-system-section","title":"Build-system section","text":"

    The pyproject.toml file normally contains a [build-system] section. Pixi will use this section to build and install the project if it is added as a pypi path dependency.

    If the pyproject.toml file does not contain any [build-system] section, pixi will fall back to uv's default, which is equivalent to the below:

    pyproject.toml
    [build-system]\nrequires = [\"setuptools >= 40.8.0\"]\nbuild-backend = \"setuptools.build_meta:__legacy__\"\n

    Including a [build-system] section is highly recommended. If you are not sure of the build-backend you want to use, including the [build-system] section below in your pyproject.toml is a good starting point. pixi init --format pyproject defaults to hatchling. The advantages of hatchling over setuptools are outlined on its website.

    pyproject.toml
    [build-system]\nbuild-backend = \"hatchling.build\"\nrequires = [\"hatchling\"]\n
    "},{"location":"advanced/updates_github_actions/","title":"Update lockfiles with GitHub Actions","text":"

    You can leverage GitHub Actions in combination with pavelzw/pixi-diff-to-markdown to automatically update your lockfiles similar to dependabot or renovate in other ecosystems.

    Dependabot/Renovate support for pixi

    You can track native Dependabot support for pixi in dependabot/dependabot-core #2227 and for Renovate in renovatebot/renovate #2213.

    "},{"location":"advanced/updates_github_actions/#how-to-use","title":"How to use","text":"

    To get started, create a new GitHub Actions workflow file in your repository.

    .github/workflows/update-lockfiles.yml
    name: Update lockfiles\n\npermissions: # (1)!\n  contents: write\n  pull-requests: write\n\non:\n  workflow_dispatch:\n  schedule:\n    - cron: 0 5 1 * * # (2)!\n\njobs:\n  pixi-update:\n    runs-on: ubuntu-latest\n    steps:\n      - uses: actions/checkout@v4\n      - name: Set up pixi\n        uses: prefix-dev/setup-pixi@v0.8.1\n        with:\n          run-install: false\n      - name: Update lockfiles\n        run: |\n          set -o pipefail\n          pixi update --json | pixi exec pixi-diff-to-markdown >> diff.md\n      - name: Create pull request\n        uses: peter-evans/create-pull-request@v6\n        with:\n          token: ${{ secrets.GITHUB_TOKEN }}\n          commit-message: Update pixi lockfile\n          title: Update pixi lockfile\n          body-path: diff.md\n          branch: update-pixi\n          base: main\n          labels: pixi\n          delete-branch: true\n          add-paths: pixi.lock\n
    1. Needed for peter-evans/create-pull-request
    2. Runs at 05:00, on day 1 of the month

    In order for this workflow to work, you need to set \"Allow GitHub Actions to create and approve pull requests\" to true in your repository settings (in \"Actions\" -> \"General\").

    Tip

    If you don't have any pypi-dependencies, you can use pixi update --json --no-install to speed up diff generation.

    "},{"location":"advanced/updates_github_actions/#triggering-ci-in-automated-prs","title":"Triggering CI in automated PRs","text":"

    In order to prevent accidental recursive GitHub Workflow runs, GitHub decided to not trigger any workflows on automated PRs when using the default GITHUB_TOKEN. There are a couple of ways how to work around this limitation. You can find excellent documentation for this in peter-evans/create-pull-request, see here.

    "},{"location":"advanced/updates_github_actions/#customizing-the-summary","title":"Customizing the summary","text":"

    You can customize the summary by either using command-line-arguments of pixi-diff-to-markdown or by specifying the configuration in pixi.toml under [tool.pixi-diff-to-markdown]. See the pixi-diff-to-markdown documentation or run pixi-diff-to-markdown --help for more information.

    "},{"location":"advanced/updates_github_actions/#using-reusable-workflows","title":"Using reusable workflows","text":"

    If you want to use the same workflow in multiple repositories in your GitHub organization, you can create a reusable workflow. You can find more information in the GitHub documentation.

    "},{"location":"design_proposals/pixi_global_manifest/","title":"Pixi Global Manifest","text":"

    Feedback wanted

    This document is work in progress, and community feedback is greatly appreciated. Please share your thoughts at our GitHub discussion.

    "},{"location":"design_proposals/pixi_global_manifest/#motivation","title":"Motivation","text":"

    pixi global is currently limited to imperatively managing CLI packages. The next iteration of this feature should fulfill the following needs:

    • Shareable global environments.
    • Managing complex environments with multiple packages as dependencies
    • Flexible exposure of executables
    "},{"location":"design_proposals/pixi_global_manifest/#design-considerations","title":"Design Considerations","text":"

    There are a few things we wanted to keep in mind in the design:

    1. User-friendliness: Pixi is a user focused tool that goes beyond developers. The feature should have good error reporting and helpful documentation from the start.
    2. Keep it simple: The CLI should be all you strictly need to interact with global environments.
    3. Unsurprising: Simple commands should behave similar to traditional package managers.
    4. Human Readable: Any file created by this feature should be human-readable and modifiable.
    "},{"location":"design_proposals/pixi_global_manifest/#manifest","title":"Manifest","text":"

    The global environments and exposed will be managed by a human-readable manifest. This manifest will stick to conventions set by pixi.toml where possible. Among other things it will be written in the TOML format, be named pixi-global.toml and be placed at ~/.pixi/manifests/pixi-global.toml. The motivation for the location is discussed further below

    pixi-global.toml
    # The name of the environment is `python`\n[envs.python]\nchannels = [\"conda-forge\"]\n# optional, defaults to your current OS\nplatform = \"osx-64\"\n# It will expose python, python3 and python3.11, but not pip\n[envs.python.dependencies]\npython = \"3.11.*\"\npip = \"*\"\n\n[envs.python.exposed]\npython = \"python\"\npython3 = \"python3\"\n\"python3.11\" = \"python3.11\"\n\n# The name of the environment is `python3-10`\n[envs.python3-10]\nchannels = [\"https://fast.prefix.dev/conda-forge\"]\n# It will expose python3.10\n[envs.python3-10.dependencies]\npython = \"3.10.*\"\n\n[envs.python3-10.exposed]\n\"python3.10\" = \"python\"\n
    "},{"location":"design_proposals/pixi_global_manifest/#cli","title":"CLI","text":"

    Install one or more packages PACKAGE and expose their executables. If --environment has been given, all packages will be installed in the same environment. --expose can be given if --environment is given as well or if only a single PACKAGE will be installed. The syntax for MAPPING is exposed_name=executable_name, so for example python3.10=python. --platform sets the platform of the environment to PLATFORM Multiple channels can be specified by using --channel multiple times. By default, if no channel is provided, the default-channels key in the pixi configuration is used, which again defaults to \"conda-forge\".

    pixi global install [--expose MAPPING] [--environment ENV] [--platform PLATFORM] [--no-activation] [--channel CHANNEL]... PACKAGE...\n

    Remove environments ENV.

    pixi global uninstall <ENV>...\n

    Update PACKAGE if --package is given. If not, all packages in environments ENV will be updated. If the update leads to executables being removed, it will offer to remove the mappings. If the user declines the update process will stop. If the update leads to executables being added, it will offer for each binary individually to expose it. --assume-yes will assume yes as answer for every question that would otherwise be asked interactively.

    pixi global update [--package PACKAGE] [--assume-yes] <ENV>...\n

    Updates all packages in all environments. If the update leads to executables being removed, it will offer to remove the mappings. If the user declines the update process will stop. If the update leads to executables being added, it will offer for each binary individually to expose it. --assume-yes will assume yes as answer for every question that would otherwise be asked interactively.

    pixi global update-all [--assume-yes]\n

    Add one or more packages PACKAGE into an existing environment ENV. If environment ENV does not exist, it will return with an error. Without --expose no binary will be exposed. If you don't mention a spec like python=3.8.*, the spec will be unconstrained with *. The syntax for MAPPING is exposed_name=executable_name, so for example python3.10=python.

    pixi global add --environment ENV [--expose MAPPING] <PACKAGE>...\n

    Remove package PACKAGE from environment ENV. If that was the last package remove the whole environment and print that information in the console. If this leads to executables being removed, it will offer to remove the mappings. If the user declines the remove process will stop.

    pixi global remove --environment ENV PACKAGE\n

    Add one or more MAPPING for environment ENV which describe which executables are exposed. The syntax for MAPPING is exposed_name=executable_name, so for example python3.10=python.

    pixi global expose add --environment ENV  <MAPPING>...\n

    Remove one or more exposed BINARY from environment ENV

    pixi global expose remove --environment ENV <BINARY>...\n

    Ensure that the environments on the machine reflect the state in the manifest. The manifest is the single source of truth. Only if there's no manifest, will the data from existing environments be used to create a manifest. pixi global sync is implied by most other pixi global commands.

    pixi global sync\n

    List all environments, their specs and exposed executables

    pixi global list\n

    Set the channels CHANNEL for a certain environment ENV in the pixi global manifest.

    pixi global channel set --environment ENV <CHANNEL>...\n

    Set the platform PLATFORM for a certain environment ENV in the pixi global manifest.

    pixi global platform set --environment ENV PLATFORM\n

    "},{"location":"design_proposals/pixi_global_manifest/#simple-workflow","title":"Simple workflow","text":"

    Create environment python, install package python=3.10.* and expose all executables of that package

    pixi global install python=3.10.*\n

    Update all packages in environment python

    pixi global update python\n

    Remove environment python

    pixi global uninstall python\n

    Create environment python and pip, install corresponding packages and expose all executables of that packages

    pixi global install python pip\n

    Remove environments python and pip

    pixi global uninstall python pip\n

    Create environment python-pip, install python and pip in the same environment and expose all executables of these packages

    pixi global install --environment python-pip python pip\n

    "},{"location":"design_proposals/pixi_global_manifest/#adding-dependencies","title":"Adding dependencies","text":"

    Create environment python, install package python and expose all executables of that package. Then add package hypercorn to environment python but doesn't expose its executables.

    pixi global install python\npixi global add --environment python hypercorn\n

    Update package cryptography (a dependency of hypercorn) to 43.0.0 in environment python

    pixi update --environment python cryptography=43.0.0\n

    Then remove hypercorn again.

    pixi global remove --environment python hypercorn\n

    "},{"location":"design_proposals/pixi_global_manifest/#specifying-which-executables-to-expose","title":"Specifying which executables to expose","text":"

    Make a new environment python3-10 with package python=3.10 and expose the python executable as python3.10.

    pixi global install --environment python3-10 --expose \"python3.10=python\" python=3.10\n

    Now python3.10 is available.

    Run the following in order to expose python from environment python3-10 as python3-10 instead.

    pixi global expose remove --environment python3-10 python3.10\npixi global expose add --environment python3-10 \"python3-10=python\"\n

    Now python3-10 is available, but python3.10 isn't anymore.

    "},{"location":"design_proposals/pixi_global_manifest/#syncing","title":"Syncing","text":"

    Most pixi global sub commands imply a pixi global sync.

    • Users should be able to change the manifest by hand (creating or modifying (adding or removing))
    • Users should be able to \"export\" their existing environments into the manifest, if non-existing.
    • The manifest is always \"in sync\" after install/remove/inject/other global command.

    First time, clean computer. Running the following creates manifest and ~/.pixi/envs/python.

    pixi global install python\n

    Delete ~/.pixi and syncing, should add environment python again as described in the manifest

    rm `~/.pixi/envs`\npixi global sync\n

    If there's no manifest, but existing environments, pixi will create a manifest that matches your current environments. It is to be decided whether the user should be asked if they want an empty manifest instead, or if it should always import the data from the environments.

    rm <manifest>\npixi global sync\n

    If we remove the python environment from the manifest, running pixi global sync will also remove the ~/.pixi/envs/python environment from the file system.

    vim <manifest>\npixi global sync\n

    "},{"location":"design_proposals/pixi_global_manifest/#open-questions","title":"Open Questions","text":""},{"location":"design_proposals/pixi_global_manifest/#should-we-version-the-manifest","title":"Should we version the manifest?","text":"

    Something like:

    [manifest]\nversion = 1\n

    We still have to figure out which existing programs do something similar and how they benefit from it.

    "},{"location":"design_proposals/pixi_global_manifest/#multiple-manifests","title":"Multiple manifests","text":"

    We could go for one default manifest, but also parse other manifests in the same directory. The only requirement to be parsed as manifest is a .toml extension In order to modify those with the CLI one would have to add an option --manifest to select the correct one.

    • pixi-global.toml: Default
    • pixi-global-company-tools.toml
    • pixi-global-from-my-dotfiles.toml

    It is unclear whether the first implementation already needs to support this. At the very least we should put the manifest into its own folder like ~/.pixi/global/manifests/pixi-global.toml

    "},{"location":"design_proposals/pixi_global_manifest/#discovery-via-config-key","title":"Discovery via config key","text":"

    In order to make it easier to manage manifests in version control, we could allow to set the manifest path via a key in the pixi configuration.

    config.toml
    global_manifests = \"/path/to/your/manifests\"\n
    "},{"location":"design_proposals/pixi_global_manifest/#no-activation","title":"No activation","text":"

    The current pixi global install features --no-activation. When this flag is set, CONDA_PREFIX and PATH will not be set when running the exposed executable. This is useful when installing Python package managers or shells.

    Assuming that this needs to be set per mapping, one way to expose this functionality would be to allow the following:

    [envs.pip.exposed]\npip = { executable=\"pip\", activation=false }\n
    "},{"location":"examples/cpp-sdl/","title":"SDL example","text":"

    The cpp-sdl example is located in the pixi repository.

    git clone https://github.com/prefix-dev/pixi.git\n

    Move to the example folder

    cd pixi/examples/cpp-sdl\n

    Run the start command

    pixi run start\n

    Using the depends-on feature you only needed to run the start task but under water it is running the following tasks.

    # Configure the CMake project\npixi run configure\n\n# Build the executable\npixi run build\n\n# Start the build executable\npixi run start\n
    "},{"location":"examples/opencv/","title":"Opencv example","text":"

    The opencv example is located in the pixi repository.

    git clone https://github.com/prefix-dev/pixi.git\n

    Move to the example folder

    cd pixi/examples/opencv\n
    "},{"location":"examples/opencv/#face-detection","title":"Face detection","text":"

    Run the start command to start the face detection algorithm.

    pixi run start\n

    The screen that starts should look like this:

    Check out the webcame_capture.py to see how we detect a face.

    "},{"location":"examples/opencv/#camera-calibration","title":"Camera Calibration","text":"

    Next to face recognition, a camera calibration example is also included.

    You'll need a checkerboard for this to work. Print this:

    Then run

    pixi run calibrate\n

    To make a picture for calibration press SPACE Do this approximately 10 times with the chessboard in view of the camera

    After that press ESC which will start the calibration.

    When the calibration is done, the camera will be used again to find the distance to the checkerboard.

    "},{"location":"examples/ros2-nav2/","title":"Navigation 2 example","text":"

    The nav2 example is located in the pixi repository.

    git clone https://github.com/prefix-dev/pixi.git\n

    Move to the example folder

    cd pixi/examples/ros2-nav2\n

    Run the start command

    pixi run start\n
    "},{"location":"features/advanced_tasks/","title":"Advanced tasks","text":"

    When building a package, you often have to do more than just run the code. Steps like formatting, linting, compiling, testing, benchmarking, etc. are often part of a project. With pixi tasks, this should become much easier to do.

    Here are some quick examples

    pixi.toml
    [tasks]\n# Commands as lists so you can also add documentation in between.\nconfigure = { cmd = [\n    \"cmake\",\n    # Use the cross-platform Ninja generator\n    \"-G\",\n    \"Ninja\",\n    # The source is in the root directory\n    \"-S\",\n    \".\",\n    # We wanna build in the .build directory\n    \"-B\",\n    \".build\",\n] }\n\n# Depend on other tasks\nbuild = { cmd = [\"ninja\", \"-C\", \".build\"], depends-on = [\"configure\"] }\n\n# Using environment variables\nrun = \"python main.py $PIXI_PROJECT_ROOT\"\nset = \"export VAR=hello && echo $VAR\"\n\n# Cross platform file operations\ncopy = \"cp pixi.toml pixi_backup.toml\"\nclean = \"rm pixi_backup.toml\"\nmove = \"mv pixi.toml backup.toml\"\n
    "},{"location":"features/advanced_tasks/#depends-on","title":"Depends on","text":"

    Just like packages can depend on other packages, our tasks can depend on other tasks. This allows for complete pipelines to be run with a single command.

    An obvious example is compiling before running an application.

    Checkout our cpp_sdl example for a running example. In that package we have some tasks that depend on each other, so we can assure that when you run pixi run start everything is set up as expected.

    pixi task add configure \"cmake -G Ninja -S . -B .build\"\npixi task add build \"ninja -C .build\" --depends-on configure\npixi task add start \".build/bin/sdl_example\" --depends-on build\n

    Results in the following lines added to the pixi.toml

    pixi.toml
    [tasks]\n# Configures CMake\nconfigure = \"cmake -G Ninja -S . -B .build\"\n# Build the executable but make sure CMake is configured first.\nbuild = { cmd = \"ninja -C .build\", depends-on = [\"configure\"] }\n# Start the built executable\nstart = { cmd = \".build/bin/sdl_example\", depends-on = [\"build\"] }\n
    pixi run start\n

    The tasks will be executed after each other:

    • First configure because it has no dependencies.
    • Then build as it only depends on configure.
    • Then start as all it dependencies are run.

    If one of the commands fails (exit with non-zero code.) it will stop and the next one will not be started.

    With this logic, you can also create aliases as you don't have to specify any command in a task.

    pixi task add fmt ruff\npixi task add lint pylint\n
    pixi task alias style fmt lint\n

    Results in the following pixi.toml.

    pixi.toml
    fmt = \"ruff\"\nlint = \"pylint\"\nstyle = { depends-on = [\"fmt\", \"lint\"] }\n

    Now run both tools with one command.

    pixi run style\n
    "},{"location":"features/advanced_tasks/#working-directory","title":"Working directory","text":"

    Pixi tasks support the definition of a working directory.

    cwd\" stands for Current Working Directory. The directory is relative to the pixi package root, where the pixi.toml file is located.

    Consider a pixi project structured as follows:

    \u251c\u2500\u2500 pixi.toml\n\u2514\u2500\u2500 scripts\n    \u2514\u2500\u2500 bar.py\n

    To add a task to run the bar.py file, use:

    pixi task add bar \"python bar.py\" --cwd scripts\n

    This will add the following line to manifest file:

    pixi.toml
    [tasks]\nbar = { cmd = \"python bar.py\", cwd = \"scripts\" }\n
    "},{"location":"features/advanced_tasks/#caching","title":"Caching","text":"

    When you specify inputs and/or outputs to a task, pixi will reuse the result of the task.

    For the cache, pixi checks that the following are true:

    • No package in the environment has changed.
    • The selected inputs and outputs are the same as the last time the task was run. We compute fingerprints of all the files selected by the globs and compare them to the last time the task was run.
    • The command is the same as the last time the task was run.

    If all of these conditions are met, pixi will not run the task again and instead use the existing result.

    Inputs and outputs can be specified as globs, which will be expanded to all matching files.

    pixi.toml
    [tasks]\n# This task will only run if the `main.py` file has changed.\nrun = { cmd = \"python main.py\", inputs = [\"main.py\"] }\n\n# This task will remember the result of the `curl` command and not run it again if the file `data.csv` already exists.\ndownload_data = { cmd = \"curl -o data.csv https://example.com/data.csv\", outputs = [\"data.csv\"] }\n\n# This task will only run if the `src` directory has changed and will remember the result of the `make` command.\nbuild = { cmd = \"make\", inputs = [\"src/*.cpp\", \"include/*.hpp\"], outputs = [\"build/app.exe\"] }\n

    Note: if you want to debug the globs you can use the --verbose flag to see which files are selected.

    # shows info logs of all files that were selected by the globs\npixi run -v start\n
    "},{"location":"features/advanced_tasks/#environment-variables","title":"Environment variables","text":"

    You can set environment variables for a task. These are seen as \"default\" values for the variables as you can overwrite them from the shell.

    pixi.toml

    [tasks]\necho = { cmd = \"echo $ARGUMENT\", env = { ARGUMENT = \"hello\" } }\n
    If you run pixi run echo it will output hello. When you set the environment variable ARGUMENT before running the task, it will use that value instead.

    ARGUMENT=world pixi run echo\n\u2728 Pixi task (echo in default): echo $ARGUMENT\nworld\n

    These variables are not shared over tasks, so you need to define these for every task you want to use them in.

    Extend instead of overwrite

    If you use the same environment variable in the value as in the key of the map you will also overwrite the variable. For example overwriting a PATH pixi.toml

    [tasks]\necho = { cmd = \"echo $PATH\", env = { PATH = \"/tmp/path:$PATH\" } }\n
    This will output /tmp/path:/usr/bin:/bin instead of the original /usr/bin:/bin.

    "},{"location":"features/advanced_tasks/#clean-environment","title":"Clean environment","text":"

    You can make sure the environment of a task is \"pixi only\". Here pixi will only include the minimal required environment variables for your platform to run the command in. The environment will contain all variables set by the conda environment like \"CONDA_PREFIX\". It will however include some default values from the shell, like: \"DISPLAY\", \"LC_ALL\", \"LC_TIME\", \"LC_NUMERIC\", \"LC_MEASUREMENT\", \"SHELL\", \"USER\", \"USERNAME\", \"LOGNAME\", \"HOME\", \"HOSTNAME\",\"TMPDIR\", \"XPC_SERVICE_NAME\", \"XPC_FLAGS\"

    [tasks]\nclean_command = { cmd = \"python run_in_isolated_env.py\", clean-env = true}\n
    This setting can also be set from the command line with pixi run --clean-env TASK_NAME.

    clean-env not supported on Windows

    On Windows it's hard to create a \"clean environment\" as conda-forge doesn't ship Windows compilers and Windows needs a lot of base variables. Making this feature not worthy of implementing as the amount of edge cases will make it unusable.

    "},{"location":"features/advanced_tasks/#our-task-runner-deno_task_shell","title":"Our task runner: deno_task_shell","text":"

    To support the different OS's (Windows, OSX and Linux), pixi integrates a shell that can run on all of them. This is deno_task_shell. The task shell is a limited implementation of a bourne-shell interface.

    "},{"location":"features/advanced_tasks/#built-in-commands","title":"Built-in commands","text":"

    Next to running actual executable like ./myprogram, cmake or python the shell has some built-in commandos.

    • cp: Copies files.
    • mv: Moves files.
    • rm: Remove files or directories. Ex: rm -rf [FILE]... - Commonly used to recursively delete files or directories.
    • mkdir: Makes directories. Ex. mkdir -p DIRECTORY... - Commonly used to make a directory and all its parents with no error if it exists.
    • pwd: Prints the name of the current/working directory.
    • sleep: Delays for a specified amount of time. Ex. sleep 1 to sleep for 1 second, sleep 0.5 to sleep for half a second, or sleep 1m to sleep a minute
    • echo: Displays a line of text.
    • cat: Concatenates files and outputs them on stdout. When no arguments are provided, it reads and outputs stdin.
    • exit: Causes the shell to exit.
    • unset: Unsets environment variables.
    • xargs: Builds arguments from stdin and executes a command.
    "},{"location":"features/advanced_tasks/#syntax","title":"Syntax","text":"
    • Boolean list: use && or || to separate two commands.
      • &&: if the command before && succeeds continue with the next command.
      • ||: if the command before || fails continue with the next command.
    • Sequential lists: use ; to run two commands without checking if the first command failed or succeeded.
    • Environment variables:
      • Set env variable using: export ENV_VAR=value
      • Use env variable using: $ENV_VAR
      • unset env variable using unset ENV_VAR
    • Shell variables: Shell variables are similar to environment variables, but won\u2019t be exported to spawned commands.
      • Set them: VAR=value
      • use them: VAR=value && echo $VAR
    • Pipelines: Use the stdout output of a command into the stdin a following command
      • |: echo Hello | python receiving_app.py
      • |&: use this to also get the stderr as input.
    • Command substitution: $() to use the output of a command as input for another command.
      • python main.py $(git rev-parse HEAD)
    • Negate exit code: ! before any command will negate the exit code from 1 to 0 or visa-versa.
    • Redirects: > to redirect the stdout to a file.
      • echo hello > file.txt will put hello in file.txt and overwrite existing text.
      • python main.py 2> file.txt will put the stderr output in file.txt.
      • python main.py &> file.txt will put the stderr and stdout in file.txt.
      • echo hello >> file.txt will append hello to the existing file.txt.
    • Glob expansion: * to expand all options.
      • echo *.py will echo all filenames that end with .py
      • echo **/*.py will echo all filenames that end with .py in this directory and all descendant directories.
      • echo data[0-9].csv will echo all filenames that have a single number after data and before .csv

    More info in deno_task_shell documentation.

    "},{"location":"features/environment/","title":"Environments","text":"

    Pixi is a tool to manage virtual environments. This document explains what an environment looks like and how to use it.

    "},{"location":"features/environment/#structure","title":"Structure","text":"

    A pixi environment is located in the .pixi/envs directory of the project. This location is not configurable as it is a specific design decision to keep the environments in the project directory. This keeps your machine and your project clean and isolated from each other, and makes it easy to clean up after a project is done.

    If you look at the .pixi/envs directory, you will see a directory for each environment, the default being the one that is normally used, if you specify a custom environment the name you specified will be used.

    .pixi\n\u2514\u2500\u2500 envs\n    \u251c\u2500\u2500 cuda\n    \u2502   \u251c\u2500\u2500 bin\n    \u2502   \u251c\u2500\u2500 conda-meta\n    \u2502   \u251c\u2500\u2500 etc\n    \u2502   \u251c\u2500\u2500 include\n    \u2502   \u251c\u2500\u2500 lib\n    \u2502   ...\n    \u2514\u2500\u2500 default\n        \u251c\u2500\u2500 bin\n        \u251c\u2500\u2500 conda-meta\n        \u251c\u2500\u2500 etc\n        \u251c\u2500\u2500 include\n        \u251c\u2500\u2500 lib\n        ...\n

    These directories are conda environments, and you can use them as such, but you cannot manually edit them, this should always go through the pixi.toml. Pixi will always make sure the environment is in sync with the pixi.lock file. If this is not the case then all the commands that use the environment will automatically update the environment, e.g. pixi run, pixi shell.

    "},{"location":"features/environment/#cleaning-up","title":"Cleaning up","text":"

    If you want to clean up the environments, you can simply delete the .pixi/envs directory, and pixi will recreate the environments when needed.

    # either:\nrm -rf .pixi/envs\n\n# or per environment:\nrm -rf .pixi/envs/default\nrm -rf .pixi/envs/cuda\n
    "},{"location":"features/environment/#activation","title":"Activation","text":"

    An environment is nothing more than a set of files that are installed into a certain location, that somewhat mimics a global system install. You need to activate the environment to use it. In the most simple sense that mean adding the bin directory of the environment to the PATH variable. But there is more to it in a conda environment, as it also sets some environment variables.

    To do the activation we have multiple options:

    • Use the pixi shell command to open a shell with the environment activated.
    • Use the pixi shell-hook command to print the command to activate the environment in your current shell.
    • Use the pixi run command to run a command in the environment.

    Where the run command is special as it runs its own cross-platform shell and has the ability to run tasks. More information about tasks can be found in the tasks documentation.

    Using the pixi shell-hook in pixi you would get the following output:

    export PATH=\"/home/user/development/pixi/.pixi/envs/default/bin:/home/user/.local/bin:/home/user/bin:/usr/local/bin:/usr/local/sbin:/usr/bin:/home/user/.pixi/bin\"\nexport CONDA_PREFIX=\"/home/user/development/pixi/.pixi/envs/default\"\nexport PIXI_PROJECT_NAME=\"pixi\"\nexport PIXI_PROJECT_ROOT=\"/home/user/development/pixi\"\nexport PIXI_PROJECT_VERSION=\"0.12.0\"\nexport PIXI_PROJECT_MANIFEST=\"/home/user/development/pixi/pixi.toml\"\nexport CONDA_DEFAULT_ENV=\"pixi\"\nexport PIXI_ENVIRONMENT_PLATFORMS=\"osx-64,linux-64,win-64,osx-arm64\"\nexport PIXI_ENVIRONMENT_NAME=\"default\"\nexport PIXI_PROMPT=\"(pixi) \"\n. \"/home/user/development/pixi/.pixi/envs/default/etc/conda/activate.d/activate-binutils_linux-64.sh\"\n. \"/home/user/development/pixi/.pixi/envs/default/etc/conda/activate.d/activate-gcc_linux-64.sh\"\n. \"/home/user/development/pixi/.pixi/envs/default/etc/conda/activate.d/activate-gfortran_linux-64.sh\"\n. \"/home/user/development/pixi/.pixi/envs/default/etc/conda/activate.d/activate-gxx_linux-64.sh\"\n. \"/home/user/development/pixi/.pixi/envs/default/etc/conda/activate.d/libglib_activate.sh\"\n. \"/home/user/development/pixi/.pixi/envs/default/etc/conda/activate.d/rust.sh\"\n

    It sets the PATH and some more environment variables. But more importantly it also runs activation scripts that are presented by the installed packages. An example of this would be the libglib_activate.sh script. Thus, just adding the bin directory to the PATH is not enough.

    "},{"location":"features/environment/#traditional-conda-activate-like-activation","title":"Traditional conda activate-like activation","text":"

    If you prefer to use the traditional conda activate-like activation, you could use the pixi shell-hook command.

    $ which python\npython not found\n$ eval \"$(pixi shell-hook)\"\n$ (default) which python\n/path/to/project/.pixi/envs/default/bin/python\n

    Warning

    It is not encouraged to use the traditional conda activate-like activation, as deactivating the environment is not really possible. Use pixi shell instead.

    "},{"location":"features/environment/#using-pixi-with-direnv","title":"Using pixi with direnv","text":"Installing direnv

    Of course you can use pixi to install direnv globally. We recommend to run

    pixi global install direnv

    to install the latest version of direnv on your computer.

    This allows you to use pixi in combination with direnv. Enter the following into your .envrc file:

    .envrc
    watch_file pixi.lock # (1)!\neval \"$(pixi shell-hook)\" # (2)!\n
    1. This ensures that every time your pixi.lock changes, direnv invokes the shell-hook again.
    2. This installs if needed, and activates the environment. direnv ensures that the environment is deactivated when you leave the directory.
    $ cd my-project\ndirenv: error /my-project/.envrc is blocked. Run `direnv allow` to approve its content\n$ direnv allow\ndirenv: loading /my-project/.envrc\n\u2714 Project in /my-project is ready to use!\ndirenv: export +CONDA_DEFAULT_ENV +CONDA_PREFIX +PIXI_ENVIRONMENT_NAME +PIXI_ENVIRONMENT_PLATFORMS +PIXI_PROJECT_MANIFEST +PIXI_PROJECT_NAME +PIXI_PROJECT_ROOT +PIXI_PROJECT_VERSION +PIXI_PROMPT ~PATH\n$ which python\n/my-project/.pixi/envs/default/bin/python\n$ cd ..\ndirenv: unloading\n$ which python\npython not found\n
    "},{"location":"features/environment/#environment-variables","title":"Environment variables","text":"

    The following environment variables are set by pixi, when using the pixi run, pixi shell, or pixi shell-hook command:

    • PIXI_PROJECT_ROOT: The root directory of the project.
    • PIXI_PROJECT_NAME: The name of the project.
    • PIXI_PROJECT_MANIFEST: The path to the manifest file (pixi.toml).
    • PIXI_PROJECT_VERSION: The version of the project.
    • PIXI_PROMPT: The prompt to use in the shell, also used by pixi shell itself.
    • PIXI_ENVIRONMENT_NAME: The name of the environment, defaults to default.
    • PIXI_ENVIRONMENT_PLATFORMS: Comma separated list of platforms supported by the project.
    • CONDA_PREFIX: The path to the environment. (Used by multiple tools that already understand conda environments)
    • CONDA_DEFAULT_ENV: The name of the environment. (Used by multiple tools that already understand conda environments)
    • PATH: We prepend the bin directory of the environment to the PATH variable, so you can use the tools installed in the environment directly.
    • INIT_CWD: ONLY IN pixi run: The directory where the command was run from.

    Note

    Even though the variables are environment variables these cannot be overridden. E.g. you can not change the root of the project by setting PIXI_PROJECT_ROOT in the environment.

    "},{"location":"features/environment/#solving-environments","title":"Solving environments","text":"

    When you run a command that uses the environment, pixi will check if the environment is in sync with the pixi.lock file. If it is not, pixi will solve the environment and update it. This means that pixi will retrieve the best set of packages for the dependency requirements that you specified in the pixi.toml and will put the output of the solve step into the pixi.lock file. Solving is a mathematical problem and can take some time, but we take pride in the way we solve environments, and we are confident that we can solve your environment in a reasonable time. If you want to learn more about the solving process, you can read these:

    • Rattler(conda) resolver blog
    • UV(PyPI) resolver blog

    Pixi solves both the conda and PyPI dependencies, where the PyPI dependencies use the conda packages as a base, so you can be sure that the packages are compatible with each other. These solvers are split between the rattler and uv library, these control the heavy lifting of the solving process, which is executed by our custom SAT solver: resolvo. resolve is able to solve multiple ecosystem like conda and PyPI. It implements the lazy solving process for PyPI packages, which means that it only downloads the metadata of the packages that are needed to solve the environment. It also supports the conda way of solving, which means that it downloads the metadata of all the packages at once and then solves in one go.

    For the [pypi-dependencies], uv implements sdist building to retrieve the metadata of the packages, and wheel building to install the packages. For this building step, pixi requires to first install python in the (conda)[dependencies] section of the pixi.toml file. This will always be slower than the pure conda solves. So for the best pixi experience you should stay within the [dependencies] section of the pixi.toml file.

    "},{"location":"features/environment/#caching","title":"Caching","text":"

    Pixi caches all previously downloaded packages in a cache folder. This cache folder is shared between all pixi projects and globally installed tools.

    Normally the location would be the following platform-specific default cache folder:

    • Linux: $XDG_CACHE_HOME/rattler or $HOME/.cache/rattler
    • macOS: $HOME/Library/Caches/rattler
    • Windows: %LOCALAPPDATA%\\rattler

    This location is configurable by setting the PIXI_CACHE_DIR or RATTLER_CACHE_DIR environment variable.

    When you want to clean the cache, you can simply delete the cache directory, and pixi will re-create the cache when needed.

    The cache contains multiple folders concerning different caches from within pixi.

    • pkgs: Contains the downloaded/unpacked conda packages.
    • repodata: Contains the conda repodata cache.
    • uv-cache: Contains the uv cache. This includes multiple caches, e.g. built-wheels wheels archives
    • http-cache: Contains the conda-pypi mapping cache.
    "},{"location":"features/lockfile/","title":"The pixi.lock lock file","text":"

    A lock file is the protector of the environments, and pixi is the key to unlock it.

    "},{"location":"features/lockfile/#what-is-a-lock-file","title":"What is a lock file?","text":"

    A lock file locks the environment in a specific state. Within pixi a lock file is a description of the packages in an environment. The lock file contains two definitions:

    • The environments that are used in the project with their complete set of packages. e.g.:

      environments:\n    default:\n        channels:\n          - url: https://conda.anaconda.org/conda-forge/\n        packages:\n            linux-64:\n            ...\n            - conda: https://conda.anaconda.org/conda-forge/linux-64/python-3.12.2-hab00c5b_0_cpython.conda\n            ...\n            osx-64:\n            ...\n            - conda: https://conda.anaconda.org/conda-forge/osx-64/python-3.12.2-h9f0c242_0_cpython.conda\n            ...\n
      • The definition of the packages themselves. e.g.:

        - kind: conda\n  name: python\n  version: 3.12.2\n  build: h9f0c242_0_cpython\n  subdir: osx-64\n  url: https://conda.anaconda.org/conda-forge/osx-64/python-3.12.2-h9f0c242_0_cpython.conda\n  sha256: 7647ac06c3798a182a4bcb1ff58864f1ef81eb3acea6971295304c23e43252fb\n  md5: 0179b8007ba008cf5bec11f3b3853902\n  depends:\n    - bzip2 >=1.0.8,<2.0a0\n    - libexpat >=2.5.0,<3.0a0\n    - libffi >=3.4,<4.0a0\n    - libsqlite >=3.45.1,<4.0a0\n    - libzlib >=1.2.13,<1.3.0a0\n    - ncurses >=6.4,<7.0a0\n    - openssl >=3.2.1,<4.0a0\n    - readline >=8.2,<9.0a0\n    - tk >=8.6.13,<8.7.0a0\n    - tzdata\n    - xz >=5.2.6,<6.0a0\n  constrains:\n    - python_abi 3.12.* *_cp312\n  license: Python-2.0\n  size: 14596811\n  timestamp: 1708118065292\n
    "},{"location":"features/lockfile/#why-a-lock-file","title":"Why a lock file","text":"

    Pixi uses the lock file for the following reasons:

    • To save a working installation state, without copying the entire environment's data.
    • To ensure the project configuration is aligned with the installed environment.
    • To give the user a file that contains all the information about the environment.

    This gives you (and your collaborators) a way to really reproduce the environment they are working in. Using tools such as docker suddenly becomes much less necessary.

    "},{"location":"features/lockfile/#when-is-a-lock-file-generated","title":"When is a lock file generated?","text":"

    A lock file is generated when you install a package. More specifically, a lock file is generated from the solve step of the installation process. The solve will return a list of packages that are to be installed, and the lock file will be generated from this list. This diagram tries to explain the process:

    graph TD\n    A[Install] --> B[Solve]\n    B --> C[Generate and write lock file]\n    C --> D[Install Packages]
    "},{"location":"features/lockfile/#how-to-use-a-lock-file","title":"How to use a lock file","text":"

    Do not edit the lock file

    A lock file is a machine only file, and should not be edited by hand.

    That said, the pixi.lock is human-readable, so it's easy to track the changes in the environment. We recommend you track the lock file in git or other version control systems. This will ensure that the environment is always reproducible and that you can always revert back to a working state, in case something goes wrong. The pixi.lock and the manifest file pixi.toml/pyproject.toml should always be in sync.

    Running the following commands will check and automatically update the lock file if you changed any dependencies:

    • pixi install
    • pixi run
    • pixi shell
    • pixi shell-hook
    • pixi tree
    • pixi list
    • pixi add
    • pixi remove

    All the commands that support the interaction with the lock file also include some lock file usage options:

    • --frozen: install the environment as defined in the lock file, doesn't update pixi.lock if it isn't up-to-date with manifest file. It can also be controlled by the PIXI_FROZEN environment variable (example: PIXI_FROZEN=true).
    • --locked: only install if the pixi.lock is up-to-date with the manifest file[^1]. It can also be controlled by the PIXI_LOCKED environment variable (example: PIXI_LOCKED=true). Conflicts with --frozen.

    Syncing the lock file with the manifest file

    The lock file is always matched with the whole configuration in the manifest file. This means that if you change the manifest file, the lock file will be updated.

    flowchart TD\n    C[manifest] --> A[lockfile] --> B[environment]

    "},{"location":"features/lockfile/#lockfile-satisfiability","title":"Lockfile satisfiability","text":"

    The lock file is a description of the environment, and it should always be satisfiable. Satisfiable means that the given manifest file and the created environment are in sync with the lockfile. If the lock file is not satisfiable, pixi will generate a new lock file automatically.

    Steps to check if the lock file is satisfiable:

    • All environments in the manifest file are in the lock file
    • All channels in the manifest file are in the lock file
    • All packages in the manifest file are in the lock file, and the versions in the lock file are compatible with the requirements in the manifest file, for both conda and pypi packages.
      • Conda packages use a matchspec which can match on all the information we store in the lockfile, even timestamp, subdir and license.
    • If pypi-dependencies are added, all conda package that are python packages in the lock file have a purls field.
    • All hashes for the pypi editable packages are correct.
    • There is only a single entry for every package in the lock file.

    If you want to get more details checkout the actual code as this is a simplification of the actual code.

    "},{"location":"features/lockfile/#the-version-of-the-lock-file","title":"The version of the lock file","text":"

    The lock file has a version number, this is to ensure that the lock file is compatible with the local version of pixi.

    version: 4\n

    Pixi is backward compatible with the lock file, but not forward compatible. This means that you can use an older lock file with a newer version of pixi, but not the other way around.

    "},{"location":"features/lockfile/#your-lock-file-is-big","title":"Your lock file is big","text":"

    The lock file can grow quite large, especially if you have a lot of packages installed. This is because the lock file contains all the information about the packages.

    1. We try to keep the lock file as small as possible.
    2. It's always smaller than a docker image.
    3. Downloading the lock file is always faster than downloading the incorrect packages.
    "},{"location":"features/lockfile/#you-dont-need-a-lock-file-because","title":"You don't need a lock file because...","text":"

    If you can not think of a case where you would benefit from a fast reproducible environment, then you don't need a lock file.

    But take note of the following:

    • A lock file allows you to run the same environment on different machines, think CI systems.
    • It also allows you to go back to a working state if you have made a mistake.
    • It helps other users onboard to your project as they don't have to figure out the environment setup or solve dependency issues.
    "},{"location":"features/lockfile/#removing-the-lock-file","title":"Removing the lock file","text":"

    If you want to remove the lock file, you can simply delete it.

    rm pixi.lock\n

    This will remove the lock file, and the next time you run a command that requires the lock file, it will be generated again.

    Note

    This does remove the locked state of the environment, and the environment will be updated to the latest version of the packages.

    "},{"location":"features/multi_environment/","title":"Multi Environment Support","text":""},{"location":"features/multi_environment/#motivating-example","title":"Motivating Example","text":"

    There are multiple scenarios where multiple environments are useful.

    • Testing of multiple package versions, e.g. py39 and py310 or polars 0.12 and 0.13.
    • Smaller single tool environments, e.g. lint or docs.
    • Large developer environments, that combine all the smaller environments, e.g. dev.
    • Strict supersets of environments, e.g. prod and test-prod where test-prod is a strict superset of prod.
    • Multiple machines from one project, e.g. a cuda environment and a cpu environment.
    • And many more. (Feel free to edit this document in our GitHub and add your use case.)

    This prepares pixi for use in large projects with multiple use-cases, multiple developers and different CI needs.

    "},{"location":"features/multi_environment/#design-considerations","title":"Design Considerations","text":"

    There are a few things we wanted to keep in mind in the design:

    1. User-friendliness: Pixi is a user focussed tool that goes beyond developers. The feature should have good error reporting and helpful documentation from the start.
    2. Keep it simple: Not understanding the multiple environments feature shouldn't limit a user to use pixi. The feature should be \"invisible\" to the non-multi env use-cases.
    3. No Automatic Combinatorial: To ensure the dependency resolution process remains manageable, the solution should avoid a combinatorial explosion of dependency sets. By making the environments user defined and not automatically inferred by testing a matrix of the features.
    4. Single environment Activation: The design should allow only one environment to be active at any given time, simplifying the resolution process and preventing conflicts.
    5. Fixed lock files: It's crucial to preserve fixed lock files for consistency and predictability. Solutions must ensure reliability not just for authors but also for end-users, particularly at the time of lock file creation.
    "},{"location":"features/multi_environment/#feature-environment-set-definitions","title":"Feature & Environment Set Definitions","text":"

    Introduce environment sets into the pixi.toml this describes environments based on feature's. Introduce features into the pixi.toml that can describe parts of environments. As an environment goes beyond just dependencies the features should be described including the following fields:

    • dependencies: The conda package dependencies
    • pypi-dependencies: The pypi package dependencies
    • system-requirements: The system requirements of the environment
    • activation: The activation information for the environment
    • platforms: The platforms the environment can be run on.
    • channels: The channels used to create the environment. Adding the priority field to the channels to allow concatenation of channels instead of overwriting.
    • target: All the above features but also separated by targets.
    • tasks: Feature specific tasks, tasks in one environment are selected as default tasks for the environment.
    Default features
    [dependencies] # short for [feature.default.dependencies]\npython = \"*\"\nnumpy = \"==2.3\"\n\n[pypi-dependencies] # short for [feature.default.pypi-dependencies]\npandas = \"*\"\n\n[system-requirements] # short for [feature.default.system-requirements]\nlibc = \"2.33\"\n\n[activation] # short for [feature.default.activation]\nscripts = [\"activate.sh\"]\n
    Different dependencies per feature
    [feature.py39.dependencies]\npython = \"~=3.9.0\"\n[feature.py310.dependencies]\npython = \"~=3.10.0\"\n[feature.test.dependencies]\npytest = \"*\"\n
    Full set of environment modification in one feature
    [feature.cuda]\ndependencies = {cuda = \"x.y.z\", cudnn = \"12.0\"}\npypi-dependencies = {torch = \"1.9.0\"}\nplatforms = [\"linux-64\", \"osx-arm64\"]\nactivation = {scripts = [\"cuda_activation.sh\"]}\nsystem-requirements = {cuda = \"12\"}\n# Channels concatenate using a priority instead of overwrite, so the default channels are still used.\n# Using the priority the concatenation is controlled, default is 0, the default channels are used last.\n# Highest priority comes first.\nchannels = [\"nvidia\", {channel = \"pytorch\", priority = -1}] # Results in:  [\"nvidia\", \"conda-forge\", \"pytorch\"] when the default is `conda-forge`\ntasks = { warmup = \"python warmup.py\" }\ntarget.osx-arm64 = {dependencies = {mlx = \"x.y.z\"}}\n
    Define tasks as defaults of an environment
    [feature.test.tasks]\ntest = \"pytest\"\n\n[environments]\ntest = [\"test\"]\n\n# `pixi run test` == `pixi run --environment test test`\n

    The environment definition should contain the following fields:

    • features: Vec<Feature>: The features that are included in the environment set, which is also the default field in the environments.
    • solve-group: String: The solve group is used to group environments together at the solve stage. This is useful for environments that need to have the same dependencies but might extend them with additional dependencies. For instance when testing a production environment with additional test dependencies.
    Creating environments from features
    [environments]\n# implicit: default = [\"default\"]\ndefault = [\"py39\"] # implicit: default = [\"py39\", \"default\"]\npy310 = [\"py310\"] # implicit: py310 = [\"py310\", \"default\"]\ntest = [\"test\"] # implicit: test = [\"test\", \"default\"]\ntest39 = [\"test\", \"py39\"] # implicit: test39 = [\"test\", \"py39\", \"default\"]\n
    Testing a production environment with additional dependencies
    [environments]\n# Creating a `prod` environment which is the minimal set of dependencies used for production.\nprod = {features = [\"py39\"], solve-group = \"prod\"}\n# Creating a `test_prod` environment which is the `prod` environment plus the `test` feature.\ntest_prod = {features = [\"py39\", \"test\"], solve-group = \"prod\"}\n# Using the `solve-group` to solve the `prod` and `test_prod` environments together\n# Which makes sure the tested environment has the same version of the dependencies as the production environment.\n
    Creating environments without including the default feature
    [dependencies]\npython = \"*\"\nnumpy = \"*\"\n\n[feature.lint.dependencies]\npre-commit = \"*\"\n\n[environments]\n# Create a custom environment which only has the `lint` feature (numpy isn't part of that env).\nlint = {features = [\"lint\"], no-default-feature = true}\n
    "},{"location":"features/multi_environment/#lock-file-structure","title":"lock file Structure","text":"

    Within the pixi.lock file, a package may now include an additional environments field, specifying the environment to which it belongs. To avoid duplication the packages environments field may contain multiple environments so the lock file is of minimal size.

    - platform: linux-64\n  name: pre-commit\n  version: 3.3.3\n  category: main\n  environments:\n    - dev\n    - test\n    - lint\n  ...:\n- platform: linux-64\n  name: python\n  version: 3.9.3\n  category: main\n  environments:\n    - dev\n    - test\n    - lint\n    - py39\n    - default\n  ...:\n
    "},{"location":"features/multi_environment/#user-interface-environment-activation","title":"User Interface Environment Activation","text":"

    Users can manually activate the desired environment via command line or configuration. This approach guarantees a conflict-free environment by allowing only one feature set to be active at a time. For the user the cli would look like this:

    Default behavior
    \u279c pixi run python\n# Runs python in the `default` environment\n
    Activating an specific environment
    \u279c pixi run -e test pytest\n\u279c pixi run --environment test pytest\n# Runs `pytest` in the `test` environment\n
    Activating a shell in an environment
    \u279c pixi shell -e cuda\npixi shell --environment cuda\n# Starts a shell in the `cuda` environment\n
    Running any command in an environment
    \u279c pixi run -e test any_command\n# Runs any_command in the `test` environment which doesn't require to be predefined as a task.\n
    "},{"location":"features/multi_environment/#ambiguous-environment-selection","title":"Ambiguous Environment Selection","text":"

    It's possible to define tasks in multiple environments, in this case the user should be prompted to select the environment.

    Here is a simple example of a task only manifest:

    pixi.toml

    [project]\nname = \"test_ambiguous_env\"\nchannels = []\nplatforms = [\"linux-64\", \"win-64\", \"osx-64\", \"osx-arm64\"]\n\n[tasks]\ndefault = \"echo Default\"\nambi = \"echo Ambi::Default\"\n[feature.test.tasks]\ntest = \"echo Test\"\nambi = \"echo Ambi::Test\"\n\n[feature.dev.tasks]\ndev = \"echo Dev\"\nambi = \"echo Ambi::Dev\"\n\n[environments]\ndefault = [\"test\", \"dev\"]\ntest = [\"test\"]\ndev = [\"dev\"]\n
    Trying to run the abmi task will prompt the user to select the environment. As it is available in all environments.

    Interactive selection of environments if task is in multiple environments
    \u279c pixi run ambi\n? The task 'ambi' can be run in multiple environments.\n\nPlease select an environment to run the task in: \u203a\n\u276f default # selecting default\n  test\n  dev\n\n\u2728 Pixi task (ambi in default): echo Ambi::Test\nAmbi::Test\n

    As you can see it runs the task defined in the feature.task but it is run in the default environment. This happens because the ambi task is defined in the test feature, and it is overwritten in the default environment. So the tasks.default is now non-reachable from any environment.

    Some other results running in this example:

    \u279c pixi run --environment test ambi\n\u2728 Pixi task (ambi in test): echo Ambi::Test\nAmbi::Test\n\n\u279c pixi run --environment dev ambi\n\u2728 Pixi task (ambi in dev): echo Ambi::Dev\nAmbi::Dev\n\n# dev is run in the default environment\n\u279c pixi run dev\n\u2728 Pixi task (dev in default): echo Dev\nDev\n\n# dev is run in the dev environment\n\u279c pixi run -e dev dev\n\u2728 Pixi task (dev in dev): echo Dev\nDev\n

    "},{"location":"features/multi_environment/#important-links","title":"Important links","text":"
    • Initial writeup of the proposal: GitHub Gist by 0xbe7a
    • GitHub project: #10
    "},{"location":"features/multi_environment/#real-world-example-use-cases","title":"Real world example use cases","text":"Polarify test setup

    In polarify they want to test multiple versions combined with multiple versions of polars. This is currently done by using a matrix in GitHub actions. This can be replaced by using multiple environments.

    pixi.toml
    [project]\nname = \"polarify\"\n# ...\nchannels = [\"conda-forge\"]\nplatforms = [\"linux-64\", \"osx-arm64\", \"osx-64\", \"win-64\"]\n\n[tasks]\npostinstall = \"pip install --no-build-isolation --no-deps --disable-pip-version-check -e .\"\n\n[dependencies]\npython = \">=3.9\"\npip = \"*\"\npolars = \">=0.14.24,<0.21\"\n\n[feature.py39.dependencies]\npython = \"3.9.*\"\n[feature.py310.dependencies]\npython = \"3.10.*\"\n[feature.py311.dependencies]\npython = \"3.11.*\"\n[feature.py312.dependencies]\npython = \"3.12.*\"\n[feature.pl017.dependencies]\npolars = \"0.17.*\"\n[feature.pl018.dependencies]\npolars = \"0.18.*\"\n[feature.pl019.dependencies]\npolars = \"0.19.*\"\n[feature.pl020.dependencies]\npolars = \"0.20.*\"\n\n[feature.test.dependencies]\npytest = \"*\"\npytest-md = \"*\"\npytest-emoji = \"*\"\nhypothesis = \"*\"\n[feature.test.tasks]\ntest = \"pytest\"\n\n[feature.lint.dependencies]\npre-commit = \"*\"\n[feature.lint.tasks]\nlint = \"pre-commit run --all\"\n\n[environments]\npl017 = [\"pl017\", \"py39\", \"test\"]\npl018 = [\"pl018\", \"py39\", \"test\"]\npl019 = [\"pl019\", \"py39\", \"test\"]\npl020 = [\"pl020\", \"py39\", \"test\"]\npy39 = [\"py39\", \"test\"]\npy310 = [\"py310\", \"test\"]\npy311 = [\"py311\", \"test\"]\npy312 = [\"py312\", \"test\"]\n
    .github/workflows/test.yml
    jobs:\n  tests-per-env:\n    runs-on: ubuntu-latest\n    strategy:\n      matrix:\n        environment: [py311, py312]\n    steps:\n    - uses: actions/checkout@v4\n      - uses: prefix-dev/setup-pixi@v0.5.1\n        with:\n          environments: ${{ matrix.environment }}\n      - name: Run tasks\n        run: |\n          pixi run --environment ${{ matrix.environment }} test\n  tests-with-multiple-envs:\n    runs-on: ubuntu-latest\n    steps:\n    - uses: actions/checkout@v4\n    - uses: prefix-dev/setup-pixi@v0.5.1\n      with:\n       environments: pl017 pl018\n    - run: |\n        pixi run -e pl017 test\n        pixi run -e pl018 test\n
    Test vs Production example

    This is an example of a project that has a test feature and prod environment. The prod environment is a production environment that contains the run dependencies. The test feature is a set of dependencies and tasks that we want to put on top of the previously solved prod environment. This is a common use case where we want to test the production environment with additional dependencies.

    pixi.toml

    [project]\nname = \"my-app\"\n# ...\nchannels = [\"conda-forge\"]\nplatforms = [\"osx-arm64\", \"linux-64\"]\n\n[tasks]\npostinstall-e = \"pip install --no-build-isolation --no-deps --disable-pip-version-check -e .\"\npostinstall = \"pip install --no-build-isolation --no-deps --disable-pip-version-check .\"\ndev = \"uvicorn my_app.app:main --reload\"\nserve = \"uvicorn my_app.app:main\"\n\n[dependencies]\npython = \">=3.12\"\npip = \"*\"\npydantic = \">=2\"\nfastapi = \">=0.105.0\"\nsqlalchemy = \">=2,<3\"\nuvicorn = \"*\"\naiofiles = \"*\"\n\n[feature.test.dependencies]\npytest = \"*\"\npytest-md = \"*\"\npytest-asyncio = \"*\"\n[feature.test.tasks]\ntest = \"pytest --md=report.md\"\n\n[environments]\n# both default and prod will have exactly the same dependency versions when they share a dependency\ndefault = {features = [\"test\"], solve-group = \"prod-group\"}\nprod = {features = [], solve-group = \"prod-group\"}\n
    In ci you would run the following commands:
    pixi run postinstall-e && pixi run test\n
    Locally you would run the following command:
    pixi run postinstall-e && pixi run dev\n

    Then in a Dockerfile you would run the following command: Dockerfile

    FROM ghcr.io/prefix-dev/pixi:latest # this doesn't exist yet\nWORKDIR /app\nCOPY . .\nRUN pixi run --environment prod postinstall\nEXPOSE 8080\nCMD [\"/usr/local/bin/pixi\", \"run\", \"--environment\", \"prod\", \"serve\"]\n

    Multiple machines from one project

    This is an example for an ML project that should be executable on a machine that supports cuda and mlx. It should also be executable on machines that don't support cuda or mlx, we use the cpu feature for this.

    pixi.toml
    [project]\nname = \"my-ml-project\"\ndescription = \"A project that does ML stuff\"\nauthors = [\"Your Name <your.name@gmail.com>\"]\nchannels = [\"conda-forge\", \"pytorch\"]\n# All platforms that are supported by the project as the features will take the intersection of the platforms defined there.\nplatforms = [\"win-64\", \"linux-64\", \"osx-64\", \"osx-arm64\"]\n\n[tasks]\ntrain-model = \"python train.py\"\nevaluate-model = \"python test.py\"\n\n[dependencies]\npython = \"3.11.*\"\npytorch = {version = \">=2.0.1\", channel = \"pytorch\"}\ntorchvision = {version = \">=0.15\", channel = \"pytorch\"}\npolars = \">=0.20,<0.21\"\nmatplotlib-base = \">=3.8.2,<3.9\"\nipykernel = \">=6.28.0,<6.29\"\n\n[feature.cuda]\nplatforms = [\"win-64\", \"linux-64\"]\nchannels = [\"nvidia\", {channel = \"pytorch\", priority = -1}]\nsystem-requirements = {cuda = \"12.1\"}\n\n[feature.cuda.tasks]\ntrain-model = \"python train.py --cuda\"\nevaluate-model = \"python test.py --cuda\"\n\n[feature.cuda.dependencies]\npytorch-cuda = {version = \"12.1.*\", channel = \"pytorch\"}\n\n[feature.mlx]\nplatforms = [\"osx-arm64\"]\n# MLX is only available on macOS >=13.5 (>14.0 is recommended)\nsystem-requirements = {macos = \"13.5\"}\n\n[feature.mlx.tasks]\ntrain-model = \"python train.py --mlx\"\nevaluate-model = \"python test.py --mlx\"\n\n[feature.mlx.dependencies]\nmlx = \">=0.16.0,<0.17.0\"\n\n[feature.cpu]\nplatforms = [\"win-64\", \"linux-64\", \"osx-64\", \"osx-arm64\"]\n\n[environments]\ncuda = [\"cuda\"]\nmlx = [\"mlx\"]\ndefault = [\"cpu\"]\n
    Running the project on a cuda machine
    pixi run train-model --environment cuda\n# will execute `python train.py --cuda`\n# fails if not on linux-64 or win-64 with cuda 12.1\n
    Running the project with mlx
    pixi run train-model --environment mlx\n# will execute `python train.py --mlx`\n# fails if not on osx-arm64\n
    Running the project on a machine without cuda or mlx
    pixi run train-model\n
    "},{"location":"features/multi_platform_configuration/","title":"Multi platform config","text":"

    Pixi's vision includes being supported on all major platforms. Sometimes that needs some extra configuration to work well. On this page, you will learn what you can configure to align better with the platform you are making your application for.

    Here is an example manifest file that highlights some of the features:

    pixi.tomlpyproject.toml pixi.toml
    [project]\n# Default project info....\n# A list of platforms you are supporting with your package.\nplatforms = [\"win-64\", \"linux-64\", \"osx-64\", \"osx-arm64\"]\n\n[dependencies]\npython = \">=3.8\"\n\n[target.win-64.dependencies]\n# Overwrite the needed python version only on win-64\npython = \"3.7\"\n\n\n[activation]\nscripts = [\"setup.sh\"]\n\n[target.win-64.activation]\n# Overwrite activation scripts only for windows\nscripts = [\"setup.bat\"]\n
    pyproject.toml
    [tool.pixi.project]\n# Default project info....\n# A list of platforms you are supporting with your package.\nplatforms = [\"win-64\", \"linux-64\", \"osx-64\", \"osx-arm64\"]\n\n[tool.pixi.dependencies]\npython = \">=3.8\"\n\n[tool.pixi.target.win-64.dependencies]\n# Overwrite the needed python version only on win-64\npython = \"~=3.7.0\"\n\n\n[tool.pixi.activation]\nscripts = [\"setup.sh\"]\n\n[tool.pixi.target.win-64.activation]\n# Overwrite activation scripts only for windows\nscripts = [\"setup.bat\"]\n
    "},{"location":"features/multi_platform_configuration/#platform-definition","title":"Platform definition","text":"

    The project.platforms defines which platforms your project supports. When multiple platforms are defined, pixi determines which dependencies to install for each platform individually. All of this is stored in a lock file.

    Running pixi install on a platform that is not configured will warn the user that it is not setup for that platform:

    \u276f pixi install\n  \u00d7 the project is not configured for your current platform\n   \u256d\u2500[pixi.toml:6:1]\n 6 \u2502 channels = [\"conda-forge\"]\n 7 \u2502 platforms = [\"osx-64\", \"osx-arm64\", \"win-64\"]\n   \u00b7             \u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u252c\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\n   \u00b7                             \u2570\u2500\u2500 add 'linux-64' here\n 8 \u2502\n   \u2570\u2500\u2500\u2500\u2500\n  help: The project needs to be configured to support your platform (linux-64).\n
    "},{"location":"features/multi_platform_configuration/#target-specifier","title":"Target specifier","text":"

    With the target specifier, you can overwrite the original configuration specifically for a single platform. If you are targeting a specific platform in your target specifier that was not specified in your project.platforms then pixi will throw an error.

    "},{"location":"features/multi_platform_configuration/#dependencies","title":"Dependencies","text":"

    It might happen that you want to install a certain dependency only on a specific platform, or you might want to use a different version on different platforms.

    pixi.toml
    [dependencies]\npython = \">=3.8\"\n\n[target.win-64.dependencies]\nmsmpi = \"*\"\npython = \"3.8\"\n

    In the above example, we specify that we depend on msmpi only on Windows. We also specifically want python on 3.8 when installing on Windows. This will overwrite the dependencies from the generic set of dependencies. This will not touch any of the other platforms.

    You can use pixi's cli to add these dependencies to the manifest file.

    pixi add --platform win-64 posix\n

    This also works for the host and build dependencies.

    pixi add --host --platform win-64 posix\npixi add --build --platform osx-64 clang\n

    Which results in this.

    pixi.toml
    [target.win-64.host-dependencies]\nposix = \"1.0.0.*\"\n\n[target.osx-64.build-dependencies]\nclang = \"16.0.6.*\"\n
    "},{"location":"features/multi_platform_configuration/#activation","title":"Activation","text":"

    Pixi's vision is to enable completely cross-platform projects, but you often need to run tools that are not built by your projects. Generated activation scripts are often in this category, default scripts in unix are bash and for windows they are bat

    To deal with this, you can define your activation scripts using the target definition.

    pixi.toml

    [activation]\nscripts = [\"setup.sh\", \"local_setup.bash\"]\n\n[target.win-64.activation]\nscripts = [\"setup.bat\", \"local_setup.bat\"]\n
    When this project is run on win-64 it will only execute the target scripts not the scripts specified in the default activation.scripts

    "},{"location":"features/system_requirements/","title":"System Requirements in pixi","text":"

    System requirements define the minimal system specifications necessary during dependency resolution for a project. For instance, specifying a Unix system with a particular minimal libc version ensures that dependencies are compatible with the project's environment.

    System specifications are closely related to virtual packages, allowing for flexible and accurate dependency management.

    "},{"location":"features/system_requirements/#default-system-requirements","title":"Default System Requirements","text":"

    The following configurations outline the default minimal system requirements for different operating systems:

    LinuxWindowsosx-64osx-arm64
    # Default system requirements for Linux\n[system-requirements]\nlinux = \"4.18\"\nlibc = { family = \"glibc\", version = \"2.28\" }\n

    Windows currently has no minimal system requirements defined. If your project requires specific Windows configurations, you should define them accordingly.

    # Default system requirements for macOS\n[system-requirements]\nmacos = \"13.0\"\n
    # Default system requirements for macOS ARM64\n[system-requirements]\nmacos = \"13.0\"\n
    "},{"location":"features/system_requirements/#customizing-system-requirements","title":"Customizing System Requirements","text":"

    You only need to define system requirements if your project necessitates a different set from the defaults. This is common when installing environments on older or newer versions of operating systems.

    "},{"location":"features/system_requirements/#adjusting-for-older-systems","title":"Adjusting for Older Systems","text":"

    If you're encountering an error like:

    \u00d7 The current system has a mismatching virtual package. The project requires '__linux' to be at least version '4.18' but the system has version '4.12.14'\n

    This indicates that the project's system requirements are higher than your current system's specifications. To resolve this, you can lower the system requirements in your project's configuration:

    [system-requirements]\nlinux = \"4.12.14\"\n

    This adjustment informs the dependency resolver to accommodate the older system version.

    "},{"location":"features/system_requirements/#using-cuda-in-pixi","title":"Using CUDA in pixi","text":"

    To utilize CUDA in your project, you must specify the desired CUDA version in the system-requirements table. This ensures that CUDA is recognized and appropriately locked into the lock file if necessary.

    Example Configuration

    [system-requirements]\ncuda = \"12\"  # Replace \"12\" with the specific CUDA version you intend to use\n
    "},{"location":"features/system_requirements/#setting-system-requirements-environment-specific","title":"Setting System Requirements environment specific","text":"

    This can be set per feature in the the manifest file.

    [feature.cuda.system-requirements]\ncuda = \"12\"\n\n[environments]\ncuda = [\"cuda\"]\n
    "},{"location":"features/system_requirements/#available-override-options","title":"Available Override Options","text":"

    In certain scenarios, you might need to override the system requirements detected on your machine. This can be particularly useful when working on systems that do not meet the project's default requirements.

    You can override virtual packages by setting the following environment variables:

    • CONDA_OVERRIDE_CUDA - Description: Sets the CUDA version. - Usage Example: CONDA_OVERRIDE_CUDA=11
    • CONDA_OVERRIDE_GLIBC - Description: Sets the glibc version. - Usage Example: CONDA_OVERRIDE_GLIBC=2.28
    • CONDA_OVERRIDE_OSX - Description: Sets the macOS version. - Usage Example: CONDA_OVERRIDE_OSX=13.0
    "},{"location":"features/system_requirements/#additional-resources","title":"Additional Resources","text":"

    For more detailed information on managing virtual packages and overriding system requirements, refer to the Conda Documentation.

    "},{"location":"ide_integration/devcontainer/","title":"Use pixi inside of a devcontainer","text":"

    VSCode Devcontainers are a popular tool to develop on a project with a consistent environment. They are also used in GitHub Codespaces which makes it a great way to develop on a project without having to install anything on your local machine.

    To use pixi inside of a devcontainer, follow these steps:

    Create a new directory .devcontainer in the root of your project. Then, create the following two files in the .devcontainer directory:

    .devcontainer/Dockerfile
    FROM mcr.microsoft.com/devcontainers/base:jammy\n\nARG PIXI_VERSION=v0.32.1\n\nRUN curl -L -o /usr/local/bin/pixi -fsSL --compressed \"https://github.com/prefix-dev/pixi/releases/download/${PIXI_VERSION}/pixi-$(uname -m)-unknown-linux-musl\" \\\n    && chmod +x /usr/local/bin/pixi \\\n    && pixi info\n\n# set some user and workdir settings to work nicely with vscode\nUSER vscode\nWORKDIR /home/vscode\n\nRUN echo 'eval \"$(pixi completion -s bash)\"' >> /home/vscode/.bashrc\n
    .devcontainer/devcontainer.json
    {\n    \"name\": \"my-project\",\n    \"build\": {\n      \"dockerfile\": \"Dockerfile\",\n      \"context\": \"..\",\n    },\n    \"customizations\": {\n      \"vscode\": {\n        \"settings\": {},\n        \"extensions\": [\"ms-python.python\", \"charliermarsh.ruff\", \"GitHub.copilot\"]\n      }\n    },\n    \"features\": {\n      \"ghcr.io/devcontainers/features/docker-in-docker:2\": {}\n    },\n    \"mounts\": [\"source=${localWorkspaceFolderBasename}-pixi,target=${containerWorkspaceFolder}/.pixi,type=volume\"],\n    \"postCreateCommand\": \"sudo chown vscode .pixi && pixi install\"\n}\n

    Put .pixi in a mount

    In the above example, we mount the .pixi directory into a volume. This is needed since the .pixi directory shouldn't be on a case insensitive filesystem (default on macOS, Windows) but instead in its own volume. There are some conda packages (for example ncurses-feedstock#73) that contain files that only differ in case which leads to errors on case insensitive filesystems.

    "},{"location":"ide_integration/devcontainer/#secrets","title":"Secrets","text":"

    If you want to authenticate to a private conda channel, you can add secrets to your devcontainer.

    .devcontainer/devcontainer.json
    {\n    \"build\": \"Dockerfile\",\n    \"context\": \"..\",\n    \"options\": [\n        \"--secret\",\n        \"id=prefix_dev_token,env=PREFIX_DEV_TOKEN\",\n    ],\n    // ...\n}\n
    .devcontainer/Dockerfile
    # ...\nRUN --mount=type=secret,id=prefix_dev_token,uid=1000 \\\n    test -s /run/secrets/prefix_dev_token \\\n    && pixi auth login --token \"$(cat /run/secrets/prefix_dev_token)\" https://repo.prefix.dev\n

    These secrets need to be present either as an environment variable when starting the devcontainer locally or in your GitHub Codespaces settings under Secrets.

    "},{"location":"ide_integration/jupyterlab/","title":"JupyterLab Integration","text":""},{"location":"ide_integration/jupyterlab/#basic-usage","title":"Basic usage","text":"

    Using JupyterLab with pixi is very simple. You can just create a new pixi project and add the jupyterlab package to it. The full example is provided under the following Github link.

    pixi init\npixi add jupyterlab\n

    This will create a new pixi project and add the jupyterlab package to it. You can then start JupyterLab using the following command:

    pixi run jupyter lab\n

    If you want to add more \"kernels\" to JupyterLab, you can simply add them to your current project \u2013 as well as any dependencies from the scientific stack you might need.

    pixi add bash_kernel ipywidgets matplotlib numpy pandas  # ...\n
    "},{"location":"ide_integration/jupyterlab/#what-kernels-are-available","title":"What kernels are available?","text":"

    You can easily install more \"kernels\" for JupyterLab. The conda-forge repository has a number of interesting additional kernels - not just Python!

    • bash_kernel A kernel for bash
    • xeus-cpp A C++ kernel based on the new clang-repl
    • xeus-cling A C++ kernel based on the slightly older Cling
    • xeus-lua A Lua kernel
    • xeus-sql A kernel for SQL
    • r-irkernel An R kernel
    "},{"location":"ide_integration/jupyterlab/#advanced-usage","title":"Advanced usage","text":"

    If you want to have only one instance of JupyterLab running but still want per-directory Pixi environments, you can use one of the kernels provided by the pixi-kernel package.

    "},{"location":"ide_integration/jupyterlab/#configuring-jupyterlab","title":"Configuring JupyterLab","text":"

    To get started, create a Pixi project, add jupyterlab and pixi-kernel and then start JupyterLab:

    pixi init\npixi add jupyterlab pixi-kernel\npixi run jupyter lab\n

    This will start JupyterLab and open it in your browser.

    pixi-kernel searches for a manifest file, either pixi.toml or pyproject.toml, in the same directory of your notebook or in any parent directory. When it finds one, it will use the environment specified in the manifest file to start the kernel and run your notebooks.

    "},{"location":"ide_integration/jupyterlab/#binder","title":"Binder","text":"

    If you just want to check a JupyterLab environment running in the cloud using pixi-kernel, you can visit Binder.

    "},{"location":"ide_integration/pycharm/","title":"PyCharm Integration","text":"

    You can use PyCharm with pixi environments by using the conda shim provided by the pixi-pycharm package.

    "},{"location":"ide_integration/pycharm/#how-to-use","title":"How to use","text":"

    To get started, add pixi-pycharm to your pixi project.

    pixi add pixi-pycharm\n

    This will ensure that the conda shim is installed in your project's environment.

    Having pixi-pycharm installed, you can now configure PyCharm to use your pixi environments. Go to the Add Python Interpreter dialog (bottom right corner of the PyCharm window) and select Conda Environment. Set Conda Executable to the full path of the conda file (on Windows: conda.bat) which is located in .pixi/envs/default/libexec. You can get the path using the following command:

    Linux & macOSWindows
    pixi run 'echo $CONDA_PREFIX/libexec/conda'\n
    pixi run 'echo $CONDA_PREFIX\\\\libexec\\\\conda.bat'\n

    This is an executable that tricks PyCharm into thinking it's the proper conda executable. Under the hood it redirects all calls to the corresponding pixi equivalent.

    Use the conda shim from this pixi project

    Please make sure that this is the conda shim from this pixi project and not another one. If you use multiple pixi projects, you might have to adjust the path accordingly as PyCharm remembers the path to the conda executable.

    Having selected the environment, PyCharm will now use the Python interpreter from your pixi environment.

    PyCharm should now be able to show you the installed packages as well.

    You can now run your programs and tests as usual.

    Mark .pixi as excluded

    In order for PyCharm to not get confused about the .pixi directory, please mark it as excluded.

    Also, when using a remote interpreter, you should exclude the .pixi directory on the remote machine. Instead, you should run pixi install on the remote machine and select the conda shim from there.

    "},{"location":"ide_integration/pycharm/#multiple-environments","title":"Multiple environments","text":"

    If your project uses multiple environments to tests different Python versions or dependencies, you can add multiple environments to PyCharm by specifying Use existing environment in the Add Python Interpreter dialog.

    You can then specify the corresponding environment in the bottom right corner of the PyCharm window.

    "},{"location":"ide_integration/pycharm/#multiple-pixi-projects","title":"Multiple pixi projects","text":"

    When using multiple pixi projects, remember to select the correct Conda Executable for each project as mentioned above. It also might come up that you have multiple environments it might come up that you have multiple environments with the same name.

    It is recommended to rename the environments to something unique.

    "},{"location":"ide_integration/pycharm/#debugging","title":"Debugging","text":"

    Logs are written to ~/.cache/pixi-pycharm.log. You can use them to debug problems. Please attach the logs when filing a bug report.

    "},{"location":"ide_integration/r_studio/","title":"Developing R scripts in RStudio","text":"

    You can use pixi to manage your R dependencies. The conda-forge channel contains a wide range of R packages that can be installed using pixi.

    "},{"location":"ide_integration/r_studio/#installing-r-packages","title":"Installing R packages","text":"

    R packages are usually prefixed with r- in the conda-forge channel. To install an R package, you can use the following command:

    pixi add r-<package-name>\n# for example\npixi add r-ggplot2\n
    "},{"location":"ide_integration/r_studio/#using-r-packages-in-rstudio","title":"Using R packages in RStudio","text":"

    To use the R packages installed by pixi in RStudio, you need to run rstudio from an activated environment. This can be achieved by running RStudio from pixi shell or from a task in the pixi.toml file.

    "},{"location":"ide_integration/r_studio/#full-example","title":"Full example","text":"

    The full example can be found here: RStudio example. Here is an example of a pixi.toml file that sets up an RStudio task:

    [project]\nname = \"r\"\nchannels = [\"conda-forge\"]\nplatforms = [\"linux-64\", \"osx-64\", \"osx-arm64\"]\n\n[target.linux.tasks]\nrstudio = \"rstudio\"\n\n[target.osx.tasks]\nrstudio = \"open -a rstudio\"\n# or alternatively with the full path:\n# rstudio = \"/Applications/RStudio.app/Contents/MacOS/RStudio\"\n\n[dependencies]\nr = \">=4.3,<5\"\nr-ggplot2 = \">=3.5.0,<3.6\"\n

    Once RStudio has loaded, you can execute the following R code that uses the ggplot2 package:

    # Load the ggplot2 package\nlibrary(ggplot2)\n\n# Load the built-in 'mtcars' dataset\ndata <- mtcars\n\n# Create a scatterplot of 'mpg' vs 'wt'\nggplot(data, aes(x = wt, y = mpg)) +\n  geom_point() +\n  labs(x = \"Weight (1000 lbs)\", y = \"Miles per Gallon\") +\n  ggtitle(\"Fuel Efficiency vs. Weight\")\n

    Note

    This example assumes that you have installed RStudio system-wide. We are working on updating RStudio as well as the R interpreter builds on Windows for maximum compatibility with pixi.

    "},{"location":"reference/cli/","title":"Commands","text":""},{"location":"reference/cli/#global-options","title":"Global options","text":"
    • --verbose (-v|vv|vvv) Increase the verbosity of the output messages, the -v|vv|vvv increases the level of verbosity respectively.
    • --help (-h) Shows help information, use -h to get the short version of the help.
    • --version (-V): shows the version of pixi that is used.
    • --quiet (-q): Decreases the amount of output.
    • --color <COLOR>: Whether the log needs to be colored [env: PIXI_COLOR=] [default: auto] [possible values: always, never, auto]. Pixi also honors the FORCE_COLOR and NO_COLOR environment variables. They both take precedence over --color and PIXI_COLOR.
    • --no-progress: Disables the progress bar.[env: PIXI_NO_PROGRESS] [default: false]
    "},{"location":"reference/cli/#init","title":"init","text":"

    This command is used to create a new project. It initializes a pixi.toml file and also prepares a .gitignore to prevent the environment from being added to git.

    It also supports the pyproject.toml file, if you have a pyproject.toml file in the directory where you run pixi init, it appends the pixi data to the pyproject.toml instead of a new pixi.toml file.

    "},{"location":"reference/cli/#arguments","title":"Arguments","text":"
    1. [PATH]: Where to place the project (defaults to current path) [default: .]
    "},{"location":"reference/cli/#options","title":"Options","text":"
    • --channel <CHANNEL> (-c): specify a channel that the project uses. Defaults to conda-forge. (Allowed to be used more than once)
    • --platform <PLATFORM> (-p): specify a platform that the project supports. (Allowed to be used more than once)
    • --import <ENV_FILE> (-i): Import an existing conda environment file, e.g. environment.yml.
    • --format <FORMAT>: Specify the format of the project file, either pyproject or pixi. [default: pixi]

    Importing an environment.yml

    When importing an environment, the pixi.toml will be created with the dependencies from the environment file. The pixi.lock will be created when you install the environment. We don't support git+ urls as dependencies for pip packages and for the defaults channel we use main, r and msys2 as the default channels.

    pixi init myproject\npixi init ~/myproject\npixi init  # Initializes directly in the current directory.\npixi init --channel conda-forge --channel bioconda myproject\npixi init --platform osx-64 --platform linux-64 myproject\npixi init --import environment.yml\npixi init --format pyproject\npixi init --format pixi\n
    "},{"location":"reference/cli/#add","title":"add","text":"

    Adds dependencies to the manifest file. It will only add if the package with its version constraint is able to work with rest of the dependencies in the project. More info on multi-platform configuration.

    If the project manifest is a pyproject.toml, adding a pypi dependency will add it to the native pyproject project.dependencies array, or to the native project.optional-dependencies table if a feature is specified:

    • pixi add --pypi boto3 would add boto3 to the project.dependencies array
    • pixi add --pypi boto3 --feature aws would add boto3 to the project.dependencies.aws array

    These dependencies will be read by pixi as if they had been added to the pixi pypi-dependencies tables of the default or a named feature.

    "},{"location":"reference/cli/#arguments_1","title":"Arguments","text":"
    1. [SPECS]: The package(s) to add, space separated. The version constraint is optional.
    "},{"location":"reference/cli/#options_1","title":"Options","text":"
    • --manifest-path <MANIFEST_PATH>: the path to manifest file, by default it searches for one in the parent directories.
    • --host: Specifies a host dependency, important for building a package.
    • --build: Specifies a build dependency, important for building a package.
    • --pypi: Specifies a PyPI dependency, not a conda package. Parses dependencies as PEP508 requirements, supporting extras and versions. See configuration for details.
    • --no-install: Don't install the package to the environment, only add the package to the lock-file.
    • --no-lockfile-update: Don't update the lock-file, implies the --no-install flag.
    • --platform <PLATFORM> (-p): The platform for which the dependency should be added. (Allowed to be used more than once)
    • --feature <FEATURE> (-f): The feature for which the dependency should be added.
    • --editable: Specifies an editable dependency, only use in combination with --pypi.
    pixi add numpy # (1)!\npixi add numpy pandas \"pytorch>=1.8\" # (2)!\npixi add \"numpy>=1.22,<1.24\" # (3)!\npixi add --manifest-path ~/myproject/pixi.toml numpy # (4)!\npixi add --host \"python>=3.9.0\" # (5)!\npixi add --build cmake # (6)!\npixi add --platform osx-64 clang # (7)!\npixi add --no-install numpy # (8)!\npixi add --no-lockfile-update numpy # (9)!\npixi add --feature featurex numpy # (10)!\n\n# Add a pypi dependency\npixi add --pypi requests[security] # (11)!\npixi add --pypi Django==5.1rc1 # (12)!\npixi add --pypi \"boltons>=24.0.0\" --feature lint # (13)!\npixi add --pypi \"boltons @ https://files.pythonhosted.org/packages/46/35/e50d4a115f93e2a3fbf52438435bb2efcf14c11d4fcd6bdcd77a6fc399c9/boltons-24.0.0-py3-none-any.whl\" # (14)!\npixi add --pypi \"exchangelib @ git+https://github.com/ecederstrand/exchangelib\" # (15)!\npixi add --pypi \"project @ file:///absolute/path/to/project\" # (16)!\npixi add --pypi \"project@file:///absolute/path/to/project\" --editable # (17)!\n
    1. This will add the numpy package to the project with the latest available for the solved environment.
    2. This will add multiple packages to the project solving them all together.
    3. This will add the numpy package with the version constraint.
    4. This will add the numpy package to the project of the manifest file at the given path.
    5. This will add the python package as a host dependency. There is currently no different behavior for host dependencies.
    6. This will add the cmake package as a build dependency. There is currently no different behavior for build dependencies.
    7. This will add the clang package only for the osx-64 platform.
    8. This will add the numpy package to the manifest and lockfile, without installing it in an environment.
    9. This will add the numpy package to the manifest without updating the lockfile or installing it in the environment.
    10. This will add the numpy package in the feature featurex.
    11. This will add the requests package as pypi dependency with the security extra.
    12. This will add the pre-release version of Django to the project as a pypi dependency.
    13. This will add the boltons package in the feature lint as pypi dependency.
    14. This will add the boltons package with the given url as pypi dependency.
    15. This will add the exchangelib package with the given git url as pypi dependency.
    16. This will add the project package with the given file url as pypi dependency.
    17. This will add the project package with the given file url as an editable package as pypi dependency.

    Tip

    If you want to use a non default pinning strategy, you can set it using pixi's configuration.

    pixi config set pinning-strategy no-pin --global\n
    The default is semver which will pin the dependencies to the latest major version or minor for v0 versions.

    "},{"location":"reference/cli/#install","title":"install","text":"

    Installs an environment based on the manifest file. If there is no pixi.lock file or it is not up-to-date with the manifest file, it will (re-)generate the lock file.

    pixi install only installs one environment at a time, if you have multiple environments you can select the right one with the --environment flag. If you don't provide an environment, the default environment will be installed.

    Running pixi install is not required before running other commands. As all commands interacting with the environment will first run the install command if the environment is not ready, to make sure you always run in a correct state. E.g. pixi run, pixi shell, pixi shell-hook, pixi add, pixi remove to name a few.

    "},{"location":"reference/cli/#options_2","title":"Options","text":"
    • --manifest-path <MANIFEST_PATH>: the path to manifest file, by default it searches for one in the parent directories.
    • --frozen: install the environment as defined in the lock file, doesn't update pixi.lock if it isn't up-to-date with manifest file. It can also be controlled by the PIXI_FROZEN environment variable (example: PIXI_FROZEN=true).
    • --locked: only install if the pixi.lock is up-to-date with the manifest file1. It can also be controlled by the PIXI_LOCKED environment variable (example: PIXI_LOCKED=true). Conflicts with --frozen.
    • --environment <ENVIRONMENT> (-e): The environment to install, if none are provided the default environment will be used.
    pixi install\npixi install --manifest-path ~/myproject/pixi.toml\npixi install --frozen\npixi install --locked\npixi install --environment lint\npixi install -e lint\n
    "},{"location":"reference/cli/#update","title":"update","text":"

    The update command checks if there are newer versions of the dependencies and updates the pixi.lock file and environments accordingly. It will only update the lock file if the dependencies in the manifest file are still compatible with the new versions.

    "},{"location":"reference/cli/#arguments_2","title":"Arguments","text":"
    1. [PACKAGES]... The packages to update, space separated. If no packages are provided, all packages will be updated.
    "},{"location":"reference/cli/#options_3","title":"Options","text":"
    • --manifest-path <MANIFEST_PATH>: the path to manifest file, by default it searches for one in the parent directories.
    • --environment <ENVIRONMENT> (-e): The environment to install, if none are provided all the environments are updated.
    • --platform <PLATFORM> (-p): The platform for which the dependencies should be updated.
    • --dry-run (-n): Only show the changes that would be made, without actually updating the lock file or environment.
    • --no-install: Don't install the (solve) environment needed for solving pypi-dependencies.
    • --json: Output the changes in json format.
    pixi update numpy\npixi update numpy pandas\npixi update --manifest-path ~/myproject/pixi.toml numpy\npixi update --environment lint python\npixi update -e lint -e schema -e docs pre-commit\npixi update --platform osx-arm64 mlx\npixi update -p linux-64 -p osx-64 numpy\npixi update --dry-run\npixi update --no-install boto3\n
    "},{"location":"reference/cli/#run","title":"run","text":"

    The run commands first checks if the environment is ready to use. When you didn't run pixi install the run command will do that for you. The custom tasks defined in the manifest file are also available through the run command.

    You cannot run pixi run source setup.bash as source is not available in the deno_task_shell commandos and not an executable.

    "},{"location":"reference/cli/#arguments_3","title":"Arguments","text":"
    1. [TASK]... The task you want to run in the projects environment, this can also be a normal command. And all arguments after the task will be passed to the task.
    "},{"location":"reference/cli/#options_4","title":"Options","text":"
    • --manifest-path <MANIFEST_PATH>: the path to manifest file, by default it searches for one in the parent directories.
    • --frozen: install the environment as defined in the lock file, doesn't update pixi.lock if it isn't up-to-date with manifest file. It can also be controlled by the PIXI_FROZEN environment variable (example: PIXI_FROZEN=true).
    • --locked: only install if the pixi.lock is up-to-date with the manifest file1. It can also be controlled by the PIXI_LOCKED environment variable (example: PIXI_LOCKED=true). Conflicts with --frozen.
    • --environment <ENVIRONMENT> (-e): The environment to run the task in, if none are provided the default environment will be used or a selector will be given to select the right environment.
    • --clean-env: Run the task in a clean environment, this will remove all environment variables of the shell environment except for the ones pixi sets. THIS DOESN't WORK ON Windows.
      pixi run python\npixi run cowpy \"Hey pixi user\"\npixi run --manifest-path ~/myproject/pixi.toml python\npixi run --frozen python\npixi run --locked python\n# If you have specified a custom task in the pixi.toml you can run it with run as well\npixi run build\n# Extra arguments will be passed to the tasks command.\npixi run task argument1 argument2\n\n# If you have multiple environments you can select the right one with the --environment flag.\npixi run --environment cuda python\n\n# THIS DOESN'T WORK ON WINDOWS\n# If you want to run a command in a clean environment you can use the --clean-env flag.\n# The PATH should only contain the pixi environment here.\npixi run --clean-env \"echo \\$PATH\"\n

    Info

    In pixi the deno_task_shell is the underlying runner of the run command. Checkout their documentation for the syntax and available commands. This is done so that the run commands can be run across all platforms.

    Cross environment tasks

    If you're using the depends-on feature of the tasks, the tasks will be run in the order you specified them. The depends-on can be used cross environment, e.g. you have this pixi.toml:

    pixi.toml
    [tasks]\nstart = { cmd = \"python start.py\", depends-on = [\"build\"] }\n\n[feature.build.tasks]\nbuild = \"cargo build\"\n[feature.build.dependencies]\nrust = \">=1.74\"\n\n[environments]\nbuild = [\"build\"]\n

    Then you're able to run the build from the build environment and start from the default environment. By only calling:

    pixi run start\n

    "},{"location":"reference/cli/#exec","title":"exec","text":"

    Runs a command in a temporary environment disconnected from any project. This can be useful to quickly test out a certain package or version.

    Temporary environments are cached. If the same command is run again, the same environment will be reused.

    Cleaning temporary environments

    Currently, temporary environments can only be cleaned up manually. Environments for pixi exec are stored under cached-envs-v0/ in the cache directory. Run pixi info to find the cache directory.

    "},{"location":"reference/cli/#arguments_4","title":"Arguments","text":"
    1. <COMMAND>: The command to run.
    "},{"location":"reference/cli/#options_5","title":"Options:","text":"
    • --spec <SPECS> (-s): Matchspecs of packages to install. If this is not provided, the package is guessed from the command.
    • --channel <CHANNELS> (-c): The channel to install the packages from. If not specified the default channel is used.
    • --force-reinstall If specified a new environment is always created even if one already exists.
    pixi exec python\n\n# Add a constraint to the python version\npixi exec -s python=3.9 python\n\n# Run ipython and include the py-rattler package in the environment\npixi exec -s ipython -s py-rattler ipython\n\n# Force reinstall to recreate the environment and get the latest package versions\npixi exec --force-reinstall -s ipython -s py-rattler ipython\n
    "},{"location":"reference/cli/#remove","title":"remove","text":"

    Removes dependencies from the manifest file.

    If the project manifest is a pyproject.toml, removing a pypi dependency with the --pypi flag will remove it from either - the native pyproject project.dependencies array or the native project.optional-dependencies table (if a feature is specified) - pixi pypi-dependencies tables of the default or a named feature (if a feature is specified)

    "},{"location":"reference/cli/#arguments_5","title":"Arguments","text":"
    1. <DEPS>...: List of dependencies you wish to remove from the project.
    "},{"location":"reference/cli/#options_6","title":"Options","text":"
    • --manifest-path <MANIFEST_PATH>: the path to manifest file, by default it searches for one in the parent directories.
    • --host: Specifies a host dependency, important for building a package.
    • --build: Specifies a build dependency, important for building a package.
    • --pypi: Specifies a PyPI dependency, not a conda package.
    • --platform <PLATFORM> (-p): The platform from which the dependency should be removed.
    • --feature <FEATURE> (-f): The feature from which the dependency should be removed.
    • --no-install: Don't install the environment, only remove the package from the lock-file and manifest.
    • --no-lockfile-update: Don't update the lock-file, implies the --no-install flag.
    pixi remove numpy\npixi remove numpy pandas pytorch\npixi remove --manifest-path ~/myproject/pixi.toml numpy\npixi remove --host python\npixi remove --build cmake\npixi remove --pypi requests\npixi remove --platform osx-64 --build clang\npixi remove --feature featurex clang\npixi remove --feature featurex --platform osx-64 clang\npixi remove --feature featurex --platform osx-64 --build clang\npixi remove --no-install numpy\n
    "},{"location":"reference/cli/#task","title":"task","text":"

    If you want to make a shorthand for a specific command you can add a task for it.

    "},{"location":"reference/cli/#options_7","title":"Options","text":"
    • --manifest-path <MANIFEST_PATH>: the path to manifest file, by default it searches for one in the parent directories.
    "},{"location":"reference/cli/#task-add","title":"task add","text":"

    Add a task to the manifest file, use --depends-on to add tasks you want to run before this task, e.g. build before an execute task.

    "},{"location":"reference/cli/#arguments_6","title":"Arguments","text":"
    1. <NAME>: The name of the task.
    2. <COMMAND>: The command to run. This can be more than one word.

    Info

    If you are using $ for env variables they will be resolved before adding them to the task. If you want to use $ in the task you need to escape it with a \\, e.g. echo \\$HOME.

    "},{"location":"reference/cli/#options_8","title":"Options","text":"
    • --platform <PLATFORM> (-p): the platform for which this task should be added.
    • --feature <FEATURE> (-f): the feature for which the task is added, if non provided the default tasks will be added.
    • --depends-on <DEPENDS_ON>: the task it depends on to be run before the one your adding.
    • --cwd <CWD>: the working directory for the task relative to the root of the project.
    • --env <ENV>: the environment variables as key=value pairs for the task, can be used multiple times, e.g. --env \"VAR1=VALUE1\" --env \"VAR2=VALUE2\".
    • --description <DESCRIPTION>: a description of the task.
    pixi task add cow cowpy \"Hello User\"\npixi task add tls ls --cwd tests\npixi task add test cargo t --depends-on build\npixi task add build-osx \"METAL=1 cargo build\" --platform osx-64\npixi task add train python train.py --feature cuda\npixi task add publish-pypi \"hatch publish --yes --repo main\" --feature build --env HATCH_CONFIG=config/hatch.toml --description \"Publish the package to pypi\"\n

    This adds the following to the manifest file:

    [tasks]\ncow = \"cowpy \\\"Hello User\\\"\"\ntls = { cmd = \"ls\", cwd = \"tests\" }\ntest = { cmd = \"cargo t\", depends-on = [\"build\"] }\n\n[target.osx-64.tasks]\nbuild-osx = \"METAL=1 cargo build\"\n\n[feature.cuda.tasks]\ntrain = \"python train.py\"\n\n[feature.build.tasks]\npublish-pypi = { cmd = \"hatch publish --yes --repo main\", env = { HATCH_CONFIG = \"config/hatch.toml\" }, description = \"Publish the package to pypi\" }\n

    Which you can then run with the run command:

    pixi run cow\n# Extra arguments will be passed to the tasks command.\npixi run test --test test1\n
    "},{"location":"reference/cli/#task-remove","title":"task remove","text":"

    Remove the task from the manifest file

    "},{"location":"reference/cli/#arguments_7","title":"Arguments","text":"
    • <NAMES>: The names of the tasks, space separated.
    "},{"location":"reference/cli/#options_9","title":"Options","text":"
    • --platform <PLATFORM> (-p): the platform for which this task is removed.
    • --feature <FEATURE> (-f): the feature for which the task is removed.
    pixi task remove cow\npixi task remove --platform linux-64 test\npixi task remove --feature cuda task\n
    "},{"location":"reference/cli/#task-alias","title":"task alias","text":"

    Create an alias for a task.

    "},{"location":"reference/cli/#arguments_8","title":"Arguments","text":"
    1. <ALIAS>: The alias name
    2. <DEPENDS_ON>: The names of the tasks you want to execute on this alias, order counts, first one runs first.
    "},{"location":"reference/cli/#options_10","title":"Options","text":"
    • --platform <PLATFORM> (-p): the platform for which this alias is created.
    pixi task alias test-all test-py test-cpp test-rust\npixi task alias --platform linux-64 test test-linux\npixi task alias moo cow\n
    "},{"location":"reference/cli/#task-list","title":"task list","text":"

    List all tasks in the project.

    "},{"location":"reference/cli/#options_11","title":"Options","text":"
    • --environment(-e): the environment's tasks list, if non is provided the default tasks will be listed.
    • --summary(-s): list the tasks per environment.
    pixi task list\npixi task list --environment cuda\npixi task list --summary\n
    "},{"location":"reference/cli/#list","title":"list","text":"

    List project's packages. Highlighted packages are explicit dependencies.

    "},{"location":"reference/cli/#options_12","title":"Options","text":"
    • --platform <PLATFORM> (-p): The platform to list packages for. Defaults to the current platform
    • --json: Whether to output in json format.
    • --json-pretty: Whether to output in pretty json format
    • --sort-by <SORT_BY>: Sorting strategy [default: name] [possible values: size, name, type]
    • --explicit (-x): Only list the packages that are explicitly added to the manifest file.
    • --manifest-path <MANIFEST_PATH>: The path to manifest file, by default it searches for one in the parent directories.
    • --environment (-e): The environment's packages to list, if non is provided the default environment's packages will be listed.
    • --frozen: install the environment as defined in the lock file, doesn't update pixi.lock if it isn't up-to-date with manifest file. It can also be controlled by the PIXI_FROZEN environment variable (example: PIXI_FROZEN=true).
    • --locked: Only install if the pixi.lock is up-to-date with the manifest file1. It can also be controlled by the PIXI_LOCKED environment variable (example: PIXI_LOCKED=true). Conflicts with --frozen.
    • --no-install: Don't install the environment for pypi solving, only update the lock-file if it can solve without installing. (Implied by --frozen and --locked)
    pixi list\npixi list --json-pretty\npixi list --explicit\npixi list --sort-by size\npixi list --platform win-64\npixi list --environment cuda\npixi list --frozen\npixi list --locked\npixi list --no-install\n

    Output will look like this, where python will be green as it is the package that was explicitly added to the manifest file:

    \u279c pixi list\n Package           Version     Build               Size       Kind   Source\n _libgcc_mutex     0.1         conda_forge         2.5 KiB    conda  _libgcc_mutex-0.1-conda_forge.tar.bz2\n _openmp_mutex     4.5         2_gnu               23.1 KiB   conda  _openmp_mutex-4.5-2_gnu.tar.bz2\n bzip2             1.0.8       hd590300_5          248.3 KiB  conda  bzip2-1.0.8-hd590300_5.conda\n ca-certificates   2023.11.17  hbcca054_0          150.5 KiB  conda  ca-certificates-2023.11.17-hbcca054_0.conda\n ld_impl_linux-64  2.40        h41732ed_0          688.2 KiB  conda  ld_impl_linux-64-2.40-h41732ed_0.conda\n libexpat          2.5.0       hcb278e6_1          76.2 KiB   conda  libexpat-2.5.0-hcb278e6_1.conda\n libffi            3.4.2       h7f98852_5          56.9 KiB   conda  libffi-3.4.2-h7f98852_5.tar.bz2\n libgcc-ng         13.2.0      h807b86a_4          755.7 KiB  conda  libgcc-ng-13.2.0-h807b86a_4.conda\n libgomp           13.2.0      h807b86a_4          412.2 KiB  conda  libgomp-13.2.0-h807b86a_4.conda\n libnsl            2.0.1       hd590300_0          32.6 KiB   conda  libnsl-2.0.1-hd590300_0.conda\n libsqlite         3.44.2      h2797004_0          826 KiB    conda  libsqlite-3.44.2-h2797004_0.conda\n libuuid           2.38.1      h0b41bf4_0          32.8 KiB   conda  libuuid-2.38.1-h0b41bf4_0.conda\n libxcrypt         4.4.36      hd590300_1          98 KiB     conda  libxcrypt-4.4.36-hd590300_1.conda\n libzlib           1.2.13      hd590300_5          60.1 KiB   conda  libzlib-1.2.13-hd590300_5.conda\n ncurses           6.4         h59595ed_2          863.7 KiB  conda  ncurses-6.4-h59595ed_2.conda\n openssl           3.2.0       hd590300_1          2.7 MiB    conda  openssl-3.2.0-hd590300_1.conda\n python            3.12.1      hab00c5b_1_cpython  30.8 MiB   conda  python-3.12.1-hab00c5b_1_cpython.conda\n readline          8.2         h8228510_1          274.9 KiB  conda  readline-8.2-h8228510_1.conda\n tk                8.6.13      noxft_h4845f30_101  3.2 MiB    conda  tk-8.6.13-noxft_h4845f30_101.conda\n tzdata            2023d       h0c530f3_0          116.8 KiB  conda  tzdata-2023d-h0c530f3_0.conda\n xz                5.2.6       h166bdaf_0          408.6 KiB  conda  xz-5.2.6-h166bdaf_0.tar.bz2\n
    "},{"location":"reference/cli/#tree","title":"tree","text":"

    Display the project's packages in a tree. Highlighted packages are those specified in the manifest.

    The package tree can also be inverted (-i), to see which packages require a specific dependencies.

    "},{"location":"reference/cli/#arguments_9","title":"Arguments","text":"
    • REGEX optional regex of which dependencies to filter the tree to, or which dependencies to start with when inverting the tree.
    "},{"location":"reference/cli/#options_13","title":"Options","text":"
    • --invert (-i): Invert the dependency tree, that is given a REGEX pattern that matches some packages, show all the packages that depend on those.
    • --platform <PLATFORM> (-p): The platform to list packages for. Defaults to the current platform
    • --manifest-path <MANIFEST_PATH>: The path to manifest file, by default it searches for one in the parent directories.
    • --environment (-e): The environment's packages to list, if non is provided the default environment's packages will be listed.
    • --frozen: install the environment as defined in the lock file, doesn't update pixi.lock if it isn't up-to-date with manifest file. It can also be controlled by the PIXI_FROZEN environment variable (example: PIXI_FROZEN=true).
    • --locked: Only install if the pixi.lock is up-to-date with the manifest file1. It can also be controlled by the PIXI_LOCKED environment variable (example: PIXI_LOCKED=true). Conflicts with --frozen.
    • --no-install: Don't install the environment for pypi solving, only update the lock-file if it can solve without installing. (Implied by --frozen and --locked)
    pixi tree\npixi tree pre-commit\npixi tree -i yaml\npixi tree --environment docs\npixi tree --platform win-64\n

    Warning

    Use -v to show which pypi packages are not yet parsed correctly. The extras and markers parsing is still under development.

    Output will look like this, where direct packages in the manifest file will be green. Once a package has been displayed once, the tree won't continue to recurse through its dependencies (compare the first time python appears, vs the rest), and it will instead be marked with a star (*).

    Version numbers are colored by the package type, yellow for Conda packages and blue for PyPI.

    \u279c pixi tree\n\u251c\u2500\u2500 pre-commit v3.3.3\n\u2502   \u251c\u2500\u2500 cfgv v3.3.1\n\u2502   \u2502   \u2514\u2500\u2500 python v3.12.2\n\u2502   \u2502       \u251c\u2500\u2500 bzip2 v1.0.8\n\u2502   \u2502       \u251c\u2500\u2500 libexpat v2.6.2\n\u2502   \u2502       \u251c\u2500\u2500 libffi v3.4.2\n\u2502   \u2502       \u251c\u2500\u2500 libsqlite v3.45.2\n\u2502   \u2502       \u2502   \u2514\u2500\u2500 libzlib v1.2.13\n\u2502   \u2502       \u251c\u2500\u2500 libzlib v1.2.13 (*)\n\u2502   \u2502       \u251c\u2500\u2500 ncurses v6.4.20240210\n\u2502   \u2502       \u251c\u2500\u2500 openssl v3.2.1\n\u2502   \u2502       \u251c\u2500\u2500 readline v8.2\n\u2502   \u2502       \u2502   \u2514\u2500\u2500 ncurses v6.4.20240210 (*)\n\u2502   \u2502       \u251c\u2500\u2500 tk v8.6.13\n\u2502   \u2502       \u2502   \u2514\u2500\u2500 libzlib v1.2.13 (*)\n\u2502   \u2502       \u2514\u2500\u2500 xz v5.2.6\n\u2502   \u251c\u2500\u2500 identify v2.5.35\n\u2502   \u2502   \u2514\u2500\u2500 python v3.12.2 (*)\n...\n\u2514\u2500\u2500 tbump v6.9.0\n...\n    \u2514\u2500\u2500 tomlkit v0.12.4\n        \u2514\u2500\u2500 python v3.12.2 (*)\n

    A regex pattern can be specified to filter the tree to just those that show a specific direct, or transitive dependency:

    \u279c pixi tree pre-commit\n\u2514\u2500\u2500 pre-commit v3.3.3\n    \u251c\u2500\u2500 virtualenv v20.25.1\n    \u2502   \u251c\u2500\u2500 filelock v3.13.1\n    \u2502   \u2502   \u2514\u2500\u2500 python v3.12.2\n    \u2502   \u2502       \u251c\u2500\u2500 libexpat v2.6.2\n    \u2502   \u2502       \u251c\u2500\u2500 readline v8.2\n    \u2502   \u2502       \u2502   \u2514\u2500\u2500 ncurses v6.4.20240210\n    \u2502   \u2502       \u251c\u2500\u2500 libsqlite v3.45.2\n    \u2502   \u2502       \u2502   \u2514\u2500\u2500 libzlib v1.2.13\n    \u2502   \u2502       \u251c\u2500\u2500 bzip2 v1.0.8\n    \u2502   \u2502       \u251c\u2500\u2500 libzlib v1.2.13 (*)\n    \u2502   \u2502       \u251c\u2500\u2500 libffi v3.4.2\n    \u2502   \u2502       \u251c\u2500\u2500 tk v8.6.13\n    \u2502   \u2502       \u2502   \u2514\u2500\u2500 libzlib v1.2.13 (*)\n    \u2502   \u2502       \u251c\u2500\u2500 xz v5.2.6\n    \u2502   \u2502       \u251c\u2500\u2500 ncurses v6.4.20240210 (*)\n    \u2502   \u2502       \u2514\u2500\u2500 openssl v3.2.1\n    \u2502   \u251c\u2500\u2500 platformdirs v4.2.0\n    \u2502   \u2502   \u2514\u2500\u2500 python v3.12.2 (*)\n    \u2502   \u251c\u2500\u2500 distlib v0.3.8\n    \u2502   \u2502   \u2514\u2500\u2500 python v3.12.2 (*)\n    \u2502   \u2514\u2500\u2500 python v3.12.2 (*)\n    \u251c\u2500\u2500 pyyaml v6.0.1\n...\n

    Additionally, the tree can be inverted, and it can show which packages depend on a regex pattern. The packages specified in the manifest will also be highlighted (in this case cffconvert and pre-commit would be).

    \u279c pixi tree -i yaml\n\nruamel.yaml v0.18.6\n\u251c\u2500\u2500 pykwalify v1.8.0\n\u2502   \u2514\u2500\u2500 cffconvert v2.0.0\n\u2514\u2500\u2500 cffconvert v2.0.0\n\npyyaml v6.0.1\n\u2514\u2500\u2500 pre-commit v3.3.3\n\nruamel.yaml.clib v0.2.8\n\u2514\u2500\u2500 ruamel.yaml v0.18.6\n    \u251c\u2500\u2500 pykwalify v1.8.0\n    \u2502   \u2514\u2500\u2500 cffconvert v2.0.0\n    \u2514\u2500\u2500 cffconvert v2.0.0\n\nyaml v0.2.5\n\u2514\u2500\u2500 pyyaml v6.0.1\n    \u2514\u2500\u2500 pre-commit v3.3.3\n
    "},{"location":"reference/cli/#shell","title":"shell","text":"

    This command starts a new shell in the project's environment. To exit the pixi shell, simply run exit.

    "},{"location":"reference/cli/#options_14","title":"Options","text":"
    • --change-ps1 <true or false>: When set to false, the (pixi) prefix in the shell prompt is removed (default: true). The default behavior can be configured globally.
    • --manifest-path <MANIFEST_PATH>: the path to manifest file, by default it searches for one in the parent directories.
    • --frozen: install the environment as defined in the lock file, doesn't update pixi.lock if it isn't up-to-date with manifest file. It can also be controlled by the PIXI_FROZEN environment variable (example: PIXI_FROZEN=true).
    • --locked: only install if the pixi.lock is up-to-date with the manifest file1. It can also be controlled by the PIXI_LOCKED environment variable (example: PIXI_LOCKED=true). Conflicts with --frozen.
    • --environment <ENVIRONMENT> (-e): The environment to activate the shell in, if none are provided the default environment will be used or a selector will be given to select the right environment.
    pixi shell\nexit\npixi shell --manifest-path ~/myproject/pixi.toml\nexit\npixi shell --frozen\nexit\npixi shell --locked\nexit\npixi shell --environment cuda\nexit\n
    "},{"location":"reference/cli/#shell-hook","title":"shell-hook","text":"

    This command prints the activation script of an environment.

    "},{"location":"reference/cli/#options_15","title":"Options","text":"
    • --shell <SHELL> (-s): The shell for which the activation script should be printed. Defaults to the current shell. Currently supported variants: [bash, zsh, xonsh, cmd, powershell, fish, nushell]
    • --manifest-path: the path to manifest file, by default it searches for one in the parent directories.
    • --frozen: install the environment as defined in the lock file, doesn't update pixi.lock if it isn't up-to-date with manifest file. It can also be controlled by the PIXI_FROZEN environment variable (example: PIXI_FROZEN=true).
    • --locked: only install if the pixi.lock is up-to-date with the manifest file1. It can also be controlled by the PIXI_LOCKED environment variable (example: PIXI_LOCKED=true). Conflicts with --frozen.
    • --environment <ENVIRONMENT> (-e): The environment to activate, if none are provided the default environment will be used or a selector will be given to select the right environment.
    • --json: Print all environment variables that are exported by running the activation script as JSON. When specifying this option, --shell is ignored.
    pixi shell-hook\npixi shell-hook --shell bash\npixi shell-hook --shell zsh\npixi shell-hook -s powershell\npixi shell-hook --manifest-path ~/myproject/pixi.toml\npixi shell-hook --frozen\npixi shell-hook --locked\npixi shell-hook --environment cuda\npixi shell-hook --json\n

    Example use-case, when you want to get rid of the pixi executable in a Docker container.

    pixi shell-hook --shell bash > /etc/profile.d/pixi.sh\nrm ~/.pixi/bin/pixi # Now the environment will be activated without the need for the pixi executable.\n
    "},{"location":"reference/cli/#search","title":"search","text":"

    Search a package, output will list the latest version of the package.

    "},{"location":"reference/cli/#arguments_10","title":"Arguments","text":"
    1. <PACKAGE>: Name of package to search, it's possible to use wildcards (*).
    "},{"location":"reference/cli/#options_16","title":"Options","text":"
    • --manifest-path <MANIFEST_PATH>: the path to manifest file, by default it searches for one in the parent directories.
    • --channel <CHANNEL> (-c): specify a channel that the project uses. Defaults to conda-forge. (Allowed to be used more than once)
    • --limit <LIMIT> (-l): optionally limit the number of search results
    • --platform <PLATFORM> (-p): specify a platform that you want to search for. (default: current platform)
    pixi search pixi\npixi search --limit 30 \"py*\"\n# search in a different channel and for a specific platform\npixi search -c robostack --platform linux-64 \"plotjuggler*\"\n
    "},{"location":"reference/cli/#self-update","title":"self-update","text":"

    Update pixi to the latest version or a specific version. If the pixi binary is not found in the default location (e.g. ~/.pixi/bin/pixi), pixi won't update to prevent breaking the current installation (Homebrew, etc.). The behaviour can be overridden with the --force flag

    "},{"location":"reference/cli/#options_17","title":"Options","text":"
    • --version <VERSION>: The desired version (to downgrade or upgrade to). Update to the latest version if not specified.
    • --force: Force the update even if the pixi binary is not found in the default location.
    pixi self-update\npixi self-update --version 0.13.0\npixi self-update --force\n
    "},{"location":"reference/cli/#info","title":"info","text":"

    Shows helpful information about the pixi installation, cache directories, disk usage, and more. More information here.

    "},{"location":"reference/cli/#options_18","title":"Options","text":"
    • --manifest-path <MANIFEST_PATH>: the path to manifest file, by default it searches for one in the parent directories.
    • --extended: extend the information with more slow queries to the system, like directory sizes.
    • --json: Get a machine-readable version of the information as output.
    pixi info\npixi info --json --extended\n
    "},{"location":"reference/cli/#clean","title":"clean","text":"

    Clean the parts of your system which are touched by pixi. Defaults to cleaning the environments and task cache. Use the cache subcommand to clean the cache

    "},{"location":"reference/cli/#options_19","title":"Options","text":"
    • --manifest-path <MANIFEST_PATH>: the path to manifest file, by default it searches for one in the parent directories.
    • --environment <ENVIRONMENT> (-e): The environment to clean, if none are provided all environments will be removed.
    pixi clean\n
    "},{"location":"reference/cli/#clean-cache","title":"clean cache","text":"

    Clean the pixi cache on your system.

    "},{"location":"reference/cli/#options_20","title":"Options","text":"
    • --pypi: Clean the pypi cache.
    • --conda: Clean the conda cache.
    • --yes: Skip the confirmation prompt.
    pixi clean cache # clean all pixi caches\npixi clean cache --pypi # clean only the pypi cache\npixi clean cache --conda # clean only the conda cache\npixi clean cache --yes # skip the confirmation prompt\n
    "},{"location":"reference/cli/#upload","title":"upload","text":"

    Upload a package to a prefix.dev channel

    "},{"location":"reference/cli/#arguments_11","title":"Arguments","text":"
    1. <HOST>: The host + channel to upload to.
    2. <PACKAGE_FILE>: The package file to upload.
    pixi upload https://prefix.dev/api/v1/upload/my_channel my_package.conda\n
    "},{"location":"reference/cli/#auth","title":"auth","text":"

    This command is used to authenticate the user's access to remote hosts such as prefix.dev or anaconda.org for private channels.

    "},{"location":"reference/cli/#auth-login","title":"auth login","text":"

    Store authentication information for given host.

    Tip

    The host is real hostname not a channel.

    "},{"location":"reference/cli/#arguments_12","title":"Arguments","text":"
    1. <HOST>: The host to authenticate with.
    "},{"location":"reference/cli/#options_21","title":"Options","text":"
    • --token <TOKEN>: The token to use for authentication with prefix.dev.
    • --username <USERNAME>: The username to use for basic HTTP authentication
    • --password <PASSWORD>: The password to use for basic HTTP authentication.
    • --conda-token <CONDA_TOKEN>: The token to use on anaconda.org / quetz authentication.
    pixi auth login repo.prefix.dev --token pfx_JQEV-m_2bdz-D8NSyRSaAndHANx0qHjq7f2iD\npixi auth login anaconda.org --conda-token ABCDEFGHIJKLMNOP\npixi auth login https://myquetz.server --username john --password xxxxxx\n
    "},{"location":"reference/cli/#auth-logout","title":"auth logout","text":"

    Remove authentication information for a given host.

    "},{"location":"reference/cli/#arguments_13","title":"Arguments","text":"
    1. <HOST>: The host to authenticate with.
    pixi auth logout <HOST>\npixi auth logout repo.prefix.dev\npixi auth logout anaconda.org\n
    "},{"location":"reference/cli/#config","title":"config","text":"

    Use this command to manage the configuration.

    "},{"location":"reference/cli/#options_22","title":"Options","text":"
    • --system (-s): Specify management scope to system configuration.
    • --global (-g): Specify management scope to global configuration.
    • --local (-l): Specify management scope to local configuration.

    Checkout the pixi configuration for more information about the locations.

    "},{"location":"reference/cli/#config-edit","title":"config edit","text":"

    Edit the configuration file in the default editor.

    pixi config edit --system\npixi config edit --local\npixi config edit -g\n
    "},{"location":"reference/cli/#config-list","title":"config list","text":"

    List the configuration

    "},{"location":"reference/cli/#arguments_14","title":"Arguments","text":"
    1. [KEY]: The key to list the value of. (all if not provided)
    "},{"location":"reference/cli/#options_23","title":"Options","text":"
    • --json: Output the configuration in JSON format.
    pixi config list default-channels\npixi config list --json\npixi config list --system\npixi config list -g\n
    "},{"location":"reference/cli/#config-prepend","title":"config prepend","text":"

    Prepend a value to a list configuration key.

    "},{"location":"reference/cli/#arguments_15","title":"Arguments","text":"
    1. <KEY>: The key to prepend the value to.
    2. <VALUE>: The value to prepend.
    pixi config prepend default-channels conda-forge\n
    "},{"location":"reference/cli/#config-append","title":"config append","text":"

    Append a value to a list configuration key.

    "},{"location":"reference/cli/#arguments_16","title":"Arguments","text":"
    1. <KEY>: The key to append the value to.
    2. <VALUE>: The value to append.
    pixi config append default-channels robostack\npixi config append default-channels bioconda --global\n
    "},{"location":"reference/cli/#config-set","title":"config set","text":"

    Set a configuration key to a value.

    "},{"location":"reference/cli/#arguments_17","title":"Arguments","text":"
    1. <KEY>: The key to set the value of.
    2. [VALUE]: The value to set. (if not provided, the key will be removed)
    pixi config set default-channels '[\"conda-forge\", \"bioconda\"]'\npixi config set --global mirrors '{\"https://conda.anaconda.org/\": [\"https://prefix.dev/conda-forge\"]}'\npixi config set repodata-config.disable-zstd true --system\npixi config set --global detached-environments \"/opt/pixi/envs\"\npixi config set detached-environments false\n
    "},{"location":"reference/cli/#config-unset","title":"config unset","text":"

    Unset a configuration key.

    "},{"location":"reference/cli/#arguments_18","title":"Arguments","text":"
    1. <KEY>: The key to unset.
    pixi config unset default-channels\npixi config unset --global mirrors\npixi config unset repodata-config.disable-zstd --system\n
    "},{"location":"reference/cli/#global","title":"global","text":"

    Global is the main entry point for the part of pixi that executes on the global(system) level.

    Tip

    Binaries and environments installed globally are stored in ~/.pixi by default, this can be changed by setting the PIXI_HOME environment variable.

    "},{"location":"reference/cli/#global-install","title":"global install","text":"

    This command installs package(s) into its own environment and adds the binary to PATH, allowing you to access it anywhere on your system without activating the environment.

    "},{"location":"reference/cli/#arguments_19","title":"Arguments","text":"

    1.<PACKAGE>: The package(s) to install, this can also be a version constraint.

    "},{"location":"reference/cli/#options_24","title":"Options","text":"
    • --channel <CHANNEL> (-c): specify a channel that the project uses. Defaults to conda-forge. (Allowed to be used more than once)
    • --platform <PLATFORM> (-p): specify a platform that you want to install the package for. (default: current platform)
    • --no-activation: Do not insert conda_prefix, path modifications, and activation script into the installed executable script.
    pixi global install ruff\n# multiple packages can be installed at once\npixi global install starship rattler-build\n# specify the channel(s)\npixi global install --channel conda-forge --channel bioconda trackplot\n# Or in a more concise form\npixi global install -c conda-forge -c bioconda trackplot\n\n# Support full conda matchspec\npixi global install python=3.9.*\npixi global install \"python [version='3.11.0', build_number=1]\"\npixi global install \"python [version='3.11.0', build=he550d4f_1_cpython]\"\npixi global install python=3.11.0=h10a6764_1_cpython\n\n# Install for a specific platform, only useful on osx-arm64\npixi global install --platform osx-64 ruff\n\n# Install without inserting activation code into the executable script\npixi global install ruff --no-activation\n

    Tip

    Running osx-64 on Apple Silicon will install the Intel binary but run it using Rosetta

    pixi global install --platform osx-64 ruff\n

    After using global install, you can use the package you installed anywhere on your system.

    "},{"location":"reference/cli/#global-list","title":"global list","text":"

    This command shows the current installed global environments including what binaries come with it. A global installed package/environment can possibly contain multiple binaries and they will be listed out in the command output. Here is an example of a few installed packages:

    > pixi global list\nGlobal install location: /home/hanabi/.pixi\n\u251c\u2500\u2500 bat 0.24.0\n|   \u2514\u2500 exec: bat\n\u251c\u2500\u2500 conda-smithy 3.31.1\n|   \u2514\u2500 exec: feedstocks, conda-smithy\n\u251c\u2500\u2500 rattler-build 0.13.0\n|   \u2514\u2500 exec: rattler-build\n\u251c\u2500\u2500 ripgrep 14.1.0\n|   \u2514\u2500 exec: rg\n\u2514\u2500\u2500 uv 0.1.17\n    \u2514\u2500 exec: uv\n
    "},{"location":"reference/cli/#global-upgrade","title":"global upgrade","text":"

    This command upgrades a globally installed package (to the latest version by default).

    "},{"location":"reference/cli/#arguments_20","title":"Arguments","text":"
    1. <PACKAGE>: The package to upgrade.
    "},{"location":"reference/cli/#options_25","title":"Options","text":"
    • --channel <CHANNEL> (-c): specify a channel that the project uses. Defaults to conda-forge. Note the channel the package was installed from will be always used for upgrade. (Allowed to be used more than once)
    • --platform <PLATFORM> (-p): specify a platform that you want to upgrade the package for. (default: current platform)
    pixi global upgrade ruff\npixi global upgrade --channel conda-forge --channel bioconda trackplot\n# Or in a more concise form\npixi global upgrade -c conda-forge -c bioconda trackplot\n\n# Conda matchspec is supported\n# You can specify the version to upgrade to when you don't want the latest version\n# or you can even use it to downgrade a globally installed package\npixi global upgrade python=3.10\n
    "},{"location":"reference/cli/#global-upgrade-all","title":"global upgrade-all","text":"

    This command upgrades all globally installed packages to their latest version.

    "},{"location":"reference/cli/#options_26","title":"Options","text":"
    • --channel <CHANNEL> (-c): specify a channel that the project uses. Defaults to conda-forge. Note the channel the package was installed from will be always used for upgrade. (Allowed to be used more than once)
    pixi global upgrade-all\npixi global upgrade-all --channel conda-forge --channel bioconda\n# Or in a more concise form\npixi global upgrade-all -c conda-forge -c bioconda trackplot\n
    "},{"location":"reference/cli/#global-remove","title":"global remove","text":"

    Removes a package previously installed into a globally accessible location via pixi global install

    Use pixi global info to find out what the package name is that belongs to the tool you want to remove.

    "},{"location":"reference/cli/#arguments_21","title":"Arguments","text":"
    1. <PACKAGE>: The package(s) to remove.
    pixi global remove pre-commit\n\n# multiple packages can be removed at once\npixi global remove pre-commit starship\n
    "},{"location":"reference/cli/#project","title":"project","text":"

    This subcommand allows you to modify the project configuration through the command line interface.

    "},{"location":"reference/cli/#options_27","title":"Options","text":"
    • --manifest-path <MANIFEST_PATH>: the path to manifest file, by default it searches for one in the parent directories.
    "},{"location":"reference/cli/#project-channel-add","title":"project channel add","text":"

    Add channels to the channel list in the project configuration. When you add channels, the channels are tested for existence, added to the lock file and the environment is reinstalled.

    "},{"location":"reference/cli/#arguments_22","title":"Arguments","text":"
    1. <CHANNEL>: The channels to add, name or URL.
    "},{"location":"reference/cli/#options_28","title":"Options","text":"
    • --no-install: do not update the environment, only add changed packages to the lock-file.
    • --feature <FEATURE> (-f): The feature for which the channel is added.
    pixi project channel add robostack\npixi project channel add bioconda conda-forge robostack\npixi project channel add file:///home/user/local_channel\npixi project channel add https://repo.prefix.dev/conda-forge\npixi project channel add --no-install robostack\npixi project channel add --feature cuda nvidia\n
    "},{"location":"reference/cli/#project-channel-list","title":"project channel list","text":"

    List the channels in the manifest file

    "},{"location":"reference/cli/#options_29","title":"Options","text":"
    • urls: show the urls of the channels instead of the names.
    $ pixi project channel list\nEnvironment: default\n- conda-forge\n\n$ pixi project channel list --urls\nEnvironment: default\n- https://conda.anaconda.org/conda-forge/\n
    "},{"location":"reference/cli/#project-channel-remove","title":"project channel remove","text":"

    List the channels in the manifest file

    "},{"location":"reference/cli/#arguments_23","title":"Arguments","text":"
    1. <CHANNEL>...: The channels to remove, name(s) or URL(s).
    "},{"location":"reference/cli/#options_30","title":"Options","text":"
    • --no-install: do not update the environment, only add changed packages to the lock-file.
    • --feature <FEATURE> (-f): The feature for which the channel is removed.
    pixi project channel remove conda-forge\npixi project channel remove https://conda.anaconda.org/conda-forge/\npixi project channel remove --no-install conda-forge\npixi project channel remove --feature cuda nvidia\n
    "},{"location":"reference/cli/#project-description-get","title":"project description get","text":"

    Get the project description.

    $ pixi project description get\nPackage management made easy!\n
    "},{"location":"reference/cli/#project-description-set","title":"project description set","text":"

    Set the project description.

    "},{"location":"reference/cli/#arguments_24","title":"Arguments","text":"
    1. <DESCRIPTION>: The description to set.
    pixi project description set \"my new description\"\n
    "},{"location":"reference/cli/#project-environment-add","title":"project environment add","text":"

    Add an environment to the manifest file.

    "},{"location":"reference/cli/#arguments_25","title":"Arguments","text":"
    1. <NAME>: The name of the environment to add.
    "},{"location":"reference/cli/#options_31","title":"Options","text":"
    • -f, --feature <FEATURES>: Features to add to the environment.
    • --solve-group <SOLVE_GROUP>: The solve-group to add the environment to.
    • --no-default-feature: Don't include the default feature in the environment.
    • --force: Update the manifest even if the environment already exists.
    pixi project environment add env1 --feature feature1 --feature feature2\npixi project environment add env2 -f feature1 --solve-group test\npixi project environment add env3 -f feature1 --no-default-feature\npixi project environment add env3 -f feature1 --force\n
    "},{"location":"reference/cli/#project-environment-remove","title":"project environment remove","text":"

    Remove an environment from the manifest file.

    "},{"location":"reference/cli/#arguments_26","title":"Arguments","text":"
    1. <NAME>: The name of the environment to remove.
    pixi project environment remove env1\n
    "},{"location":"reference/cli/#project-environment-list","title":"project environment list","text":"

    List the environments in the manifest file.

    pixi project environment list\n
    "},{"location":"reference/cli/#project-export-conda_environment","title":"project export conda_environment","text":"

    Exports a conda environment.yml file. The file can be used to create a conda environment using conda/mamba:

    pixi project export conda-environment environment.yml\nmamba create --name <env> --file environment.yml\n
    "},{"location":"reference/cli/#arguments_27","title":"Arguments","text":"
    1. <OUTPUT_PATH>: Optional path to render environment.yml to. Otherwise it will be printed to standard out.
    "},{"location":"reference/cli/#options_32","title":"Options","text":"
    • --environment <ENVIRONMENT> (-e): Environment to render.
    • --platform <PLATFORM> (-p): The platform to render.
    pixi project export conda-environment --environment lint\npixi project export conda-environment --platform linux-64 environment.linux-64.yml\n
    "},{"location":"reference/cli/#project-export-conda_explicit_spec","title":"project export conda_explicit_spec","text":"

    Render a platform-specific conda explicit specification file for an environment. The file can be then used to create a conda environment using conda/mamba:

    mamba create --name <env> --file <explicit spec file>\n

    As the explicit specification file format does not support pypi-dependencies, use the --ignore-pypi-errors option to ignore those dependencies.

    "},{"location":"reference/cli/#arguments_28","title":"Arguments","text":"
    1. <OUTPUT_DIR>: Output directory for rendered explicit environment spec files.
    "},{"location":"reference/cli/#options_33","title":"Options","text":"
    • --environment <ENVIRONMENT> (-e): Environment to render. Can be repeated for multiple envs. Defaults to all environments.
    • --platform <PLATFORM> (-p): The platform to render. Can be repeated for multiple platforms. Defaults to all platforms available for selected environments.
    • --ignore-pypi-errors: PyPI dependencies are not supported in the conda explicit spec file. This flag allows creating the spec file even if PyPI dependencies are present.
    pixi project export conda_explicit_spec output\npixi project export conda_explicit_spec -e default -e test -p linux-64 output\n
    "},{"location":"reference/cli/#project-platform-add","title":"project platform add","text":"

    Adds a platform(s) to the manifest file and updates the lock file.

    "},{"location":"reference/cli/#arguments_29","title":"Arguments","text":"
    1. <PLATFORM>...: The platforms to add.
    "},{"location":"reference/cli/#options_34","title":"Options","text":"
    • --no-install: do not update the environment, only add changed packages to the lock-file.
    • --feature <FEATURE> (-f): The feature for which the platform will be added.
    pixi project platform add win-64\npixi project platform add --feature test win-64\n
    "},{"location":"reference/cli/#project-platform-list","title":"project platform list","text":"

    List the platforms in the manifest file.

    $ pixi project platform list\nosx-64\nlinux-64\nwin-64\nosx-arm64\n
    "},{"location":"reference/cli/#project-platform-remove","title":"project platform remove","text":"

    Remove platform(s) from the manifest file and updates the lock file.

    "},{"location":"reference/cli/#arguments_30","title":"Arguments","text":"
    1. <PLATFORM>...: The platforms to remove.
    "},{"location":"reference/cli/#options_35","title":"Options","text":"
    • --no-install: do not update the environment, only add changed packages to the lock-file.
    • --feature <FEATURE> (-f): The feature for which the platform will be removed.
    pixi project platform remove win-64\npixi project platform remove --feature test win-64\n
    "},{"location":"reference/cli/#project-version-get","title":"project version get","text":"

    Get the project version.

    $ pixi project version get\n0.11.0\n
    "},{"location":"reference/cli/#project-version-set","title":"project version set","text":"

    Set the project version.

    "},{"location":"reference/cli/#arguments_31","title":"Arguments","text":"
    1. <VERSION>: The version to set.
    pixi project version set \"0.13.0\"\n
    "},{"location":"reference/cli/#project-version-majorminorpatch","title":"project version {major|minor|patch}","text":"

    Bump the project version to {MAJOR|MINOR|PATCH}.

    pixi project version major\npixi project version minor\npixi project version patch\n
    1. An up-to-date lock file means that the dependencies in the lock file are allowed by the dependencies in the manifest file. For example

      • a manifest with python = \">= 3.11\" is up-to-date with a name: python, version: 3.11.0 in the pixi.lock.
      • a manifest with python = \">= 3.12\" is not up-to-date with a name: python, version: 3.11.0 in the pixi.lock.

      Being up-to-date does not mean that the lock file holds the latest version available on the channel for the given dependency.\u00a0\u21a9\u21a9\u21a9\u21a9\u21a9\u21a9

    "},{"location":"reference/pixi_configuration/","title":"The configuration of pixi itself","text":"

    Apart from the project specific configuration pixi supports configuration options which are not required for the project to work but are local to the machine. The configuration is loaded in the following order:

    LinuxmacOSWindows Priority Location Comments 1 /etc/pixi/config.toml System-wide configuration 2 $XDG_CONFIG_HOME/pixi/config.toml XDG compliant user-specific configuration 3 $HOME/.config/pixi/config.toml User-specific configuration 4 $PIXI_HOME/config.toml Global configuration in the user home directory. PIXI_HOME defaults to ~/.pixi 5 your_project/.pixi/config.toml Project-specific configuration 6 Command line arguments (--tls-no-verify, --change-ps1=false, etc.) Configuration via command line arguments Priority Location Comments 1 /etc/pixi/config.toml System-wide configuration 2 $XDG_CONFIG_HOME/pixi/config.toml XDG compliant user-specific configuration 3 $HOME/Library/Application Support/pixi/config.toml User-specific configuration 4 $PIXI_HOME/config.toml Global configuration in the user home directory. PIXI_HOME defaults to ~/.pixi 5 your_project/.pixi/config.toml Project-specific configuration 6 Command line arguments (--tls-no-verify, --change-ps1=false, etc.) Configuration via command line arguments Priority Location Comments 1 C:\\ProgramData\\pixi\\config.toml System-wide configuration 2 %APPDATA%\\pixi\\config.toml User-specific configuration 3 $PIXI_HOME\\config.toml Global configuration in the user home directory. PIXI_HOME defaults to %USERPROFILE%/.pixi 4 your_project\\.pixi\\config.toml Project-specific configuration 5 Command line arguments (--tls-no-verify, --change-ps1=false, etc.) Configuration via command line arguments

    Note

    The highest priority wins. If a configuration file is found in a higher priority location, the values from the configuration read from lower priority locations are overwritten.

    Note

    To find the locations where pixi looks for configuration files, run pixi with -vv.

    "},{"location":"reference/pixi_configuration/#reference","title":"Reference","text":"Casing In Configuration

    In versions of pixi 0.20.1 and older the global configuration used snake_case we've changed to kebab-case for consistency with the rest of the configuration. But we still support the old snake_case configuration, for older configuration options. These are:

    • default_channels
    • change_ps1
    • tls_no_verify
    • authentication_override_file
    • mirrors and sub-options
    • repodata-config and sub-options

    The following reference describes all available configuration options.

    "},{"location":"reference/pixi_configuration/#default-channels","title":"default-channels","text":"

    The default channels to select when running pixi init or pixi global install. This defaults to only conda-forge. config.toml

    default-channels = [\"conda-forge\"]\n

    Note

    The default-channels are only used when initializing a new project. Once initialized the channels are used from the project manifest.

    "},{"location":"reference/pixi_configuration/#change-ps1","title":"change-ps1","text":"

    When set to false, the (pixi) prefix in the shell prompt is removed. This applies to the pixi shell subcommand. You can override this from the CLI with --change-ps1.

    config.toml
    change-ps1 = true\n
    "},{"location":"reference/pixi_configuration/#tls-no-verify","title":"tls-no-verify","text":"

    When set to true, the TLS certificates are not verified.

    Warning

    This is a security risk and should only be used for testing purposes or internal networks.

    You can override this from the CLI with --tls-no-verify.

    config.toml
    tls-no-verify = false\n
    "},{"location":"reference/pixi_configuration/#authentication-override-file","title":"authentication-override-file","text":"

    Override from where the authentication information is loaded. Usually, we try to use the keyring to load authentication data from, and only use a JSON file as a fallback. This option allows you to force the use of a JSON file. Read more in the authentication section. config.toml

    authentication-override-file = \"/path/to/your/override.json\"\n

    "},{"location":"reference/pixi_configuration/#detached-environments","title":"detached-environments","text":"

    The directory where pixi stores the project environments, what would normally be placed in the .pixi/envs folder in a project's root. It doesn't affect the environments built for pixi global. The location of environments created for a pixi global installation can be controlled using the PIXI_HOME environment variable.

    Warning

    We recommend against using this because any environment created for a project is no longer placed in the same folder as the project. This creates a disconnect between the project and its environments and manual cleanup of the environments is required when deleting the project.

    However, in some cases, this option can still be very useful, for instance to:

    • force the installation on a specific filesystem/drive.
    • install environments locally but keep the project on a network drive.
    • let a system-administrator have more control over all environments on a system.

    This field can consist of two types of input.

    • A boolean value, true or false, which will enable or disable the feature respectively. (not \"true\" or \"false\", this is read as false)
    • A string value, which will be the absolute path to the directory where the environments will be stored.

    config.toml

    detached-environments = true\n
    or: config.toml
    detached-environments = \"/opt/pixi/envs\"\n

    The environments will be stored in the cache directory when this option is true. When you specify a custom path the environments will be stored in that directory.

    The resulting directory structure will look like this: config.toml

    detached-environments = \"/opt/pixi/envs\"\n
    /opt/pixi/envs\n\u251c\u2500\u2500 pixi-6837172896226367631\n\u2502   \u2514\u2500\u2500 envs\n\u2514\u2500\u2500 NAME_OF_PROJECT-HASH_OF_ORIGINAL_PATH\n    \u251c\u2500\u2500 envs # the runnable environments\n    \u2514\u2500\u2500 solve-group-envs # If there are solve groups\n

    "},{"location":"reference/pixi_configuration/#pinning-strategy","title":"pinning-strategy","text":"

    The strategy to use for pinning dependencies when running pixi add. The default is semver but you can set the following:

    • no-pin: No pinning, resulting in an unconstraint dependency. *
    • semver: Pinning to the latest version that satisfies the semver constraint. Resulting in a pin to major for most versions and to minor for v0 versions.
    • exact-version: Pinning to the exact version, 1.2.3 -> ==1.2.3.
    • major: Pinning to the major version, 1.2.3 -> >=1.2.3, <2.
    • minor: Pinning to the minor version, 1.2.3 -> >=1.2.3, <1.3.
    • latest-up: Pinning to the latest version, 1.2.3 -> >=1.2.3.
    config.toml
    pinning-strategy = \"no-pin\"\n
    "},{"location":"reference/pixi_configuration/#mirrors","title":"mirrors","text":"

    Configuration for conda channel-mirrors, more info below.

    config.toml
    [mirrors]\n# redirect all requests for conda-forge to the prefix.dev mirror\n\"https://conda.anaconda.org/conda-forge\" = [\n    \"https://prefix.dev/conda-forge\"\n]\n\n# redirect all requests for bioconda to one of the three listed mirrors\n# Note: for repodata we try the first mirror first.\n\"https://conda.anaconda.org/bioconda\" = [\n    \"https://conda.anaconda.org/bioconda\",\n    # OCI registries are also supported\n    \"oci://ghcr.io/channel-mirrors/bioconda\",\n    \"https://prefix.dev/bioconda\",\n]\n
    "},{"location":"reference/pixi_configuration/#repodata-config","title":"repodata-config","text":"

    Configuration for repodata fetching. config.toml

    [repodata-config]\n# disable fetching of jlap, bz2 or zstd repodata files.\n# This should only be used for specific old versions of artifactory and other non-compliant\n# servers.\ndisable-jlap = true  # don't try to download repodata.jlap\ndisable-bzip2 = true # don't try to download repodata.json.bz2\ndisable-zstd = true  # don't try to download repodata.json.zst\n

    "},{"location":"reference/pixi_configuration/#pypi-config","title":"pypi-config","text":"

    To setup a certain number of defaults for the usage of PyPI registries. You can use the following configuration options:

    • index-url: The default index URL to use for PyPI packages. This will be added to a manifest file on a pixi init.
    • extra-index-urls: A list of additional URLs to use for PyPI packages. This will be added to a manifest file on a pixi init.
    • keyring-provider: Allows the use of the keyring python package to store and retrieve credentials.
    config.toml
    [pypi-config]\n# Main index url\nindex-url = \"https://pypi.org/simple\"\n# list of additional urls\nextra-index-urls = [\"https://pypi.org/simple2\"]\n# can be \"subprocess\" or \"disabled\"\nkeyring-provider = \"subprocess\"\n

    index-url and extra-index-urls are not globals

    Unlike pip, these settings, with the exception of keyring-provider will only modify the pixi.toml/pyproject.toml file and are not globally interpreted when not present in the manifest. This is because we want to keep the manifest file as complete and reproducible as possible.

    "},{"location":"reference/pixi_configuration/#mirror-configuration","title":"Mirror configuration","text":"

    You can configure mirrors for conda channels. We expect that mirrors are exact copies of the original channel. The implementation will look for the mirror key (a URL) in the mirrors section of the configuration file and replace the original URL with the mirror URL.

    To also include the original URL, you have to repeat it in the list of mirrors.

    The mirrors are prioritized based on the order of the list. We attempt to fetch the repodata (the most important file) from the first mirror in the list. The repodata contains all the SHA256 hashes of the individual packages, so it is important to get this file from a trusted source.

    You can also specify mirrors for an entire \"host\", e.g.

    config.toml
    [mirrors]\n\"https://conda.anaconda.org\" = [\n    \"https://prefix.dev/\"\n]\n

    This will forward all request to channels on anaconda.org to prefix.dev. Channels that are not currently mirrored on prefix.dev will fail in the above example.

    "},{"location":"reference/pixi_configuration/#oci-mirrors","title":"OCI Mirrors","text":"

    You can also specify mirrors on the OCI registry. There is a public mirror on the Github container registry (ghcr.io) that is maintained by the conda-forge team. You can use it like this:

    config.toml
    [mirrors]\n\"https://conda.anaconda.org/conda-forge\" = [\n    \"oci://ghcr.io/channel-mirrors/conda-forge\"\n]\n

    The GHCR mirror also contains bioconda packages. You can search the available packages on Github.

    "},{"location":"reference/project_configuration/","title":"Configuration","text":"

    The pixi.toml is the pixi project configuration file, also known as the project manifest.

    A toml file is structured in different tables. This document will explain the usage of the different tables. For more technical documentation check pixi on crates.io.

    Tip

    We also support the pyproject.toml file. It has the same structure as the pixi.toml file. except that you need to prepend the tables with tool.pixi instead of just the table name. For example, the [project] table becomes [tool.pixi.project]. There are also some small extras that are available in the pyproject.toml file, checkout the pyproject.toml documentation for more information.

    "},{"location":"reference/project_configuration/#the-project-table","title":"The project table","text":"

    The minimally required information in the project table is:

    [project]\nchannels = [\"conda-forge\"]\nname = \"project-name\"\nplatforms = [\"linux-64\"]\n
    "},{"location":"reference/project_configuration/#name","title":"name","text":"

    The name of the project.

    name = \"project-name\"\n
    "},{"location":"reference/project_configuration/#channels","title":"channels","text":"

    This is a list that defines the channels used to fetch the packages from. If you want to use channels hosted on anaconda.org you only need to use the name of the channel directly.

    channels = [\"conda-forge\", \"robostack\", \"bioconda\", \"nvidia\", \"pytorch\"]\n

    Channels situated on the file system are also supported with absolute file paths:

    channels = [\"conda-forge\", \"file:///home/user/staged-recipes/build_artifacts\"]\n

    To access private or public channels on prefix.dev or Quetz use the url including the hostname:

    channels = [\"conda-forge\", \"https://repo.prefix.dev/channel-name\"]\n
    "},{"location":"reference/project_configuration/#platforms","title":"platforms","text":"

    Defines the list of platforms that the project supports. Pixi solves the dependencies for all these platforms and puts them in the lock file (pixi.lock).

    platforms = [\"win-64\", \"linux-64\", \"osx-64\", \"osx-arm64\"]\n

    The available platforms are listed here: link

    Special macOS behavior

    macOS has two platforms: osx-64 for Intel Macs and osx-arm64 for Apple Silicon Macs. To support both, include both in your platforms list. Fallback: If osx-arm64 can't resolve, use osx-64. Running osx-64 on Apple Silicon uses Rosetta for Intel binaries.

    "},{"location":"reference/project_configuration/#version-optional","title":"version (optional)","text":"

    The version of the project. This should be a valid version based on the conda Version Spec. See the version documentation, for an explanation of what is allowed in a Version Spec.

    version = \"1.2.3\"\n
    "},{"location":"reference/project_configuration/#authors-optional","title":"authors (optional)","text":"

    This is a list of authors of the project.

    authors = [\"John Doe <j.doe@prefix.dev>\", \"Marie Curie <mss1867@gmail.com>\"]\n
    "},{"location":"reference/project_configuration/#description-optional","title":"description (optional)","text":"

    This should contain a short description of the project.

    description = \"A simple description\"\n
    "},{"location":"reference/project_configuration/#license-optional","title":"license (optional)","text":"

    The license as a valid SPDX string (e.g. MIT AND Apache-2.0)

    license = \"MIT\"\n
    "},{"location":"reference/project_configuration/#license-file-optional","title":"license-file (optional)","text":"

    Relative path to the license file.

    license-file = \"LICENSE.md\"\n
    "},{"location":"reference/project_configuration/#readme-optional","title":"readme (optional)","text":"

    Relative path to the README file.

    readme = \"README.md\"\n
    "},{"location":"reference/project_configuration/#homepage-optional","title":"homepage (optional)","text":"

    URL of the project homepage.

    homepage = \"https://pixi.sh\"\n
    "},{"location":"reference/project_configuration/#repository-optional","title":"repository (optional)","text":"

    URL of the project source repository.

    repository = \"https://github.com/prefix-dev/pixi\"\n
    "},{"location":"reference/project_configuration/#documentation-optional","title":"documentation (optional)","text":"

    URL of the project documentation.

    documentation = \"https://pixi.sh\"\n
    "},{"location":"reference/project_configuration/#conda-pypi-map-optional","title":"conda-pypi-map (optional)","text":"

    Mapping of channel name or URL to location of mapping that can be URL/Path. Mapping should be structured in json format where conda_name: pypi_package_name. Example:

    local/robostack_mapping.json
    {\n  \"jupyter-ros\": \"my-name-from-mapping\",\n  \"boltons\": \"boltons-pypi\"\n}\n

    If conda-forge is not present in conda-pypi-map pixi will use prefix.dev mapping for it.

    conda-pypi-map = { \"conda-forge\" = \"https://example.com/mapping\", \"https://repo.prefix.dev/robostack\" = \"local/robostack_mapping.json\"}\n
    "},{"location":"reference/project_configuration/#channel-priority-optional","title":"channel-priority (optional)","text":"

    This is the setting for the priority of the channels in the solver step.

    Options:

    • strict: Default, The channels are used in the order they are defined in the channels list. Only packages from the first channel that has the package are used. This ensures that different variants for a single package are not mixed from different channels. Using packages from different incompatible channels like conda-forge and main can lead to hard to debug ABI incompatibilities.

      We strongly recommend not to switch the default. - disabled: There is no priority, all package variants from all channels will be set per package name and solved as one. Care should be taken when using this option. Since package variants can come from any channel when you use this mode, packages might not be compatible. This can cause hard to debug ABI incompatibilities.

      We strongly discourage using this option.

    channel-priority = \"disabled\"\n

    channel-priority = \"disabled\" is a security risk

    Disabling channel priority may lead to unpredictable dependency resolutions. This is a possible security risk as it may lead to packages being installed from unexpected channels. It's advisable to maintain the default strict setting and order channels thoughtfully. If necessary, specify a channel directly for a dependency.

    [project]\n# Putting conda-forge first solves most issues\nchannels = [\"conda-forge\", \"channel-name\"]\n[dependencies]\npackage = {version = \"*\", channel = \"channel-name\"}\n

    "},{"location":"reference/project_configuration/#the-tasks-table","title":"The tasks table","text":"

    Tasks are a way to automate certain custom commands in your project. For example, a lint or format step. Tasks in a pixi project are essentially cross-platform shell commands, with a unified syntax across platforms. For more in-depth information, check the Advanced tasks documentation. Pixi's tasks are run in a pixi environment using pixi run and are executed using the deno_task_shell.

    [tasks]\nsimple = \"echo This is a simple task\"\ncmd = { cmd=\"echo Same as a simple task but now more verbose\"}\ndepending = { cmd=\"echo run after simple\", depends-on=\"simple\"}\nalias = { depends-on=[\"depending\"]}\ndownload = { cmd=\"curl -o file.txt https://example.com/file.txt\" , outputs=[\"file.txt\"]}\nbuild = { cmd=\"npm build\", cwd=\"frontend\", inputs=[\"frontend/package.json\", \"frontend/*.js\"]}\nrun = { cmd=\"python run.py $ARGUMENT\", env={ ARGUMENT=\"value\" }}\nformat = { cmd=\"black $INIT_CWD\" } # runs black where you run pixi run format\nclean-env = { cmd = \"python isolated.py\", clean-env = true} # Only on Unix!\n

    You can modify this table using pixi task.

    Note

    Specify different tasks for different platforms using the target table

    Info

    If you want to hide a task from showing up with pixi task list or pixi info, you can prefix the name with _. For example, if you want to hide depending, you can rename it to _depending.

    "},{"location":"reference/project_configuration/#the-system-requirements-table","title":"The system-requirements table","text":"

    The system requirements are used to define minimal system specifications used during dependency resolution.

    For example, we can define a unix system with a specific minimal libc version.

    [system-requirements]\nlibc = \"2.28\"\n
    or make the project depend on a specific version of cuda:
    [system-requirements]\ncuda = \"12\"\n

    The options are:

    • linux: The minimal version of the linux kernel.
    • libc: The minimal version of the libc library. Also allows specifying the family of the libc library. e.g. libc = { family=\"glibc\", version=\"2.28\" }
    • macos: The minimal version of the macOS operating system.
    • cuda: The minimal version of the CUDA library.

    More information in the system requirements documentation.

    "},{"location":"reference/project_configuration/#the-pypi-options-table","title":"The pypi-options table","text":"

    The pypi-options table is used to define options that are specific to PyPI registries. These options can be specified either at the root level, which will add it to the default options feature, or on feature level, which will create a union of these options when the features are included in the environment.

    The options that can be defined are:

    • index-url: replaces the main index url.
    • extra-index-urls: adds an extra index url.
    • find-links: similar to --find-links option in pip.
    • no-build-isolation: disables build isolation, can only be set per package.
    • index-strategy: allows for specifying the index strategy to use.

    These options are explained in the sections below. Most of these options are taken directly or with slight modifications from the uv settings. If any are missing that you need feel free to create an issue requesting them.

    "},{"location":"reference/project_configuration/#alternative-registries","title":"Alternative registries","text":"

    Strict Index Priority

    Unlike pip, because we make use of uv, we have a strict index priority. This means that the first index is used where a package can be found. The order is determined by the order in the toml file. Where the extra-index-urls are preferred over the index-url. Read more about this on the uv docs

    Often you might want to use an alternative or extra index for your project. This can be done by adding the pypi-options table to your pixi.toml file, the following options are available:

    • index-url: replaces the main index url. If this is not set the default index used is https://pypi.org/simple. Only one index-url can be defined per environment.
    • extra-index-urls: adds an extra index url. The urls are used in the order they are defined. And are preferred over the index-url. These are merged across features into an environment.
    • find-links: which can either be a path {path = './links'} or a url {url = 'https://example.com/links'}. This is similar to the --find-links option in pip. These are merged across features into an environment.

    An example:

    [pypi-options]\nindex-url = \"https://pypi.org/simple\"\nextra-index-urls = [\"https://example.com/simple\"]\nfind-links = [{path = './links'}]\n

    There are some examples in the pixi repository, that make use of this feature.

    Authentication Methods

    To read about existing authentication methods for private registries, please check the PyPI Authentication section.

    "},{"location":"reference/project_configuration/#no-build-isolation","title":"No Build Isolation","text":"

    Even though build isolation is a good default. One can choose to not isolate the build for a certain package name, this allows the build to access the pixi environment. This is convenient if you want to use torch or something similar for your build-process.

    [dependencies]\npytorch = \"2.4.0\"\n\n[pypi-options]\nno-build-isolation = [\"detectron2\"]\n\n[pypi-dependencies]\ndetectron2 = { git = \"https://github.com/facebookresearch/detectron2.git\", rev = \"5b72c27ae39f99db75d43f18fd1312e1ea934e60\"}\n

    Conda dependencies define the build environment

    To use no-build-isolation effectively, use conda dependencies to define the build environment. These are installed before the PyPI dependencies are resolved, this way these dependencies are available during the build process. In the example above adding torch as a PyPI dependency would be ineffective, as it would not yet be installed during the PyPI resolution phase.

    "},{"location":"reference/project_configuration/#index-strategy","title":"Index Strategy","text":"

    The strategy to use when resolving against multiple index URLs. Description modified from the uv documentation:

    By default, uv and thus pixi, will stop at the first index on which a given package is available, and limit resolutions to those present on that first index (first-match). This prevents dependency confusion attacks, whereby an attack can upload a malicious package under the same name to a secondary index.

    One index strategy per environment

    Only one index-strategy can be defined per environment or solve-group, otherwise, an error will be shown.

    "},{"location":"reference/project_configuration/#possible-values","title":"Possible values:","text":"
    • \"first-index\": Only use results from the first index that returns a match for a given package name
    • \"unsafe-first-match\": Search for every package name across all indexes, exhausting the versions from the first index before moving on to the next. Meaning if the package a is available on index x and y, it will prefer the version from x unless you've requested a package version that is only available on y.
    • \"unsafe-best-match\": Search for every package name across all indexes, preferring the best version found. If a package version is in multiple indexes, only look at the entry for the first index. So given index, x and y that both contain package a, it will take the best version from either x or y, but should that version be available on both indexes it will prefer x.

    PyPI only

    The index-strategy only changes PyPI package resolution and not conda package resolution.

    "},{"location":"reference/project_configuration/#the-dependencies-tables","title":"The dependencies table(s)","text":"

    This section defines what dependencies you would like to use for your project.

    There are multiple dependencies tables. The default is [dependencies], which are dependencies that are shared across platforms.

    Dependencies are defined using a VersionSpec. A VersionSpec combines a Version with an optional operator.

    Some examples are:

    # Use this exact package version\npackage0 = \"1.2.3\"\n# Use 1.2.3 up to 1.3.0\npackage1 = \"~=1.2.3\"\n# Use larger than 1.2 lower and equal to 1.4\npackage2 = \">1.2,<=1.4\"\n# Bigger or equal than 1.2.3 or lower not including 1.0.0\npackage3 = \">=1.2.3|<1.0.0\"\n

    Dependencies can also be defined as a mapping where it is using a matchspec:

    package0 = { version = \">=1.2.3\", channel=\"conda-forge\" }\npackage1 = { version = \">=1.2.3\", build=\"py34_0\" }\n

    Tip

    The dependencies can be easily added using the pixi add command line. Running add for an existing dependency will replace it with the newest it can use.

    Note

    To specify different dependencies for different platforms use the target table

    "},{"location":"reference/project_configuration/#dependencies","title":"dependencies","text":"

    Add any conda package dependency that you want to install into the environment. Don't forget to add the channel to the project table should you use anything different than conda-forge. Even if the dependency defines a channel that channel should be added to the project.channels list.

    [dependencies]\npython = \">3.9,<=3.11\"\nrust = \"1.72\"\npytorch-cpu = { version = \"~=1.1\", channel = \"pytorch\" }\n
    "},{"location":"reference/project_configuration/#pypi-dependencies","title":"pypi-dependencies","text":"Details regarding the PyPI integration

    We use uv, which is a new fast pip replacement written in Rust.

    We integrate uv as a library, so we use the uv resolver, to which we pass the conda packages as 'locked'. This disallows uv from installing these dependencies itself, and ensures it uses the exact version of these packages in the resolution. This is unique amongst conda based package managers, which usually just call pip from a subprocess.

    The uv resolution is included in the lock file directly.

    Pixi directly supports depending on PyPI packages, the PyPA calls a distributed package a 'distribution'. There are Source and Binary distributions both of which are supported by pixi. These distributions are installed into the environment after the conda environment has been resolved and installed. PyPI packages are not indexed on prefix.dev but can be viewed on pypi.org.

    Important considerations

    • Stability: PyPI packages might be less stable than their conda counterparts. Prefer using conda packages in the dependencies table where possible.
    "},{"location":"reference/project_configuration/#version-specification","title":"Version specification:","text":"

    These dependencies don't follow the conda matchspec specification. The version is a string specification of the version according to PEP404/PyPA. Additionally, a list of extra's can be included, which are essentially optional dependencies. Note that this version is distinct from the conda MatchSpec type. See the example below to see how this is used in practice:

    [dependencies]\n# When using pypi-dependencies, python is needed to resolve pypi dependencies\n# make sure to include this\npython = \">=3.6\"\n\n[pypi-dependencies]\nfastapi = \"*\"  # This means any version (the wildcard `*` is a pixi addition, not part of the specification)\npre-commit = \"~=3.5.0\" # This is a single version specifier\n# Using the toml map allows the user to add `extras`\npandas = { version = \">=1.0.0\", extras = [\"dataframe\", \"sql\"]}\n\n# git dependencies\n# With ssh\nflask = { git = \"ssh://git@github.com/pallets/flask\" }\n# With https and a specific revision\nrequests = { git = \"https://github.com/psf/requests.git\", rev = \"0106aced5faa299e6ede89d1230bd6784f2c3660\" }\n# TODO: will support later -> branch = '' or tag = '' to specify a branch or tag\n\n# You can also directly add a source dependency from a path, tip keep this relative to the root of the project.\nminimal-project = { path = \"./minimal-project\", editable = true}\n\n# You can also use a direct url, to either a `.tar.gz` or `.zip`, or a `.whl` file\nclick = { url = \"https://github.com/pallets/click/releases/download/8.1.7/click-8.1.7-py3-none-any.whl\" }\n\n# You can also just the default git repo, it will checkout the default branch\npytest = { git = \"https://github.com/pytest-dev/pytest.git\"}\n
    "},{"location":"reference/project_configuration/#full-specification","title":"Full specification","text":"

    The full specification of a PyPI dependencies that pixi supports can be split into the following fields:

    "},{"location":"reference/project_configuration/#extras","title":"extras","text":"

    A list of extras to install with the package. e.g. [\"dataframe\", \"sql\"] The extras field works with all other version specifiers as it is an addition to the version specifier.

    pandas = { version = \">=1.0.0\", extras = [\"dataframe\", \"sql\"]}\npytest = { git = \"URL\", extras = [\"dev\"]}\nblack = { url = \"URL\", extras = [\"cli\"]}\nminimal-project = { path = \"./minimal-project\", editable = true, extras = [\"dev\"]}\n
    "},{"location":"reference/project_configuration/#version","title":"version","text":"

    The version of the package to install. e.g. \">=1.0.0\" or * which stands for any version, this is pixi specific. Version is our default field so using no inline table ({}) will default to this field.

    py-rattler = \"*\"\nruff = \"~=1.0.0\"\npytest = {version = \"*\", extras = [\"dev\"]}\n
    "},{"location":"reference/project_configuration/#git","title":"git","text":"

    A git repository to install from. This support both https:// and ssh:// urls.

    Use git in combination with rev or subdirectory:

    • rev: A specific revision to install. e.g. rev = \"0106aced5faa299e6ede89d1230bd6784f2c3660
    • subdirectory: A subdirectory to install from. subdirectory = \"src\" or subdirectory = \"src/packagex\"
    # Note don't forget the `ssh://` or `https://` prefix!\npytest = { git = \"https://github.com/pytest-dev/pytest.git\"}\nrequests = { git = \"https://github.com/psf/requests.git\", rev = \"0106aced5faa299e6ede89d1230bd6784f2c3660\" }\npy-rattler = { git = \"ssh://git@github.com/mamba-org/rattler.git\", subdirectory = \"py-rattler\" }\n
    "},{"location":"reference/project_configuration/#path","title":"path","text":"

    A local path to install from. e.g. path = \"./path/to/package\" We would advise to keep your path projects in the project, and to use a relative path.

    Set editable to true to install in editable mode, this is highly recommended as it is hard to reinstall if you're not using editable mode. e.g. editable = true

    minimal-project = { path = \"./minimal-project\", editable = true}\n
    "},{"location":"reference/project_configuration/#url","title":"url","text":"

    A URL to install a wheel or sdist directly from an url.

    pandas = {url = \"https://files.pythonhosted.org/packages/3d/59/2afa81b9fb300c90531803c0fd43ff4548074fa3e8d0f747ef63b3b5e77a/pandas-2.2.1.tar.gz\"}\n
    Did you know you can use: add --pypi?

    Use the --pypi flag with the add command to quickly add PyPI packages from the CLI. E.g pixi add --pypi flask

    This does not support all the features of the pypi-dependencies table yet.

    "},{"location":"reference/project_configuration/#source-dependencies-sdist","title":"Source dependencies (sdist)","text":"

    The Source Distribution Format is a source based format (sdist for short), that a package can include alongside the binary wheel format. Because these distributions need to be built, the need a python executable to do this. This is why python needs to be present in a conda environment. Sdists usually depend on system packages to be built, especially when compiling C/C++ based python bindings. Think for example of Python SDL2 bindings depending on the C library: SDL2. To help built these dependencies we activate the conda environment that includes these pypi dependencies before resolving. This way when a source distribution depends on gcc for example, it's used from the conda environment instead of the system.

    "},{"location":"reference/project_configuration/#host-dependencies","title":"host-dependencies","text":"

    This table contains dependencies that are needed to build your project but which should not be included when your project is installed as part of another project. In other words, these dependencies are available during the build but are no longer available when your project is installed. Dependencies listed in this table are installed for the architecture of the target machine.

    [host-dependencies]\npython = \"~=3.10.3\"\n

    Typical examples of host dependencies are:

    • Base interpreters: a Python package would list python here and an R package would list mro-base or r-base.
    • Libraries your project links against during compilation like openssl, rapidjson, or xtensor.
    "},{"location":"reference/project_configuration/#build-dependencies","title":"build-dependencies","text":"

    This table contains dependencies that are needed to build the project. Different from dependencies and host-dependencies these packages are installed for the architecture of the build machine. This enables cross-compiling from one machine architecture to another.

    [build-dependencies]\ncmake = \"~=3.24\"\n

    Typical examples of build dependencies are:

    • Compilers are invoked on the build machine, but they generate code for the target machine. If the project is cross-compiled, the architecture of the build and target machine might differ.
    • cmake is invoked on the build machine to generate additional code- or project-files which are then include in the compilation process.

    Info

    The build target refers to the machine that will execute the build. Programs and libraries installed by these dependencies will be executed on the build machine.

    For example, if you compile on a MacBook with an Apple Silicon chip but target Linux x86_64 then your build platform is osx-arm64 and your host platform is linux-64.

    "},{"location":"reference/project_configuration/#the-activation-table","title":"The activation table","text":"

    The activation table is used for specialized activation operations that need to be run when the environment is activated.

    There are two types of activation operations a user can modify in the manifest:

    • scripts: A list of scripts that are run when the environment is activated.
    • env: A mapping of environment variables that are set when the environment is activated.

    These activation operations will be run before the pixi run and pixi shell commands.

    Note

    The activation operations are run by the system shell interpreter as they run before an environment is available. This means that it runs as cmd.exe on windows and bash on linux and osx (Unix). Only .sh, .bash and .bat files are supported.

    And the environment variables are set in the shell that is running the activation script, thus take note when using e.g. $ or %.

    If you have scripts or env variable per platform use the target table.

    [activation]\nscripts = [\"env_setup.sh\"]\nenv = { ENV_VAR = \"value\" }\n\n# To support windows platforms as well add the following\n[target.win-64.activation]\nscripts = [\"env_setup.bat\"]\n\n[target.linux-64.activation.env]\nENV_VAR = \"linux-value\"\n\n# You can also reference existing environment variables, but this has\n# to be done separately for unix-like operating systems and Windows\n[target.unix.activation.env]\nENV_VAR = \"$OTHER_ENV_VAR/unix-value\"\n\n[target.win.activation.env]\nENV_VAR = \"%OTHER_ENV_VAR%\\\\windows-value\"\n
    "},{"location":"reference/project_configuration/#the-target-table","title":"The target table","text":"

    The target table is a table that allows for platform specific configuration. Allowing you to make different sets of tasks or dependencies per platform.

    The target table is currently implemented for the following sub-tables:

    • activation
    • dependencies
    • tasks

    The target table is defined using [target.PLATFORM.SUB-TABLE]. E.g [target.linux-64.dependencies]

    The platform can be any of:

    • win, osx, linux or unix (unix matches linux and osx)
    • or any of the (more) specific target platforms, e.g. linux-64, osx-arm64

    The sub-table can be any of the specified above.

    To make it a bit more clear, let's look at an example below. Currently, pixi combines the top level tables like dependencies with the target-specific ones into a single set. Which, in the case of dependencies, can both add or overwrite dependencies. In the example below, we have cmake being used for all targets but on osx-64 or osx-arm64 a different version of python will be selected.

    [dependencies]\ncmake = \"3.26.4\"\npython = \"3.10\"\n\n[target.osx.dependencies]\npython = \"3.11\"\n

    Here are some more examples:

    [target.win-64.activation]\nscripts = [\"setup.bat\"]\n\n[target.win-64.dependencies]\nmsmpi = \"~=10.1.1\"\n\n[target.win-64.build-dependencies]\nvs2022_win-64 = \"19.36.32532\"\n\n[target.win-64.tasks]\ntmp = \"echo $TEMP\"\n\n[target.osx-64.dependencies]\nclang = \">=16.0.6\"\n
    "},{"location":"reference/project_configuration/#the-feature-and-environments-tables","title":"The feature and environments tables","text":"

    The feature table allows you to define features that can be used to create different [environments]. The [environments] table allows you to define different environments. The design is explained in the this design document.

    Simplest example
    [feature.test.dependencies]\npytest = \"*\"\n\n[environments]\ntest = [\"test\"]\n

    This will create an environment called test that has pytest installed.

    "},{"location":"reference/project_configuration/#the-feature-table","title":"The feature table","text":"

    The feature table allows you to define the following fields per feature.

    • dependencies: Same as the dependencies.
    • pypi-dependencies: Same as the pypi-dependencies.
    • pypi-options: Same as the pypi-options.
    • system-requirements: Same as the system-requirements.
    • activation: Same as the activation.
    • platforms: Same as the platforms. Unless overridden, the platforms of the feature will be those defined at project level.
    • channels: Same as the channels. Unless overridden, the channels of the feature will be those defined at project level.
    • channel-priority: Same as the channel-priority.
    • target: Same as the target.
    • tasks: Same as the tasks.

    These tables are all also available without the feature prefix. When those are used we call them the default feature. This is a protected name you can not use for your own feature.

    Cuda feature table example
    [feature.cuda]\nactivation = {scripts = [\"cuda_activation.sh\"]}\n# Results in:  [\"nvidia\", \"conda-forge\"] when the default is `conda-forge`\nchannels = [\"nvidia\"]\ndependencies = {cuda = \"x.y.z\", cudnn = \"12.0\"}\npypi-dependencies = {torch = \"==1.9.0\"}\nplatforms = [\"linux-64\", \"osx-arm64\"]\nsystem-requirements = {cuda = \"12\"}\ntasks = { warmup = \"python warmup.py\" }\ntarget.osx-arm64 = {dependencies = {mlx = \"x.y.z\"}}\n
    Cuda feature table example but written as separate tables
    [feature.cuda.activation]\nscripts = [\"cuda_activation.sh\"]\n\n[feature.cuda.dependencies]\ncuda = \"x.y.z\"\ncudnn = \"12.0\"\n\n[feature.cuda.pypi-dependencies]\ntorch = \"==1.9.0\"\n\n[feature.cuda.system-requirements]\ncuda = \"12\"\n\n[feature.cuda.tasks]\nwarmup = \"python warmup.py\"\n\n[feature.cuda.target.osx-arm64.dependencies]\nmlx = \"x.y.z\"\n\n# Channels and Platforms are not available as separate tables as they are implemented as lists\n[feature.cuda]\nchannels = [\"nvidia\"]\nplatforms = [\"linux-64\", \"osx-arm64\"]\n
    "},{"location":"reference/project_configuration/#the-environments-table","title":"The environments table","text":"

    The [environments] table allows you to define environments that are created using the features defined in the [feature] tables.

    The environments table is defined using the following fields:

    • features: The features that are included in the environment. Unless no-default-feature is set to true, the default feature is implicitly included in the environment.
    • solve-group: The solve group is used to group environments together at the solve stage. This is useful for environments that need to have the same dependencies but might extend them with additional dependencies. For instance when testing a production environment with additional test dependencies. These dependencies will then be the same version in all environments that have the same solve group. But the different environments contain different subsets of the solve-groups dependencies set.
    • no-default-feature: Whether to include the default feature in that environment. The default is false, to include the default feature.

    Full environments table specification

    [environments]\ntest = {features = [\"test\"], solve-group = \"test\"}\nprod = {features = [\"prod\"], solve-group = \"test\"}\nlint = {features = [\"lint\"], no-default-feature = true}\n
    As shown in the example above, in the simplest of cases, it is possible to define an environment only by listing its features:

    Simplest example
    [environments]\ntest = [\"test\"]\n

    is equivalent to

    Simplest example expanded
    [environments]\ntest = {features = [\"test\"]}\n

    When an environment comprises several features (including the default feature): - The activation and tasks of the environment are the union of the activation and tasks of all its features. - The dependencies and pypi-dependencies of the environment are the union of the dependencies and pypi-dependencies of all its features. This means that if several features define a requirement for the same package, both requirements will be combined. Beware of conflicting requirements across features added to the same environment. - The system-requirements of the environment is the union of the system-requirements of all its features. If multiple features specify a requirement for the same system package, the highest version is chosen. - The channels of the environment is the union of the channels of all its features. Channel priorities can be specified in each feature, to ensure channels are considered in the right order in the environment. - The platforms of the environment is the intersection of the platforms of all its features. Be aware that the platforms supported by a feature (including the default feature) will be considered as the platforms defined at project level (unless overridden in the feature). This means that it is usually a good idea to set the project platforms to all platforms it can support across its environments.

    "},{"location":"reference/project_configuration/#global-configuration","title":"Global configuration","text":"

    The global configuration options are documented in the global configuration section.

    "},{"location":"switching_from/conda/","title":"Transitioning from the conda or mamba to pixi","text":"

    Welcome to the guide designed to ease your transition from conda or mamba to pixi. This document compares key commands and concepts between these tools, highlighting pixi's unique approach to managing environments and packages. With pixi, you'll experience a project-based workflow, enhancing your development process, and allowing for easy sharing of your work.

    "},{"location":"switching_from/conda/#why-pixi","title":"Why Pixi?","text":"

    Pixi builds upon the foundation of the conda ecosystem, introducing a project-centric approach rather than focusing solely on environments. This shift towards projects offers a more organized and efficient way to manage dependencies and run code, tailored to modern development practices.

    "},{"location":"switching_from/conda/#key-differences-at-a-glance","title":"Key Differences at a Glance","text":"Task Conda/Mamba Pixi Installation Requires an installer Download and add to path (See installation) Creating an Environment conda create -n myenv -c conda-forge python=3.8 pixi init myenv followed by pixi add python=3.8 Activating an Environment conda activate myenv pixi shell within the project directory Deactivating an Environment conda deactivate exit from the pixi shell Running a Task conda run -n myenv python my_program.py pixi run python my_program.py (See run) Installing a Package conda install numpy pixi add numpy Uninstalling a Package conda remove numpy pixi remove numpy

    No base environment

    Conda has a base environment, which is the default environment when you start a new shell. Pixi does not have a base environment. And requires you to install the tools you need in the project or globally. Using pixi global install bat will install bat in a global environment, which is not the same as the base environment in conda.

    Activating pixi environment in the current shell

    For some advanced use-cases, you can activate the environment in the current shell. This uses the pixi shell-hook which prints the activation script, which can be used to activate the environment in the current shell without pixi itself.

    ~/myenv > eval \"$(pixi shell-hook)\"\n

    "},{"location":"switching_from/conda/#environment-vs-project","title":"Environment vs Project","text":"

    Conda and mamba focus on managing environments, while pixi emphasizes projects. In pixi, a project is a folder containing a manifest(pixi.toml/pyproject.toml) file that describes the project, a pixi.lock lock-file that describes the exact dependencies, and a .pixi folder that contains the environment.

    This project-centric approach allows for easy sharing and collaboration, as the project folder contains all the necessary information to recreate the environment. It manages more than one environment for more than one platform in a single project, and allows for easy switching between them. (See multiple environments)

    "},{"location":"switching_from/conda/#global-environments","title":"Global environments","text":"

    conda installs all environments in one global location. When this is important to you for filesystem reasons, you can use the detached-environments feature of pixi.

    pixi config set detached-environments true\n# or a specific location\npixi config set detached-environments /path/to/envs\n
    This doesn't allow you to activate the environments using pixi shell -n but it will make the installation of the environments go to the same folder.

    pixi does have the pixi global command to install tools on your machine. (See global) This is not a replacement for conda but works the same as pipx and condax. It creates a single isolated environment for the given requirement and installs the binaries into the global path.

    pixi global install bat\nbat pixi.toml\n

    Never install pip with pixi global

    Installations with pixi global get their own isolated environment. Installing pip with pixi global will create a new isolated environment with its own pip binary. Using that pip binary will install packages in the pip environment, making it unreachable form anywhere as you can't activate it.

    "},{"location":"switching_from/conda/#automated-switching","title":"Automated switching","text":"

    With pixi you can import environment.yml files into a pixi project. (See import)

    pixi init --import environment.yml\n
    This will create a new project with the dependencies from the environment.yml file.

    Exporting your environment

    If you are working with Conda users or systems, you can export your environment to a environment.yml file to share them.

    pixi project export conda\n
    Additionally you can export a conda explicit specification.

    "},{"location":"switching_from/conda/#troubleshooting","title":"Troubleshooting","text":"

    Encountering issues? Here are solutions to some common problems when being used to the conda workflow:

    • Dependency is excluded because due to strict channel priority not using this option from: 'https://conda.anaconda.org/conda-forge/' This error occurs when the package is in multiple channels. pixi uses a strict channel priority. See channel priority for more information.
    • pixi global install pip, pip doesn't work. pip is installed in the global isolated environment. Use pixi add pip in a project to install pip in the project environment and use that project.
    • pixi global install <Any Library> -> import <Any Library> -> ModuleNotFoundError: No module named '<Any Library>' The library is installed in the global isolated environment. Use pixi add <Any Library> in a project to install the library in the project environment and use that project.
    "},{"location":"switching_from/poetry/","title":"Transitioning from poetry to pixi","text":"

    Welcome to the guide designed to ease your transition from poetry to pixi. This document compares key commands and concepts between these tools, highlighting pixi's unique approach to managing environments and packages. With pixi, you'll experience a project-based workflow similar to poetry while including the conda ecosystem and allowing for easy sharing of your work.

    "},{"location":"switching_from/poetry/#why-pixi","title":"Why Pixi?","text":"

    Poetry is most-likely the closest tool to pixi in terms of project management, in the python ecosystem. On top of the PyPI ecosystem, pixi adds the power of the conda ecosystem, allowing for a more flexible and powerful environment management.

    "},{"location":"switching_from/poetry/#quick-look-at-the-differences","title":"Quick look at the differences","text":"Task Poetry Pixi Creating an Environment poetry new myenv pixi init myenv Running a Task poetry run which python pixi run which python pixi uses a built-in cross platform shell for run where poetry uses your shell. Installing a Package poetry add numpy pixi add numpy adds the conda variant. pixi add --pypi numpy adds the PyPI variant. Uninstalling a Package poetry remove numpy pixi remove numpy removes the conda variant. pixi remove --pypi numpy removes the PyPI variant. Building a package poetry build We've yet to implement package building and publishing Publishing a package poetry publish We've yet to implement package building and publishing Reading the pyproject.toml [tool.poetry] [tool.pixi] Defining dependencies [tool.poetry.dependencies] [tool.pixi.dependencies] for conda, [tool.pixi.pypi-dependencies] or [project.dependencies] for PyPI dependencies Dependency definition - numpy = \"^1.2.3\"- numpy = \"~1.2.3\"- numpy = \"*\" - numpy = \">=1.2.3 <2.0.0\"- numpy = \">=1.2.3 <1.3.0\"- numpy = \"*\" Lock file poetry.lock pixi.lock Environment directory ~/.cache/pypoetry/virtualenvs/myenv ./.pixi Defaults to the project folder, move this using the detached-environments"},{"location":"switching_from/poetry/#support-both-poetry-and-pixi-in-my-project","title":"Support both poetry and pixi in my project","text":"

    You can allow users to use poetry and pixi in the same project, they will not touch each other's parts of the configuration or system. It's best to duplicate the dependencies, basically making an exact copy of the tool.poetry.dependencies into tool.pixi.pypi-dependencies. Make sure that python is only defined in the tool.pixi.dependencies and not in the tool.pixi.pypi-dependencies.

    Mixing pixi and poetry

    It's possible to use poetry in pixi environments but this is advised against. Pixi supports PyPI dependencies in a different way than poetry does, and mixing them can lead to unexpected behavior. As you can only use one package manager at a time, it's best to stick to one.

    If using poetry on top of a pixi project, you'll always need to install the poetry environment after the pixi environment. And let pixi handle the python and poetry installation.

    "},{"location":"tutorials/python/","title":"Tutorial: Doing Python development with Pixi","text":"

    In this tutorial, we will show you how to create a simple Python project with pixi. We will show some of the features that pixi provides, that are currently not a part of pdm, poetry etc.

    "},{"location":"tutorials/python/#why-is-this-useful","title":"Why is this useful?","text":"

    Pixi builds upon the conda ecosystem, which allows you to create a Python environment with all the dependencies you need. This is especially useful when you are working with multiple Python interpreters and bindings to C and C++ libraries. For example, GDAL from PyPI does not have binary C dependencies, but the conda package does. On the other hand, some packages are only available through PyPI, which pixi can also install for you. Best of both world, let's give it a go!

    "},{"location":"tutorials/python/#pixitoml-and-pyprojecttoml","title":"pixi.toml and pyproject.toml","text":"

    We support two manifest formats: pyproject.toml and pixi.toml. In this tutorial, we will use the pyproject.toml format because it is the most common format for Python projects.

    "},{"location":"tutorials/python/#lets-get-started","title":"Let's get started","text":"

    Let's start out by making a directory and creating a new pyproject.toml file.

    pixi init pixi-py --format pyproject\n

    This gives you the following pyproject.toml:

    [project]\nname = \"pixi-py\"\nversion = \"0.1.0\"\ndescription = \"Add a short description here\"\nauthors = [{name = \"Tim de Jager\", email = \"tim@prefix.dev\"}]\nrequires-python = \">= 3.11\"\ndependencies = []\n\n[build-system]\nbuild-backend = \"hatchling.build\"\nrequires = [\"hatchling\"]\n\n[tool.pixi.project]\nchannels = [\"conda-forge\"]\nplatforms = [\"osx-arm64\"]\n\n[tool.pixi.pypi-dependencies]\npixi-py = { path = \".\", editable = true }\n\n[tool.pixi.tasks]\n

    Let's add the Python project to the tree:

    Linux & macOSWindows
    cd pixi-py # move into the project directory\nmkdir pixi_py\ntouch pixi_py/__init__.py\n
    cd pixi-py\nmkdir pixi_py\ntype nul > pixi_py\\__init__.py\n

    We now have the following directory structure:

    .\n\u251c\u2500\u2500 pixi_py\n\u2502\u00a0\u00a0 \u251c\u2500\u2500 __init__.py\n\u2514\u2500\u2500 pyproject.toml\n

    We've used a flat-layout here but pixi supports both flat- and src-layouts.

    "},{"location":"tutorials/python/#whats-in-the-pyprojecttoml","title":"What's in the pyproject.toml?","text":"

    Okay, so let's have a look at what's sections have been added and how we can modify the pyproject.toml.

    These first entries were added to the pyproject.toml file:

    # Main pixi entry\n[tool.pixi.project]\nchannels = [\"conda-forge\"]\n# This is your machine platform by default\nplatforms = [\"osx-arm64\"]\n

    The channels and platforms are added to the [tool.pixi.project] section. Channels like conda-forge manage packages similar to PyPI but allow for different packages across languages. The keyword platforms determines what platform the project supports.

    The pixi_py package itself is added as an editable dependency. This means that the package is installed in editable mode, so you can make changes to the package and see the changes reflected in the environment, without having to re-install the environment.

    # Editable installs\n[tool.pixi.pypi-dependencies]\npixi-py = { path = \".\", editable = true }\n

    In pixi, unlike other package managers, this is explicitly stated in the pyproject.toml file. The main reason being so that you can choose which environment this package should be included in.

    "},{"location":"tutorials/python/#managing-both-conda-and-pypi-dependencies-in-pixi","title":"Managing both conda and PyPI dependencies in pixi","text":"

    Our projects usually depend on other packages.

    $ pixi add black\nAdded black\n

    This will result in the following addition to the pyproject.toml:

    # Dependencies\n[tool.pixi.dependencies]\nblack = \">=24.4.2,<24.5\"\n

    But we can also be strict about the version that should be used with pixi add black=24, resulting in

    [tool.pixi.dependencies]\nblack = \"24.*\"\n

    Now, let's add some optional dependencies:

    pixi add --pypi --feature test pytest\n

    Which results in the following fields added to the pyproject.toml:

    [project.optional-dependencies]\ntest = [\"pytest\"]\n

    After we have added the optional dependencies to the pyproject.toml, pixi automatically creates a feature, which can contain a collection of dependencies, tasks, channels, and more.

    Sometimes there are packages that aren't available on conda channels but are published on PyPI. We can add these as well, which pixi will solve together with the default dependencies.

    $ pixi add black --pypi\nAdded black\nAdded these as pypi-dependencies.\n

    which results in the addition to the dependencies key in the pyproject.toml

    dependencies = [\"black\"]\n

    When using the pypi-dependencies you can make use of the optional-dependencies that other packages make available. For example, black makes the cli dependencies option, which can be added with the --pypi keyword:

    $ pixi add black[cli] --pypi\nAdded black[cli]\nAdded these as pypi-dependencies.\n

    which updates the dependencies entry to

    dependencies = [\"black[cli]\"]\n
    Optional dependencies in pixi.toml

    This tutorial focuses on the use of the pyproject.toml, but in case you're curious, the pixi.toml would contain the following entry after the installation of a PyPI package including an optional dependency:

    [pypi-dependencies]\nblack = { version = \"*\", extras = [\"cli\"] }\n

    "},{"location":"tutorials/python/#installation-pixi-install","title":"Installation: pixi install","text":"

    Now let's install the project with pixi install:

    $ pixi install\n\u2714 Project in /path/to/pixi-py is ready to use!\n

    We now have a new directory called .pixi in the project root. This directory contains the environment that was created when we ran pixi install. The environment is a conda environment that contains the dependencies that we specified in the pyproject.toml file. We can also install the test environment with pixi install -e test. We can use these environments for executing code.

    We also have a new file called pixi.lock in the project root. This file contains the exact versions of the dependencies that were installed in the environment across platforms.

    "},{"location":"tutorials/python/#whats-in-the-environment","title":"What's in the environment?","text":"

    Using pixi list, you can see what's in the environment, this is essentially a nicer view on the lock file:

    $ pixi list\nPackage          Version       Build               Size       Kind   Source\nbzip2            1.0.8         h93a5062_5          119.5 KiB  conda  bzip2-1.0.8-h93a5062_5.conda\nblack            24.4.2                            3.8 MiB    pypi   black-24.4.2-cp312-cp312-win_amd64.http.whl\nca-certificates  2024.2.2      hf0a4a13_0          152.1 KiB  conda  ca-certificates-2024.2.2-hf0a4a13_0.conda\nlibexpat         2.6.2         hebf3989_0          62.2 KiB   conda  libexpat-2.6.2-hebf3989_0.conda\nlibffi           3.4.2         h3422bc3_5          38.1 KiB   conda  libffi-3.4.2-h3422bc3_5.tar.bz2\nlibsqlite        3.45.2        h091b4b1_0          806 KiB    conda  libsqlite-3.45.2-h091b4b1_0.conda\nlibzlib          1.2.13        h53f4e23_5          47 KiB     conda  libzlib-1.2.13-h53f4e23_5.conda\nncurses          6.4.20240210  h078ce10_0          801 KiB    conda  ncurses-6.4.20240210-h078ce10_0.conda\nopenssl          3.2.1         h0d3ecfb_1          2.7 MiB    conda  openssl-3.2.1-h0d3ecfb_1.conda\npython           3.12.3        h4a7b5fc_0_cpython  12.6 MiB   conda  python-3.12.3-h4a7b5fc_0_cpython.conda\nreadline         8.2           h92ec313_1          244.5 KiB  conda  readline-8.2-h92ec313_1.conda\ntk               8.6.13        h5083fa2_1          3 MiB      conda  tk-8.6.13-h5083fa2_1.conda\ntzdata           2024a         h0c530f3_0          117 KiB    conda  tzdata-2024a-h0c530f3_0.conda\npixi-py          0.1.0                                        pypi   . (editable)\nxz               5.2.6         h57fd34a_0          230.2 KiB  conda  xz-5.2.6-h57fd34a_0.tar.bz2\n

    Python

    The Python interpreter is also installed in the environment. This is because the Python interpreter version is read from the requires-python field in the pyproject.toml file. This is used to determine the Python version to install in the environment. This way, pixi automatically manages/bootstraps the Python interpreter for you, so no more brew, apt or other system install steps.

    Here, you can see the different conda and Pypi packages listed. As you can see, the pixi-py package that we are working on is installed in editable mode. Every environment in pixi is isolated but reuses files that are hard-linked from a central cache directory. This means that you can have multiple environments with the same packages but only have the individual files stored once on disk.

    We can create the default and test environments based on our own test feature from the optional-dependency:

    pixi project environment add default --solve-group default\npixi project environment add test --feature test --solve-group default\n

    Which results in:

    # Environments\n[tool.pixi.environments]\ndefault = { solve-group = \"default\" }\ntest = { features = [\"test\"], solve-group = \"default\" }\n
    Solve Groups

    Solve groups are a way to group dependencies together. This is useful when you have multiple environments that share the same dependencies. For example, maybe pytest is a dependency that influences the dependencies of the default environment. By putting these in the same solve group, you ensure that the versions in test and default are exactly the same.

    The default environment is created when you run pixi install. The test environment is created from the optional dependencies in the pyproject.toml file. You can execute commands in this environment with e.g. pixi run -e test python

    "},{"location":"tutorials/python/#getting-code-to-run","title":"Getting code to run","text":"

    Let's add some code to the pixi-py package. We will add a new function to the pixi_py/__init__.py file:

    from rich import print\n\ndef hello():\n    return \"Hello, [bold magenta]World[/bold magenta]!\", \":vampire:\"\n\ndef say_hello():\n    print(*hello())\n

    Now add the rich dependency from PyPI using: pixi add --pypi rich.

    Let's see if this works by running:

    pixi r python -c \"import pixi_py; pixi_py.say_hello()\"\nHello, World! \ud83e\udddb\n
    Slow?

    This might be slow(2 minutes) the first time because pixi installs the project, but it will be near instant the second time.

    Pixi runs the self installed Python interpreter. Then, we are importing the pixi_py package, which is installed in editable mode. The code calls the say_hello function that we just added. And it works! Cool!

    "},{"location":"tutorials/python/#testing-this-code","title":"Testing this code","text":"

    Okay, so let's add a test for this function. Let's add a tests/test_me.py file in the root of the project.

    Giving us the following project structure:

    .\n\u251c\u2500\u2500 pixi.lock\n\u251c\u2500\u2500 pixi_py\n\u2502\u00a0\u00a0 \u251c\u2500\u2500 __init__.py\n\u251c\u2500\u2500 pyproject.toml\n\u2514\u2500\u2500 tests/test_me.py\n
    from pixi_py import hello\n\ndef test_pixi_py():\n    assert hello() == (\"Hello, [bold magenta]World[/bold magenta]!\", \":vampire:\")\n

    Let's add an easy task for running the tests.

    $ pixi task add --feature test test \"pytest\"\n\u2714 Added task `test`: pytest .\n

    So pixi has a task system to make it easy to run commands. Similar to npm scripts or something you would specify in a Justfile.

    Pixi tasks

    Tasks are actually a pretty cool pixi feature that is powerful and runs in a cross-platform shell. You can do caching, dependencies and more. Read more about tasks in the tasks section.

    $ pixi r test\n\u2728 Pixi task (test): pytest .\n================================================================================================= test session starts =================================================================================================\nplatform darwin -- Python 3.12.2, pytest-8.1.1, pluggy-1.4.0\nrootdir: /private/tmp/pixi-py\nconfigfile: pyproject.toml\ncollected 1 item\n\ntest_me.py .                                                                                                                                                                                                    [100%]\n\n================================================================================================== 1 passed in 0.00s =================================================================================================\n

    Neat! It seems to be working!

    "},{"location":"tutorials/python/#test-vs-default-environment","title":"Test vs Default environment","text":"

    The interesting thing is if we compare the output of the two environments.

    pixi list -e test\n# v.s. default environment\npixi list\n

    Is that the test environment has:

    package          version       build               size       kind   source\n...\npytest           8.1.1                             1.1 mib    pypi   pytest-8.1.1-py3-none-any.whl\n...\n

    But the default environment is missing this package. This way, you can finetune your environments to only have the packages that are needed for that environment. E.g. you could also have a dev environment that has pytest and ruff installed, but you could omit these from the prod environment. There is a docker example that shows how to set up a minimal prod environment and copy from there.

    "},{"location":"tutorials/python/#replacing-pypi-packages-with-conda-packages","title":"Replacing PyPI packages with conda packages","text":"

    Last thing, pixi provides the ability for pypi packages to depend on conda packages. Let's confirm this with pixi list:

    $ pixi list\nPackage          Version       Build               Size       Kind   Source\n...\npygments         2.17.2                            4.1 MiB    pypi   pygments-2.17.2-py3-none-any.http.whl\n...\n

    Let's explicitly add pygments to the pyproject.toml file. Which is a dependency of the rich package.

    pixi add pygments\n

    This will add the following to the pyproject.toml file:

    [tool.pixi.dependencies]\npygments = \">=2.17.2,<2.18\"\n

    We can now see that the pygments package is now installed as a conda package.

    $ pixi list\nPackage          Version       Build               Size       Kind   Source\n...\npygments         2.17.2        pyhd8ed1ab_0        840.3 KiB  conda  pygments-2.17.2-pyhd8ed1ab_0.conda\n

    This way, PyPI dependencies and conda dependencies can be mixed and matched to seamlessly interoperate.

    $  pixi r python -c \"import pixi_py; pixi_py.say_hello()\"\nHello, World! \ud83e\udddb\n

    And it still works!

    "},{"location":"tutorials/python/#conclusion","title":"Conclusion","text":"

    In this tutorial, you've seen how easy it is to use a pyproject.toml to manage your pixi dependencies and environments. We have also explored how to use PyPI and conda dependencies seamlessly together in the same project and install optional dependencies to manage Python packages.

    Hopefully, this provides a flexible and powerful way to manage your Python projects and a fertile base for further exploration with Pixi.

    Thanks for reading! Happy Coding \ud83d\ude80

    Any questions? Feel free to reach out or share this tutorial on X, join our Discord, send us an e-mail or follow our GitHub.

    "},{"location":"tutorials/ros2/","title":"Tutorial: Develop a ROS 2 package with pixi","text":"

    In this tutorial, we will show you how to develop a ROS 2 package using pixi. The tutorial is written to be executed from top to bottom, missing steps might result in errors.

    The audience for this tutorial is developers who are familiar with ROS 2 and how are interested to try pixi for their development workflow.

    "},{"location":"tutorials/ros2/#prerequisites","title":"Prerequisites","text":"
    • You need to have pixi installed. If you haven't installed it yet, you can follow the instructions in the installation guide. The crux of this tutorial is to show you only need pixi!
    • On Windows, it's advised to enable Developer mode. Go to Settings -> Update & Security -> For developers -> Developer mode.

    If you're new to pixi, you can check out the basic usage guide. This will teach you the basics of pixi project within 3 minutes.

    "},{"location":"tutorials/ros2/#create-a-pixi-project","title":"Create a pixi project.","text":"
    pixi init my_ros2_project -c robostack-staging -c conda-forge\ncd my_ros2_project\n

    It should have created a directory structure like this:

    my_ros2_project\n\u251c\u2500\u2500 .gitattributes\n\u251c\u2500\u2500 .gitignore\n\u2514\u2500\u2500 pixi.toml\n

    The pixi.toml file is the manifest file for your project. It should look like this:

    pixi.toml
    [project]\nname = \"my_ros2_project\"\nversion = \"0.1.0\"\ndescription = \"Add a short description here\"\nauthors = [\"User Name <user.name@email.url>\"]\nchannels = [\"robostack-staging\", \"conda-forge\"]\n# Your project can support multiple platforms, the current platform will be automatically added.\nplatforms = [\"linux-64\"]\n\n[tasks]\n\n[dependencies]\n

    The channels you added to the init command are repositories of packages, you can search in these repositories through our prefix.dev website. The platforms are the systems you want to support, in pixi you can support multiple platforms, but you have to define which platforms, so pixi can test if those are supported for your dependencies. For the rest of the fields, you can fill them in as you see fit.

    "},{"location":"tutorials/ros2/#add-ros-2-dependencies","title":"Add ROS 2 dependencies","text":"

    To use a pixi project you don't need any dependencies on your system, all the dependencies you need should be added through pixi, so other users can use your project without any issues.

    Let's start with the turtlesim example

    pixi add ros-humble-desktop ros-humble-turtlesim\n

    This will add the ros-humble-desktop and ros-humble-turtlesim packages to your manifest. Depending on your internet speed this might take a minute, as it will also install ROS in your project folder (.pixi).

    Now run the turtlesim example.

    pixi run ros2 run turtlesim turtlesim_node\n

    Or use the shell command to start an activated environment in your terminal.

    pixi shell\nros2 run turtlesim turtlesim_node\n

    Congratulations you have ROS 2 running on your machine with pixi!

    Some more fun with the turtle

    To control the turtle you can run the following command in a new terminal

    cd my_ros2_project\npixi run ros2 run turtlesim turtle_teleop_key\n

    Now you can control the turtle with the arrow keys on your keyboard.

    "},{"location":"tutorials/ros2/#add-a-custom-python-node","title":"Add a custom Python node","text":"

    As ros works with custom nodes, let's add a custom node to our project.

    pixi run ros2 pkg create --build-type ament_python --destination-directory src --node-name my_node my_package\n

    To build the package we need some more dependencies:

    pixi add colcon-common-extensions \"setuptools<=58.2.0\"\n

    Add the created initialization script for the ros workspace to your manifest file.

    Then run the build command

    pixi run colcon build\n

    This will create a sourceable script in the install folder, you can source this script through an activation script to use your custom node. Normally this would be the script you add to your .bashrc but now you tell pixi to use it.

    Linux & macOSWindows pixi.toml
    [activation]\nscripts = [\"install/setup.sh\"]\n
    pixi.toml
    [activation]\nscripts = [\"install/setup.bat\"]\n
    Multi platform support

    You can add multiple activation scripts for different platforms, so you can support multiple platforms with one project. Use the following example to add support for both Linux and Windows, using the target syntax.

    [project]\nplatforms = [\"linux-64\", \"win-64\"]\n\n[activation]\nscripts = [\"install/setup.sh\"]\n[target.win-64.activation]\nscripts = [\"install/setup.bat\"]\n

    Now you can run your custom node with the following command

    pixi run ros2 run my_package my_node\n
    "},{"location":"tutorials/ros2/#simplify-the-user-experience","title":"Simplify the user experience","text":"

    In pixi we have a feature called tasks, this allows you to define a task in your manifest file and run it with a simple command. Let's add a task to run the turtlesim example and the custom node.

    pixi task add sim \"ros2 run turtlesim turtlesim_node\"\npixi task add build \"colcon build --symlink-install\"\npixi task add hello \"ros2 run my_package my_node\"\n

    Now you can run these task by simply running

    pixi run sim\npixi run build\npixi run hello\n
    Advanced task usage

    Tasks are a powerful feature in pixi.

    • You can add depends-on to the tasks to create a task chain.
    • You can add cwd to the tasks to run the task in a different directory from the root of the project.
    • You can add inputs and outputs to the tasks to create a task that only runs when the inputs are changed.
    • You can use the target syntax to run specific tasks on specific machines.
    [tasks]\nsim = \"ros2 run turtlesim turtlesim_node\"\nbuild = {cmd = \"colcon build --symlink-install\", inputs = [\"src\"]}\nhello = { cmd = \"ros2 run my_package my_node\", depends-on = [\"build\"] }\n
    "},{"location":"tutorials/ros2/#build-a-c-node","title":"Build a C++ node","text":"

    To build a C++ node you need to add the ament_cmake and some other build dependencies to your manifest file.

    pixi add ros-humble-ament-cmake-auto compilers pkg-config cmake ninja\n

    Now you can create a C++ node with the following command

    pixi run ros2 pkg create --build-type ament_cmake --destination-directory src --node-name my_cpp_node my_cpp_package\n

    Now you can build it again and run it with the following commands

    # Passing arguments to the build command to build with Ninja, add them to the manifest if you want to default to ninja.\npixi run build --cmake-args -G Ninja\npixi run ros2 run my_cpp_package my_cpp_node\n
    Tip

    Add the cpp task to the manifest file to simplify the user experience.

    pixi task add hello-cpp \"ros2 run my_cpp_package my_cpp_node\"\n
    "},{"location":"tutorials/ros2/#conclusion","title":"Conclusion","text":"

    In this tutorial, we showed you how to create a Python & CMake ROS2 project using pixi. We also showed you how to add dependencies to your project using pixi, and how to run your project using pixi run. This way you can make sure that your project is reproducible on all your machines that have pixi installed.

    "},{"location":"tutorials/ros2/#show-off-your-work","title":"Show Off Your Work!","text":"

    Finished with your project? We'd love to see what you've created! Share your work on social media using the hashtag #pixi and tag us @prefix_dev. Let's inspire the community together!

    "},{"location":"tutorials/ros2/#frequently-asked-questions","title":"Frequently asked questions","text":""},{"location":"tutorials/ros2/#what-happens-with-rosdep","title":"What happens with rosdep?","text":"

    Currently, we don't support rosdep in a pixi environment, so you'll have to add the packages using pixi add. rosdep will call conda install which isn't supported in a pixi environment.

    "},{"location":"tutorials/rust/","title":"Tutorial: Develop a Rust package using pixi","text":"

    In this tutorial, we will show you how to develop a Rust package using pixi. The tutorial is written to be executed from top to bottom, missing steps might result in errors.

    The audience for this tutorial is developers who are familiar with Rust and cargo and how are interested to try pixi for their development workflow. The benefit would be within a rust workflow that you lock both rust and the C/System dependencies your project might be using. E.g tokio users will almost most definitely use openssl.

    If you're new to pixi, you can check out the basic usage guide. This will teach you the basics of pixi project within 3 minutes.

    "},{"location":"tutorials/rust/#prerequisites","title":"Prerequisites","text":"
    • You need to have pixi installed. If you haven't installed it yet, you can follow the instructions in the installation guide. The crux of this tutorial is to show you only need pixi!
    "},{"location":"tutorials/rust/#create-a-pixi-project","title":"Create a pixi project.","text":"
    pixi init my_rust_project\ncd my_rust_project\n

    It should have created a directory structure like this:

    my_rust_project\n\u251c\u2500\u2500 .gitattributes\n\u251c\u2500\u2500 .gitignore\n\u2514\u2500\u2500 pixi.toml\n

    The pixi.toml file is the manifest file for your project. It should look like this:

    pixi.toml
    [project]\nname = \"my_rust_project\"\nversion = \"0.1.0\"\ndescription = \"Add a short description here\"\nauthors = [\"User Name <user.name@email.url>\"]\nchannels = [\"conda-forge\"]\nplatforms = [\"linux-64\"] # (1)!\n\n[tasks]\n\n[dependencies]\n
    1. The platforms is set to your system's platform by default. You can change it to any platform you want to support. e.g. [\"linux-64\", \"osx-64\", \"osx-arm64\", \"win-64\"].
    "},{"location":"tutorials/rust/#add-rust-dependencies","title":"Add Rust dependencies","text":"

    To use a pixi project you don't need any dependencies on your system, all the dependencies you need should be added through pixi, so other users can use your project without any issues.

    pixi add rust\n

    This will add the rust package to your pixi.toml file under [dependencies]. Which includes the rust toolchain, and cargo.

    "},{"location":"tutorials/rust/#add-a-cargo-project","title":"Add a cargo project","text":"

    Now that you have rust installed, you can create a cargo project in your pixi project.

    pixi run cargo init\n

    pixi run is pixi's way to run commands in the pixi environment, it will make sure that the environment is set up correctly for the command to run. It runs its own cross-platform shell, if you want more information checkout the tasks documentation. You can also activate the environment in your own shell by running pixi shell, after that you don't need pixi run ... anymore.

    Now we can build a cargo project using pixi.

    pixi run cargo build\n
    To simplify the build process, you can add a build task to your pixi.toml file using the following command:
    pixi task add build \"cargo build\"\n
    Which creates this field in the pixi.toml file: pixi.toml
    [tasks]\nbuild = \"cargo build\"\n

    And now you can build your project using:

    pixi run build\n

    You can also run your project using:

    pixi run cargo run\n
    Which you can simplify with a task again.
    pixi task add start \"cargo run\"\n

    So you should get the following output:

    pixi run start\nHello, world!\n

    Congratulations, you have a Rust project running on your machine with pixi!

    "},{"location":"tutorials/rust/#next-steps-why-is-this-useful-when-there-is-rustup","title":"Next steps, why is this useful when there is rustup?","text":"

    Cargo is not a binary package manager, but a source-based package manager. This means that you need to have the Rust compiler installed on your system to use it. And possibly other dependencies that are not included in the cargo package manager. For example, you might need to install openssl or libssl-dev on your system to build a package. This is the case for pixi as well, but pixi will install these dependencies in your project folder, so you don't have to worry about them.

    Add the following dependencies to your cargo project:

    pixi run cargo add git2\n

    If your system is not preconfigured to build C and have the libssl-dev package installed you will not be able to build the project:

    pixi run build\n...\nCould not find directory of OpenSSL installation, and this `-sys` crate cannot\nproceed without this knowledge. If OpenSSL is installed and this crate had\ntrouble finding it,  you can set the `OPENSSL_DIR` environment variable for the\ncompilation process.\n\nMake sure you also have the development packages of openssl installed.\nFor example, `libssl-dev` on Ubuntu or `openssl-devel` on Fedora.\n\nIf you're in a situation where you think the directory *should* be found\nautomatically, please open a bug at https://github.com/sfackler/rust-openssl\nand include information about your system as well as this message.\n\n$HOST = x86_64-unknown-linux-gnu\n$TARGET = x86_64-unknown-linux-gnu\nopenssl-sys = 0.9.102\n\n\nIt looks like you're compiling on Linux and also targeting Linux. Currently this\nrequires the `pkg-config` utility to find OpenSSL but unfortunately `pkg-config`\ncould not be found. If you have OpenSSL installed you can likely fix this by\ninstalling `pkg-config`.\n...\n
    You can fix this, by adding the necessary dependencies for building git2, with pixi:
    pixi add openssl pkg-config compilers\n

    Now you should be able to build your project again:

    pixi run build\n...\n   Compiling git2 v0.18.3\n   Compiling my_rust_project v0.1.0 (/my_rust_project)\n    Finished dev [unoptimized + debuginfo] target(s) in 7.44s\n     Running `target/debug/my_rust_project`\n

    "},{"location":"tutorials/rust/#extra-add-more-tasks","title":"Extra: Add more tasks","text":"

    You can add more tasks to your pixi.toml file to simplify your workflow.

    For example, you can add a test task to run your tests:

    pixi task add test \"cargo test\"\n

    And you can add a clean task to clean your project:

    pixi task add clean \"cargo clean\"\n

    You can add a formatting task to your project:

    pixi task add fmt \"cargo fmt\"\n

    You can extend these tasks to run multiple commands with the use of the depends-on field.

    pixi task add lint \"cargo clippy\" --depends-on fmt\n

    "},{"location":"tutorials/rust/#conclusion","title":"Conclusion","text":"

    In this tutorial, we showed you how to create a Rust project using pixi. We also showed you how to add dependencies to your project using pixi. This way you can make sure that your project is reproducible on any system that has pixi installed.

    "},{"location":"tutorials/rust/#show-off-your-work","title":"Show Off Your Work!","text":"

    Finished with your project? We'd love to see what you've created! Share your work on social media using the hashtag #pixi and tag us @prefix_dev. Let's inspire the community together!

    "},{"location":"CHANGELOG/","title":"Changelog","text":"

    All notable changes to this project will be documented in this file.

    The format is based on Keep a Changelog, and this project adheres to Semantic Versioning.

    "},{"location":"CHANGELOG/#0321-2024-10-08","title":"[0.32.1] - 2024-10-08","text":""},{"location":"CHANGELOG/#fixes","title":"Fixes","text":"
    • Bump Rust version to 1.81 by @wolfv in #2227
    "},{"location":"CHANGELOG/#documentation","title":"Documentation","text":"
    • Pixi-pack, docker, devcontainer by @pavelzw in #2220
    "},{"location":"CHANGELOG/#0320-2024-10-08","title":"[0.32.0] - 2024-10-08","text":""},{"location":"CHANGELOG/#highlights","title":"\u2728 Highlights","text":"

    The biggest fix in this PR is the move to the latest rattler as it came with some major bug fixes for macOS and Rust 1.81 compatibility.

    "},{"location":"CHANGELOG/#changed","title":"Changed","text":"
    • Correctly implement total ordering for dependency provider by @tdejager in rattler/#892
    "},{"location":"CHANGELOG/#fixed","title":"Fixed","text":"
    • Fixed self-clobber issue when up/down grading packages by @wolfv in rattler/#893
    • Check environment name before returning not found print by @ruben-arts in #2198
    • Turn off symlink follow for task cache by @ruben-arts in #2209
    "},{"location":"CHANGELOG/#0310-2024-10-03","title":"[0.31.0] - 2024-10-03","text":""},{"location":"CHANGELOG/#highlights_1","title":"\u2728 Highlights","text":"

    Thanks to our maintainer @baszamstra! He sped up the resolver for all cases we could think of in #2162 Check the result of times it takes to solve the environments in our test set:

    "},{"location":"CHANGELOG/#added","title":"Added","text":"
    • Add nodefaults to imported conda envs by @ruben-arts in #2097
    • Add newline to .gitignore by @ruben-arts in #2095
    • Add --no-activation option to prevent env activation during global install/upgrade by @183amir in #1980
    • Add --priority arg to project channel add by @minrk in #2086
    "},{"location":"CHANGELOG/#changed_1","title":"Changed","text":"
    • Use pixi spec for conda environment yml by @ruben-arts in #2096
    • Update rattler by @nichmor in #2120
    • Update README.md by @ruben-arts in #2129
    • Follow symlinks while walking files by @0xbe7a in #2141
    "},{"location":"CHANGELOG/#documentation_1","title":"Documentation","text":"
    • Adapt wording in pixi global proposal by @Hofer-Julian in #2098
    • Community: add array-api-extra by @lucascolley in #2107
    • pixi global mention no-activation by @Hofer-Julian in #2109
    • Add minimal constructor example by @bollwyvl in #2102
    • Update global manifest install by @Hofer-Julian in #2128
    • Add description for pixi update --json by @scottamain in #2160
    • Fixes backticks for doc strings by @rachfop in #2174
    "},{"location":"CHANGELOG/#fixed_1","title":"Fixed","text":"
    • Sort exported conda explicit spec topologically by @synapticarbors in #2101
    • --import env_file breaks channel priority by @fecet in #2113
    • Allow exact yanked pypi packages by @nichmor in #2116
    • Check if files are same in self-update by @apoorvkh in #2132
    • get_or_insert_nested_table by @Hofer-Julian in #2167
    • Improve install.sh PATH handling and general robustness by @Arcitec in #2189
    • Output tasks on pixi run without input by @ruben-arts in #2193
    "},{"location":"CHANGELOG/#performance","title":"Performance","text":"
    • Significantly speed up conda resolution by @baszalmstra in #2162
    "},{"location":"CHANGELOG/#new-contributors","title":"New Contributors","text":"
    • @Arcitec made their first contribution in #2189
    • @rachfop made their first contribution in #2174
    • @scottamain made their first contribution in #2160
    • @apoorvkh made their first contribution in #2132
    • @0xbe7a made their first contribution in #2141
    • @fecet made their first contribution in #2113
    • @minrk made their first contribution in #2086
    • @183amir made their first contribution in #1980
    • @lucascolley made their first contribution in #2107
    "},{"location":"CHANGELOG/#0300-2024-09-19","title":"[0.30.0] - 2024-09-19","text":""},{"location":"CHANGELOG/#highlights_2","title":"\u2728 Highlights","text":"

    I want to thank @synapticarbors and @abkfenris for starting the work on pixi project export. Pixi now supports the export of a conda environment.yml file and a conda explicit specification file. This is a great addition to the project and will help users to share their projects with other non pixi users.

    "},{"location":"CHANGELOG/#added_1","title":"Added","text":"
    • Export conda explicit specification file from project by @synapticarbors in #1873
    • Add flag to pixi search by @Hofer-Julian in #2018
    • Adds the ability to set the index strategy by @tdejager in #1986
    • Export conda environment.yml by @abkfenris in #2003
    "},{"location":"CHANGELOG/#changed_2","title":"Changed","text":"
    • Improve examples/docker by @jennydaman in #1965
    • Minimal pre-commit tasks by @Hofer-Julian in #1984
    • Improve error and feedback when target does not exist by @tdejager in #1961
    • Move the rectangle using a mouse in SDL by @certik in #2069
    "},{"location":"CHANGELOG/#documentation_2","title":"Documentation","text":"
    • Update cli.md by @xela-95 in #2047
    • Update system-requirements information by @ruben-arts in #2079
    • Append to file syntax in task docs by @nicornk in #2013
    • Change documentation of pixi upload to refer to correct API endpoint by @traversaro in #2074
    "},{"location":"CHANGELOG/#testing","title":"Testing","text":"
    • Add downstream nerfstudio test by @tdejager in #1996
    • Run pytests in parallel by @tdejager in #2027
    • Testing common wheels by @tdejager in #2031
    "},{"location":"CHANGELOG/#fixed_2","title":"Fixed","text":"
    • Lock file is always outdated for pypi path dependencies by @nichmor in #2039
    • Fix error message for export conda explicit spec by @synapticarbors in #2048
    • Use conda-pypi-map for feature channels by @nichmor in #2038
    • Constrain feature platforms in schema by @bollwyvl in #2055
    • Split tag creation functions by @tdejager in #2062
    • Tree print to pipe by @ruben-arts in #2064
    • subdirectory in pypi url by @ruben-arts in #2065
    • Create a GUI application on Windows, not Console by @certik in #2067
    • Make dashes underscores in python package names by @ruben-arts in #2073
    • Give better errors on broken pyproject.toml by @ruben-arts in #2075
    "},{"location":"CHANGELOG/#refactor","title":"Refactor","text":"
    • Stop duplicating strip_channel_alias from rattler by @Hofer-Julian in #2017
    • Follow-up wheels tests by @Hofer-Julian in #2063
    • Integration test suite by @Hofer-Julian in #2081
    • Remove psutils by @Hofer-Julian in #2083
    • Add back older caching method by @tdejager in #2046
    • Release script by @Hofer-Julian in #1978
    • Activation script by @Hofer-Julian in #2014
    • Pins python version in add_pypi_functionality by @tdejager in #2040
    • Improve the lock_file_usage flags and behavior. by @ruben-arts in #2078
    • Move matrix to workflow that it is used in by @tdejager in #1987
    • Refactor manifest into more generic approach by @nichmor in #2015
    "},{"location":"CHANGELOG/#new-contributors_1","title":"New Contributors","text":"
    • @certik made their first contribution in #2069
    • @xela-95 made their first contribution in #2047
    • @nicornk made their first contribution in #2013
    • @jennydaman made their first contribution in #1965
    "},{"location":"CHANGELOG/#0290-2024-09-04","title":"[0.29.0] - 2024-09-04","text":""},{"location":"CHANGELOG/#highlights_3","title":"\u2728 Highlights","text":"
    • Add build-isolation options, for more details check out our docs
    • Allow to use virtual package overrides from environment variables (PR)
    • Many bug fixes
    "},{"location":"CHANGELOG/#added_2","title":"Added","text":"
    • Add build-isolation options by @tdejager in #1909
    • Add release script by @Hofer-Julian in #1971
    "},{"location":"CHANGELOG/#changed_3","title":"Changed","text":"
    • Use rustls-tls instead of native-tls per default by @Hofer-Julian in #1929
    • Upgrade to uv 0.3.4 by @tdejager in #1936
    • Upgrade to uv 0.4.0 by @tdejager in #1944
    • Better error for when the target or platform are missing by @tdejager in #1959
    • Improve integration tests by @Hofer-Julian in #1958
    • Improve release script by @Hofer-Julian in #1974
    "},{"location":"CHANGELOG/#fixed_3","title":"Fixed","text":"
    • Update env variables in installation docs by @lev112 in #1937
    • Always overwrite when pixi adding the dependency by @ruben-arts in #1935
    • Typo in schema.json by @SobhanMP in #1948
    • Using file url as mapping by @nichmor in #1930
    • Offline mapping should not request by @nichmor in #1968
    • pixi init for pyproject.toml by @Hofer-Julian in #1947
    • Use two in memory indexes, for resolve and builds by @tdejager in #1969
    • Minor issues and todos by @KGrewal1 in #1963
    "},{"location":"CHANGELOG/#refactor_1","title":"Refactor","text":"
    • Improve integration tests by @Hofer-Julian in #1942
    "},{"location":"CHANGELOG/#new-contributors_2","title":"New Contributors","text":"
    • @SobhanMP made their first contribution in #1948
    • @lev112 made their first contribution in #1937
    "},{"location":"CHANGELOG/#0282-2024-08-28","title":"[0.28.2] - 2024-08-28","text":""},{"location":"CHANGELOG/#changed_4","title":"Changed","text":"
    • Use mold on linux by @Hofer-Julian in #1914
    "},{"location":"CHANGELOG/#documentation_3","title":"Documentation","text":"
    • Fix global manifest by @Hofer-Julian in #1912
    • Document azure keyring usage by @tdejager in #1913
    "},{"location":"CHANGELOG/#fixed_4","title":"Fixed","text":"
    • Let init add dependencies independent of target and don't install by @ruben-arts in #1916
    • Enable use of manylinux wheeltags once again by @tdejager in #1925
    • The bigger runner by @ruben-arts in #1902
    "},{"location":"CHANGELOG/#0281-2024-08-26","title":"[0.28.1] - 2024-08-26","text":""},{"location":"CHANGELOG/#changed_5","title":"Changed","text":"
    • Uv upgrade to 0.3.2 by @tdejager in #1900
    "},{"location":"CHANGELOG/#documentation_4","title":"Documentation","text":"
    • Add keyrings.artifacts to the list of project built with pixi by @jslorrma in #1908
    "},{"location":"CHANGELOG/#fixed_5","title":"Fixed","text":"
    • Use default indexes if non where given by the lockfile by @ruben-arts in #1910
    "},{"location":"CHANGELOG/#new-contributors_3","title":"New Contributors","text":"
    • @jslorrma made their first contribution in #1908
    "},{"location":"CHANGELOG/#0280-2024-08-22","title":"[0.28.0] - 2024-08-22","text":""},{"location":"CHANGELOG/#highlights_4","title":"\u2728 Highlights","text":"
    • Bug Fixes: Major fixes in general but especially for PyPI installation issues and better error messaging.
    • Compatibility: Default Linux version downgraded to 4.18 for broader support.
    • New Features: Added INIT_CWD in pixi run, improved logging, and more cache options.
    "},{"location":"CHANGELOG/#added_3","title":"Added","text":"
    • Add INIT_CWD to activated env pixi run by @ruben-arts in #1798
    • Add context to error when parsing conda-meta files by @baszalmstra in #1854
    • Add some logging for when packages are actually overridden by conda by @tdejager in #1874
    • Add package when extra is added by @ruben-arts in #1856
    "},{"location":"CHANGELOG/#changed_6","title":"Changed","text":"
    • Use new gateway to get the repodata for global install by @nichmor in #1767
    • Pixi global proposal by @Hofer-Julian in #1757
    • Upgrade to new uv 0.2.37 by @tdejager in #1829
    • Use new gateway for pixi search by @nichmor in #1819
    • Extend pixi clean cache with more cache options by @ruben-arts in #1872
    • Downgrade __linux default to 4.18 by @ruben-arts in #1887
    "},{"location":"CHANGELOG/#documentation_5","title":"Documentation","text":"
    • Fix instructions for update github actions by @Hofer-Julian in #1774
    • Fix fish completion script by @dennis-wey in #1789
    • Expands the environment variable examples in the reference section by @travishathaway in #1779
    • Community feedback pixi global by @Hofer-Julian in #1800
    • Additions to the pixi global proposal by @Hofer-Julian in #1803
    • Stop using invalid environment name in pixi global proposal by @Hofer-Julian in #1826
    • Extend pixi global proposal by @Hofer-Julian in #1861
    • Make channels required in pixi global manifest by @Hofer-Julian in #1868
    • Fix linux minimum version in project_configuration docs by @traversaro in #1888
    "},{"location":"CHANGELOG/#fixed_6","title":"Fixed","text":"
    • Try to increase rlimit by @baszalmstra in #1766
    • Add test for invalid environment names by @Hofer-Julian in #1825
    • Show global config in info command by @ruben-arts in #1807
    • Correct documentation of PIXI_ENVIRONMENT_PLATFORMS by @traversaro in #1842
    • Format in docs/features/environment.md by @cdeil in #1846
    • Make proper use of NamedChannelOrUrl by @ruben-arts in #1820
    • Trait impl override by @baszalmstra in #1848
    • Tame pixi search by @baszalmstra in #1849
    • Fix pixi tree -i duplicate output by @baszalmstra in #1847
    • Improve spec parsing error messages by @baszalmstra in #1786
    • Parse matchspec from CLI Lenient by @baszalmstra in #1852
    • Improve parsing of pypi-dependencies by @baszalmstra in #1851
    • Don't enforce system requirements for task tests by @baszalmstra in #1855
    • Satisfy when there are no pypi packages in the lockfile by @ruben-arts in #1862
    • Ssh url should not contain colon by @baszalmstra in #1865
    • find-links with manifest-path by @baszalmstra in #1864
    • Increase stack size in debug mode on windows by @baszalmstra in #1867
    • Solve-group-envs should reside in .pixi folder by @baszalmstra in #1866
    • Move package-override logging by @tdejager in #1883
    • Pinning logic for minor and major by @baszalmstra in #1885
    • Docs manifest tests by @ruben-arts in #1879
    "},{"location":"CHANGELOG/#refactor_2","title":"Refactor","text":"
    • Encapsulate channel resolution logic for CLI by @olivier-lacroix in #1781
    • Move to pub(crate) fn in order to detect and remove unused functions by @Hofer-Julian in #1805
    • Only compile TaskNode::full_command for tests by @Hofer-Julian in #1809
    • Derive Default for more structs by @Hofer-Julian in #1824
    • Rename get_up_to_date_prefix to update_prefix by @Hofer-Julian in #1837
    • Make HasSpecs implementation more functional by @Hofer-Julian in #1863
    "},{"location":"CHANGELOG/#new-contributors_4","title":"New Contributors","text":"
    • @cdeil made their first contribution in #1846
    "},{"location":"CHANGELOG/#0271-2024-08-09","title":"[0.27.1] - 2024-08-09","text":""},{"location":"CHANGELOG/#documentation_6","title":"Documentation","text":"
    • Fix mlx feature in \"multiple machines\" example by @rgommers in #1762
    • Update some of the cli and add osx rosetta mention by @ruben-arts in #1760
    • Fix typo by @pavelzw in #1771
    "},{"location":"CHANGELOG/#fixed_7","title":"Fixed","text":"
    • User agent string was wrong by @wolfv in #1759
    • Dont accidentally wipe pyproject.toml on init by @ruben-arts in #1775
    "},{"location":"CHANGELOG/#refactor_3","title":"Refactor","text":"
    • Add pixi_spec crate by @baszalmstra in #1741
    "},{"location":"CHANGELOG/#new-contributors_5","title":"New Contributors","text":"
    • @rgommers made their first contribution in #1762
    "},{"location":"CHANGELOG/#0270-2024-08-07","title":"[0.27.0] - 2024-08-07","text":""},{"location":"CHANGELOG/#highlights_5","title":"\u2728 Highlights","text":"

    This release contains a lot of refactoring and improvements to the codebase, in preparation for future features and improvements. Including with that we've fixed a ton of bugs. To make sure we're not breaking anything we've added a lot of tests and CI checks. But let us know if you find any issues!

    As a reminder, you can update pixi using pixi self-update and move to a specific version, including backwards, with pixi self-update --version 0.27.0.

    "},{"location":"CHANGELOG/#added_4","title":"Added","text":"
    • Add pixi run completion for fish shell by @dennis-wey in #1680
    "},{"location":"CHANGELOG/#changed_7","title":"Changed","text":"
    • Move examples from setuptools to hatchling by @Hofer-Julian in #1692
    • Let pixi init create hatchling pyproject.toml by @Hofer-Julian in #1693
    • Make [project] table optional for pyproject.toml manifests by @olivier-lacroix in #1732
    "},{"location":"CHANGELOG/#documentation_7","title":"Documentation","text":"
    • Improve the fish completions location by @tdejager in #1647
    • Explain why we use hatchling by @Hofer-Julian
    • Update install CLI doc now that the update command exist by @olivier-lacroix in #1690
    • Mention pixi exec in GHA docs by @pavelzw in #1724
    • Update to correct spelling by @ahnsn in #1730
    • Ensure hatchling is used everywhere in documentation by @olivier-lacroix in #1733
    • Add readme to WASM example by @wolfv in #1703
    • Fix typo by @pavelzw in #1660
    • Fix typo by @DimitriPapadopoulos in #1743
    • Fix typo by @SeaOtocinclus in #1651
    "},{"location":"CHANGELOG/#testing_1","title":"Testing","text":"
    • Added script and tasks for testing examples by @tdejager in #1671
    • Add simple integration tests by @ruben-arts in #1719
    "},{"location":"CHANGELOG/#fixed_8","title":"Fixed","text":"
    • Prepend pixi to path instead of appending by @vigneshmanick in #1644
    • Add manifest tests and run them in ci by @ruben-arts in #1667
    • Use hashed pypi mapping by @baszalmstra in #1663
    • Depend on pep440_rs from crates.io and use replace by @baszalmstra in #1698
    • pixi add with more than just package name and version by @ruben-arts in #1704
    • Ignore pypi logic on non pypi projects by @ruben-arts in #1705
    • Fix and refactor --no-lockfile-update by @ruben-arts in #1683
    • Changed example to use hatchling by @tdejager in #1729
    • Todo clean up by @KGrewal1 in #1735
    • Allow for init to pixi.toml when pyproject.toml is available. by @ruben-arts in #1640
    • Test on macos-13 by @ruben-arts in #1739
    • Make sure pixi vars are available before activation.env vars are by @ruben-arts in #1740
    • Authenticate exec package download by @olivier-lacroix in #1751
    "},{"location":"CHANGELOG/#refactor_4","title":"Refactor","text":"
    • Extract pixi_manifest by @baszalmstra in #1656
    • Delay channel config url evaluation by @baszalmstra in #1662
    • Split out pty functionality by @tdejager in #1678
    • Make project manifest loading DRY and consistent by @olivier-lacroix in #1688
    • Refactor channel add and remove CLI commands by @olivier-lacroix in #1689
    • Refactor pixi::consts and pixi::config into separate crates by @tdejager in #1684
    • Move dependencies to pixi_manifest by @tdejager in #1700
    • Moved pypi environment modifiers by @tdejager in #1699
    • Split HasFeatures by @tdejager in #1712
    • Move, splits and renames the HasFeatures trait by @tdejager in #1717
    • Merge utils by @tdejager in #1718
    • Move fancy to its own crate by @tdejager in #1722
    • Move config to repodata functions by @tdejager in #1723
    • Move pypi-mapping to its own crate by @tdejager in #1725
    • Split utils into 2 crates by @tdejager in #1736
    • Add progress bar as a crate by @nichmor in #1727
    • Split up pixi_manifest lib by @tdejager in #1661
    "},{"location":"CHANGELOG/#new-contributors_6","title":"New Contributors","text":"
    • @DimitriPapadopoulos made their first contribution in #1743
    • @KGrewal1 made their first contribution in #1735
    • @ahnsn made their first contribution in #1730
    • @dennis-wey made their first contribution in #1680
    "},{"location":"CHANGELOG/#0261-2024-07-22","title":"[0.26.1] - 2024-07-22","text":""},{"location":"CHANGELOG/#fixed_9","title":"Fixed","text":"
    • Make sure we also build the msi installer by @ruben-arts in #1645
    "},{"location":"CHANGELOG/#0260-2024-07-19","title":"[0.26.0] - 2024-07-19","text":""},{"location":"CHANGELOG/#highlights_6","title":"\u2728 Highlights","text":"
    • Specify how pixi pins your dependencies with the pinning-strategy in the config. e.g. semver -> >=1.2.3,<2 and no-pin -> *) #1516
    • Specify how pixi solves multiple channels with channel-priority in the manifest. #1631
    "},{"location":"CHANGELOG/#added_5","title":"Added","text":"
    • Add short options to config location flags by @ruben-arts in #1586
    • Add a file guard to indicate if an environment is being installed by @baszalmstra in #1593
    • Add pinning-strategy to the configuration by @ruben-arts in #1516
    • Add channel-priority to the manifest and solve by @ruben-arts in #1631
    • Add nushell completion by @Hofer-Julian in #1599
    • Add nushell completions for pixi run by @Hofer-Julian in #1627
    • Add completion for pixi run --environment for nushell by @Hofer-Julian in #1636
    "},{"location":"CHANGELOG/#changed_8","title":"Changed","text":"
    • Upgrade uv 0.2.18 by @tdejager in #1540
    • Refactor pyproject.toml parser by @nichmor in #1592
    • Interactive warning for packages in pixi global install by @ruben-arts in #1626
    "},{"location":"CHANGELOG/#documentation_8","title":"Documentation","text":"
    • Add WASM example with JupyterLite by @wolfv in #1623
    • Added LLM example by @ytjhai in #1545
    • Add note to mark directory as excluded in pixi-pycharm by @pavelzw in #1579
    • Add changelog to docs by @vigneshmanick in #1574
    • Updated the values of the system requirements by @tdejager in #1575
    • Tell cargo install which bin to install by @ruben-arts in #1584
    • Update conflict docs for cargo add by @Hofer-Julian in #1600
    • Revert \"Update conflict docs for cargo add \" by @Hofer-Julian in #1605
    • Add reference documentation for the exec command by @baszalmstra in #1587
    • Add transitioning docs for poetry and conda by @ruben-arts in #1624
    • Add pixi-pack by @pavelzw in #1629
    • Use '-' instead of '_' for package name by @olivier-lacroix in #1628
    "},{"location":"CHANGELOG/#fixed_10","title":"Fixed","text":"
    • Flaky task test by @tdejager in #1581
    • Pass command line arguments verbatim by @baszalmstra in #1582
    • Run clippy on all targets by @Hofer-Julian in #1588
    • Pre-commit install pixi task by @Hofer-Julian in #1590
    • Add clap_complete_nushell to dependencies by @Hofer-Julian in #1625
    • Write to stdout for machine readable output by @Hofer-Julian in #1639
    "},{"location":"CHANGELOG/#refactor_5","title":"Refactor","text":"
    • Migrate to workspace by @baszalmstra in #1597
    "},{"location":"CHANGELOG/#removed","title":"Removed","text":"
    • Remove double manifest warning by @tdejager in #1580
    "},{"location":"CHANGELOG/#new-contributors_7","title":"New Contributors","text":"
    • @ytjhai made their first contribution in #1545
    "},{"location":"CHANGELOG/#0250-2024-07-05","title":"[0.25.0] - 2024-07-05","text":""},{"location":"CHANGELOG/#highlights_7","title":"\u2728 Highlights","text":"
    • pixi exec command, execute commands in temporary environments, useful for testing in short-lived sessions.
    • We've bumped the default system-requirements to higher defaults: glibc (2.17 -> 2.28), osx64 (10.15 -> 13.0), osx-arm64 (11.0 -> 13.0). Let us know if this causes any issues. To keep the previous values please use a system-requirements table, this is explained here
    "},{"location":"CHANGELOG/#changed_9","title":"Changed","text":"
    • Bump system requirements by @wolfv in #1553
    • Better error when exec is missing a cmd by @tdejager in #1565
    • Make exec use authenticated client by @tdejager in #1568
    "},{"location":"CHANGELOG/#documentation_9","title":"Documentation","text":"
    • Automatic updating using github actions by @pavelzw in #1456
    • Describe the --change-ps1 option for pixi shell by @Yura52 in #1536
    • Add some other quantco repos by @pavelzw in #1542
    • Add example using geos-rs by @Hofer-Julian in #1563
    "},{"location":"CHANGELOG/#fixed_11","title":"Fixed","text":"
    • Tiny error in basic_usage.md by @Sjouks in #1513
    • Lazy initialize client by @baszalmstra in #1511
    • URL typos in rtd examples by @kklein in #1538
    • Fix satisfiability for short sha hashes by @tdejager in #1530
    • Wrong path passed to dynamic check by @tdejager in #1552
    • Don't error if no tasks is available on platform by @hoxbro in #1550
    "},{"location":"CHANGELOG/#refactor_6","title":"Refactor","text":"
    • Add to use update code by @baszalmstra in #1508
    "},{"location":"CHANGELOG/#new-contributors_8","title":"New Contributors","text":"
    • @kklein made their first contribution in #1538
    • @Yura52 made their first contribution in #1536
    • @Sjouks made their first contribution in #1513
    "},{"location":"CHANGELOG/#0242-2024-06-14","title":"[0.24.2] - 2024-06-14","text":""},{"location":"CHANGELOG/#documentation_10","title":"Documentation","text":"
    • Add readthedocs examples by @bollwyvl in #1423
    • Fix typo in project_configuration.md by @RaulPL in #1502
    "},{"location":"CHANGELOG/#fixed_12","title":"Fixed","text":"
    • Too much shell variables in activation of pixi shell by @ruben-arts in #1507
    "},{"location":"CHANGELOG/#0241-2024-06-12","title":"[0.24.1] - 2024-06-12","text":""},{"location":"CHANGELOG/#fixed_13","title":"Fixed","text":"
    • Replace http code %2b with + by @ruben-arts in #1500
    "},{"location":"CHANGELOG/#0240-2024-06-12","title":"[0.24.0] - 2024-06-12","text":""},{"location":"CHANGELOG/#highlights_8","title":"\u2728 Highlights","text":"
    • You can now run in a more isolated environment on unix machines, using pixi run --clean-env TASK_NAME.
    • You can new easily clean your environment with pixi clean or the cache with pixi clean cache
    "},{"location":"CHANGELOG/#added_6","title":"Added","text":"
    • Add pixi clean command by @ruben-arts in #1325
    • Add --clean-env flag to tasks and run command by @ruben-arts in #1395
    • Add description field to task by @jjjermiah in #1479
    • Add pixi file to the environment to add pixi specific details by @ruben-arts in #1495
    "},{"location":"CHANGELOG/#changed_10","title":"Changed","text":"
    • Project environment cli by @baszalmstra in #1433
    • Update task list console output by @vigneshmanick in #1443
    • Upgrade uv by @tdejager in #1436
    • Sort packages in list_global_packages by @dhirschfeld in #1458
    • Added test for special chars wheel filename by @tdejager in #1454
    "},{"location":"CHANGELOG/#documentation_11","title":"Documentation","text":"
    • Improve multi env tasks documentation by @ruben-arts in #1494
    "},{"location":"CHANGELOG/#fixed_14","title":"Fixed","text":"
    • Use the activated environment when running a task by @tdejager in #1461
    • Fix authentication pypi-deps for download from lockfile by @tdejager in #1460
    • Display channels correctly in pixi info by @ruben-arts in #1459
    • Render help for --frozen by @ruben-arts in #1468
    • Don't record purl for non conda-forge channels by @nichmor in #1451
    • Use best_platform to verify the run platform by @ruben-arts in #1472
    • Creation of parent dir of symlink by @ruben-arts in #1483
    • pixi install --all output missing newline by @vigneshmanick in #1487
    • Don't error on already existing dependency by @ruben-arts in #1449
    • Remove debug true in release by @ruben-arts in #1477
    "},{"location":"CHANGELOG/#new-contributors_9","title":"New Contributors","text":"
    • @dhirschfeld made their first contribution in #1458

    Full commit history

    "},{"location":"CHANGELOG/#0230-2024-05-27","title":"[0.23.0] - 2024-05-27","text":""},{"location":"CHANGELOG/#highlights_9","title":"\u2728 Highlights","text":"
    • This release adds two new commands pixi config and pixi update
      • pixi config allows you to edit, set, unset, append, prepend and list your local/global or system configuration.
      • pixi update re-solves the full lockfile or use pixi update PACKAGE to only update PACKAGE, making sure your project is using the latest versions that the manifest allows for.
    "},{"location":"CHANGELOG/#added_7","title":"Added","text":"
    • Add pixi config command by @chawyehsu in #1339
    • Add pixi list --explicit flag command by @jjjermiah in #1403
    • Add [activation.env] table for environment variables by @ruben-arts in #1156
    • Allow installing multiple envs, including --all at once by @tdejager in #1413
    • Add pixi update command to re-solve the lockfile by @baszalmstra in #1431 (fixes 20 :thumbsup:)
    • Add detached-environments to the config, move environments outside the project folder by @ruben-arts in #1381 (fixes 11 :thumbsup:)
    "},{"location":"CHANGELOG/#changed_11","title":"Changed","text":"
    • Use the gateway to fetch repodata by @baszalmstra in #1307
    • Switch to compressed mapping by @nichmor in #1335
    • Warn on pypi conda clobbering by @nichmor in #1353
    • Align remove arguments with add by @olivier-lacroix in #1406
    • Add backward compat logic for older lock files by @nichmor in #1425
    "},{"location":"CHANGELOG/#documentation_12","title":"Documentation","text":"
    • Fix small screen by removing getting started section. by @ruben-arts in #1393
    • Improve caching docs by @ruben-arts in #1422
    • Add example, python library using gcp upload by @tdejager in #1380
    • Correct typos with --no-lockfile-update. by @tobiasraabe in #1396
    "},{"location":"CHANGELOG/#fixed_15","title":"Fixed","text":"
    • Trim channel url when filter packages_for_prefix_mapping by @zen-xu in #1391
    • Use the right channels when upgrading global packages by @olivier-lacroix in #1326
    • Fish prompt display looks wrong in tide by @tfriedel in #1424
    • Use local mapping instead of remote by @nichmor in #1430
    "},{"location":"CHANGELOG/#refactor_7","title":"Refactor","text":"
    • Remove unused fetch_sparse_repodata by @olivier-lacroix in #1411
    • Remove project level method that are per environment by @olivier-lacroix in #1412
    • Update lockfile functionality for reusability by @baszalmstra in #1426
    "},{"location":"CHANGELOG/#new-contributors_10","title":"New Contributors","text":"
    • @tfriedel made their first contribution in #1424
    • @jjjermiah made their first contribution in #1403
    • @tobiasraabe made their first contribution in #1396

    Full commit history

    "},{"location":"CHANGELOG/#0220-2024-05-13","title":"[0.22.0] - 2024-05-13","text":""},{"location":"CHANGELOG/#highlights_10","title":"\u2728 Highlights","text":"
    • Support for source pypi dependencies through the cli:
      • pixi add --pypi 'package @ package.whl', perfect for adding just build wheels to your environment in CI.
      • pixi add --pypi 'package_from_git @ git+https://github.com/org/package.git', to add a package from a git repository.
      • pixi add --pypi 'package_from_path @ file:///path/to/package' --editable, to add a package from a local path.
    "},{"location":"CHANGELOG/#added_8","title":"Added","text":"
    • Implement more functions for pixi add --pypi by @wolfv in #1244
    "},{"location":"CHANGELOG/#documentation_13","title":"Documentation","text":"
    • Update install cli doc by @vigneshmanick in #1336
    • Replace empty default example with no-default-feature by @beenje in #1352
    • Document the add & remove cli behaviour with pyproject.toml manifest by @olivier-lacroix in #1338
    • Add environment activation to GitHub actions docs by @pavelzw in #1371
    • Clarify in CLI that run can also take commands by @twrightsman in #1368
    "},{"location":"CHANGELOG/#fixed_16","title":"Fixed","text":"
    • Automated update of install script in pixi.sh by @ruben-arts in #1351
    • Wrong description on pixi project help by @notPlancha in #1358
    • Don't need a python interpreter when not having pypi dependencies. by @ruben-arts in #1366
    • Don't error on not editable not path by @ruben-arts in #1365
    • Align shell-hook cli with shell by @ruben-arts in #1364
    • Only write prefix file if needed by @ruben-arts in #1363
    "},{"location":"CHANGELOG/#refactor_8","title":"Refactor","text":"
    • Lock-file resolve functionality in separated modules by @tdejager in #1337
    • Use generic for RepoDataRecordsByName and PypiRecordsByName by @olivier-lacroix in #1341
    "},{"location":"CHANGELOG/#new-contributors_11","title":"New Contributors","text":"
    • @twrightsman made their first contribution in #1368
    • @notPlancha made their first contribution in #1358
    • @vigneshmanick made their first contribution in #1336

    Full commit history

    "},{"location":"CHANGELOG/#0211-2024-05-07","title":"[0.21.1] - 2024-05-07","text":""},{"location":"CHANGELOG/#fixed_17","title":"Fixed","text":"
    • Use read timeout, not global timeout by @wolfv in #1329
    • Channel priority logic by @ruben-arts in #1332

    Full commit history

    "},{"location":"CHANGELOG/#0210-2024-05-06","title":"[0.21.0] - 2024-05-06","text":""},{"location":"CHANGELOG/#highlights_11","title":"\u2728 Highlights","text":"
    • This release adds support for configuring PyPI settings globally, to use alternative PyPI indexes and load credentials with keyring.
    • We now support cross-platform running, for osx-64 on osx-arm64 and wasm environments.
    • There is now a no-default-feature option to simplify usage of environments.
    "},{"location":"CHANGELOG/#added_9","title":"Added","text":"
    • Add pypi config for global local config file + keyring support by @wolfv in #1279
    • Allow for cross-platform running, for osx-64 on osx-arm64 and wasm environments by @wolfv in #1020
    "},{"location":"CHANGELOG/#changed_12","title":"Changed","text":"
    • Add no-default-feature option to environments by @olivier-lacroix in #1092
    • Add /etc/pixi/config.toml to global configuration search paths by @pavelzw in #1304
    • Change global config fields to kebab-case by @tdejager in #1308
    • Show all available task with task list by @Hoxbro in #1286
    • Allow to emit activation environment variables as JSON by @borchero in #1317
    • Use locked pypi packages as preferences in the pypi solve to get minimally updating lock files by @ruben-arts in #1320
    • Allow to upgrade several global packages at once by @olivier-lacroix in #1324
    "},{"location":"CHANGELOG/#documentation_14","title":"Documentation","text":"
    • Typo in tutorials python by @carschandler in #1297
    • Python Tutorial: Dependencies, PyPI, Order, Grammar by @JesperDramsch in #1313
    "},{"location":"CHANGELOG/#fixed_18","title":"Fixed","text":"
    • Schema version and add it to tbump by @ruben-arts in #1284
    • Make integration test fail in ci and fix ssh issue by @ruben-arts in #1301
    • Automate adding install scripts to the docs by @ruben-arts in #1302
    • Do not always request for prefix mapping by @nichmor in #1300
    • Align CLI aliases and add missing by @ruben-arts in #1316
    • Alias depends_on to depends-on by @ruben-arts in #1310
    • Add error if channel or platform doesn't exist on remove by @ruben-arts in #1315
    • Allow spec in pixi q instead of only name by @ruben-arts in #1314
    • Remove dependency on sysroot for linux by @ruben-arts in #1319
    • Fix linking symlink issue, by updating to the latest rattler by @baszalmstra in #1327
    "},{"location":"CHANGELOG/#refactor_9","title":"Refactor","text":"
    • Use IndexSet instead of Vec for collections of unique elements by @olivier-lacroix in #1289
    • Use generics over PyPiDependencies and CondaDependencies by @olivier-lacroix in #1303
    "},{"location":"CHANGELOG/#new-contributors_12","title":"New Contributors","text":"
    • @borchero made their first contribution in #1317
    • @JesperDramsch made their first contribution in #1313
    • @Hoxbro made their first contribution in #1286
    • @carschandler made their first contribution in #1297

    Full commit history

    "},{"location":"CHANGELOG/#0201-2024-04-26","title":"[0.20.1] - 2024-04-26","text":""},{"location":"CHANGELOG/#highlights_12","title":"\u2728 Highlights","text":"
    • Big improvements on the pypi-editable installs.
    "},{"location":"CHANGELOG/#fixed_19","title":"Fixed","text":"
    • Editable non-satisfiable by @baszalmstra in #1251
    • Satisfiability with pypi extras by @baszalmstra in #1253
    • Change global install activation script permission from 0o744 -> 0o755 by @zen-xu in #1250
    • Avoid creating Empty TOML tables by @olivier-lacroix in #1270
    • Uses the special-case uv path handling for both built and source by @tdejager in #1263
    • Modify test before attempting to write to .bash_profile in install.sh by @bruchim-cisco in #1267
    • Parse properly 'default' as environment Cli argument by @olivier-lacroix in #1247
    • Apply schema.json normalization, add to docs by @bollwyvl in #1265
    • Improve absolute path satisfiability by @tdejager in #1252
    • Improve parse deno error and make task a required field in the cli by @ruben-arts in #1260
    "},{"location":"CHANGELOG/#new-contributors_13","title":"New Contributors","text":"
    • @bollwyvl made their first contribution in #1265
    • @bruchim-cisco made their first contribution in #1267
    • @zen-xu made their first contribution in #1250

    Full commit history

    "},{"location":"CHANGELOG/#0200-2024-04-19","title":"[0.20.0] - 2024-04-19","text":""},{"location":"CHANGELOG/#highlights_13","title":"\u2728 Highlights","text":"
    • We now support env variables in the task definition, these can also be used as default values for parameters in your task which you can overwrite with your shell's env variables. e.g. task = { cmd = \"task to run\", env = { VAR=\"value1\", PATH=\"my/path:$PATH\" } }
    • We made a big effort on fixing issues and improving documentation!
    "},{"location":"CHANGELOG/#added_10","title":"Added","text":"
    • Add env to the tasks to specify tasks specific environment variables by @wolfv in https://github.com/prefix-dev/pixi/pull/972
    "},{"location":"CHANGELOG/#changed_13","title":"Changed","text":"
    • Add --pyproject option to pixi init with a pyproject.toml by @olivier-lacroix in #1188
    • Upgrade to new uv version 0.1.32 by @tdejager in #1208
    "},{"location":"CHANGELOG/#documentation_15","title":"Documentation","text":"
    • Document pixi.lock by @ruben-arts in #1209
    • Document channel priority definition by @ruben-arts in #1234
    • Add rust tutorial including openssl example by @ruben-arts in #1155
    • Add python tutorial to documentation by @tdejager in #1179
    • Add JupyterLab integration docs by @renan-r-santos in #1147
    • Add Windows support for PyCharm integration by @pavelzw in #1192
    • Setup_pixi for local pixi installation by @ytausch in #1181
    • Update pypi docs by @Hofer-Julian in #1215
    • Fix order of --no-deps when pip installing in editable mode by @glemaitre in #1220
    • Fix frozen documentation by @ruben-arts in #1167
    "},{"location":"CHANGELOG/#fixed_20","title":"Fixed","text":"
    • Small typo in list cli by @tdejager in #1169
    • Issue with invalid solve group by @baszalmstra in #1190
    • Improve error on parsing lockfile by @ruben-arts in #1180
    • Replace _ with - when creating environments from features by @wolfv in #1203
    • Prevent duplicate direct dependencies in tree by @abkfenris in #1184
    • Use project root directory instead of task.working_directory for base dir when hashing by @wolfv in #1202
    • Do not leak env vars from bat scripts in cmd.exe by @wolfv in #1205
    • Make file globbing behave more as expected by @wolfv in #1204
    • Fix for using file::// in pyproject.toml dependencies by @tdejager in #1196
    • Improve pypi version conversion in pyproject.toml dependencies by @wolfv in #1201
    • Update to the latest rattler by @wolfv in #1235
    "},{"location":"CHANGELOG/#breaking","title":"BREAKING","text":"
    • task = { cmd = \"task to run\", cwd = \"folder\", inputs = \"input.txt\", output = \"output.txt\"} Where input.txt and output.txt where previously in folder they are now relative the project root. This changed in: #1202
    • task = { cmd = \"task to run\", inputs = \"input.txt\"} previously searched for all input.txt files now only for the ones in the project root. This changed in: #1204
    "},{"location":"CHANGELOG/#new-contributors_14","title":"New Contributors","text":"
    • @glemaitre made their first contribution in #1220

    Full commit history

    "},{"location":"CHANGELOG/#0191-2024-04-11","title":"[0.19.1] - 2024-04-11","text":""},{"location":"CHANGELOG/#highlights_14","title":"\u2728 Highlights","text":"

    This fixes the issue where pixi would generate broken environments/lockfiles when a mapping for a brand-new version of a package is missing.

    "},{"location":"CHANGELOG/#changed_14","title":"Changed","text":"
    • Add fallback mechanism for missing mapping by @nichmor in #1166

    Full commit history

    "},{"location":"CHANGELOG/#0190-2024-04-10","title":"[0.19.0] - 2024-04-10","text":""},{"location":"CHANGELOG/#highlights_15","title":"\u2728 Highlights","text":"
    • This release adds a new pixi tree command to show the dependency tree of the project.
    • Pixi now persists the manifest and environment when activating a shell, so you can use pixi as if you are in that folder while in the shell.
    "},{"location":"CHANGELOG/#added_11","title":"Added","text":"
    • pixi tree command to show dependency tree by @abkfenris in #1069
    • Persistent shell manifests by @abkfenris in #1080
    • Add to pypi in feature (pixi add --feature test --pypi package) by @ruben-arts in #1135
    • Use new mapping by @nichmor in #888
    • --no-progress to disable all progress bars by @baszalmstra in #1105
    • Create a table if channel is specified (pixi add conda-forge::rattler-build) by @baszalmstra in #1079
    "},{"location":"CHANGELOG/#changed_15","title":"Changed","text":"
    • Add the project itself as an editable dependency by @olivier-lacroix in #1084
    • Get tool.pixi.project.name from project.name by @olivier-lacroix in #1112
    • Create features and environments from extras by @olivier-lacroix in #1077
    • Pypi supports come out of Beta by @olivier-lacroix in #1120
    • Enable to force PIXI_ARCH for pixi installation by @beenje in #1129
    • Improve tool.pixi.project detection logic by @olivier-lacroix in #1127
    • Add purls for packages if adding pypi dependencies by @nichmor in #1148
    • Add env name if not default to tree and list commands by @ruben-arts in #1145
    "},{"location":"CHANGELOG/#documentation_16","title":"Documentation","text":"
    • Add MODFLOW 6 to community docs by @Hofer-Julian in #1125
    • Addition of ros2 tutorial by @ruben-arts in #1116
    • Improve install script docs by @ruben-arts in #1136
    • More structured table of content by @tdejager in #1142
    "},{"location":"CHANGELOG/#fixed_21","title":"Fixed","text":"
    • Amend syntax in conda-meta/history to prevent conda.history.History.parse() error by @jaimergp in #1117
    • Fix docker example and include pyproject.toml by @tdejager in #1121
    "},{"location":"CHANGELOG/#new-contributors_15","title":"New Contributors","text":"
    • @abkfenris made their first contribution in #1069
    • @beenje made their first contribution in #1129
    • @jaimergp made their first contribution in #1117

    Full commit history

    "},{"location":"CHANGELOG/#0180-2024-04-02","title":"[0.18.0] - 2024-04-02","text":""},{"location":"CHANGELOG/#highlights_16","title":"\u2728 Highlights","text":"
    • This release adds support for pyproject.toml, now pixi reads from the [tool.pixi] table.
    • We now support editable PyPI dependencies, and PyPI source dependencies, including git, path, and url dependencies.

    [!TIP] These new features are part of the ongoing effort to make pixi more flexible, powerful, and comfortable for the python users. They are still in progress so expect more improvements on these features soon, so please report any issues you encounter and follow our next releases!

    "},{"location":"CHANGELOG/#added_12","title":"Added","text":"
    • Support for pyproject.toml by @olivier-lacroix in #999
    • Support for PyPI source dependencies by @tdejager in #985
    • Support for editable PyPI dependencies by @tdejager in #1044
    "},{"location":"CHANGELOG/#changed_16","title":"Changed","text":"
    • XDG_CONFIG_HOME and XDG_CACHE_HOME compliance by @chawyehsu in #1050
    • Build pixi for windows arm by @baszalmstra in #1053
    • Platform literals by @baszalmstra in #1054
    • Cli docs: --user is actually --username
    • Fixed error in auth example (CLI docs) by @ytausch in #1076
    "},{"location":"CHANGELOG/#documentation_17","title":"Documentation","text":"
    • Add lockfile update description in preparation for pixi update by @ruben-arts in #1073
    • zsh may be used for installation on macOS by @pya in #1091
    • Fix typo in pixi auth documentation by @ytausch in #1076
    • Add rstudio to the IDE integration docs by @wolfv in #1144
    "},{"location":"CHANGELOG/#fixed_22","title":"Fixed","text":"
    • Test failure on riscv64 by @hack3ric in #1045
    • Validation test was testing on a wrong pixi.toml by @ruben-arts in #1056
    • Pixi list shows path and editable by @baszalmstra in #1100
    • Docs ci by @ruben-arts in #1074
    • Add error for unsupported pypi dependencies by @baszalmstra in #1052
    • Interactively delete environment when it was relocated by @baszalmstra in #1102
    • Allow solving for different platforms by @baszalmstra in #1101
    • Don't allow extra keys in pypi requirements by @baszalmstra in #1104
    • Solve when moving dependency from conda to pypi by @baszalmstra in #1099
    "},{"location":"CHANGELOG/#new-contributors_16","title":"New Contributors","text":"
    • @pya made their first contribution in #1091
    • @ytausch made their first contribution in #1076
    • @hack3ric made their first contribution in #1045
    • @olivier-lacroix made their first contribution in #999
    • @henryiii made their first contribution in #1063

    Full commit history

    "},{"location":"CHANGELOG/#0171-2024-03-21","title":"[0.17.1] - 2024-03-21","text":""},{"location":"CHANGELOG/#highlights_17","title":"\u2728 Highlights","text":"

    A quick bug-fix release for pixi list.

    "},{"location":"CHANGELOG/#documentation_18","title":"Documentation","text":"
    • Fix typo by @pavelzw in #1028
    "},{"location":"CHANGELOG/#fixed_23","title":"Fixed","text":"
    • Remove the need for a python interpreter in pixi list by @baszalmstra in #1033
    "},{"location":"CHANGELOG/#0170-2024-03-19","title":"[0.17.0] - 2024-03-19","text":""},{"location":"CHANGELOG/#highlights_18","title":"\u2728 Highlights","text":"
    • This release greatly improves pixi global commands, thanks to @chawyehsu!
    • We now support global (or local) configuration for pixi's own behavior, including mirrors, and OCI registries.
    • We support channel mirrors for corporate environments!
    • Faster task execution thanks to caching \ud83d\ude80 Tasks that already executed successfully can be skipped based on the hash of the inputs and outputs.
    • PyCharm and GitHub Actions integration thanks to @pavelzw \u2013 read more about it in the docs!
    "},{"location":"CHANGELOG/#added_13","title":"Added","text":"
    • Add citation file by @ruben-arts in #908
    • Add a pixi badge by @ruben-arts in #961
    • Add deserialization of pypi source dependencies from toml by @ruben-arts and @wolf in #895 #984
    • Implement mirror and OCI settings by @wolfv in #988
    • Implement inputs and outputs hash based task skipping by @wolfv in #933
    "},{"location":"CHANGELOG/#changed_17","title":"Changed","text":"
    • Refined global upgrade commands by @chawyehsu in #948
    • Global upgrade supports matchspec by @chawyehsu in #962
    • Improve pixi search with platform selection and making limit optional by @wolfv in #979
    • Implement global config options by @wolfv in #960 #1015 #1019
    • Update auth to use rattler cli by @kassoulait by @ruben-arts in #986
    "},{"location":"CHANGELOG/#documentation_19","title":"Documentation","text":"
    • Remove cache: true from setup-pixi by @pavelzw in #950
    • Add GitHub Actions documentation by @pavelzw in #955
    • Add PyCharm documentation by @pavelzw in #974
    • Mention watch_file in direnv usage by @pavelzw in #983
    • Add tip to help users when no PROFILE file exists by @ruben-arts in #991
    • Move yaml comments into mkdocs annotations by @pavelzw in #1003
    • Fix --env and extend actions examples by @ruben-arts in #1005
    • Add Wflow to projects built with pixi by @Hofer-Julian in #1006
    • Removed linenums to avoid buggy visualization by @ruben-arts in #1002
    • Fix typos by @pavelzw in #1016
    "},{"location":"CHANGELOG/#fixed_24","title":"Fixed","text":"
    • Pypi dependencies not being removed by @tdejager in #952
    • Permissions for lint pr by @ruben-arts in #852
    • Install Windows executable with install.sh in Git Bash by @jdblischak in #966
    • Proper scanning of the conda-meta folder for json entries by @wolfv in #971
    • Global shim scripts for Windows by @wolfv in #975
    • Correct fish prompt by @wolfv in #981
    • Prefix_file rename by @ruben-arts in #959
    • Conda transitive dependencies of pypi packages are properly extracted by @baszalmstra in #967
    • Make tests more deterministic and use single * for glob expansion by @wolfv in #987
    • Create conda-meta/history file by @pavelzw in #995
    • Pypi dependency parsing was too lenient by @wolfv in #984
    • Add reactivation of the environment in pixi shell by @wolfv in #982
    • Add tool to strict json schema by @ruben-arts in #969
    "},{"location":"CHANGELOG/#new-contributors_17","title":"New Contributors","text":"
    • @jdblischak made their first contribution in #966
    • @kassoulait made their first contribution in #986

    Full commit history

    "},{"location":"CHANGELOG/#0161-2024-03-11","title":"[0.16.1] - 2024-03-11","text":""},{"location":"CHANGELOG/#fixed_25","title":"Fixed","text":"
    • Parse lockfile matchspecs lenient, fixing bug introduced in 0.16.0 by @ruben-arts in #951

    Full commit history

    "},{"location":"CHANGELOG/#0160-2024-03-09","title":"[0.16.0] - 2024-03-09","text":""},{"location":"CHANGELOG/#highlights_19","title":"\u2728 Highlights","text":"
    • This release removes rip and add uv as the PyPI resolver and installer.
    "},{"location":"CHANGELOG/#added_14","title":"Added","text":"
    • Add tcsh install support by @obust in #898
    • Add user agent to pixi http client by @baszalmstra in #892
    • Add a schema for the pixi.toml by @ruben-arts in #936
    "},{"location":"CHANGELOG/#changed_18","title":"Changed","text":"
    • Switch from rip to uv by @tdejager in #863
    • Move uv options into context by @tdejager in #911
    • Add Deltares projects to Community.md by @Hofer-Julian in #920
    • Upgrade to uv 0.1.16, updated for changes in the API by @tdejager in #935
    "},{"location":"CHANGELOG/#fixed_26","title":"Fixed","text":"
    • Made the uv re-install logic a bit more clear by @tdejager in #894
    • Avoid duplicate pip dependency while importing environment.yaml by @sumanth-manchala in #890
    • Handle custom channels when importing from env yaml by @sumanth-manchala in #901
    • Pip editable installs getting uninstalled by @renan-r-santos in #902
    • Highlight pypi deps in pixi list by @sumanth-manchala in #907
    • Default to the default environment if possible by @ruben-arts in #921
    • Switching channels by @baszalmstra in #923
    • Use correct name of the channel on adding by @ruben-arts in #928
    • Turn back on jlap for faster repodata fetching by @ruben-arts in #937
    • Remove dists site-packages's when python interpreter changes by @tdejager in #896
    "},{"location":"CHANGELOG/#new-contributors_18","title":"New Contributors","text":"
    • @obust made their first contribution in #898
    • @renan-r-santos made their first contribution in #902

    Full Commit history

    "},{"location":"CHANGELOG/#0152-2024-02-29","title":"[0.15.2] - 2024-02-29","text":""},{"location":"CHANGELOG/#changed_19","title":"Changed","text":"
    • Add more info to a failure of activation by @ruben-arts in #873
    "},{"location":"CHANGELOG/#fixed_27","title":"Fixed","text":"
    • Improve global list UX when there is no global env dir created by @sumanth-manchala in #865
    • Update rattler to v0.19.0 by @AliPiccioniQC in #885
    • Error on pixi run if platform is not supported by @ruben-arts in #878
    "},{"location":"CHANGELOG/#new-contributors_19","title":"New Contributors","text":"
    • @sumanth-manchala made their first contribution in #865
    • @AliPiccioniQC made their first contribution in #885

    Full commit history

    "},{"location":"CHANGELOG/#0151-2024-02-26","title":"[0.15.1] - 2024-02-26","text":""},{"location":"CHANGELOG/#added_15","title":"Added","text":"
    • Add prefix to project info json output by @baszalmstra in #859
    "},{"location":"CHANGELOG/#changed_20","title":"Changed","text":"
    • New pixi global list display format by @chawyehsu in #723
    • Add direnv usage by @pavelzw in #845
    • Add docker example by @pavelzw in #846
    • Install/remove multiple packages globally by @chawyehsu in #854
    "},{"location":"CHANGELOG/#fixed_28","title":"Fixed","text":"
    • Prefix file in init --import by @ruben-arts in #855
    • Environment and feature names in pixi info --json by @baszalmstra in #857

    Full commit history

    "},{"location":"CHANGELOG/#0150-2024-02-23","title":"[0.15.0] - 2024-02-23","text":""},{"location":"CHANGELOG/#highlights_20","title":"\u2728 Highlights","text":"
    • [pypi-dependencies] now get build in the created environment so it uses the conda installed build tools.
    • pixi init --import env.yml to import an existing conda environment file.
    • [target.unix.dependencies] to specify dependencies for unix systems instead of per platform.

    [!WARNING] This versions build failed, use v0.15.1

    "},{"location":"CHANGELOG/#added_16","title":"Added","text":"
    • pass environment variables during pypi resolution and install (#818)
    • skip micromamba style selector lines and warn about them (#830)
    • add import yml flag (#792)
    • check duplicate dependencies (#717)
    • (ci) check conventional PR title (#820)
    • add --feature to pixi add (#803)
    • add windows, macos, linux and unix to targets (#832)
    "},{"location":"CHANGELOG/#fixed_29","title":"Fixed","text":"
    • cache and retry pypi name mapping (#839)
    • check duplicates while adding dependencies (#829)
    • logic PIXI_NO_PATH_UPDATE variable (#822)
    "},{"location":"CHANGELOG/#other","title":"Other","text":"
    • add mike to the documentation and update looks (#809)
    • add instructions for installing on Alpine Linux (#828)
    • more error reporting in self-update (#823)
    • disabled jlap for now (#836)

    Full commit history

    "},{"location":"CHANGELOG/#0140-2024-02-15","title":"[0.14.0] - 2024-02-15","text":""},{"location":"CHANGELOG/#highlights_21","title":"\u2728 Highlights","text":"

    Now, solve-groups can be used in [environments] to ensure dependency alignment across different environments without simultaneous installation. This feature is particularly beneficial for managing identical dependencies in test and production environments. Example configuration:

    [environments]\ntest = { features = [\"prod\", \"test\"], solve-groups = [\"group1\"] }\nprod = { features = [\"prod\"], solve-groups = [\"group1\"] }\n
    This setup simplifies managing dependencies that must be consistent across test and production.

    "},{"location":"CHANGELOG/#added_17","title":"Added","text":"
    • Add index field to pypi requirements by @vlad-ivanov-name in #784
    • Add -f/--feature to the pixi project platform command by @ruben-arts in #785
    • Warn user when unused features are defined by @ruben-arts in #762
    • Disambiguate tasks interactive by @baszalmstra in #766
    • Solve groups for conda by @baszalmstra in #783
    • Pypi solve groups by @baszalmstra in #802
    • Enable reflinks by @baszalmstra in #729
    "},{"location":"CHANGELOG/#changed_21","title":"Changed","text":"
    • Add environment name to the progress by @ruben-arts in #788
    • Set color scheme by @ruben-arts in #773
    • Update lock on pixi list by @ruben-arts in #775
    • Use default env if task available in it. by @ruben-arts in #772
    • Color environment name in install step by @ruben-arts in #795
    "},{"location":"CHANGELOG/#fixed_30","title":"Fixed","text":"
    • Running cuda env and using those tasks. by @ruben-arts in #764
    • Make svg a gif by @ruben-arts in #782
    • Fmt by @ruben-arts
    • Check for correct platform in task env creation by @ruben-arts in #759
    • Remove using source name by @ruben-arts in #765
    • Auto-guessing of the shell in the shell-hook by @ruben-arts in https://github.com/prefix-dev/pixi/pull/811
    • sdist with direct references by @nichmor in https://github.com/prefix-dev/pixi/pull/813
    "},{"location":"CHANGELOG/#miscellaneous","title":"Miscellaneous","text":"
    • Add slim-trees to community projects by @pavelzw in #760
    • Add test to default env in polarify example
    • Add multiple machine example by @ruben-arts in #757
    • Add more documentation on environments by @ruben-arts in #790
    • Update rip and rattler by @wolfv in #798
    • Rattler 0.18.0 by @baszalmstra in #805
    • Rip 0.8.0 by @nichmor in #806
    • Fix authentication path by @pavelzw in #796
    • Initial addition of integration test by @ruben-arts in https://github.com/prefix-dev/pixi/pull/804
    "},{"location":"CHANGELOG/#new-contributors_20","title":"New Contributors","text":"
    • @vlad-ivanov-name made their first contribution in #784
    • @nichmor made their first contribution in #806

    Full commit history

    "},{"location":"CHANGELOG/#0130-2024-02-01","title":"[0.13.0] - 2024-02-01","text":""},{"location":"CHANGELOG/#highlights_22","title":"\u2728 Highlights","text":"

    This release is pretty crazy in amount of features! The major ones are: - We added support for multiple environments. :tada: Checkout the documentation - We added support for sdist installation, which greatly improves the amount of packages that can be installed from PyPI. :rocket:

    [!IMPORTANT]

    Renaming of PIXI_PACKAGE_* variables:

    PIXI_PACKAGE_ROOT -> PIXI_PROJECT_ROOT\nPIXI_PACKAGE_NAME ->  PIXI_PROJECT_NAME\nPIXI_PACKAGE_MANIFEST -> PIXI_PROJECT_MANIFEST\nPIXI_PACKAGE_VERSION -> PIXI_PROJECT_VERSION\nPIXI_PACKAGE_PLATFORMS -> PIXI_ENVIRONMENT_PLATFORMS\n
    Check documentation here: https://pixi.sh/environment/

    [!IMPORTANT]

    The .pixi/env/ folder has been moved to accommodate multiple environments. If you only have one environment it is now named .pixi/envs/default.

    "},{"location":"CHANGELOG/#added_18","title":"Added","text":"
    • Add support for multiple environment:
      • Update to rattler lock v4 by @baszalmstra in #698
      • Multi-env installation and usage by @baszalmstra in #721
      • Update all environments in the lock-file when requesting an environment by @baszalmstra in #711
      • Run tasks in the env they are defined by @baszalmstra in #731
      • polarify use-case as an example by @ruben-arts in #735
      • Make environment name parsing strict by @ruben-arts in #673
      • Use named environments (only \"default\" for now) by @ruben-arts in #674
      • Use task graph instead of traversal by @baszalmstra in #725
      • Multi env documentation by @ruben-arts in #703
      • pixi info -e/--environment option by @ruben-arts in #676
      • pixi channel add -f/--feature option by @ruben-arts in #700
      • pixi channel remove -f/--feature option by @ruben-arts in #706
      • pixi remove -f/--feature option by @ruben-arts in #680
      • pixi task list -e/--environment option by @ruben-arts in #694
      • pixi task remove -f/--feature option by @ruben-arts in #694
      • pixi install -e/--environment option by @ruben-arts in #722
    • Support for sdists in pypi-dependencies by @tdejager in #664
    • Add pre-release support to pypi-dependencies by @tdejager in #716
    • Support adding dependencies for project's unsupported platforms by @orhun in #668
    • Add pixi list command by @hadim in #665
    • Add pixi shell-hook command by @orhun in #672#679 #684
    • Use env variable to configure locked, frozen and color by @hadim in #726
    • pixi self-update by @hadim in #675
    • Add PIXI_NO_PATH_UPDATE for PATH update suppression by @chawyehsu in #692
    • Set the cache directory by @ruben-arts in #683
    "},{"location":"CHANGELOG/#changed_22","title":"Changed","text":"
    • Use consistent naming for tests module by @orhun in #678
    • Install pixi and add to the path in docker example by @ruben-arts in #743
    • Simplify the deserializer of PyPiRequirement by @orhun in #744
    • Use tabwriter instead of comfy_table by @baszalmstra in #745
    • Document environment variables by @ruben-arts in #746
    "},{"location":"CHANGELOG/#fixed_31","title":"Fixed","text":"
    • Quote part of the task that has brackets ([ or ]) by @JafarAbdi in #677
    • Package clobber and __pycache__ removal issues by @wolfv in #573
    • Non-global reqwest client by @tdejager in #693
    • Fix broken pipe error during search by @orhun in #699
    • Make pixi search result correct by @chawyehsu in #713
    • Allow the tasks for all platforms to be shown in pixi info by @ruben-arts in #728
    • Flaky tests while installing pypi dependencies by @baszalmstra in #732
    • Linux install script by @mariusvniekerk in #737
    • Download wheels in parallel to avoid deadlock by @baszalmstra in #752
    "},{"location":"CHANGELOG/#new-contributors_21","title":"New Contributors","text":"
    • @JafarAbdi made their first contribution in #677
    • @mariusvniekerk made their first contribution in #737

    Full commit history

    "},{"location":"CHANGELOG/#0120-2024-01-15","title":"[0.12.0] - 2024-01-15","text":""},{"location":"CHANGELOG/#highlights_23","title":"\u2728 Highlights","text":"
    • Some great community contributions, pixi global upgrade, pixi project version commands, a PIXI_HOME variable.
    • A ton of refactor work to prepare for the multi-environment feature.
      • Note that there are no extra environments created yet, but you can just specify them in the pixi.toml file already.
      • Next we'll build the actual environments.
    "},{"location":"CHANGELOG/#added_19","title":"Added","text":"
    • Add global upgrade command to pixi by @trueleo in #614
    • Add configurable PIXI_HOME by @chawyehsu in #627
    • Add --pypi option to pixi remove by @marcelotrevisani in https://github.com/prefix-dev/pixi/pull/602
    • PrioritizedChannels to specify channel priority by @ruben-arts in https://github.com/prefix-dev/pixi/pull/658
    • Add project version {major,minor,patch} CLIs by @hadim in https://github.com/prefix-dev/pixi/pull/633
    "},{"location":"CHANGELOG/#changed_23","title":"Changed","text":"
    • Refactored project model using targets, features and environments by @baszalmstra in https://github.com/prefix-dev/pixi/pull/616
    • Move code from Project to Environment by @baszalmstra in #630
    • Refactored system-requirements from Environment by @baszalmstra in #632
    • Extract activation.scripts into Environment by @baszalmstra in #659
    • Extract pypi-dependencies from Environment by @baszalmstra in https://github.com/prefix-dev/pixi/pull/656
    • De-serialization of features and environments by @ruben-arts in https://github.com/prefix-dev/pixi/pull/636
    "},{"location":"CHANGELOG/#fixed_32","title":"Fixed","text":"
    • Make install.sh also work with wget if curl is not available by @wolfv in #644
    • Use source build for rattler by @ruben-arts
    • Check for pypi-dependencies before amending the pypi purls by @ruben-arts in #661
    • Don't allow the use of reflinks by @ruben-arts in #662
    "},{"location":"CHANGELOG/#removed_1","title":"Removed","text":"
    • Remove windows and unix system requirements by @baszalmstra in #635
    "},{"location":"CHANGELOG/#documentation_20","title":"Documentation","text":"
    • Document the channel logic by @ruben-arts in https://github.com/prefix-dev/pixi/pull/610
    • Update the instructions for installing on Arch Linux by @orhun in https://github.com/prefix-dev/pixi/pull/653
    • Update Community.md by @KarelZe in https://github.com/prefix-dev/pixi/pull/654
    • Replace contributions.md with contributing.md and make it more standardized by @ruben-arts in https://github.com/prefix-dev/pixi/pull/649
    • Remove windows and unix system requirements by @baszalmstra in https://github.com/prefix-dev/pixi/pull/635
    • Add CODE_OF_CONDUCT.md by @ruben-arts in https://github.com/prefix-dev/pixi/pull/648
    • Removed remaining .ps1 references by @bahugo in https://github.com/prefix-dev/pixi/pull/643
    "},{"location":"CHANGELOG/#new-contributors_22","title":"New Contributors","text":"
    • @marcelotrevisani made their first contribution in https://github.com/prefix-dev/pixi/pull/602
    • @trueleo made their first contribution in https://github.com/prefix-dev/pixi/pull/614
    • @bahugo made their first contribution in https://github.com/prefix-dev/pixi/pull/643
    • @KarelZe made their first contribution in https://github.com/prefix-dev/pixi/pull/654

    Full Changelog: https://github.com/prefix-dev/pixi/compare/v0.11.0...v0.12.0

    "},{"location":"CHANGELOG/#0111-2024-01-06","title":"[0.11.1] - 2024-01-06","text":""},{"location":"CHANGELOG/#fixed_33","title":"Fixed","text":"
    • Upgrading rattler to fix pixi auth in #642
    "},{"location":"CHANGELOG/#0110-2024-01-05","title":"[0.11.0] - 2024-01-05","text":""},{"location":"CHANGELOG/#highlights_24","title":"\u2728 Highlights","text":"
    • Lots of important and preparations for the pypi sdist and multi environment feature
    • Lots of new contributors that help pixi improve!
    "},{"location":"CHANGELOG/#added_20","title":"Added","text":"
    • Add new commands for pixi project {version|channel|platform|description} by @hadim in #579
    • Add dependabot.yml by @pavelzw in #606
    "},{"location":"CHANGELOG/#changed_24","title":"Changed","text":"
    • winget-releaser gets correct identifier by @ruben-arts in #561
    • Task run code by @baszalmstra in #556
    • No ps1 in activation scripts by @ruben-arts in #563
    • Changed some names for clarity by @tdejager in #568
    • Change font and make it dark mode by @ruben-arts in #576
    • Moved pypi installation into its own module by @tdejager in #589
    • Move alpha to beta feature and toggle it off with env var by @ruben-arts in #604
    • Improve UX activation scripts by @ruben-arts in #560
    • Add sanity check by @tdejager in #569
    • Refactor manifest by @ruben-arts in #572
    • Improve search by @Johnwillliam in #578
    • Split pypi and conda solve steps by @tdejager in #601
    "},{"location":"CHANGELOG/#fixed_34","title":"Fixed","text":"
    • Save file after lockfile is correctly updated by @ruben-arts in #555
    • Limit the number of concurrent solves by @baszalmstra in #571
    • Use project virtual packages in add command by @msegado in #609
    • Improved mapped dependency by @ruben-arts in #574
    "},{"location":"CHANGELOG/#documentation_21","title":"Documentation","text":"
    • Change font and make it dark mode by @ruben-arts in #576
    • typo: no ps1 in activation scripts by @ruben-arts in #563
    • Document adding CUDA to system-requirements by @ruben-arts in #595
    • Multi env proposal documentation by @ruben-arts in #584
    • Fix multiple typos in configuration.md by @SeaOtocinclus in #608
    • Add multiple machines from one project example by @pavelzw in #605
    "},{"location":"CHANGELOG/#new-contributors_23","title":"New Contributors","text":"
    • @hadim made their first contribution in #579
    • @msegado made their first contribution in #609
    • @Johnwillliam made their first contribution in #578
    • @SeaOtocinclus made their first contribution in #608

    Full Changelog: https://github.com/prefix-dev/pixi/compare/v0.10.0...v0.11.0

    "},{"location":"CHANGELOG/#0100-2023-12-8","title":"[0.10.0] - 2023-12-8","text":""},{"location":"CHANGELOG/#highlights_25","title":"Highlights","text":"
    • Better pypi-dependencies support, now install even more of the pypi packages.
    • pixi add --pypi command to add a pypi package to your project.
    "},{"location":"CHANGELOG/#added_21","title":"Added","text":"
    • Use range (>=1.2.3, <1.3) when adding requirement, instead of 1.2.3.* by @baszalmstra in https://github.com/prefix-dev/pixi/pull/536
    • Update rip to fix by @tdejager in https://github.com/prefix-dev/pixi/pull/543
      • Better Bytecode compilation (.pyc) support by @baszalmstra
      • Recognize .data directory headers by @baszalmstra
    • Also print arguments given to a pixi task by @ruben-arts in https://github.com/prefix-dev/pixi/pull/545
    • Add pixi add --pypi command by @ruben-arts in https://github.com/prefix-dev/pixi/pull/539
    "},{"location":"CHANGELOG/#fixed_35","title":"Fixed","text":"
    • space in global install path by @ruben-arts in https://github.com/prefix-dev/pixi/pull/513
    • Glibc version/family parsing by @baszalmstra in https://github.com/prefix-dev/pixi/pull/535
    • Use build and host specs while getting the best version by @ruben-arts in https://github.com/prefix-dev/pixi/pull/538
    "},{"location":"CHANGELOG/#miscellaneous_1","title":"Miscellaneous","text":"
    • docs: add update manual by @ruben-arts in https://github.com/prefix-dev/pixi/pull/521
    • add lightgbm demo by @partrita in https://github.com/prefix-dev/pixi/pull/492
    • Update documentation link by @williamjamir in https://github.com/prefix-dev/pixi/pull/525
    • Update Community.md by @jiaxiyang in https://github.com/prefix-dev/pixi/pull/527
    • Add winget releaser by @ruben-arts in https://github.com/prefix-dev/pixi/pull/547
    • Custom rerun-sdk example, force driven graph of pixi.lock by @ruben-arts in https://github.com/prefix-dev/pixi/pull/548
    • Better document pypi part by @ruben-arts in https://github.com/prefix-dev/pixi/pull/546
    "},{"location":"CHANGELOG/#new-contributors_24","title":"New Contributors","text":"
    • @partrita made their first contribution in https://github.com/prefix-dev/pixi/pull/492
    • @williamjamir made their first contribution in https://github.com/prefix-dev/pixi/pull/525
    • @jiaxiyang made their first contribution in https://github.com/prefix-dev/pixi/pull/527

    Full Changelog: https://github.com/prefix-dev/pixi/compare/v0.9.1...v0.10.0

    "},{"location":"CHANGELOG/#091-2023-11-29","title":"[0.9.1] - 2023-11-29","text":""},{"location":"CHANGELOG/#highlights_26","title":"Highlights","text":"
    • PyPI's scripts are now fixed. For example: https://github.com/prefix-dev/pixi/issues/516
    "},{"location":"CHANGELOG/#fixed_36","title":"Fixed","text":"
    • Remove attr (unused) and update all dependencies by @wolfv in https://github.com/prefix-dev/pixi/pull/510
    • Remove empty folders on python uninstall by @baszalmstra in https://github.com/prefix-dev/pixi/pull/512
    • Bump rip to add scripts by @baszalmstra in https://github.com/prefix-dev/pixi/pull/517

    Full Changelog: https://github.com/prefix-dev/pixi/compare/v0.9.0...v0.9.1

    "},{"location":"CHANGELOG/#090-2023-11-28","title":"[0.9.0] - 2023-11-28","text":""},{"location":"CHANGELOG/#highlights_27","title":"Highlights","text":"
    • You can now run pixi remove, pixi rm to remove a package from the environment
    • Fix pip install -e issue that was created by release v0.8.0 : https://github.com/prefix-dev/pixi/issues/507
    "},{"location":"CHANGELOG/#added_22","title":"Added","text":"
    • pixi remove command by @Wackyator in https://github.com/prefix-dev/pixi/pull/483
    "},{"location":"CHANGELOG/#fixed_37","title":"Fixed","text":"
    • Install entrypoints for [pypi-dependencies] @baszalmstra in https://github.com/prefix-dev/pixi/pull/508
    • Only uninstall pixi installed packages by @baszalmstra in https://github.com/prefix-dev/pixi/pull/509

    Full Changelog: https://github.com/prefix-dev/pixi/compare/v0.8.0...v0.9.0

    "},{"location":"CHANGELOG/#080-2023-11-27","title":"[0.8.0] - 2023-11-27","text":""},{"location":"CHANGELOG/#highlights_28","title":"Highlights","text":"
    • \ud83c\udf89\ud83d\udc0d[pypi-dependencies] ALPHA RELEASE\ud83d\udc0d\ud83c\udf89, you can now add PyPI dependencies to your pixi project.
    • UX of pixi run has been improved with better errors and showing what task is run.

    [!NOTE] [pypi-dependencies] support is still incomplete, missing functionality is listed here: https://github.com/orgs/prefix-dev/projects/6. Our intent is not to have 100% feature parity with pip, our goal is that you only need pixi for both conda and pypi packages alike.

    "},{"location":"CHANGELOG/#added_23","title":"Added","text":"
    • Bump rattler @ruben-arts in https://github.com/prefix-dev/pixi/pull/496
    • Implement lock-file satisfiability with pypi-dependencies by @baszalmstra in https://github.com/prefix-dev/pixi/pull/494
    • List pixi tasks when command not found is returned by @ruben-arts in https://github.com/prefix-dev/pixi/pull/488
    • Show which command is run as a pixi task by @ruben-arts in https://github.com/prefix-dev/pixi/pull/491 && https://github.com/prefix-dev/pixi/pull/493
    • Add progress info to conda install by @baszalmstra in https://github.com/prefix-dev/pixi/pull/470
    • Install pypi dependencies (alpha) by @baszalmstra in https://github.com/prefix-dev/pixi/pull/452
    "},{"location":"CHANGELOG/#fixed_38","title":"Fixed","text":"
    • Add install scripts to pixi.sh by @ruben-arts in https://github.com/prefix-dev/pixi/pull/458 && https://github.com/prefix-dev/pixi/pull/459 && https://github.com/prefix-dev/pixi/pull/460
    • Fix RECORD not found issue by @baszalmstra in https://github.com/prefix-dev/pixi/pull/495
    • Actually add to the .gitignore and give better errors by @ruben-arts in https://github.com/prefix-dev/pixi/pull/490
    • Support macOS for pypi-dependencies by @baszalmstra in https://github.com/prefix-dev/pixi/pull/478
    • Custom pypi-dependencies type by @ruben-arts in https://github.com/prefix-dev/pixi/pull/471
    • pypi-dependencies parsing errors by @ruben-arts in https://github.com/prefix-dev/pixi/pull/479
    • Progress issues by @baszalmstra in https://github.com/prefix-dev/pixi/pull/4
    "},{"location":"CHANGELOG/#miscellaneous_2","title":"Miscellaneous","text":"
    • Example: ctypes by @liquidcarbon in https://github.com/prefix-dev/pixi/pull/441
    • Mention the AUR package by @orhun in https://github.com/prefix-dev/pixi/pull/464
    • Update rerun example by @ruben-arts in https://github.com/prefix-dev/pixi/pull/489
    • Document pypi-dependencies by @ruben-arts in https://github.com/prefix-dev/pixi/pull/481
    • Ignore docs paths on rust workflow by @ruben-arts in https://github.com/prefix-dev/pixi/pull/482
    • Fix flaky tests, run serially by @baszalmstra in https://github.com/prefix-dev/pixi/pull/477
    "},{"location":"CHANGELOG/#new-contributors_25","title":"New Contributors","text":"
    • @liquidcarbon made their first contribution in https://github.com/prefix-dev/pixi/pull/441
    • @orhun made their first contribution in https://github.com/prefix-dev/pixi/pull/464

    Full Changelog: https://github.com/prefix-dev/pixi/compare/v0.7.0...v0.8.0

    "},{"location":"CHANGELOG/#070-2023-11-14","title":"[0.7.0] - 2023-11-14","text":""},{"location":"CHANGELOG/#highlights_29","title":"Highlights","text":"
    • Channel priority: channels = [\"conda-forge\", \"pytorch\"] All packages found in conda-forge will not be taken from pytorch.
    • Channel specific dependencies: pytorch = { version=\"*\", channel=\"pytorch\"}
    • Autocompletion on pixi run <TABTAB>
    • Moved all pixi documentation into this repo, try it with pixi run docs!
    • Lots of new contributors!
    "},{"location":"CHANGELOG/#added_24","title":"Added","text":"
    • Bump rattler to its newest version by @ruben-arts in https://github.com/prefix-dev/pixi/pull/395 * Some notable changes: * Add channel priority (If a package is found in the first listed channel it will not be looked for in the other channels). * Fix JLAP using wrong hash. * Lockfile forward compatibility error.
    • Add nushell support by @wolfv in https://github.com/prefix-dev/pixi/pull/360
    • Autocomplete tasks on pixi run for bash and zsh by @ruben-arts in https://github.com/prefix-dev/pixi/pull/390
    • Add prefix location file to avoid copy error by @ruben-arts in https://github.com/prefix-dev/pixi/pull/422
    • Channel specific dependencies python = { version = \"*\" channel=\"conda-forge\" } by @ruben-arts in https://github.com/prefix-dev/pixi/pull/439
    "},{"location":"CHANGELOG/#changed_25","title":"Changed","text":"
    • project.version as optional field in the pixi.toml by @ruben-arts in https://github.com/prefix-dev/pixi/pull/400
    "},{"location":"CHANGELOG/#fixed_39","title":"Fixed","text":"
    • Deny unknown fields in pixi.toml to help users find errors by @ruben-arts in https://github.com/prefix-dev/pixi/pull/396
    • install.sh to create dot file if not present by @humphd in https://github.com/prefix-dev/pixi/pull/408
    • Ensure order of repodata fetches by @baszalmstra in https://github.com/prefix-dev/pixi/pull/405
    • Strip Linux binaries by @baszalmstra in https://github.com/prefix-dev/pixi/pull/414
    • Sort task list by @ruben-arts in https://github.com/prefix-dev/pixi/pull/431
    • Fix global install path on windows by @ruben-arts in https://github.com/prefix-dev/pixi/pull/449
    • Let PIXI_BIN_PATH use backslashes by @Hofer-Julian in https://github.com/prefix-dev/pixi/pull/442
    • Print more informative error if created file is empty by @traversaro in https://github.com/prefix-dev/pixi/pull/447
    "},{"location":"CHANGELOG/#docs","title":"Docs","text":"
    • Move to mkdocs with all documentation by @ruben-arts in https://github.com/prefix-dev/pixi/pull/435
    • Fix typing errors by @FarukhS52 in https://github.com/prefix-dev/pixi/pull/426
    • Add social cards to the pages by @ruben-arts in https://github.com/prefix-dev/pixi/pull/445
    • Enhance README.md: Added Table of Contents, Grammar Improvements by @adarsh-jha-dev in https://github.com/prefix-dev/pixi/pull/421
    • Adding conda-auth to community examples by @travishathaway in https://github.com/prefix-dev/pixi/pull/433
    • Minor grammar correction by @tylere in https://github.com/prefix-dev/pixi/pull/406
    • Make capitalization of tab titles consistent by @tylere in https://github.com/prefix-dev/pixi/pull/407
    "},{"location":"CHANGELOG/#new-contributors_26","title":"New Contributors","text":"
    • @tylere made their first contribution in https://github.com/prefix-dev/pixi/pull/406
    • @humphd made their first contribution in https://github.com/prefix-dev/pixi/pull/408
    • @adarsh-jha-dev made their first contribution in https://github.com/prefix-dev/pixi/pull/421
    • @FarukhS52 made their first contribution in https://github.com/prefix-dev/pixi/pull/426
    • @travishathaway made their first contribution in https://github.com/prefix-dev/pixi/pull/433
    • @traversaro made their first contribution in https://github.com/prefix-dev/pixi/pull/447

    Full Changelog: https://github.com/prefix-dev/pixi/compare/v0.6.0...v0.7.0

    "},{"location":"CHANGELOG/#060-2023-10-17","title":"[0.6.0] - 2023-10-17","text":""},{"location":"CHANGELOG/#highlights_30","title":"Highlights","text":"

    This release fixes some bugs and adds the --cwd option to the tasks.

    "},{"location":"CHANGELOG/#fixed_40","title":"Fixed","text":"
    • Improve shell prompts by @ruben-arts in https://github.com/prefix-dev/pixi/pull/385 https://github.com/prefix-dev/pixi/pull/388
    • Change --frozen logic to error when there is no lockfile by @ruben-arts in https://github.com/prefix-dev/pixi/pull/373
    • Don't remove the '.11' from 'python3.11' binary file name by @ruben-arts in https://github.com/prefix-dev/pixi/pull/366
    "},{"location":"CHANGELOG/#changed_26","title":"Changed","text":"
    • Update rerun example to v0.9.1 by @ruben-arts in https://github.com/prefix-dev/pixi/pull/389
    "},{"location":"CHANGELOG/#added_25","title":"Added","text":"
    • Add the current working directory (--cwd) in pixi tasks by @ruben-arts in https://github.com/prefix-dev/pixi/pull/380

    Full Changelog: https://github.com/prefix-dev/pixi/compare/v0.5.0...v0.6.0

    "},{"location":"CHANGELOG/#050-2023-10-03","title":"[0.5.0] - 2023-10-03","text":""},{"location":"CHANGELOG/#highlights_31","title":"Highlights","text":"

    We rebuilt pixi shell, fixing the fact that your rc file would overrule the environment activation.

    "},{"location":"CHANGELOG/#fixed_41","title":"Fixed","text":"
    • Change how shell works and make activation more robust by @wolfv in https://github.com/prefix-dev/pixi/pull/316
    • Documentation: use quotes in cli by @pavelzw in https://github.com/prefix-dev/pixi/pull/367
    "},{"location":"CHANGELOG/#added_26","title":"Added","text":"
    • Create or append to the .gitignore and .gitattributes files by @ruben-arts in https://github.com/prefix-dev/pixi/pull/359
    • Add --locked and --frozen to getting an up-to-date prefix by @ruben-arts in https://github.com/prefix-dev/pixi/pull/363
    • Documentation: improvement/update by @ruben-arts in https://github.com/prefix-dev/pixi/pull/355
    • Example: how to build a docker image using pixi by @ruben-arts in https://github.com/prefix-dev/pixi/pull/353 & https://github.com/prefix-dev/pixi/pull/365
    • Update to the newest rattler by @baszalmstra in https://github.com/prefix-dev/pixi/pull/361
    • Periodic cargo upgrade --all --incompatible by @wolfv in https://github.com/prefix-dev/pixi/pull/358

    Full Changelog: https://github.com/prefix-dev/pixi/compare/v0.4.0...v0.5.0

    "},{"location":"CHANGELOG/#040-2023-09-22","title":"[0.4.0] - 2023-09-22","text":""},{"location":"CHANGELOG/#highlights_32","title":"Highlights","text":"

    This release adds the start of a new cli command pixi project which will allow users to interact with the project configuration from the command line.

    "},{"location":"CHANGELOG/#fixed_42","title":"Fixed","text":"
    • Align with latest rattler version 0.9.0 by @ruben-arts in https://github.com/prefix-dev/pixi/pull/350
    "},{"location":"CHANGELOG/#added_27","title":"Added","text":"
    • Add codespell (config, workflow) to catch typos + catch and fix some of those by @yarikoptic in https://github.com/prefix-dev/pixi/pull/329
    • remove atty and use stdlib by @wolfv in https://github.com/prefix-dev/pixi/pull/337
    • xtsci-dist to Community.md by @HaoZeke in https://github.com/prefix-dev/pixi/pull/339
    • ribasim to Community.md by @Hofer-Julian in https://github.com/prefix-dev/pixi/pull/340
    • LFortran to Community.md by @wolfv in https://github.com/prefix-dev/pixi/pull/341
    • Give tip to resolve virtual package issue by @ruben-arts in https://github.com/prefix-dev/pixi/pull/348
    • pixi project channel add subcommand by @baszalmstra and @ruben-arts in https://github.com/prefix-dev/pixi/pull/347
    "},{"location":"CHANGELOG/#new-contributors_27","title":"New Contributors","text":"
    • @yarikoptic made their first contribution in https://github.com/prefix-dev/pixi/pull/329
    • @HaoZeke made their first contribution in https://github.com/prefix-dev/pixi/pull/339

    Full Changelog: https://github.com/prefix-dev/pixi/compare/v0.3.0...v0.4.0

    "},{"location":"CHANGELOG/#030-2023-09-11","title":"[0.3.0] - 2023-09-11","text":""},{"location":"CHANGELOG/#highlights_33","title":"Highlights","text":"

    This releases fixes a lot of issues encountered by the community as well as some awesome community contributions like the addition of pixi global list and pixi global remove.

    "},{"location":"CHANGELOG/#fixed_43","title":"Fixed","text":"
    • Properly detect Cuda on linux using our build binaries, by @baszalmstra (#290)
    • Package names are now case-insensitive, by @baszalmstra (#285)
    • Issue with starts-with and compatibility operator, by @tdejager (#296)
    • Lock files are now consistently sorted, by @baszalmstra (#295 & #307)
    • Improved xonsh detection and powershell env-var escaping, by @wolfv (#307)
    • system-requirements are properly filtered by platform, by @ruben-arts (#299)
    • Powershell completion install script, by @chawyehsu (#325)
    • Simplified and improved shell quoting, by @baszalmstra (#313)
    • Issue where platform specific subdirs were required, by @baszalmstra (#333)
    • thread 'tokio-runtime-worker' has overflowed its stack issue, by @baszalmstra (#28)
    "},{"location":"CHANGELOG/#added_28","title":"Added","text":"
    • Certificates from the OS certificate store are now used, by @baszalmstra (#310)
    • pixi global list and pixi global remove commands, by @cjfuller (#318)
    "},{"location":"CHANGELOG/#changed_27","title":"Changed","text":"
    • --manifest-path must point to a pixi.toml file, by @baszalmstra (#324)
    "},{"location":"CHANGELOG/#020-2023-08-22","title":"[0.2.0] - 2023-08-22","text":""},{"location":"CHANGELOG/#highlights_34","title":"Highlights","text":"
    • Added pixi search command to search for packages, by @Wackyator. (#244)
    • Added target specific tasks, eg. [target.win-64.tasks], by @ruben-arts. (#269)
    • Flaky install caused by the download of packages, by @baszalmstra. (#281)
    "},{"location":"CHANGELOG/#fixed_44","title":"Fixed","text":"
    • Install instructions, by @baszalmstra. (#258)
    • Typo in getting started, by @RaulPL. (#266)
    • Don't execute alias tasks, by @baszalmstra. (#274)
    "},{"location":"CHANGELOG/#added_29","title":"Added","text":"
    • Rerun example, by @ruben-arts. (#236)
    • Reduction of pixi's binary size, by @baszalmstra (#256)
    • Updated pixi banner, including webp file for faster loading, by @baszalmstra. (#257)
    • Set linguist attributes for pixi.lock automatically, by @spenserblack. (#265)
    • Contribution manual for pixi, by @ruben-arts. (#268)
    • GitHub issue templates, by @ruben-arts. (#271)
    • Links to prefix.dev in readme, by @tdejager. (#279)
    "},{"location":"CHANGELOG/#010-2023-08-11","title":"[0.1.0] - 2023-08-11","text":"

    As this is our first Semantic Versioning release, we'll change from the prototype to the developing phase, as semver describes. A 0.x release could be anything from a new major feature to a breaking change where the 0.0.x releases will be bugfixes or small improvements.

    "},{"location":"CHANGELOG/#highlights_35","title":"Highlights","text":"
    • Update to the latest rattler version, by @baszalmstra. (#249)
    "},{"location":"CHANGELOG/#fixed_45","title":"Fixed","text":"
    • Only add shebang to activation scripts on unix platforms, by @baszalmstra. (#250)
    • Use official crates.io releases for all dependencies, by @baszalmstra. (#252)
    "},{"location":"CHANGELOG/#008-2023-08-01","title":"[0.0.8] - 2023-08-01","text":""},{"location":"CHANGELOG/#highlights_36","title":"Highlights","text":"
    • Much better error printing using miette, by @baszalmstra. (#211)
    • You can now use pixi on aarch64-linux, by @pavelzw. (#233)
    • Use the Rust port of libsolv as the default solver, by @ruben-arts. (#209)
    "},{"location":"CHANGELOG/#added_30","title":"Added","text":"
    • Add mention to condax in the docs, by @maresb. (#207)
    • Add brew installation instructions, by @wolfv. (#208)
    • Add activation.scripts to the pixi.toml to configure environment activation, by @ruben-arts. (#217)
    • Add pixi upload command to upload packages to prefix.dev, by @wolfv. (#127)
    • Add more metadata fields to the pixi.toml, by @wolfv. (#218)
    • Add pixi task list to show all tasks in the project, by @tdejager. (#228)
    • Add --color to configure the colors in the output, by @baszalmstra. (#243)
    • Examples, ROS2 Nav2, JupyterLab and QGIS, by @ruben-arts.
    "},{"location":"CHANGELOG/#fixed_46","title":"Fixed","text":"
    • Add trailing newline to pixi.toml and .gitignore, by @pavelzw. (#216)
    • Deny unknown fields and rename license-file in pixi.toml, by @wolfv. (#220)
    • Overwrite PS1 variable when going into a pixi shell, by @ruben-arts. (#201)
    "},{"location":"CHANGELOG/#changed_28","title":"Changed","text":"
    • Install environment when adding a dependency using pixi add, by @baszalmstra. (#213)
    • Improve and speedup CI, by @baszalmstra. (#241)
    "},{"location":"CHANGELOG/#007-2023-07-11","title":"[0.0.7] - 2023-07-11","text":""},{"location":"CHANGELOG/#highlights_37","title":"Highlights","text":"
    • Transitioned the run subcommand to use the deno_task_shell for improved cross-platform functionality. More details in the Deno Task Runner documentation.
    • Added an info subcommand to retrieve system-specific information understood by pixi.
    "},{"location":"CHANGELOG/#breaking-changes","title":"BREAKING CHANGES","text":"
    • [commands] in the pixi.toml is now called [tasks]. (#177)
    "},{"location":"CHANGELOG/#added_31","title":"Added","text":"
    • The pixi info command to get more system information by @wolfv in (#158)
    • Documentation on how to use the cli by @ruben-arts in (#160)
    • Use the deno_task_shell to execute commands in pixi run by @baszalmstra in (#173)
    • Use new solver backend from rattler by @baszalmstra in (#178)
    • The pixi command command to the cli by @tdejager in (#177)
    • Documentation on how to use the pixi auth command by @wolfv in (#183)
    • Use the newest rattler 0.6.0 by @baszalmstra in (#185)
    • Build with pixi section to the documentation by @tdejager in (#196)
    "},{"location":"CHANGELOG/#fixed_47","title":"Fixed","text":"
    • Running tasks sequentially when using depends_on by @tdejager in (#161)
    • Don't add PATH variable where it is already set by @baszalmstra in (#169)
    • Fix README by @Hofer-Julian in (#182)
    • Fix Ctrl+C signal in pixi run by @tdejager in (#190)
    • Add the correct license information to the lockfiles by @wolfv in (#191)
    "},{"location":"CHANGELOG/#006-2023-06-30","title":"[0.0.6] - 2023-06-30","text":""},{"location":"CHANGELOG/#highlights_38","title":"Highlights","text":"

    Improving the reliability is important to us, so we added an integration testing framework, we can now test as close as possible to the CLI level using cargo.

    "},{"location":"CHANGELOG/#added_32","title":"Added","text":"
    • An integration test harness, to test as close as possible to the user experience but in rust. (#138, #140, #156)
    • Add different levels of dependencies in preparation for pixi build, allowing host- and build- dependencies (#149)
    "},{"location":"CHANGELOG/#fixed_48","title":"Fixed","text":"
    • Use correct folder name on pixi init (#144)
    • Fix windows cli installer (#152)
    • Fix global install path variable (#147)
    • Fix macOS binary notarization (#153)
    "},{"location":"CHANGELOG/#005-2023-06-26","title":"[0.0.5] - 2023-06-26","text":"

    Fixing Windows installer build in CI. (#145)

    "},{"location":"CHANGELOG/#004-2023-06-26","title":"[0.0.4] - 2023-06-26","text":""},{"location":"CHANGELOG/#highlights_39","title":"Highlights","text":"

    A new command, auth which can be used to authenticate the host of the package channels. A new command, shell which can be used to start a shell in the pixi environment of a project. A refactor of the install command which is changed to global install and the install command now installs a pixi project if you run it in the directory. Platform specific dependencies using [target.linux-64.dependencies] instead of [dependencies] in the pixi.toml

    Lots and lots of fixes and improvements to make it easier for this user, where bumping to the new version of rattler helped a lot.

    "},{"location":"CHANGELOG/#added_33","title":"Added","text":"
    • Platform specific dependencies and helpful error reporting on pixi.toml issues(#111)
    • Windows installer, which is very useful for users that want to start using pixi on windows. (#114)
    • shell command to use the pixi environment without pixi run. (#116)
    • Verbosity options using -v, -vv, -vvv (#118)
    • auth command to be able to login or logout of a host like repo.prefix.dev if you're using private channels. (#120)
    • New examples: CPP sdl: #121, Opencv camera calibration #125
    • Apple binary signing and notarization. (#137)
    "},{"location":"CHANGELOG/#changed_29","title":"Changed","text":"
    • pixi install moved to pixi global install and pixi install became the installation of a project using the pixi.toml (#124)
    "},{"location":"CHANGELOG/#fixed_49","title":"Fixed","text":"
    • pixi run uses default shell (#119)
    • pixi add command is fixed. (#132)
    • Community issues fixed: #70, #72, #90, #92, #94, #96
    "}]} \ No newline at end of file diff --git a/dev/sitemap.xml.gz b/dev/sitemap.xml.gz index afd6d08c0..74528c92a 100644 Binary files a/dev/sitemap.xml.gz and b/dev/sitemap.xml.gz differ