Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Bump jaxlib from 0.4.8 to 0.4.13 #59

Open
wants to merge 1 commit into
base: main
Choose a base branch
from

Conversation

dependabot[bot]
Copy link

@dependabot dependabot bot commented on behalf of github Jul 1, 2023

Bumps jaxlib from 0.4.8 to 0.4.13.

Release notes

Sourced from jaxlib's releases.

jaxlib release v0.4.13

  • Changes

    • Added Windows CPU-only wheels to the jaxlib Pypi release.
  • Bug fixes

    • __cuda_array_interface__ was broken in previous jaxlib versions and is now fixed ({jax-issue}16440).
    • Concurrent CUDA kernel tracing is now enabled by default on NVIDIA GPUs.

jaxlib release v0.4.12

No release notes provided.

Jaxlib release v0.4.11

No release notes provided.

jaxlib release v0.4.10

No release notes provided.

jaxlib release v0.4.9

No release notes provided.

Changelog

Sourced from jaxlib's changelog.

jax 0.4.13 (June 22, 2023)

  • Changes

    • jax.jit now allows None to be passed to in_shardings and out_shardings. The semantics are as follows:
      • For in_shardings, JAX will mark is as replicated but this behavior can change in the future.
      • For out_shardings, we will rely on the XLA GSPMD partitioner to determine the output shardings.
    • jax.experimental.pjit.pjit also allows None to be passed to in_shardings and out_shardings. The semantics are as follows:
      • If the mesh context manager is not provided, JAX has the freedom to choose whatever sharding it wants.
        • For in_shardings, JAX will mark is as replicated but this behavior can change in the future.
        • For out_shardings, we will rely on the XLA GSPMD partitioner to determine the output shardings.
      • If the mesh context manager is provided, None will imply that the value will be replicated on all devices of the mesh.
    • Executable.cost_analysis() works on Cloud TPU
    • Added a warning if a non-allowlisted jaxlib plugin is in use.
    • Added jax.tree_util.tree_leaves_with_path.
    • None is not a valid input to jax.experimental.multihost_utils.host_local_array_to_global_array or jax.experimental.multihost_utils.global_array_to_host_local_array. Please use jax.sharding.PartitionSpec() if you wanted to replicate your input.
  • Bug fixes

    • Fixed incorrect wheel name in CUDA 12 releases (#16362); the correct wheel is named cudnn89 instead of cudnn88.
  • Deprecations

    • The native_serialization_strict_checks parameter to {func}jax.experimental.jax2tf.convert is deprecated in favor of the new native_serializaation_disabled_checks ({jax-issue}[#16347](https://github.com/google/jax/issues/16347)).

jaxlib 0.4.13 (June 22, 2023)

  • Changes

    • Added Windows CPU-only wheels to the jaxlib Pypi release.
  • Bug fixes

    • __cuda_array_interface__ was broken in previous jaxlib versions and is now fixed ({jax-issue}16440).
    • Concurrent CUDA kernel tracing is now enabled by default on NVIDIA GPUs.

jax 0.4.12 (June 8, 2023)

  • Changes

... (truncated)

Commits
  • c3e2427 Merge pull request #16527 from skye:version
  • 487b640 Jax 0.4.13 release.
  • 10424c5 Update JAX's XlaExecutable.cost_analysis and related plumbing so it works on ...
  • 9f4080a Silence pytype errors under an upcoming pytype change.
  • e123d1e Merge pull request #16508 from hawkinsp:metal
  • 677b0d9 Ignore JAX_USE_PJRT_C_API_ON_TPU=false user warning raised.
  • 85a84fd Add a link to the Apple Metal plugin to the JAX README.
  • b3527f3 Zlib compress kernel proto.
  • f238667 Make JAX-Triton calls serializable.
  • c5a47d1 [jax2tf] Refactor the backwards compatibility tests.
  • Additional commits viewable in compare view

Dependabot compatibility score

Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


Dependabot commands and options

You can trigger Dependabot actions by commenting on this PR:

  • @dependabot rebase will rebase this PR
  • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
  • @dependabot merge will merge this PR after your CI passes on it
  • @dependabot squash and merge will squash and merge this PR after your CI passes on it
  • @dependabot cancel merge will cancel a previously requested merge and block automerging
  • @dependabot reopen will reopen this PR if it is closed
  • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
  • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
  • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
  • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)

Bumps [jaxlib](https://github.com/google/jax) from 0.4.8 to 0.4.13.
- [Release notes](https://github.com/google/jax/releases)
- [Changelog](https://github.com/google/jax/blob/main/CHANGELOG.md)
- [Commits](jax-ml/jax@jax-v0.4.8...jaxlib-v0.4.13)

---
updated-dependencies:
- dependency-name: jaxlib
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <[email protected]>
@dependabot dependabot bot added the dependencies Pull requests that update a dependency file label Jul 1, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
dependencies Pull requests that update a dependency file
Projects
None yet
Development

Successfully merging this pull request may close these issues.

0 participants