Skip to content

Commit

Permalink
Merge pull request #19985 from jakevdp:doc-tagline
Browse files Browse the repository at this point in the history
PiperOrigin-RevId: 615843373
  • Loading branch information
jax authors committed Mar 14, 2024
2 parents f99284e + 72b2321 commit 993abb1
Show file tree
Hide file tree
Showing 2 changed files with 4 additions and 6 deletions.
5 changes: 2 additions & 3 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -17,9 +17,8 @@

## What is JAX?

JAX is [Autograd](https://github.com/hips/autograd) and [XLA](https://www.tensorflow.org/xla),
brought together for high-performance numerical computing, including
large-scale machine learning research.
JAX is a Python library for accelerator-oriented array computation and program transformation,
designed for high-performance numerical computing and large-scale machine learning.

With its updated version of [Autograd](https://github.com/hips/autograd),
JAX can automatically differentiate native
Expand Down
5 changes: 2 additions & 3 deletions docs/index.rst
Original file line number Diff line number Diff line change
@@ -1,7 +1,8 @@
JAX: High-Performance Array Computing
=====================================

JAX is Autograd_ and XLA_, brought together for high-performance numerical computing.
JAX is a Python library for accelerator-oriented array computation and program transformation,
designed for high-performance numerical computing and large-scale machine learning.

If you're looking to train neural networks, use Flax_ and start with its documentation.
Some associated tools are Optax_ and Orbax_.
Expand Down Expand Up @@ -93,8 +94,6 @@ For an end-to-end transformer library built on JAX, see MaxText_.
glossary


.. _Autograd: https://github.com/hips/autograd
.. _XLA: https://openxla.org/xla
.. _Flax: https://flax.readthedocs.io/
.. _Orbax: https://orbax.readthedocs.io/
.. _Optax: https://optax.readthedocs.io/
Expand Down

0 comments on commit 993abb1

Please sign in to comment.