diff --git a/README.md b/README.md index 522d20cea734..44a6d6e26415 100644 --- a/README.md +++ b/README.md @@ -17,9 +17,8 @@ ## What is JAX? -JAX is [Autograd](https://github.com/hips/autograd) and [XLA](https://www.tensorflow.org/xla), -brought together for high-performance numerical computing, including -large-scale machine learning research. +JAX is a Python library for accelerator-oriented array computation and program transformation, +designed for high-performance numerical computing and large-scale machine learning. With its updated version of [Autograd](https://github.com/hips/autograd), JAX can automatically differentiate native diff --git a/docs/index.rst b/docs/index.rst index bef957568f92..6214d2bb46eb 100644 --- a/docs/index.rst +++ b/docs/index.rst @@ -1,7 +1,8 @@ JAX: High-Performance Array Computing ===================================== -JAX is Autograd_ and XLA_, brought together for high-performance numerical computing. +JAX is a Python library for accelerator-oriented array computation and program transformation, +designed for high-performance numerical computing and large-scale machine learning. If you're looking to train neural networks, use Flax_ and start with its documentation. Some associated tools are Optax_ and Orbax_. @@ -93,8 +94,6 @@ For an end-to-end transformer library built on JAX, see MaxText_. glossary -.. _Autograd: https://github.com/hips/autograd -.. _XLA: https://openxla.org/xla .. _Flax: https://flax.readthedocs.io/ .. _Orbax: https://orbax.readthedocs.io/ .. _Optax: https://optax.readthedocs.io/