Page 1 of 1

Google JAX: Autograd and XLA

Posted: Fri Sep 27, 2019 12:00 pm
by Eli
Google has developed a new machine learning "monster" called JAX, which is Autograd and XLA, brought together for high-performance machine learning research.

With its updated version of Autograd, JAX can automatically differentiate native Python and NumPy functions. It can differentiate through loops, branches, recursion, and closures, and it can take derivatives of derivatives of derivatives. It supports reverse-mode differentiation (a.k.a. backpropagation) via grad as well as forward-mode differentiation, and the two can be composed arbitrarily to any order. What’s new is that JAX uses XLA to compile and run your NumPy programs on GPUs and TPUs. See more Google JAX.