- Home
- Computers
- Programming Languages
- Deep Learning with JAX
Deep Learning with JAX
List Price:
$59.99
- Availability: Confirm prior to ordering
- Branding: minimum 50 pieces (add’l costs below)
- Check Freight Rates (branded products only)
Branding Options (v), Availability & Lead Times
- 1-Color Imprint: $2.00 ea.
- Promo-Page Insert: $2.50 ea. (full-color printed, single-sided page)
- Belly-Band Wrap: $2.50 ea. (full-color printed)
- Set-Up Charge: $45 per decoration
- Availability: Product availability changes daily, so please confirm your quantity is available prior to placing an order.
- Branded Products: allow 10 business days from proof approval for production. Branding options may be limited or unavailable based on product design or cover artwork.
- Unbranded Products: allow 3-5 business days for shipping. All Unbranded items receive FREE ground shipping in the US. Inquire for international shipping.
- RETURNS/CANCELLATIONS: All orders, branded or unbranded, are NON-CANCELLABLE and NON-RETURNABLE once a purchase order has been received.
Product Details
Author:
Grigory Sapunov
Format:
Paperback
Pages:
408
Publisher:
Manning (October 29, 2024)
Language:
English
ISBN-13:
9781633438880
ISBN-10:
1633438880
Weight:
24oz
Dimensions:
7.375" x 9.25" x 0.9"
File:
Eloquence-SimonSchuster_04022026_P9912986_onix30_Complete-20260402.xml
Folder:
Eloquence
List Price:
$59.99
Pub Discount:
37
As low as:
$53.99
Publisher Identifier:
P-SS
Discount Code:
G
Case Pack:
18
Imprint:
Manning
Overview
Accelerate deep learning and other number-intensive tasks with JAX, Google’s awesome high-performance numerical computing library.
The JAX numerical computing library tackles the core performance challenges at the heart of deep learning and other scientific computing tasks. By combining Google’s Accelerated Linear Algebra platform (XLA) with a hyper-optimized version of NumPy and a variety of other high-performance features, JAX delivers a huge performance boost in low-level computations and transformations.
In Deep Learning with JAX you will learn how to:
• Use JAX for numerical calculations
• Build differentiable models with JAX primitives
• Run distributed and parallelized computations with JAX
• Use high-level neural network libraries such as Flax
• Leverage libraries and modules from the JAX ecosystem
Deep Learning with JAX is a hands-on guide to using JAX for deep learning and other mathematically-intensive applications. Google Developer Expert Grigory Sapunov steadily builds your understanding of JAX’s concepts. The engaging examples introduce the fundamental concepts on which JAX relies and then show you how to apply them to real-world tasks. You’ll learn how to use JAX’s ecosystem of high-level libraries and modules, and also how to combine TensorFlow and PyTorch with JAX for data loading and deployment.
Purchase of the print book includes a free eBook in PDF and ePub formats from Manning Publications.
About the technology
Google’s JAX offers a fresh vision for deep learning. This powerful library gives you fine control over low level processes like gradient calculations, delivering fast and efficient model training and inference, especially on large datasets. JAX has transformed how research scientists approach deep learning. Now boasting a robust ecosystem of tools and libraries, JAX makes evolutionary computations, federated learning, and other performance-sensitive tasks approachable for all types of applications.
About the book
Deep Learning with JAX teaches you to build effective neural networks with JAX. In this example-rich book, you’ll discover how JAX’s unique features help you tackle important deep learning performance challenges, like distributing computations across a cluster of TPUs. You’ll put the library into action as you create an image classification tool, an image filter application, and other realistic projects. The nicely-annotated code listings demonstrate how JAX’s functional programming mindset improves composability and parallelization.
What's inside
• Use JAX for numerical calculations
• Build differentiable models with JAX primitives
• Run distributed and parallelized computations with JAX
• Use high-level neural network libraries such as Flax
About the reader
For intermediate Python programmers who are familiar with deep learning.
About the author
Grigory Sapunov holds a Ph.D. in artificial intelligence and is a Google Developer Expert in Machine Learning.
The technical editor on this book was Nicholas McGreivy.
Table of Contents
Part 1
1 When and why to use JAX
2 Your first program in JAX
Part 2
3 Working with arrays
4 Calculating gradients
5 Compiling your code
6 Vectorizing your code
7 Parallelizing your computations
8 Using tensor sharding
9 Random numbers in JAX
10 Working with pytrees
Part 3
11 Higher-level neural network libraries
12 Other members of the JAX ecosystem
A Installing JAX
B Using Google Colab
C Using Google Cloud TPUs
D Experimental parallelization
The JAX numerical computing library tackles the core performance challenges at the heart of deep learning and other scientific computing tasks. By combining Google’s Accelerated Linear Algebra platform (XLA) with a hyper-optimized version of NumPy and a variety of other high-performance features, JAX delivers a huge performance boost in low-level computations and transformations.
In Deep Learning with JAX you will learn how to:
• Use JAX for numerical calculations
• Build differentiable models with JAX primitives
• Run distributed and parallelized computations with JAX
• Use high-level neural network libraries such as Flax
• Leverage libraries and modules from the JAX ecosystem
Deep Learning with JAX is a hands-on guide to using JAX for deep learning and other mathematically-intensive applications. Google Developer Expert Grigory Sapunov steadily builds your understanding of JAX’s concepts. The engaging examples introduce the fundamental concepts on which JAX relies and then show you how to apply them to real-world tasks. You’ll learn how to use JAX’s ecosystem of high-level libraries and modules, and also how to combine TensorFlow and PyTorch with JAX for data loading and deployment.
Purchase of the print book includes a free eBook in PDF and ePub formats from Manning Publications.
About the technology
Google’s JAX offers a fresh vision for deep learning. This powerful library gives you fine control over low level processes like gradient calculations, delivering fast and efficient model training and inference, especially on large datasets. JAX has transformed how research scientists approach deep learning. Now boasting a robust ecosystem of tools and libraries, JAX makes evolutionary computations, federated learning, and other performance-sensitive tasks approachable for all types of applications.
About the book
Deep Learning with JAX teaches you to build effective neural networks with JAX. In this example-rich book, you’ll discover how JAX’s unique features help you tackle important deep learning performance challenges, like distributing computations across a cluster of TPUs. You’ll put the library into action as you create an image classification tool, an image filter application, and other realistic projects. The nicely-annotated code listings demonstrate how JAX’s functional programming mindset improves composability and parallelization.
What's inside
• Use JAX for numerical calculations
• Build differentiable models with JAX primitives
• Run distributed and parallelized computations with JAX
• Use high-level neural network libraries such as Flax
About the reader
For intermediate Python programmers who are familiar with deep learning.
About the author
Grigory Sapunov holds a Ph.D. in artificial intelligence and is a Google Developer Expert in Machine Learning.
The technical editor on this book was Nicholas McGreivy.
Table of Contents
Part 1
1 When and why to use JAX
2 Your first program in JAX
Part 2
3 Working with arrays
4 Calculating gradients
5 Compiling your code
6 Vectorizing your code
7 Parallelizing your computations
8 Using tensor sharding
9 Random numbers in JAX
10 Working with pytrees
Part 3
11 Higher-level neural network libraries
12 Other members of the JAX ecosystem
A Installing JAX
B Using Google Colab
C Using Google Cloud TPUs
D Experimental parallelization








