Vector Calculus - II

The appendix complements the vector calculus lesson from the linear algebra chapter with a bit more details.

We'll cover the following

Jacobian

The gradient is used for real-valued functions, f:RnRf:R^n \to R.

The concept of gradients can be extended to vector-valued functions, f:RnRmf: R^n \to R^m by Jacobian matrices.

A Jacobian matrix is defined as:

J=[fx1fxn]=[Tf1Tfm]=[f1x1f1xnfmx1fmxn] J = \begin{bmatrix} \dfrac{\partial \mathbf{f}}{\partial x_1} & \cdots & \dfrac{\partial \mathbf{f}}{\partial x_n} \end{bmatrix} = \begin{bmatrix} \nabla^\mathsf{T} f_1 \\ \vdots \\ \nabla^\mathsf{T} f_m \end{bmatrix} = \begin{bmatrix} \dfrac{\partial f_1}{\partial x_1} & \cdots & \dfrac{\partial f_1}{\partial x_n}\\ \vdots & \ddots & \vdots\\ \dfrac{\partial f_m}{\partial x_1} & \cdots & \dfrac{\partial f_m}{\partial x_n} \end{bmatrix}

JAX provides both forward and reverse mode auto-differentiation functions to calculate the Jacobian.

  1. jacfwd() - calculates the Jacobian column-by-column
  2. jacrev() - calculates the Jacobian row-by-row

Get hands-on with 1200+ tech skills courses.