Skip to content

Commit

Permalink
Merge branch 'autograd-improve' into 'main'
Browse files Browse the repository at this point in the history
Improve autograd tools

See merge request omniverse/warp!900
  • Loading branch information
mmacklin committed Jan 15, 2025
2 parents 7298be5 + 2e8407e commit cca9134
Show file tree
Hide file tree
Showing 5 changed files with 528 additions and 203 deletions.
4 changes: 4 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,8 @@

### Added

- `warp.autograd.gradcheck`, `function_jacobian`, `function_jacobian_fd` now also accept arbitrary Python functions that have Warp arrays as inputs and outputs.
- `warp.autograd.gradcheck_tape` now has additional optional arguments `reverse_launches` and `skip_to_launch_index`.
- Added preview of Tile Cholesky factorization and solve APIs through `tile_cholesky` and `tile_cholesky_solve`, as well as helpers `tile_tril` and `tile_add_diag`. Those are preview APIs and subject to change.
- Support `assert` statements in kernels ([docs](https://nvidia.github.io/warp/debugging.html#assertions)).
Assertions can only be triggered in `"debug"` mode ([GH-366](https://github.com/NVIDIA/warp/issues/336)).
Expand All @@ -23,6 +25,8 @@

### Fixed

- Fix autodiff Jacobian computation in `warp.autograd.jacobian_ad` where in some cases gradients were not zero-ed out properly.
- Fix plotting issues in `warp.autograd.jacobian_plot`.
- Fix errors during graph capture caused by module unloading ([GH-401](https://github.com/NVIDIA/warp/issues/401)).
- Fix allocating arrays with strides ([GH-404](https://github.com/NVIDIA/warp/issues/404)).
- Fix `ImportError` exception being thrown during `OpenGLRenderer` interpreter shutdown on Windows
Expand Down
27 changes: 18 additions & 9 deletions docs/modules/differentiability.rst
Original file line number Diff line number Diff line change
Expand Up @@ -683,33 +683,42 @@ A native snippet may also include a return statement. If this is the case, you m
Debugging Gradients
###################

.. note::
We are continuously expanding the debugging section to provide tools to help users debug gradient computations in upcoming Warp releases.

Measuring Gradient Accuracy
^^^^^^^^^^^^^^^^^^^^^^^^^^^

.. currentmodule:: warp.autograd

Warp provides utility functions to evaluate the partial Jacobian matrices for input/output argument pairs given to kernel launches.
:func:`jacobian` computes the Jacobian matrix of a kernel using Warp's automatic differentiation engine.
:func:`jacobian_fd` computes the Jacobian matrix of a kernel using finite differences.
:func:`jacobian` computes the Jacobian matrix of a Warp kernel, or any Python function calling Warp kernels and having Warp arrays as inputs and outputs, using Warp's automatic differentiation engine.
:func:`jacobian_fd` computes the Jacobian matrix of a kernel or a function using finite differences.
:func:`gradcheck` compares the Jacobian matrices computed by the autodiff engine and finite differences to measure the accuracy of the gradients.
:func:`jacobian_plot` visualizes the Jacobian matrices returned by the :func:`jacobian` and :func:`jacobian_fd` functions.

``warp.autograd.gradcheck``
^^^^^^^^^^^^^^^^^^^^^^^^^^^

.. autofunction:: gradcheck

``warp.autograd.gradcheck_tape``
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

.. autofunction:: gradcheck_tape

``warp.autograd.jacobian``
^^^^^^^^^^^^^^^^^^^^^^^^^^

.. autofunction:: jacobian

``warp.autograd.jacobian_fd``
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

.. autofunction:: jacobian_fd

``warp.autograd.jacobian_plot``
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

.. autofunction:: jacobian_plot


Example usage
"""""""""""""
^^^^^^^^^^^^^

.. code-block:: python
Expand Down
Loading

0 comments on commit cca9134

Please sign in to comment.