Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[QUESTION] How to compute a Jacobian matrix efficiently using automatic differentiation #424

Open
Yidong-ZHAO opened this issue Jan 10, 2025 · 0 comments
Assignees
Labels
question The issue author requires information

Comments

@Yidong-ZHAO
Copy link

Hi!

I have a multi-valued function $\bf{f}: \mathbb{R}^n \to \mathbb{R}^n$, where $n$ indicates the dimension of the input/output vector. The naive way to get the Jacobian matrix $\bf{J}$ of $\bf{f}$ is to compute the components rows by rows using automatic differentiation (following the official document).

However, for a large $n$, the naive way is very slow. I wonder whether there is an efficient way similar to what is suggested in this discussion.

In my case, for each row $i \in [0, n)$, $J$ has a maximum of 9 non-zero components. Their column indices are: $i-m-1$, $i-m$, $i-m+1$, $i-1$, $i$, $i+1$, $i+m-1$, $i+m$, $i+m+1$, where $m \in (0, n)$ is a constant.

Thanks!

@Yidong-ZHAO Yidong-ZHAO added the question The issue author requires information label Jan 10, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
question The issue author requires information
Projects
None yet
Development

No branches or pull requests

2 participants