Finite differences Jacobian #71
Replies: 10 comments
-
how shall I approach this? |
Beta Was this translation helpful? Give feedback.
-
This is perhaps one of the most complex issues in the list (if you're looking for a challenge then issues 4, 6, and 8 would all be good options). Essentially what is needed is that you iterate through each parameter (first construct the full flattened parameter tensor then go one element at a time) and add a small epsilon change to it's value (say 1e-4), then re-run the model.full_sample method and take a difference between the original image and the re-run with param+epsilon. That difference divided by epsilon should be the jacobian for that parameter. Do it for each parameter and store the results in a format like the other jacobian functions. |
Beta Was this translation helpful? Give feedback.
-
Thinking about it further, it is perhaps not as difficult as I originally envisioned. The place to fill the code is in the |
Beta Was this translation helpful? Give feedback.
-
okay okay |
Beta Was this translation helpful? Give feedback.
-
I got it |
Beta Was this translation helpful? Give feedback.
-
Let me know if anything is unclear about the jacobian. As for running autoprof in the terminal, you have to give it a config file for it to run. See the getting started tutorial: https://connorstoneastro.github.io/AutoProf-2/getting_started.html ah, I see you have it working, nice! let me know if you run into any issues. For sure I will update the code to give a more helpful error message so future users don't get confused by the AttributeError above! |
Beta Was this translation helpful? Give feedback.
-
hey |
Beta Was this translation helpful? Give feedback.
-
Ah, astropy is a package with lots of astronomy related tools. In this case it is being used to write a |
Beta Was this translation helpful? Give feedback.
-
May be possible to avoind finitte differences jacobian by creating a system to "chunk" the jacobian calculation into smaller sub-windows of the total image. |
Beta Was this translation helpful? Give feedback.
-
After various recent optimizations to the autograd jacobian methods, the use of finite differences jacobians no longer seems so critical. For now I will close this discussion, but we can re-open it if there ends up being a special use-case. |
Beta Was this translation helpful? Give feedback.
-
Automatic differentiation is nice for getting exact derivatives, however it is very memory intensive for large images with many models. Finite differences uses considerably less memory and so should be included.
Beta Was this translation helpful? Give feedback.
All reactions