GitHub - DiscoGrad/DiscoGrad: DiscoGrad - automatically differentiate across conditional branches in C++ programs
DiscoGrad
Trying to do gradient descent using automatic differentiation over branchy programs?
Or to combine them with neural networks for end-to-end training?
Then this might be interesting to you.
Automatic Differentiation (AD) is a popular method to obtain the gradients of computer programs, which are extremely useful for adjusting program parameters using gradient descent to solve optimization, control, and inference problems. Unfortunately, AD alone often yields unhelpful (zero-valued and/o...
Read more at github.com