Stalingrad and DVL
Stalin∇ (Stalingrad) is a compiler specialized in automatic differentiation (AD) for scientific computing. Stalingrad compiles a Scheme-based source language that we call VLAD, a purely-functional language with built-in AD operators. The compiler uses polyvariant union-free flow analysis, which allows all scaffolding and function manipulation to be migrated to compile time, providing performance advantages. Our tests indicate one to three orders of magnitude increase in performance compared with overloading-based forward AD for functional and imperative languages and similar performance compared with transformation-based forward AD for imperative languages.
Source code for Stalingrad can be found here. You can also see example code that we use for benchmarking our implementation against major AD implementations in several languages. Results from the 2009 benchmarks can be found here.The newer DVL compiler by Alexey Radul is based on Stalingrad, using a reimplementation of the non-AD portion of VLAD and combining it with a new AD implementation.
DiffSharp
DiffSharp is a functional automatic differentiation (AD) library implemented in the F# language. It supports C# and the other CLI languages.
AD allows exact and efficient calculation of derivatives, by systematically invoking the chain rule of calculus at the elementary operator level during program execution. AD is different from numerical differentiation, which is prone to truncation and round-off errors, and symbolic differentiation, which is affected by expression swell and cannot fully handle algorithmic control flow.
Using the DiffSharp library, differentiation (gradients, Hessians, Jacobians, directional derivatives, and matrix-free Hessian- and Jacobian-vector products) is applied using higher-order functions, that is, functions which take other functions as arguments. Your functions can use the full expressive capability of the language including control flow. DiffSharp allows composition of differentiation using nested forward and reverse AD up to any level, meaning that you can compute exact higher-order derivatives or differentiate functions that are internally making use of differentiation. Please see the API Overview page for a list of available operations.
Hype
Hype is an experimental library for deep learning and hyperparameter optimization. We are using it to develop and test our ideas of nested AD and compositional machine learning.
Haskell ad package
The ad package for Haskell, by Edward Kmett with contributions by Barak A. Pearlmutter and Jeffrey Mark Siskind, provides an API for AD in Haskell and includes forward and reverse mode implementations. For accessing the code and seeing usage examples, you can visit the package's GitHub page. This package subsumes the functionality of the fad package which provided only forward mode AD.
Farfel Fortran
An implementation of the Farfel Fortran extension that adds AD block constructs to Fortran77 is in development.
Demos and other software
You can access an audio demonstration of the Contextual Independent Component Analysis (cICA) blind source separation algorithm, which also includes a comparison with the output from the Bell-Sejnowski ICA.
Other software produced through the collaboration of Barak A. Pearlmutter and Jeffrey Mark Siskind can be found here.
Public repositories of the Brain and Computation Lab on GitHub can be found here.