DeepLearning.scala is a DSL for creating complex neural networks.
With the help of DeepLearning.scala, regular programmers are able to build complex neural networks from simple code. You write code almost as usual, the only difference being that code based on DeepLearning.scala is differentiable, which enables such code to evolve by modifying its parameters continuously.
Like Theano and other deep learning toolkits, DeepLearning.scala allows you to build neural networks from mathematical formulas. It supports floats, doubles, GPU-accelerated N-dimensional arrays, and calculates derivatives of the weights in the formulas.
Neural networks created by DeepLearning.scala support ADT data structures (e.g. HList and Coproduct), and calculate derivatives for these data structures.
Neural networks created by DeepLearning.scala may contain control flows like if
/else
/match
/case
in a regular language. Combined with ADT data structures, you can implement arbitrary algorithms inside neural networks, and still keep the variables within those algorithms differentiable and trainable.
Neural networks created by DeepLearning.scala are composable. You can create large networks by combining smaller networks. If two larger networks share sub-networks, the weights for a shared sub-network that are trained in one super-network will also affect the other super-network.
All of the above features are statically type checked.
Version 1.0 is the current version with all of the above features. The final version will be released in March 2017.
- Support
for
/while
and other higher-order functions on differentiableSeq
s. - Support
for
/while
and other higher-order functions on GPU-accelerated differentiable N-dimensional arrays.
Version 2.0 will be released in Q2 2017.
- Support using custom
case class
es inside neural networks. - Support distributed models and distributed training on Spark.
Version 3.0 will be released in late 2017.
DeepLearning.scala is heavily inspired by my colleague @MarisaKirisame. Originally, we worked together on a prototype of a deep learning framework, and eventually split our work into this project and DeepDarkFantasy.
@milessabin's shapeless provides a solid foundation for type-level programming as used in DeepLearning.scala.