Skip to content

Releases: cslr/dinrhiw2

Version v0.90 release

22 Mar 15:26
b778271
Compare
Choose a tag to compare

Basic deep learning code now kinda works. (residual neural network + leaky ReLU + ADAM optimizer).

Version 0.80 release

27 Feb 21:01
Compare
Choose a tag to compare

Version 0.80 release.

Residual neural network code now kinda works and allows use of 40 layer neural network although results are not very good. Also small bugfixes here and there.

Residual neural network code implemented now

19 Feb 09:39
Compare
Choose a tag to compare

This is important update from private repo. It enables residual neural network (enabled in nntool). Code automatically uses skip connectiosn from even numbered layers forward if number of neurons for both layers is same. It therefore does mini skips over two layers from the input layer all the way to output layer and implements residual neural network architecture.

Deep learning: for a simple test problem (test_data_residual.sh) neural network can learn the problem with 40 layers in 10 minutes (dense residual neural network with leaky rectifier unit non-linearity), 20 layers residual neural network gives perfect results.

dinrhiw2-private-repo-sync-gcc9

17 Feb 09:36
Compare
Choose a tag to compare
Pre-release

Long waited dinrhiw2-private repo sync to fix bugs and implement TSNE to work ok with small number of data points.

Code requires GCC 9.* compiler as GCC 8.* random_device has bugs which break class RNG random number generation code.