v0.11.0
BetaML v0.11.0
Attention: many breaking changes in this version !!
- experimental new
ConvLayer
andPoolLayer
for convolutional networks. BetaML neural networks work only on CPU and even on CPU the convolution layers (but not the dense ones) are 2-3 times slower than Flux. Still they have some quite unique characteristics, like working with any dimensions or not requiring AD in most cases, so they may still be useful in some corner situations. Then, if you want to help in porting to GPU... ;-) - Isolated MLJ interface models into their own
Bmlj
submodule - Renamed many model in a congruent way
- Shortened the hyper-parameters and learnable parameters struct names
- Corrected many doc bugs
- Several bugfixes