Releases: sylvaticus/BetaML.jl
Releases · sylvaticus/BetaML.jl
v0.8.0
BetaML v0.8.0
- support for all models of the new "V2" API that implements a "standard"
mod = Model([Options])
,fit!(mod,X,[Y])
,predict(mod,[X])
workflow (details here). Classic API is now deprecated, with some of its functions be removed in the next BetaML 0.9 versions and some unexported. - standardised function names to follow the Julia style guidelines and the new BetaML code style guidelines](https://sylvaticus.github.io/BetaML.jl/dev/StyleGuide_templates.html)
- new hyper-parameter autotuning method:
Autotune is hyperthreaded with model-specific defaults. For example for Random Forests the defaults are:
mod = ModelXX(autotune=true) # --> control autotune with the parameter `tunemethod` fit!(mod,x,[y]) # --> autotune happens here together with final tuning est = predict(mod,xnew)
The number of models is reduced in order to arrive with a single model.tunemethod=SuccessiveHalvingSearch( hpranges = Dict("n_trees" => [10, 20, 30, 40], "max_depth" => [5,10,nothing], "min_gain" => [0.0, 0.1, 0.5], "min_records" => [2,3,5], "max_features" => [nothing,5,10,30], "beta" => [0,0.01,0.1]), loss = l2loss_by_cv, # works for both regression and classification res_shares = [0.08, 0.1, 0.13, 0.15, 0.2, 0.3, 0.4] multithreads = false) # RF are already multi-threaded
Only supervised model autotuning is currently implemented, but GMM-based clustering autotuniing is planned usingBIC
orAIC
. - new functions
model_load
andmodel_save
to load/save trained models from the filesystem - new
MinMaxScaler
(StandardScaler
was already available as classical API functionsscale
andgetScalingFactors
) - many bugfixes/improvments on corner situations
- new MLJ interface models to
NeuralNetworkEstimator
Closed issues:
- Improve oneHotEncode stability for encoding integers embedding categories (#29)
- initVarainces! doesn't support mixed-type variances (#33)
- Error generating MLJ model registry (#37)
- WARNING: could not import Perceptron ... (#38)
- MLJ model
BetaMLGMMRegressor
predicting row vectors instead of column vectors (#40)
v0.7.1
BetaML v0.7.1
- solve issue #37
- initial attempt to provide plotting of a decision tree
Merged pull requests:
- 1st attempt to implement
AbstractTrees
-interface (#34) (@roland-KA) - CompatHelper: add new compat entry for AbstractTrees at version 0.4, (keep existing compat) (#35) (@github-actions[bot])
- AbstractTrees-interface completed (#36) (@roland-KA)
v0.7.0
BetaML v0.7.0
- new experimental V2 API that implements a "standard"
mod = Model([Options])
,train!(mod,X,[Y])
,predict(mod,[X])
workflow. In BetaML v0.7 this new API is still experimental, as documentation and implementation are not completed (missing yet perceptions and NeuralNetworks). We plan to make it the default API in BetaML 0.8, when the current API will be dimmed deprecated. - new
Imputation
module with several missing values imputersMeanImputer
,GMMImputer
,RFImputer
,GeneralImputer
and relative MLJ interfaces. The last one, in particular, allows using any regressor/classifier (not necessarily of BetaML) for which the API described above is valid Cluster
module reorganised with only hard clustering algorithms (K-Means and K-medoids), while GMM clustering and the newGMMRegressor1
andGMMRegressor2
are in the newGMM
module- Split large files in subfiles, like
Trees.jl
where DT and RF are now on separate (included) files - New
oneHotDecoder(x)
function inUtils
module - New dependency to
DocStringExtensions.jl
- Several bugfixes
v0.6.1
BetaML v0.6.1
bugfix in Kernel Perceptron (binary and multi-class) when a single class is present in training (issue #32)
v0.6.0
BetaML v0.6.0
- bugfixes in MLJ interface, gmm clustering and other
- API change for print(confusionMatrix) only
Merged pull requests:
- Indent
pegasos
a bit more in the docstring (#30) (@rikhuijzer)
v0.5.6
BetaML v0.5.6
- bugfixes in MLJ interface, documentation build and a rare case of segfault on Julia 1.5
Closed issues:
v0.5.5
BetaML v0.5.5
- Added an optional "learnable" parameter to the activation function of VectorFunctionLayer
- Added similar ScalarFunctionLayer (useful for multiclass, multi-label classification, see the test added to Nn_test.jl in the previous commit)
v0.5.4
BetaML v0.5.4
Bugfix on pca() that was reporting the reprojected matrix (and the reprojection vectors) in the opposite order than announced (from the most explained variance to the less)
v0.5.3
BetaML v0.5.3
Bugfix on findfirst() that was making ambiguous some base calls that use functions. The BetaML version now restricts to arrays of abstractstrings and numbers.
Closed issues:
- Tag a new release to enable use with Distributions 0.25 (#25)
v0.5.2
BetaML v0.5.2
Started development on STATS (sub-)module ("classical" statistics)
Updated to Distributions 0.25
Merged pull requests:
- CompatHelper: bump compat for "Distributions" to "0.25" (#24) (@github-actions[bot])