Skip to content

Releases: sylvaticus/BetaML.jl

v0.8.0

02 Oct 10:45
Compare
Choose a tag to compare

BetaML v0.8.0

Diff since v0.7.1

  • support for all models of the new "V2" API that implements a "standard" mod = Model([Options]), fit!(mod,X,[Y]), predict(mod,[X]) workflow (details here). Classic API is now deprecated, with some of its functions be removed in the next BetaML 0.9 versions and some unexported.
  • standardised function names to follow the Julia style guidelines and the new BetaML code style guidelines](https://sylvaticus.github.io/BetaML.jl/dev/StyleGuide_templates.html)
  • new hyper-parameter autotuning method:
    mod = ModelXX(autotune=true)  # --> control autotune with the parameter `tunemethod`
    fit!(mod,x,[y])               # --> autotune happens here together with final tuning
    est = predict(mod,xnew)
    
    Autotune is hyperthreaded with model-specific defaults. For example for Random Forests the defaults are:
    tunemethod=SuccessiveHalvingSearch(
        hpranges     = Dict("n_trees"   => [10, 20, 30, 40],
                         "max_depth"    => [5,10,nothing],
                         "min_gain"     => [0.0, 0.1, 0.5],
                         "min_records"  => [2,3,5],
                         "max_features" => [nothing,5,10,30],
                         "beta"         => [0,0.01,0.1]),
        loss         = l2loss_by_cv, # works for both regression and classification
        res_shares   = [0.08, 0.1, 0.13, 0.15, 0.2, 0.3, 0.4]
        multithreads = false) # RF are already multi-threaded
    
    The number of models is reduced in order to arrive with a single model.
    Only supervised model autotuning is currently implemented, but GMM-based clustering autotuniing is planned using BIC or AIC.
  • new functions model_load and model_save to load/save trained models from the filesystem
  • new MinMaxScaler (StandardScaler was already available as classical API functions scale and getScalingFactors)
  • many bugfixes/improvments on corner situations
  • new MLJ interface models to NeuralNetworkEstimator

Closed issues:

  • Improve oneHotEncode stability for encoding integers embedding categories (#29)
  • initVarainces! doesn't support mixed-type variances (#33)
  • Error generating MLJ model registry (#37)
  • WARNING: could not import Perceptron ... (#38)
  • MLJ model BetaMLGMMRegressor predicting row vectors instead of column vectors (#40)

v0.7.1

11 Aug 14:28
Compare
Choose a tag to compare

BetaML v0.7.1

Diff since v0.7.0

  • solve issue #37
  • initial attempt to provide plotting of a decision tree

Merged pull requests:

  • 1st attempt to implement AbstractTrees-interface (#34) (@roland-KA)
  • CompatHelper: add new compat entry for AbstractTrees at version 0.4, (keep existing compat) (#35) (@github-actions[bot])
  • AbstractTrees-interface completed (#36) (@roland-KA)

v0.7.0

02 Aug 14:18
Compare
Choose a tag to compare

BetaML v0.7.0

Diff since v0.6.1

  • new experimental V2 API that implements a "standard" mod = Model([Options]), train!(mod,X,[Y]), predict(mod,[X]) workflow. In BetaML v0.7 this new API is still experimental, as documentation and implementation are not completed (missing yet perceptions and NeuralNetworks). We plan to make it the default API in BetaML 0.8, when the current API will be dimmed deprecated.
  • new Imputation module with several missing values imputers MeanImputer, GMMImputer, RFImputer, GeneralImputer and relative MLJ interfaces. The last one, in particular, allows using any regressor/classifier (not necessarily of BetaML) for which the API described above is valid
  • Cluster module reorganised with only hard clustering algorithms (K-Means and K-medoids), while GMM clustering and the new GMMRegressor1 and GMMRegressor2 are in the new GMM module
  • Split large files in subfiles, like Trees.jl where DT and RF are now on separate (included) files
  • New oneHotDecoder(x) function in Utils module
  • New dependency to DocStringExtensions.jl
  • Several bugfixes

v0.6.1

10 Jun 13:56
Compare
Choose a tag to compare

BetaML v0.6.1

Diff since v0.6.0

bugfix in Kernel Perceptron (binary and multi-class) when a single class is present in training (issue #32)

v0.6.0

01 Jun 10:51
Compare
Choose a tag to compare

BetaML v0.6.0

Diff since v0.5.6

  • bugfixes in MLJ interface, gmm clustering and other
    • API change for print(confusionMatrix) only

Merged pull requests:

v0.5.6

08 Nov 16:41
Compare
Choose a tag to compare

BetaML v0.5.6

Diff since v0.5.5

  • bugfixes in MLJ interface, documentation build and a rare case of segfault on Julia 1.5

Closed issues:

  • MLJ traits for GMMClusterer (#26)
  • The input scitypes for trees are incorrect (#28)

v0.5.5

15 Sep 13:11
Compare
Choose a tag to compare

BetaML v0.5.5

Diff since v0.5.4

  • Added an optional "learnable" parameter to the activation function of VectorFunctionLayer
    • Added similar ScalarFunctionLayer (useful for multiclass, multi-label classification, see the test added to Nn_test.jl in the previous commit)

v0.5.4

02 Sep 12:57
Compare
Choose a tag to compare

BetaML v0.5.4

Diff since v0.5.3

Bugfix on pca() that was reporting the reprojected matrix (and the reprojection vectors) in the opposite order than announced (from the most explained variance to the less)

v0.5.3

20 Aug 09:10
Compare
Choose a tag to compare

BetaML v0.5.3

Diff since v0.5.2

Bugfix on findfirst() that was making ambiguous some base calls that use functions. The BetaML version now restricts to arrays of abstractstrings and numbers.

Closed issues:

  • Tag a new release to enable use with Distributions 0.25 (#25)

v0.5.2

17 Aug 09:22
Compare
Choose a tag to compare

BetaML v0.5.2

Diff since v0.5.1

Started development on STATS (sub-)module ("classical" statistics)
Updated to Distributions 0.25

Merged pull requests:

  • CompatHelper: bump compat for "Distributions" to "0.25" (#24) (@github-actions[bot])