Skip to content

A suite of boosting algorithms for the online learning setting.

Notifications You must be signed in to change notification settings

abhinavjain241/online_boosting

 
 

Repository files navigation

online_boosting

A suite of boosting algorithms and weak learners for the online learning setting.

Ensemblers

Implementations for the following online boosting algorithms are provided:

  1. Online AdaBoost (OzaBoost), from Oza & Russell.
  2. Online GradientBoost (OGBoost), from Leistner et al..
  3. Online SmoothBoost (OSBoost), from Chen et al..
  4. OSBoost with Expert Advice (EXPBoost), again from Chen et al..
  5. OSBoost with Online Convex Programming (OCPBoost), again from Chen et al..

The corresponding Python modules can be found in the 'ensemblers' folder, named as above.

Weak Learners

The package also includes implementations for a number of online weak learners, all of which can be plugged in to the above online boosting algorithms. Some of the key weak learners include:

  1. Perceptrons.
  2. Naive Bayes (Gaussian & Binary).
  3. Random Decision Stumps.
  4. Incremental Decision Trees, based on the DTree module.

Dependencies

The ensemblers and weak learners are generally dependent on Numpy and Scipy. Some of the weak learners (in particular, those prefixed with "sk") are dependent on scikit-learn. File I/O is done through YAML using the PyYAML package.

A full list of dependencies is available in the requirements.txt.

Usage

Run:

$ python main.py [-h] [--record]
               dataset ensembler weak_learner # weak_learners [trials]

Test error for a combination of ensembler and weak learner.

positional arguments:

  • dataset dataset filename (Eg: australian, heart, etc.)
  • ensembler chosen ensembler (Eg: OGBooster, OSBooster, EXPBooster, OCPBooster, etc.)
  • weak_learner chosen weak learner (Eg: Perceptron, DecisionTree, etc.)
  • # weak_learners number of weak learners (Eg: 10, 1000)
  • trials number of trials (each with different shuffling of the data); defaults to 1

optional arguments:

  • -h, --help show this help message and exit
  • --record export the results in YAML format

Datasets

Example datasets in LIBSVM format from different sources can be found here. Place the data files in a new folder called data in the repository.

License

MIT © Charles Marsh

About

A suite of boosting algorithms for the online learning setting.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 100.0%