Learning Efficient Convolutional Networks through Network Slimming, In ICCV 2017.
-
Updated
May 13, 2019 - Python
Learning Efficient Convolutional Networks through Network Slimming, In ICCV 2017.
Functional models and algorithms for sparse signal processing
L1-regularized least squares with PyTorch
Logistic Regression technique in machine learning both theory and code in Python. Includes topics from Assumptions, Multi Class Classifications, Regularization (l1 and l2), Weight of Evidence and Information Value
The given information of network connection, model predicts if connection has some intrusion or not. Binary classification for good and bad type of the connection further converting to multi-class classification and most prominent is feature importance analysis.
Overparameterization and overfitting are common concerns when designing and training deep neural networks. Network pruning is an effective strategy used to reduce or limit the network complexity, but often suffers from time and computational intensive procedures to identify the most important connections and best performing hyperparameters. We s…
An Image Reconstructor that applies fast proximal gradient method (FISTA) to the wavelet transform of an image using L1 and Total Variation (TV) regularizations
High Dimensional Portfolio Selection with Cardinality Constraints
MNIST Digit Prediction using Batch Normalization, Group Normalization, Layer Normalization and L1-L2 Regularizations
MITx - MicroMasters Program on Statistics and Data Science - Data Analysis: Statistical Modeling and Computation in Applications - Second Project
A wrapper for L1 trend filtering via primal-dual algorithm by Kwangmoo Koh, Seung-Jean Kim, and Stephen Boyd
Forecasting for AirQuality UCI dataset with Conjugate Gradient Artificial Neural Network based on Feature Selection L1 Regularized and Genetic Algorithm for Parameter Optimization
regression algorithm implementaion from scratch with python (least-squares, regularized LS, L1-regularized LS, robust regression)
Implementation of optimization and regularization algorithms in deep neural networks from scratch
During this study we will explore the different regularisation methods that can be used to address the problem of overfitting in a given Neural Network architecture, using the balanced EMNIST dataset.
Mathematical machine learning algorithm implementations
Comparing Three Penalized Least Squares Estimators: LASSO,SCAD and MCP.
Comparision of Linear Regression, Ridge Regression, Lasso Regression
2018-2019 Semester 1 at Soton, individual CW of ML
Add a description, image, and links to the l1-regularization topic page so that developers can more easily learn about it.
To associate your repository with the l1-regularization topic, visit your repo's landing page and select "manage topics."