A reproduction of Learning Efficient Convolutional Networks through Network Slimming
-
Updated
Jun 25, 2020 - Python
A reproduction of Learning Efficient Convolutional Networks through Network Slimming
Implementation of Convex Optimization algorithms
Subgradient methods for Multicommodity Network Design
Solving quadratic programming problem using subgradient optimizer
MCPy is a python library for McCormick relaxations with sub-gradients. This is quite useful for prototyping and testing new convex relaxation and global optimization algorithms.
Implementation and brief comparison of different First Order and different Proximal gradient methods, comparison of their convergence rates
R package for SGD inference
Convex Optimization Algorithms
Dual-Based Procedure and Subgradient method implementations
Minimax NMF
Non-linear topology identification using Deep Learning. Sparsity (lasso) is enforced in the sensor connections. The non-convex and non-differentiable function is solved using sub-gradient descent algorithm.
Optimization includes a class of methods to find global or local optima for discrete or continuous objectives; from evolutionary-based algorithms to swarm-based ones.
In this work, we consider learning sparse models in large scale setting, where the number of samples and the feature dimension can grow as large as millions or billions. Two immediate issues occur under such challenging scenarios: (i) com- putational cost; (ii) memory overhead.
Add a description, image, and links to the subgradient topic page so that developers can more easily learn about it.
To associate your repository with the subgradient topic, visit your repo's landing page and select "manage topics."