A repository for files related to my Honors Research Thesis at Westminster College.
- Thesis
- Implementations
- Time Series Analysis
- Introduction to Hierarchical Temporal Memory
- Particle Swarm Optimization
- Sources
Slides from PhDEconomics:
Introduction to ARMA Models from Wharton
Numenta Research: Key Discoveries in Understanding How the Brain Works
This video gives an excellent overview of the neocortex and gives an intuitive understanding of the theory.
Numenta's HTM School with Matt Taylor
HTM School provides a multipart overview of the various components of Hierarchical Temporal Memory.
Particle Swarm Optimization (PSO) Visualized
Particle swarm optimization (PSO) is a population based stochastic optimization technique developed by Dr. Eberhart and Dr. Kennedy in 1995, inspired by social behavior of bird flocking or fish schooling.
PSO shares many similarities with evolutionary computation techniques such as Genetic Algorithms (GA). The system is initialized with a population of random solutions and searches for optima by updating generations. However, unlike GA, PSO has no evolution operators such as crossover and mutation. In PSO, the potential solutions, called particles, fly through the problem space by following the current optimum particles.
Each particle keeps track of its coordinates in the problem space which are associated with the best solution (fitness) it has achieved so far. (The fitness value is also stored.) This value is called pbest. Another "best" value that is tracked by the particle swarm optimizer is the best value, obtained so far by any particle in the neighbors of the particle. This location is called lbest. when a particle takes all the population as its topological neighbors, the best value is a global best and is called gbest.
The particle swarm optimization concept consists of, at each time step, changing the velocity of (accelerating) each particle toward its pbest and lbest locations (local version of PSO). Acceleration is weighted by a random term, with separate random numbers being generated for acceleration toward pbest and lbest locations.
In past several years, PSO has been successfully applied in many research and application areas. It is demonstrated that PSO gets better results in a faster, cheaper way compared with other methods.
Another reason that PSO is attractive is that there are few parameters to adjust. One version, with slight variations, works well in a wide variety of applications. Particle swarm optimization has been used for approaches that can be used across a wide range of applications, as well as for specific applications focused on a specific requirement.
A great resource to learn about Cortical Learning Algorithms is of course, Numenta. They strongly advocate for open science and post their research papers as well as conference posters online at their website. Numenta also has a YouTube channel with lots of helpful resources and for a more gentle introduction, Numenta's Matt Taylor has an excellent YouTube channel called HTM School.
For a comprehensive list of papers and presentations, check the References section of my Honors Research draft, but here is a good list to get you started:
- Advanced NuPIC Programming
- Biological and Machine Intelligence
- Encoding Data for HTM Systems
- Enhancement of Classifiers in HTM-CLA Using Similarity Evaluation Methods
- Evaluation of Hierarchical Temporal Memory in algorithmic trading
- A Framework for Intelligence and Cortical Function Based on Grid Cells in the Neocortex
- Getting Predictions out of HTM (CLA Classifiers)
- Hierarchical Temporal Memory including HTM Cortical Learning Algorithms
- Have We Missed Half of What the Neocortex Does? A New Predictive Framework Based on Cortical Grid Cells
- HTM School: Scalar Encoding
- Intelligent Predictions: an Empirical Study of the Cortical Learning Algorithm
- Locations in the Neocortex: A Theory of Sensorimotor Object Recognition Using Cortical Grid Cells
- A Mathematical Formalization of Hierarchical Temporal Memory’s Spatial Pooler
- Principles of Hierarchical Temporal Memory (HTM): Foundations of Machine Intelligence
- Properties of Sparse Distributed Representations and their Application to Hierarchical Temporal Memory
- Quantum Computation via Sparse Distributed Representation
- Random Distributed Scalar Encoder
- Real Machine Intelligence with Clortex and NuPIC
- Semantic Folding: Theory and its Application in Semantic Fingerprinting
- SDR Classifier
- A Theory of How Columns in the Neocortex Enable Learning the Structure of the World
- Towards a Mathematical Theory of Cortical Micro-circuits