Mixtures of Experts, Using Gaussian Mixture Models for the Gate

This code implements the mixture of expert’s using a Gaussian mixture model for the gate.
691 download
Aggiornato 11 nov 2014

Visualizza la licenza

This code implements using a Gaussian mixture model for the gate. ; the main advantage of this method is that training for the gate uses expected maximization (EM) algorithm or single loop EM algorithm. This is achieved using a Gaussian mixture model for the gate. Other methods use the Softmax Function that does not have an analytically closed form solution, requiring the Generalized Expectation Maximization (GEM) or the double loop EM algorithm. The problems with GEM is that it requires extra computation and the stepsize must be chosen carefully to guarantee the convergence of the inner loop. I used k means clustering for initialization, I find only a small improvement after initialization. If you have any questions or recommendations contact me.

Cita come

Joseph Santarcangelo (2024). Mixtures of Experts, Using Gaussian Mixture Models for the Gate (https://www.mathworks.com/matlabcentral/fileexchange/48367-mixtures-of-experts-using-gaussian-mixture-models-for-the-gate), MATLAB Central File Exchange. Recuperato .

Compatibilità della release di MATLAB
Creato con R2008a
Compatibile con qualsiasi release
Compatibilità della piattaforma
Windows macOS Linux
Categorie
Scopri di più su Statistics and Machine Learning Toolbox in Help Center e MATLAB Answers

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!
Versione Pubblicato Note della release
1.2.0.0

din't upload last time

1.1.0.0

There was an error in the first version, I also improved documentation

1.0.0.0