A Recurrent Log-Linearized Gaussian Mixture Network
Use this link to cite this item : https://ir.lib.hiroshima-u.ac.jp/00014212
Gaussian mixture model
hidden Markov model (HMM)
neural networks (NNs)
recurrent neural networks (RNNs)
Context in time series is one of the most useful andinteresting characteristics for machine learning. In some cases, thedynamic characteristic would be the only basis for achieving a possibleclassification. A novel neural network, which is named “a recurrentlog-linearized Gaussian mixture network (R-LLGMN)," isproposed in this paper for classification of time series. The structureof this network is based on a hidden Markov model (HMM),which has been well developed in the area of speech recognition.R-LLGMN can as well be interpreted as an extension of a probabilisticneural network using a log-linearized Gaussian mixturemodel, in which recurrent connections have been incorporated tomake temporal information in use. Some simulation experimentsare carried out to compare R-LLGMN with the traditional estimatorof HMM as classifiers, and finally, pattern classification experimentsfor EEG signals are conducted. It is indicated from theseexperiments that R-LLGMN can successfully classify not only artificialdata but real biological data such as EEG signals.
IEEE Transactions on Neural Networks
|date of issued||
Copyright (c) 2003 IEEE. Personal use of this material is permitted. However, permission to reprint/republish this material for advertising or promotional purposes or for creating new collective works for resale or redistribution to servers or lists, or to reuse any copyrighted component of this work in other works must be obtained from the IEEE.
|relation is version of URL||
Graduate School of Engineering