Quantization Error Som
The SOM paradigm was originally motivated by an attempt to explain some functional structures of the brain. Quite surprisingly, however, the SOM turned out to be very useful in exploratory data analysis and started to live a life of its own: some 6000 scientific articles have been based on it. Teuvo Kohonen, WSOM 2005, 5th Workshop On Self-Organizing Maps. One of the main features of artificial neural networks is the ability to adapt to an environment by learning in order to improve its performance to carry out whatever task its being trained for. The learning paradigm we will be working with here is the unsupervised learning paradigm and we will use it with competitive learning rules. Unsupervised learning can also be thought of as self-organized learning [haykin94neuralnetworks] , in which the purpose is to discover significant patterns or features in the input data, and to do the discovery without a teacher. To do so, a self-organizing learning algorithm is provided with a set of rules of local nature, which enables it to learn to compute an input-output mapping with specific desirable properties (by local it is meant that the effect of changing a neurons synaptic weight is confined to its immediate neighborhood). Different structures of self-organizing system can be used. For example, feedforward networks consisting of an input layer and an output layer with lateral connections between the neurons in the output layer can be used or a multilayer feedforward network in which the self-organization proceeds on a layer-by-layer basis. Regardless of the structure, the learning process consists of repeatedly modifying the synaptic weights of all the connections in the system in response to some input, usually represented by a data vector, in accordance with the set of prescribed rules, until a final configuration develops. Of course, the key question here is how a useful configuration finally can develop from self-organization. The answer to this can be found in the following observation made by
Support Answers MathWorks Search MathWorks.com MathWorks Answers Support MATLAB Answers™ MATLAB Central Community Home MATLAB Answers File Exchange Cody Blogs Newsreader Link Exchange ThingSpeak Anniversary Home Ask Answer Browse More Contributors Recent Activity Flagged Content Flagged as Spam Help MATLAB Central Community Home MATLAB Answers File Exchange Cody Blogs Newsreader Link Exchange ThingSpeak Anniversary Home Ask Answer Browse More Contributors Recent Activity Flagged Content Flagged as Spam Help Trial software Deniz (view profile) 1 question 0 answers 0 accepted answers Reputation: 1 Vote1 About Self Organizing Maps Asked by Deniz Deniz (view profile) 1 question 0 answers 0 accepted answers Reputation: 1 on 24 http://rslab.movsom.com/paper/somrs/html/chapter4.php Oct 2012 Latest activity Answered by Shahrbanoo Hazratiyadkoori Shahrbanoo Hazratiyadkoori (view profile) 0 questions 1 answer 0 accepted answers Reputation: 0 on 14 Jan 2015 11 views (last 30 days) 11 views (last 30 days) Hi, the question is about training number of Self Organizing Maps(SOM) function in Matlab that we need to minimize the error between the samples and the Best Matching Units (BMU). In neural network toolbox https://www.mathworks.com/matlabcentral/answers/51736-about-self-organizing-maps we can observe the U-matrix and component planes. Maybe, when the patterns of U-matrix or component planes are close to stable, we can stop training. Is this the solution? Are there any parameters to decide the iteration number? For example, in SOM Toolbox (Laboratory of Computer and Information Science) for Matlab 5, there are some functions that calculate quantization error (average distance between each data vector and its BMU)and topographic error (the proportion of all data vectors for which first and second BMUs are not adjacent units) Is there any function to calculate that kind of errors and SOM quality in Matlab? Many thanks... 1 Comment Show all comments Peter Peter (view profile) 1 question 0 answers 0 accepted answers Reputation: 0 on 11 Jan 2013 Direct link to this comment: https://www.mathworks.com/matlabcentral/answers/51736#comment_122083 Did you find a simple solution to this problem. I've reached the same point as well. Thanks Tags neural networkssomself organizing mapserror Products MATLAB Related Content 1 Answer Shahrbanoo Hazratiyadkoori (view profile) 0 questions 1 answer 0 accepted answers Reputation: 0 Vote0 Link Direct link to this answer: https://www.mathworks.com/matlabcentral/answers/51736#answer_164813 Answer by Shahrbanoo Hazratiyadkoori Shahrbanoo Hazratiyadkoori (view profile) 0 questions 1 answer 0 accepted answers Reputation: 0 on 14 Jan 2015 Hi, I have pr
Kohonen Contributors:0.38 - Eugene M. Izhikevich 0.23 - Ke CHEN 0.23 - Timo Honkela 0.15 - Tobias Denninger 0.08 - Graham W Griffiths 0.08 - Benjamin Bronner Jeff McKinstry Dr. Teuvo Kohonen, Academy of Finland Dr. Timo Honkela, Helsinki University of Technology Figure http://www.scholarpedia.org/article/Kohonen_network 1: The array of nodes in a two-dimensional SOM grid. The Self-Organizing Map (SOM), commonly also known as Kohonen network (Kohonen 1982, Kohonen 2001) is a computational method for the visualization and analysis of high-dimensional data, especially experimentally acquired information. Contents 1 Introduction 2 History 3 Mathematical definition of the SOM 4 Useful instructions when applying the SOM 5 Software packages 6 Extensions of SOM 7 References 8 quantization error External links 9 See Also Introduction The Self-Organizing Map defines an ordered mapping, a kind of projection from a set of given data items onto a regular, usually two-dimensional grid. A model \(m_i\) is associated with each grid node ( Figure 1). These models are computed by the SOM algorithm. A data item will be mapped into the node whose model which is most similar to the data item, quantization error som e.g., has the smallest distance from the data item in some metric. Like a codebook vector in vector quantization, the model is then usually a certain weighted local average of the given data items in the data space. But in addition to that, when the models are computed by the SOM algorithm, they are more similar at the nearby nodes than between nodes located farther away from each other on the grid. In this way the set of the models can be regarded to constitute a similarity graph, and structured 'skeleton' of the distribution of the given data items. The SOM was originally developed for the visualization of distributions of metric vectors, such as ordered sets of measurement values or statistical attributes, but it can be shown that a SOM-type mapping can be defined for any data items, the mutual pairwise distances of which can be defined. Examples of non-vectorial data that are feasible for this method are strings of symbols and sequences of segments in organic molecules (Kohonen and Somervuo 2002). History The SOM algorithm grew out of early neural network models, especially models of associative memory and adaptive learning (cf. Kohonen 1984). A new incentive was to explain the spatial organ
be down. Please try the request again. Your cache administrator is webmaster. Generated Tue, 25 Oct 2016 00:33:16 GMT by s_wx1062 (squid/3.5.20)