Example 1
Memory matrix (associative memory): Compute the resonse of the memory using memory matrix given in 3.19. Then compute the Euclidean distances between the response and the original key vectors in 3.15.
Part I. The memory is trained with three key vectors: x1, x2 & x3.
x1=[-0.3333; 0.7778; 0.5329]; %compute the original Euclidean distance |
Part II Key vectros are more mutually orthogonal.
x1=[0.1309; -0.9779; -0.1629]; estimate_M = x1*x1' + x2*x2' + x3*x3' x1m = [0; -0.9779; -0.1629]; estimate_x1m=estimate_M*x1m %Euclidean distance between x1m and other key vectors %compute the original Euclidean distance |
Example 2
MLP NN trained with backpropagation. P154, 3.5.
function 1...
function [W1, W2, E] = sol_hw5_bp2(Ptrain, Ttrain, Ptest, Ttest, M, Tp, W1, W2); [n, Q] = size(Ptrain); if nargin<8 end end |
function 2...
function y=sol_hw5_bp2val(P,W1,W2,beta); %BP2VAL-output of the MLP NN with one hidden layer %y=bp2val(P,W1,W2,beta); [n,Q]=size(P); |
Problem a, calling functions
clear all; Qq = [200 100 50]; %train, test, verification case nos %Generate the training data Ptest = zeros(1, Qq(2)); %Generate the verification data. This time choose sequential points Tp = [0.05 2000 1 0.1 1]; [W1,W2,E] = sol_hw5_bp2(Ptrain, Ttrain, Ptest, Ttest, M, Tp); Yverif=sol_hw5_bp2val(Pverif, W1,W2,1); figure; figure; |
Problem b, calling functions
clear all; Qq = [200 100 50]; %train, test, verification case nos %Generate the training data Ptest = zeros(2, Qq(2)); %Generate the verification data. This time choose sequential points %train the network [W1,W2,E] = sol_hw5_bp2(Ptrain, Ttrain, Ptest, Ttest, M, Tp); %generate plots axis([0 2 0 2]); figure; |
Problem c, calling functions
% Generate the training data % Generate the test data % Generate the verification data. This time choose sequaential points % Train the network Yverif=sol_hw5_bp2val(Pverif,W1,W2,1); figure; |
Problem d, calling functions
% Problem 3.5(d) % Generate the training data % Generate the test data % Generate the verification data. This time choose sequaential points Pverif=[reshape(X,1,1331);reshape(Y,1,1331);reshape(Z,1,1331)]; % Train the network Yverif=sol_hw5_bp2val(Pverif,W1,W2,1); |
Example 3
A rather manual Kohonen SOM NN example.
W=[0.41 0.45 0.41 0 0 0 -0.41 -0.45 -0.41; |
Example 4
SOM: From higher-dimension to two-dimension representation
C1 = randn(3,1000); %3 x 1000 random matrix C2 = randn(3,1000); %3 x 1000 random matrix %Plot the data set %Create SOM and Plot the weights net.trainParam.epochs = 5; |
Example 5
SOM: From higher-dimension to two-dimension representation
C1 = randn(3,1000); %3 x 1000 random matrix C2 = randn(3,1000); %3 x 1000 random matrix %Plot the data set %Create SOM and Plot the weights net.trainParam.epochs = 5; |
Example 6
Self-Orgranizing Feature Map (SOFM)
%Generate 1,000 two-dimensional vectors in the unit square %Create SOM onto a 5x5 rectangular topology %Initial weights %After 1 epoch %After 2 epochs %After 10 epochs |
Example 7
NEQLVQ example from matlab
%NEWLVQ Create a learning vector quantization network. P = [-3 -2 -2 0 0 0 0 +2 +2 +3; T = ind2vec(Tc); Y = sim(net,P) |
function 2...
Example 8
A LVQ function for my project,
how to store weights (save to a file) after training, and load weights later to simulate net.
Example 9
Class example,
Train an ART1 network using 3 input vectors.
Example 10
Class example,
Train an ART1 network using 3 input vectors. Same as example above, but with a different p value, cause the network to behave differently.
Example 11
A example of ploting neurons of layer 1 of ART1 network using ODE function. 2nd file name must be function name.
02_example11.txt
02_example11_function.txt
Example 12
A example of simulating layer 1 of ART1 network. 2nd file name must be function name.
02_example12.txt
02_example12_function.txt
Example 13
A example of simulating layer 2 of ART1 network. 2nd file name must be function name.
02_example13.txt
02_example13_function.txt