صفحه 1:
OrtPicial Oeurd Opto
ia Dathab
Wey Perciccnd
صفحه 2:
408 مس ی وت نت
Rx(l مس ی نو جا جز
bi bres
a =F(Op + b)
صفحه 3:
@ = havdlim(n) @ = purelin(n)
Hard-Limit Transfer Function Linear Transfer Function
a= logsig(n)
Log-Sigmoid Transfer Function
med averkn Orkd> 8
صفحه 4:
مه ارب Orvhitecture
Input Layer of Neurons
Where,
R= number of
elements in
Input vector
S= number of
neurons in layer
1
سس
a= f(Wp+b)
wis weight watices, dimeasiza xR
Pie REM Levin, ckorusiza Rice
bebe هد متسه
صفحه 5:
Dulttiple tayers
Layer 2 Layer 3
Input Layer 4
7
7
7
J 5 J ۲ 4 لاا
a2 +bs) لايل تاد نو ar =f! (Wap +b) a2 = 2(LWatat+be)
a =B (LWe R (LWeifl ره هد be) bs)
صفحه 6:
(Pervepirces it Outab
QOuhe the pervepirces with net = newp(PR,S,TF,LF)
۳0 = RxO اه سم coin ood sax vokeD Por R اه نو
(© = eanvber oP pulpal vec
DE = DrosePer Pietra, dead = ‘hardko’, her option = kordkes’
۵ = Learciey ,مقس dePoul > مات من ان تما
متسد تما ع سس
سید تلم مرو < لها
bop > Aw = (rup” = ep”
hearst > srseeckaid bess
te where e = 1-3
۵ للها سوسس
00 + رطع یط
صفحه 7:
...ووم من
This is co exercise how to roo the orfiPicil
سوه اسهم
row the cent problew, we will cocopute the
weights und biases woudl
8
صفحه 8:
)3020 Cute ia (Percept
0
T
۱ ۱
Ss
هه
net = newp([O 1; 0 11,1);
weight_init = net.IW{1,1}
bias_init = net.b{1}
net.trainParam.epochs = 20;
net = train(net,P,T);
weight_final = net.IW{1,1}
bias final = net.b{1}
simulation = sim(net,P)
صفحه 9:
i; 5
net = newp([O 1; 0 11,1);
weight_init = net.IW{1,1}
bias_init = net.b{1}
net.trainParam.epochs = 20;
net = train(net,P,T);
weight_final = net.IW{1,1}
bias final net.b{1} اه
simulation = sim(net,P) oa
0
O = سرصم ,[0 ]0 مرس
0- د مط_صطط ,[0 0] - ام ماس
° قف »د md
صفحه 10:
DO@BOOD Cute ia (Pervepirou
(9011; 0101);
{111 0]; 0
P=
Ts
net = newp([O 1; 0 11,1);
weight_init = net.IW{1,1}
bias_init = net.b{1}
net.trainParam.epochs = 20
net = train(net,P,T);
weight_final = net.IW{1,1}
bias final = net.b{1} ae}
simulation = sim(net,P) a
© د سرصم ,[ ©] د مرس
© - مخ_صطط ,[ م] د امسر
0 فج م بابدلا سملا
صفحه 11:
MOOR Cate ia جمسادرصص و۳)
1 1:
0 0); ۳
ao}
net = newp([O 1; 0 11,1);
weight init = net. IW{1,1}
bias init = net.b{1}
wel
net.trainParam.epochs = 20;
net = train(net,P,T);
weight_final = net.IW{1,1}
bias final = net.b{1} wel
simulation = sim(net,P) aa.
we
صفحه 12:
(@uckpropayiion ict Duta
Quake the bockpropocatica wit
net = newff(PR,[S1 S2...SN1],{TF1 TF2...TFNU},BTF,BLF, PF)
PR = RxO ware oF cont ond wae uckes Por (R اه نو
۵ < موه ۲اه طلسم vec
OPE = PrxePer Reto (wer our we wy innePer Pucetine)
LP مس مسا ع
20
02200000 «©
صفحه 13:
Licear Pier (wit POO) ix Duty
QOuke the Liceur Pier wit newlin(PR,S,ID,LR)
۳0 = RxO اه سم coin ood sax vokeD Por R اه نو
(© = eanvber oP pulpal vec
1D = deta
۸ < مسا 2-5
۱
02200000 «©
Artificial Neural Network
in Matlab
Hany Ferdinando
Architecture (single neuron)
w is weight matrices, dimension 1xR
p is input vector, dimension Rx1
b is bias
a = f(Wp + b)
Neural Network in Matlab
2
Transfer Function
Neural Network in Matlab
3
Architecture with neurons
w is weight matrices, dimension SxR
p is input vector, dimension Rxn
b is bias
Neural Network in Matlab
4
Multiple layers
Neural Network in Matlab
5
Perceptrons in Matlab
Make the perceptrons with net = newp(PR,S,TF,LF)
PR = Rx2 matrix of min and max values for R input elements
S = number of output vector
TF = Transfer function, default = ‘hardlim’, other option = ‘hardlims’
LF = Learning function, default = ‘learnp’, other option = ‘learnpn’
hardlim = hardlimit function
hardlims = symetric hardlimit function
learnpw = (t-a)pT = epT
learnpn normalized learnp
Wnew = Wold + W
b
=b +e
new
old
Neural Network in Matlab
where e = t - a
6
Compute manually…
This is an exercise how to run the artificial
neural network
From the next problem, we will compute the
weights and biases manually
Neural Network in Matlab
7
AND Gate in Perceptron
Performance i s 0, Goal i s 0
1
P = [0 0 1 1; 0 1 0 1];
T = [0 0 0 1];
0.9
net = newp([0 1; 0 1],1);
weight_init = net.IW{1,1}
bias_init = net.b{1}
net.trainParam.epochs = 20;
net = train(net,P,T);
weight_final = net.IW{1,1}
bias_final = net.b{1}
simulation = sim(net,P)
Trai ni ng-Bl ue Goal -Bl ack
0.8
0.7
0.6
0.5
0.4
0.3
0.2
0.1
0
0
1
2
3
6Epochs
4
5
6
weight_init = [0 0], bias_init = 0
weight_final = [2 1], bias_final = -3
Neural Network in Matlab
8
OR Gate in Perceptron
Performance i s 0, Goal i s 0
1
P = [0 0 1 1; 0 1 0 1];
T = [0 1 1 1];
0.9
net = newp([0 1; 0 1],1);
weight_init = net.IW{1,1}
bias_init = net.b{1}
net.trainParam.epochs = 20;
net = train(net,P,T);
weight_final = net.IW{1,1}
Trai ni ng-Bl ue Goal -Bl ack
0.8
0.7
0.6
0.5
0.4
0.3
bias_final = net.b{1}
0.2
simulation = sim(net,P)
0.1
0
0
0.5
1
1.5
2
4Epochs
2.5
3
3.5
4
weight_init = [0 0], bias_init = 0
weight_final = [1 1], bias_final = -1
Neural Network in Matlab
9
NAND Gate in Perceptron
Performance i s 0, Goal i s 0
1
P = [0 0 1 1; 0 1 0 1];
T = [1 1 1 0];
0.9
net = newp([0 1; 0 1],1);
weight_init = net.IW{1,1}
bias_init = net.b{1}
net.trainParam.epochs = 20;
net = train(net,P,T);
weight_final = net.IW{1,1}
Trai ni ng-Bl ue Goal -Bl ack
0.8
0.7
0.6
0.5
0.4
0.3
bias_final = net.b{1}
0.2
simulation = sim(net,P)
0.1
0
0
1
2
3
6Epochs
4
5
6
weight_init = [0 0], bias_init = 0
weight_final = [-2 -1], bias_final = 2
Neural Network in Matlab
10
NOR Gate in Perceptron
Performance i s 0, Goal i s 0
1
P = [0 0 1 1; 0 1 0 1];
T = [1 0 0 0];
0.9
net = newp([0 1; 0 1],1);
weight_init = net.IW{1,1}
bias_init = net.b{1}
net.trainParam.epochs = 20;
net = train(net,P,T);
weight_final = net.IW{1,1}
Trai ni ng-Bl ue Goal -Bl ack
0.8
0.7
0.6
0.5
0.4
0.3
bias_final = net.b{1}
0.2
simulation = sim(net,P)
0.1
0
0
0.5
1
1.5
2
4Epochs
2.5
3
3.5
4
weight_init = [0 0], bias_init = 0
weight_final = [-1 -1], bias_final = 0
Neural Network in Matlab
11
Backpropagation in Matlab
Make the backpropagation with
net = newff(PR,[S1 S2...SNl],{TF1 TF2...TFNl},BTF,BLF,PF)
PR = Rx2 matrix of min and max values for R input elements
S = number of output vector
BTF = Transfer function (user can use any transfer functions)
BLF = Learning function
PF = performance
xk+1 = xk - kgk
Neural Network in Matlab
12
Linear Filter (with ANN) in Matlab
Make the Linear Filter with newlin(PR,S,ID,LR)
PR = Rx2 matrix of min and max values for R input elements
S = number of output vector
ID = delay
LR = Learning Rate
Transfer function for linear filter is only linear line (purelin)
Neural Network in Matlab
13