=========================preview======================
(COMP327)midterm01F_sol.pdf
Back to COMP327 Login to download
======================================================
HongKongUniversityofScienceandTechnology
COMP527:PatternRecognition
Fall2001
SolutionforMidtermExamination
1.Thestatementisincorrect.
Theobjectiveofclassi.ertrainingistominimizetheclassi.cationerrorofthetestset, whichisasu.cientlylargerandomsampledrawnfromthefeaturespaceaccordingtothe underlyingdatadistribution.Minimizingtheclassi.cationerrorofthetrainingsetorthe validationsetisonlyanindirectwayoftryingtoachievetheultimateobjective.
2.Thestatementiscorrect.
Bayesianestimationintegratesthepriorparameterdistributionp(.)andthelikelihood p(Dj.)ofdatasetDtoobtaintheposteriorparameterdistributionp(.jD),fromwhich MAPestimationisperformedto.ndapointestimateoftheparameter.thatmaximizes p(.jD).MLestimation,ontheotherhand,.ndsapointestimateoftheparameter.that maximizesthelikelihoodp(Dj.).Inotherwords,wehave:
p(Dj.)p(.)
.MAP.argmax p(.jD).argmax .argmax p(Dj.)p(.)
. p(D).
.ML .argmax . p(Dj.)
3. (a) Theconditionalrisksare:
R(.1jx) ..11P(!1jx) +.12P(!2jx) +.13P(!3jx)
.P(!2jx) +P(!3jx)
R(.2jx) ..21P(!1jx) +.22P(!2jx) +.23P(!3jx)
.2P(!1jx) + 2 P(!3jx)
R(.3jx) ..31P(!1jx) +.32P(!2jx) +.33P(!3jx)
.4P(!1jx) + 4 P(!2jx)
(b) Theminimum-riskclassi.cationruleis:
8
. . !1 P(!2jx).2P(!1jx) +P(!3jx)andP(!3jx).4P(!1jx) + 3 P(!2jx)
.
Decide .!2 2P(!1jx) +P(!3jx). P (!2jx)and2P(!3jx).2P(!1jx) + 4 P(!2jx)
. . !3 4P(!1jx) + 3 P(!2jx). P (!3jx) a n d 2 P(!1jx) + 4 P(!2jx).2P(!3jx)
.
: reject otherwise
4. (a) Seepage4.37ofnotes.
(b) Usingtypicalnonparametricdensityestimationmethodssuch a s P arzen-windowesti-
mation,thelikelihoodofatestexampleisestimatedbasedonlyontrainingexamples
inthevicinityofthetestexampleinthefeaturespace. Whenonlylimiteddataare
available for training, the estimate obtainedcan b e very inaccurate. Onthe other
hand,parametricmethodscanestimatethedensityfunctionwellevenwithlimited
data,aslongastheassumptionabouttheparametricfunctionformisvalid.
1
5.Sincethisisa2-dimensionalproblem,wecanplotthedatapointsanddrawastraightline
toseparatethetrainingexamplesintotwoclasses. Lettheequationofthestraightlinebex1+ax2+b.0.Sinceitpassesthroughthepoints (1.0)and(0.;2),thefollowingtwoequationsshouldhold:
(
1+b.0 ;2a+b.0
Solvingthesetwoequations,wecanseethata.;1.2andb.;1.Thustheequation ofthestraightlinecanbeexpressedas2x1;x2;2.0,whichimpliesthatthelinear discriminantfunctiong(x).2x1;x2;2cansolvetheclassi.cationtask.Toverifythis, wesubstitutethe.vepointsintog(x):
g(2.1).1(.0) g(3.2).2(.0) g(4.4).2(.0) g(1.1).;1(.0) g(3.5).;1(.0)
6.The.rstpartofthederivationisindependentofthetypeofoutputunitsused.Thetotal errorisgivenby:
nc
XX
1(p)(p)E(p)E(p)
E.where.(tk;yk)2
2
p.1 k.1
Thegradienttermwithrespecttoahidden-to-outputweightis:
nn (p)(p)n
X(p)X(p) X
@E@E@E@yk@xk (p)(p)(p)(p)
)f0
...;(t;yk(x)y
(p)(p) kkkj
@wkj @wkj @wkj
p.1p.1@[email protected]
kk
2
Theactivationfunctionofthesoftmaxoutputunitsis:
h
exp(xk)X
fk(xk).yk.Pwherexk.wyj
c kj
k0.1exp(xk0)
j.0