0% found this document useful (0 votes)
29 views3 pages

LLM W6

LLM Week 6 assignment

Uploaded by

shanthidl
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF or read online on Scribd
0% found this document useful (0 votes)
29 views3 pages

LLM W6

LLM Week 6 assignment

Uploaded by

shanthidl
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF or read online on Scribd
You are on page 1/ 3
‘gopalaiahshanthi@bmsitin ~ NPTEL (htps:fswayam.govinloxplorer?ncCode=NPTEL) » Introduction to Large Language Models (LLMs) (course) Week'6Assignment 6 iHalready registered, Your ast recorded submission was on 2025-08-28, 21:06 IST Due date: 2025-09-05, 23:59 IST. ‘lick to check your 1) ROPE uses additive embeddings ike sinusoidal encoding 1 point payment status _ Orolo Course outline 2} Which othe folowing is rue about mulhe tention? point About NPTEL( Ott increases model interreabityby using a engl se of atention weighs © cach head operates on dere pats ofthe inst in paral How does an NPTEL How dows an NPTEL Ot reduces the numberof parameters in the mode ° Heads are avorage befor applying tho sfimax function Week 10 Weok20 Woes 0 Wook 40 Woes 0 Week 0 ues Atenton (owt line S8lesones7) Normatston ani? nts Seleaons8) of ransformer wing Pryor (ue? lneSaleaon=s9) Lecture Mail (ait? Foosoack Fom juni? untebaslesson=61) © improve grant ow durng backpropagation O normalize input ombedeings Reduce computational comply O Prevent averting 4) The feactonsardnebuerkn a Transformer blocs noducesnoninearty between atentin ayers Owe 5) The sinusoid! postions encoding uses sino for even dimensions and __for od dimensions. @ cosine Onone ofthese 6), Why is postonal encoding ade to input ebeddings in rensormers? (Oe provide unique vals fr each word © ro indicate the poston of tokens since Transormers ave nen-sequental Oo scale embeasings Oo avoid vanishing graonts 7) You ae given a se-atenon layer with input mension 612, using ® heads, Wnat the ouput mension per head? oe O28 On 4 point 1 point 1 point 1 point 2points (assessment? rare=t25) Yoar 2025 July Solution £8) Foratransormer with dan * 512, acute the positonl encoding for positon p=14 and eimensions 6 and 7 using 2 points the snus formula PEQ@.21) = sin parmean) PE(P.2+1) = 008 (apa 14 a sin [opgs7ase): COS (Cosagarase) thy ee te © cos (a), sin (Ae) 5 14 14 © COS (seoggarase) SiN (roo oa7ase) o & 14 14 © sin (apqs7s2): COS (SoSaga7ase) ‘You may submit any numberof tines before the due dat, The nl submission wil be considere for gracing ‘Submit Ans

You might also like