From bbe8349341e9c264d104bce0127332014060a5cd Mon Sep 17 00:00:00 2001 From: Revanth <109272714+revanth1718@users.noreply.github.com> Date: Sat, 22 Jun 2024 23:45:04 +0530 Subject: [PATCH 1/4] Add files via upload --- contrib/machine-learning/assets/XG_1.webp | Bin 0 -> 16294 bytes 1 file changed, 0 insertions(+), 0 deletions(-) create mode 100644 contrib/machine-learning/assets/XG_1.webp diff --git a/contrib/machine-learning/assets/XG_1.webp b/contrib/machine-learning/assets/XG_1.webp new file mode 100644 index 0000000000000000000000000000000000000000..c693d3da7138fa30b344d82d640a233440c783e3 GIT binary patch literal 16294 zcmYMb18^n4(l&fz+qS*2la0-d?QCq@wr$(a#!fc2Z96&N-uvFa{;8Vjr+Rvxsi~>% z>YAxmmJ%1w6$Aj(#eOQPDRODR0ssK$f4vwK;0pryry=eY008Vi0XZPFC7^yF0yb=^ zlEtKj1>_A8yZ#6f#%O|L>XW@KUqr`oZ7-BBPnSd_p8@Mu z%lZ@dkH>lMz}-FuUmw3lpx2kxr+UA_wf>pl*0GJlnQF}>UvFpc!3`-Ok5e}4^5*ZZ{rxzFJ)u;%>!0{wt+9~5^Acfd0r z{BN+&un)U7rh24h{YPLrun!3QX6nZFp;`rdt9v1O+d1@O7svn>0RMakeZ^fPwf6l5 zjsbUo?B7k#WtU8EcF%&ZeY=8xy>5WO7w(S?;K^3k^C0jINj|RmR6hmItpbBY0r3QdT0$Yj&i%)OEAJBGu58bq|is$Pe@+Nh4z*f%dq;Sr3N?xE< znT-OJbpu(bRuFjb#1;DaS*}BE-$+jpQ&!LM34F-4UXK~Qt2)_5sbn3qh(bilsjwoI zL@eHs{9P*aTH^27oIr(!LNp(xxMys6iO?r>mrvrWN-wF+H?8+ayvweI{>o;QD0Z)& z6yimz%z$lY*D&E86KxHg@y;jso zJ%P2$<5jJnkc;X{;?#9~V6_LnqSs15QMxTFd%btfo9c>m|2qNZCg2{S7c%>b!5FaX! zIzq8-eIu=f^-DvPHQDzS64;(1+|IJdupbOvgMoXOCOTC z_*D-ZP~xuAG;I^zFg+??sN|*m{Bml4J*-~hF#>0lXucIE%=k0l6>8q#^$Xdk-*{5h z3pOKoURc(Zjj-^&#ubfoijS5x`ULIox~&=Flj!Lj4S0?yLB3*Hg$ioe#tX%E!E^Jo ziu?(EE^T$epd}jliY9-~>c{DjXj^#**RplPY$&dZ6t5Vrr}DCDKc-+gu>J+um9g%q ziFa5gm=hT?b}7q2E-ZW#r4nE2PN)%c7_J6Qk^{P%Sf&_FGkp4*f?fGNNvQW8)LH0A zN0o?D4Tvigx78iA-pHnPc;M zj`8p-_nr>bQCqK|`#R>GP2VRfyY~9#5W^`C51muG_l)@ARE$wNY)nBfX%;Dcp03Cw zPiY`@6{g+65Klvp8BL425Xm^1E>)jzoLPFA1nDx-N1mQqNU7D(3_pI5dDx)}N21c< zrA_U731lz7H3UlL%lS+U`2cjaIK%;~ixH8-L&$_@!fIVbPQxxVVhq$>%Gr=cCKKj% zDfT|v6ZaMF{fsmMq)Wv^(ewdx82nEBEq*GIHEmGX3_46h_(p+&->s20BeWVy)GU;K zXufq3U2lu7aBSRi?ODUus6TSSb!a&pr#YqYg^VTJyrlJV_O8!RN`>ISnk|?lA!YO~ zZ-fiC=YZ$P-RVX7V_fX{Ay)FK-%OY{cTFMm_(P7jlAklXFER3~vsvKrU3OA(J zeoL5Fw=-z>mU;1q=RGSFKl~pv#^q}P<%Xj_gki@L!Wx_{5tVP380jmLpP3sDgf{Mh zYgZWS&X7{Sj5ieWk;(?8KE~BEs3i?7>z7Dk$o|C!7mewFw(0<_0!ET7u`#~`y?7fw zUBq=v?Z(NcT{*Ql>ulsDXWgwpUH6=g)4Vs|n4fzhHU~roP=Ird-~uZG5rfWu{vP5A zze`_>UFNE>$Ue6|8i0`D+Y&+hdm0Q)u zd}up_=;^;VBp_eSq1`A@|Y3 z_Xa>!wuS;^uX*kk-F&{_5UV|iMfAvO`{Ar<11o0+!r=Z{+q`S{fWX}s*cS{STuh;Q z2Wz5ms@uiR3}K+Oti9PcG(SV`{0oPP-WvTc>yA@;K;o8Lv1*YO+;HlDGr6f5UZ~_Q zp?)}iFVkxxl)~r#3)Yi`*u)O9a#WGlTBSuyOJpT!aZp%H>=6IiRCc=?Q=F;3GbZ_( zm7@_^kBDipkGcuUXb?hwe}Olu%;;N{5uHn8-HY?8!&oaW=KhkN9Y52x3;g;*#N^2B-*^OHUnubX&uW38Y;P=_-0A&+`CwHVY0pr| zRbhvv*+S%}4taR4+W(#Xn;7u@HUoHM*kRnv;Ix4ds-K&hJ1!6+IN<`ZiU5%_l{)$r ztSUo@byLdtTuUfXQ>%pHu!)eZUvqJb=EUJ11E!n85D{O-fcDob?vS;}Rj#@z^&^H# zrOpoZCpI8^Zt%POBfF)~JN&%x*&hBJwbElkLgdOjXsjYg3j)?e)cs6c&y!DVT?UW% z*X(tLI)JcAy)9n%0KNT~UJ$4g2-3sPVWcFfOWZSsAtz~81vRQGYk$`c)?0u9Cxko* zfYd(-yPH*(4f^%9stAyNf1}Cf8%(G=4~Af{4U)v#fUTE4F~E^gRSpA|n9=Hmx)_WqzKUWPq z*GPRrQbyl3$ixYM@w0iV0X*4SF*)#D$YLu^JM&60-YZ2-&%|9G{0NdeU?x~)lGn|z z(#X&J5({f0{)2L0quSGz&#sSoPNtBP3s>b!xA?NT3cEOl?Xbp65Qh@q4TQOHhSDi{(`nXb&FT73a)W0J zN&{EWt>H?0v33(HZKCClXL?vo+`HL|y{Zzqa}L;@I~%siDwE~>t_YTsUe+W4Pixb_ zx}a5(bo5Sg>|b^*qgDwyy#XmDwNCLGX-&Wgyk(dtCEG|vTGAS%+>+MtPQ<^2M+0CD zf<}`5K|6D1c`~;ka8mW6Uso*dK$FB)WQUR&%DdH!vWrniP_}LzLG~*y7v&D9RuO}f zuo%AERM|L~BQ_d8GQ71j_viD+uJ}dP*SL|!34TQO4W|PgRxSJGy4g3(3;;p!S0qd& z^ydL3K^f(QLA+yUe7w0@qXIAbYl6>nHykD-JPJN0RB>{DS3@7j)AplMd`ExyhV0QA|v5+}%t zvAQyi@T+YK93kM2j4NJ7LR&TPeIhAdk;;gcvRXPGY}w?wAp8zKziCkdo-pYyN4ysb zGQ*8e0Nf+@@5EHk<4Tau370g5YC^OloV4N3@cB=O#V4th$_)C{-U?+M>~WnwZ(ol_ zLTju$k)5fi`gaTX;zKrH?E0?ZyD-jfy6c{9s)(D~;xJ;!vMlW!8wFtEMUcX5TmwO^ zRpIR#q9LY^=}06)=oSKJ_lywtPyG4c%-CH)D!O!6)Us`Nf$3(AY}ze@Dt{+n!42za zDkQq=o1kn-G9HZ=Yb$$q-rONFxX$?~E&6UiNG>c;le%-*;s3->Aac>00BE2PIA9(a z)4jq?xw@5lu2tmRA@91T*ATzpaY}oGaE%htkw6GBzS%e))eBFOabG->d>blUKJ6K=ld3w za7iXe#3t#@?n^b_^?lj|*jAxZQlI&Hu*m6nX;~yvZqL^HVgyb}XBvq{h{iLWH%4 z+3dbM+TuckXL!a{=_rcEpC7#Tqk?Af2bW7wX2RsfFq!l#3SYWFUSbcp>Msc+36SvH zM7#b%e{Jrlqk_Dz(>S9$I9p8e05PM?Y{FdDlI zas_S>F1}!sm*{#Pv~{s^{AGQ=oA26=jEfpJw)#e(*gTCD`bc&9p6ZC=t$p|%4wMs5 z0ian{J6yzpkfssf)L-!qLkKagTRIW-sJetY4QQq0ROogz%kf!vO(OA7+L`LGbCo=s zod=zGT}ZNU%YqeOova4fpURXcnW3r#=&Pajz%94G>tWe?Q3>@^@KqJ#=_|22AUZcB z$O_%>>G+bFhmb_oPQD9Le>)9pszXY-u*HAlEvXh@g4 zJEP(tm*jy|v6q?UCP>|E|4f{Ip1P zbp*=|)9g6yh{BVa{n4PjO^)Vo6Bg%aKoFqplP{1g5g+fv29X|{QsDKf3_!kvobLPtNdpX2>F21-u(V1udll8A1hIDs0`i>hWQz|Dbs6u zu$`=1#!H=L4xAPhtL2m8q1553dV@S+2Fe?(m2bYip?4yQY9&>3(~bhAhi04^O)Tw% z?I9m_m9PdOFrAc}PLPaIY&`G}`a@@09}|$?M!fW0o{36hWrV{<$i674 zJ*{n^Kv=?%P30mO_7Q4gF&e1y<53XDqX>**Pu;* z#$Nf}GM3*2F(27CYess<8@sZyO;W+XXKG(xdi9_Ux% z0}H{lWu#x#Nw-7E=dAn_U^YzFMcD(~7z|ex*dY$Ja3BB!)Q~?YY52I0?r7kuL$>3o z%HShnX%$~V^){MZZLn>wQ~?^Z;vtUzh?teKTsd;w^8CUrP}4dLXCCrNf?}8#Ys@A* zvc>SR0z>wFdrw3GAXncXltfiqZBHcOIVE8cc0ly_#d(uP*ERdBrn7Y3igh|Q)}u+> zq&&9#%-Sea`&d91`@c-3W>7JU-9B8i8M|h z6Vlw>d+#r5+J68b?{+BobLkQCW5dW+gKk%ZWW=lUJ$e)cPt;#~stzYfjlJW1UF3P( z)4if*QInJNNKufVRb%nVv5q5LdfE%}=)+dC`eA%TUCb|c;WssZNQ`);c_-^R#pCF; z1_bw`fV0B8r=GDJ{Q0|f;|H`!fyc?|?k1f;bo;cM;Lj&yLGc4ePQkV)#g8bMTL|S) zsH=93H<5!ctxfOPfvqzj=2gSGJ?CubPwi`Cr<=>M{(lNUC5E)ARLw$*fv3I|^%XMb$FUQ?;-XF4;re#r>+ld-} z7+h9e%9gqx8hU|PnD-!Xk$=f_>=HxGszr%0Z}ZzzM#G{DZFqOiEi$8n27yaUE!&u# z2(#GR?CO~@c~E)48#;*^4>!)J9rpchI_+}ITrM}akfUK3CBf135F=}@h-&?G~V^&$w-oS-E-SykZZXP*$p;+T2d#2NI2S5A9J9EAs0Idm&tk+HC!{PW7GndI}K z^m^XqHe)QhouS!_-x_gERRcPn94di@NB5n0B}R7)>gp&5ph#85!dsTi5=|jb{VuQY zNV^hRpCe-;ys2p44vP)P+td&w!&*%}*lJqlXIo8Xr3`B{8XJg+2yc3BRY%f0Xc86Q zuw%3zM+5@I=2bA5GD$+;>S-11UigzlmG0-VVLDuLmRm0~HluAK6siix!YQfS6e2ib zhZBG#_-^X96Ka`Y-)F9jl~8e(>;)!_m}eP~7q>azNRT1g^y&D4|8$c)U8pYdEdnk5 zL#AKYFM@GkEjP!AEO-FJGobwUq^NhGnkxx@vL%5@%|t7iYjAZCSi+hab9VJ*L#z>C z-~)mw2a95bN25H*deYSvsb2 zS$C(5;2+cH{(jvBhD?K!u>4TRxp1hTy9y{P8YO`P>Z!BW2yf%_oAy`2GWR0g*Uio-(Xg*JKc*znamM8y z7aegF>W&6`xnGdhsN>sFOi_Ob=0{}V)k7| z7>HpmiyrZK<4re-v(ID7iAkTm~1ac3zNR4xkW}^CpG`y|3aaDJM+Y1$R2a<&v^bInPg=eAO2p(2L-F8)ZSqDxcyM zPwKF=+NS**TUx&<3+bFL5rPVrq^YAhj}t;a;YH9S?_01RB2R1#IL9!npwwi($`L2}_4S5zqvOpAlD{)<^uk7 zh*2H;y5h(YDql0 zK-NcR$~Tv|6(!=FJ$OL1o!fk)T>?uxwV?FVPLo=s5;3bT?X%T0mEpjEjpnSL{X<8v z;JAJjg_@{aZUK7A&K4KS2A;$^mqp_9X^(~Q9yZLB?@`(pZ2c);jx~yu+Ks#@6!#fv z{93xF@(H3>`?Bq3(OJ8Xc~dGskw?zQU1;I!yPy?S5`XR^kQR@;Qn(a7AVe>V zKqL;V{x_Sqra{;F`^a9($2XpbkBjR*Zch;YoG(l^HqrbYydyC{K7@5s0<>qlwAWsfAdqNf z?@0h~>kS*UrIr7&zRzpbV3C|IUMSHKkv7CaMgC;0s+SZ|FINfPETGy~k1xQXN`%&7 z8!sxc^9>_Oh@2oC2EOT^%N%Z91ns?7B(Sx|++Hx|AMvV^)u@(Y9ogx%!8QOF7Q8g1yD~zD)&f$$YX`4Xn+h()p9^IgHA6~! zr#MTiac8Tiy?LYRMk5IB?`}gu4GbX|EBi9GFP+gW$k{!9fKD|SHiP}@F_i+wh@+9n3`AM zN^c8&`h|GzKrA=Oxm~IPr_sdkV8!WiOOeugsUXM=6K7zB({z7XhV$f+| z?2n7&#Yu1Chd(Y@u|=8iV_9zpqv#6|G32*Xe6xwvNZbLBoR_2Z%V~|Uj!lm>xzV`^ z&IZJ9WW91)bOW3z4;=ZjZ%_ln2A$l26y}&oTkxvQh0HPix=OtmZIoa3n?z55q^>Cj z88$Em=F@zHxOVzaovB9RLz^7>)5OU^{_SOOm?fTUXjO21S*}7ckP~I7pW<#iAux^e zFbpUd+@hjhX=_)55n4)iM-d%7WHKg#-&IDJk^&{kRO=9L0f9xy7BjP!GJS!qsrK+* zkV$)ONPUL_m05M7QOWGaW-se8D%lfeMMZbxQt_I?f?LV_5X>cI97@RZhc7<1;;h|i z-(g_w@eakqj(CN4ZB2Lv&9=7OjF|p)eyx`TLP{M{+fZQ`zN*HUlCEE&k0ODyI=&<~*5tvfj=WuFhamWc`$lK_u-%4Ypv$$D z(L~geK*fC-?GwzA=x9?tEfgH2t;xu$(Y1tlZAdT5?xatxzWOgO)b%Z;*{JlyLz*vw zj_1J{s|$4PqF~lKQcnW`E=M#70;$P+d={5Gh_m|SFC*}o&ogDp@#&f-u`3lTEt#q| z(yw`CeMF8_3VGP`sYI41FWHVKOHJ3AZtb8uTPTI(+};6a`x5Y$R{5;1;@N!{yR=>g zm}z%jd0}59b4Q7f&RiS@LgXZ{mkggk<<{P^%3=XLZjh70NEGQffhfG;yPASRhjtg! zT05D{b8YQ#V&+|erzXqjq9&t{l1?U4ZIrhWgveO)O^;lbn=jtw4#DV0jl!nstl~iLoSuAi>=(%Qor?uQ zF@VDx)R9)60Tlxwxwb#0*AE=arI={nKM;7+Jr0*&k@bXG_N+Pe{(p> z^#0{^-F<}G#kqt9&#)ABO-Pc7`=Sr>u@z-@6IX2EMurm!!&5#Fl;_@&+`SAU;}TZA zp97^r0}#GN56EZoQMT+~!aMyE1Mdz*O&0MOna>`xl#dAy75vbyuN7O;94Gx#(&!Bg z!r@ps(&vw09e?bv>_v5}is>!wl`}vEwJ0IXHxgh}aCQe5S|8VCwG`@D;eVf>w2O=l1z{T4o#2<6BPvq|yTFm8Z`?#GdXTmb034!Eyp~tSSe=^Mu z|8t|0gA?)wtSTG<*y{xde`IcELo6n%~ zbU0+X=>M%p*$YnOHdI?R?OWOh^ru^O{s|&(R!jzedA;ES_Z8UvzW)7ww9$6bMicVI zR;Zl*$<#>;x!o!Q$H^VxA5JtI5BCFA%Xf%y*prA&UD2Uev>;bJQe>Mr{GBK4?0bAM ze!j`(#OXCns}0X~@E5tcU73Q@Pj5xtOgDi_X7hk0TqzNN%OzXhASr-zNKzMF+S6ae z;#o^}GBCH4Nn7;v%!aOupDCdH&xY=JyY9}RLUhHWN4ys11Xr6a(Ssi@O4W~n`aq+Ht=FE_LOmylL>P(k6j$e zf06MzL9f@(rRK6u9t-(yNkVC{DaNX(|Gv#-troy(2C9w*7XKI#ZoQT_^ZDk6!51oF z^^fC;db6E{0lOywt|E6KlSW#SE`{*!YRI3Yd5n={B?a?Zcl9E+P2D@#)Qv`d*9(Sx zkf%rDejL6@AvSu)Z4D_Bv$5{7_KBNjSkMl^IgQSoCJtDv2xs1RIEv!CCBAb`aLZZ4 zW`VmI;M@x002pHt~QkKF_#P@XfjYX$`_Tv_v(1uPZ83tFeW1{1--@8 zt_(pOGeZxNI^bh{)a3_{W#=oJz;EYQa+rna2Jc_hO;3Pumf!} zjN6K4|1N%L_RKSPUembF5c|@xbk2u5Jfw*&h&@3O0vo+DV zi)dSR=eR``>pCkF>8x>M5I3GOwvhS^C)F)lUxq{W%Dh!s#-@aXc4~&T!IUg~v6Ui% zn%c_c29%q>dTl&Rp1bw`Wa3~yuu&$_IXya8pToW~0Zm+F7a2;;*bc9%NQa^wREGmQ z%SSs{St4_0UeZ5cpTGV&G$TGoaFXn2u|$;y4i*H)$2%~*$J)4s_ILA)RQTYM&`xp+h5y z#~$yNW=wpwF*J415`^{iNIvKI?BIMjc+w1DR3n7G_%XTz4YqPsMTlUxX&BG$IJKa) zpEhF}Y;~H7@KZCix%()=Tcb&Hpz(+Gc*`km_V^lRcg%8{yh31QTv?0 zP&!Ei&||tQ9}FRiw<=S8E=Yhw`sHJ~P;dAI3%Ay`w_bbT%-tmp8_fXF_kc%(m}a6! zR*}sl-g%5AZ;1n)y}w?hm)N>AL*FTk{c9g{mnT2siFYVlNv!5R(pPyzmr1aPBsqcq zmQ5Zl7d9SS_Q^H9?;l1h=W&R)Ip0m!G1Ju{!84` z+oys3`M?C>Z{s*DFJ0H_C|CTWl^8ocRi!wU>i5D482(c3d(WmTse%F7Emq6P&zG|7 zjwH?RmRe&l5UhF?%Voz?v`q+s$3$K`7x3+xM0B58$ajwrB-#4G?DOlrjF#wMJ_caw zeK^D%?;fT-b}Af!(N~BmW}`+x0)$@~4b@TK{3ZDN&^Duwq__+_gMt$f{?o6&1N*kVNPo8O=c*M%5rrU#44R-D@s2YaoB@ z{0HtM|H^(xNrjIvk|AGA^P}T-guJH@3C)w#U0lI0nw;e+Nw}gD!!2_x_H6N%eVt7{ zz@qzy$@WXMQg(E5sA@L#mtj5X8ujjDv@#D6ihd6WQDL)1WJ%qDOd*a-pXeXDprL$M zB}4$8G9mgIiJ%NGF(E21jhO=bL#La;L__Tbn0Vv-D0+^lgwFIENE%x8&P56l^U*b| zW6l;Vhrnm!{@O8?-8#Ra#T~+!UKuzuv0tR;=2fPF7YT^$E0x@;`0B(%eHSUrTB_Vs za?}924HF_levM!{u2S!&46NfU+^UpqDGv`M|53!zPioa$3H6<{eu8P3#XLXII7lHBrkR=vSe{>{^$V+T5vRmFqGx%tM}Km z5y>T6I^y*EIOFidy#90by<=qNccgp&#<_cJrSH!g-SgxK^n5X6tdt`z7S^N%BY)r#KZsUh3g> zR`Bl%8;QXJ|Mm$m{EUfx;~^%?J)%c_^}j_C8<$-Mi539MxL$ZbTuJx(EL~yb^7M?b9jyz;0;pe z!M0vMtvutkZ;{8Z_}riF-H>OKN@K5_V!xk%H@80<+Tu9GhI4)-N%1v0jETuFqjj0i z3*A^y=c%Kb+U-mnxVeK1yvF4h>@9@R61ry*L^t59Gfd`GWw@WgbAB8!)caWKIu+Jj zVO4sBq?loW*N?x-b{sr`G%v~gUKCWg60{rFRSoa}f$V z+z2MN=8c_vH}kx8FC(o+I{}CGZ31!#Xf?x-*v5#q0Psa4N;yvzSRG?%mgGp%Gvpjh z_i=Hc6g$WLXV&40EKNwYU_jxB(-cu^`=q-RetiAUs^$k-3!>;VniUjRvx=Ltj8Ms; zdaw9#$b>p6G)a={_w>UQLT#BB{gU?9eoK7sjA`cTxUagDEP8V=U1N^D2I5RkIu@zoNMl=e1olR+f+w++8XWJ8~ z$V7LYPP{AEq0UTf@&Ps|n*|(G!%0QgMajNp z*rx)&G7K1TeG`;TM)GS3f{wqJ-rNoO_VCM=il_iV*#FKtN4> zV1VvykFV*F-Zlj;6vq9@{ILOLAsh6}4|f#jK^G~auB-2aj}D@j6SRK+?9sutkiLMh z=v6Ee6_59HNcuC~@CB{VkogMk23W3s98TSBEh^Jd|ET4ba3E+rsoA~pYthnEmkT;_ zFrJfbaWn0vE1U(@6ex+-zU7hlm{+pH!m1>$iTy3GEQIh2dR1X0Rd49$=a*dT_x`5O z4T^C|BUrx|GDx6m!S$5J0CSQz8$v{!|M$K9UG@(xQIIRbB zY2y=b4SD>)sHuI8kwCO=%C*#|p92};s^j+~VL?tW&6nL%Dd`l!wN^~TdHH1SBQxZI-aFhk|@+8e@%^S-sQV=)tdEC5t(0L$3|H#I4tBE zw7=$c^Emkx@{|`r<;OMHw#mg0NU7H(Jznd9p74)RtRThy9qX3Jm2NTSUK-#X6+i_Q0 z$bB6V=h(AwEmC$nss8?!ey5EBFQ^xWemY7_>NwnSUgAP`OT!epNwT^}-7Q%ncSbrd z4S0G9==I~$R_MQwM^R~>iJ-|$SEkc~yQAy7dPXqmejitYs}c5qRuqaS^dGy(HIrP( z47#0_)|;kg;5Zy=*0v3a1;K*Hq%x8#c7t!?HK;-;5f&p+^@O?WncWqI)mFjLzYI^O zLyC7Gu%{wirOapV@elET~$YZ^j zGF3vuQ1=kt21d~rATDv4JXJ$#5~OYReak)so6m>sg3-2XaAWQC0|26Syq+xsL8#Xvb>dH^y@YAb@Q=Wonp_;=1~rn2tq!cR6V`Ia3{ zu>~<$LT%yvEWMI@1>9@8bDA9QlIi_}sd2qbQWX$cZ>yb^KI`(f0?=_poj!^fkFAOm zxfqfZM}l^%bgyyseS6BMh5kwvoAn_NBsxKv$)*`mpyIW#s2pl)&ew!&%km-uM;(^- z!tZRdi7%A0NB;YUUoEWVimJI=mP_;*TL27QC^PALnBNKJI2M{PXhq zr0tL6UCduPHvH|^yZ~x3T3$El(w`0>)V~o+L0ocO@xrHn*Cbo^u+P+hw_GlYJr3?Z zP=m}H##oj>nhG-T7Pe&nu7 zTmLw?5W-d$f#@M7!+WS;*tq(EuPkXiw1e1#{;e-Gv$rrqf0Z$H5Lg z+mhWl%=1bO;ByT!OKaU`IWKWw>J>uLq8Rqh&b(GvmC+bd-&^rk>5aABHlM8! zJyD`kuwZ4}&Oq~-2s%)FAh*B60UuF`6yy6zx=olH8WU3*hvA~~OsyAY@6+vA-$fJ2 zx3#MY@>;yN%^S}6+k(pu2>u(LR-B5|(q$s2U{>`2ZpHG7kR=$wOE1vWPi)7UYogrz z8}j(}tD%U;m(Jy_KlDp}B-ew=mHbZwBF{}rVD!@P$r=V&vf|$kiAl&(5)2t7^X1Sr z#6a;QtiRwC_V*}F6Bk-w>{Zzu#GkoA@-i3Qnod~iJ2@+u{^+k-UK0&JEK1_}_l<`3 zVZ*r4uW;l{VVVeGAxr^~k#MoQU}54E1)))6%-bp5_&dU>BZaml5;OqzYc`v3Pc8=0 z!}q%`Z;0?j8bZ49`H>#xb|`ZDp{K5OxIYpkrFI2!bRU~4>iz(*Z!H|vY`Ut6Y=)De zL`IQ(S-j$#rmHQ+VVSM%mu%Rq+*QS;c+ktk*!56 zm|)cB$ri@E7(5_?KLg;ZOIngy2Vwo!MtERqP0wMB&{6Z zYc*`m{L`C<&w%x-c>;6{e|lgmLpWTgGR7&SUx_e_xvE5MH5yj!c!~DC?Uiq`c^`TElXW^zKZ$ zM;nuAPXH8{rUu*Z@To#)*O@5dG^fT<=Hc4$$qS~Uv|2FIME;{sUsEN)Vx}n(!@sT= zO#&CRVDr-QY^`EO6#gRxTr1U75XOdb0H$9T070N@&z<`VhgJVtU#gVGoj6{Pf3R80 z1?-0T+IFFNsT7BY8_&Eq(|E2}4M(dvqM*;<2yMnL_PS529Bd#2Y}sGWztLV8g1<*rOjYc}vlXpcCQGa13yucNNdyTg zzwJD6RHVnK`x^&(KyF%$0CK*Quz~XZ!`3^Eez&KVNc66h>J$G;ix7{qkU+QXX`?Ao zHLpSACYt3`b&Gku-Tn%!o5VmkK~7*l!k%(8V^P$%M5H=+PQW{+P^aM))zfNY`$8KG zwcQ95AfOa+l)EstWNTQDw%#v5{@f%-sN$B{z16m={Q`zxZzC`(mxvGcz1#|*>l)kd zN99tRGO{emz0>FROqnVLoqQMAI~zx&$SPp*2fEW1d*Uh+)hiaymg~*QDU7r|ydkKg zfWfFOA~t0nQ+68g0;+785W_xk3O{b8qwqIc>7%%#cjLP(#YKF%dKc0E;9 zISXP!ZUF(G6jPaoORE^+e>DyE>;=rr4nbtM!m2s&z}mM;$g6HMh~wS{D41--DYNwHM62|u)06c>8rfzf3Dk-g=w zg~Ho}Iez3SLtd5+x$(!@TI%I;A`@1#km&0WMLNd!t^R}jf>;Q}Q@P7sh4alcWm+xF z3q{USAf7r)VO(?xBp|f5r>QZ3zU~9ina%`VwqNjom#Ur^WeJ&82nOu`sRApV1}Pt| zPK|=Fmp1bGs2N3#iBoRI%nwf94^Sz%158ip_hJ3!x77ZFAI4{w=v|)ufzeu{3$t~c zDoU>k58+fD#V8>AT7oCAJX-zUrIyMsBINF2b~*gxm8~yr!*t=43rsCc0Eukx7tBxY z4gs4)76;LNDR1|TmZ;&TtTf!TI#ixTA>{%NbDmR2uKul`v6`%XL^d|MS3)k;;^xJ6 zW^xgcW7w0h0 z>`Ap#CVgQWeJa7Y;QENajqr@}^B5ok(79`8+RZsL;CAO9n$*bDU(P>*NP%G^QK5&k z8Ta?v{($5WP7*hFhu?I*J9yC(c|B)LvTeK>u9S01gIA;oV1*WiASmH$0M;FP|LialS`3RyE=4!k~tjiBOT;OvbHp4r`omOZdx3-~B`sJ=j$~LEQV_r_H&Q@$ff16M4TFd5B)3;uJi3YARc?hlOgW zsPq8a+|rXtx505J_I@y`E9cS|z>KuU`CzZLF~08|h#0LVZ6e>yG*#DAsQ zKaBT3toIL7|IY^WA4UWH|LrZq!2fIikMO@u^56I$p8rom0s!$)0Puer<{vD~APN76 nLH;oS^8cr2VP*!&|35|`5C8zp|3Ai(e_Mh55BvRx@?ZOZ$>WS_ literal 0 HcmV?d00001 From 995c7c6581a7c44cc81c184876f8d55a901b63fa Mon Sep 17 00:00:00 2001 From: Revanth <109272714+revanth1718@users.noreply.github.com> Date: Sat, 22 Jun 2024 23:45:44 +0530 Subject: [PATCH 2/4] Add Xgboost --- contrib/machine-learning/xgboost.md | 96 +++++++++++++++++++++++++++++ 1 file changed, 96 insertions(+) create mode 100644 contrib/machine-learning/xgboost.md diff --git a/contrib/machine-learning/xgboost.md b/contrib/machine-learning/xgboost.md new file mode 100644 index 0000000..f704463 --- /dev/null +++ b/contrib/machine-learning/xgboost.md @@ -0,0 +1,96 @@ +# XGBoost +XGBoost is an implementation of gradient boosted decision trees designed for speed and performance. +## Introduction to Gradient Boosting +Gradient boosting is a powerful technique for building predictive models that has seen widespread success in various applications. + +- **Boosting Concept**: Boosting originated from the idea of modifying weak learners to improve their predictive capability. +- **AdaBoost**: The first successful boosting algorithm was Adaptive Boosting (AdaBoost), which utilizes decision stumps as weak learners. +- **Gradient Boosting Machines (GBM)**: AdaBoost and related algorithms were later reformulated as Gradient Boosting Machines, casting boosting as a numerical optimization problem. +- **Algorithm Elements**: + - _Loss function_: Determines the objective to minimize (e.g., cross-entropy for classification, mean squared error for regression). + - _Weak learner_: Typically, decision trees are used as weak learners. + - _Additive model_: New weak learners are added iteratively to minimize the loss function, correcting the errors of previous models. +## Introduction to XGBoost +- eXtreme Gradient Boosting (XBGoost): a more **regularized form** of Gradient Boosting, as it uses **advanced regularization (L1&L2)**, improving the model’s **generalization capabilities.** +- It’s suitable when there is **a large number of training samples and a small number of features**; or when there is **a mixture of categorical and numerical features**. + + +- **Development**: Created by Tianqi Chen, XGBoost is designed for computational speed and model performance. +- **Key Features**: + - _Speed_: Achieved through careful engineering, including parallelization of tree construction, distributed computing, and cache optimization. + - _Support for Variations_: XGBoost supports various techniques and optimizations. + - _Out-of-Core Computing_: Can handle very large datasets that don't fit into memory. +- **Advantages**: + - _Sparse Optimization_: Suitable for datasets with many zero values. + - _Regularization_: Implements advanced regularization techniques (L1 and L2), enhancing generalization capabilities. + - _Parallel Training_: Utilizes all CPU cores during training for faster processing. + - _Multiple Loss Functions_: Supports different loss functions based on the problem type. + - _Bagging and Early Stopping_: Additional techniques for improving performance and efficiency. +- **Pre-Sorted Decision Tree Algorithm**: + 1. Features are pre-sorted by their values. + 2. Traversing segmentation points involves finding the best split point on a feature with a cost of O(#data). + 3. Data is split into left and right child nodes after finding the split point. + 4. Pre-sorting allows for accurate split point determination. + - **Limitations**: + + 1. Iterative Traversal: Each iteration requires traversing the entire training data multiple times. + 2. Memory Consumption: Loading the entire training data into memory limits size, while not loading it leads to time-consuming read/write operations. + 3. Space Consumption: Pre-sorting consumes space, storing feature sorting results and split gain calculations. + + XGBoosting:![enter image description here](https://miro.medium.com/v2/resize:fit:1100/format:webp/1*8Y_e29rVdBZ4pC3DFDEZDQ.png) +## Develop Your First XGBoost Model +This code uses the XGBoost library to train a model on the Iris dataset, splitting the data, setting hyperparameters, training the model, making predictions, and evaluating accuracy, achieving an accuracy score of X on the testing set. + + +```python +# XGBoost with Iris Dataset +# Importing necessary libraries +import numpy as np +import xgboost as xgb +from sklearn.datasets import load_iris +from sklearn.model_selection import train_test_split +from sklearn.metrics import accuracy_score + +# Loading a sample dataset (Iris dataset) +data = load_iris() +X = data.data +y = data.target + +# Splitting the dataset into training and testing sets +X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.2, random_state=42) + +# Converting the dataset into DMatrix format +dtrain = xgb.DMatrix(X_train, label=y_train) +dtest = xgb.DMatrix(X_test, label=y_test) + +# Setting hyperparameters for XGBoost +params = { + 'max_depth': 3, + 'eta': 0.1, + 'objective': 'multi:softmax', + 'num_class': 3 +} + +# Training the XGBoost model +num_round = 50 +model = xgb.train(params, dtrain, num_round) + +# Making predictions on the testing set +y_pred = model.predict(dtest) + +# Evaluating the model +accuracy = accuracy_score(y_test, y_pred) +print("Accuracy:", accuracy) +``` + +### Output + + Accuracy: 1.0 + +## **Conclusion** + +XGBoost's focus on speed, performance, and scalability has made it one of the most widely used and powerful predictive modeling algorithms available. Its ability to handle large datasets efficiently, along with its advanced features and optimizations, makes it a valuable tool in machine learning and data science. + + +## Reference +- [Machine Learning Prediction of Turning Precision Using Optimized XGBoost Model](https://www.mdpi.com/2076-3417/12/15/7739) \ No newline at end of file From bc20d105b779532363b05ce104c5a519742c0cd3 Mon Sep 17 00:00:00 2001 From: Revanth <109272714+revanth1718@users.noreply.github.com> Date: Sat, 22 Jun 2024 23:49:10 +0530 Subject: [PATCH 3/4] Update xgboost.md --- contrib/machine-learning/xgboost.md | 18 +++++++----------- 1 file changed, 7 insertions(+), 11 deletions(-) diff --git a/contrib/machine-learning/xgboost.md b/contrib/machine-learning/xgboost.md index f704463..1eb7f09 100644 --- a/contrib/machine-learning/xgboost.md +++ b/contrib/machine-learning/xgboost.md @@ -1,8 +1,8 @@ # XGBoost XGBoost is an implementation of gradient boosted decision trees designed for speed and performance. + ## Introduction to Gradient Boosting Gradient boosting is a powerful technique for building predictive models that has seen widespread success in various applications. - - **Boosting Concept**: Boosting originated from the idea of modifying weak learners to improve their predictive capability. - **AdaBoost**: The first successful boosting algorithm was Adaptive Boosting (AdaBoost), which utilizes decision stumps as weak learners. - **Gradient Boosting Machines (GBM)**: AdaBoost and related algorithms were later reformulated as Gradient Boosting Machines, casting boosting as a numerical optimization problem. @@ -10,11 +10,10 @@ Gradient boosting is a powerful technique for building predictive models that ha - _Loss function_: Determines the objective to minimize (e.g., cross-entropy for classification, mean squared error for regression). - _Weak learner_: Typically, decision trees are used as weak learners. - _Additive model_: New weak learners are added iteratively to minimize the loss function, correcting the errors of previous models. + ## Introduction to XGBoost - eXtreme Gradient Boosting (XBGoost): a more **regularized form** of Gradient Boosting, as it uses **advanced regularization (L1&L2)**, improving the model’s **generalization capabilities.** - It’s suitable when there is **a large number of training samples and a small number of features**; or when there is **a mixture of categorical and numerical features**. - - - **Development**: Created by Tianqi Chen, XGBoost is designed for computational speed and model performance. - **Key Features**: - _Speed_: Achieved through careful engineering, including parallelization of tree construction, distributed computing, and cache optimization. @@ -32,16 +31,15 @@ Gradient boosting is a powerful technique for building predictive models that ha 3. Data is split into left and right child nodes after finding the split point. 4. Pre-sorting allows for accurate split point determination. - **Limitations**: - 1. Iterative Traversal: Each iteration requires traversing the entire training data multiple times. 2. Memory Consumption: Loading the entire training data into memory limits size, while not loading it leads to time-consuming read/write operations. - 3. Space Consumption: Pre-sorting consumes space, storing feature sorting results and split gain calculations. - - XGBoosting:![enter image description here](https://miro.medium.com/v2/resize:fit:1100/format:webp/1*8Y_e29rVdBZ4pC3DFDEZDQ.png) + 3. Space Consumption: Pre-sorting consumes space, storing feature sorting results and split gain calculations. + XGBoosting: + ![image](assets/XG_1.webp) + ## Develop Your First XGBoost Model This code uses the XGBoost library to train a model on the Iris dataset, splitting the data, setting hyperparameters, training the model, making predictions, and evaluating accuracy, achieving an accuracy score of X on the testing set. - ```python # XGBoost with Iris Dataset # Importing necessary libraries @@ -88,9 +86,7 @@ print("Accuracy:", accuracy) Accuracy: 1.0 ## **Conclusion** - XGBoost's focus on speed, performance, and scalability has made it one of the most widely used and powerful predictive modeling algorithms available. Its ability to handle large datasets efficiently, along with its advanced features and optimizations, makes it a valuable tool in machine learning and data science. - ## Reference -- [Machine Learning Prediction of Turning Precision Using Optimized XGBoost Model](https://www.mdpi.com/2076-3417/12/15/7739) \ No newline at end of file +- [Machine Learning Prediction of Turning Precision Using Optimized XGBoost Model](https://www.mdpi.com/2076-3417/12/15/7739) From da0d7f9beb36e76f3e47ed4025f98d82b13381e0 Mon Sep 17 00:00:00 2001 From: Revanth <109272714+revanth1718@users.noreply.github.com> Date: Sat, 22 Jun 2024 23:50:33 +0530 Subject: [PATCH 4/4] Update index.md --- contrib/machine-learning/index.md | 1 + 1 file changed, 1 insertion(+) diff --git a/contrib/machine-learning/index.md b/contrib/machine-learning/index.md index df7a3e5..caea81e 100644 --- a/contrib/machine-learning/index.md +++ b/contrib/machine-learning/index.md @@ -25,3 +25,4 @@ - [Naive Bayes](naive-bayes.md) - [Neural network regression](neural-network-regression.md) - [PyTorch Fundamentals](pytorch-fundamentals.md) +- [Xgboost](xgboost.md)