»ó¼¼Á¤º¸
Ã¥¼Ò°³
ÀÌ Ã¥Àº ÅÙ¼Ç÷Î(Tensorflow)¸¦ »ç¿ëÇÏ¿© ´Ù¾çÇÑ µö·¯´× ¸ðµ¨µéÀ» ÇнÀÇÏ´Â ¹æ¹ý°ú ÃÖÀûÈ ÀÌ·ÐÀ» ´Ù·é´Ù. µö·¯´×¿¡ °ü·ÃµÈ ¸¹Àº ¼öÇÐ À̷еéÀÌ ÀÖÁö¸¸, °¡Àå ÇÙ½ÉÀûÀÎ ÃÖÀûÈ ¹®Á¦¸¦ Áß½ÉÀ¸·Î µö·¯´×À» ÇнÀÇÑ´Ù.ÇöÀç±îÁö ÃâÆÇµÈ µö·¯´× °ü·Ã µµ¼¿Í °ÀǵéÀº ÃÖÀûÈ À̷п¡ ´ëÇØ¼ ´Ù·çÁö ¾Ê°Å³ª ¾ÆÁÖ °£´ÜÇÏ°Ô ´Ù·é´Ù. ÇÏÁö¸¸ ÃÖÀûÈ ÀÌ·ÐÀ» ÀÌÇØÇϱ⿡´Â ÅξøÀÌ ºÎÁ·ÇÑ ¾çÀÌ´Ù. ÀÌ Ã¥¿¡¼´Â °¡Àå ¸ÕÀú ÃÖÀûÈ ÀÌ·ÐÀ» ¼Ò°³Çϰí, ÃÖÀûÈ ¹®Á¦´Â ÅÙ¼Ç÷θ¦ »ç¿ëÇÏ¿© Ǭ´Ù. ÀÌ·¯ÇÑ ¹æ½ÄÀº µö·¯´× ¸ðµ¨À» ÅÙ¼Ç÷θ¦ »ç¿ëÇÏ¿© ÇнÀ½ÃŰ´Â °æ¿ì¿¡µµ º¯ÇÏÁö ¾Ê´Â´Ù. »ç½Ç ÅÙ¼Ç÷Π¿Ü¿¡µµ ÆÄÀÌÅäÄ¡(PyTorch), Äɶó½º(Keras) µî ¸¹Àº µö·¯´× ÆÐŰÁöµéÀÌ ÀÖ´Ù. ÀÌµé ¸ðµÎ´Â ÃÖÀûÈ ¹®Á¦¸¦ Ǫ´Â °ÍÀÌ ÃÖÁ¾ ¸ñÇ¥ÀÌ´Ù.
ÀúÀÚ¼Ò°³
Çѱ¹°úÇбâ¼ú¿ø¿¡¼ Àü»êÇаú¸¦ Àü°øÇϰí, ¿¬¼¼´ëÇб³ °è»ê°úÇаøÇаú¿¡¼ ÀüÀÚ°øÇÐ ¼®»ç¸¦ Á¹¾÷Çß´Ù. »ï¼º¸Þµð½¼, ¿£ºñµð¾Æ¿¡¼ ±Ù¹«Çß°í, ÇöÀç´Â Áö¸à½º Çï½Ã´Ï¾î½º¿¡¼ ±Ù¹«Çϰí ÀÖ´Ù.
¸ñÂ÷
ÀÌ Ã¥À» ³»¸ç...µé¾î°¡¸ç...PART 1 ÇÁ·Î±×·¡¹Ö Áغñ ÀÛ¾÷ Chapter 01 °³¹ßȯ°æ ¼³Á¤Çϱâ 1.1 ¾Æ³ªÄÜ´Ù(Anaconda) ¼³Ä¡Çϱâ 1.1.1 À©µµ¿ì(Windows)¿¡¼ ¼³Ä¡Çϱâ 1.1.2 macOS¿¡¼ ¼³Ä¡Çϱâ1.1.3 Å͹̳Î(Terminal) ½ÇÇà ¹æ¹ý 1.1.4 °³¹ßȯ°æ »ý¼º°ú »èÁ¦ ±×¸®°í ÆÐŰÁö ¼³Ä¡ 1.1.5 °³¹ßȯ°æ Ȱ¼ºÈ¿Í ºñȰ¼ºÈ 1.1.6 °³¹ßȯ°æ ³»¿¡ ÆÐŰÁö ¼³Ä¡Çϱâ 1.1.7 °³¹ßȯ°æ ³»º¸³»±â¿Í ºÒ·¯¿À±â 1.2 ÅÙ¼Ç÷Î(TensorFlow) ¹× °ü·Ã ÆÐŰÁö ¼³Ä¡Çϱâ 1.2.1 ymlÀ» ÅëÇØ ºÒ·¯¿À±â 1.2.2 yml¾øÀÌ Á÷Á¢ ¼³Á¤Çϱâ Chapter 02 ÁÖÇÇÅÍ ³ëÆ®ºÏ°ú ÆÄÀÌ½ã Æ©Å丮¾ó 2.1 ÁÖÇÇÅÍ ³ëÆ®ºÏ(Jupyter Notebook) 2.1.1 ÆÄÀ̽ã ÄÚµå ½ÇÇàÇϱâ 2.1.2 ¸¶Å©´Ù¿î(Markdown) 2.1.3 Æí¸®ÇÑ ±â´É ¼Ò°³ 2.2 ÆÄÀ̽㠱âÃÊ ¹®¹ý 2.2.1 º¯¼ö ¼±¾ð ¹× ÇÔ¼ö ¼±¾ð, ±×¸®°í À͸íÇÔ¼ö 2.2.2 ÁÖ¿ä º¯¼ö ŸÀÔ 2.2.3 for¹®(for loop) 2.2.4 if¹®(if statement) 2.2.5 Á¦³Ê·¹ÀÌÅÍ(Generator) 2.3 ÀÚÁÖ »ç¿ëµÇ´Â ÆÄÀ̽㠹®¹ý ÆÐÅÏ 2.3.1 µ¥ÀÌÅÍ Å¸ÀÔ¸¶´Ù ´Ù¸¥ for loop ½ºÅ¸ÀÏ 2.3.2 zip°¡ µé¾î°£ for loop 2.3.3 ÇÑ ÁÙ for¹® 2.3.4 ÆÄÀÏ Àбâ/¾²±â 2.4 numpy array 2.4.1 nÂ÷¿ø ¹è¿(Array) 2.4.2 ¹è¿ÀÇ ¸ð¾ç(Shape) 2.4.3 ÀüÄ¡ ¿¬»ê(Transpose) 2.4.4 Reshape 2.4.5 ¹è¿ Àε¦½Ì 2.5 ½Ã°¢È ÆÐŰÁö(matplotlib) Æ©Å丮¾ó 2.5.1 ºÐÆ÷µµ(Scatter Plot) ±×¸®±â 2.5.2 Æä¾îÇöù(Pair Plot) ±×¸®±â 2.5.3 ´ÜÀϺ¯¼ö ÇÔ¼ö ±×·¡ÇÁ ±×¸®±â 2.5.4 ¿©·¯ ±×·¡ÇÁ¸¦ ÇÑ ´«¿¡ º¸±â 2.5.5 ±×·¡ÇÁ ½ºÅ¸Àϸµ 2.5.6 ´Ùº¯¼ö ÇÔ¼ö ±×·¡ÇÁ ±×¸®±â Chapter 03 ÅÙ¼Ç÷ΠƩÅ丮¾ó 3.1 ÅÙ¼Ç÷Π¼³Ä¡ 3.2 ÅÙ¼Ç÷Π±¸Á¶ ÀÌÇØÇϱâ 3.2.1 ±×·¡ÇÁ(Graph) 3.2.2 ÅÙ¼(Tensor) 3.2.3 ¿¬»ê(Operation) 3.3 ¿¬»êÀÇ ½ÃÀÛ ½ÃÁ¡ 3.4 ÁÖ¿ä ŸÀÔ 3°¡Áö 3.4.1 Constant 3.4.2 Placeholder 3.4.3 Variable(º¯¼ö) 3.5 ±âÃÊ ¼öÇÐ ¿¬»ê 3.5.1 ½ºÄ®¶ó µ¡¼À 3.5.2 ÅÙ¼Ç÷ο¡¼ Á¦°øÇÏ´Â ´Ù¾çÇÑ ÇÔ¼ö 3.5.3 ¸®´ö¼Ç(Reduction) PART 2 µö·¯´×¿¡ ÇÊ¿äÇÑ ¼öÄ¡ÇØ¼® ÀÌ·Ð Chapter 04 ÃÖÀûÈ À̷п¡ ÇÊ¿äÇÑ ¼±Çü´ë¼ö¿Í ¹ÌºÐ 4.1 ¼±Çü´ë¼ö 4.1.1 ±³À°°úÁ¤¿¡ µû¸¥ ¼±Çü´ë¼öÀÇ ¹æÇ⼺ 4.1.2 Á¤ÀÇ ¹× Ç¥±â¹ý 4.1.3 º¤ÅÍ/º¤ÅÍ ¿¬»ê 4.1.4 Çà·Ä/º¤ÅÍ ¿¬»ê 4.1.5 Çà·Ä/Çà·Ä ¿¬»ê 4.1.6 ¼±Çü½Ã½ºÅÛÀÇ Ç®ÀÌ 4.2 µö·¯´×¿¡¼ ÀÚÁÖ »ç¿ëµÇ´Â ¼±Çü´ë¼ö Ç¥±â¹ý 4.3 ¹ÌºÐ°ú ±×·¡µð¾ðÆ®(Gradient) Chapter 05 µö·¯´×¿¡ ÇÊ¿äÇÑ ÃÖÀûÈ ÀÌ·Ð 5.1 µö·¯´×¿¡ ³ªÅ¸³ª´Â ÃÖÀûÈ ¹®Á¦ 5.2 ÃÖÀûÈ ¹®Á¦ÀÇ Ãâ¹ß 5.3 ÃÖÀûÈ ¹®Á¦ Ç¥ÇöÀÇ µ¶Çعý 5.3.1 Á¦°ö°ªÀÇ ÇÕÀ» ÀÌ¿ëÇÑ ¼±Çüȸ±Í 5.3.2 Àý´ñ°ªÀÇ ÇÕÀ» »ç¿ëÇÑ ¼±Çüȸ±Í 5.4 ´Ù¾çÇÑ µö·¯´× ¸ðµ¨°ú ÃÖÀûÈ ¹®Á¦ ¹Ì¸®º¸±â Chapter 06 °íÀü ¼öÄ¡ÃÖÀûÈ ¾Ë°í¸®Áò 6.1 ¼öÄ¡ÃÖÀûÈ ¾Ë°í¸®ÁòÀÌ ÇÊ¿äÇÑ ÀÌÀ¯ 6.2 ¼öÄ¡ÃÖÀûÈ ¾Ë°í¸®ÁòÀÇ ÆÐÅÏ 6.3 ±×·¡µð¾ðÆ® µð¼¾Æ®(Gradient Descent) 6.3.1 ¿¹Á¦·Î ¹è¿ì´Â ±×·¡µð¾ðÆ® µð¼¾Æ® 6.3.2 ±×·¡µð¾ðÆ® µð¼¾Æ® ¹æ¹ýÀÇ ÇѰèÁ¡ 6.4 ±×·¡µð¾ðÆ® µð¼¾Æ®¸¦ »ç¿ëÇÑ ¼±Çüȸ±Í ¸ðµ¨ ÇнÀ 6.4.1 ¼±Çüȸ±Í ¹®Á¦ ¼ö½Ä ¼Ò°³ 6.4.2 ±×·¡µð¾ðÆ® µð¼¾Æ® ¹æ¹ý Àû¿ë 6.4.3 ÇѰèÁ¡ Chapter 07 µö·¯´×À» À§ÇÑ ¼öÄ¡ÃÖÀûÈ ¾Ë°í¸®Áò 7.1 ½ºÅäij½ºÆ½ ¹æ¹ý(Stochastic method) 7.2 ½ºÅäij½ºÆ½ ¹æ¹ýÀÇ ÄÚµå ±¸Çö ÆÐÅÏ 7.3 Ž»ö ¹æÇâ ±â¹Ý ¾Ë°í¸®Áò 7.3.1 ½ºÅäij½ºÆ½ ±×·¡µð¾ðÆ® µð¼¾Æ® ¹æ¹ý 7.3.2 ¸ð¸àÅÒ/³×½ºÅ×·ÎÇÁ ¹æ¹ý 7.4 ÇнÀ·ü ±â¹Ý ¾Ë°í¸®Áò 7.4.1 ÀûÀÀÇü ÇнÀ·ü ¹æ¹ýÀÇ Çʿ伺7.4.2 Adagrad 7.4.3 RMSProp(Root Mean Square Propagation) 7.4.4 Adam PART 3 ÅÙ¼Ç÷θ¦ »ç¿ëÇÑ µö·¯´×ÀÇ ±âº» ¸ðµ¨ ÇнÀ Chapter 08 ¼±Çüȸ±Í ¸ðµ¨8.1 ¿¹Ãø ¸ðµ¨°ú ¼Õ½ÇÇÔ¼ö8.2 °áÁ¤·ÐÀû ¹æ¹ý°ú ½ºÅäij½ºÆ½ ¹æ¹ý8.2.1 °áÁ¤·ÐÀû ¹æ¹ý 8.2.2 ½ºÅäij½ºÆ½ ¹æ¹ý 8.3 ºñ¼±Çüȸ±Í ¸ðµ¨ 8.3.1 ÀÌÂ÷ °î¼± µ¥ÀÌÅÍ 8.3.2 »ïÂ÷ °î¼± µ¥ÀÌÅÍ 8.3.3 »ï°¢ÇÔ¼ö °î¼± µ¥ÀÌÅÍ 8.4 ºñ¼±Çü Ư¼º°ª ÃßÁ¤ ¹æ¹ý°ú ½Å°æ¸Á ¸ðµ¨Chapter 09 ¼±Çü ºÐ·ù ¸ðµ¨ 9.1 ÀÌÇ× ºÐ·ù ¸ðµ¨ 9.1.1 ¿¬¼Ó È®·ü ¸ðµ¨ 9.1.2 ÃÖ´ë¿ìµµ¹ý°ú Å©·Î½º ¿£Æ®·ÎÇÇ 9.1.3 ¹Ì´Ï ¹èÄ¡ ¹æ¹ýÀ» ÅëÇÑ ¸ðµ¨ ÇнÀ 9.1.4 Ư¼º°ªÀ» ÀÌ¿ëÇÑ ºñ¼±Çü ºÐ·ù ¸ðµ¨ 9.2 ´ÙÁß ºÐ·ù ¸ðµ¨ 9.2.1 ¼ÒÇÁÆ®¸Æ½º(Softmax) 9.2.2 ¿ø-ÇÖ(One-hot) ÀÎÄÚµù 9.2.3 ´ÙÁß ºÐ·ù ¸ðµ¨ÀÇ Å©·Î½º ¿£Æ®·ÎÇÇ 9.2.4 ¹Ì´Ï ¹èÄ¡ ¹æ¹ýÀ» ÅëÇÑ ¸ðµ¨ ÇнÀ 9.2.5 MNIST Chapter 10 ½Å°æ¸Á ȸ±Í ¸ðµ¨ 10.1 ½Å°æ¸Á ¸ðµ¨ÀÇ Çʿ伺 10.2 ½Å°æ¸Á ¸ðµ¨ ¿ë¾î ¼Ò°³ 10.3 ½Å°æ¸Á ¸ðµ¨ ±¸Çö 10.4 ½Å°æ¸Á ¸ðµ¨ÀÇ ´Ù¾çÇÑ Ç¥Çö 10.5 Ư¼º°ª ÀÚµ¿ ÃßÃâÀÇ ¿ø¸® 10.6 ½Å°æ¸Á ¸ðµ¨ÀÇ ´ÜÁ¡ Chapter 11 ½Å°æ¸Á ºÐ·ù ¸ðµ¨ 11.1 ½Å°æ¸Á ºÐ·ù ¸ðµ¨ÀÇ Çʿ伺 11.2 ´Ù¾çÇÑ µ¥ÀÌÅÍ ºÐÆ÷¿Í ½Å°æ¸Á ºÐ·ù ¸ðµ¨ 11.2.1 ½Å°æ¸Á ºÐ·ù ¸ðµ¨ ÇнÀ 11.2.2 üĿº¸µå ¿¹Á¦ 11.2.3 ºÒ±ÔÄ¢ÇÑ µ¥ÀÌÅÍ ºÐÆ÷ ¿¹Á¦11.3 ½Å°æ¸Á ºÐ·ù ¸ðµ¨ÀÇ ´Ù¾çÇÑ Ç¥Çö 11.4 MNIST ºÐ·ù ¹®Á¦ PART 4 ÇнÀ¿ë/Å×½ºÆ®¿ë µ¥ÀÌÅÍ¿Í ¾ð´õÇÇÆÃ/¿À¹öÇÇÆÃChapter 12 ¾ð´õÇÇÆÃ/¿À¹öÇÇÆÃ ¼Ò°³ 12.1 µö·¯´× ¸ðµ¨°ú ÇÔ¼ö 12.2 ÇнÀ¿ë µ¥ÀÌÅÍ¿Í Á¤´äÇÔ¼ö 12.3 Á¤´äÇÔ¼ö¿Í Å×½ºÆ®¿ë µ¥ÀÌÅÍ 12.4 ¾ð´õÇÇÆÃ/¿À¹öÇÇÆÃÀÇ 2°¡Áö ¿äÀÎ Chapter 13 ¾ð´õÇÇÆÃÀÇ Áø´Ü°ú ÇØ°áÃ¥ 13.1 ÇнÀ ¹Ýº¹ Ƚ¼ö Àç¼³Á¤ 13.2 ÇнÀ·ü Àç¼³Á¤ 13.3 ¸ðµ¨ º¹Àâµµ Áõ°¡ 13.4 ¾ð´õÇÇÆÃµÈ ½Å°æ¸Á ºÐ·ù ¸ðµ¨ 13.5 ¾ð´õÇÇÆÃ ¿ä¾à Chapter 14 ¿À¹öÇÇÆÃÀÇ Áø´Ü°ú ÇØ°áÃ¥ 14.1 ÇнÀ ¹Ýº¹ Ƚ¼ö ÁÙÀ̱â 14.2 Regularization ÇÔ¼ö Ãß°¡ 14.2.1 L2 Regularization 14.2.2 L1 Regularization 14.3 µå·Ó¾Æ¿ô(Dropout) 14.4 ºÐ·ù ¹®Á¦ 14.5 ±³Â÷°ËÁõ µ¥ÀÌÅÍÀÇ µîÀå Chapter 15 ÅÙ¼º¸µå(TensorBoard) Ȱ¿ë 15.1 ±×·¡ÇÁ ±×¸®±â 15.2 È÷½ºÅä±×·¥ ±×¸®±â 15.3 À̹ÌÁö ±×¸®±â 15.4 ½Å°æ¸Á ¸ðµ¨ ÇнÀ °úÁ¤¿¡ ÅÙ¼º¸µå Àû¿ëÇϱâ Chapter 16 ¸ðµ¨ ÀúÀåÇϱâ¿Í ºÒ·¯¿À±â 16.1 ÀúÀåÇϱâ16.2 ºÒ·¯¿À±â 16.3 ¿À¹öÇÇÆÃ Çö»ó ÇØ°á ÀÀ¿ë ¿¹Á¦ Chapter 17 µö·¯´× °¡À̵å¶óÀÎ 17.1 µö·¯´× ÇÁ·ÎÁ§Æ® ÁøÇà ¼ø¼ 17.1.1 ¸ðµ¨°ú ¼Õ½ÇÇÔ¼ö ¼±Åà 17.1.2 ¸ðµ¨ ÇнÀ ÁøÇà 17.1.3 ¾ð´õÇÇÆÃ È®ÀÎ 17.1.4 ¿À¹öÇÇÆÃ È®ÀÎ 17.1.5 ÃÖÁ¾ ¼º´É È®ÀÎ 17.2 µö·¯´× ÇнÀÀÇ ±Ùº»Àû ÇѰè 17.2.1 ¼Õ½ÇÇÔ¼ö¿¡´Â ÇнÀ¿ë µ¥ÀÌÅÍ»ÓÀÌ´Ù. 17.2.2 µ¥ÀÌÅÍ Àü󸮴 ¸Å¿ì Áß¿äÇÏ´Ù. 17.2.3 ¼Õ½ÇÇÔ¼ö¿Í Á¤È®µµ´Â ´Ù¸£´Ù. 17.2.4 Å×½ºÆ® µ¥ÀÌÅÍÀÇ ºÐÆ÷´Â ¿ÏÀüÈ÷ ¾Ë ¼ö ¾ø´Ù. PART 5 µö·¯´× ¸ðµ¨ Chapter 18 CNN ¸ðµ¨ 18.1 µö·¯´×(Deep Learning) À̶õ 18.2 CNN ¸ðµ¨ ¼Ò°³ 18.3 Äܺ¼·ç¼Ç(Convolution) 18.3.1 Ä¿³Î(Kernel)/Filter 18.3.2 Strides 18.3.3 Padding 18.4 Max-Pooling 18.5 Dropout 18.6 ReLU Ȱ¼º ÇÔ¼ö 18.6.1 »ç¶óÁö´Â ±×·¡µð¾ðÆ® ¹®Á¦ 18.6.2 ¹®Á¦ÀÇ ÀÌÇØ 18.6.3 ¹®Á¦ÀÇ ¿øÀÎ 18.6.4 ÇØ°á 18.7 ÀÚµ¿ Ư¼º(Feature) ÃßÃâ 18.8 MNIST ¼ýÀÚ ºÐ·ù ¹®Á¦ 18.8.1 µ¥ÀÌÅÍ ÈȾ±â 18.8.2 One-Hot ÀÎÄÚµù 18.8.3 CNN ¸ðµ¨ ±¸ÃàÇϱâ 18.8.4 ÃÖÀûÈ ¹®Á¦ ¼³Á¤ 18.8.5 ÇÏÀÌÆÛ ÆÄ¶ó¹ÌÅÍ ¼³Á¤ 18.8.6 ÇнÀ ½ÃÀÛ 18.8.7 Á¤È®µµ È®ÀÎ 18.8.8 Àüü ÄÚµå Chapter 19 GAN(Generative Adversarial Networks) ¸ðµ¨ 19.1 min-max ÃÖÀûÈ ¹®Á¦ ¼Ò°³ 19.2 Generator(»ý¼º±â) 19.2.1 Variable Scope(º¯¼ö ¹üÀ§) 19.2.2 Leaky ReLU(´©¼³ ReLU) 19.2.3 Tanh Output19.3 Discriminator(ÆÇº°±â) 19.4 GAN ³×Æ®¿öÅ© ¸¸µé±â 19.4.1 Hyper parameters 19.5 ¼Õ½ÇÇÔ¼ö 19.6 Training(ÇнÀ) 19.6.1 Training(ÇнÀ)ÀÇ ¼¼ºÎ Á¶°Ç ¼³Á¤ 19.6.2 Training loss(ÇнÀ ¼Õ½Ç) 19.6.3 »ý¼º±â·Î ¸¸µç »ùÇà ¿µ»ó 19.6.4 »ý¼º±â·Î »õ·Î¿î ¿µ»ó ¸¸µé±â 19.7 À¯¿ëÇÑ ¸µÅ© ¹× Àüü ÄÚµå 19.7.1 À¯¿ëÇÑ ¸µÅ© 19.7.2 Àüü ÄÚµå PART 6 ÀÀ¿ë ¹®Á¦ Chapter 20 ¿µ»ó 20.1 Transfer Learning ¼Ò°³20.2 ²É »çÁø ºÐ·ù 20.2.1 ÇÊ¿äÇÑ »çÀü Áö½Ä 20.2.2 ȯ°æ Áغñ 20.2.3 ¹®Á¦ ¼Ò°³ 20.2.4 VGG16 ¸ðµ¨ 20.2.5 µ¥ÀÌÅÍ ÈȾ±â 20.2.6 ¸ðµ¨ ¸¸µé±â 20.2.7 ÃÖÀûÈ ¹®Á¦ ¼³Á¤ 20.2.8 ÇÏÀÌÆÛ ÆÄ¶ó¹ÌÅÍ ¼³Á¤ 20.2.9 ÇнÀ 20.2.10 Á¤È®µµ 20.3 Bottleneck Ư¼º ÃßÃâ ¹æ¹ý 20.4 Transfer Learning Àüü ÄÚµå Chapter 21 ¹®ÀÚ¿ ºÐ¼® word2vec 21.1 Word Embeddings 21.2 One-hot encoding 21.3 Word2Vec ¸ðµ¨ 21.3.1 ȯ°æ Áغñ 21.3.2 Àüó¸®(preprocessing) 21.3.3 SubSampling21.3.4 ¹èÄ¡ ¸¸µé±â 21.3.5 ±×·¡ÇÁ ¸¸µé±â 21.3.6 Embedding(ÀÓº£µù) 21.3.7 Negative sampling 21.3.8 Validation 21.3.9 Training ÇнÀ 21.4 T-SNE¸¦ ÀÌ¿ëÇÑ ½Ã°¢È 21.5 Àüü ÄÚµå ã¾Æº¸±â