»çÀÌŶ·± [¸Ó½Å·¯´×]
- °Á±¸¼º
- (20°) Àüü : 13½Ã°£ 1ºÐ|Æò±Õ : 39ºÐ2ÃÊ
- ¼ö°·á
- 450,000¿ø

- 225,000¿ø
»çÀÌŶ·± [¸Ó½Å·¯´×] Àΰ Æò»ý±³À°¿ø ±³À° ¾È³»
¿ÀǼҽº(scikit-learn) ±â¹Ý ¸Ó½Å·¯´× °ÁÂÀÔ´Ï´Ù. ¾ËÁö¿À »çÀÌŶ·± ¸Ó½Å·¯´× °Á´ scikit-learn ¶óÀ̺귯¸®¿Í °°Àº ±â°èÇнÀ °ü·Ã ¿ÀǼҽº¸¦ Ȱ¿ëÇÏ¿© ½Ç½ÀÀ» ¼öÇàÇØº¸´Â ¹æ½ÄÀ¸·Î ±¸¼ºµÇ¾î ÀÖ½À´Ï´Ù
-
01.34ºÐ
Data Preparation
Data Load, Data Summarization, Data Visualization, Data Split ¹æ¹ý¿¡ ´ëÇØ ÇнÀÇÕ´Ï´Ù. (Python Lib: os, tarfile, urllib, numpy, pandas, matplitlib.pyplot µî)
Ã¥°¥ÇÇ[00:09] ¸Ó½Å·¯´×°Á¿¡¼´Â?/[01:20] ÇÊ¿äÇÑ ÇÁ·Î±×·¥ ¹× ¶óÀ̺귯¸® ¼³Ä¡/[05:22] Data Preparation ½Ç½À - ÆÄÀÏ ´Ù¿î·Îµå ¹× ¾ÐÃàÇ®±â/[14:03] Data Load/[16:25] Data Summarization/[20:11] Data Visualization/[23:46] Data Split/[28:02] Data Split-Train,Test ºÐ¸®Çϱâ
-
02.46ºÐ
Data Understanding
(Expansion) Data Split, (Comparison) Data Distribution, Data Visualization - Scatter, Correlation Analysis ¹æ¹ý¿¡ ´ëÇØ ÇнÀÇÕ´Ï´Ù.
Ã¥°¥ÇÇ[00:05] ÀÎÆ®·Î/[02:17] HousingºÐ¼®/[06:48] train_test_splitȰ¿ëÇϱâ/[10:43] StratifiedShuffledSplit»ç¿ëÇϱâ/[13:36] Error»çÇ×(Income_cat¿¡´ëÇÏ¿©)/[18:26] Error»çÇ×(Income_cat¿¡´ëÇÏ¿©)2/[24:50] Comparison(µ¥ÀÌÅͺñ±³)/[30:29] Data_set °¡½ÃÈ/[33:38] bad_visualization/[38:28] CorrelationºÐ¼®/[41:39] Correlation_matrix
-
03.42ºÐ
Data Preprocessing
Data Cleansing, Categorical Data, Pipeline ó¸® ¹æ¹ý¿¡ ´ëÇØ ÇнÀÇÕ´Ï´Ù.
Ã¥°¥ÇÇ[02:17] ºÒ·¯¿À±â/[09:50] impute Ȱ¿ëÇϱâ/[15:10] impute °ËÁõÇϱâ/[18:30] Lable 󸮹æ¹ý /[23:27] onehot encoding ÀÌ¿ëÇϱâ/[28:02] Pipeline/[31:57] Attribute_adder»ý¼º/[35:18] StandardScalar ¾Ë¾Æº¸±â
-
04.52ºÐ
ML model(end-to-end)
Model Selection, Model Training, Model Tuning¿¡ ´ëÇØ End-to-end·Î ½Ç½ÀÇÏ¿©, ±â°èÇнÀ¿¡ ´ëÇÑ Àü¹ÝÀûÀÎ ÀÌÇØµµ¸¦ ³ôÀÌ´Â ÇнÀÀ» ¼öÇàÇÕ´Ï´Ù.
Ã¥°¥ÇÇ[01:35] µ¥ÀÌÅÍ ºÒ·¯¿À±â/[03:02] ÈÆ·Ç¼Â Å×½ºÆ®¼Â ºÐ¸®/[06:18] µ¥ÀÌÅÍÁ¤Á¦[Data preparation, Data Processing]/[11:30] µ¥ÀÌÅÍÁ¤Á¦2[Data preparation, Data Processing]/[16:06] Machine Learnring - Linear Regression/[22:25] Machine Learnring - Mean squre Error, Mean absolute Error/[27:26] Machine Learnring - Dicision Tree/[33:50] Machine Learnring - CV/[40:39] Machine Learnring - CV2/[43:14] Machine Learnring - Random Forest/[47:07] Machine Learnring - SVM
-
05.55ºÐ
Classification Part 1
Machine Learning ¾Ë°í¸®Áò Áß ºÐ·ù¿¡ ´ëÇÑ ÇнÀÀ» ¼öÇàÇÕ´Ï´Ù. (Model Training: SGD Classifier, Model Evaluation: Cross Validation + Precision & Recall & F1-score + ROC Curve)
Ã¥°¥ÇÇ[01:44] MNIST µ¥ÀÌÅÍ/[04:11] MNIST µ¥ÀÌÅÍ Ã¼Å©/[07:19] MNIST µ¥ÀÌÅÍÀÇ PlotȰ¿ë/[15:55] MNIST µ¥ÀÌÅÍÀÇ PlotȰ¿ë2/[22:40] clfÀÌ¿ëÇÏ¿© ¿¹ÃøÇϱâ/[25:02] Cross Validation/[31:48] Confusion Matrix¸¦ ÀÌ¿ëÇÏ¿© Æò°¡Çϱâ/[37:26] Precision & Recall Trade-off/[41:40] Precision & Recall Curve¸¸µé±â/[47:06] Precision & Recall »çÀÌÀÇ ±×·¡ÇÁ ¸¸µé±â/[50:01] ROC Curve³ªÅ¸³»±â
-
06.33ºÐ
Classification Part 2
Machine Learning ¾Ë°í¸®Áò Áß ºÐ·ù¿¡ ´ëÇÑ ÇнÀÀ» ¼öÇàÇÕ´Ï´Ù. (Model Building: SGDClassifier, OneVsOneClassifier, ForestClassifier, ErrorVisualization: ConfusionMatrix-Matshow, Multiple Labels Classification)
Ã¥°¥ÇÇ[00:57] MNIST µ¥ÀÌÅÍ ºÒ·¯¿À±â/[06:21] Y lableÀÛ¼º/[12:29] lable ¿øÀÎÈ®ÀÎ/[17:19] Random Forest Classifier ¸¸µé±â/[23:53] ConfusionMatrix È®ÀÎ/[27:55] ConfusionMatrix °¡½ÃÈ
-
07.34ºÐ
Regression Part 1
Linear Regression¿¡ ´ëÇÑ ÇнÀÀ» ¼öÇàÇÕ´Ï´Ù.(Normal Equation, Batch Gradient Descent, Stochastic Gradient Descent, Minibatch Gradient Descent)
Ã¥°¥ÇÇ[00:55] Normal Equation - Á¤±Ô¹æÁ¤½Ä/[07:40] Linear Regression ±×¸®±â/[12:22] Batch Gradient Descent - °æ»çÇϰ¹ý/[19:38] Stochastic Gradient Descent - È®·ü/[25:21] Minibatch Gradient Descent-¹Ì´Ï ¹èÄ¡°æ»ç
-
08.34ºÐ
Regression Part 2
Other Regression¿¡ ´ëÇÑ ÇнÀÀ» ¼öÇàÇÕ´Ï´Ù.(Polynomial Regression, Logistic Regression)
Ã¥°¥ÇÇ[00:30] Polynomial Regression/[05:12] Polynomial Regression2/[09:11] Polynomial Regression3/[13:15] ¸ðµ¨ºñ±³Çϱâ/[20:03] Logistic Regression/[25:15] Logistic Regression2/[29:36] Logistic Regression3
-
09.37ºÐ
SVM Part 1
SVM ¸ðµ¨ ÇнÀÀ» ¼öÇàÇÕ´Ï´Ù. (Comparison between Good & Bad model, Large margin classification, Sensitivity to features scale & outliers)
Ã¥°¥ÇÇ[01:15] SVM ¸ðµ¨ ºôµùÇϱâ/[05:00] Linear SVMÀÇ ÁÁÀº¸ðµ¨°ú ³ª»Û¸ðµ¨/[10:20] Linear SVMÀÇ ÁÁÀº¸ðµ¨°ú ³ª»Û¸ðµ¨2/[16:30] Large Margin classification/[19:15] Large Margin classification2/[26:27] Large Margin classification3/[33:12] Large Margin classification4
-
10.58ºÐ
SVM Part 2
Nonlinear DatasetÀÇ SVM ¸ðµ¨ ÇнÀÀ» ¼öÇàÇÕ´Ï´Ù. (Polynomial Kerneal, Similarity Characteristic, Gaussian RBF Kernel)
Ã¥°¥ÇÇ[00:22] Non-linearÀ̶õ?/[04:59] Nonlinear Dataset Á¦ÀÛ/[10:39] Nonlinear Dataset Á¦ÀÛ2/[14:07] Polynomial Kerneal -´ÙÇׯ¯¼º/[18:01] Similarity Characteristic/[23:48] Kerneal Trick ÀÌ¿ë/[29:46] Gaussian RBF Kernel/[34:08] Gaussian RBF Kernel - plotÇ¥½Ã/[41:20] Gaussian RBF Kernel - plotÇ¥½Ã2/[49:12] SVM¸ðµ¨¸¸µé±â
-
11.30ºÐ
SVM Part 3
SVM ȸ±Í(SVM Regression) ¸ðµ¨ ÇнÀÀ» ¼öÇàÇÕ´Ï´Ù. (Linear SVR, Polynomial Kernal Trick)
Ã¥°¥ÇÇ[00:34] import ÀÛ¾÷ ¹× µ¥ÀÌÅÍ »ý¼º/[02:08] Linear SVR/[10:19] Linear SVR2/[14:36] Linear SVR3/[17:48] Polynomial Kernal Trick/[22:17] Polynomial Kernal Trick2
-
12.49ºÐ
SVM Part 4
SVM Background¿¡ ´ëÇØ ÇнÀÀ» ¼öÇàÇÕ´Ï´Ù. (Decision Function, Objective Function, Hinge Loss, Training Time)
-
13.54ºÐ
DecisionTree Part 1
Decision Tree¿¡ ´ëÇØ ÇнÀÀ» ¼öÇàÇÕ´Ï´Ù. (Training, Visualization, Prediction Class, Sensitivity, Restriction)
-
14.26ºÐ
DecisionTree Part 2
Regression Decision Tree¿¡ ´ëÇØ ÇнÀÀ» ¼öÇàÇÕ´Ï´Ù. (TreeRegressor, Comparision(DecisionTree: Classification vs. Regression), Restriction)
-
15.34ºÐ
EnsembleLearning Part 1
Ensemble Learning Áß Voting, Bagging ¹æ¹ý¿¡ ´ëÇØ ÇнÀÀ» ¼öÇàÇÕ´Ï´Ù. (Voting Classifier, Bagging Classifier)
-
16.28ºÐ
EnsembleLearning Part 2
Bagging ¹æ¹ýÀÇ ´ëÇ¥ÀûÀÎ ¾Ë°í¸®ÁòÀÎ Random Forest¿¡ ´ëÇØ ÇнÀÀ» ¼öÇàÇÕ´Ï´Ù. (Random Forest, Out-of-bag Evaluation, Feature Importance)
-
17.59ºÐ
EnsembleLearning Part 3
Boosting ¹æ¹ý¿¡ ´ëÇØ ÇнÀÀ» ¼öÇàÇÕ´Ï´Ù. (Ada Boosting, Gradient Boosting)
-
18.27ºÐ
Dimension Reduction Part 1
Â÷¿øÃà¼Ò ¹æ¹ý¿¡ ´ëÇØ ÇнÀÀ» ¼öÇàÇÕ´Ï´Ù. (Projection with PCA, Manifold Learning)
-
19.24ºÐ
Dimension Reduction Part 2
Â÷¿øÃà¼Ò ¹æ¹ý¿¡ ´ëÇØ ÇнÀÀ» ¼öÇàÇÕ´Ï´Ù. (PCA, MNIST compression, Incremental PCA)
-
20.25ºÐ
Dimension Reduction Part 3
Â÷¿øÃà¼Ò ¹æ¹ý¿¡ ´ëÇØ ÇнÀÀ» ¼öÇàÇÕ´Ï´Ù. (Kernal PCA, LLE(Locally Linear Embedding))