2024 spring

(2 Div.)


Instruction

Course Staff
Time & Location
  • 월/목요일 12:00 - 13:45, 공학 6호관 608호
Office Hours
  • 일시/장소: 화요일 13:00 - 15:00, 공학 6호관 407호
  • 주의 사항
    • 수업 및 과제 관련 내용 질의/응답은 면담 대신 e루리 질의 응답 게시판을 이용
    • 면담 1일전 미리 이메일 연락 등을 통해 일정을 잡을 것
Textbook
  • Primary
    • [Ge23] Aurélien Géron. 2023. Hands-On Machine Learning with Scikit-Learn, Keras & TensorFlow: Concepts, Tools, and Techniques to Build Intelligent Systems, 3rd Ed. O`Reilly
  • Secondary
    • [Oh21] 오일석. 2021. 기계 학습. 한빛 아카데미
    • [Ow22] Louis Owen. 2022. Hyperparameter Tuning with Python: Boost your machine learning model’s performance via hyperparameter tuning. Packt.
    • [Br20] Jason Brownlee. 2020. Data Preparation for Machine Learning, 1.1 Ed. Machine Learning Mastery
    • [Br21] Jason Brownlee. 2021. Imbalanced Classification with Python, 1.3 Ed. Machine Learning Mastery
    • [Mo24] Christoph Molnar. 2024. Interpretable Machine Learning
    • [Ma23] Serg Masis. Interpretable Machine Learning with Python. 2023
Prerequisite
  • (필수) 파이썬프로그래밍
    • 파이썬 사용법 및 Numpy/Pandas 라이브러리 사용법은 숙지된 상태라고 가정하고 수업을 진행
  • (선택) 자료구조, 선형대수학, 데이터분석프로그래밍
Grading Policy
팀 프로젝트Team Project (30%)
  • End-to-End Machine Learning Projects
    • #1 (10%): Data collection
    • #2 (20%): Model building and evaluation
개인 과제Assignments (40%)
  • Kaggle ML Competition
    • #1 (4%): Housing
    • #2 (9%): Wine Quality
    • #3 (9%): Student Dropout or Success
    • #4 (9%): Nurse Stress Prediction using Wearable Sensors
    • #5 (9%): Doom or Animal Crossing
중간 과제Midterm Assignment (20%)
  • Kaggle ML Competition: Classifying Emotions during Debate using Physiological Responses
출석Attendance (10%)
  • 지각 3회 = 결석 1회
  • 결석 1회에 출석 점수 1% 차감
  • 총 수업 일의 1/3 (10회) 초과 결석 시 F
    • 즉, 11회 이상 결석 시 F
  • 별도의 사유(예. 예비군 훈련 등)가 있을 시 수업 시간 전에 교수에게 이메일 송부
    • 단, 급하게 벌어진 사유(예. 급병, 친족상 등)는 소명 자료를 제출

Schedule

W01: Overview
March 04: Overview
March 07: Machine Learning Landscape
  • Lecture
  • Reference
    • [Ge23] Chap. 1
    • [Oh21] Chap. 1

W02: Machine Learning Pipeline
March 11: Machine Learning Pipeline - Lecture
  • Lecture
  • Reference
    • [Ge23] Chap. 2
    • D. Sculley et al. 2015. Hidden technical debt in Machine learning systems. In Proceedings of the 28th International Conference on Neural Information Processing Systems - Volume 2 (NIPS'15).
    • Soowon Kang et al. 2023. K-EmoPhone: A Mobile and Wearable Dataset with In-Situ Emotion, Stress, and Attention Labels. Sci Data 10, 351 (2023).
March 14: Machine Learning Pipeline - End-to-End Practice

W03: Linear Model
March 18: Linear Model - Theory
  • Lecture
  • Reference
    • [Ge23] Chap. 4
    • [Oh21] Chap. 2
March 21: Linear Model - Lab
  • Lab
  • Reference
    • [Ge23] Chap. 4
    • [Oh21] Chap. 2

W04: Support Vector Machine and Decision Tree
March 25: Support Vector Machine
  • Lecture
  • Lab
  • Reference
    • [Ge23] Chap. 5
    • [Oh21] Chap. 11
March 28: Decision Tree

W05: Ensemble Learning
April 01: Basics & Random Forest
  • Lecture
  • Lab
  • Reference
    • [Ge23] Chap. 7
    • [Oh21] Chap. 12
April 04: Gradient Boosting

W06: Feature Engineering #1
April 08: Feature Extraction
  • Lecture
  • Lab
  • Reference
    • Andreas Bulling et al. 2014. A tutorial on human activity recognition using body-worn inertial sensors. ACM Comput. Surv. 46, 3, Article 33.
    • Soujanya Poria et al. 2017. A review of affective computing: From unimodal analysis to multimodal fusion. Information Fusion, 37, 98–125.
April 11: Feature Selection

W07: Feature Engineering #2
April 15: Dimensionality Reduction
  • Lecture
  • Lab
  • Reference
    • [Ge23] Chap. 8
    • [Br20] Chap. 7
April 18: Balancing Label Distribution
  • Lecture
  • Lab
  • Reference
    • [Br21] Chap. 4

W08: Midterm

W09: Cross-Validation and Performance Measures
April 29: Cross-Validation
May 02: Performance Measures

W10: Hyper-parameter Tuning
May 06: Substitution Holiday for Children's Day
  • No class
May 09: Hyper-parameter Tuning #1
  • Lecture
  • Lab
  • Reference
    • [Ow22] Chap. 2, 3, 4, 7, 8
    • Tong Yu and Hong Zhu. 2000. Hyper-Parameter Optimization: A Review of Algorithms and Applications

W11: Clustering
May 13: Hyper-parameter Tuning #2
  • Lecture
  • Lab
  • Reference
    • [Ow22] Chap. 5, 6, 9, 10
    • Tong Yu and Hong Zhu. 2000. Hyper-Parameter Optimization: A Review of Algorithms and Applications
May 16: Clustering
  • Lecture
  • Lab
  • References
    • [Ge23] Chap. 9
    • [Oh21] Chap. 6

W12: Artificial & Deep Neural Network
May 20: Artificial Neural Network
  • Lecture
  • Lab
  • References
    • [Ge23] Chap. 10
    • [Oh21] Chap. 3
May 23: Deep Neural Network

W13: Convolution & Recurrent Neural Network
May 27: Convolution Neural Network
  • Lecture
  • Lab
  • References
    • [Ge23] Chap. 14
    • [Oh21] Chap. 4
May 30: Recurrent Neural Network
  • Lecture
  • Lab
  • References
    • [Ge23] Chap. 15
    • [Oh21] Chap. 8

W14: Autoencoder
June 03: Autoencoder
June 06: Memorial Day
  • No class

W15: Generative Models & Interpretable Machine Learning
June 10: Generative Models
June 13: Interpretable Machine Learning
  • Lecture
  • References
    • [Mo24] Chap. 5 - 10
    • [Ma23] Chap. 4 - 8

W16: Final Term
June 17: Final Term Period
  • No Class; Focus on Team Assignment #2
June 20: Final Remark