Texas A&M University

CSCE 625 Artificial Intelligence

Spring 2025

[ Home | Logistics | Grading | Schedule | Canvas ]

Course Overview

Description: This graduate-level course will focus on the fundamental concepts and modern methods of artificial intelligence such as search (uninformed, informed, iterative improvement, constraint satisfaction, space/time complexity), game playing (minmax, alpha-beta pruning; knowledge representation and reasoning; propositional logic, first-order logic and automated theorem proving, etc.), planning, uncertainty and probabilistic reasoning, machine learning and deep learning basics. Selected topics include foundation models, generative AI, agentic AI, physical AI, robotics, natural language processing, computer vision, and AI ethics.




Course Information

Instructor: Cheng Zhang

chzhang at tamu dot edu

Office hour: 2-3 PM, Friday

Office: Peterson 321

TA: Fengzhi Guo

fengzh_g at tamu dot edu

Office hour: 1-2 PM, Friday

Office: Peterson 364

Logistics

Grading Policy

  • Quizzes (10%)
  • Homework assignments (50%)
  • Final project (40%)
  • Textbooks

    This course does not mandate any textbook. The lecture slides/videos and other materials provided by the instructor will be sufficient, serving as the primary reference. In addition, the students are recommended to refer to the following textbooks and materials:
  • Stuart Russell and Peter Norvig, Artificial intelligence: a modern approach (3rd edition). Pearson, 2010
  • Christopher M Bishop, Pattern recognition and machine learning. Springer, 2006.
  • Kevin P. Murphy, Machine Learning: A Probabilistic Perspective. The MIT Press, 2012
  • Shai Shalev-Shwartz and Shai Ben-David, Understanding machine learning: From theory to algorithms. Cambridge University Press, 2014.
  • Ian Goodfellow, Yoshua Bengio, and Aaron Courville, Deep learning. MIT Press, 2016.
  • Ethem Alpaydin, Introduction to Machine Learning. The MIT Press.
  •  



    Class Schedule


    ** class schedule is subject to change **

    AI: Artificial intelligence: a modern approach (3rd edition). Stuart Russel and Peter Norvig

    PRML: Pattern recognition and machine learning. Springer, 2006. Christopher M Bishop

    Lecture Date Topic Additional Reading Note
    Week 1
    1 Tu 1/14 Logistics and course overview [slides] Review: linear algebera, probability, and Python
    2 Th 1/16 AI agent design [slides] AI: 1-2 Drop out due: 1/17
    Week 2
    Tu 1/23 Lecture canceled due to inclement weather AI: 3.1-3.5
    AI: 5.1-5.3
    3 & 4 Th 1/16 Uninformed & Informed search [slides] [slides]
    + details BFS, DFC, UCS, Greedy, and A* search
    AI: 3.1-3.5
    AI: 5.1-5.3
    Week 3
    5 Tu 1/28 Graph search [slides] AI: 3.1-3.5
    AI: 5.1-5.3
    Assignment 1 out [link]
    6 Th 1/30 Adversarial search [slides] AI: 3.1-3.5
    AI: 5.1-5.3
    Week 4
    7 Tu 2/4 Learning agents and data [slides]
    + details Maching learning overview, application data
    PRML: 1.2.1, 1.2.2, 1.2.4
    PRML: 2.5
    8 Th 2/6 Data, feature, and representation [slides]
    + details Bag of words, histograms
    PRML: 1.2.1, 1.2.2, 1.2.4
    PRML: 2.5
    Week 5
    9 Tu 2/11 Correlation and normalization [slides]
    + details Non-parametric representations (Parzen), data correlation, Z-score
    PRML: 1.4
    PRML: 12.1, 12.4.3
    Assignment 1 due
    Assignment 2 out [link]
    10 Th 2/13 Dimensionality reduction [slides]
    + details PCA & embedding (T-SNE)
    PRML: 1.4
    PRML: 12.1, 12.4.3
    Week 6
    11 Tu 2/18 Linear regression [slides]
    + details General parameter estimation techniques
    PRML: 1.1, 1.2.5, 1.5.5
    PRML: 3.1
    12 Th 2/20 Course project topic discussion [slides] Refer to Slack for the recording
    Week 7
    13 Tu 2/25 Non-linear regression
    + details Gradient descent and Newton's method
    PRML: 1.1, 1.2.5, 1.5.5
    PRML: 3.1
    14 Th 2/27 Non-linear regression (cont.) PRML: 1.1, 1.2.5, 1.5.5
    PRML: 3.1
    Week 8
    15 Tu 3/4 Parameter estimation for probability models
    + details Probability basics, probability refresher, distribution modeling (parameter estimation, MAP, ML), Bayes rule
    PRML: 3.1.3, 5.2.4, 1.2.3, 1.2.4, 2.3, 8.1.1-8.1.3, 8.2, 8.4.1
    AI: 13.1-13.5
    Review: probability
    Assignment 2 due Assignment 3 out
    Th 3/6 Project highlight presentation
    Week 9
    Tu 3/11 Spring break (no class)
    Th 3/13 Spring break (no class)
    Week 10
    16 Tu 3/18 Parameter estimation and Naïve Bayes PRML: 3.1.3, 5.2.4, 1.2.3, 1.2.4, 2.3, 8.1.1-8.1.3, 8.2, 8.4.1
    AI: 13.1-13.5
    Review: probability
    17 Th 3/20 Graphical model introduction
    + details Bayesian networks
    PRML: 1.2.3, 1.2.4, 2.3, 8.1.1-8.1.3, 8.2, 8.4.1
    AI: 13.1-13.5
    Week 11
    18 Tu 3/25 Graphical model introduction
    + details Bayesian networks
    PRML: 1.2.3, 1.2.4, 2.3, 8.1.1-8.1.3, 8.2, 8.4.1
    AI: 13.1-13.5
    19 Th 3/27 Unsupervised learning, clustering
    + details Unsupervised learning overview, K-means/medoids, agglomerative
    PRML: 9.1
    Week 12
    20 Tu 4/1 Supervised learning
    + details Supervised learning overview, Train/Val/Test, cross-validation, overfitting, KNN, Decision Trees, Random Forest, Boosting/Bagging, Logistic Regression, SVM-lite, Bayesian Classifier
    PRML: 1.1, 1.3
    AI 18.1 - 18.3
    Assignment 3 due Assignment 4 out
    21 Th 4/3 Neural networks and deep learning
    + details Deep learning introduction with CNN, RNN, GNN basics, perceptron, multi-layer perceptron, backpropagation, training particulars
    PRML: 4.1.7
    PRML: 5.1-5.3
    Week 13
    22 Tu 4/8 Transformer, foundation models, large language models
    23 Th 4/10 Frontiers: Generative AI, Agentic AI, Physical AI AI: 22-25
    Week 14
    24 Tu 4/15 Frontiers: Generative AI, Agentic AI, Physical AI AI: 22-25
    Th 4/17 Guest Lecture: Prof. Yapeng Tian
    Week 15
    Tu 4/22 Final presentation Assignment 4 due
    Th 4/24 Final presentation
    Week 16
    Tu 4/29 Final presentation (if needed)
    Th 5/1 Reading day (no class)