CODE | CCE5502 | ||||||||||||||||
TITLE | Fundamentals of AI and ML | ||||||||||||||||
UM LEVEL | 05 - Postgraduate Modular Diploma or Degree Course | ||||||||||||||||
MQF LEVEL | 7 | ||||||||||||||||
ECTS CREDITS | 5 | ||||||||||||||||
DEPARTMENT | Communications and Computer Engineering | ||||||||||||||||
DESCRIPTION | This study-unit first introduces Artificial Intelligence (AI) and Machine Learning (ML) techniques that can be applied in the diverse disciplines. The theory of state-of-the-art techniques in Unsupervised Learning, Supervised Learning, Reinforcement Learning and Anomaly Detection will be covered, as will be the theory of Neural Networks and Deep Learning Models and techniques that can implement both supervised and unsupervised learning. The study-unit then teaches students how to train ML models and infer from them, using Python tools and libraries. Study-unit Aims: The aim of the study-unit is to give the student the tools and knowledge necessary to build computational models out of data. Learning Outcomes: 1. Knowledge & Understanding By the end of the study-unit the student will be able to: - Describe the field of Artificial Intelligence and the problems that it tackles; - Describe Machine learning as a sub-field of Artificial Intelligence; - Distinguish between Unsupervised learning, Supervised Learning, Reinforcement Learning and Anomaly detection; - Describe the applications of these techniques in Clustering, Dimensionality Reduction, Classification and Regression; - Explain the theory of Unsupervised, Supervised and Reinforcement learning and Anomaly detection; - Explain how to apply these techniques to solve scientific and engineering problems. 2. Skills By the end of the study-unit the student will be able to: - Train and infer from Unsupervised Learning models including Clustering Algorithms (K-Means, GMM) and Dimensionality reduction algorithms (PCA); - Train, and infer from Supervised Learning models including Classification Algorithms ( logistic regressor, decision trees, SVM, k-NN, Naive Bayes) and Regression Algorithms (linear regressor, decision trees, SVR, k-NN for regression), ensemble methods and random forest; - Select and deploy reinforcement learning algorithms; - Train, and/or infer from Artificial Neural Networks and Deep Neural Networks used for Supervised Learning ( Fully connected feed forward networks, CNN, RNN, LSTM, GRU) and unsupervised learning (Autoencoders) and those using attention mechanisms and transformers. Main Text/s and any supplementary readings: Main - “Building Machine Learning Systems with Python”, 3rd Edition, Luis Pedro Coelho, Wilhelm Richert, Matthieu Brucher (Authors), Packt Publishing, ISBN: 9781788623223. - J.D. Kelleher, B. Mac Namee and A. D’Arcy, “Fundamentals of Machine Learning for Predictive Data Analytics”, MIT Press. 2nd edition (main library – Q325.5 .K455). Supplementary - Christopher M. Bishop, Pattern Recognition and Machine Learning, Springer, 2006. - Kevin P. Murphy, Machine Learning; A probabilistic Perspective, MIT press, 2012. - Goodfellow, Y. Bengio and A. Courville, “Deep Learning”, MIT, 2016. - "Python Data Science Handbook: Essential Tools for Working with Data", 1st Edition, Jake VanderPlas (Author), O’Reilly, available at https://jakevdp.github.io/PythonDataScienceHandbook/ |
||||||||||||||||
STUDY-UNIT TYPE | Fieldwork, Lectures, Project and Tutorials | ||||||||||||||||
METHOD OF ASSESSMENT |
|
||||||||||||||||
LECTURER/S | Leander Grech Gianluca Valentino |
||||||||||||||||
The University makes every effort to ensure that the published Courses Plans, Programmes of Study and Study-Unit information are complete and up-to-date at the time of publication. The University reserves the right to make changes in case errors are detected after publication.
The availability of optional units may be subject to timetabling constraints. Units not attracting a sufficient number of registrations may be withdrawn without notice. It should be noted that all the information in the description above applies to study-units available during the academic year 2024/5. It may be subject to change in subsequent years. |