ACS6427 Data Modelling and Machine Intelligence

Module Description (subject to change)

All of our lives are affected by "machine intelligence" and "data models" - Google is a very visible example. But if you are a victim of identity theft, if you want a loan to buy a house or if you want to pass through immigration at an airport, a model derived from data using some form of machine learning technique will be involved.

Engineers increasingly look to machine intelligence techniques such as neural networks and other machine learning methods to solve problems that are not amenable to conventional analysis e.g. by application of Newton's & Kirchhoff's laws, and other physical principles. Instead they use measurements of system variables to compute a model of the process that can then be used in design, analysis and forecasting. System identification is a specific example of data modelling.

We will look at the underlying principles of machine learning, the advantages and limitations of the various approaches and effective ways of applying them with the aim of making you a competent practitioner.

Credits: 15 (Autumn semester)

Restrictions:
A typical undergraduate “engineering mathematics” background is required: functions; multivariate calculus; linear algebra & matrix operations; basic optimization; probability theory; statistics & a facility with matlab

Module Leader

Zi-Qiang Lang

Professor Zi-Qiang Lang
Email: z.lang@sheffield.ac.uk
 Amy Johnson Building

If you have any questions about the module please talk to me during the lectures or the labs in the first instance. It is likely that other students will learn from any questions you ask as well, so don’t be afraid to ask.

Outside of lectures please contact me via email, or drop by.

Learning Outcomes

Learning Outcomes

By the end of the modules students will be able to:

  1. Identify and classify archetypal data modelling paradigms and propose and select appropriate candidate solutions. [SM1m, EA2m]
  2. Understand the underlying, theoretical principles and limitations of machine learning techniques and use this knowledge in the solution of realistic problems. [SM3m, SM5m, EA1m, EA3m, EA4m]
  3. Design, create and apply machine intelligence software to a number of artificial and real problems. [EA1m, D2p, D3m, D4m]
  4. Evaluate the quality of such solutions, compare and contrast alternatives and defend/criticize the approach taken. [SM3m, EA1m, EA3m, SM6m]

This module satisfies the AHEP3 (Accreditation of Higher Education Programmes, Third Edition) Learning Outcomes that are listed in brackets after each learning outcome above. For further details on AHEP3 Learning Outcomes, see the downloads section of our accreditation webpage.

Syllabus

Syllabus

  1. Introduction: examples; data-to-knowledge; assumptions; the model; inference paradigm; inference engine; the data; the problem of induction
  2. The linear model & least-squares: some useful results; formulation; solution; operation; Matlab example
  3. Non-linearity & MIMO problems: generalization of the linear model; univariate non-linear functions; Matlab example; functions of more than on variable; colinearity; multi-response
  4. Non-linear MISO relationships: non-linear input/output functions of more than one variable; Matlab example; other choices of basis
  5. LIP modelling with basis functions: model building; radial basis functions; sigmoidal basis functions
  6. Controlling complexity: choices; regularization; Matlab example; parsimony
  7. Model selection: cross-validation; a strategy
  8. Gradient-based optimization: motivation; first-order approximation; stability; convexity
  9. Non-linear models: overview; neural networks; the multi-layer Perceptron; Netlab
  10. The data: sample size & distribution; the “small-sample-size” problem; normalization; time signals
  11. Pattern recognition: overview; classification; logistic regression; decision making
  12. Non-linear classification: generalization; non-linear models; assessment of performance
  13. More recent approaches to modelling and inference
  14. Introduction to deep-learning
Teaching Methods

Learning and Teaching Methods

NOTE: This summary of teaching methods is representative of a normal Semester. Owing to the ongoing disruption from Covid-19, the exact method of delivery will be different in 2020/21.

Lectures: 18 hours
Labs: 12 hours
Independent Study: 118 hours

Teaching Materials

Learning and Teaching Materials

Al lecture slides will be provided as handouts with space to take notes. Matlab “helper” software will be provided for the labs. All resources will be available on Blackboard (MOLE) as will vodcasts of the lectures.

Assessment

Assessment

Formal exam (50%) 2 hrs in Autumn exam period.

Blackboard (MOLE) quizzes x4 (6.25% each)

Coursework (25%)

Feedback

Feedback

Examples will be presented in class that provide an opportunity for you to gauge your understanding and to request clarification from the lecturer. Questions during lectures are welcomed.

Laboratory sessions provide a good opportunity for feedback and guidance on progress – this will be face-to-face & oral. Four Blackboard (MOLE) “quizzes” will take place & these will contain feedback available immediately the submission date is passed. The final assignment & examination will receive only summary feedback that you can access after they have been marked.

Student Evaluation

Student Evaluation

Students are encouraged to provide feedback during the module direct to the lecturer. Students will also have the opportunity to provide formal feedback via the Faculty of Engineering Student Evaluation Survey at the end of the module.

You can view the latest Department response to the survey feedback here.

Recommended Reading

Recommended Reading

  • Bishop, C.M. Neural Networks for Pattern Recognition, Clarendon Press, Oxford, 1995. [Available in Information Commons, 006.4(B)]
  • Nabney, I.T. Netlab: Algorithms for Pattern Recognition, Springer, London, 2002. [Available in Information Commons, 006.4(N)]
  • Haykin, S. Neural Networks: A Comprehensive Foundation. Prentice Hall, Upper Saddle River, 1999. [Available in Information Commons, 006.32(H)]