Breadcrumb
Nonparametric Regression (MATH M6004)
Contents of this document:
- Administrative information
- Unit aims, General Description, and Relation to Other Units
- Teaching methods and Learning objectives
- Assessment methods and Award of credit points
- Transferable skills
- Texts and Syllabus
Administrative Information
- Unit number and title: MATH M6004 Nonparametric Regression
- Level: M/7
- Credit point value: 10 credit points
- Year: 12/13
- First Given in this form: 2007
- Unit Organiser: Arne Kovac
- Lecturer: Dr A Kovac
- Teaching block: 1
- Prerequisites: MATH20800 Statistics 2 and MATH35110 Linear Models
Unit aims
- Teach the problem of nonparametric function estimation
- Introduce several methods like kernel estimators and spline techniques
- Construct estimators of a signal from a sample of observations
- Learn flexible and popular techniques of distribution-free methods
General Description of the Unit
A regression function is an important tool for describing the relation between two or more random variables. In real life problems, this function is usually unknown but can be estimated from a sample of observations. In the most simple cases, we have enough information on the problem at hand to assume that the regression curve is known up to the value of some coefficients (for example, it is a straight line, but we need to estimate the coefficients of the line). Nonparametric methods are flexible techniques dedicated to treat more general cases: here, we construct a good estimator of the regression function without assuming that it has a specified shape. "In this module, we will introduce popular nonparametric methods of regression estimation: local polynomial regression, spline regression and a short introduction to wavelet thresholding. Focus will be put on local polynomial and spline regression.We will see how these methods can be applied in practice.
Each module covers an area of statistics and applied probability relevant to the research and other interests of members of academic staff. Details are given in the Syllabus section below.
Teaching Methods
Lectures supported by exercise sheets, many of which involve computer practical work
Learning Objectives
The students will be able to:
- calculate nonparametric regression estimates in R
- decide on suitable techniques for choosing smoothing parameters involved in techniques for nonparametric regression
- compute the various nonparametric regression estimators for given data
- know the strengths and weaknesses of the various methods introduced
- be able to decide the most appropriate method to use given the data at hand
Assessment Methods
Assessment will be by means of a project involving applications on the computer and some theoretical properties of the estimators.
Award of Credit Points
Credit points will be awarded if a student passes the unit; i.e. attains a final mark of 50 or more.
Transferable Skills
In addition to the general skills associated with other mathematical units, students will also have the opportunity to gain practice in the following: report writing, use of information resources, use of initiative in learning material in other than that provided by the lectures themselves, time management, general IT skills and word-processing.
Texts
The main references are:
- J. Fan and I. Gijbels. (1996) Local Polynomial Modeling and Its Applications. Chapman & Hall,
London - Haerdle, W. (1992) Applied nonparametric regression. Econometric Society Monographs.
Wand, M.P and Jones, M.C. (1995). Kernel smoothing.
but the following books can also be heplful:
- Bowman, A.W. and Azzalini, A. (1997), Applied smoothing techniques for data analysis
- Haerdle, W. (1991) Smoothing techniques : with implementation in S
Syllabus
- Introduction to nonparametric methods (explanation of differences with parametric methods)
- Kernel method: local polynomial regression as a generalization of the usual linear regression. Necessity of a smoothing parameter.
- Selection of the smoothing parameter in practice + some properties of local polynomial estimators.
- Spline regression as a generalization of the usual linear regression. Smoothing spline, penalty, smoothing parameter, properties of the spline estimators
- Wavelet thresholding
- Smoothing under Shape Constraints
