# Polynomial regression
A polynomial regression is a special case of [[Multiple linear regression|multiple linear regression]] that includes additional predictors that are powers of another predictor. If we have just one predictor, then a polynomial regression model of degree $d$ is specified by:
$
E(Y_i\mid X) = \beta_0 + \beta_1 X + \beta_2 X^2 + ... + \beta_{d} X^d
$
For example, if $d = 2$, then we'll have quadratic regression. Polynomial regression is useful for modeling relatively simple nonlinearities in the predictor-outcome relationship.
Despite how it looks, polynomial regression is still a linear regression. It is still linear *in the parameters*.
Since predictors can get large, it can be useful to center the predictor before using it in a polynomial regression.
If a polynomial regression contains multiple covariates, then the full model would contain all the polynomial terms of order $d$ *and* the interactions between the predictors.
## Interpretation
It is difficult to interpret the parameters associated with polynomial predictors in terms of changes. It is more convenient to interpret the parameters in terms of rates of change at a given predictor value.
---
# References
[[Applied Linear Regression#5. Complex Regressors]]