Cubic Regression Calculator
Use Omni's cubic regression calculator whenever you want to fit the cubic model of regression to a dataset. With its help, you'll be able to quickly determine the cubic polynomial that best models your data. If you need to learn more about this technique, scroll down to find an article where we give the cubic regression formula, explain how to calculate cubic regression by hand, and illustrate all this theory with an example of cubic regression!
Definition of cubic regression
In general, regression is a statistical technique that allows us to model the relationship between two variables by finding a curve that best fits the observed samples.
In the cubic regression model, we deal with cubic functions, that is, polynomials of degree 3. You can see an example in the picture below. The idea is the same as in other regression models, like linear regression, where we try to fit a straight line to data points, or quadratic regression, where we deal with parabolas. These three types of regression are examples of polynomial regression.
As we now understand the cubic polynomial regression model, so let's discuss the cubic regression formula.
The formula for cubic regression
To discuss the cubic regression formula in a more formal way, we need to introduce some notation. Let us, therefore, consider a set of data points:
(x_{1},y_{1}), ..., (x_{n},y_{n})
.
The cubic regression function takes the form:
y = a + bx + cx² + dx³
,
where a, b, c, d
are real numbers, called coefficients of the cubic regression model. As you can see, we model how the change in x
affects the value of y
. In other words, we assume here that x
is the independent (explanatory) variable and y
is the dependent (response) variable.
 If
d = 0
, we obtain quadratic regression; and  If
c = d = 0
, then we get a simple linear regression model.
And that's it when it comes to the cubic regression equation! The main challenge now is to determine the actual values of the four coefficients. To find the coefficients of the cubic regression model, we usually resort to the leastsquares method. That is, we look for such values of a, b, c, d
that minimize the squared distance between each data point:
(x_{i}, y_{i})
,
and the corresponding point predicted by the cubic regression equation:
(x_{i}, a + bx_{i} + cx_{i}² + dx_{i}³)
.
"OK, but this doesn't help that much in finding these values", you're probably thinking, and we completely agree. In what follows, we discuss how to determine the coefficients in cubic regression function by hand. A quicker solution is, of course, to use Omni's cubic regression calculator 😉.
How to use this cubic regression calculator?
Here's a short instruction on how to use our cubic regression calculator:
 Input your sample  up to 30 points. Remember that the calculator needs at least 4 points to fit the cubic regression function to your data!
 The calculator will display the scatter plot of your data and the cubic curve fitted to these points.
 Below the scatter plot, you will find the cubic regression equation for your data.
 If you need the coefficients computed with a higher precision, click the
advanced mode
of our cubic regression calculator. A field will appear where you can change the number of significant figures.
How to find cubic regression by hand?
It's high time we discussed how to compute the coefficients of cubic regression by hand. We'll use the projection approach, which is a very quick method as it uses matrix operations.
Let us introduce some necessary notation:

We let
X
be a matrix with four columns andn
rows, wheren
is the number of data points. We fill the first column with ones, the second with the observed valuesx_{1}, ..., x_{n}
of the explanatory (independent) variable, the third with squares of these observed values, and the fourth with cubes of these observed values:⌈ 1 x_{1} x_{1}^{2} x_{1}^{3} ⌉  1 x_{2} x_{2}^{2} x_{2}^{3}   ... ... ... ...  ⌊ 1 x_{n} x_{n}^{2} x_{n}^{3} ⌋ This matrix is often called the model matrix.

We let
y
be a column vector containing the valuesy_{1}, ..., y_{n}
of the response (dependent) variable:⌈ y_{1} ⌉  y_{2}   ...  ⌊ y_{n} ⌋ 
We let
β
be the column of the coefficients of the cubic regression model that we're looking for:⌈ a ⌉  b   c  ⌊ d ⌋ Keep in mind that the order matters  start with
a
at the top and finish withd
at the bottom!
Now, to determine the actual values of the coefficients, we just use the socalled normal equation:
β = (X^{T}X)^{1}X^{T}y
,
where:

X^{T}
 Transpose ofX
; 
(X^{T}X)^{1}
 Inverse ofX^{T}X
; and 
The operation between every two matrices is matrix multiplication.
⚠ Keep in mind that for some very peculiar datasets, the inverse of X^{T}X
might not exist. If this happens, you cannot fit the cubic polynomial regression to this data.
As you can see, it's not that hard to find cubic regression by hand, yet there are some challenges along the way. To get a better grasp of how to do all these computations in practice, let's solve an example of cubic regression together.
Cubic regression example
Let us find the cubic regression function for the following dataset:
(0, 1), (2, 0), (3, 3), (4, 5), (5, 4)
.
Here are our matrices:

The matrix
X
:⌈ 1 0 0 0 ⌉  1 2 4 8   1 3 9 27   1 4 16 64  ⌊ 1 5 25 125 ⌋ 
The vector
y
:⌈ 1 ⌉  0   3   5  ⌊ 4 ⌋
We apply the formula stepbystep:

First, we determine
X^{T}
:⌈ 1 1 1 1 1 ⌉  0 2 3 4 5   0 4 9 16 25  ⌊ 0 8 27 64 125 ⌋ 
Next, we compute
X^{T}X
:⌈ 5 14 54 224 ⌉  14 54 224 978   54 224 978 4424  ⌊ 224 978 4424 20514 ⌋ 
Then, we find
(X^{T}X)^{1}
:⌈ 0.9987 0.9544 0.2844 0.0267 ⌉  0.9544 5.5128 2.7877 0.3488   0.2844 2.7877 1.4987 0.1934  ⌊ 0.0267 0.3488 0.1934 0.0254 ⌋ 
Finally, we perform the matrix multiplication
(X^{T}X)^{1}X^{T}y
. The linear regression coefficients we wanted to find are:⌈ 0.9973 ⌉  5.0755   3.0687  ⌊ 0.3868 ⌋ 
Therefore, the cubic regression unction that best fits our data is:
y = 0.9973  5.0755x + 3.0687x²  0.3868x³
.
As you can see, to find the cubic linear regression formula by hand, we need to perform a lot of calculations. Thankfully, there's Omni's cubic regression calculator 😊!
FAQ
What is cubic regression?
Cubic regression is a statistical technique finds the cubic polynomial (a polynomial of degree 3) that best fits our dataset. This is a special case of polynomial regression, other examples including simple linear regression and quadratic regression.
How to find cubic regression?
To calculate cubic regression, we use the method of leastsquares. In practice, we employ the normal equation which employs the model matrix X
, involving the independent variable, and the vector y
, which contains the values of the dependent variable. Through a series of matrix operations, this equation allows us to find the coefficients of cubic regression.
When to use cubic regression?
Use a cubic equation when you see in the scatter plot (or you have some prior theory that leads you to believe) that your data follows a cubic curve. Remember, though, that we want our models to be as simple as possible, so, whenever possible, try to fit a simpler model, like simple linear or quadratic regression.
Can I fit cubic regression to 3 data points?
You can fit many (infinitely many, in fact) cubic curves to 3 data points. You need 4 data points to find a unique cubic model. Note, that with 4 points the fit will be perfect, i.e., all the points will lie at the curve!