Newton-Cotes in practice

Newton-Cotes in practice#

Newton-Cotes, especially the Simpson’s rules, is the go-to for integration when the function is smooth*-ish* (viz - locally quadratic).

Uneven data#

Having the same step size within a subinterval has clear benefits for convergence due to the cancellation of higher order terms. However, the error of the total integral is determined by the largest of the subdomains. Therefore, it is optimimal to have equal spacing in and between the subdomains.

For uneven data, we could define subdivisions based on where the step size is equal, thereby making best use of our data. In the case where the step size is completely random, we can resort to the trapezoid rule.

Multiple dimensions#

Multiple integrals can be decomposed into a series of 1D integrals due to the properties of integration:

\[\int \int_R f(x, y) dA = \int \bigg[\int f(x, y) dy \bigg] dx \]

This defines a recursive algorithm for calculating a series of 1D integrals.

Summary#

Rule

Subdomain Formula

Subdomain error

Total Integral Formula

Total error

Midpoint

\(I_i = hf(x_i)\)

\(O(h^3)\)

\(I = h\sum_{i=1}^n f(x_i)\)

\(O(h^2)\)

Trapezoid

\(I_i = \frac{h}{2}[f(x_{i-1}) + f(x_i)]\)

\(O(h^3)\)

\(I = \frac{h}{2}[f(x_0) + 2\sum_{i=1}^{n-1} f(x_i) + f(x_n)]\)

\(O(h^2)\)

Simpson’s 1/3

\(I_i = \frac{h}{3}[f(x_{i-1}) + 4f(x_i) + f(x_{i+1})]\)

\(O(h^5)\)

\(I = \frac{h}{3}[f(x_0) + 4\sum_{i=1,3,5}^{n-1} f(x_i) + 2\sum_{i=2,4,6}^{n-2} f(x_i) + f(x_n)]\)

\(O(h^4)\)

Simpson’s 3/8

\(I_i = \frac{3h}{8}[f(x_{i-1}) + 3f(x_i) + 3f(x_{i+1}) + f(x_{i+2})]\)

\(O(h^5)\)

\(I = \frac{3h}{8}[f(x_0) + 3\sum_{i=1,4,7}^{n-2} f(x_i) + 3\sum_{i=2,5,8}^{n-1} f(x_i) + 2\sum_{i=3,6,9}^{n-3} f(x_i) + f(x_n)]\)

\(O(h^4)\)