Romberg’s rule#
An alternative to the Newton-Cotes formulae uses successive sampling step sizes to estimate and cancel error. We begin with Richardson extrapolation and then generalize to Romberg’s rule.
Richardson extrapolation#
Richardson extrapolation is an algorithm that uses successive applications of the trapezoid rule with differing step size to achieve superior results with less effort.
The exact integral can always be expressed:
where \(I'(h)\) is the approximation with step \(h\) and the associated error \(E(h)\). We know that \(E(h) \propto h^2\) for the trapezoid rule. In fact, it is \(E(h) \propto f'' h^2\).
Let’s sample the interval twice with step sizes \(h_1\) and \(h_2\). If we assume \(f''\) doesn’t change much we can say,
Now, since the exact integral is the same in both cases,
and inserted into the formula for \(h_2\),
which can be shown to be accurate to \(O(h^4)\)!
For the special case where \(h_1 = 2 h_2\) (which has advantages for overlapping point evaluations)
This is an interesting result! Effectively what we have done is use a second estimate to estimate the next power in our expansion, leading to a higher order estimate!
Romberg Integration Algorithm#
In fact we can repeat this proceedure arbitrarily! Above we combined two order \(O(h^2)\) to make \(O(h^4)\). We can take this results, combine it with another sampling at \(h_3 < h_2\) and combine to get an \(O(h^6)\) estimate and so on! If we successively halve the step size, we can get:
where the last line is the Romberg Integration Algorithm which has some very nice properties:
lends itself to recursive programming (elegant)
parallelizes nicely since each interval is independent.
can integrate to arbitrary accuracy (if a callable function is provided)
Can reuse function evaluations if the step size is chosen well.
However, it requires either:
the ability to sample the function at will
a structured sampling grid
Remember whoever that we assumed the second derivative doesn’t change much between levels (I.e.: that it stays the same at different length scales). If there is high frequency noise and a low frequency trend, the cancellation of the errors isn’t so robust. For these reasons, heuristics based on Newton-Cotes (Decision trees of applications of Simpson’s rules and trapezoid depending on sampling structure) remains a common robust choice.
Example#
Approximate \(\int_0^\pi \sin(x) dx\) using the Rhomberg rule and compare with Simpson’s 1/3 rule
import numpy as np
import scipy as sp
def f(x):
return np.sin(x)
tolerance = 1e-6
for n in [4,8,16,32,64, 128]:
x = np.linspace(0, np.pi, n+1)
f_x = f(x)
rhomberg = sp.integrate.romb(f_x, dx = np.pi/n, show=True)
print(f"Romberg with {n} intervals: {abs(rhomberg-2)}")
simpson = sp.integrate.simpson(f_x, x=x)
print(f"Simpson with {n} intervals: {abs(simpson-2)}")
Richardson Extrapolation Table for Romberg Integration
======================================================
0.00000
1.57080 2.09440
1.89612 2.00456 1.99857
======================================================
Romberg with 4 intervals: 0.001429268176164289
Simpson with 4 intervals: 0.0045597549844207386
Richardson Extrapolation Table for Romberg Integration
======================================================
0.00000
1.57080 2.09440
1.89612 2.00456 1.99857
1.97423 2.00027 1.99998 2.00001
======================================================
Romberg with 8 intervals: 5.549979670949057e-06
Simpson with 8 intervals: 0.00026916994838765973
Richardson Extrapolation Table for Romberg Integration
======================================================
0.00000
1.57080 2.09440
1.89612 2.00456 1.99857
1.97423 2.00027 1.99998 2.00001
1.99357 2.00002 2.00000 2.00000 2.00000
======================================================
Romberg with 16 intervals: 5.412709835894702e-09
Simpson with 16 intervals: 1.6591047935499148e-05
Richardson Extrapolation Table for Romberg Integration
======================================================
0.00000
1.57080 2.09440
1.89612 2.00456 1.99857
1.97423 2.00027 1.99998 2.00001
1.99357 2.00002 2.00000 2.00000 2.00000
1.99839 2.00000 2.00000 2.00000 2.00000 2.00000
======================================================
Romberg with 32 intervals: 1.3216094885137863e-12
Simpson with 32 intervals: 1.0333694131503535e-06
Richardson Extrapolation Table for Romberg Integration
======================================================
0.00000
1.57080 2.09440
1.89612 2.00456 1.99857
1.97423 2.00027 1.99998 2.00001
1.99357 2.00002 2.00000 2.00000 2.00000
1.99839 2.00000 2.00000 2.00000 2.00000 2.00000
1.99960 2.00000 2.00000 2.00000 2.00000 2.00000 2.00000
======================================================
Romberg with 64 intervals: 4.440892098500626e-16
Simpson with 64 intervals: 6.453000178652246e-08
Richardson Extrapolation Table for Romberg Integration
======================================================
0.00000
1.57080 2.09440
1.89612 2.00456 1.99857
1.97423 2.00027 1.99998 2.00001
1.99357 2.00002 2.00000 2.00000 2.00000
1.99839 2.00000 2.00000 2.00000 2.00000 2.00000
1.99960 2.00000 2.00000 2.00000 2.00000 2.00000 2.00000
1.99990 2.00000 2.00000 2.00000 2.00000 2.00000 2.00000 2.00000
======================================================
Romberg with 128 intervals: 0.0
Simpson with 128 intervals: 4.032257194808153e-09
Note the error became zero! What does that mean?