The Joy of Bisection
By Jason Lenthe | March 15, 2025

We spend so much of our mathematical education learning how to solve equations using algebra that it came as quite a surprise when I learned many years ago that some types of equations cannot be solved with algebra. And yet these are not special cases where great mathematical minds have devised clever and obscure solutions, but entire classes of relatively simple equations that can be solved systematically. What's even more amazing is that there is a simple algorithm that can approximately but rigorously solve such equations.

Solving Equations without Algebra?

Let's take a very simple equation: $$ 5 = 2^x $$

“What's so hard about solving that?” you say. Remembering your rules of logarithms, you quickly write something down like $$ x = { \log(5) \over \log(2) } $$

Then pop open your favorite REPL interpreter and import the math library. Or maybe you grab your scientific calculator. Or, if you're really old school, you get our your battered copy of the Mathematical Handbook for Scientists and Engineers [1] and turn to appendix F for the table of logarithms. One way or another, you're able to get 2.321928094887362 as an approximate but pretty darn accurate solution. But did you ever stop and think...can you solve this equation only using pencil, paper, and algebra?

It turns out that the logarithm function is quite a pain to do by hand. No amount of moving terms around with addition, subtraction, multiplication, or division is going to get you x on a side by itself with the other side having some expression you can calculate using standard arithmetic. You'll need to leverage the mathematical properties of the equation in conjunction with an iterative method. And this is exactly what your math library's log() function is doing for you.

Mathematical expressions like \(2^x\) (where \(x\) is a real number) belong to the category of transcendental functions. Informally, a transcendental function is one that cannot be written solely with the basic arithmetic operations of addition, subtraction, multiplication, and division. Since they can be written with basic arithmetic alone, equations containing transcendental functions cannot be solved using algebra alone.

Equations, Function Roots, and Bracketing

Solving an equation can be reduced to finding a zero or root function. To show how this is the case, consider any equation: $$ y = f(x) $$ Solving for x given y is tantamount to finding an x that satisfies: $$ 0 = g(x) $$ where \( g(x) = f(x) - y \). In other words, finding the point where \(g(x)\) is zero is the same thing as solving \(y = f(x)\) for \(x\). Thus solving an equation reduces to finding the zero or root of a function.

Now suppose you have some function \(f(x)\) and an interval \([a, b]\) over which \(f(x)\) is continuous and that \( \text{sgn}(f(a)) \ne \text{sgn}(f(b)) \). According to the intermediate value theorem, you know that there exists some \(x\) in the interval \([a, b]\) such that \(f(x) = 0\). The values \(a\) and \(b\) are said to bracket the root of the function \(f\).

Generally, the most reliable methods for finding the root of a function require that you first bracket the root before you can find it precisely. While there are methods that do not require bracketing, they do not guarantee convergence to a solution. Often a bracket is known in advance based on the nature of the function you're dealing with. In more complicated scenarios, a bracket can be found by sampling the function over some interval of interest.

Bisection

One of the most reliable methods of finding the root of a function is the bisection method. The bisection method works by taking our bracketing interval and repeatedly cutting it in half until the bracket has shrunk to a specified tolerance. At each step the two halves are evaluated and half that keeps the opposite signed function values at its end points is kept for the next iteration.

While bisection is not the fastest method, it's still pretty fast. Since it's chopping the bracket in half on each iteration, it quickly converges to the specified tolerance. Other methods that are faster are either less reliable in convergence or more complicated (or both).

And now we present the laser-coder version of a bisection root finder in Java. First we need a function interface:


package net.lasercoder;

interface Function {
    double apply(double x);
}          
        

That was easy. Now for a bisection root finder class:


package net.lasercoder;

public class BisectionRootFinder {
    private int max_iterations = 100;
    private double tolerance = 1e-6;

    public BisectionRootFinder() {
    }

    public BisectionRootFinder(int max_iterations, double tolerance) {
        this.max_iterations = max_iterations;
        this.tolerance = tolerance;
    }

    public double findRoot(Function f, double a, double b) {
        double fa = f.apply(a);
        double fb = f.apply(b);

        if (Math.signum(fa) == Math.signum(fb)) {
            String msg = String.format(
                    "Root is not bracketed: f(%f) = %f, f(%f) = %f",
                    a, fa, b, fb);
            throw new IllegalArgumentException(msg);
        }

        int iteration = 0;
        while (Math.abs(a - b) > tolerance && iteration < max_iterations) {
            double mid = 0.5 * (a + b);
            double fMid = f.apply(mid);
            if (Math.signum(fMid) == Math.signum(fa)) {
                a = mid;
                fa = fMid;
            } else {
                b = mid;
                fb = fMid;
            }
            iteration++;
        }

        if (iteration == max_iterations) {
            String msg = "Max number of iterations hit without convergence.";
            throw new RuntimeException(msg);
        }

        if (Math.abs(fa) < Math.abs(fb)) {
            return a;
        } else {
            return b;
        }
    }
}          
        

The class is configured with a root finding tolerance and a maximum number of iterations (to avoid an infinite loop if a function is badly behaved). You can accept the default configuration by using the default constructor or use the second constructor to configure your own tolerance and maximum number of iterations.

The bisection algorithm is implemented in the findRoot() method. It starts by evaluating the function on the bracket endpoints a and b and confirms that you actually have a bracket. If you don't, an IllegalArgumentException is thrown. Then we begin the halving iterations and continue doing so as long as our bracket is bigger than the tolerance and we have not yet hit the maximum number of iterations.

On each iteration, we determine the midpoint of the bracket and evaluate at it. Then we decide which half to keep: \([a, \text{mid}] \) or \([\text{mid}, b]\). If we hit the maximum number of iterations, a RuntimeException is thrown. If we make it to the end of the method, we have converged on a solution! The final step is to return a value. Most bisection implementations that I've seen will arbitrarily return a or the midpoint of \([a, b]\). My version goes the extra mile and compares the function values at \(a\) and \(b\) to determine which is actually closer to the true root, and that is the value returned.

Newton, Secant, and Brent

Bisection is a wonderful method because it is simple, reliable, and fast. There are however other methods that are faster at the expense of lower reliability and/or greater complexity. Newton's method uses the derivative of the function to repeatedly compute tangent lines to estimate the root. For well-behaved functions like \(x^2\), this method converges very quickly and reliably. But in more complicated situations, convergence is not guaranteed. And of course, a derivative is not always easy or even possible to evaluate. Newton's method is great for specific functions like \(x^2\) (e.g. computing a square root) where the derivative is trivially easy and the function predictably well behaved, but in functions with complexities such as inflection points or multiple roots other more reliable methods are needed.

The secant method is similar, except that instead of tangent lines, it uses secant lines and does not require the calculation of derivatives. The secant method can have very good performance but in some cases like periodic functions with multiple peaks and valleys, it can easily fail to converge. Bisection does not fall prey to such poorly behaved functions. If there is a root in the interval, bisection will find it.

Another interesting root finding method is Brent's method [3]. Brent seems to have set out to create the ultimate root-finding method—and has succeeded. His method combines parabolic interpolation for speed and bisection steps for reliability to converge on a solution rapidly without the risk of being flustered by poorly behaved functions. It usually converges significantly faster than bisection. The only downside of Brent's method is that it is significantly more complicated than bisection; however, it's certainly not too complicated for a motivated math library developer.

Well-known math libraries typically offer both bisection and Brent root finders such as SciPy (brentq() and bisect()) and Apache Commons Math (BisectionSolver and BrentSolver).

Conclusions

Its really neat to be able to find a very accurate solution to an equation without using algebra. This is not a new discovery but definitely one worth geeking out over. The application of such methods is vast. Great algorithms like bisection and Brent's methods have been around for a long time. Common libraries make these and other root-finding methods readily available for mainstream programming languages. What equations will you be solving?

References

  1. Korn, Granino A., Korn, Theresa M. (2000) Mathematical Handbook for Scientists and Engineers. Dover (Original work published 1968)
  2. Press, William H., et. al. (1992) Numerical Recipes in C. Cambridge University Press.
  3. Brent, Richard, P. (2002) Algorithms for Minimization Without Derivatives. Dover (Original work published 1973)

Acknowlegements

ChatGPT was used in the preparation of this article for light proofreading and stylistic suggestions.