Exponential Shrinking

In Computer Science, we classify problems based on the relationship between input size and program runtime.

  • Maybe doubling the input size doubles the runtime. That's a linear problem and considered easy.

  • But maybe adding just a single item to the input already doubles the runtime. That's an exponential problem and considered hard.

A quick example

Just so we have something concrete to talk about, imagine a problem where the input is a bunch of items, and you need to find the best combination of items (with some measure of what good and best mean)

  • For one item, we either pick it or we don't, so that's two choices to evaluate.

  • For two items, we can pick neither, both, one, or the other for four choices.

  • Each additional item doubles the number of choices.

  • For 10 items, we're already at 1024 choices.

  • For 20 items, we're at over a million.

  • For 100 items, we've got a one with thirty zeros.

Look at the reverse!

The standard way of looking at exponential problems is pessimistic: Woe to us! Any little addition to the input makes the problem so much harder! How can we possibly cope?

But look at it the other way round: With exponential problems, each time you take an item off the table, you cut the problem in half. In other words, it's the complex problems where simplifying things gives you the most bang for your buck. Whereas with a linear problem, you only get a 1:1 return on your simplification efforts, with exponential problems, you get an exponential return.

Concrete Applications

  • Code Review – Performing ten reviews of a hundred lines of code each is drastically easier than performing one review of a thousand lines of code.

  • Machine Learning – Reducing a dataset by filtering out low-impact features can make training a machine learning model exponentially faster.

  • Team Size – Splitting a large team into smaller, autonomous units cuts coordination overhead dramatically.

  • Company Growth – Empowering smaller decision-making groups prevents bureaucracy from scaling out of control.

  • Processes – Removing a single key bottleneck can halve the complexity of an entire workflow.

  • Product Focus – Trimming unnecessary offerings simplifies operations and reduces decision fatigue for customers.

I'd love to hear if you've seen this principle in action in your line of work! Reply to this email and let me know.

Previous
Previous

The 5 tiers of data for AI

Next
Next

Burn and Scrape