Keeping up via known unknowns

Happy Monday!

There's never a shortage of new things we have to learn about. New model, new framework, new benchmark, new tool. It's a delicate balance. If we never bother to keep up, we'll get left behind. If all we do is try to keep up, we'll never get anything done.

Here's something I've stumbled on that works for me. It's based on Donald Rumsfeld's much-ridiculed classification of things we know (or don't know).

What we (don't) know

  • Known knowns: That's just the stuff you know.

  • Known unknowns: Stuff you don't know. But at least you know that you don't know.

  • Unknown unknowns: Stuff you don't know. And you don't even know that you don't know.

With unknown unknowns, you don't even know what the question is and that you could be asking it.

Making the unknown known

Loosely keeping up with recent developments means turning the unknown unknowns into known unknowns. The first step here is just to pay attention and keep your eyes open. You're most likely doing that already! At this level, the question is relatively shallow: "What even is XYZ?" What might prevent us from digging further, though, is the sense that we'd have to spend a lot of time to gain a deep understanding and that this isn't sustainable given the number of new things to explore.

But what if all we do is push a little bit deeper so that our questions are a bit more nuanced? This can easily be achieved by a bit of browsing and skimming:

  • For a new tool, idly browse the documentation.

  • For a framework or package, skip quickly through the tutorial.

  • For a research paper, skim the abstract and call it a day.

It's enough to come out of this experience with many new things you don't know. But at least you'll know that! It'll give your mind something to latch on and, in the future, notice when it becomes relevant to what you're doing. Then, when you have validation that diving deeper will be helpful, you can spend the time and feel good about it.

Previous
Previous

Even if it's not hallucinating, it's hallucinating

Next
Next

When AI really wanted to sell me more power drills