Search, not AI, is still king

Search, not AI, is still the foundation of knowledge work. 

The Google search box was not just a necessary precursor to Artificial Intelligence, good search already looks a lot like it. Take the question “what is the boiling point of alcohol?”. AI can work through the physics and derive the answer, or it can just…look it up.

In a perfect world, where every question has a ready answer, you search. In the real world, some answers exist and others have to be worked out. Or, you let AI do the thinking.

Thinking, rather than searching is what can cause bad AI outputs (or hallucinations). A published source is usually more reliable than an answer an LLM derives from scratch. At worst, retrieval might miss a relevant document, but derivation can fabricate one.

This is clearest in serious knowledge work, where correctness is critical and answers take multiple reasoning steps. Small mistakes compound. The more “thinking” AI does, the higher the risk of hallucination.

The best way to lower this risk is to lean harder on search and less on derivation. Reasoning was never the problem. That’s the fun part of the job. 

Yet in most AI chatbots, reasoning and search are fused into one black box that answers questions instead of helping users do sensemaking. Wouldn’t it make more sense for search and AI generation to be separate?

The search for reasoning “context” can’t look like a Google search box, where you type in an intent and then groom through results. It should look more like a loop of search, review, synthesize, search again. It is more of a recipe than a search goal. It’s search that lets you specify how to search, not what to search.

Search, taken to an extreme, can look like an ETL loop that mines data for signal. Letting users define the search strategy in detail creates a much better leverage point for human-machine interaction. It keeps the fact foraging deterministic, and lets AI reasoning do what it does best. 

Bigger models help, but they’re not the whole story. Reliability in AI will come from these type of boring UX changes that make everything more deterministic.