Earlier this year Courtland was reading Rainbows End, Vernor Vinge’s seminal augmented reality novel, when he came across the term “Local Honcho1”:

We simply put our own agent nearby, in a well-planned position with essentially zero latencies. What the Americans call a Local Honcho.

The near future Vinge constructs is one of outrageous data abundance, where every experience is riddled with information and overlayed realities, and each person must maintain multiple identities against this data and relative to those contexts.

It’s such an intense landscape, that the entire educational system has undergone wholesale renovation to address the new normal, and older people must routinely return to school to learn the latest skills. It also complicates economic life, resulting in intricate networks of nested agents than can be hard for any one individual to tease apart.

Highlighting this, a major narrative arc in the novel involves intelligence agencies running operations of pretty unfathomable global sophistication. Since (in the world of the novel) artificial intelligence has more or less failed as a research direction, this requires ultra-competent human operators able to parse and leverage high velocity information. For field operations, it requires a “Local Honcho” on the ground to act as an adaptable central nervous system for the mission and its agents:

Altogether it was not as secure as Vaz’s milnet, but it would suffice for most regions of the contingency tree. Alfred tweaked the box, and now he was getting Parker’s video direct. At last, he was truly a Local Honcho.

For months before, Plastic had been deep into the weeds around harvesting, retrieving, & leveraging user context with LLMs. First to enhance the UX of our AI tutor (Bloom), then in thinking about how to solve this horizontally for all vertical-specific AI applications. It struck us that we faced similar challenges to the characters in Rainbows End and were converging on a similar solution.

As you interface with the entire constellation of AI applications, you shouldn’t have to redundantly provide context and oversight for every interaction. You need a single source of truth that can do this for you. You need a Local Honcho.

But as we’ve discovered, LLMs are remarkable at theory of mind tasks, and thus at reasoning about user need. So unlike in the book, this administration can be offloaded to an AI. And your Honcho can orchestrate the relevant context and identities on your behalf, whatever the operation.

Footnotes

  1. ”American English, from Japanese班長 (hanchō, “squad leader”)…probably entered English during World War II: many apocryphal stories describe American soldiers hearing Japanese prisoners-of-war refer to their lieutenants as hanchō” (Wiktionary)