Industry Insights 12 min read

‘Lobster Father’ Says Human Internet Is Unfriendly to Agents—Startup Builds Agent‑Friendly Infra

The article examines why current web infrastructure, designed for human users, hampers AI agents—citing low success rates and token waste—then details AgentEarth’s approach to create an agent‑centric internet layer with proprietary protocols, unified gateways, and self‑hosted tool stores that dramatically improve reliability, speed, and cost efficiency.

Machine Heart
Machine Heart
Machine Heart
‘Lobster Father’ Says Human Internet Is Unfriendly to Agents—Startup Builds Agent‑Friendly Infra

In March last year, AI pioneer Andrej Karpathy tweeted that most content is still written for humans, but soon AI will be the primary consumer, urging developers to make documents AI‑friendly. The author notes that many dismissed this as premature, yet within a year the rise of “Lobster” agents has made human‑centric web tools feel like a stone‑road carriage era.

Agents now face a hostile internet: verification walls, login hurdles, scarce CLI/API access, and token costs. Success rates for a single external tool call are only about 60%; multi‑step workflows drop below 30%. Peter Steinberger, dubbed the “father of Lobster,” echoed this frustration, calling current internet infrastructure extremely unfriendly to agents.

To address this, former cloud‑operations executive Liu Hongtao founded AgentEarth with CTO Dan Minghui (formerly of Didi’s real‑time matching system) and chief scientist Prof. Xue, whose expertise lies in low‑level network protocols. Their mission is not a simple agent tool but a foundational infrastructure that builds a high‑speed, reliable logistics layer for agents and a curated “store” where agents can instantly invoke vetted, high‑quality tools, saving both time and token spend.

The solution is organized into three technical layers. The top layer provides a self‑operated logic that initially offers only internally selected tools, ensuring early quality similar to a self‑run e‑commerce store, before gradually opening to third‑party tools with a large‑model‑driven recommendation engine. The middle layer introduces a unified gateway that shifts tool‑quality responsibility from the agent to the platform, handling selection, failover, and transparent billing so agents no longer waste tokens on trial‑and‑error. The bottom layer implements a proprietary “transmit‑store‑compute” integrated scheduling protocol, claimed to be 2‑10× faster than Google’s QUIC in real‑world tests, reaching up to ten‑fold speedups for transferring generated media.

Developing such a protocol is a long‑term effort, likened to breeding a new species; each stage must mature before the next can proceed, and deep network‑behavior knowledge accumulates over years. Liu notes that the protocol’s roots trace back to TCP/IP optimizations and now form the company’s core moat.

Beyond technical details, the article highlights broader market dynamics: while Cloudflare’s “Markdown for Agents” and Google’s WebMCP hint at early interest, the Agent Internet Infra space remains nascent with few dedicated service providers. Treating agents as primary end‑users shifts infrastructure goals from human experience to task‑completion efficiency, demanding high reliability, low latency, and transparent cost accounting.

Finally, the piece argues that Agent Internet Infra could rewrite the growth model of the internet: a single entity can deploy thousands of agents that run continuously, suggesting the traffic and value ceiling is currently undefined and may spawn the next generation of large‑scale infrastructure companies.

AI agentsprotocol optimizationInternet infrastructureagent‑friendlyAgentEarth
Machine Heart
Written by

Machine Heart

Professional AI media and industry service platform

0 followers
Reader feedback

How this landed with the community

Sign in to like

Rate this article

Was this worth your time?

Sign in to rate
Discussion

0 Comments

Thoughtful readers leave field notes, pushback, and hard-won operational detail here.