Skip to main content
Blog
In the Agent Era, Who Is Your Infra Built For?

In the Agent Era, Who Is Your Infra Built For?

MiniMax surpassing Baidu in valuation is inevitable—Baidu doesn't even have a decent API. Traffic is shifting from humans to Agents. Infrastructure not built for Agents will eventually be used by neither humans nor machines.

Jiawei GuanJiawei Guan4 min read
Share:

MiniMax has surpassed Baidu in market cap. I think this was inevitable.

Baidu has done something devastating to the entire industry: it has blocked the flow of knowledge. From both a personal and an industry perspective, this company is a massive stumbling block.

What Search Engines Should Do

Search engines don't produce knowledge. People write things on the internet; search engines send crawlers to gather this content, aggregate it, and make it easy for others to find. It's simply a pipe.

In the PC era, this pipe worked reasonably well. People opened browsers, typed keywords, and viewed web pages. During that phase, Baidu did indeed monopolize Chinese-language search.

Baidu Fell Behind in the Mobile Era

In the mobile internet era, users were no longer just opening web pages. Countless apps needed information and search capabilities, but these apps weren't humans reading web pages—they were programs calling APIs. Essentially, the center of gravity shifted from humans to devices and apps. What you needed were APIs that allowed machines to easily access your services.

Look at what Google did. It's also a search engine, and it also paid the costs to aggregate knowledge. It gives you a free tier, and charges by usage beyond that. Any developer can integrate search capabilities into their own app on demand.

Baidu? It doesn't even have a decent API.

How absurd is this? I previously worked on a small news bot project and wanted to use Chinese search. After trying everything, I found that using Google to search Chinese content was more convenient than using Baidu. Even using Russia's Yandex to search for Chinese information worked better than Baidu. The hardest way to get Chinese information while physically in China turned out to be using China's largest search engine. I still find this bizarre.

You might say, if they won't open APIs, can't I just simulate a human browsing web pages? That doesn't work either. Their anti-scraping measures are airtight—they block you after just two or three attempts. Eventually, every developer arrives at the same choice: forget it, use Bing, use Google, use whatever—just don't touch Baidu.

Precisely because Baidu isn't open, oddball search services like Sogou and 360 Search have managed to survive. Compare them to Google's API pricing, and these alternatives are actually more expensive. The ecosystem is distorted.

Everyone criticizes Baidu's auction ads, but I think the real pain isn't felt by C-end users—it's felt by developers. You can tolerate ads, but having no interface completely blocks your path.

The Agent Era Is Here, and the Path Will Only Get Narrower

Now that the Agent era has arrived, this problem will only get worse.

Our team already operates with Agents at the core of daily work. We write code with Claude Code, search for information with various research agents, and use OpenClaw to operate computers. Fewer and fewer people are directly operating apps or opening web pages themselves.

Humans will only get lazier. When you want to know something, you won't search for it yourself—you'll tell your Agent: "Look this up for me." Intelligence is getting cheaper and cheaper, and you'll take it for granted. Whether on your phone or computer, your primary interaction will be speaking or typing to make it do work for you.

So how will traffic composition change? I'll make a rough estimate: it used to be roughly 50% human traffic and 50% machine traffic. Going forward, it will become 50% machine, 40% Agent, and 10% human.

In this environment, if your infrastructure isn't Agent-friendly—no APIs, no MCP, no interfaces that help Agents understand and use your services—where will your traffic come from?

Baidu wasn't open during the machine era. In the Agent era, it doesn't have MCP or anything else that makes it easy for Agents to call. Machines can't open it, and Agents can't open it either. So who will use it?

Business Models Need Rethinking

I came across something quite interesting. There was an open-source project whose business model relied on embedding soft ads in the code; you paid to upgrade the service. Later, AI coding agents became popular, and the project's page views exploded—massive, massive traffic—but revenue dropped to zero. Why? When Agents grabbed the code, they directly filtered out the ads. There was no need to tell their human masters about the promotional content.

The business model became instantly invalid.

But you can't just close the door because of this; once the door is closed, you lose even the traffic. So what do you do? Set thresholds. Give users a free tier, charge when they exceed it, and bill by volume if usage gets excessive. It's not complicated.

At the end of the day, Agents will act as proxies for humans, participating more and more in activities across the digital world. They need new infrastructure.

Separate Cars from Pedestrians; Separate Humans from Machines Too

I think a better architecture is "human-machine separation." Just like separating cars from pedestrians, the three types of traffic—humans, traditional machines, and AI Agents—should actually be processed separately, each with their own dedicated channels.

There are many benefits to separation. Efficiency is one aspect, but security, auditability, and monitoring also become much easier. Interfaces for humans should prioritize experience; interfaces for Agents should prioritize structure and callability. Each goes its own way.

There's enormous room for innovation here.

Start Thinking Now

If you want to start a business, or participate in this wave of building, ask yourself one question: how do you design infrastructure for AI Agents? This kind of thing is currently extremely scarce, but demand is growing rapidly.

There are already some projects doing this overseas, such as various skills and tools to help Agents do research better. Recently I've been using something called MiroThinker, recommended by a colleague. It's essentially a research agent. You give it an idea, and it browses a large number of web pages, gathers information, and summarizes viewpoints on its own. The results are often better than some paid products. Moreover, it automatically adjusts the depth of research based on the complexity of your question, deciding how long to spend searching.

Most of my searches now come from Claude Code and research agents like this. Maybe only 5% of the time—when it's so simple I can just search casually—do I use a traditional search engine. Baidu? I barely use it anymore.

Think Value First, Competition Second

I think before thinking about competition, you should think about value. Are you actually creating value?

Baidu does search, which should have helped humanity better aggregate and share knowledge. But in the mobile era it blocked interfaces, and in the Agent era it still hasn't made any preparations for new usage patterns. This isn't called creating value; it's called destroying value.

If you are destroying value, the market will become increasingly cautious in its valuation of you. This isn't a competitiveness problem; it's a direction problem. Who exactly is your Infra built for? Figure that out before talking about anything else.

Recommended Reading

Subscribe to Updates

Get notified when I publish new posts. No spam, ever.

Only used for blog update notifications. Unsubscribe anytime.

Comments

or comment anonymously
0/2000