Author:Haotian
Recently, I have observed the AI industry and found an increasingly "downward" change: from the original mainstream consensus of competing for concentrated computing power and "large" models, a branch has evolved that tends to be local small models and edge computing.
This can be seen from Apple Intelligence covering 500 million devices, Microsoft launching the 330 million parameter small model Mu for Windows 11, and Google DeepMind's robot "offline" operation.
What are the differences? Cloud AI competes on parameter scale and training data, and the ability to spend money is its core competitiveness; local AI competes on engineering optimization and scene adaptation, and will go a step further in terms of privacy protection, reliability, and practicality. (The hallucination problem of the main general model will seriously affect the penetration of vertical scenarios)
This will actually bring greater opportunities to web3 AI. In the past, when everyone competed for "generalization" (computing, data, algorithm) capabilities, they were naturally monopolized by traditional Giant manufacturers. It is simply a pipe dream to want to compete with Google, AWS, OpenAI, etc. by applying the concept of decentralization. After all, there is no resource advantage, technical advantage, and no user base.
But in the world of localized models + edge computing, the situation facing blockchain technology services is very different.
When an AI model runs on a user's device, how can we prove that the output has not been tampered with? How can we achieve model collaboration while protecting privacy? These issues are precisely the strengths of blockchain technology...
I have noticed some new web3 AI-related projects, such as the data communication protocol Lattica recently launched by @Gradient_HQ, which was invested 10M by Pantera, to solve the data monopoly and black box problems of centralized AI platforms; @PublicAI_ brain wave device HeadCap collects real human data and builds an "artificial verification layer", which has achieved 14M in revenue; in fact, they are all trying to solve the "credibility" problem of local AI.
In a word: Only when AI is truly "sunk" into every device, will decentralized collaboration change from a concept to a necessity?
Instead of continuing to compete in the generalization track, why not seriously consider how to provide infrastructure support for the localized AI wave?