Physical AI: What It Is, What's Been Built, and Five Startups That Will Define It
AI is leaving the screen. A $383 billion market is forming around machines that perceive, reason, and act in the physical world.
- ● Physical AI — machines that perceive, reason, and act in the physical world — is a $383 billion market in 2026, projected to reach $3.25 trillion by 2040.
- ● The first wave of consumer physical AI (Humane AI Pin, Rabbit R1, Limitless Pendant) failed — but industrial physical AI is succeeding in factories, farms, and warehouses.
- ● Robert Brunner, who founded Apple's design group, argues the defining variable is trust: the next great companies will be trusted with lives, not just data.
- ● Five startups to watch: Figure AI, Haply Robotics, Physical Intelligence, Omi, and WIRobotics.
What is physical AI?
Physical AI is the category term for machines that perceive, reason, and act in the physical world — not through a screen, but through sensors, actuators, and bodies that interact with real environments. NVIDIA CEO Jensen Huang declared it the "next frontier" at CES 2026. Bessemer Venture Partners mapped it across 50 startups. A 2026 market research report values the market at $383 billion today, projecting $3.25 trillion by 2040.
The distinction that matters: physical AI is not a chatbot running on a device. It is intelligence embedded in hardware that takes actions in the world — moving objects, navigating spaces, monitoring environments, augmenting human bodies. The hardware is not the interface to the AI. The hardware is the AI.
This is a fundamentally different product design challenge from software AI. Robert Brunner, who founded Apple's Industrial Design Group and hired Jony Ive, explained on the Product Impact Podcast why this distinction determines which products survive:
"Human beings have this unique relationship with objects. We'll use physical artifacts to define who we are — through the car we drive, the shoes we wear, the furniture we buy. People develop this emotional connection to things they can't literally speak to."
When you put intelligence inside an object, you are inserting yourself into one of the deepest emotional relationships humans have with the made world. A chat interface is something you use. A wearable device is something you live with. The trust standard is dramatically higher.
What has been built — and why the first wave failed
The first wave of consumer physical AI has been brutal.
Humane's AI Pin was effectively dead by February 2025 after burning through $230 million in venture capital. Rabbit R1 followed a similar trajectory. Meta acquired Limitless in December 2025 and immediately stopped selling the Pendant.
Brunner, whose studio Ammunition designed the Limitless Pin, was unusually direct about what went wrong:
"The fundamental issue is nobody wants to be recorded. Nobody. Even in meetings. And knowing that you're being recorded — even though it's got a little light that tells you that it's on — you're still like, okay, how is this information being used against me?"
The form factor was right. The business model was wrong. No amount of design polish could make an always-on recording device comfortable.
The industrial wave tells a different story. Boston Dynamics' Atlas is entering Hyundai factories. Agility Robotics' Digit is deployed at Toyota's Canadian manufacturing plants. Chinese firms Unitree and Agibot account for 85-90% of global humanoid shipments. Warehouse robots, surgical systems, agricultural drones, and industrial exoskeletons are shipping in volume — not as demos, but as production tools.
The pattern: physical AI succeeds in constrained environments with specific tasks and measurable outcomes. It fails in open-ended consumer contexts where the device needs to earn trust before it can be useful.
What new use cases will physical AI enable?
The market is organizing around a three-wave adoption framework:
Wave 1 (now): Industrial automation. Factories, warehouses, farms, hospitals. AWS, NVIDIA, and MassRobotics named 9 startups for a 2026 Physical AI Fellowship targeting these sectors. Use cases: autonomous material transport, precision agriculture (laser weeding without chemicals), warehouse picking and sorting, surgical assistance. The ROI is measurable and the environments are bounded.
Wave 2 (2027-2028): Semi-structured environments. Assisted living, retail, construction. More variability but still physically constrained. Use cases: shelf restocking robots (Telexistence), wearable exoskeletons that reduce worker injury by 15% (WIRobotics), real-time environmental monitoring in construction. The trust context expands but remains task-specific.
Wave 3 (2029+): Consumer and personal. Wearables, home robots, personal AI devices. This is where OpenAI's device (designed with Jony Ive, delayed beyond 2026), Apple's AI glasses, and Meta's Ray-Ban Display are aimed. This wave requires the trust breakthrough Brunner describes — and that no consumer product has yet achieved.
What needs to change about AI models for physical AI?
Current foundation models are trained on text and images. Physical AI requires models that understand physics. Three shifts are underway:
Simulation-to-reality transfer. NVIDIA's Cosmos platform generates synthetic training data by simulating real-world environments. You can't train a robot on a million physical interactions the way you train an LLM on a million documents. You simulate them. This is why NVIDIA — a chip company — is becoming a physical AI infrastructure company.
Multimodal sensor fusion. Physical AI models must process vision, lidar, force feedback, audio, and proprioception simultaneously and in real time. The latency tolerance is milliseconds, not seconds. Current vision-language models don't meet this bar for safety-critical applications. This is an active research frontier at Google DeepMind, Anthropic, and specialized labs like Physical Intelligence.
Edge deployment. Physical AI can't depend on cloud inference. A robot arm can't wait 200ms for a response while holding a component. Models must run on-device. This is driving the edge AI chip market and favoring smaller, specialized models over frontier-scale ones — a structural advantage for companies building purpose-built models rather than adapting general-purpose ones.
Five startups to watch
Figure AI — General-purpose humanoid robots for warehouse and manufacturing. Raised $675 million in 2024 with commercial pilots at BMW. The bet: the humanoid form factor is the right abstraction because factories were designed for human-shaped workers. Figure represents Wave 1 at scale.
Haply Robotics — Haptic interfaces for human-robot interaction. Selected for the 2026 AWS/NVIDIA Physical AI Fellowship. Physical AI systems that humans need to guide or correct require tactile feedback, not just visual interfaces. Haptics is the trust layer between human and machine — the physical equivalent of the graduated autonomy pattern we've documented in software agents.
Physical Intelligence — Foundation models for physical interaction — models that understand how objects behave when pushed, pulled, stacked, and manipulated. If Claude and ChatGPT are foundation models for text, Physical Intelligence is building the equivalent for the physical world.
Omi — A $89 wearable AI assistant worn on the temple for real-time conversation summarization. Where Limitless failed at $99 with always-on recording, Omi tests whether a bounded use case (meeting summarization, not life logging) and a lower price point can cross the consumer trust barrier. This is the Brunner test in action: does it remove steps, or does it add complexity?
WIRobotics — Wearable robotic devices that reduce physical strain in industrial and healthcare settings. Selected for the 2026 Physical AI Fellowship. WIRobotics may reach mass adoption first because the value is immediate (reduced injury, reduced fatigue) and the trust requirement is lower — the device assists your body rather than replacing your judgment.
How to evaluate physical AI products
Brunner's framework, articulated on the Product Impact Podcast, provides the clearest test for physical AI product teams:
"Does AI remove steps? Will the product require fewer actions to accomplish something meaningful — or more? If it adds menus and features and prompts, it's probably not good. But if AI quietly removes complexity and lets you do something faster, better, it's real."
This is also the lens PH1 Research applies when evaluating AI product impact: the measure of success is not what the technology can do, but what it removes from the user's experience. AI Value Acceleration works with enterprises facing this exact question — where is the physical AI deployment creating measurable value, and where is it adding complexity that wasn't there before?
The signal I'm tracking in my ongoing research into AI value in enterprise deployments: the physical AI companies that will survive are not the ones with the most capable hardware. They are the ones that Brunner describes — the ones whose customers trust the device enough to live with it. That trust gap is where the $383 billion market will be won or lost.
Listen: Product Impact Podcast S02E06 — Robert Brunner on Physical AI
Related:
- The Man Who Hired Jony Ive Has a Warning for the Physical AI Boom
- The Year AI Leaves the Text Box
- Robert Brunner — Person page
Sources:
- Physical AI Market: $383B in 2026, $3.25T by 2040 (GlobeNewsWire)
- 50 Startups Transforming Industries with Physical AI (Bessemer)
- CES 2026: Rise of Physical AI (SVE)
- AWS/NVIDIA Physical AI Fellowship 2026 (PYMNTS)
- Physical AI Is Here: Leaving the Lab (HumAI)
- 2026 Outlook: When AI Gets Physical (Institutional Investor)
- NVIDIA Cosmos / National Robotics Week 2026
- TechCrunch: Meta acquires Limitless
- Ammunition Group — Robert Brunner
Share this article
Hosted by Arpy Dragffy and Brittany Hobbs. Arpy runs PH1 Research, a product adoption research firm, and leads AI Value Acceleration, enterprise AI consulting.
Get AI product impact news weekly
SubscribeLatest Episodes ›
All episodes
7: $490 Billion in AI Spend Is Delivering Nothing — Orchestration Is the Fix
6. Robert Brunner Was the Secret to Beats' & Apple's Success — Now He's Redefining AI for the Physical World
5. The Human Impact of AI We Need to Measure [Helen & Dave Edwards]
4. The AI Agent Era Will Change How We Work
3. Win The AI Context Wars — Unlock The Value of Data [Juan Sequeda ]
Related
6
SEO Had 25 Years of Certainty. HubSpot Shipped Their Vision for AEO.

The Internet Is Being Re-Intermediated. Adobe's Data Shows How Fast.

What AI Does to Human Thinking: Cognitive Sovereignty, the Median Pull, and Why It Matters for Product Teams

The Agentic Era: What AI Agents Are, How They Change Work, and Why 94% of Organizations Aren't Ready
Anthropic Is No Longer a Model Company
