The Neuron — with 675,000+ subscribers — just launched a separate robotics newsletter for dedicated coverage of physical AI, factory automation, and humanoid robots. The move signals that robotics has grown from an AI subcategory into its own media vertical.
Why Robotics Split Off
The Neuron’s editorial team made the split because robotics stories were consistently generating the highest engagement metrics in the main newsletter — but covering them adequately required more depth than a general AI newsletter could provide. Physical AI involves hardware manufacturing, regulatory compliance, mechanical engineering, and industrial deployment — domains that don’t fit naturally alongside LLM benchmarks and prompt engineering tips.
The split mirrors what happened with AI itself: five years ago, AI news was a subsection of tech coverage. Now every major outlet has dedicated AI reporters and sections. Robotics is following the same trajectory.
The Companies Driving Physical AI in 2026
The robotics newsletter will cover an ecosystem that has grown dramatically:
- Boston Dynamics + DeepMind: Google’s acquisition of Boston Dynamics created a hardware-software integration that’s accelerating humanoid robot capabilities
- Tesla Optimus: Elon Musk’s humanoid robot program is now deploying units in Tesla factories and has announced external sales targets for 2027
- Figure AI: Raised $675 million at a $2.6 billion valuation to build general-purpose humanoid robots, with BMW and Amazon as early deployment partners
- Hyundai Robotics: Leveraging Boston Dynamics technology for commercial and industrial applications
- Agility Robotics (Digit): Deploying bipedal robots in Amazon warehouses
Investment Numbers
Physical AI investment has accelerated:
- 2024: $4.2 billion in robotics/physical AI venture funding
- 2025: $9.8 billion — a 133% increase
- 2026 (Q1): $4.1 billion — on pace for $16+ billion annually
The funding trajectory suggests investors believe the software AI playbook (rapid capability improvement → commercial deployment → market dominance) applies to physical AI with a 2-3 year lag.
What “Physical AI” Actually Means
The term distinguishes robots that use AI for perception, planning, and adaptation from traditional industrial robots that follow fixed programs. A welding robot on a car assembly line follows exact coordinates. A physical AI robot sees a messy warehouse shelf, identifies objects, plans how to pick and place them, and adapts when something unexpected happens.
The enabling technology is the same foundation models powering chatbots — vision-language models and reinforcement learning — applied to physical manipulation tasks. Gig workers are now filming household tasks to create the training data these systems need.
What This Means
When a 675,000-subscriber newsletter creates a dedicated publication for a topic, it reflects audience demand, not editorial whim. Physical AI has graduated from speculative future tech to current commercial deployment. The Neuron’s robotics edition launches immediately, with dedicated coverage of the companies, research, and policy shaping the physical AI market.
