Close Menu
TechurzTechurz

    Subscribe to Updates

    Get the latest creative news from FooBar about art, design and business.

    What's Hot

    The Facebook insider building content moderation for the AI era

    April 3, 2026

    Commonwealth Fusion Systems leans on magnets for near-term revenue

    April 2, 2026

    Diverse teams start with diverse VCs

    April 2, 2026
    Facebook X (Twitter) Instagram
    Trending
    • The Facebook insider building content moderation for the AI era
    • Commonwealth Fusion Systems leans on magnets for near-term revenue
    • Diverse teams start with diverse VCs
    • The reputation of troubled YC startup Delve has gotten even worse
    • Startup funding shatters all records in Q1
    • StrictlyVC San Francisco is in less than a month
    • Toyota’s Woven Capital appoints new CIO and COO in push for finding the ‘future of mobility’
    • Mercor says it was hit by cyberattack tied to compromise of open-source LiteLLM project
    Facebook X (Twitter) Instagram Pinterest Vimeo
    TechurzTechurz
    • Home
    • AI
    • Apps
    • News
    • Guides
    • Opinion
    • Reviews
    • Security
    • Startups
    TechurzTechurz
    Home»Startups»What are physical AI and embodied AI? The robots know
    Startups

    What are physical AI and embodied AI? The robots know

    TechurzBy TechurzJuly 19, 2025No Comments6 Mins Read
    Share Facebook Twitter Pinterest LinkedIn Tumblr Reddit Telegram Email
    What are physical AI and embodied AI? The robots know
    Share
    Facebook Twitter LinkedIn Pinterest Email


    Amazon recently announced that it had deployed its one-millionth robot across its workforce since rolling out its first bot in 2012. The figure is astounding from a sheer numbers perspective, especially considering that we’re talking about just one company. The one million bot number is all the more striking, though, since it took Amazon merely about a dozen years to achieve. It took the company nearly 30 years to build its current workforce of 1.5 million humans.

    At this rate, Amazon could soon “employ” more bots than people. Other companies are likely to follow suit, and not just in factories. Robots will be increasingly deployed in a wide range of traditional blue-collar roles, including delivery, construction, and agriculture, as well as in white-collar spaces like retail and food services. 

    This occupational versatility will not only stem from their physical designs—joints, gyroscopes, and motors—but also from the two burgeoning fields of artificial intelligence that power their “brains”: Physical AI and Embodied AI. Here’s what you need to understand about each and how they differ from the generative AI that powers chatbots like ChatGPT. 

    [Photo: Amazon]

    What is Physical AI?

    Physical AI refers to artificial intelligence that understands the physical properties of the real world and how these properties interact. As artificial intelligence leader Nvidia explains it, Physical AI is also known as “generative physical AI” because it can analyze data about physical processes and generate insights or recommendations for actions that a person, government, or machine should take.

    In other words, Physical AI can reason about the physical world. This real-world reasoning ability has numerous applications. A Physical AI system receiving data from a rain sensor may be able to predict if a certain location will flood. It can make these predictions by reasoning about real-time weather data using its understanding of the physical properties of fluid dynamics, such as how water is absorbed or repelled by specific landscape features.

    Physical AI can also be used to build digital twins of environments and spaces, from an individual factory to an entire city. It can help determine the optimal floor placement for heavy manufacturing equipment, for example, by understanding the building’s physical characteristics, such as the weight capacity of each floor based on its material composition. Or it can improve urban planning by analyzing things like traffic flows, how trees impact heat retention on streets, and how building heights affect sunlight distribution in neighborhoods.

    [Photo: Amazon]

    What is Embodied AI?

    Embodied AI refers to artificial intelligence that “lives” inside (“embodies”) a physical vessel that can move around and physically interact with the real world. Embodied AI can inhabit various objects, including smart vacuum cleaners, humanoid robots, and self-driving cars.

    Like Physical AI, Embodied AI can reason about physics, as well as how one object affects another. However, since Embodied AI literally “embodies” a physical entity, such as a robot, it can also alter the real world around it, whether that be a robotic arm performing surgery, a humanoid bot working construction, or a self-driving truck transporting supplies from one location to another.

    Embodied AI has advanced capabilities due to the mobility of its physical body and, as Nvidia explains, additional sensors, which can include cameras or LiDAR, that enable it to perceive its surroundings.

    A real-time distinction

    It is worth noting that the terms “Physical AI” and “Embodied AI” are increasingly being used interchangeably to describe any AI that understands the physics and spatial relationships of the real world and uses that understanding to power the brains behind bots. 

    However, most experts agree that Physical AI and Embodied AI are interrelated but distinct varieties of artificial intelligence.

    Henrik I. Christensen, an expert on robotics and AI and a professor of computer science at the University of California, San Diego, says that one distinguishing factor between the two is their real-time operational capabilities. “Physical AI denotes systems that [infer things] related to the physical world, such as friction, elasticity,” Christensen told me via email. This kind of system “may not operate in real time but has a detailed model of interaction in the physical world.”

    Embodied AI, on the other hand, “denotes systems that operate in the physical world [and also] interact with objects in the real world, [so] they must operate in real-time,” Christensen says.

    This real-time requirement is essential for robots working in the real world. If a robot doesn’t grab something as fast as it should, disaster can strike on the factory floor. He notes that Embodied AI systems often need to use simplified models to ensure they can “provide an answer fast enough.”

    Will robots take all the jobs?

    LLM artificial intelligence systems that power ChatGPT, Claude, Llama, Grok, and others have long been seen as a threat to white-collar jobs, since they can reason about information and generate answers based on that information, much like a human can. However, because LLMs lack both a physical presence and an understanding of how physics affects objects in the real world, they have generally been seen as less of a threat to blue-collar jobs, which typically involve physical labor and an understanding of how objects interact in the real world.

    But Physical AI and Embodied AI systems change the blue-collar risk assessment. Physical AI systems now possess reasoning capabilities regarding physical interactions, and Embodied AI enables robots to apply that understanding in the real world.

    Yet, for now, at least, LLMs still pose a greater threat to white-collar jobs than Physical AI and Embodied AI do to blue-collar ones. This is because LLM technology is readily available and easily deployable across organizations at scale. While Physical AI systems could see nearly as speedy a rollout in the years ahead, Embodied AI systems face more hurdles due to the need to manufacture legions of robots capable of operating in real-world environments.

    However, as Amazon’s one millionth robot rollout demonstrates, companies are increasingly interested in integrating more bots into the workforce, whether that’s in the factory or in the kitchen flipping burgers. As for why? Well, to take a line from my own novel, Beautiful Shining People, “bots never accidentally drop or damage things—not to mention they never get sick, or need days off, or give away free burgers to their friends.”

    In other words, Physical AI and Embodied AI-powered robots have the potential to save companies a significant amount on their biggest expense: labor. And they are sure to take advantage of it. The only question for me, then, is: When AI takes all our jobs, who will be left to buy the things these companies sell?

    The super-early-rate deadline for Fast Company’s Most Innovative Companies Awards is Friday, July 25, at 11:59 p.m. PT. Apply today.

    Embodied Physical robots
    Share. Facebook Twitter Pinterest LinkedIn Tumblr Email
    Previous ArticleMy Favorite Film Is a Historical Drama With Modern Themes. And It’s Free to Stream Now
    Next Article Soulframe Open Alpha Live Now, New Game by Warframe Devs
    Techurz
    • Website

    Related Posts

    Opinion

    Humanoid robotics maker Sunday reaches $1.15B valuation to build household robots

    March 12, 2026
    Opinion

    Humanoid maker Sunday reaches $1.15 billion valuation to build household robots

    March 12, 2026
    Opinion

    CES 2026 was all about ‘physical AI’ and robots, robots, robots

    January 9, 2026
    Add A Comment
    Leave A Reply Cancel Reply

    Top Posts

    College social app Fizz expands into grocery delivery

    September 3, 20252,288 Views

    A Former Apple Luminary Sets Out to Create the Ultimate GPU Software

    September 25, 202516 Views

    The Reason Murderbot’s Tone Feels Off

    May 14, 202512 Views
    Stay In Touch
    • Facebook
    • YouTube
    • TikTok
    • WhatsApp
    • Twitter
    • Instagram
    Latest Reviews

    Subscribe to Updates

    Get the latest tech news from FooBar about tech, design and biz.

    Most Popular

    College social app Fizz expands into grocery delivery

    September 3, 20252,288 Views

    A Former Apple Luminary Sets Out to Create the Ultimate GPU Software

    September 25, 202516 Views

    The Reason Murderbot’s Tone Feels Off

    May 14, 202512 Views
    Our Picks

    The Facebook insider building content moderation for the AI era

    April 3, 2026

    Commonwealth Fusion Systems leans on magnets for near-term revenue

    April 2, 2026

    Diverse teams start with diverse VCs

    April 2, 2026

    Subscribe to Updates

    Get the latest creative news from FooBar about art, design and business.

    Facebook X (Twitter) Instagram Pinterest
    • About Us
    • Contact Us
    • Privacy Policy
    • Terms and Conditions
    • Disclaimer
    © 2026 techurz. Designed by Pro.

    Type above and press Enter to search. Press Esc to cancel.