Share
Print
A+
A-

Ace Robotics unveils open-source world model and embodied AI framework

Writer: Windy Shao  |  Editor: Lin Qiuying  |  From: Original  |  Updated: 2025-12-19

Chinese robotics firm Ace Robotics, backed by artificial intelligence company SenseTime, on Dec. 18 unveiled a series of landmark innovations aimed at accelerating the large-scale commercialization of embodied intelligence.

At the launch event, Ace Robotics introduced its ACE (Ambient Capture Engine) embodied intelligence R&D paradigm, alongside the Kairos 3.0 world model — described as the first open-source world model capable of commercial deployment — and the A1 embodied super-brain module. Together, the releases are intended to establish a fully domestic, open and collaborative ecosystem spanning data collection, model training and real-world robotic applications.

Chairman of Ace Robotics Wang Xiaogang during a presentation. Courtesy of SenseTime

SenseTime co-founder and executive director Wang Xiaogang, who also serves as chairman of Ace Robotics, said the company aims to give robots a true “intelligent brain” capable of understanding and interacting with the physical world. “By working closely with partners across the industrial chain, Ace Robotics seeks to promote the scalable deployment of embodied intelligence and build an independent, controllable development path that positions China at the forefront of next-generation intelligent technologies,” Wang said.


A human-centric paradigm for embodied intelligence

Ace Robotics said traditional AI development approaches can no longer meet the data demands of embodied intelligence, which requires machines to learn complex interactions between humans, objects and environments. To address this bottleneck, the company proposed a human-centric ACE paradigm that fundamentally reshapes the R&D pathway.

The logo of Ace Robotics.

The ACE framework places environmental data capture at its core, enabling the collection of tens of millions of hours of multimodal interaction data annually. Combined with Kairos 3.0, this approach amplifies the value of real-world data, achieving the equivalent training effect of hundreds of millions of hours and overcoming the severe data shortages facing the embodied AI sector.

The system integrates visual, tactile, voice and force-related data to build physically accurate, full-scene models. Through temporal alignment, interaction trajectory prediction and physics-consistent simulation, raw data is converted into dynamic scene data directly usable for model training. It has already been applied in real-world scenarios such as retail warehousing, home environments and fine-grained object manipulation.


Kairos 3.0: an open-source world model

Ace Robotics released Kairos 3.0, forming a unified world-understanding framework that spans multiple robotic embodiments. The model integrates physical laws, human behavior and real-machine actions, enabling robots not only to understand cause-and-effect relationships in the physical world, but also to generate long-duration interactive scenarios and predict multiple future outcomes.

As a multimodal world model capable of understanding, generation and prediction, Kairos 3.0 processes inputs such as vision, 3D trajectories, tactile feedback and friction data to construct a chain-of-thought-style understanding of real-world interactions.

A poster of Ace Robotics.

Ace Robotics also launched its embodied intelligence world model platform, which integrates text-to-world, image-driven world generation and trajectory-based scene construction. Supporting 11 major categories, 54 subcategories and 115 vertical application scenarios, the platform significantly lowers development barriers by allowing developers to generate and share task simulations through simple commands. The platform was open-sourced to the entire industry on Dec. 18, with APIs made available to enterprises across sectors.

The Kairos model has already been adapted to domestically developed chips from companies including MetaX, Biren Technology and Sugon, further strengthening China’s full-stack capabilities from computing power to intelligent applications.

Ace Robotics also unveiled its A1 embodied super-brain module, marking a major step toward real-world deployment. Powered by the company’s vision-based, map-free end-to-end VLA model, robotic dogs equipped with the A1 module can operate in complex, dynamic and unfamiliar environments without relying on high-precision prebuilt maps.

Combined with 360-degree perception solutions from Insta360 and SenseTime’s general-purpose vision platform, the system supports more than 150 intelligent application scenarios across over 10 industries, including security, energy, transportation and cultural tourism, where long-term stability and reliability are critical.


Building an open industry ecosystem

Ace Robotics said ecosystem collaboration remains central to its strategy. The company has formed strategic partnerships with robotics manufacturers, hardware suppliers, chipmakers, cloud service providers and data companies to build a fully autonomous, end-to-end embodied intelligence ecosystem.

In the embodied robotics segment, Ace Robotics is working with leading firms to integrate the ACE paradigm, world models and hardware platforms, jointly delivering solutions tailored to diverse industrial scenarios.

With the release of its open-source world model and embodied AI framework, Ace Robotics and SenseTime are positioning themselves at the forefront of a global shift toward spatial intelligence and embodied systems, as the AI industry moves beyond purely language-based models toward machines that can perceive, reason and act in the physical world.


Chinese robotics firm Ace Robotics, backed by artificial intelligence company SenseTime, on Dec. 18 unveiled a series of landmark innovations aimed at accelerating the large-scale commercialization of embodied intelligence.