Qinglong V3.0
OpenLoong

Qinglong V3.0

Qinglong V3.0 is a research humanoid robot by OpenLoong designed for locomotion, manipulation, and whole-body control studies.

Description

Qinglong V3.0, developed by OpenLoong (Humanoid Robots Shanghai Co., Ltd.), represents a significant advancement in open-source full-size humanoid robotics, launched in August 2025 as an evolved research platform for embodied AI, locomotion, manipulation, and whole-body control. Building on the foundational Loong platform, it emphasizes practicality for real-world scenarios including industrial tasks, public service, disaster response, and domestic applications. The robot's architecture is highly modular and biomimetic, featuring bionic joint configurations that mimic human kinematics for enhanced mobility and dexterity. Key structural modules include the head perception system (integrating vision, audition, and tactile sensing), chest computing unit, waist for torso dynamics, upper limbs with 7 DoF per arm, lower limbs with advanced leg-foot assemblies for terrain adaptation, and dexterous hands with 12-19 DoF total. The control system employs a multi-level hierarchical architecture separating proprioceptive stabilization (fast loop at 2 kHz via EtherCAT bus) from perceptual decision-making (slower loop). Dual EtherCAT masters manage upper and lower body slaves, ensuring low-latency (150-250 ms end-to-end) responses. Computing power reaches 400 TOPS via dual 2.2 GHz CPUs and dual 930 MHz GPU coprocessors, supporting real-time sensor fusion, AI inference, and motion planning on a Linux/ROS OS. AI integration leverages the 'Gewu Zhizhi' embodied intelligence system for multimodal perception and autonomous operation, with open-source support for LLM integration and reinforcement learning via GymLoong simulator. Sensors include 3D LiDAR, RGB-D cameras (~1080p), microphone arrays, fingertip/palm tactile arrays, torque/position encoders in joints, and IMUs for balance. Actuators use high-torque electric servos with harmonic drives (upper body) and planetary gears/QDD (lower body), enabling peak torques up to 396 Nm, walking speeds of 4.5-5 km/h, 20 kg payload, and terrain handling (20° slopes, 13-15 cm steps, unstructured surfaces like gravel/sand). Deployment history remains primarily in research and demo phases. Debuted at WAIC 2024 (base model), V3.0 showcased skills like precise manipulation, agile navigation, and multimodal grasping in 2025 events. OpenLoong's GitHub repositories (Hardware, Dyn-Control, GymLoong) foster community development, with early production units in academic labs for R&D. No widespread commercial deployments yet, but open-source ethos accelerates benchmarking and customization, positioning Qinglong V3.0 as a benchmark for China's humanoid ecosystem toward practical autonomy.

Key Features

Modular Bionic Design

Highly replaceable modules for head, arms, hands, legs, and chassis, enabling easy upgrades and adaptation to diverse tasks with biomimetic joint ranges exceeding human limits in some axes.

High-Performance Computing

400 TOPS heterogeneous compute with dual CPUs/GPUs for real-time AI processing, sensor fusion, and control at 2 kHz frequency via EtherCAT.

Dexterous Manipulation

Hands with 12+ DoF, 5 fingers, tactile sensors, supporting 4-6 kg per arm payload, precise grasping, and object handling up to 5 kg.

Advanced Locomotion

Stable walking at 5 km/h, running up to 2 m/s, slope traversal (20°), stair climbing (15 cm), and unstructured terrain navigation with compliant ankle/hip control.

Open-Source Ecosystem

Full hardware schematics, dynamics control (MPC/WBC), simulation (GymLoong), and software stack available on GitHub for global collaboration.

Multimodal Perception

3D LiDAR, depth/RGB cameras, tactile arrays, and audio for comprehensive environmental understanding and low-latency action.

Specifications

AvailabilityEarly production / limited release
NationalityChina
Websitehttps://www.openloong.net/
Degrees Of Freedom, Overall40
Degrees Of Freedom, Hands12
Height [Cm]165
Manipulation Performance3
Navigation Performance4
Max Speed (Km/H)5
Strength [Kg]~4–6 kg payload per arm (assumed)
Weight [Kg]75
Runtime Pr Charge (Hours)3
Safe With HumansYes
Cpu/GpuIndustrial CPU + AI accelerator / GPU (assumed)
Camera ResolutionRGB + depth cameras, ~1080p
ConnectivityEthernet, USB, Wi‑Fi
Operating SystemLinux-based / ROS
Llm IntegrationPossible via external system integration
Latency Glass To Action150–250 ms
Motor TechElectric servo motors
Gear TechHarmonic drive + planetary gears
Number Of Fingers5 per hand
Main MarketAcademic research, humanoid R&D
VerifiedNot verified
Walking Speed [Km/H]4.5
Shipping Size~185 × 85 × 65 cm
ColorWhite
ManufacturerOpenLoong
Degrees Of Freedom Overall40-43
Degrees Of Freedom Hands12-19
Height Cm165-185
Weight Kg75-85
Max Speed Kmh5
Payload Kg20 (total), 4-6 per arm
Battery≥3 hours runtime, chest-mounted with fixing plates (kWh unspecified)
ComputeDual 2.2 GHz CPUs + dual 930 MHz GPUs, 400 TOPS
Sensors3D LiDAR, RGB-D cameras (~1080p), tactile fingertip/palm arrays, torque sensors, encoders, microphone array, IMUs
ActuatorsElectric servo motors with harmonic drives (upper), planetary/QDD (lower), max torque 396 Nm
CommunicationEtherCAT bus (2 kHz), Ethernet, USB, Wi-Fi
OsLinux-based / ROS
MaterialsEngineering metals (aluminum/steel alloys for frames/brackets), lightweight bionic structures
Thermal ManagementNot specified; modular design aids heat dissipation

Curated Videos

Video 1
Video 2

Frequently Asked Questions

What are the degrees of freedom for Qinglong V3.0?

Qinglong V3.0 features approximately 40-43 degrees of freedom overall, including 12 in hands, with upper limbs at 7 DoF per arm and lower limbs optimized for mobility. This enables human-like whole-body coordination for complex tasks.

What is the battery life and power system?

Runtime per charge is around 3 hours continuous operation, powered by high-capacity batteries mounted in the chest module with fixing plates for stability. Voltage and kWh details are in open-source hardware docs, supporting ≥20 kg payload activities.

What sensors does it use?

Equipped with 3D LiDAR, RGB + depth cameras (~1080p resolution), fingertip and palm tactile sensor arrays, joint torque/encoder feedback, microphone arrays, and IMUs for full multimodal perception and balance control.

Is it open-source?

Yes, fully open-source via OpenLoong community on GitHub, including hardware CAD/PDFs, dynamics control frameworks (MPC/WBC), simulators, and software, promoting academic and developer contributions.

What is the computing hardware?

Dual 2.2 GHz industrial CPUs paired with dual 930 MHz GPU accelerators delivering 400 TOPS, running Linux/ROS for AI workloads, perception, and real-time control with EtherCAT bus communication.

What are the dimensions and weight?

Height around 165-185 cm (sources vary; shipping ~185x85x65 cm), weight 75-85 kg. Max extended dimensions up to 219 cm height, optimized for human-scale interaction and transport.

Can it integrate with LLMs?

Yes, supports external LLM integration via ROS interfaces and low-latency pipelines (150-250 ms), enabling advanced reasoning for manipulation, navigation, and task planning in research setups.

×