Kinisi  KR1
Kinisi Robotics

Kinisi  KR1

The Kinisi KR1 is a wheeled humanoid robot from US-based Kinisi Robotics, designed for practical warehouse tasks like pick-and-place and sorting. Combining LLM intelligence with high-speed mobility up to 14.4 km/h, this 162 cm tall prototype offers autonomous operation, human-level precision, and safety in shared spaces, addressing labor shortages efficiently.

Description

The Kinisi KR1, also known as Kinisi 01, is a wheeled-base humanoid robot developed by Kinisi Robotics, a US-UK startup founded in January 2024 by robotics expert Brennand Pierce. Standing at 162 cm tall and weighing approximately 100 kg, the KR1 is engineered for practical, real-world applications in warehouses, logistics, manufacturing, retail, and maintenance, addressing labor shortages through high-speed mobility, human-level strength, and precision manipulation. Unlike legged humanoids, its omnidirectional wheeled base with active suspension and zero-turn radius enables top speeds of up to 2.4 m/s (8.64-14.4 km/h reported variably), superior stability on flat industrial floors, and efficient navigation in dynamic environments without the energy inefficiency of bipedal locomotion. Architecturally, the KR1 features a fully vertically integrated stack, encompassing custom motion hardware, perception AI, and control systems for seamless performance. The mechanical design includes aluminum structural framing, BLDC motors paired with strain wave (harmonic) gears for high torque and precision, 21 degrees of freedom overall (2 in hands), and modular quick-swap grippers supporting vacuum end-effectors for versatile manipulation. Payload capacity reaches 25 kg dynamic and 40 kg static, with 20 kg lifting strength, allowing handling of fragile items, heavy boxes, high-shelf picks, sorting, assembly, and even towel folding via imitation learning. The AI architecture is edge-deployed for sub-100ms perception-to-action latency, powered by NVIDIA Jetson processors running Linux OS with Wi-Fi connectivity and IP65 ingress protection. Perception fuses multi-modal sensors: stereo depth cameras for ±2 mm accuracy at 2 m, 180° LiDAR array with SLAM for spatial awareness, and inertial inputs for dynamic balance. Real-time transformer models enable environment mapping, motion prediction, and adaptation. Intelligence leverages imitation learning—robots observe human demonstrations (no coding required) to generalize tasks across variations, supported by a data flywheel where fleet data improves collective performance. Agentic AI handles autonomous navigation, obstacle avoidance, and decision-making in a continuous sensing-reasoning-actuation loop, with optional semi-autonomous or tele-op modes. Safety is prioritized via dual redundant controllers, hardware/software e-stops, thermal/current diagnostics, and fail-safe behaviors like posture locking. Power comes from hot-swappable 48 V 20 Ah Li-ion batteries with BMS, offering 6-8 hours runtime and 90-minute rapid charge to 80%, plus auto-docking. No specific LLM is named, but integration supports LLM intelligence for advanced reasoning, as noted in early specs. Deployment history began rapidly: KR1 prototype operational by May 2024 with agentic AI. First customer visit October 2024, European on-site demo May 2025 (won NVIDIA startup challenge), V2/V3 hardware iterations by 2025. Real-world pilots include warehouse pick-and-place, glass sorting (November 2025 deployment), and infinite pick demos. Over 10k prior deployments from founder's past ventures inform scalability. Priced at ~$75k (prototype), KR1 emphasizes low-cost, cloud-independent operation for immediate ROI in shared human spaces, revolutionizing automation with proven reliability over speculative designs.

Key Features

Onboard Edge Intelligence

Fully local AI processing with NVIDIA Jetson, real-time transformers, and imitation learning for sub-100ms latency, no cloud dependency for privacy and reliability.

High-Precision Manipulation

Human-level dexterity with 21 DoF, strain wave gears, modular grippers; handles 25 kg payloads, fragile/heavy objects, pick-and-place, sorting, assembly.

Advanced Mobility

Omnidirectional wheeled base, 2.4 m/s speed, active suspension, zero-turn radius for agile navigation in warehouses.

Multi-Modal Perception

Stereo depth cameras (±2 mm accuracy), 180° LiDAR SLAM, inertial sensors for real-time 3D mapping and obstacle avoidance.

Safety and Redundancy

Dual controllers, e-stops, health monitoring, fail-safes; IP65 rated, safe in human-shared spaces.

Easy Deployment and Learning

Train via human demos in hours; data flywheel for fleet learning; hot-swap batteries, OTA updates.

Specifications

AvailabilityPrototype
NationalityUS
Websitehttps://www.kinisi.com/
Degrees Of Freedom, Overall21
Degrees Of Freedom, Hands2
Height [Cm]162
Manipulation Performance2
Navigation Performance2
Max Speed (Km/H)14.4
Strength [Kg]20
Weight [Kg]100
Runtime Pr Charge (Hours)8
Safe With HumansYes
Cpu/GpuNvidia Jetson
Ingress ProtectionIP 65
ConnectivityWi‑Fi
Operating SystemLinux
Latency Glass To ActionSub-100ms
Motor TechBLDC
Gear TechStrain Wave
Main Structural MaterialAluminium
Main Marketlogistics, Warehouse
VerifiedNot verified
ManufacturerKinisi Robotics
Height Cm162
Weight Kg100
Dof Overall21
Dof Hands2
Max Speed Kmh8.64
Payload Dynamic Kg25
Payload Static Kg40
Strength Kg20
Runtime Hours6-8
Battery48 V Li-ion, 20 Ah hot-swappable
Cpu GpuNVIDIA Jetson
OsLinux
SensorsStereo depth cameras (±2 mm @ 2 m), 180° LiDAR (SLAM), inertial sensors
MotorsBLDC
GearsStrain Wave (harmonic)
MaterialsAluminum structure
Ip RatingIP65
Latency Ms<100
ConnectivityWi-Fi, optional encrypted cloud telemetry

Curated Videos

Video 1
Video 2
Video 3
Video 4

Frequently Asked Questions

What is the battery life and charging method for the Kinisi KR1?

The KR1 features hot-swappable 48 V 20 Ah Li-ion batteries providing 6-8 hours runtime under typical warehouse duty cycles. Rapid charging reaches 80% in 90 minutes via CC/CV smart charger with auto-docking capability and integrated BMS for thermal/current protection.

What sensors does the KR1 use for perception?

It integrates stereo depth cameras for ±2 mm accuracy at 2 m, a 180° LiDAR array with SLAM for spatial mapping, and inertial sensors, fusing multi-modal data for real-time 3D environment understanding and dynamic adaptation.

How does the KR1 learn new tasks?

Using imitation learning, operators demonstrate tasks intuitively without coding; the robot generalizes from single demos to variations, improving via a data flywheel where fleet experiences enhance collective intelligence rapidly.

What is the payload and speed capability?

Dynamic payload of 25 kg (40 kg static), lifting up to 20 kg with precision. Top speed is 2.4 m/s with adaptive control for confined spaces, enabled by omnidirectional wheels and active suspension.

Is the KR1 safe for human collaboration?

Yes, with dual redundant safety controllers, multiple e-stops, continuous diagnostics, fail-safe posture locking, and IP65 rating, ensuring safe operation alongside humans in shared industrial environments.

What processor powers the AI?

NVIDIA Jetson series handles edge-deployed real-time transformer models for perception, planning, and control, achieving sub-100ms glass-to-gripper latency on Linux OS with Wi-Fi connectivity.

×