Eggie
Tangible Robots

Eggie

Eggie is a friendly mobile humanoid robot designed for interaction, education, and light service tasks. It combines expressive arms with smooth wheeled mobility, offering an approachable and flexible platform for AI-driven communication.

Description

Eggie is a cutting-edge wheeled humanoid robot developed by Tangible Robots, a California-based startup founded to make robots commonplace in everyday environments, particularly homes. Announced in November 2025, Eggie represents a prototype platform emphasizing three core pillars: dexterity, compliance, and whole-body control. This architecture enables Eggie to thrive in chaotic, contact-rich real-world scenarios like cluttered kitchens or living rooms, where traditional robots falter due to rigid designs requiring pristine conditions. Architecturally, Eggie features a mobile wheeled base for smooth navigation at up to 3 km/h, paired with an expressive upper body including anthropomorphic five-fingered bionic hands capable of precise manipulation (1-2 kg per arm). Standing at 160 cm tall and weighing 45 kg, it boasts 20 degrees of freedom overall, with advanced electric servo motors and plastic/composite gear trains housed in an aluminum frame with composite panels. Sensory suite includes dual 1080p depth + RGB cameras for perception, supporting low-latency (<200 ms end-to-end) vision-language-action (VLA) models. Powered by a Linux-based ROS environment likely running on an NVIDIA Jetson or similar embedded AI board, Eggie integrates Bluetooth and Wi-Fi connectivity, with IP20 ingress protection and 7-hour runtime per charge. AI capabilities are central to Eggie's design, leveraging a full-stack vertical integration of hardware and software. Training pipelines incorporate imitation learning from teleoperated demos, reinforcement learning for robustness, dexterous manipulation policies, VLA models for multimodal understanding (3D vision, touch, scale), and co-training at scale. Currently, demonstrations like wiping coffee spills, picking mugs, lint removal from clothes, and household chores are teleoperated (2x speed in some clips), but the focus is rapid iteration toward full autonomy. Real-world deployments have already captured data from hundreds of unique environments and thousands of 'in-the-wild' interactions, enabling sim-to-real transfer unachievable in controlled sims. This data moat supports general-purpose manipulation for tasks like cleaning, object delivery, and interactive service. Deployment history, though early-stage, underscores Tangible's pragmatic approach: prioritizing real-world learning over hype. Videos from November 2025 show Eggie in messy scenes—full shots of cluttered manipulation—highlighting compliance for safe human coexistence. Targeted markets include home assistance, exhibitions, research, and education, with a charming white design fostering approachability. As a US-nationality prototype (price ~$32,000), Eggie positions Tangible against giants like Tesla Optimus by focusing on scalable, manufacturable hardware-software synergy. Future roadmap emphasizes autonomy scaling, with ongoing improvements in manipulation precision and environmental adaptability. Eggie's blend of approachable form, robust AI, and real-data training heralds practical home robotics, potentially revolutionizing daily chores by 2026-2030.

Key Features

Dexterity and Anthropomorphic Hands

Five-fingered bionic hands with precise manipulation for handling everyday objects like mugs and cloths, supporting 1-2 kg payloads per arm.

Compliance and Whole-Body Control

Designed for contact-rich interactions in cluttered spaces, using compliant control to safely navigate and manipulate in unpredictable home environments.

Wheeled Mobility

Smooth indoor navigation at 3 km/h, avoiding bipedal complexities while maintaining humanoid expressiveness for natural interaction.

Advanced AI Integration

Combines imitation learning, RL, VLA models, and multimodal training on real-world data for low-latency (<200 ms) autonomous behaviors.

Real-World Data Learning

Trained on thousands of interactions from hundreds of diverse environments, enabling robust performance beyond simulations.

Specifications

AvailabilityPrototype
NationalityUS
Websitehttps://tangiblerobots.ai/
Degrees Of Freedom, Overall20
Degrees Of Freedom, Hands3
Height [Cm]160
Manipulation Performance2
Navigation Performance3
Max Speed (Km/H)3
Strength [Kg]1–2 kg per arm
Weight [Kg]45
Runtime Pr Charge (Hours)7
Safe With HumansYes
Cpu/GpuLikely NVIDIA Jetson or similar embedded AI board
Ingress ProtectionIP20
Camera ResolutionDual 1080p depth + RGB
ConnectivityBluetooth, Wi‑Fi
Operating SystemLinux-based ROS environment
Llm IntegrationLikely via cloud API (OpenAI, Alibaba Qwen, Baidu, etc.)
Latency Glass To Action<200ms (End-to-End)
Motor TechElectric servo motors
Gear TechPlastic/composite gear trains
Main Structural MaterialAluminum frame + composite panels
Number Of Fingers5 per hand
Main Marketexhibitions, Home assistance, Research & education
VerifiedNot verified
ColorWhite
ManufacturerTangible Robots
Height Cm160
Weight Kg45
Dof Overall20
Dof Hands3
Max Speed Kmh3
Payload Per Arm Kg1-2
Runtime Hours7
Cpu GpuLikely NVIDIA Jetson or similar embedded AI board
OsLinux-based ROS environment
CamerasDual 1080p depth + RGB
ConnectivityBluetooth, Wi-Fi
Ip RatingIP20
MotorsElectric servo motors
GearsPlastic/composite gear trains
MaterialsAluminum frame + composite panels
Fingers Per Hand5
Llm IntegrationLikely via cloud API (OpenAI, etc.)
Latency Ms<200 (end-to-end)
SafetySafe with humans: Yes
ColorWhite
Price Usd32000

Curated Videos

Video 1
Video 2
Video 3

Frequently Asked Questions

What is the current availability of Eggie?

Eggie is in prototype stage as of December 2025, not yet commercially available. Tangible Robots is focusing on rapid development through real-world data collection, with potential early access or research units forthcoming.

What AI technologies power Eggie?

Eggie uses imitation learning, reinforcement learning, vision-language-action models, and multimodal co-training. It runs on ROS with likely NVIDIA Jetson compute and cloud LLM APIs for perception and decision-making.

How does Eggie navigate and move?

Featuring a wheeled base for reliable indoor mobility up to 3 km/h, combined with 20 DoF expressive arms for whole-body control, allowing smooth navigation and dexterous tasks without legged locomotion challenges.

Is Eggie safe for home use with humans?

Yes, designed with compliance for safe contact-rich interactions, lightweight construction (45 kg), and human-safe speed/strength limits. It prioritizes approachable design to foster comfortable cohabitation.

What is the battery life and charging?

Eggie offers up to 7 hours of runtime per charge, suitable for extended home assistance sessions. Specific battery specs like kWh/voltage are not publicly detailed, but optimized for daily use.

What tasks can Eggie perform?

Demos include cleaning spills, lint removal, object pickup, and household chores. Aimed at general-purpose manipulation in chaotic environments, with roadmap toward full autonomy.

×