Interbotix / WidowX

Trossen Robotics research robot arm series - Standard hardware for VLA and Embodied AI research

Interbotix / WidowX

Home > Hardware > Arms > WidowX


Overview

Interbotix is a research robot arm brand developed by Trossen Robotics, providing a series of high-precision manipulators based on ROBOTIS Dynamixel servo motors. The WidowX and ViperX series are robot arm platforms widely used for VLA (Vision-Language-Action) model training, teleoperation data collection, and Embodied AI research. They have been adopted by major robot learning projects including ALOHA, BridgeData V2, and Open X-Embodiment.

ItemDetailsSource
ManufacturerTrossen RoboticsOfficial site
Founded2005, by Matt TrossenTracxn
HeadquartersDowners Grove, Illinois, USACBInsights
MotorsROBOTIS Dynamixel X-SeriesOfficial docs
Price Range$2,000 - $6,500 (as of 2024, subject to change)Product page
Main ApplicationsResearch, Education, ML/AI data collection-

Company Introduction: Trossen Robotics

Trossen Robotics is a robotics company founded in 2005 by Matt Trossen, supplying robot hardware to research labs and educational institutions for over 20 years1. They manufacture and distribute research manipulators, unmanned ground vehicles (UGV), and ML/AI integrated research kits, and are particularly famous for their Interbotix brand utilizing ROBOTIS Dynamixel servos.

By supplying hardware for the ALOHA project and Open X-Embodiment dataset, they have established a central position in the Embodied AI research community.


Product Lineup

X-Series Arms (Basic Lineup)

Specifications below are based on official Interbotix documentation, and prices are approximate reference prices as of 2024.

ModelDoFReachPayloadServo ConfigurationPrice (ref)Features
PincherX-1004335mm50gXL430~$500Entry-level, compact
PincherX-1504450mm50gXL430~$600Entry-level, extended reach
ReactorX-1505450mm100gXM430/XL430~$1,200Intermediate, wrist rotation
ReactorX-2005550mm150gXM430/XL430~$1,500Intermediate, extended reach
WidowX-2005550mm200gXM430-W350, XL430-W250~$2,500Research standard
WidowX-2505650mm250gXM430-W350, XL430-W250~$3,000Research extended
WidowX-250 6DoF6650mm250gXM430-W350, XL430-W250~$3,550ALOHA Leader arm
ViperX-2505650mm450gXM540-W270, XM430-W350~$4,500High payload
ViperX-3005750mm750gXM540-W270, XM430-W350~$5,500High performance
ViperX-300 6DoF6750mm750gXM540-W270, XM430-W350~$6,130ALOHA Follower arm

Note: Prices may vary; check official site for latest pricing.

Detailed Spec Comparison

Specifications below are excerpted from official documentation (WidowX-200, WidowX-250, ViperX-300 6DoF).

ItemWidowX-200WidowX-250ViperX-300 6DoF
Degrees of Freedom5 DoF5 DoF6 DoF
Max Reach550mm650mm750mm
Total Span1100mm1300mm1500mm
Payload200g250g750g
Repeatability1mm1mm1mm
Accuracy5-8mm5-8mm5-8mm
Gripper Opening30-74mm30-74mm42-116mm
Servo Count789
Wrist RotationSupportedSupportedSupported

Payload Note: Official documentation recommends 50% or less extension when using maximum payload.

AI Series (Released 2025)

Trossen Robotics announced a new AI hardware lineup specialized for ML/VLA research in 2025. Information below references WidowX AI official page and Trossen AI page.

ModelFeaturesMain Applications
WidowX AI6DoF, 700mm reach, 1.5kg payload, 1mm accuracy, iNerve controllerML/VLA research base platform
Solo AILeader-Follower configuration, teleoperation specializedOptimized for data collection
Mobile AIAgileX Tracer mobile base integrationMobile Manipulation research
Stationary AI4-arm compound workstationLarge-scale multi-arm experiments

WidowX AI is available in three configurations: Base, Leader, and Follower, with the Follower version equipped with Intel RealSense D405 depth camera. According to the official site, shipping started from mid-April 2025.


Dynamixel Servo Technology

The core of Interbotix robot arms is ROBOTIS Dynamixel X-Series smart servo motors. Specifications below reference ROBOTIS e-Manual and Interbotix official documentation.

Key Features

FeatureDescription
Position Resolution4096 positions (approximately 0.088 degrees)
PID ControlUser-definable PID parameters
FeedbackReal-time monitoring of position, velocity, current, temperature, voltage
CommunicationTTL or RS-485 (varies by model), 1Mbps default baudrate
ComplianceSoftware-based compliance settings

Servo Models Used

  • XL430-W250: Small, lightweight, for gripper and wrist joints
  • XM430-W350: Medium, for intermediate joints, high torque-to-weight ratio
  • XM540-W270: Large, for base and shoulder joints, maximum torque

U2D2 Controller

All Interbotix arms connect to PC via ROBOTIS U2D2 interface. As a USB to TTL converter, it provides direct access to Dynamixel Wizard software and ROS/ROS2.


Software Ecosystem

ROS/ROS2 Support

Support status below is based on Interbotix official documentation. ROS distribution EOL (End of Life) status may change over time.

VersionStatusNotes
ROS MelodicSupported (Legacy)Ubuntu 18.04, EOL 2023
ROS NoeticSupportedUbuntu 20.04, final ROS1 LTS
ROS2 GalacticSupported (Legacy)EOL November 2022
ROS2 HumbleSupported (Recommended)Ubuntu 22.04 LTS, supported until 2027
ROS2 RollingSupportedDevelopment rolling release

Recommended: ROS2 Humble recommended for new projects.

Provided Packages

  • URDF/Meshes: Accurate inertia models included
  • Driver Nodes: Physical robot control and joint state publishing
  • MoveIt Integration: Motion planning support
  • Gazebo Simulation: Simulation environment provided
  • MuJoCo Models: Physics simulation (including ALOHA 2)

AI/ML Framework Integration (AI Series)

  • Hugging Face LeRobot: Data pipelines and model training
  • OpenPI (Physical Intelligence): Pi0, Pi0.5 policy training and inference
  • NVIDIA Isaac: Simulation and deployment
  • Pre-trained Model Support: ALOHA, BiACT, OCTO, Crossformers, etc.

Key Significance

1. Low-Cost High-Performance Research Platform

Interbotix robot arms provide research-quality precision (1mm repeatability) and reliability at the $2,000-$6,500 price range. This is significantly lower cost compared to traditional industrial robot arms, enabling academia and startups to conduct large-scale data collection and VLA research.

2. Open Source Ecosystem

All hardware designs, drivers, and URDF models are open-sourced. All code is accessible from the interbotix repositories on GitHub and continuously improved through community contributions.

3. Standardized Hardware

Major robot learning datasets including ALOHA, BridgeData V2, and Open X-Embodiment were all collected with Interbotix arms. This allows researchers to directly test and fine-tune pre-trained models on the same hardware.

4. Democratization of Embodied AI Research

From Mobile ALOHA systems (approximately $32,000, per ALOHA 2 paper) to single arms, various options are available for different budgets, enabling more researchers to participate in Embodied AI research.


VLA Research Applications

ALOHA / Mobile ALOHA

ALOHA (A Low-cost Open-source Hardware System for Bimanual Teleoperation) developed by Stanford’s Tony Z. Zhao, Zipeng Fu, and Chelsea Finn research team is built around Interbotix arms2.

ComponentHardwareRole
Leader ArmsWidowX-250 6DoF x 2Human teleoperator input
Follower ArmsViperX-300 6DoF x 2Actual task execution
Mobile BaseAgileX TracerMovement (Mobile ALOHA)
Cameras2 wrist + 1 topVisual input

ALOHA 2 provides improved performance, ergonomics, and robustness, with all hardware designs and MuJoCo models open-sourced.

Key Papers:

  • Zhao et al., “Learning Fine-Grained Bimanual Manipulation with Low-Cost Hardware” (RSS 2023)
  • Fu et al., “Mobile ALOHA: Learning Bimanual Mobile Manipulation with Low-Cost Whole-Body Teleoperation” (2024)

BridgeData V2

A large-scale robot manipulation dataset collected by UC Berkeley RAIL Lab3.

ItemDetails
RobotWidowX-250 6DoF
Trajectories60,096
Environments24
Skills13
Control Frequency5Hz
Average Trajectory Length38 timesteps

Includes various basic manipulation skills such as pick-and-place, pushing, sweeping, drawer/door manipulation, block stacking, and clothes folding. Collected via VR controller teleoperation and is a core component of the Open X-Embodiment dataset.

Key Papers:

  • Walke et al., “BridgeData V2: A Dataset for Robot Learning at Scale” (CoRL 2023)

Open X-Embodiment

The world’s largest open-source robot dataset led by Google DeepMind, collected from 34 research labs4.

ItemDetails
Total Trajectories1M+
Robot Types22 embodiments
Skills500+
Tasks150,000+
Data FormatRLDS (TFRecord)

The Bridge dataset collected with WidowX is a core component of Open X-Embodiment and plays an important role in cross-robot transfer for RT-X model training. Research has confirmed that skills learned from WidowX data transfer to Google Robot.

Key Papers:

  • Open X-Embodiment Collaboration, “Open X-Embodiment: Robotic Learning Datasets and RT-X Models” (2023)

OpenVLA

Open-source VLA model developed by Stanford and UC Berkeley research teams5.

ItemDetails
Parameters7B
Training DataOpen X-Embodiment 970k trajectories
Base ModelsLlama 2 + DINOv2 + SigLIP
Training Infrastructure64x A100 GPU, 15 days

Achieved 16.5% higher success rate compared to RT-2-X (55B) across 29 evaluation tasks on WidowX and Google Robot embodiments. Shows particularly strong performance on BridgeData V2 WidowX tasks.

Supports efficient fine-tuning through LoRA and lightweight deployment through quantization, enabling operation on consumer-grade GPUs.

Key Papers:

  • Kim et al., “OpenVLA: An Open-Source Vision-Language-Action Model” (2024)

Pi0 (Physical Intelligence)

VLA flow model for general robot control developed by Physical Intelligence6.

ItemDetails
Base ModelPaliGemma VLM
Training Data7 robot platforms, 68 tasks
Control FrequencyUp to 50Hz
Action GenerationFlow Matching (Diffusion variant)

Demonstrated zero-shot and fine-tuning performance on complex real-world tasks such as laundry folding, table cleaning, grocery bagging, and box assembly. Supports cross-embodiment learning across various robot types including single arms, dual arms, and mobile manipulators.

Released OpenPI framework as open-source in February 2025, fully integrated with Trossen AI hardware.

Key Papers:

  • Black et al., “Pi0: A Vision-Language-Action Flow Model for General Robot Control” (2024)

See Also


References

Official Documentation

Datasets

Papers

Software

Footnotes

  1. Trossen Robotics company profile referenced from business databases including Tracxn, CBInsights, Crunchbase. Detailed metrics like employee count and revenue may vary by database.

  2. Fu et al., “ALOHA 2: An Enhanced Low-Cost Hardware for Bimanual Teleoperation”, 2024. https://aloha-2.github.io/

  3. Walke et al., “BridgeData V2: A Dataset for Robot Learning at Scale”, CoRL 2023. https://rail-berkeley.github.io/bridgedata/

  4. Open X-Embodiment Collaboration, “Open X-Embodiment: Robotic Learning Datasets and RT-X Models”, arXiv:2310.08864, 2023.

  5. Kim et al., “OpenVLA: An Open-Source Vision-Language-Action Model”, arXiv:2406.09246, 2024.

  6. Black et al., “Pi0: A Vision-Language-Action Flow Model for General Robot Control”, arXiv:2410.24164, 2024.