An Open-Source Robotics Research Platform for Autonomous Laparoscopic Surgery

Authors redacted for blind review
Affiliations redacted for blind review

System overview: the teleoperation interface consists of a haptic input device and a clutch pedal board, combined with a Meta Quest headset providing immersive 3D perception. On the patient side, two UR5e robotic arms control the surgical instrument and a stereoscopic endoscope capturing the surgical workspace.

Abstract

Autonomous robot-assisted surgery demands reliable, high-precision platforms that strictly adhere to the safety and kinematic constraints of minimally invasive procedures. Existing research platforms, primarily based on the da Vinci Research Kit, suffer from cable-driven mechanical limitations that degrade state-space consistency and hinder the downstream training of reliable autonomous policies.

We present an open-source, robot-agnostic Remote Center of Motion (RCM) controller based on a closed-form analytical velocity solver that enforces the trocar constraint deterministically without iterative optimization. The controller operates in Cartesian space, enabling any industrial manipulator to function as a surgical robot. We provide implementations for the UR5e and Franka Emika Panda manipulators, and integrate stereoscopic 3D perception. We integrate the robot control into a full-stack ROS-based surgical robotics platform supporting teleoperation, demonstration recording, and deployment of learned policies via a decoupled server–client architecture.

We validate the system on a bowel grasping and retraction task across phantom, ex vivo, and in vivo porcine laparoscopic procedures. RCM deviations remain sub-millimeter across all conditions, and trajectory smoothness metrics (SPARC, LDLJ) are comparable to expert demonstrations from the JIGSAWS benchmark recorded on the da Vinci system. These results demonstrate that the platform provides the precision and robustness required for teleoperation, data collection and autonomous policy deployment in realistic surgical scenarios.

System Architecture

System architecture overview showing ROS-based modular platform

The platform is built with modularity at its core, with inter-component communication abstracted through ROS topics. In teleoperation mode, the operator's input device supplies target commands to the RCM controller, while stereoscopic visualization is rendered through the Meta Quest headset. During policy deployment, the control signal is generated by the learned model via a ZMQ-based server–client architecture, keeping model inference cleanly decoupled from the system.

The RCM-constrained robot controller operates in Cartesian space and is therefore agnostic to the specific hardware. Any robot arm capable of executing Cartesian velocity commands can be used. For policy deployment, inference is decoupled from ROS using a ZMQ-based server–client architecture. A model server aggregates observations, forwards them to the policy, and relays predicted actions back through a safety controller. This design enables rollouts across hardware and simulation environments without modifying the infrastructure.

Analytical RCM Controller

The Remote Center of Motion (RCM) constraint requires that the instrument shaft always pivots about a fixed point at the trocar insertion site. Unlike optimization-based approaches (e.g., Quadratic Programming), our controller enforces this constraint analytically via closed-form vector algebra, yielding deterministic, smooth velocity profiles without iterative solvers.

We provide two complementary control strategies: Cartesian Tip Velocity Control for intuitive teleoperation and policy deployment, and Spherical Coordinate Control that directly maps to the surgical degrees of freedom.

RCM geometry showing instrument vectors and control variables

RCM geometry: spherical (pitch, yaw, roll, translation) and Cartesian tip velocity control, with all vectors in the robot base frame.

RCM Constraint Maintenance
The RCM point remains fixed while the controller freely moves the instrument tip.

Control Modalities

Cartesian Tip Control

Input: desired tip velocity v_tip + shaft roll ω_roll. Moves the instrument tip in 3D Cartesian space inside the patient body while maintaining the RCM constraint.

Spherical Coordinate Control

Input: ω_pitch, ω_yaw, ω_roll, v_trans. Directly maps to the surgical degrees of freedom: pivoting, roll, and insertion depth.

Instrument Roll
The controller rotates the instrument about its shaft axis while maintaining the RCM constraint.

Immersive 3D Perception

The system employs a TIPCAM1 S 3D stereo endoscope mounted on a second robotic arm to provide stereoscopic visualization. The stereo video feed is displayed on either a 3D monitor or a Meta Quest headset via Endomersion, enabling depth perception during task execution and demonstration recording.

The camera frame is determined using the forward kinematics of the camera-holding arm combined with a measured offset from its end-effector to the endoscope tip. Since the transformation between the two robot bases is known, instrument commands are transformed from the camera frame to the instrument robot frame, ensuring natural and intuitive teleoperation regardless of viewing angle.

Endomersion immersive 3D visualization through Meta Quest headset

Immersive stereoscopic visualization through the Meta Quest headset running Endomersion, showing the surgical workspace captured by the endoscope.

Hardware Setup

Full testbed setup with labeled components

The platform consists of two UR5e robotic arms—one for controlling the surgical instrument and the other for a stereoscopic endoscope—combined with:

  • Lambda.7 haptic device (Force Dimension) for teleoperation input
  • Meta Quest headset for immersive 3D visualization
  • TIPCAM1 S 3D stereo endoscope (Karl Storz)
  • Clutch pedal board for gripper and clutch control

Input devices and endoscope nodes may run on separate machines connected via ROS, allowing operators to control the robot from a remote workstation. Clocks are synchronized using Precision Time Protocol (PTP).

Experimental Environments

We validate the platform across three increasingly realistic laparoscopic environments on a bowel grasping and retraction task.

Phantom

Phantom Tissue

Controlled bench-top environment with synthetic tissue models for repeatable data collection and initial policy training.

Ex Vivo

Ex Vivo Porcine

Cadaveric porcine tissue providing realistic tissue mechanics and visual appearance without the complexity of a live procedure.

In Vivo

In Vivo Porcine

Live porcine laparoscopic procedures with full physiological conditions including breathing motion and tissue compliance.

Collected Datasets

The platform enables collection of diverse surgical task datasets. Below are examples of tasks recorded using our system.

Needle Driving
Pick and Place
Thread in Hole
Endoscope Guidance
Phantom — Camera controlled directly via RCM controller
Endoscope Guidance
In vivo — Camera controlled directly via RCM controller

Key Results

< 0.08
mm median RCM deviation
Sub-millimeter precision across all conditions
85%
autonomous policy success rate
17/20 rollouts on bowel grasping task
≈ dVRK
smoothness metrics
Comparable to JIGSAWS da Vinci benchmark
RCM deviation across experimental conditions
RCM Deviation. Mean deviation (left) and worst-case maximum deviation (right) remain sub-millimeter across all conditions, with phantom (x̃ = 0.035 mm), ex vivo (x̃ = 0.052 mm), and in vivo (x̃ = 0.079 mm) showing a progressive increase consistent with trocar pressure on the abdominal wall during in vivo procedures.
Trajectory smoothness comparison
Trajectory Smoothness. SPARC (left) and Log Dimensionless Jerk (right) metrics, where values closer to zero indicate smoother motion. Our platform produces progressively less smooth trajectories from phantom to in vivo, consistent with increasing task difficulty. All three conditions yield smoother trajectories than expert demonstrations from the JIGSAWS Knot Tying benchmark recorded on the da Vinci system. All signals downsampled to 5 Hz for fair comparison.

Contributions

  • An open-source, robot-agnostic RCM controller based on a closed-form analytical solver that enforces kinematic constraints via Cartesian velocities, ensuring deterministic execution without iterative optimization.
  • A full-stack ROS-based software architecture for teleoperation, data recording, and deployment of autonomous policies with a decoupled server–client inference architecture.
  • Quantitative validation across phantom, ex vivo, and in vivo porcine laparoscopic procedures, demonstrating sub-millimeter RCM precision and trajectory smoothness comparable to established surgical robotics benchmarks.

BibTeX

@inproceedings{anonymous2026surgical,
  author    = {Anonymous},
  title     = {An Open-Source Robotics Research Platform
               for Autonomous Laparoscopic Surgery},
  booktitle = {Under Review},
  year      = {2026},
}