Researchers have developed a compact surgical robot with an integrated camera that autonomously monitors and adjusts movements in real-time.
It provides high-precision control – essential for delicate procedures – to help overcome the limitations of traditional systems that often rely on bulky external feedback mechanisms.
By harnessing internal visual tracking, the robot can self-correct during operation, potentially enhancing accuracy and safety in minimally invasive surgeries.
Using onboard closed-loop control, the origami-inspired robot attained micrometre-level accuracy and stability, even when exposed to external forces.
The breakthrough is the first example of internal visual feedback in micro-robotic systems.
Experts suggest it paves the way for compact, autonomous surgical tools that can operate deep inside the human body.
In microsurgery, environmental forces, user tremors and the limitations of conventional actuators often hinder precise movement.
Piezoelectric beams offer excellent force and responsiveness but are prone to drift and hysteresis unless paired with real-time feedback.
Most existing systems rely on external cameras or strain sensors for correction, which add bulk and complicated wiring – particularly problematic for minimally invasive procedures.
At the same time, compliant mechanisms provide compact, backlash-free motion but still require precise sensing to function effectively in clinical settings.
Given these challenges, there is an urgent need for a lightweight, high-resolution internal feedback system that enables stable and autonomous control of microrobots.
In a study, researchers from Imperial College London and the University of Glasgow have created the first microrobot that manages its movements using fully onboard visual feedback.
Published in Microsystems & Nanoengineering, the study features a piezoelectric-driven delta robot with an internal endoscope camera and AprilTag markers for internal visual tracking.
This approach eliminates the need for external sensing devices, enabling closed-loop motion correction within a self-contained system. Its compact design and precise control open new horizons for future microsurgical tools.
Inspired by delta mechanisms and origami structures, the robot is actuated through piezoelectric beams integrated into a 3D-printed, compliant framework.
Traditional joints are replaced with flexure-based elements, enabling precise, backlash-free movement across three degrees of freedom.
For feedback, a tiny borescope camera is embedded beneath the robot's platform to monitor AprilTag fiducials in real time.
Using this visual data, a PID-based control system continuously adjusts the robot’s motion to follow programmed paths and counter external disturbances, such as gravity.
The robot successfully traced complex 3D trajectories with high repeatability, achieving a root-mean-square motion accuracy of 7.5 μm, a precision of 8.1 μm, and a resolution of 10 μm.
In tests, the closed-loop system performed better than open-loop control, particularly under external disturbances, and preserved trajectory stability even when affected by disturbances.
Compared to existing micromanipulators, this solution combines onboard sensing, simple fabrication, and adaptability for surgery.
It is the first system of its kind to utilise internal visual feedback for autonomous motion correction, providing unprecedented independence and control at the microscale.
Dr Xu Chen, lead author, said: ‘This development represents a paradigm shift in micro-robotics. Our approach enables a surgical microrobot to track and adjust its motion without external infrastructure. By integrating vision directly into the robot, we enhance reliability, portability, and precision – crucial features for real-world medical applications. We believe this technology sets a new standard for future surgical tools that operate independently within the human body.’
The robot’s compact, self-regulating design makes it ideal for minimally invasive procedures, such as navigating catheters or performing laser tissue resections.
Future enhancements, such as higher frame-rate cameras and advanced depth sensing, could improve responsiveness and z-axis resolution.
With scalability reduced to under a centimetre, this platform could support tools for endomicroscopy, neurosurgery and other applications.


