Did you just see that? Arbitrary view synthesis for egocentric replay of operating room workflows from ambient sensors
By: Han Zhang , Lalithkumar Seenivasan , Jose L. Porras and more
Potential Business Impact:
Lets surgeons see what others see during operations.
Observing surgical practice has historically relied on fixed vantage points or recollections, leaving the egocentric visual perspectives that guide clinical decisions undocumented. Fixed-camera video can capture surgical workflows at the room-scale, but cannot reconstruct what each team member actually saw. Thus, these videos only provide limited insights into how decisions that affect surgical safety, training, and workflow optimization are made. Here we introduce EgoSurg, the first framework to reconstruct the dynamic, egocentric replays for any operating room (OR) staff directly from wall-mounted fixed-camera video, and thus, without intervention to clinical workflow. EgoSurg couples geometry-driven neural rendering with diffusion-based view enhancement, enabling high-visual fidelity synthesis of arbitrary and egocentric viewpoints at any moment. In evaluation across multi-site surgical cases and controlled studies, EgoSurg reconstructs person-specific visual fields and arbitrary viewpoints with high visual quality and fidelity. By transforming existing OR camera infrastructure into a navigable dynamic 3D record, EgoSurg establishes a new foundation for immersive surgical data science, enabling surgical practice to be visualized, experienced, and analyzed from every angle.
Similar Papers
High-Quality Virtual Single-Viewpoint Surgical Video: Geometric Autocalibration of Multiple Cameras in Surgical Lights
CV and Pattern Recognition
Clears blocked views in surgery videos automatically.
Open-Source Multi-Viewpoint Surgical Telerobotics
Robotics
Gives robot surgeons extra eyes for better control.
Efficient 3D Scene Reconstruction and Simulation from Sparse Endoscopic Views
CV and Pattern Recognition
Makes surgery practice more real and faster.