RemoteFusion: Real Time Depth Camera Fusion for Remote Collaboration on Physical Tasks

Matt Adcock, Stuart Anderson, Bruce Thomas

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract / Description of output

Remote guidance systems allow humans to collaborate on physical tasks across large distances and have applications in fields such as medicine, maintenance and working with hazardous substances. Existing systems typically provide two dimensional video streams to remote participants, and these are restricted to viewpoint locations based on the placement of physical cameras. Recent systems have incorporated the ability of a remote expert to annotate their 2D view and for these annotations to be displayed in the physical workspace to the local worker. We present a prototype remote guidance system, called RemoteFusion, which is based on the volumetric fusion of commodity depth cameras. The system incorporates real-time 3D fusion with color, the ability to distinguish and render dynamic elements of a scene whether human or non-human, a multi-touch driven free 3D viewpoint, and a Spatial Augmented Reality (SAR) light annotation mechanism. We provide a physical overview of the system, including hardware and software configuration, and detail the implementation of each of the key features.
Original languageEnglish
Title of host publicationProceedings of the 12th ACM SIGGRAPH International Conference on Virtual-Reality Continuum and Its Applications in Industry
Place of PublicationNew York, NY, USA
Number of pages8
ISBN (Print)978-1-4503-2590-5
Publication statusPublished - 2013

Publication series

NameVRCAI '13

Keywords / Materials (for Non-textual outputs)

  • fusion
  • remote guidance
  • spatial augmented reality
  • spatial user interfaces
  • visualization


Dive into the research topics of 'RemoteFusion: Real Time Depth Camera Fusion for Remote Collaboration on Physical Tasks'. Together they form a unique fingerprint.

Cite this