TY - GEN
T1 - RemoteFusion: Real Time Depth Camera Fusion for Remote Collaboration on Physical Tasks
AU - Adcock, Matt
AU - Anderson, Stuart
AU - Thomas, Bruce
PY - 2013
Y1 - 2013
N2 - Remote guidance systems allow humans to collaborate on physical tasks across large distances and have applications in fields such as medicine, maintenance and working with hazardous substances. Existing systems typically provide two dimensional video streams to remote participants, and these are restricted to viewpoint locations based on the placement of physical cameras. Recent systems have incorporated the ability of a remote expert to annotate their 2D view and for these annotations to be displayed in the physical workspace to the local worker. We present a prototype remote guidance system, called RemoteFusion, which is based on the volumetric fusion of commodity depth cameras. The system incorporates real-time 3D fusion with color, the ability to distinguish and render dynamic elements of a scene whether human or non-human, a multi-touch driven free 3D viewpoint, and a Spatial Augmented Reality (SAR) light annotation mechanism. We provide a physical overview of the system, including hardware and software configuration, and detail the implementation of each of the key features.
AB - Remote guidance systems allow humans to collaborate on physical tasks across large distances and have applications in fields such as medicine, maintenance and working with hazardous substances. Existing systems typically provide two dimensional video streams to remote participants, and these are restricted to viewpoint locations based on the placement of physical cameras. Recent systems have incorporated the ability of a remote expert to annotate their 2D view and for these annotations to be displayed in the physical workspace to the local worker. We present a prototype remote guidance system, called RemoteFusion, which is based on the volumetric fusion of commodity depth cameras. The system incorporates real-time 3D fusion with color, the ability to distinguish and render dynamic elements of a scene whether human or non-human, a multi-touch driven free 3D viewpoint, and a Spatial Augmented Reality (SAR) light annotation mechanism. We provide a physical overview of the system, including hardware and software configuration, and detail the implementation of each of the key features.
KW - fusion
KW - remote guidance
KW - spatial augmented reality
KW - spatial user interfaces
KW - visualization
U2 - 10.1145/2534329.2534331
DO - 10.1145/2534329.2534331
M3 - Conference contribution
SN - 978-1-4503-2590-5
T3 - VRCAI '13
SP - 235
EP - 242
BT - Proceedings of the 12th ACM SIGGRAPH International Conference on Virtual-Reality Continuum and Its Applications in Industry
PB - ACM
CY - New York, NY, USA
ER -