AR Sandbox
The goal of this installation was to develop a real-time integrated augmented reality system to physically create topography models which are then scanned into a computer in real time, and used as background for a variety of graphics effects and simulations. The final product is supposed to be self contained to the point where it can be used as a hands-on exhibit with little supervision.
This Augmented Reality Sandbox was designed and built by Maulesh Trivedi and Russell Oliver in partnership with Oliver Kreylos, Ph.D of UC Davis.
Background
Raw depth frames arrive from the Kinect camera at 30 frames per second and are fed into a statistical evaluation filter with a fixed configurable per-pixel buffer size (currently defaulting to 30 frames, corresponding to 1 second delay), which serves the triple purpose of filtering out moving objects such as hands or tools, reducing the noise inherent in the Kinect's depth data stream, and filling in missing data in the depth stream. The resulting topographic surface is then rendered from the point of view of the data projector suspended above the sandbox, with the effect that the projected topography exactly matches the real sand topography.
The sandbox itself has a 4:3 aspect ratio, to match the field-of-view of the Kinect camera and the projector. The size of the sandbox is limited by the Kinect camera's minimum and maximum sensing distances, and the desired sandbox resolution. Due to the Kinect camera's approximately 90° field-of-view, the Kinect camera has to be mounted about as high above the sand surface as the sandbox is wide. The Kinect camera should be mounted directly above the sandbox's center point, looking straight down. (See figure above)
In Conclusion
With our lives becoming more and more digital every day, there is an increasing need to bridge the gap between our physical and digital worlds, and interact seamlessly with both worlds at the same time. Augmented Reality (AR) blurs the line between what’s real and what’s computer generated by enhancing what we see…hear….and feel. The Augmented Reality (AR) sandbox allows users to create topography models by shaping real sand, which is then augmented in real time by an elevation color map, topographic contour lines, and simulated water. The system teaches geographic, geologic, and hydrologic concepts — such as how to read a topography map, the meaning of contour lines, watersheds, catchment areas, levees, etc.
Simply put, we have a PC running Linux, an X-box Kinect Camera, short throw projector, and a box to hold sand. When interacting with the sandbox — you can create forms within the box by moving the sand. The Kinect’s sensors read the change in the environment and that data is fed into the software. The resulting topographic surface is then rendered from the point of view of the projector which is suspended above the sandbox. The projected topography exactly matches the real sand topography.