【正文】
1, Thomas Dohse 2, Jeremiah D. Still 1, Derrick J. Parkhurst 13Human Computer Interaction Program 1, Computer Science Department 2, Psychology Department Iowa State University { kcd | dohse | derrick | jeremiah } AbstractA rearprojection multitouch tabletop display was augmented with hand tracking utilizing puter vision techniques. While both touch detection and hand tracking can be independently useful for achieving interaction with tabletop displays, these techniques are not reliable when multiple users in close proximity simultaneously interact with the display. To solve this problem, we bine touch detection and hand tracking techniques in order to allow multiple users to simultaneously interact with the display without interference. Our hope is that by considering activities occurring on and above a tabletop display, multiuser interaction will bee more natural and useful, which should ultimately support collaborative work.1. IntroductionLarge displays are useful for information visualization when multiple people must jointly use the information to work together and acplish a single goal. The social interactions that result from using a shared display can be highly valuable [1]. However, these displays can fail to allow multiple users to simultaneously interact with the information. Tabletop interfaces can provide a large shared display while simultaneously accepting natural and direct interaction from multiple users through touch detection. For example, multitouch surfaces that utilize the phenomenon of frustrated total internal reflection (FTIR) have received widespread attention recently [2]. FTIR detection techniques allow the system to track a large number of simultaneous touch points with very high spatial and temporal frequency. The FTIR has several advantages over other multitouch detection technologies, such as being both low cost and scalable [3]. Other puter vision based tracking systems have a limited ability to detect touches versus near touches which is an important element of interacting with the table surface.FTIR tracking alone has two shortings pared to other methods of touch tracking. Each touch in FTIR appears as an independent event. Although inferences based on distance between touch points can be leveraged to guess which touches are part of the same event, each touch ultimately remains a standalone piece of data. As the number of users and plexity of their actions increases, so does the probability of incorrectly grouping touch points with a single user. The other issue is that the system is inherently susceptible to problems with spurious IR noise (. poor lighting conditions or flash photography). To solve lighting and touch differentiation problems, we augmented a FTIR tabletop display with an overhead camera. Using the camera, hands on or over the table can be tracked using skin color segmentation techniques. With hand coordinates available, touch points can be assigned the ownership necessary to support multiple users and correctly identify events prised of multiple touches. This technique works well even when gestures are made by multiple users in close proximity because it does not need to differentiate touches based on closeness. The fusion of hand and touch point locations also increases the robustness of touch sensing in the presence of unwanted IR light because of the redundancy of the point’s location. Additionally, tracking hands allows users to generate interactions without touching the surface, but rather making movements above the table. This creates a hybridization of the two interaction techniques that is still being explored. Figure A1: Three users working together using a rearprojection multitouch tabletop display augmented with hand tracking using an overhead camera.2. Related Work Techniques and technologies used for interaction detection on tabletop displays are rapidly maturing, but many researchers are still seeking better methods for capturing natural interactions made by multiple users within the context of real world applications. There are a number of approaches to tracking user interactions with a tabletop display. One successful method is to use a surface material that is laden with sensors, such as the mercially available DiamondTouch system. This system uses a technique where a circuit is capacitively closed when the user touches the table [4]. Interfaces like this one use front projection due to the opaque surface needed for the sensors. Other systems such as the metaDESK also use sensors, but integrate them in physical objects that can be manipulated [5]. Another mon approach is to use video cameras to track interactions. For example, the HoloWall uses a semiopaque diffuser that allows infrared (IR) light projected from behind the screen to reflect off of objects at a certain distance from the surface [6]. The TouchLight interface uses two IR cameras to determine when contact with the screen has occurred [7]. Other projection based systems, like the ViCat use overhead cameras to track hand gestures [8]. This table does not use physical touches on the surface, but uses an overhead camera to track hand gestures in order to interact with the display. Work is also being done to improve the nature of multitouch interaction. These areas of research are equally vital to the field as designing new systems to support multitouch. Such as designing cooperative gestures to f