The Microsoft Surface is not released to the public yet and already, developers are coming up with the next step for the multi-touch technology. Andy Wilson is one of the Microsoft researchers who helped create the Surface. Wilson has though of a way that uses physics engines, used in 3D games, to help make touch screens more realistic. The Surface uses an IR camera positioned under the screen to detect hand or finger placements as they are pressed onto the screen. The camera relays the information to a computer that matches the hand placements to the display and moves the display accordingly. The computer then relays the movements to a projector, also mounted under the screen, to display the images onto the screen. The major problem with this system is that the IR camera reads each placement the same, meaning a fingertip and a full hand are both read as a “touch”. Wilson states that with the physics engines, the Surface, and all other touch-screen, will be able to read different hand gestures, including light touches, pushing, grabbing, and many more. This new technology, if developed, will open up many new real-world applications for the touch-screen industry. Wilson plans to announce his ideas at the User Interface Software and Technology Conference this week.