Multitouch display technology has been gaining a lot of visibility recently, both with Microsoft's Surface interactive tabletop displays and the iPhone's slick gesture-based interface. Ignoring some of the particulars, the guts behind this technology is relatively simple, and you can make you own multitouch interactive display for little over the cost of a projector (the most expensive part of this setup).
An acrylic panel is edge lit with infrared leds. When your finger comes in contact with the acrylic, it scatters infrared light out the back where it is visible via infrared camera. As long as nothing is touching the acrylic, very little of the light escapes, instead just reflecting around inside. Image processing takes care of detecting tips of fingers and relaying their location to application software. Since the camera "reads" the whole display in parallel, it is easy to detect multiple fingertips at once, even those belonging to multiple users. All this sensing goes on in the infrared spectrum, leaving us free to utilize the visible spectrum to display interactive software.
As you might imagine, there's a lot of software that translates the infrared fingerpresses that the webcam sees into a usable operating system or application interface. You're not entirely on your own with this. There's an image processing library called Touchlib which will handle passing your C++ app screen touch events. The community of Touchlib developers seem to be pretty active and have put together a number of open source, sample applications which can help you get started.
Full Article - Link
DIY Multitouch Dsiplay @ Instructables - Link
Touchlib - Link
Natural User Interface Group (multitouch developer community) - Link
Full Article - Link
DIY Multitouch Dsiplay @ Instructables - Link
Touchlib - Link
Natural User Interface Group (multitouch developer community) - Link
No comments:
Post a Comment
Note: Only a member of this blog may post a comment.