Posts under Multitouch
Recently the Virtual Graffiti Wall application (formerly Pixel Perfect) I built for Classy Event Group was featured on the morning show Today in the US. Check out the 2 minute segment in the below video.
The app is built on Adobe AIR and is multi-user/multi-touch enabled. Features include:
- Stencils (with rotate, move and scale)
- Spray paint simulation with drips
- ‘clip art’ stamps (with rotate move and scale)
- Image gallery hot folder for live photobooth
- Built in green screen removal with user fine adjustments
- Customisable sounds for each brush
- Saving and printing of artwork
The system is available for rental world wide at www.PhotoGraffitiWall.com.
In the never ending quest to get better blob tracking on my touch screen I decided to add more cameras. A few months ago I added a second camera but they didn’t quite cover the whole screen – there was about an inch gap in the centre of the screen. I didn’t want to use wider lenses because of the distortion so I decided to add two more camera to bring the total to 4, with each camera tracking a quarter of the screen.
I was trying to think of a good way to mount the cameras so that they were evenly spaced and I came up with the idea of using a piece of wood with a routed cavity to sit each camera in. The problem with this was that the wood would have to sit above the backlight of the LCD and so it would block some light. Then I remembered I had a spare piece of 10mm acrylic from my old 19″ FTIR touch screen. So instead of using wood I cut and routed the piece of acrylic so that the cameras had transparent mounting plates.
Here are some photos of my process…
Left: Acrylic measured so that the lense of each camera is perfectly spaced; Right: Testing routing the acrylic – I had to use the slowest speed and keep moving so the router bit didn’t gum up with melted plastic. Super messy!
This post has been sitting around in draft for a while now so thought I’d better publish it. I finally got around to recording a video demo of an app I’ve been working on for a client. I needed to combine my PixelBender paint mixing with blob size so that the brush is the size of the object touching the screen. The video below demonstrates the app using TUIO and CCV. The final version will most likely be using GestureWorks and an Iduem screen.
One thing I have been wanting to do for a quite a while now with my touch screen is use touch size data to determine the size of lines drawn on the screen. The other day I finally got around to researching experimenting with CCV. It turns out that CCV can send the width and height of blobs along with coordinates and other data but its not turned on by default and there is no option in the GUI to turn it on.
Here’s a quick demo using a paint brush:
I’ve been trying to find ways to improve the performance of paint mixing with Flash and I though I could try using Stage3D for hardware accelerated graphics. But then I realised that Stage 3D is optimised for polygons and 3D models so it was probably not the best solution. I wanted to see if it was possible anyway and thanks to the Starling Framework and a bit of help from Thibault Imberts book it turns out its possible using the RenderTexture on an Image object, but it’s not really possible to do any complex drawing. Starling’s Image object is the equivalent of the Bitmap class, which is using to display any BitmapData. I made an example which you can see below.
By creating a RenderTexture for an Image object, you can use the RenderTexture’s draw() method to draw another Image’s texture onto it. This is similar to using BitmapData and draw() to draw one bitmap onto another. But Stage3 doesn’t use pixels, it uses textures mapped to triangles instead therefore at this stage it’s not possible to use something like getPixel() for get colour data from the ‘canvas’.
Well its been a while since I posted an update on my DSI coffee table. It is pretty much complete now. The last thing I have to do is get a piece of scratch resistant plexiglas for the table top to create a single seamless top. I would also like to get black perspex for the sides so the table has a piano black finish, but this is a maybe in the future. The reason I have been holding off with the final top piece is because I am still experimenting with different lighting so blobs can be detected through the extra layer of plastic. Currently I am waiting on a second PS3 Eye camera so I can try a two camera setup to hopefully get clearer and less distorted images for tracking.
PixelPerfect is a digital art wall app I built for Classy Event Group using Adobe AIR. After seeing a video I posted over at the NUI Group forums they asked me to build a customised version of my stenciling and painting app for their digital art wall. PixelPerfect uses a new stencilling method which I developed which is not only heaps faster, it also leaves paint on the stencils – something which we hadn’t seen before in previous stencil applications. The app has the ability to load in background images and transparent foreground images and the saving and printing of creations. The service provided by Classy Events Group also enables a photobooth function. The app watches a folder for photos and a DSLR saves images to the folder enabling the user to instantly pull up their photo and draw on it.
Here is a small update on the current state of my coffee table. Its been a nearly a year now since I first started and I’m still slowly working on it. Recently I decided to add some touch sensitive hardware buttons at the edge of the screen. I researched some different ways of doing this and I decided to use Phidgets since these seem to be the easiest to use with Flash or any other languages and weren’t as low level as something like Arduino. So I bought a Phidgets board and 4 touch sensors and installed two on each side of the screen to be used as hardware buttons. I have adjusted my multitouch painting application so that these buttons hide/show tools, layers and other panels. Here are some photos of the installation in the table top:
Routing out the recesses in the table top to sit the sensors in:
Mounting the Phidgets controller:
Two of the Phidgets touch sensors sitting in place. They will be covered by 3mm of perspex which will eventually be the top layer of the table:
Here is some more updated photos of the coffee table. I wanted to post some shots up before I head over to Japan in a few days. I managed to get it out of my bedroom and into the lounge for some better shots. Note that the top and sides are not the final. Currently they are MDF painted matte black, but when I get back from overseas in a month I will replace the top and sides with gloss black perspex so it’s all shiney! Check out the new photos below.
As some may know Microsoft has release info and a new website for Microsoft Surface 2.0. The new surface looks great and at just 4″ thick it is now a lot more adaptable than the table style of version 1. Customised legs can be attached to the unit or it cam be mounted to a wall. Surface 2 uses some new technologies that Microsoft has developed. The first is per-pixel IR detection, so basically each pixel on the screen has its own tiny ‘IR Camera’ so the which enabled the screen to see anything placed on top. Surface 2 runs on Windows 7 this time around instead of the custom Surface software of version 1 and now has integrated support for Windows Phone 7. The top surface is covered with a huge sheet of Gorilla Glass, that largest to ever be bonded to an LCD. And finally the unit is being manufactured by Samsung. Check out the demo video below and the link to the new website.