Monday 23 March 2009

Moving on...

Well it looks like my efforts to fix the problem haven't worked. I've re-installed the software and basically done everything I can think of bar re-installing my operating system to fix the problem. Mysteriously this bug has arisen with no major alterations BUT I do think a few MS updates may be the culprit.

Thankfully I found a set of flash files that do the job I am trying to do for me and I know they already work because I did a test run on these files before using tbeta. So this will be my method of delivery for the time being, though I'm hoping when I migrate to a laptop it will fix my issues, as the code I had looked pretty neat.

So now I have to set up the final bit of this project, with a projector and laptop free standing so it can be taken a part and moved around easily. I also need to test the software on some level, so this is going to be undertaken as well.

Ultimately conclusions and outcomes are purely qualitative and very subjective, but one thing I have seen so far with this software is a massive movement toward a new type of electronic art, which is quite amazing and has endless possibilities. I see the medium moving less toward technical areas like games design or production and more toward this free flowing creative area, though I think that really both areas would come to fruition if the right people were interested in using this hardware/software. However Autodesk are working with Multi Touch screens so we will see!

Thursday 19 March 2009

Touch Sensing Woes

For the past few days I have been trying to get one of three examples running effectively.

I'm having big problems getting my touch sensing software (Tbeta) to work in harmony with the application. When I build and run the executable I get an error where it won't even load the opengl window, it just stops in DOS stating the same error code.

I'm switching now to touchlib and praying that works. I've also added some forum posts to the NUIG group and OpenFrameworks forum, but most of this software works in the Xcode C++ coding application on a mac, not on a win32 machine. But this isn't the culprit.

I've made extensive changes to the code, just got a working re-install of code blocks on the go tonight to try and rectify a litany of issues. Other issues concerned where major problems with VideoGrabber.cpp and quicktime, but these were localised around a set of functions which had not been initialised. Commenting seemed to do the trick, but I think it only broke the file to be honest.

The annoying thing is that this has all started to happen even though the workflow worked a month ago, but now it seems to have stopped working altogether. My deducement is that the FLOSC java application that is part of Tbeta (It sends out touched coordinates from tbeta to a localhost server which then fires them back out to any program that asks for them) is the culprit causing a networking crash. A thread on the openframeworks forum appears to back all this up.

More to come...

Tuesday 17 March 2009

More to come...

Well it's nearly time to start looking into completing the hardware setup and getting the programming to work together in effective synergy. So far as posted this has been difficult, but attempts will be made to adapt open source examples to try and see if an effective application can be created.

Wednesday 11 February 2009

openframeworks

I've managed to get tbeta to work with openframeworks the C++ framework for touch interfacing. But interactivity is an issue. I learnt openGL 2.0 a few years ago before I went to University, but my understand is a little to basic.

I have to find a way of allowing the user to switch modes (To do different things), though I could have this interactivity placed onto a key on the keyboard. But even if I did I wouldn't be sure how to change modes, as this involves looping and use of boolean (true false) arguments...which I used to get wrong.

Hopefully I'll find some working code for an image loader, or something similar soon. I will post more when that time comes.

Tuesday 10 February 2009

Surface...more important than I thought

Program tests were good today, albeit it limited. I now see why the surface (Material) is so important. The surface material has to allow a finger to slide over it with ease (Acrylic provides too much friction) and currently I am interfacing directly onto the acrylic. I even tried gloves to get that down, but they didn't displace the IR field.

So I need to come up with a few project flows and then try working on an application to bring together a few features. As always the community is fantastic. I'm now using openframeworks with tuio which is part of TBeta's blob tracking element. This is what allowed me to interface with the software running on the PC.

Testing all this will be difficult in the time I have available, but I hope to get a program up and running soon for show.

It works!...again

Right, so I've fixed the first programming problem. First of all I should say that the screen is working properly now and I can configure it properly as well. I am still missing a projector, which isn't great and I am also missing a laptop for portability, but that can wait. One less thing to worry about though.

The hardware interfaces with the infra camera, which is interpreted by the software on the computer. I had already been notified of an API framework (Call it what you will) called openFrameworks by developers at Autodesk and so I went to check it out. To cut a long story short I chose this as my end solution. I used tbeta and mtmini software to configure the hardware and test the principles of infra red hardware, before progressing.

Yesterday I found out that I couldn't just interface directly with openFrameworks (Which is built in C++) without having something else handle the multi touch interaction, which was being picked up by the infra red camera. Thankfully the community is fantastic and someone had already had this problem.

Tbeta software handles multi touch movements on screen and configures them and it has a facility to send these movements out through a localhost service (This basically picks up the movement information sent out by tbeta and waits for another application to ask for it) using a specific port. So this was the fix to my problem, because it would do all the hard work for me...famous last words! haha

Well it wasn't so bad in the end. After a full day of tweaking and trying to figure out why the libraries weren't linking properly in Code:Blocks (The C++ shell I am using), I finally got it to work. But I had to update the framework, the addons that I needed to get it to work and my project links.

So now I have a lovely blank window, but hope to have paint or something much more interesting to show for it, very soon!

Monday 9 February 2009

The acrylic frame

Well this weekend I finished the frame that will hold the acrylic screen (See picture). It cost me about £5 from the timber yard + alterations and cutting. If I had more time I would build a table like so many others have, but I don't.



I would have got more done but I was working on a novel this weekend, which I'm glad to say is finished, but I am currently re-drafting (see Dominic Took.com)

Today I am going to re-calibrate the screen using the software provided by Tbeta which interfaces with a flash application to provide interactivity in the form of a multi touch screen (It's pretty neat!). I hope to get a half decent video up on youtube sometime soon.

I will let you know how I get on...

Thursday 5 February 2009

Optimised and Working!

The electrical tape worked!

After a lot of numerous changes today, I've managed to optimise all the blobs so they work effectively. I can drag a finger over the surface and get blobs picked up effectively. I can resize images and rotate the 2d global desk using my fingers as well. So all the basic functionality I need is working (Yipee!)

I do have some videos but they're very large so I'm attempting to re-encode them via After Effects, if it works I will post the youtube videos.

I intend to try and build a contraption to keep the surface still, sometime soon, it's rather heavy at 2-3 kg's, so it does need something fairly robust. I also need to get a projector to make interaction with blobs complete and fully intuitive.

More updates will follow I bet...

Blob tests...

The set up works, but its configuration is a little unstable. But there are a few fixable reasons for this. I tested the blobs first of all (Blobs are impressions made onto the acrylic surface (The touch surface) which disrupt the infra red field.) So the first test was successful, I managed to create blobs just by using my fingers on the surface, however I had to use the opposite to the side I expected, to achieve this...which still has me a little confused.







That was last night, today I've been trying to re-configure the hardware. I set everything up on a more stable surface, though I have nothing built to hold the screen in place...which is going to become annoying. I've found out today that the LED strip at the edge is interfering with the software picking up the blobs effectively, without that I can't really go forward.

So now I am going to try taping off the LED bleed with electrical tape.

Wednesday 4 February 2009

Moment of truth!

The surface is now complete, the first image shows the fabric stabiliser stretched over the surface of the acrylic and the projection fabric stretched on top of that layer. Now I'm ready to add the LED's and tonight I will know if all the parts, (Webcam, surface, software) actually work properly.








It's the moment of truth...

Tuesday 3 February 2009

LED Test = Green Light, Surface to

Well I've blacked out the room I'm making this in (So that the camera only picks up the infra red light from the touch screen and nothing else). I've started to make the surface, which I'm glad to say is coming along better than expected.

So now I'm about to put another layer of fabric onto the acrylic board which makes up the touch interface. It's pretty heavy at 10mm's and only about 25 inches wide (If you think about it like a monitor screen). The reason I mention it is because being so heavy isn't good for portability, but these touch screens should be mounted inside something, this is merely for test purposes.

Once I've added the first and second layer of fabric I'll be ready to add the LED's, which I'm glad to say, work!

It Worked!

I really didn't expect that the infra red fix would work, but I kept chiselling with the stanley knife...ye of little faith I know!

Photos:

My phone (Both my monitor and my phone were turned on)





More updates to come...

Infra red camera!!

I've finally got all the parts, LED strip, camera, acrylic, fabric for compliant surface and other little bits. I found out yesterday that the camera drivers were very poor! I'm guessing Japanese or Chinese make and the infra red feature was actually a graphical feature, not embedded into the camera as I had first expected.

But by a stroke of luck I have just prised out the infra red lense see image. In the end I had to crack the lense, which was very brittle, to get it out. So third time lucky in the end!! (See pic)

Now I just need to add in the infra red filter, which I believe can be put together from a floppy disk magnetic strip...or exposed negatives with no picture. I did have these but they aren't quite the right type, so I am going to use the floppy disk magnetic strip again.

With that cut and placed into the camera, I should finally have an infra red camera. Will update soon, if I manage to get the infra red LED's working.

Friday 30 January 2009

Photos!

I'll be uploading photos to a new flickr page here

Beats reading about it all!

Waiting on the hardware...

After buying slightly the wrong kit, not the wrong parts but the wrong specification, I'm now waiting on an Infra red 8 megapixel web camera, to see the infra red distortions and interface with the software. A piece of acrylic 450mm by 500mm and the infra red 850nm LED's which are being shipped from america.

Fingers crossed there are no new problems until I try and get the software working again! That's not to say I havn't compiled Openframeworks in Code::Blocks, but I need to see if the software interfaces with the web cam properly. So that is the next test.

I'll keep you updated on the parts and I should have some picture progress to follow quite soon.

The Intuitive Interface

So what is this all about then?  Well this blog charts the progress of my efforts to find a solution to the problem of interface design in computer game studio development pipelines.  What are those?  Well computer games or video games, are made up of a series of creational steps, or building blocks.  These make up the pipeline.  To broadly summarize them, you could say they're, pre-alpha or concept design, alpha development, beta development and release.  But this summary doesn't really point out all the key areas.

What I am trying to do is build a device modeled on Multi Touch Screens like this one (Multi Touch), and then attempt to prove the hypothesis that an organic (Off the cuff, no rigid structure or thought process) intuitive interface such as this one, would allow developers enhanced modes of function, when designing games.  Like doodling in real time and playing back quick animated scenes, adding video, pictures and sound from a hard drive or external storage location to prototype ideas.  There are in short, many many things you can do with the software.  I have to thank the people at Openframeworks because they're the people who developed the software under the LGPL 3.0 license, which means it's free to use, but there are catches further down the line, though I think these only apply to rigid commercial bodies.  Everything runs on a very simple C++ shell called Code::Blocks, though it will run on others as well and so the solution is cost effective (Even if the hardware might not be!!)

So I will be updating this blog hopefully every day with updates about what's going on, until then, happy reading!

Dominic Took