Friday, June 15, 2012

Contextual Statement

Here is my contextual statement plus some pictures from the development of my project:


Contextual Statement

Marty Martin

        We are allowing a human to understand the behaviours of a worker ant and comparing that to a human. Showing that a worker ant is set with specific instructions that are for a greater cause outside of the ants understanding of its colony. The human will be immersed into their perception of what a worker ant life is like by altering their visual senses into a virtual world. From the visually simulated ant’s perception, the only choices are to scout out for food and bring it back to the base or follow in the paths of others using visual 3d pheromone trails to already found food. An ant’s strongest sense is smell and a human’s is sight. For the human to have a perception of what an ant’s life is like, the human has to see what an ant smells. This would be the ground, food and any pheromone tracks.   

Video Glasses
        The individual only understands what it is doing and will never be able to see the full extent of the system it is part of. The importance of the one ant can only be strengthened from other ants and the one ant will not sustain life by itself. That said the colony could still last without the individual because of the number of ants working for one unified goal, stopping any individuality between the ants. The impact of the individual ant can only be seen from an outside perspective. A communication network seen from pheromones shows the impact of the one ant.

My attempt at a ultrasonic picaxe transmitter
        For a human to be immersed in this ant world it can see only its path and see the choice of direction in exploration it has chosen. This choice of path would need to be recognised instantly for the human to understand where it has chosen to go. This has been recognised in other installations such as work from Andrew Burrell, which allows the user choice in exploration by moving the head to look at certain areas that only shows them the area they have chosen to look at.

The grid area of the virtual reality
        The speeds at which ants move are far greater to the human, they are able to make their decision faster and decide what path and food sources to take. As our simulation installation is about understanding the ways of the worker ant, so too would the speed of our simulation needing to represent that of the ants, causing for the humans to have basic, set code decisions.

MaxMSP patch final
        The human are to be immersed inside a world that alters their senses to their perceived virtual reality it is about having the human act as a worker ant would. They will have the same thought in mind as an ant with only the choice of following other 3d pheromone paths to virtual food or creating their own path to find new food and to bring it back to the nest. Even with an ant following these simple instructions it is part of a larger plan of keeping the community supported and sustainable. The one ant is all it takes to search for that food and allow the colony to continue. Simulated versions of this allows for the audience to observe this understanding of how these insects have lasted.


References


Andrew Burrell. Unity/Max Motion Tracking. MP4, 2010. http://miscellanea.com/unitymax-motion-tracking/.

“Ants More Rational Than Humans?” Journal. ScienceDaily, July 24, 2009. http://www.sciencedaily.com/releases/2009/07/090724144524.htm.

Editor, By Roger Highfield, Science. “Like Ants, Humans Are Easily Led.” Telegraph.co.uk, December 12, 2007, sec. science-news. http://www.telegraph.co.uk/science/science-news/3318321/Like-ants-humans-are-easily-led.html.

Jorina Fontelera. “How Fast Can an Ant Run?” Article. eHow, n.d. http://www.ehow.com/about_5365350_fast-can-ant-run.html.

Luke Houghton. “The Hall of Awesomeness» Why Are People Like Ants?” Article. The Hall of Awesomeness, January 3, 2008. http://lukehoughton.com/2008/01/03/why-are-people-like-ants/.

Simone Cacace, and Emiliano Cristiani. Myrmedrome - A Real Ant Colony Simulator, n.d. http://www.not-equal.eu/myrmedrome/main_en.html.

“Virtual Reality and Visualization.” Stanford School of Medicine, n.d. http://cisl.stanford.edu/what_is/sim_modalities/virtual_reality.html.


Tuesday, June 12, 2012

Position Tracking

I have now finished the position tracking with minimal testing. The spacing I am using is at a size that I find reasonable for only two cameras. I ended up just having a x and a y camera. This turned out to be the easiest way as I knew it would become a simple grid. The biggest problem I had from this was the distortion of view a camera has the further the distance of the object is. This means the distance to travel across the x value when close to the camera will appear to be the same value as traveling along the x value when further away. This would have ruined the experience for our virtual user as their walking would take longer in some places and shorter in others for the same distance.
The distortion of x value in a camera
Fortunately from my college days in math, I was able to come up with an equation using the shortest and longest physical lengths of the x value:

Xvalue * (maxXvalue/(maxXvalue + maxYvalue - Yvalue)) = Xvalue

This is not at all the right solution but it will serve its purpose for our simulation. It was good to see that my math from college have been put to use. This has allowed me to progress onto other important and urgent matters for our group which is followed by a daunting deadline.

References

“Educate Yourself on Security Cameras & Surveillance Equipment”, n.d. http://www.wecusurveillance.com/cctveducate.

Friday, June 08, 2012

Room Evaluation

I have spent time evaluating our area for the simulation. It was a good time to try and picture where everything went and any interference we might get from the ant simulators. I was able to take measurements and create a 2d floor plan with dimensions so we can start to map out its virtual size.
Getting the dimensions
I have made progress with the connection for MaxMSP to Unity3d. I can now use colour
detection in MaxMSP to get the x and y coordinates on screen and have it sent to Unity for
the x and y coordinates on an old ant simulation version. I was able to do this by
understanding what was happening in the c# code in Unity. I then just had to modify it to my
liking.

Sending colour tracking data to Unity
I have got a better idea while when on a break about how the colour tracking will work and made some sketches for lights attached to the backpacks all at different lengths so they wont interfere with each other. The colour detection will work with detecting 3 colours at once. The part I am most concerned with is the usb extension cable distance and having one computer handle 5 webcams input at once.

Friday, June 01, 2012

Max with Unity

There has been a great change in development in the last few days, with a few problems solved. All it took was finding and talking to the right people. After spending a decent amount of time seeing if it was possible to send data straight from arduino to Unity3d I started to revise about using a medium that could solve two problems at once. As we will be using Max/MSP to track our users positions I thought it would be easier to look at sending arduino data from Max into Unity. I already new arduino could send data to max, now it was just testing how hard it was to work with unity.

I was able to find some videos of others doing the same thing. It eventually led me to this post made by Anneke Crouse, a current BCT member. She and her team partner Lidy Van Duerson were able to tell me how they used UDP as a way to send data. They used the myu Max-Unity toolkit. After looking at the Unity forums about UDP I found a guy (Bjerre) who has a reasonable amount of knowledge in UPD between Max and Unity. I was able to simply download his patch and package and follow his instructions. I can now use a colour detection patch and send the height and width of a colour seen on my Mac camera to Unity to change the size of a rectangle. This is seen in the video below.

 

My way of planning and researching looks to be developing as I start to understand alternative approaches to tackling a problem. This way of thinking will be important in the coming week if we hope to fully understand how our project will fit together.

  References


“[ANN] Myu Max-Unity Interoperability Toolkit V.1.0 Released « Cycling  ’74 Forums”, n.d. http://cycling74.com/forums/topic.php?id=18726.
“Max/MSP Communication?”, n.d. http://forum.unity3d.com/threads/4853-Max-MSP-communication/page2.
“Unity/Max Motion Track Test.” Vimeo, n.d. http://vimeo.com/13671529.
“Using Motion Tracking to Scale Unity Objects « Cycling  ’74 Forums”, n.d. http://cycling74.com/forums/topic.php?id=26981.