Archive

coding for emotional impact

.

setting targets left, then right; both in one line:

.

.

setting targets in triangle:

.

.

setting targets in triangle_2:

.

setting targets in a triangle and bouncing curve:

.

.

.

bouncing curve:

 

.

controling the sketch with Kinect:

.

Advertisements

Adjusting two variables of the flock, magnitude and speed limit of the vector. Speed limit gives different results when negative or positive, while the magnitude is changed by hoocking up curves (exponential, logarithmic, bouncing, sigmoid and linear).

 

.

.

.

.

.

.

.

.

This is how it started, with the most simplified flocking algorithm; had only attraction towards target: Here, flock is conducted with Leap motion, using hand tracking.

.

.

processing sketch : here :

However, after the user testing, I’ve decided to use the Kinect for motion tracing and conducting to utilize more of the body parts for more complex motions, instead of just one hand.

image (2)

Field report:

1.  Testing a responsive processing sketch 

Hoping to see how people respond to what they see. Before that, how noticeable it is. Whether they will see it or pass by. If they do, how interesting it is – will they stop, will they try to interact? How many times do they need to come by it to get intrigued and interact.

How long they interact? What is the interaction like? Will they come back or how they act next time if they are already familiar?

Note:

Testing was planned to be executed using one of ITP screens, to run the Processing sketch on the screen and attach leap motion next to it. Ideally, I would get the screen on the 1st floor of Tisch building, one that is inside but facing street. So the random passengers would be users. However, that was impossible to do, as the someone else’s project is running until May 1st.

After that, I got permission to use one of the screens on the floor, in the hallway, which was satisfactory enough. Again, I’ve encounter difficulties. I have spent more then 2 hours trying to get the sketch running, with the ending conclusion to ask for software updates on screen computers, and permission for that too. As that also will not be done in time, I used projector for testing. This, as a major change, has affected results of the test (as I knew it would), still I was hoping to get at least some feedback.

2. Interviewed persons info: 

ITPers:

Random ITPers passing by, as they would every day. Testing day is Tuesday, when mostly 2nd years are on the floor, doing their thesis. Not as much students as during other weekdays. (so many factors)

– Most of them just passed by, with a glimpse but walked on.

– 20% actually took a look for few seconds to see what is going on. Here, not only on projection, but all around, projector, leap, cables. More likely they were interested in what is going on in general then on the screen. Of course as projector, cables and leap were improvised and so obvious.

– And those 10%, that were walking close by enough to trigger the sketch, and intrigued enough to try to interact, spent not more then 20-30 seconds. One reason is that projector was projecting into their back. (only one crunched to try to play a bit more). That again, lasted not more 30 seconds, putting hand right in front of the leap.

3. overall interpretation

Screen, screen, screen, it has to be on the screen. Even if it was short throw projector, it is different when facing a wall with something behind than the screen with something ‘in it’. *unless this is to be installed in subway, projecting on the other side’s platform, when we want people to look behind, onto another side.

Replace Leap with Kinect. Leap has to short range, so significant number of potential interactions is lost. Much more than I thought. Also when placing the sensor, leap or Kinect, should make sure it is well hidden or at least be placed as it is part of the screen. The aim to make people interact with the thing on the screen, not that obviously with sensor itself.

More important, Kinect will enable more interesting (possible) movements/actions, than just hand movements, which are not interesting in amazingly short time.

Also the sketch itself should be changed in a way to poses more different behaviors, so it can build more complex interactions. Counting on that more complex means more engaging.

The state of the sketch, when idle and when ‘is after someone’, should be more distinctive.

Stormy day; It starts with few seconds of night, to emphasize and show it is a daylight. Later during composition, was trying to mimic lightning with fast darkening instead of fleshing (because I had no other option, still I think that sudden change is good enough to depict what is going on).

A nice day has started, birds are coming out. As time is passing by, there are more birds flying and singing.. Until the storm starts so birds have to hide. Rather impatiently they fly out again, as soon the rain soothes.. Day goes on, interlacing those two.

Tan function seemed to be the most effective for dramatic effects, pouring rain and bunch of birds latter coming out later. Sinus and Sawtooth for ‘the rest of the day’.

inspired by Cats Video:

.

timing and pacing here:

.

digression on subject of cats video, another one:

timing and pacing concept for the creature:

hurry ‘with purpose’; short breaks to ‘look around and change’;

finding itself ‘in desert’ at one point – the slowest pacing but most ‘critical’ one.

Still working on this. I’ve tarted at 2am last night and thanks to my ‘efficiency rule’ I don’t carry computer charger home : ) so far, this is 1st iteration of Creature sketch with with 1 simple recursion.

I wanted to affect the pattern and change in movement; Aimed for organic growth, to look like it is living creature, not machine (not sure why this matters).

Interesting thing happen after some time as creature starts eating itself. Like that homework task few weeks ago, to watch what happens after some time, I put constrains for speeding up/slowing down the sketch (slower actually easier to keep on watching).  So how this recursion happen over time is:

output_vzTb6E

Gets really slow after, goes slower and slower in fact..

output_CiCbPb

-Should I go to kitchen and come back, how different would it be? No dynamic shifts.

-I’m saving frames to see how it changes, but I don’t even care for 211.000 frame any longer..  (still saving)

-Waiting for change even not sure if there will be one. Hope it’s not like “machine in concrete

-Waiting for it to die. Tiresome.

-Like this is a pet.

-Finally it crashed at 266.000 frames (those are not actual frames, but variable used to speed up/slow down frame rate).

Untitled

 

Looks like nothing much has happened.

Sounds like time for more engaging one.