Archive

Physical Computing

The first thing was how to make cubes stay where it is after hitting it. Somehow they knew how to can hop or roll away.. and with the projection, we needed them to be sturdy and in place. After hours of sewing this is how we did it:

(this is .GIF so be sure to click if it doesn’t play right now)

karate cube gif

the bodyweight keeps the cube in place when hitting and we sewed a robe for two cubes so that they can be transformed and standing on top of each other (for adults) or next to each other (so that the height suits to the kids).

IMG_8687 copy

IMG_8701 copy
IMG_8700 copy

We also got rid of messy wires and sensors, and in the coat that we made, we have added a pocket for conductive foam. Using that and conductive thread (sewing never ends), I think we did a good job making we the pressure sensor (thanks Antonius for teaching us how to do it).

IMG_8674 - Copy - Copy copy IMG_8679 copy IMG_8680 copy

The fabric sewed with conductive thread from on both and opposite sides of the foam that works exactly same as FSR. Look inside of our cube:

IMG_8737 copy
IMG_8741 copyIMG_8739 copy IMG_8733 copy

We put FSR underneath of the board, so that it changes the background music and starts the game.
(working like start button of the game).

IMG_8714 copy

Woon loves our red reset button (circuit inside) so there’s no doubt in whose room it will end up:

IMG_8727 copy IMG_8728 copy

sensor test and some sound effects:

.

.

WINTER SHOW:

 

 

.

logo

 

 

Advertisements

After making a circuit with a multiple pressure sensors, technical challenge was to write a code in processing and transfer the serial data from sensors into meaningful sketch on a screen, as it was about physical and graphical interface. The goal was to make interactive space, a game scene; to have a player and audience ready to play after. The system is supposed to be interactive, physically object on a scene equipped with sensors, sound response to it and graphical projection on the scene and a player. Conceptually the project was elaborate, but the code required the same approach. Mapping the sensors, so their values become meaningful in the contest of the game was not the hard part, but linking them to sketch appeared to be; trying to use only maximal values from sensors that constantly read data, demanded more variables. Moreover, we encountered problem of analog reading data and averaging that values as solution. We faced very new concepts in computer technology.

.

first sketch white background 1 pressure sensor:

.

.

more sensors and switch added

.


.

Woonyung (laying) testing sensors  numbers:

.

.

and final testing!

.

We started with a cube; one of many wonderful black firm-foam like cubes that are on the ITP floor. I see most of them usually serve sitting purpose, but no idea where did they come from; or what actually were made for. Some crazy project maybe? Who knows, but so very tactile thanks to the material they are made of, can’t stay just for sitting…

Woonyung et moi:

002 copy

.001 copy

.

Scan 2 copy

.Scan 1 copy

Truth to be told I’ve always been the one that doesn’t even press a button if I didn’t read the instructions first. And it is boring, but hey, how else am I supposed to figure out how microwave works? Or door? But I’ve never thought that a door could be of poor design. How illegibly it can become when need to put ‘pull’ ‘push’ label on it. Obviously, it can. ‘Labels are never excuses for bad design’ and even that can be put on a wrong place. Plenty of room for misleading and things to think through; but seriously think through;

Looks like a job to do the thinking but I feel rather relieved that all does labels are not supposed to be there and things can be legible just on their appearance.

Look, the toaster; bread on the table, a knife, maybe some butter? I think you see the movie rolling in your head how this goes. Self-explanatory.

Installations should be ‘toaster, bread and butter’. And please no messing when someone is making it’s own lunch. Make it and leave it. Do good thinking before and let the interaction do the ‘talking’.

teapot

.

Love this drawing. It tells everything about design process and mental image you get when you just throw a look and get all information available from appearance of an object. Conceptual objects and visibility of things; seems like most important lesson of design ever.

Teapot, thanks a million! Now got to go make my design worth a hundred of pictures.

first the famous ball as always:

then the processing sketch with some tweaks:

after maping:

the code:
import processing.serial.*;

Serial myPort; // The serial port
float bgcolor; // Background color
float fgcolor; // Fill color
float xpos = 130; // X-coordeinate
float ypos = 210; // Y-coordinate
int radius = 20; // wheel Radius

float noteX;
float noteY;

float carAngle = 0;
float angleChange = 0.19;
float angleLimit = -6;
float angleBall;
void setup() {
// List all the available serial ports
println(Serial.list());

// I know that the first port in the serial list on my mac
// is always my Arduino module, so I open Serial.list()[0].
// Change the 0 to the appropriate number of the serial port
// that your microcontroller is attached to.
String portName = Serial.list()[0];
myPort = new Serial(this, portName, 9600);

// read bytes into a buffer until you get a linefeed (ASCII 10):
myPort.bufferUntil(‘\n’);
size(800, 400);
}

void draw() {
background(255);
//
// if (button2 == false) {
// // angleChange = 0;
// carAngle = 0;

// else if (keyCode == UP){
carAngle -= angleChange;

if (carAngle < angleLimit || carAngle > 0)
{
angleChange =-angleChange;
carAngle -= angleChange;
}

//wheels
//rear wheel
stroke(2);
//noStroke();
fill(#36322E); //biggest
ellipse(xpos, ypos, radius, radius);

fill(255); //white
stroke(255);
strokeWeight(1);
fill(#36322E);
ellipse(xpos, ypos, radius-7, radius-7);
ellipse(xpos, ypos, radius-11, radius-11);

noStroke();
fill(#36322E); //thiniest
ellipse(xpos, ypos, radius-18, radius-18); //inside

//FRONT wheel
noStroke();
fill(#36322E); //biggest
ellipse(xpos+330, ypos, radius, radius);

fill(255); //white
stroke(255);
strokeWeight(1);
fill(#36322E);
ellipse(xpos+330, ypos, radius-7, radius-7);
ellipse(xpos+330, ypos, radius-11, radius-11);

noStroke();
fill(#36322E);
ellipse(xpos+330, ypos, radius-18, radius-18); //inside if ever BLIN-BLIN

//body
pushMatrix();

translate(xpos, 210);

// float xpos=0;
// float y=0;
//

stroke(2);
fill(255);

rotate(radians(carAngle));

beginShape();
vertex(0, ypos-210-radius * 1.7); //1
vertex(0+radius*2.2/2, ypos-210-radius * 1.7*0.8); //2
vertex(130 + 242, ypos-210-radius * 1.7*1.1); //3
vertex(130 + 242+radius*0.6, ypos-210-radius * 1.7*0.7);//4
vertex(130 + 242, ypos-210+ radius/2);//5
vertex(130 + 242-radius*0.8, ypos-210+ radius/2);//6
vertex(130 + 242-radius*1.1, ypos-210- radius);//7
vertex(130 + 242-radius*0.8-radius*2.2, ypos-210- radius);//8
vertex(130 + 242-radius*2.2*1.7, ypos-210+ radius*0.6);//9
vertex(130+10, ypos-210+ radius*0.6);//10
vertex(0+radius*2.2/2, ypos-210- radius*0.6);//11
vertex(0-radius*2.2/2, ypos-210- radius*0.6);//12
vertex(0-radius*2.2*0.6, ypos-210+ radius*0.6);//13
vertex(0-radius*2.2*1.7, ypos-210+ radius*0.6);//14
vertex(0-radius*2.2*1.9, ypos-210);//15
vertex(0-radius*2.2*1.5, ypos-210 – radius * 1.4*0.9);//16
endShape(CLOSE);

//HOOD
beginShape();
vertex(0, ypos-210-radius * 1.7); //1
vertex(0+radius*2.2*0.9, ypos-210-radius * 1.7*1.6); //2
vertex(0+radius*6.6*0.9, ypos-210-radius * 1.7*1.9); //3
vertex(0+radius*6.6+radius*2.2*0.8, ypos-210-radius * 1.7); //4
vertex(0+radius*2.2/2, ypos-210-radius * 1.7*0.8); //2
endShape(CLOSE);

popMatrix();

// Draw the shape
//ellipse(xpos, ypos, 20, 20);
}

void serialEvent(Serial myPort) {
// read the serial buffer:
String myString = myPort.readStringUntil(‘\n’);
if (myString != null) {
// println(myString);
myString = trim(myString);

// split the string at the commas
// and convert the sections into integers:
int sensors[] = int(split(myString, ‘,’));

// xpos = map(sensors[0], 430, 580, 0, width);
// ypos = map(sensors[1], 30, 200, 0, height);

for (int sensorNum = 0; sensorNum < sensors.length; sensorNum++) {
print(“Sensor ” + sensorNum + “: ” + sensors[sensorNum] + “\t”);
}
// add a linefeed after all the sensor values are printed:
println();

// make sure there are three values before you use them:
if (sensors.length > 1) {
xpos = sensors[0];
carAngle= sensors[1];
xpos = map(sensors[0],35, 200, width +130, 0);
carAngle = map(sensors[1],44,200,0, -6);
// the switch values are 0 and 1. This makes them 0 and 255:
fgcolor = sensors[2] * 255;
}
}
}

 

iOS 7 on the iPhone is a mixed experience according to users who made the jump as soon as Apple released the latest software update.

 

The biggest iOS 7 complaints are lag and performance, but battery life is also an issue. The video highlights some of the problems with using iOS 7 on the iPhone 4.

As daily tasks, each has developed its own system of how to use phone. Consequently, applications used are different and things are done differently.

Phone is used probably more than any other device and it’s important to be very efficient to use.

When used daily, motor skills and logic are invested into routine. Tasks become automatic. Automation saves time. So, any and a minor change in the interface means that extra time will be spent on re ‘learning’ and the break-in ‘new stuff’. Update that change interface that way, is inefficient.

With an iOS7, such changes apply to applications that are used every day very frequently (as a calendar or messages). It is more quite frustrating to change that level of automation. what appear to be the main difficulties are so simple tasks, such as deleting messages (one by dragging no longer available) or navigating google calendar (as it is different now).

On the other hand, each update comes with much more better and upgraded things. So the new iPhone interface has made some things easiest than before. All users have listed adjusting lightning as the easiest ever. It is essential that they do not waste time on such basic tasks; they are visible and simple.

All these wonderful applications that are commonly used, are supposed to help us save time and produce faster, superior results. Mapping is between what you want to do and what appears to be possible, or in this case what was to be possible doesn’t seems to do its job. When an action has no apparent result, you may conclude that the action was ineffective. So you repeat it. You repeat it again. Wasting time leads to frustration.

Only poor design allows frustration and here frustration comes from in changing adopted habits. But, this is mostly mine conclusion. And I am not an iPhone use. In a video you’ll see users poorly complaining on that. They easily got used to new interface, new photo arranging and more options! They had no hard time adapting to new stuff. Apparently changing habits is not such a bug if you are offering improved thing. The problem stayed with functions that are no longer available, and that were simpler than the new ones. Deleting messages with a swipe, cool! But now is gone, noo. That is a big deal.

There are also camera changes that some people just do not like it, while others are enthused about. Although both sides are Instagram users, opinions are different. Obviously, emotions are on trial here, not functionality. Yeah, “beautiful things does work better”.

I LOVE that in the middle of recording I asked for brightness to be lowered, and it took like a millisecond to be done! Simple commands, don’t change.

.