Challenging Interfaces July 5, 2006Posted by reto wettach in biofeedback, innovative interfaces, physical interaction design.
1 comment so far
During the two teaching projects on alarm clocks, which I taught in Ivrea and Potsdam, we came across various “challenging interfaces”, which would force the user into a state of higher concentration/attention.
Well known is CLOCKY, a robotic alarm clock, which runs away and hides, so that the user is forced out of bed.
During the classes I was teaching, my students also came up with some interesting “challenging interfaces”:
Hayat Benchenaa developed SEFRA, an alarm clock hanging from the ceiling, which is switched off by hitting is. Each time one hits the clock during the snooze function, it would rise toward the ceiling and therefore force you out of the bed.
Blanc-o-matic by Eva Burneleit and Katrin Lütkemöller is a blanket, which would after each time switching off the snooze-function be torn down in the direction of your feets by 20 cm. Works only in winter.
Another commercial product for a “challenging interface” is the “Pattern Clock“, which forces the user to play a round of Simon-Says.
Similar to the samples above is the Dead Man’s Switch, which is used in trains and should guarantee the the operator is not incapacitated (or asleep). There are different levels of complexity for this function, as reacting to beep.
Another kind of a “challenging interface” is the part in onlineregistration processes, where you need to prove that you are human through recognizing some strange words (images taken from the yahoo mail registration).
Are there more challenging interfaces? And what can we learn from them? And why are we not using physilogical sensors for this task?
To be continued…
Braille for Non-Blinds… June 14, 2006Posted by reto wettach in biofeedback, innovative interfaces, physical interaction design.
add a comment
Oren Horev, one of the last generation student at IDII, is exploring in his MA thesis the potential of ShapeShifters.
Inspired by the concept of the toy PinPression (here a digital version from 2003) Oren developed the idea of the tactile representation of information on a mobile phone. He is suggesting to having a tactile display area on the back of the mobile, which – from my point of view – makes more sense than the visual display – at least while talking on the phone. Oren is calling his phone the TactoPhone.
In a video prototype Oren is exploring different shapes for tactile icons, or Tactons as he is calling it.
Oren also developed application scenarios, which he is showing through short videos. I especially like the idea of using the Tactons also as input by pressing them like buttons.
When watching one of the video scenarios, I was surprise to see how the hand of the user is interacting with the device: the finger tips are waiting in one row until something tactile sensation arrives. Then they start exploring the surrounding environment. I think that the user would always want to try to grasp the entire area by putting as much as possible of his hand on the area.
Oren's project is an interesting exploration of Ivan Poupyrev's LUMEN, which is a working prototype of such a display plus the idea of making each tactive pixel also a visual pixel.
Erfassen your world! June 6, 2006Posted by reto wettach in biofeedback, innovative interfaces, mobile, physical interaction design, rfid.
1 comment so far
I am using the word "Erfassen", because in German it means "to comprehend", but literally it translates into "touching".
A lot of interaction research is trying to enhance the experience of blind people. The digital cane is – of course – one of the favorite places to start such a new experience:
The UltraCane is a cane, which informs the user of obstacles ahead through vibrating buttons in the handles. Different buttons tell the direction of the obstacle. Here is a nice animation of how the UltraCane works.
Another project was presented during the confenrence "Internation Symposium on Intelligent Environments" and is using RFID-technology for a similar purpose: SPOT-IT by the university of Bratislava, Slovakia, is suggesting a system for blind people, which provides contextual information about their surroundings. However, I am not so sure whether this is the right approach, especially as RFIDs have rather short-range reach. Furthermore do I think that there is such a big need for international agreements on standards that it will be difficult to make such a project come true (just think of the fact that there are more than 3 sign languages for German shows how difficult an agreement is in that field).
Beyond tactile feedback, sound is also an interesting source for non-visual experience of the world and how to navigate it: This reminds me of a project by Haraldur Unnarsson, students in Ivrea, who used music to tell directions in a car navigation system: through emphasis on the left or right channel in the car's stereo system, the user is told how to navigate. Unfortunately this project is not findable online, only here.
1 comment so far
A new technology has been introduced by Nike: the Nike+ shoe, which is communicating wirelessly with your iPod mini. An accelerometer in the shoe is sending data to the iPod, which then tells you, how fast you are or how much you still need to run.
This project reminds me of a research project done by the German research organization Fraunhofer in 2003: Stepman is a system, which monitors the runner's heartrate and then transforms the speed of the music (without changing the pitch) to either slow the runner down or make him run faster.
add a comment
“Whether speeding down a virtual street in Sony’s Gran Turismo or slaying Spyro the Dragon, researchers hope games such as these will improve the lives of those with attention-deficit hyperactivity disorder, commonly known as ADHD, or cognitive-processing difficulties.”
The idea is based on controlling the brain activity of the child with a special helmet: once they zone out, the game will not respond anymore.
Mitfühlender Handschuh January 12, 2006Posted by reto wettach in biofeedback, innovative interfaces.
1 comment so far
CHIP berichtet von einem Handschuh (entwickelt irgendwo bei Fraunhofer), der Hautwiderstand, Temperatur und Puls misst und somit der Stresslevel des Nutzers erkennen kann.
Hier gibts noch mehr Infos zum Thema