Robotic Gadgets January 26, 2007Posted by reto wettach in gadgets, innovative interfaces, mobile.
Finally a research at the Soon Chung Yang University (Korea) finally implemented a long dream of us, the merger of a mobile phone with a robot. The robot finds its powerstation by itself and automatically drives to the call receiver. It also can indicate whether a nice or a mean person is calling – through “pleasant and unpleasant motion patterns”. (Video here, Thanks Andre)
The concept phone Nokia 888 is a shape-changing phone: it can communicate its status by changing shapes, but also two owners of the 888 can send shapes to each other, as e.g. heart shapes.
Mobile Touch at Microsoft Research October 31, 2006Posted by reto wettach in innovative interfaces, mobile, physical interaction design.
1 comment so far
Last week I had the chance to listen to Patrick Baudisch from Microsoft Research. At the Technical University in Berlin he gave a talk on “making sense on small screens” (slides). Within his talk he introduced his “summary thumbnail”, something like semantic zooming: Patrick developed an algorithm, which would scale down web-sites to the size of the screen on mobile phones, still keeping the most important fragments of the text readable.
Patrick also talked about SOAP, a – as he said – very personal research project. He developed a “a mouse-like pointing device that works in mid-air”. His project is based on the observation that – when buying a mouse – we actually only purchase half of a pointing device: the other half is the surface, which is needed to operate a mouse.
SOAP is basically the technology of a wireless infrared mouse put into a sock: the user can move this sock or skin over the internal technology containing object, which has the shape of a soap. In the discussion afterwards somebody said that it reminded of playing with a foreskin. An interesting, but true statement.
Touch for Mobiles – Update October 31, 2006Posted by reto wettach in innovative interfaces, mobile, physical interaction design.
Touchsensitive areas or screens are one of the hot directions for mobile interaction.
Not too new, but still interesting is Sony’s concept of Pre-Sense (Rekimoto, Ishizawa, Schwesig, et. al., 2003) , which basically combines a touch-pad with keys. Amongst other features, this technology allows interfaces, which are showing the user, what will happen, if he presses the key.
Quite similar is a the phone PG2800 by the Korean company PanTech: This phone can recongnize finger writing on top of the keys.
Another quite new patent is looking in using touch area surrounding the screen: Apple just filed this patent, which descibes an Electronic Device Having Display and Surrounding Touch Sensitive Bezel for User Interface and Control.
Through these interaction principles, the screen area will not be disturbed by some interacting fingers. Furthermore the screen might not get dirty of scratches through being used as an interface. (via hrmpf).
String-based Interfaces July 14, 2006Posted by reto wettach in innovative interfaces, making the invisible visible, mobile, physical interaction design.
A string is a nice way to represent a 1-dimensional set of data through length.
This alarm clock by Duck Young Kong is set by pulling a string; the length of the string is representing the time remaining until the alarm goes off. This is interesting, because here input and feedback are done through the same media – the string. I am wondering why the designer did not add some kind of scale to the string so that reading becomes easier (in an earlier post I am describing a system, which does this).
The string has also been one of the earliest expamles for Ambient Interfaces: the LiveWire by Natalie Jeremijenko. In her case, she not using the lenght to communicate information, but the activity of the wire: the more the wire dangles, the more net work traffic is happening in the building in this moment. (image by Marek Plichta)
An interface based on a rectractable string was recently presented by Gabor Blasko et.al.: in this case not only the length of the pulled string is used to receive information, but also the angle in a polar coordinate system. Additionally Blasko is adding LEDs to the string to create a “1-D-display”. Scenarios of how to work with this interface include mail reading and calendar checking.
(images are courtesy of Gabor Blasko)
Sound Feedback Interfaces July 14, 2006Posted by reto wettach in innovative interfaces, mobile.
add a comment
Due to the danger of interacting with devices “on-the-go” (a car might run over you or you might miss saying hello to your boss), there is a development in the direction of interfaces, which give feedback through sound.
An interesting one is SonicTexting, developed in Ivrea by Michal Rinott: a joystickbased text-input is enhanced by sound. I like the quality of the sound feedback, especially as Michal developed different variations – according to the expertise of the user.
Recently APPLE filed a patent for a “talking i-Pod”:
“The new iPod will tell you what it is about to play, removing the need for users to look at the screen while selecting music, and making the device safer and easier to use while driving, cycling or in badly-lit locations.” (source)
New Mobile Services July 5, 2006Posted by reto wettach in innovative interfaces, mobile, physical interaction design, rfid.
1 comment so far
Vancouver one can finally pay his parking fee via phone – powered by Verrus. Call a number and type the number of minutes you want to park. Nice additional feature: if you leave early, you can call the number again and cancel the remaining minutes…
I am wondering, whether it would be nice to have gestures for these kind of services – gestures as in table whacking.
Buscom from Finland is doing a trial with Nokian on this issue: One can pay the bus fare with a simple gesture with the mobile phone:
Erfassen your world! June 6, 2006Posted by reto wettach in biofeedback, innovative interfaces, mobile, physical interaction design, rfid.
1 comment so far
I am using the word "Erfassen", because in German it means "to comprehend", but literally it translates into "touching".
A lot of interaction research is trying to enhance the experience of blind people. The digital cane is – of course – one of the favorite places to start such a new experience:
The UltraCane is a cane, which informs the user of obstacles ahead through vibrating buttons in the handles. Different buttons tell the direction of the obstacle. Here is a nice animation of how the UltraCane works.
Another project was presented during the confenrence "Internation Symposium on Intelligent Environments" and is using RFID-technology for a similar purpose: SPOT-IT by the university of Bratislava, Slovakia, is suggesting a system for blind people, which provides contextual information about their surroundings. However, I am not so sure whether this is the right approach, especially as RFIDs have rather short-range reach. Furthermore do I think that there is such a big need for international agreements on standards that it will be difficult to make such a project come true (just think of the fact that there are more than 3 sign languages for German shows how difficult an agreement is in that field).
Beyond tactile feedback, sound is also an interesting source for non-visual experience of the world and how to navigate it: This reminds me of a project by Haraldur Unnarsson, students in Ivrea, who used music to tell directions in a car navigation system: through emphasis on the left or right channel in the car's stereo system, the user is told how to navigate. Unfortunately this project is not findable online, only here.
Interaction with mobile devices and the real world April 26, 2006Posted by reto wettach in mobile, physical interaction design, rfid.
NFC(Near Field Communication) is how the mobile phone industry calls RFID for phones: A reader is build into the mobile phone and acts as a tag at the same time. This leads to number of use cases (Nokia) as: the hip teen can download music (Nokia) related to a movie by reading the tag on the poster. or: the craftman, who needs to read meters (Nokia). French Telecom has also a nice movie on application scenarios like paying with the phone or Location-Based Services.
Interesting is the feature that allows phones to read tags and be a tag at the same time. This reminds me of an interesting interface solution by Rekimoto at Sony's CSL: he is suggesting FEEL, a gesture-based interaction to establish communication between to networked devices. So the pairing doesn't need to be performed.
Another interesting interaction method was also developed by Rekimoto: Pick-and-Drop. With the quite natural gesture of picking screen-based objects (data) through a pen and dropping these objects on other screen-based computers an easy way of transfering data could be achieved.
Mathias Dahlström developed in his thesis at IDII a gesture-based language for sharing data (in his case: music). I particulary like his idea of sending data by making a throwing gesture towards the receiver.