Introduction
With the ingress in the market of products like Nintendo Wii, Apple iPhone and Microsoft Kinect, developers finally started realizing that there are several ways a person can control a computer besides keyboards, mice and touch screens. These days there are many alternatives, obviously based on hardware sensors, and the main difference is the dependency on software computation. In facts solutions based on computer-vision (like Microsoft Kinect) rely on state of the art software to analyze pictures captured by one or more cameras.
If you are interested on the technical side of it, I recommend to have a look to the following projects: Arduino, Processing and OpenFrameworks.
Usage with Ubuntu
During a small exploration we did internally few months ago, we thought about how Ubuntu could behave if it was more aware of its physical context. Not only detecting the tilt of the device (like iPhone apps) but also analysing the user’s presence.
This wasn’t really a new concept for me, in 2006 I experimented with a user proximity sensitive billboard idea. I reckon there is a value on adapting the content of the screen based on the distance with who is watching it.
We came up with few scenarios which are far to be developed and specified, hopefully will just open some discussions or, even better, help to start some initiatives.
Lean back fullscreen
If the user moves further from the screen while a video is playing on the focused window, the video will go automatically to fullscreen.
Fullscreen notifications
If the user is not in front of the screen, the notifications could be shown at fullscreen so the user can still read them from a different location.
Windows parallax
Since this is the year of 3D screens, we couldn’t omit a parallax effect with the windows. A gesture of the user could also trigger the launcher appearance (see prototype below).
Prototype
With few hours available, I mock up something very quickly in Processing using a face recognition library (computer-vision).
Despite it could be hard to detect the horizontal position of the user’s head without a camera, we are in no way defining the technology required. The proximity could be in-facts detected with infra-red or ultra-sound sensors.
Parallax and fullscreen interaction via webcam from Canonical Design on Vimeo.


Wow! This stuff looks amazing! Good to see the design team are thinking towards the future :)
This looks awesome! Sounds like a lot of work, but its something not found in any other system either. Very innovative!
AWESOME!!!
Another usage to this can be to change window focus based on what the user is looking at. On dual screen mode, that can be very useful
Simply awesome Ubuntu is made to be the perfect O.s. of the future. We need pacience and money!
So is this something that Canonical actively plans to develop and put in Ubuntu in the near future, or just a “It would be cool if…” kinda thing?
@dieki, at the moment we just did this little exploration and it’s not planned to be developed in Ubuntu any time soon. As I mentioned in the post, it would be great to see what the community thinks and if they can bring this further. Said that, this doesn’t mean the Canonical Design team won’t come back to it at some point in the future, we’ll try to keep you posted if that happens! ;)
Interesting but it would be a lot cooler if you could use the software to unlock your computer by looking at the webcam and stuff. Moving the windows by moving yourself is great though.
This is clever and exciting. On some systems, the overhead from running the camera constantly might be an issue.
Another feature might be to pause a video if you are distracted (the tracker loses your face for 10-15 seconds).
well done, ciaparat!
I would encourage you to seek out accessibility experts who are working on computer systems for individuals with limited mobility and can not work with traditional input devices. Head movement and eye tracking input hooks need to well integrated into the existing accessibility frameworks. It would be best if you treated this exploration as a special case of accessibility more generally.
-jef
I’ve always wanted this. Looks brilliant. :)
I would probably have lean-back to Exposé if you’re not watching a video. Though I guess you don’t want to be rocking back and forth all the time.
@kennet, if you lean back the eventually scaled down windows will be less recognizable ;)
That makes me think of other uses I would see for it (other than the obvious accessibility as Jeff points out):
– Switching from headphones to speakers
– Lowering volume if I leave the room
– Pausing the movie if I raise my hand
– replaying the last 10 seconds if I raise my eyebrows :)
– forcing me to stretch if I haven’t moved much in 2 hrs :)
tons of fun coming up !
Clippy 2.0, eh? Generally I’ve had only bad experiences with computers guessing what I want to do. Explicit hand/motion gestures are one thing, but interpreting passive actions as commands is a recipe for disaster.
This is like the face gestures first of April joke opera did.
great one, bro’ ;)
Nice scenarios!
Thanks for sharing.
Head tracking is a well known topic for flight simulator players.
Track IR is the best known of the bunch:
http://www.naturalpoint.com/trackir/
Compiz already has a headtracking framework to allow a lot of this……
What I (and others) have always wanted is focus follows eyes. Now, I think we are still a way away from getting eye tracking. But, in a multimonitor situation, I’d love to have my mouse pointer jump to the screen that I’m looking at, and the focus go to the control that last had focus on that screen.
So often I find myself typing on a window on the wrong screen and wondering why it isn’t appearing.
I posted this on vimeo, but posting here too as it has wider audience.
There are more ideas here:
http://bit.ly/d5GlVI
Hi guy, I really your idea, I’m not a fluent programmer but I want to know if there is a mailing-list to share idea. I want to share my point of view about this great Idea.
Thanks for sharing
With Adobe’s ability to turn on your PC’s web Cam and Record you, via Flashplayer, I will never use anything like this.
Check out Adobe’s web site at:
http://www.macromedia.com/support/documentation/en/flashplayer/help/settings_manager02.html
@cyrildz, the best way to discuss things is joining Ayatana Mailing list: https://launchpad.net/~ayatana
@laxminarayan, thanks again for the link! ;)
Am really excited about the ideas posted by Anon on 14.09.10 .
Any source code available? Would be pretty neat to try out myself :)
wow, this is awesome!
will it come preinstalled with unity on supported netbooks in the future?
this could be an awesome revolution ^^
@Daan, I hope you know how to use Processing and third party libraries with it: http://nuthinking.com/goodies/FaceDetection_02_blog.zip ;)
Why don’t you fix what you already have before people starts to figure out the Emperor has no clothes?
Thanks Christian.
I know little about processing but the real expert in that area is a friend of mine Meredith L Patterson, she would love to experiment with this :)
Ubuntu is famous for making Linux accessible. What I would really like is to see is them bring hardware into the picture that benefits community organisations. For example, I’m president of several amateur sporting clubs, we would love to make use of things like RFID and/or Swipe cards but the hardware cost and implementation costs are way too high only commercial clubs can afford that technology.
A cheap off the shelf solution for managing memberships and access to venues based on low cost easily implemented hardware could have thousands of potential users if packaged correctly.
Really nice stuff guys! This are great ideas and I think it´s where the future goes. You should also have a look at our work and maybe get some inspiration.
http://www.youtube.com/watch?v=OHm9teVoNE8
(This is a prototype of a video player, that reacts to proxemics of people and devices in your living room)