By the way, if you’re reading this Tumblr post, you’re looking at a free portal into Newsweek.com, which normally has a paywall. Enjoy!
A Computer Learning How To Snowboard
Animated data visualisation by Andreas Koller showing the process of a computer exploring and understanding athletic movements, simply and beautifully presented in interactive WebGL - video example embedded below:
This screencast shows the process with random values for velocity, take off strength, and X, Y, Z rotations. After eight attempts the best result is kept and repeatedly shown …
This project also explores the human body as a visual instrument. How does the body, with its many inputs and outputs, perform as a tool for visual expression? What data does an athletic movement produce and how could it be made visible? How do we understand and remember complex movement patterns?
Google unveils next-generation smartphone device featuring motion and depth sensors. This is really exciting as it offers computational photography to the masses and far more sophisticated Augmented Reality experiences. The prototype device is available now for developers to create something special - video embedded below:
As we walk through our daily lives, we use visual cues to navigate and understand the world around us. We observe the size and shape of objects and rooms, and we learn their position and layout almost effortlessly over time. This awareness of space and motion is fundamental to the way we interact with our environment and each other. We are physical beings that live in a 3D world. Yet, our mobile devices assume that physical world ends at the boundaries of the screen.
The goal of Project Tango is to give mobile devices a human-scale understanding of space and motion.
You can find out more at the Project Tango website here