Microsoft HoloLens: Partner Spotlight with Volvo Cars [Video]

Microsoft HoloLens: Partner Spotlight with Volvo Cars
From giving customers a sensor’s vantage point to configuring cars in entirely new ways, Microsoft HoloLens is working to bring Volvo’s cutting edge car features to life in ways never before possible.
via YouTube

Connecterra: Intro Video [Video]

Connecterra: Intro Video
Connecterra is a sensor hardware and machine learning company based in Amsterdam, Netherlands.

This video is an introduction to our first service for dairy farmers that helps improve dairy farm productivity. Contact us for more details at
via YouTube

It’s Not a Phone, It’s a Galaxy: Gear VR [Video]

It’s Not a Phone, It’s a Galaxy: Gear VR
Virtual reality just got real. Powered by Oculus, the Samsung Gear VR puts you right into the action. From games to movies to your favorite shows, Gear VR gives you a fully immersive experience.
via YouTube

BitDrones: Interactive Flying Microbots Show Future of Virtual Reality is Physical [Video]

BitDrones: Interactive Flying Microbots Show Future of Virtual Reality is Physical
Queen’s University’s Roel Vertegaal says self-levitating displays are a breakthrough in programmable matter, allowing physical interactions with mid-air virtual objects

High resolution photographs of BitDrones are available at

Press Contact:
Chris Armes
Communications Officer, Media Relations
613-533-6000 ext. 77513

KINGSTON, ON – An interactive swarm of flying 3D pixels (voxels) developed at Queen’s University’s Human Media Lab is set to revolutionize the way people interact with virtual reality. The system, called BitDrones, allows users to explore virtual 3D information by interacting with physical self-levitating building blocks.

Queen’s professor Roel Vertegaal and his students are unveiling the BitDrones system on Monday, Nov. 9 at the ACM Symposium on User Interface Software and Technology in Charlotte, North Carolina. BitDrones is the first step towards creating interactive self-levitating programmable matter – materials capable of changing their 3D shape in a programmable fashion – using swarms of nano quadcopters. The work highlights many possible applications for the new technology, including real-reality 3D modeling, gaming, molecular modeling, medical imaging, robotics and online information visualization.

“BitDrones brings flying programmable matter, such as featured in the futuristic Disney movie Big Hero 6, closer to reality,” says Dr. Vertegaal. “It is a first step towards allowing people to interact with virtual 3D objects as real physical objects.”

Dr. Vertegaal and his team at the Human Media Lab created three types of BitDrones, each representing self-levitating displays of distinct resolutions. “PixelDrones” are equipped with one LED and a small dot matrix display. “ShapeDrones” are augmented with a light-weight mesh and a 3D printed geometric frame, and serve as building blocks for complex 3D models. “DisplayDrones” are fitted with a curved flexible high resolution touchscreen, a forward-facing video camera and Android smartphone board. All three BitDrone types are equipped with reflective markers, allowing them to be individually tracked and positioned in real time via motion capture technology. The system also tracks the user’s hand motion and touch, allowing users to manipulate the voxels in space.

“We call this a Real Reality interface rather than a Virtual Reality interface. This is what distinguishes it from technologies such as Microsoft HoloLens and the Oculus Rift: you can actually touch these pixels, and see them without a headset,” says Dr. Vertegaal.

Dr. Vertegaal and his team demonstrate a number of applications for this technology. In one scenario, users physically explore a file folder by touching the folder’s associated PixelDrone. When the folder opens, its contents are shown by other PixelDrones flying in a horizontal wheel below it. Files in this wheel are browsed by physically swiping drones to the left or right.

Users are also able to manipulate ShapeDrones to serve as building blocks for a real-time 3D model. Finally, the BitDrone system allows for telepresence by letting remote users move around locally through a DisplayDrone with Skype. The DisplayDrone automatically tracks and replicates all of the remote user’s head movements, allowing a remote user to virtually inspect a location and making it easier for the local user to understand the remote user’s actions.

While their system currently only supports a dozen of comparatively large 2.5” – 5” sized drones, the team at the Human Media Lab are working to scale up their system to support thousands of drones. These future drones would measure no more than a half inch in size, allowing users to render more seamless, high resolution programmable matter.

About Human Media Lab
The Human Media Lab (HML) at Queen’s University is one of Canada’s premier Human-Computer Interaction (HCI) laboratories. Inventions include ubiquitous eye tracking sensors, eye tracking TVs and cellphones, PaperPhone, the world’s first flexible phone, PaperTab, the world’s first flexible iPad and TeleHuman, the world’s first pseudo-holographic teleconferencing system. HML is directed by Dr. Roel Vertegaal, Professor of HCI at Queen’s University’s School of Computing. Working with him are a number of graduate and undergraduate students with computing, design, psychology and engineering backgrounds.
via YouTube

Stanford Engineers Test Tricorder-Like Detector [Video]

Stanford Engineers Test Tricorder-Like Detector
Science fiction popularized the tricorder, a handheld detector that can gather readings from the environment or diagnose disease. Stanford engineers have taken a big step toward developing such a device by combining mild microwaves with sensitive ultrasound to create a safe and portable way detect hidden objects.
via YouTube

EM-Sense: Touch Recognition of Uninstrumented Electrical and Electromechanical Objects [Video]

EM-Sense: Touch Recognition of Uninstrumented Electrical and Electromechanical Objects
Most everyday electrical and electromechanical objects emit small amounts of electromagnetic (EM) noise during regular operation. When a user makes physical contact with such an object, this EM signal propagates through the user, owing to the conductivity of the human body. By modifying a small, low-cost, software-defined radio, we can detect and classify these signals in real-time, enabling robust on-touch object detection. Unlike prior work, our approach requires no instrumentation of objects or the environment; our sensor is self-contained and can be worn unobtrusively on the body. We call our technique EM-Sense and built a proof-of-concept smartwatch implementation. Our studies show that discrimination between dozens of objects is feasible, independent of wearer, time and local environment.
via YouTube

NEC Turns a Person’s Arm Into a Keyboard [Video]

NEC Turns a Person’s Arm Into a Keyboard
Japan’s NEC Corp. has created a user interface which can display an augmented-reality keyboard on a person’s forearm, using eyeglasses and a smartwatch. Photo: NEC Corp.

Subscribe to the WSJ channel here:

More from the Wall Street Journal:

Follow WSJ on Facebook:
Follow WSJ on Google+:
Follow WSJ on Twitter:
Follow WSJ on Instagram:
Follow WSJ on Pinterest:
via YouTube