Intel's New Laptop Can Read Your Emotions, Make Everything 3D

The laptops and tablets can replicate objects, augment reality, and will be on sale within the year.

The 3D cameras recognize your hands movements, facing expressions, and much more. (Photo by Jack Smith IV)
The 3D cameras recognize your hands movements, facial expressions and much more. (Photo by Jack Smith IV)

The tech world has been buzzing about Amazon’s new Fire Phone, which has a small array of simple 3D cameras on the front. But today, Intel showed off their new 3D camera tech, and it already makes Amazon’s attempt seem quaint.

Intel has been working for years and has spent “hundreds of millions of dollars” on developing the depth-sensing cameras. They call the technology RealSense because of the lifelike way the cameras take the world in.

“We want to add human-like sensing to computing devices,” Dr. Achin Bhowmik, Intel’s lively CTO of Perceptual Computing, told Betabeat, “devices that can see and hear like us, and software that can understand human activities.”

Intel gave the first fully featured, hands-on demonstration of their RealSense cameras to a small press gathering in New York City this morning. They had a few apps available on a collection of laptops and a single tablet.

The RealSense devices use a few cameras that work together to mimic how the human eye captures depth. The cameras on the front of the laptop can capture objects within a meter’s range. Intel has a group of devices on display that had RealSense cameras, including a laptop that will be on sale by the end of the year.

In the first part of the demonstration, Intel showed us how Skype could cut out the background, and just bring up a 3D rending of whoever you’re talking to on screen. 3D-enabled Skype will come installed on laptops that ship with RealSense cameras.

(Photo by Jack Smith IV)
The demonstrator appears on screen, the hall behind him cut away completely. (Photo by Jack Smith IV)

Next, we were shown how the cameras can read the human face, creating a live, detailed rendering of a person’s facial structure and movement. Almost as soon as you come within range of the cameras, they find your facial features, identifying your forehead, eyes, cheeks, and mouth.

The cameras already have the ability to discern basic emotions — happiness, sadness, anger, frustration, exhaustion — though Intel didn’t have those features on display this morning.

The camera mapped the demonstrator's expression live onto a 3D model of a pug. (Photo by Jack Smith IV)
The camera mapped the demonstrator’s expression live onto a 3D model of a pug. (Photo by Jack Smith IV)

After faces, we were shown how the cameras can identify the human hand. Like any tech gimmick, there was a small game to go with it called Hoplites.

The basics of the game are familiar to anyone who’s played Lemmings, except you use your hand as a living bridge, guiding little legionnaires to a portal across deadly obstacles. The controls were incredibly responsive, and after two levels we were bridging chasms and guiding our troops seamlessly across the pitfalls.

(Photo via Jack Smith IV)
(Photo via Jack Smith IV)

Sure, the front-facing 3D camera was cool, but the most impressive hardware on display was the more sophisticated cameras on the back of the tablet.

The cameras can scan an environment quickly build 3D models. To demonstrate, Mr. Bhowmik held up the tablet and circled around the objects on the desk, building a live 3D rendering:

(Photo by Jack Smith IV)
(Photo by Jack Smith IV)

When the model was complete, he used his finger to drag it around on-screen, zooming in and looking from every angle. The tablets, which will be in stores in the first half of 2015, will have an indoor depth perception of about four meters, but can scan for dozens of meters outside.

The 3D model of the desktop, in full color, after about 30 seconds of scanning. (Photo by Jack Smith IV)
The 3D model of the desktop, in full color, after about 30 seconds of scanning. (Photo by Jack Smith IV)

The 3D models aren’t just good for building digital versions of real spaces — they can also be used as blueprints for 3D printing.

“The industrial community is interested in this for rapid prototyping,” Mr. Bhowmik said. “You can make quick 3D scans for replicating and modeling.”

Mr. Bhowmik didn’t have much to say about the implications of being able to build a 3D model of an entire room from your tablet in a matter of minutes, but we imagine the market will figure that one out pretty quickly.

Before the show, the team used the tablet to scan the Intel Bunny doll on the left, which built the rendering in the center. The white model on the right is a 3D printed replica made from the rendering. (Photo by Jack Smith IV)
Before the show, the team used the tablet to scan the Intel Bunny doll on the left, which built the 3D rendering in the center. The white model on the right is a printed replica made from the rendering. (Photo by Jack Smith IV)

But what Mr. Bhowmik was most excited about wasn’t replicating real objects, but “augmented reality”: putting digital, imaginary objects in real space. With augmented reality, you can look through a tablet to see the room in front of you, but also see digital objects that you can interact with.

For this demonstration, he held the tablet up to a table that had a group of toys, putting together a brief scan through the camera:

(Photo via Jack Smith IV)
(Photo via Jack Smith IV)

Once the scan was complete, he rendered a little 3D robot, who was visible only on the tablet screen, onto the table. When Mr. Bhowmik gave the robot instructions to move, it flew across the surface, dodging real-life objects and hopping over the blocks on the table. When he moved the display farther or closer, the robot stayed in the same place as if it were any other real-life object.

Looking back at the table, it was jarring to remember that the robot wasn’t really there.

When the Mr. Bhowmik moved the tablet to put an object between the camera and the robot, the robot disappeared behind the obstruction. (Photo by Jack Smith IV)
When the Mr. Bhowmik moved the tablet to put an object between the camera and the robot, the robot disappeared behind the obstruction. (Photo by Jack Smith IV)

This could be done with an infinite variety of imaginary objects, putting them in real space and interacting with them through the tablets.

“If I’m a teacher, and I want to show my students the solar system, I could put one right in the middle of the room,” Mr. Bhowmik said, dancing around the room demonstrating the vision that was, only for now, in his imagination. “Looking through the tablet, you could see the planets revolving around the sun. You reach out to touch Saturn, and it brings up all of the information you need.”

Other examples Mr. Bhowmik used were alternate reality games, like racing digital cars around a real room, and concerts where a band appears through the tablet to be performing right in your living room.

Later in the morning, Intel’s “Futurist” came out and gave a demo of his pet robot, which introduced itself as Jimmy, and told everyone that it was 3D printed and made with open source software.

Screen Shot 2014-06-24 at 11.15.07 AM
Jimmy can’t recognize you and greet you by name. Yet. (Photo by Jack Smith IV)

But wait, what’s the deal with the robot? Intel says that Jimmy is the next potential host for 3D sensing cameras. Intel wants to make robots that can scan their environment, know how to move around, walk down stairs, and interact with objects.

“People have 3D sensing, so the robot should have 3D sensing like us,” Mr. Bhowmik said. “It will recognize you, read your emotions. ‘Why are you sad today? Should I sing you a song?’ The future is crazy.”

Crazy. That’s one way of putting it, definitely. Intel's New Laptop Can Read Your Emotions, Make Everything 3D