See ‘The X-Files’ Gillian Anderson Translated via Motion Capture to a Video Game

We connected with the animator behind it all to explain what's happening as the actress performs for Cloud Imperium's 'Squadron 42'

The first episode from the new season of The X-Files just dropped, but fans of space, aliens and Gillian Anderson have something new to look forward to from the star: she’s making an appearance in Squadron 42, the single player version of the huge new sandbox video game from Cloud Imperium Games, Star CitizenMs. Anderson is playing Captain McLaren, a non-player character who happens to be the daughter of Admiral Ernst Bishop, the character Gary Oldman plays in the game.

Ms. Anderson appeared in a new behind the scenes video, posted above, showing how real life actors are recorded when their performance is destined to appear inside a video game. Steve Bender, the animation director, broke down the gear seen in the video in a phone call with the Observer.

Star Citizen does not have an official release date yet.

One critical part of the process isn’t shown in the video. Mr. Bender explained that first each real life actor has to be scanned, to make a computerized puppet for the game. Then a virtual skeleton has to be built to go inside the virtual puppet. Computers use video to interpolate the movements of the actual person’s skeleton and interpolate that into movements of the virtual puppet’s skeleton. The skeleton is what makes character avatars lifelike.

Gillian Anderson on set for 'Squadron 42.' (Photo: Cloud Imperium Games)

Gillian Anderson on set for ‘Squadron 42.’ (Photo: Cloud Imperium Games)

Here’s our scene-by-scene annotation of the video with Mr. Bender, with timestamps indicating where astute viewers can see the specific observations we’re discussing (these answers have been edited for length and clarity):

[0:12] One of the first strange elements you notice are the marks on Ms. Anderson’s face. Lines have been drawn all over it. What are those markings doing for the production?

The primary thing they do is, when you’re using face tracking software, they allow us to pinpoint specific locations on the face. 

We start with a scanned 3D model of Gillian’s head, and then what we do is we record her performance, and then we put bones into that 3D model, and the motion—the video that is recorded on set—that motion is solved onto those bones.

Cloud Imperium Games works with an outside company called Cubic Motion, and it is Cubic Motion’s solver which determines what happens when those lines move, what their solver is doing is it is analyzing the facial position on that frame, and it is feeding back data that is use to drive bone positions in the face.

[0:14] You can also see that she has this big cap on with lights shining directly on her face all the time. Those must be cameras capturing her face. How many cameras are there, constantly looking straight at her?

There are three. They look like black lipstick tubes. The lights are just so the cameras can see her better. [1:29 shows the same moment from multiple face cameras]

Behind the scenes, how entire worlds are procedurally generated in ‘No Man’s Sky’

[0:25] Hers has some pink tape on it. What’s that doing?

What that is, is that just tells the mo-cap [motion capture] studio which helmet goes with which body outfit.

[0:30] On the body outfit, there’s these colored patches all over her. What are those?

Each of the actors have a different color patch. It enables us to be consistent on set. At the end of every motion capture session what they do is a range of motion. It’s used later on in the solving process. It’s easier to do that range of motion because they just have to adjust the positions a little bit every day for her.

In the center of those colored patches is what looks like a white or a silver ball. The cameras see the reflectivity on the balls. What happens in motion capture is, if you want to plot a point in space, you need four points. The patches help to keep that marker on the body. Those markers are placed on points on the body that are going to bend or that are rigid so that the system can see how the body moves. We go through a process of targeting and solving that motion capture data, and there’s a process that applies them to our character skeleton.

[0:44] Is there tech inside the gun that she’s seen holding?

No, some of the weapons, if you look at the weapon and there’s a shiny ball on it, then there’s something to mark that gun’s position through space. But there’s nothing being transmitted.

[0:53] Is the backpack she has on doing something?

They are multiple things. One is batteries for the cameras. They have lavalier microphones, and their transmitters, and also we are transmitting the feeds from the cameras live as well. So there is a whole bunch of stuff attached to the back of them. Battery packs and data transmitters of different sorts.

Crew watches recordings of actors live, on set. (Photo: Courtesy of Cloud Imperium)

Crew watches recordings of actors live, on set. (Photo: Courtesy of Cloud Imperium)

[1:17] At various times, you can see cameras in the background. How many cameras does a set like this require?

There are cameras that look like they have red rings on them, those are the motion capture cameras [Cloud Imperium is using Vicon cameras and software], but there’s probably somewhere around 64. You want to have a reasonable number of cameras to be able to capture the data.

The other cameras, the video cameras, are there for reference footage. So what we do is after the day of the shoot, we sit down and go through each take and say we want this endpoint and that outpoint. While we are finalizing our animation, we can use those reference cameras to refine the motion capture. Such as: she touches the table in a certain way. We can say “Our solve here is a little loose. She really presses down here…”

[Photo: Cloud Imperium Games]

[Photo: Cloud Imperium Games]

[2:35 and 3:28] Can the directors and crew see the characters running around in the actual scene in the game on a monitor, live, a rough draft of the final solve, while recording the scene?

Yeah, this is something I pushed and got developed when I was at Crytek. What you’re seeing there is, if it’s only grey, is you’re seeing the preview of the solve in Motionbuilder [which is Autodesk software]. If it looks like it has textures, that’s the real time preview in the Cryengine [Star Citizen and Squadron 42’s game engine].

What we do is, we teach the system, that when Ball A moves around, its relationship to Ball B. So the system in a way understands what’s a shoulder, what’s an elbow and what’s a wrist. We are able to apply that roughly to our skeleton in real time.

[3:56] How many members of crew does it take?

I think there were probably 15 to 20 people on set, not counting the actors, both on the Cloud Imperium side and the motion capture side.

See ‘The X-Files’ Gillian Anderson Translated via Motion Capture to a Video Game