Buried inside an industrial office park roughly 40 miles north of San Francisco, there’s a half-court gym with no hardwood, no stands and just one basketball hoop that routinely attracts the best basketball players in the world.
The Warriors’ Steph Curry was there last summer. So was Clippers’ point guard Chris Paul and NBA slam dunk champion Zach LaVine. On a recent Tuesday in late August, two of the NBA’s up-and-coming superstars, Karl-Anthony Towns and DeAngelo Russell, were using the court to play some one-on-one, Towns nearly ripping the rim off with one-handed jams while Russell popped three-pointers with ease.
Unlike the duo’s usual basketball contests, though, this game was fully scripted, each jumper, drop-step and crossover orchestrated, not by a coach but by Mike Wang, a video game executive with 2K, the developer that creates games like Bioshock and NBA 2K.
Welcome to 2K’s motion capture facility, a nondescript warehouse in Petaluma, Calif., where everyone from NBA stars to WWE wrestlers are scanned into the video games you can get at your local Best Buy. Towns and Russell, donning skin-tight black Velcro suits and helmets covered in tape balls, were in town to add their signature moves to the upcoming NBA 2K17 title, to be released to the masses next Monday.
“I’ve played this game since it was first established,” the 20-year-old Towns said after the dunkfest. “Just years ago I was in high school playing 2K.”
“This is mind-boggling.”
I know the feeling.
So how does one go from real-life NBA star to video game NBA star? I’ve been playing basketball — and video games — since I was probably 6 years old, so when 2K called to see if I wanted to experience what it’s like to be scanned into a game for real, it was an absolute no-brainer.
Becoming a video game character is not hard work — at least not for the subject. It can be sweaty and embarrassing work, though, as you can see from the video above. After throwing on a pair of sneakers from the facility’s expansive shoe closet, I put on a skin-tight jumpsuit that was, as they say in the fashion world, form (un)flattering.
It became more (un)flattering after Jeremy Schichtel, a motion capture stage technician at the facility, stuck more than 60 rubber balls, called markers, on every joint and limb of my body. The balls are covered in retroreflective tape, which means they reflect light back toward its initial source. In this instance, the initial source is a camera emitting infrared light, and the tape allows the camera to track the balls as the body moves around in space.
Once suited up, I got to do the fun stuff that Towns and Russell do for a living. I shot jumpers, dribbled around the court, made a few three-pointers (that were somehow left off the video above) and performed my signature move: A behind-the-back layup that’s responsible for more than a few broken ankles at the Bothell, Wash., YMCA.
Mike Wang wasn’t there directing me — the players weren’t due to show up for another hour — so I took my cues from Emma Castles, another mocap stage technician who had the unfortunate task of making sure my errant jumpers didn’t knock any of the floor cameras over. (There were 140 cameras in total, most in the rafters, but a few of them on the ground.) Those cameras were there to pick up that retroreflective tape on my suit, capturing every flick of the wrist, head fake and missed jumper.
Motion capture technology is not new. While the cameras have certainly improved over the years, the concept and most of the technology hasn’t changed much since the mid-’90s.
Even the black suits have remained pretty similar; white clothing reflects too much light and creates “false markers.” And that’s too bad, because the suit was hot! It got even hotter as I dribbled around the court, and when coupled with a silly little plastic helmet — also covered in rubber balls — and Velcro finger gloves, I was grateful no one was recording this. (Oh, wait.)
It turns out the suit is tight for a reason: The markers are ultimately tracked by the cameras and scanned into a software program where the rest of the image disappears, leaving only the markers behind. This creates what is called a marker cloud, which, when done right, should take on the general outline of the subject’s body.
The data from all of those 140 cameras is then combined to determine movement and depth, resulting in a 3-D version of my body on a computer screen. This marker cloud is then given a skeleton — straight bone-like lines that identify the arms and legs — and then can be covered with an actual avatar. Thus, a video game star is born.
With all this data, 2K could use my movements and shooting form for any athlete they wanted — they could put Michael Jordan’s or LeBron James’s likeness on my body. The more likely scenario: My character and movements will be used as a halftime show contestant, a fan in the stands or left off the game altogether.
Or posted to Recode.net. Talk about a childhood dream come true.
This article originally appeared on Recode.net.