How Apple Takes Experimental Tech From Other Companies and Makes It Mainstream

Part of the fun of technology is seeing who's got a really interesting new technology and use case for it; another part is seeing who or what popularizes the technology so it's no longer "tech," but merely part of mainstream, everyday life.

The new iPhone 8, iPhone X and iPhone 8S are displayed during an Apple special event at the Steve Jobs Theatre on the Apple Park campus on September 12, 2017 in Cupertino, California. Justin Sullivan/Getty Images

I am a tech journalist and editor, and so it was only a matter of time before I wrote one of the cliches of the genre, the “Apple doesn’t invent technologies so much as they refine the technology experience and train users to expect that as the standard” piece. It delights me when I see stories, like the recent one from Bloomberg, pointing out that technologies like fingerprint sensors and facial recognition have been around for decades, but Apple (AAPL) is the company incorporating these technologies into consumer hardware, training mainstream users on how useful they actually are.

Sign Up For Our Daily Newsletter

By clicking submit, you agree to our <a href="http://observermedia.com/terms">terms of service</a> and acknowledge we may use your information to send you emails, product samples, and promotions on this website and other properties. You can opt out anytime.

See all of our newsletters

Part of the fun of technology is seeing who has got a really interesting new technology and use case for it; another part is seeing who or what popularizes the technology so it’s no longer “tech,” but merely part of everyday life.

Right now, the business press is beginning to place its bets on facial recognition—which, by the by, Microsoft began experimenting with and was using for secure sign-on in Windows 10 two years ago—all because Apple’s putting it in a thousand-dollar phone. It will be interesting to see how the fate of this technology unfolds, linked as it is to a phone that is landing in a country where wages have remained stagnant for fifty years and the gap between rich and poor citizens is widening.

Then again, this product is also landing in a market where smartphones are increasingly users’ primary machines, and where consumers have become used to taking out loans for pricey purchases.

So what? How the iPhone X and its biometric security measures perform will definitely say a lot about how we buy phones; the jury is out on whether we’ll learn anything about how much people like facial-recognition technology.

I think the thing to watch here is actually Apple’s play for dominance in augmented reality with its new developer kit. As ArchDaily, an architecture website, explains:

The ARKit is a developer tool for simplifying the creation of AR app experiences. It gives any iOS 11 device with Apple’s A9 processor or better (meaning the iPhone 6s or later, a fifth-generation iPad, or an iPad Pro) the ability to recognize flat surfaces and objects and attach virtual objects or graphics to them.

The pool of people who use iPhone 6s or later or fifth-generation iPads—i.e. older tech—is in the hundreds of millions. All it takes is one augmented reality app that people find vital to daily life, and the user experience becomes the norm by consensus.

Who cares? Microsoft (MSFT), among others. The company has a very strong augmented reality strategy: It’s used its Hololens headset in partnership with NASA to experiment with remote-learning technology—the idea being that the ability to remotely train personnel opens up space (or underseas) exploration to a wider pool of people with varied skills sets. Microsoft has also been working for years on immersive education using augmented reality, and the company has been working with Lowe’s on using augmented reality to walk DIYers through how to remodel a space. Microsoft’s strategy to identify industries that can benefit from augmented reality has been extensively tested and thoughtfully executed—but it’s not clear whether it can withstand the “Look what I can do with my iPhone!” behavior from consumers.

And when I say among others, I do include Google (GOOGL) and their augmented reality initiatives in that calculus. The company has an advantage that neither Apple nor Microsoft has: Access to a phenomenal amount of data plus the mindshare of people who are habituated to asking Google to answer questions for them. But the Google Glass did not work because the company could not figure out how to crack the mainstream consumer experience. But they’re still trying in augmented reality—two significant initiatives in this space were announced at the company’s Google’s developer event keynote this year.

But here’s what was most notable about one of those announcements: that Google Assistant, which relies on your smartphone camera to analyze your surroundings and provide contextually relevant information, would be coming to the iPhone. Apple habituated people to the experience of turning to their phone whenever they wanted something. There are plenty of other smartphones on the market, but Apple was the one that defined the category, user experience and expectations.

The thing to watch out for now is where and how computing will continue to break out of the old-school model where we interfaced with the machine via a monitor and keyboard. Tablets and smartphones were the first step—they taught us how to incorporate spatial relations and tactile experiences into our data interactions. (And, calling back to the Bloomberg story, they trained civilians without security clearances to dig biometrics.)

Smart watches are further habituating people to the idea that computers don’t have to be powerful, merely pervasive and ever-connected. Voice-activated household robots and personal assistants are pushing the idea that computing is ambient, contextual and communal. What will we do with a computing interface paradigm that’s broken free of the typing-pool metaphor? And who will be the one who defines the next metaphor?

Lisa Schmeiser has been reporting on tech, business and culture since the dot-com days. Find her on Twitter at @lschmeiser or subscribe to So What, Who Cares.

How Apple Takes Experimental Tech From Other Companies and Makes It Mainstream