We’re living at the cusp of a new industrial revolution, with progress supposedly moving at breakneck speeds. New innovations are introduced every day. But stagnant wages and productivity continue. And advances in robotics and artificial intelligence (AI) have yet to make significant economic impact. Despite this, we see increasing anxiety about automation.
But it doesn’t have to be this way. We have an opportunity to shape these new innovations, so that they deliver on their economic potential and do so in an inclusive and equitable way. That begins with investing in the right managerial talent.
During the 1970s and ’80s, the American economy was in a similar spot as it is today. Personal computers were advancing, while productivity grew little. At that time, economist Paul A. David noted that this problem would be fixed as managers learned what exactly a computer could do; it wasn’t readily apparent how much new tech was capable of—something that’s still the case today.
Even with the hype surrounding advanced robotics and AI, many industries simply aren’t prepared to adopt these technologies. Across traditional sectors that are extremely data-rich and can benefit significantly from technology investments—such as mining and manufacturing—poor data management practices are all too common. And it’s preventing these companies from capitalizing on technological opportunity.
But this isn’t the only barrier.
In the health care sector, where opportunity for AI to improve diagnostics and drug discovery is enormous, byzantine data regulations prevent researchers from accessing the data they need. Even in the tech sector itself, where data is almost completely digitized and the regulatory environment more favorable, understanding how to leverage new technologies hasn’t fully matured. We can see two striking examples in Tesla’s failures in plant automation producing Model 3s and Amazon’s automated hiring algorithms. Even the most innovative companies on the planet haven’t figured out how to integrate new innovations well.
Clearly, leveraging technological potential won’t be possible without first bridging the technical understanding of what robotics and AI can do. And that requires a keen business understanding of what the problems to be solved are. As it stands, these innovations remain solutions to unspecified problems.
Successfully commercializing robotics and AI, then, requires investments in complements to these technologies that allow for the opportunities they create to be better understood and captured. Clearing up how data can be used with clear and business-friendly privacy guidelines, for example, would provide enormous benefit.
Then, there must be a focus on developing the managerial talent pipelines and ecosystems that will actually foster innovation. For their part, universities are becoming increasingly critical to that end. We’ve started to see the rise of business analytics programs that sit between data science programs and pure management programs, allowing students to graduate with the ability to assess problems from both technical and managerial perspectives.
In addition, programs like the Creative Destruction Lab, founded at the University of Toronto in 2012 with a recent international expansion, develop ecosystems that pair deep science researchers with experts in management to increase the opportunities for commercial success. The program has helped over 500 research-based companies in certain areas, such as AI, quantum computing and space exploration, raise over $3.1 billion CAD, with an explicit focus on commercialization. And these are models that aren’t too difficult to replicate and bring to the market, meaning new tech will come along much sooner.
Policy approaches that incentivize universities to expand accelerator programs and develop more “spin-out” companies (companies founded at universities, which generally allow the universities to retain IP rights to any tech developed within their laboratories) would further advance tech development. While universities like Stanford and MIT are world-renowned for supporting commercialization of research, more active effort can be taken to transform this into the norm. Consider the UK, which has made commercializing innovation the hallmark of its industrial strategy. And its strategy goes beyond simply funding R&D. It also requires actively ensuring the right talent exists at every stage of the research-to-implementation pipeline.
Education has always played a pivotal role in managing and shaping technological transition, and this time is no different. It’s vital to ensure that universities and companies play this role well and that they’re encouraged to invest in future managers. After all, that’s the key to finally enabling innovation to flourish.