The sense of fear and loathing is rising as workers — especially once-secure-feeling knowledge workers — begin to see the threat that artificial intelligence poses to their livelihoods. Increasingly capable algorithms and robots are starting to perform tasks that previously required college degrees to tackle, like determining how an insurance policy should be priced, or what medical diagnosis best matches an unusual set of symptoms, or which marketing messages will generate most sales. Automation, which used to steal the jobs of those who did dangerous, dirty and dull work, is now threatening decision-makers and experts.
As workers try to get a bead on this threat, however, most tend to look upward, toward the managers and IT departments they suspect will wheel in the machines to outperform them. You might do better to glance to your side. The cognitive technology that puts pressure on your job is more likely to be your cubemate’s.
The idea of "bring your own robot" is not as farfetched as it may seem.
Here’s how it might well go down. Your colleague is just as stressed-out as you are at the rising imperative to do more with less and the shrinking prospects of jobs, promotions and raises. He is also equally frustrated by the amount of time he is spending on routine, codifiable parts of the job he mastered long ago. But being slightly more tech-savvy than you, he also spots a solution. By opening his own wallet and making a software purchase — taking advantage of the ever-plummeting cost of processing power — he shows up to work one day with an AI assistant in tow. Just as he made it to work faster than you because he used Waze to navigate through traffic, at work he’s able to make better investments, call on better customers, or make more accurate production forecasts than you.
The idea of "bring your own robot" is not as farfetched as it may seem. There are already automated tools that support a wide variety of specific tasks. And robots are getting easier to interact with. The new thing in manufacturing is "collaborative robots," which are much easier (and less dangerous) to train and work alongside. MIT professor Cynthia Breazeal has been working on "social robots" in the lab for a couple of decades, and is now an executive in a startup, Jibo, that will soon introduce a social robot for the home. One for the office can’t be far behind.
When it arrives, your colleague may become twice as productive and effective as you, and motivated enough to spend his found time cooking up innovations and other clever ways to serve the company’s customers better. He’s a superstar. Can you follow suit fast enough by bringing in your own robot helper? Maybe — but it’s doubtful that everyone in your department can. Will there even still be enough work to go around? If not, the most technically astute are the most likely to keep their jobs.
We're living in a world where the laptops, tablets and smartphones people bought for themselves are more powerful and flexible — and way more cool — than their employers’ standard issue.
There is a clear precedent for what we are describing. Starting a decade ago, IT departments started raising red flags over what they called the BYOD (Bring Your Own Device) movement. Suddenly, we were living in a world where the laptops, tablets and smartphones people bought for themselves were more powerful and flexible — and way more cool — than their employers’ standard issue.
Meanwhile, with boundaries between work and home life eroding, workers saw no sense in switching over to different devices at, say, 9 am and 5 pm. They started using their own tech in the office. At first, this was the scourge of corporate security, to be discouraged at all costs. But in much the same way that companies have embraced work-at-home arrangements, it was soon enough seen as a way to get higher performance without budgeting more for property, plant and equipment.
But if BYOD didn’t displace many workers we can point to, things may be different with the AI-enabled devices of the very near future. Dramatically more capable, they may be all the more welcome in workplaces, and could create enormous gulfs between the performance of those who invest in them and those who do not. How many managers and professionals have daydreamed of hiring assistants, even at their own expense, to enable them to outperform the rest of the pack — but could never follow through on the wish in their strictly controlled HR environments? Now they can do it, the expense will be manageable, and the arrangement will only get cheaper with time. Intelligent tools like IBM Watson once cost millions to buy, but they are rapidly evolving into smart, inexpensive "bots" or APIs that undertake specific cognitive tasks.
Dramatically more capable, AI-enabled devices may be all the more welcome in workplaces, and could create enormous gulfs between the performance of those who invest in them and those who do not.
Does this all sound hideously Darwinian? Like the old joke about one hiker saying to another, "I don’t have to outrun the bear — I only have to outrun you"? There is a more optimistic way to look at things. When your colleague invests her own money in an AI solution, her motivation will be to arrange the partnership of human and machine such that she keeps the most fulfilling and engaging parts of the job. She will, in other words, aim for mutual augmentation, allowing herself (and the machine) to take on challenges that would have been insurmountable acting alone. She won’t render herself redundant with a fully automated solution.
Peer-introduced AI in workplaces might therefore demonstrate the possible, and set the tone for the employer-sponsored implementations of smart machines that will eventually follow. We suspect that henceforth, at least as many process improvements will percolate up from the people doing work as will be imposed by the managers sitting layers above them. It usually takes someone who really knows both the job and the capabilities of technology to specify the right division of labor.
So if you’re worried about the rise of the robots and what it might mean for your employment situation, don’t expect the invasion to come from above. Look for it on all sides. And maybe stop assuming you can’t be friends with those smart machines.
Thomas H. Davenport and Julia Kirby are the co-authors of the new book, "Only Humans Need Apply: Winners and Losers in the Age of Smart Machines" (Harper Business).
Thomas H. Davenport is the President's Distinguished Professor in Management and Information Technology at Babson College, the co-founder of the International Institute for Analytics, a fellow of the MIT Initiative on the Digital Economy and a senior adviser to Deloitte Analytics. He teaches analytics and big data in executive programs at Babson, Harvard Business School, MIT Sloan School and Boston University, and he is the author or co-author of 17 books. Reach him @tdav.
Julia Kirby is a senior editor at Harvard University Press and a contributing editor for Harvard Business Review. She is the co-author of "Standing on the Sun: How the Explosion of Capitalism Abroad Will Change Business Everywhere." Reach her @JuliaKirby.
This article originally appeared on Recode.net.