All of last year as I was writing Silicon Collar, I saw two wildly different perspectives. Automation practitioners – I profiled them in over 50 work settings from accounting to wineries were all pragmatic about automation - maturity of tech, its economics and that it leads to smarter, speedier, safer workers. In contrast, most analysts and academics and many Silicon Collar types had a very pessimistic POV on catastrophic job losses from automation. To reconcile the extreme view points I looked at a century of automation- UPC scanners, ATM machines, technology in cars etc and found even after decades we have tens of millions of grocery, banking, other jobs. I also found the economy has been generating what I call Alt-jobs - in franchises, on platforms, in alternate healthcare, in ethnic groceries etc. The job economy is healthier than even the official stats show.
So, it is good to see some pragmatic opinions come out about the impact of automation.
No, I am not talking about US Treasury Secretary Steve Mnuchin who has said "It's not even on our radar screen ... 50-100 more years," Mnuchin is what even I would call an optimist. I am talking pragmatists.
Michael Milken writing in the WSJ says
“Go back 40 years, when powerful financial technology first started being used on Wall Street. The combination of mainframe computers with new types of securities and trading processes increased access to capital, especially for small and medium companies. Pioneers in the cellular telephone industry, for example, previously had a hard time convincing lenders that they could revolutionize how people communicate. There were only a handful of capital providers—primarily banks and insurers—that most companies could turn to. This changed beginning in the 1970s, when capital markets began a long process of displacing the established financial institutions as the leading sources of funding for corporate growth. Innovative fixed-income and equity-linked instruments helped create more than 60 million net new jobs in the U.S. over the last third of the 20th century.”
James Bessen at Boston University in a working paper writes (italics mine)
“Technology rarely automates major occupations completely. Consider what happened to the 271 detailed occupations used in the 1950 Census by 2010. Many occupations were eliminated for a variety of reasons. In many cases, demand for the occupational services declined (e.g., boardinghouse keepers); in some cases, demand declined because of technological obsolescence (e.g., telegraph operators). This, however, is not the same as automation. In only one case—elevator operators—can the decline and disappearance of an occupation be largely attributed to automation. Nevertheless, this 60- year period witnessed extensive automation, but it was almost entirely partial automation.”
If anything, Prof Bessen could have pointed out the Bureau of Labor Statistics is now tracking over 800 occupations, and the update next year will have even more occupations as we enjoy way more career choices than our parents did.
The pushback I get (and sure that folks like Milken and Bessen get) is we should not use historical trends to predict the future which is being shaped by rapid advances in computing, sensory and other power. As Bessen himself acknowledges “In the future, of course, new artificial intelligence technologies might be capable of fully automating jobs. However, that is not what has been happening so far nor would it seem likely to happen in more than a few occupations in the near future.”
That’s exactly the point. Most jobs – even white collar jobs these days need more than cognitive skills. They need visual acumen, social skills, finger dexterity, other limb capabilities. To completely replace them you need a “frankensoft” of technologies – AI, sensors, drone and others integrated together. Can it be done? Sure. But who has done that? And which regulators have certified that? And how many customers have economically justified that? None so far, and it will take decades for the maturity and economics of such machines to replace majority of humans in those jobs.
In the meantime, let’s question the academic and analyst research as their predictions age
Gartner has several such predictions for 2018:
- 20 percent of business content will be authored by machines.
- more than 3 million workers globally will be supervised by a “robo-boss.”
- 45 percent of the fastest-growing companies will have fewer employees than instances of smart machines.
- Digital businesses will require 50% fewer business process workers
Well, we are just a year away. How about an audit, and ideally an update from Gartner to say we were too pessimistic?
In 2013, two Oxford U researchers made an even scarier projection about automation and jobs– “47% of total US employment is at risk”.
It’s been four years since they made that projection and not even a fraction of 1% of the US employment has been automated.
Yet, every year hundreds of academic papers and news stories continue to parrot the Oxford 2013 projections. Should they not ask for an update? Should not a respected institution like Oxford come out and say “yes we got lots of publicity from that report. The honest truth is the authors should have called employers across the occupations they surveyed and asked them when in fact they would quit hiring humans. That’s not likely any time soon”
I doubt Gartner or Oxford will offer mea culpas. That’s ok. It behooves analysts like me to point out their assumptions were faulty (as I did in Silicon Collar).
More importantly, we should all talk to practitioners of automation – like the plant managers, the chief medical information officers I interviewed for the book - who are implementing the automation. They will calmly tell you what is real and what is not.
It’s time for automation pragmatism.
Cross-posted at Medium