Artificial Intelligence and Jobs by the Year 2025:
One of my biggest take-aways from the recent (May, 2017) NVIDIA GTC (GPU Technology Conference) was less about the technology, and more about the near-term jobs impact of artificial intelligence (AI) and robotics. Making smart education and career decisions is crucial, as the emerging combination of AI and robotics will have a huge impact on jobs. Those of you studying artificial intelligence, deep learning, and neural networks will have a stronger career potential. Routine jobs will be largely replaced by AI systems, sometimes using robotics and AI-infused edge devices. Certain other jobs will involve greater partnership between humans and AIs. The jobs that will be most untouched and unchanged will be those depending on thoughtful, creative, unique problem-solving human involvement, together with emotional and social interactions.
The Changing Jobs Landscape
I’ve just had one of those conversations with a friend; the sort that upsets one’s world. She was describing yet another friend who had lost her job. This was not due to AI, per se, but to the overall change in how we are creating and sourcing jobs. In this case, her friend – who was a social worker with a major hospital – had a job that was outsourced from the hospital to a contracting company. The company then mandated that people in this job role were to work from home. The next step was to make them part-time employees. A combination of income loss and social / emotional isolation; it led to a downward spiral. We’ll skip the rest; it isn’t pretty.
This was jarring for me. It revealed yet another industry that had hugely changed how jobs were filled. Over the past few years, I’ve seen that universities have transformed how they do business. A decade or two ago, if I was in transition between corporate jobs, I could easily find a one-year appointment as a Visiting Associate Professor. Now, in universities and colleges throughout the country, about two-thirds of the classes that used to be taught by full-time faculty members (tenured, tenure-track, lecturers, and visiting) are now being taught by adjunct professors. Adjuncts are part-time; they don’t receive benefits, and their employment is on a semester-to-semester (or quarter-to-quarter) basis.
The cost-savings to the universities is enormous. The accompanying stress on full-time faculty, who need to not only teach their regular load but also handle all committee work, all certification duties and course planning, all advising, and then also “evaluate” their adjunct colleagues, is equally enormous. Many adjunct faculty members resort to teaching multiple gigs at multiple universities (because teaching more than two courses at a given university would make them full-time), and so are also hugely over-worked, and are often living at a near-poverty level.
I thought this pattern was limited to colleges and universities. I was wrong; this is endemic. It’s a huge shift in how things are being done.
This is also just the tip of the iceberg.
So here’s the message: the iceberg is the shift in jobs. We’re only seeing the emerging tip right now; much more is coming our way. The RMS Titanic is you and your job, and the safety and security of yourself and your family, and the likelihood that you can afford to send your children to college some day.
Some of you (we’ll get to this) have already seen the danger coming and jumped into a lifeboat. From here, you are actively building a new boat, with all kinds of iceberg-detection radar and satellite imagery. This new boat will get you sanely and safely on a new course, and will give you all-round situation awareness. The only problem is, you’re building this new boat while either (1) getting off your personal Titanic, or (2) are already on a lifeboat out to sea.
Let’s trust that you are looking at AI as a career path. This blogpost will help you get from point A (early beginner) to B (fairly competent), as fast and as expeditiously as possible.
Before we look at what you need to do in the AI arena, though, let’s develop a bit more awareness of the current and emerging situation.
How AI and Robotics Will Impact Jobs by 2025
My usual focus is on AI theory and developing new architectures, and less on the practical implications of emerging AI. Thus, the coming jobs-impact of AI and robotics has taken me by surprise.
Here’s the short list:
- Autonomous vehicles: driverless cars, autonomous truck platoons, and self-piloting drones will certainly emerge. All (not just a few, but ALL) major auto manufacturers are targeting releases of partially-assisted driving and/or AI-in-control (with the human able to step in) by 2025. (Click here for a good illustration of the various SAE levels of automation.) (I’m writing another article for PDD (Product Design and Development); this will have a deeper report on autonomous vehicles, and I’ll come back and insert the link when it’s published.)
- Smart manufacturing: Also real. Big technology challenges, such as giving robots sensitive haptic (touch) response, along with sensor fusion, are being worked out.
- Routine (even very skilled) knowledge-based work: From reading medical images to various kinds of searches (e.g., legal precedents) to translation; a significant number of jobs have been and will continue to be replaced by technology.
The really important thing is that once the technology bugs are worked out, the impact on jobs is practically overnight. There’s very little cushion in which to adapt. (That is, if you’re not using your personal over-the-horizon event detection radar.) I really like the charts shown in this Market Realist article on Tech Adoption Rates Have Reached Dizzying Heights. It shows that the timeframe for technology adoption is shortening drastically. It used to be that it took 1-2 decades to go from 20% to 80% complete adoption of a new technology. Now, it’s more like 1-2 years. In short, as fast as corporations can revise their businesses and make whatever changes are needed (install a new fleet of driverless trucks or cars, build a new factory floor, or even just integrate a new knowledge-based AI), then jobs will be eliminated.
I really like how Perry Marshall interprets the 80/20 rule; and it’s a useful framework for getting a rough handle on things. Thus, I’m estimating that about 20% of American jobs will be lost, never to appear again. Of the rest, my estimate is that about 60% of American jobs can and will be substantially assisted by emerging technologies. The article that I wrote for Product Design and Development last month, Designing like it’s 2025: next-gen CAD technologies give designers “awesome superpowers”, shows how virtual reality (VR) and AI-assisted design tools will impact how product designers do their work. And of course, some jobs (estimating about 20%) will continue more-or-less the same.
Of my rough-estimate that about 20% of American jobs will be permanently lost by the 2025-2030 timeframe, I’m estimating that the losses will come mostly from transportation and from manufacturing and service (e.g., checkouts at the grocery store). Also, a sizeable fraction will come from the knowledge work industry.
The point is not to tight-ratchet on the numerical estimate. Rather, the point is that we are definitively looking at a seismic shift in the nature of available jobs, where 2025 is about the tipping point for the permanent disappearance of certain job categories. This shift is at least as significant as the industrial revolution. I personally think it’s more akin to the discovery of fire.
Learning AI Is a Long Runway Task
For some things in life, you just really need a long runway.
If you’re reading this blog, you’re among the very elite of the elite. You’re not just reworking your life to take advantage of AI technologies; you’ve determined to become an AI specialist. This is a task that takes a very long runway.
Whether you’re building your new boat while bailing out of your current lifeboat (or getting yourself and your family off the Titanic), or whether you’re building your aircraft as you hurtle down the runway, there are three big challenges that you face:
- Missing a good map: too much information, and it’s not well-organized; we’ll be talking about this a LOT in the following posts,
- Missing the flight navigation tools: very likely, you’re a very decent problem-solver or engineer, but you don’t naturally think in terms of mathematics and abstract formalisms – yet you need the tools requiring that level of understanding,
- Missing the weather forecast: you not only need a look at the current weather (where the jobs are today), but what’s coming up over the horizon.
Let me focus on that last point, because it’s the most essential. Lots of things are “hot” right now. Artificial intelligence is hot; so are neural networks and deep learning. Becoming a data scientist is “hot.”
Yes, AI is here to stay – not trying to dissuade you in the slightest if you’re moving into this field. But deep learning? Already passé.
At the NVIDIA conference, I was shocked to realize that the algorithms that everyone was touting were all the ones that were common and popular some thirty years ago, at the dawn of neural networks. They were simply piled higher and deeper. Restricted Boltzmann machines? These are really multilayer Perceptrons, and most often, trained with backpropagation. (Not with the simulated annealing algorithm originally introduced to train the Boltzmann machine.) Autoencoders? Convolution neural networks (CNNs)? They’ve been around forever. (Meaning; the last thirty-some years.)
The hottest thing to emerge in the machine learning / neural networks / AI area over the past couple of years has been GANs, or generative adversarial networks. These are basically a system of two neural networks; one “discriminative” (trained using a classification method such as back-propagation), and the other “generative” (using unsupervised learning). These two methods are, again, each some thirty years or more old. Pitting them against each other … yes, that’s new (and it’s a really good idea). But this doesn’t really create a new neural network algorithm.
What we have, with the current deep learning technology, is a culmination and refinement of ideas that were introduced thirty years ago, or even earlier.
The point is: we’re nearly at the end of what we can do with the neural network architectures that came out of the 1980’s and early 1990’s. The advent of GPUs has let us push them to their limits; hence, the breakthroughs in numerous technical challenges.
But in terms of earth-shattering newness? Not really.
I’m making this point because, if you’re self-educating in AI – or even enrolled in a solid AI program – you need to think beyond what’s happening right now. Beyond today’s hot story.
You’ve got to have your own, private, over-the-horizon event detection system.
What a Long Runway Task Means to You
Taking on a long runway task means that you can launch your own (metaphorical) aircraft that can get to a high-enough altitude, and stay up long enough, in order to handle the challenges of the future. These are challenges that require at least two year’s prep time; not just a single semester course. This means thinking towards the future. It means mastering the fundamentals, even if those fundamentals are really hard.
Here’s the bottom line: you are very likely caught in a situation where you need not just certain skills, but a certain level of understanding. The toughest thing is to decide where to put your time-dollar. Every time you scrape out some hours to study something, it comes at a cost.
Some aspects of AI you can learn pretty fast. These include the backpropagation algorithm (which is a series of chain-rule derivative applications) and all the standard architectures, including basic stacked-up RBNs (restricted Boltzmann machines), CNNs (convolution neural networks), RNNs (recurrent neural networks), and the like. Anything that you can learn in within one semester or one quarter of a given class is going to be deep learning – light.
In particular, anything that you can do using pre-made building blocks, such as are available using TensorFlow and Keras, are a form of DL-light. In fact, there’s a not-too-subtle danger in using these tools, because you can start building five-story office buildings – even skyscrapers – all using pre-fab parts. And this is without ever having first built a garden shed from scratch. There’s a whole lotta “gotchas” that most certainly will get you if you haven’t learned neural network fundamentals, from the inside out.
A long runway task is something that takes one to two years to master, and that’s when you’re being diligent for sustained periods of time. The machine learning algorithms that depend on combinations of statistical mechanics and Bayesian logic are definitely one of those long runway items.
During his keynote speech at the May, 2017 NVIDIA GTC, Jensen Huang referred to a certain machine learning course at Stanford University as the most popular course on campus. This course is still being taught, and Andrew Ng, who first taught it, is still co-teaching it. (You check out the syllabus and course content.)
This is a pretty hefty course, and it gets you one shade heavier than DL-light, largely due to the material in the last four weeks.
However, if you’re planning on an AI-based career, you need more. Some of that more requires statistical mechanics.
Getting through stat mech – not everything, just focusing on what you really need to know – is a bit of a challenge.
My goal is to make that journey as safe and as straightforward as possible. Nothing will make it easy, but I can help you learn just what you need, and avoid misadventures.
More will follow. Stay tuned.
Live free or die, my friend –
Live free or die: Death is not the worst of evils.
Attr. to Gen. John Stark, American Revolutionary War
- Kosner, A.W. (2013), Why Is Machine Learning (CS229) the Most Popular Course at Stanford?, Forbes (Dec 29, 2013). article.
- Maren, A.J. (2017), Designing like it’s 2025: next-gen CAD technologies give designers “awesome superpowers,” Product Design and Development (July-August 2017), 14-18.
- Special Report: Artificial intelligence: the impact on jobs (2016), Automation and Anxiety, The Economist (June 25, 2016). The Economist article
- Stark, H. (2017), As robots rise, how artificial intelligence will impact jobs, Forbes (Apr. 28, 2017). Forbes article