AI and Loss of Job Description
There is a lot of speculation about what (gen) AI (and esp growth of LLM) means for labor markets.
“AI is more likely to lead to loss of _job descriptions_ than loss of jobs”
There is a lot of speculation about what (gen) AI (and esp growth of LLM) means for labor markets. I heard the above statement on a podcast and it most concisely captures how I think about it.
“Data science” was a job title that came to the forefront of our attention recently and almost as quickly went back into the background. What I think happened there was that we were trying hard to jam in a bunch of mostly different things that involved analyzing data under one job title. I am sure people are sticking to that still although it doesn’t quite work because it’s still relatively “cool”. But the reality is that neither those who are data scientists nor those who hire them know exactly what that role means, and even their understanding and expectations of it is constantly shifting.
I’m bringing up this example because it is one of the latest examples of how jobs, esp newer ones, can’t really be defined by a generic title and are more realistically a subset of a broader set of tasks chosen by a particular team at a particular time. I experienced that closely as a data scientist, and now I’m going through it again as a founder.
Now, if we think about “jobs” as a flexible and fluid set of tasks, it is kind of obvious that many factors can impact the exact details of that set, like business context and priorities, availability of resources, and introduction of technologies and processes. In other words, jobs are really these open, dynamic, systems of tasks within the context of their environments, and they evolve to maximize some sort of system level metric.
Now the nature of fluid and flexible things is that they adapt to changes and it is non-trivial to completely wipe them off (eg. it is an incredibly difficult and expensive task to suck all the air out of a chamber). So, I’m a bit confused why people so easily talk about “loss of jobs” as if jobs are these monolithic things that could be here or not in a binary way. It is completely reasonable to talk about “loss of job description” in the sense that this particular subset of tasks are no longer needed and instead that other subset is necessary, but thinking that the whole set of tasks would become redundant so quickly that we won't even have time to react is a bit dramatic. And if that happens it is most probably because we were in denial although there were obvious leading indicators.
That said, the rate at which this replacement is happening is much faster than anything we have experienced, and there’s no clarity around what that might mean at a larger scale. But I think instead of focusing on “what can go wrong”, we need to think about “what can go right”!
For example, given that job descriptions are evolving faster than ever, what kind of infrastructure (eg. education system) is necessary? Or what kind of workforce management / corporate structure is most suitable for a rapidly shifting landscape of job descriptions? And importantly at a psychological and social level, what kind of support is necessary to help people find consistent and ongoing “meaning” and “purpose” although their job description is constantly changing?
I think the first step in responsible AI should really be focused on drafting the right set of questions to ask. Without that we might be barking up the wrong tree!