Social media erupted recently over a meeting between Vice President Kamala Harris and civic leaders, where she attempted to explain AI. “I think the first part of this issue that should be articulated is AI is kind of a fancy thing,” Harris shared. Speaking at a roundtable event at the Eisenhower Executive Office Building in Washington, she continued, “First of all, it’s two letters. It means artificial intelligence, but ultimately what it is, is it’s about machine learning.” A definition without a difference led keyboard warriors to take up arms against the VP, saying things like this:
Meanwhile, a new survey shares that 58% of HR managers believe that older generations feel less confident at work, compared to their younger peers, because of AI. In light of AI adoption in the workplace, L&D (learning and development) pros are weighing in, via a new survey from TalentLMS. With AI coming on faster than some would like, six out of ten HR managers are calling for upskilling and reskilling of employees. But is new training really enough, when AI moves faster than most humans can think?
“AI is quickly leading to an existential identity crisis for employees everywhere,” according to Christopher Lind, a top leadership voice on LinkedIn and the Chief Learning Officer at ChenMed, a strategic healthcare organization. “AI completely disrupts the way that work is accomplished, leaving those who are more familiar with how things have been done in a state of risk.” From his home office in a Milwaukee suburb, Lind explains that historical and contextual knowledge is extremely valuable. But how, exactly, do you make that experience relevant? When the Vice President of the United States struggles to understand or explain the concepts behind AI, what does that mean for corporate leaders in industry?
“Companies are in a challenging position, since few really understand the complexities and nuances of how knowledge work gets done,” Lind acknowledges. Seems that the Vice President isn’t the only one struggling to understand the impact of AI.
“Unlike an assembly line, where it’s clear how parts move through an operation, knowledge work is far less obvious. In many ways, it’s like a black box.” How work gets done is already a little mysterious – and how AI works is a whole new level of complexity.
In fact, researchers at the University of Michigan explain the “black box problem” of AI: we are unable to see how deep learning systems make their decisions. Associate Professor Samir Rawashedeh says that we can either pump the brakes on AI adoption (as if there were brakes to pump when there are profits to be made), or find a way to understand the decision-making process that happens inside of machine learning. Rawashdeh says so-called “explainable AI” is still very much an emerging field, but computer scientists have some interesting ideas about how to make deep learning more transparent, and thus fixable and accountable. Because, when AI produces unwanted or unexpected outcomes, it might be useful to know where these risks (and even life-threatening choices) are coming from. Unfortunately, those answers are already elusive.
“This means companies are going to have to take time to fundamentally deconstruct work so they’re able to reconstruct it in a way that optimizes the best of what AI has to offer while retaining the best of what people bring to the table,” Lind shares.
A designer of learning programs at GE Healthcare, where he was also the Chief Learning Officer, Lind says we must redesign the world of work to welcome AI, and make every generation operate without fear or threat of obsolescence. So, is upskilling the answer?
“To the point about upskilling/reskilling, it’s less about teaching an old dog new tricks. It’s about teaching an old dog how to play fetch in new and different ways.” No word on how old a dog has to be, in order to be considered old, but Lind’s point is clear: change is everywhere, driven by AI, and companies have to go beyond simple definitions to really explore the meaning of this new technology.
“Ultimately, reskilling or upskilling still follows the same set of rules, but the game looks very different. This should encourage employees since, in this reimagined way of working, there will be greater opportunity to leverage more of the unique human skills that bring people to life while delegating the activities resulting in people feeling underutilized because of the largely mechanical way work gets done now.”
At the end of the day, the question of what role AI should play in our lives may not be fundamentally different from the conversations we have anytime a potentially transformative technology emerges, according to Rawashdeh. Typically, that conversation involves a calculation of risks and benefits. “Without question, there is a huge potential for AI, but it gets scary when you get into areas like autonomy or health care or national defense. You realize we have to get this right.”
Maybe start with deeper understanding, at the leadership level. That way, AI can become a useful tool instead of a threat.
Read the full article here