How an Executive Forecasts How Conversations will Go with Co-Workers
Moderna's Tracey Franklin has found a way to use AI to better prepare for leadership interactions and conflict challenges
The uses for artificial intelligence (AI) in workplaces are expanding and an executive implemented it in her work to help with communication and conflict exchanges.
Moderna chief human-resources officer Tracey Franklin recently talked about it at the Fortune Workplace Innovation Summit, saying that she uses AI to predict scenarios with coworkers before they happen.
She described her thinking and how she goes about it.
“We have a lot of personality tests that we use in the organization so I've created profiles in a GPT of our executive committee,” Franklin said. “I have scenarios of when two people are maybe at conflict or when I have to go in with an opinion or recommendation and how might the group react to my recommendation."
She also uses AI for helping her emotionally and psychologically.
“Or if I'm having a really bad day and I need to understand myself and why I'm triggering, I actually have an interactive coach, therapist and teammate for me that I use all the time. It's been my favorite thing.”
Franklins prepares for uncertainty through the use of technology as a trusted assistant.
“And I've said, here's the situation, how are these two people going to react? Or this is what happened, why did these two people react this way? And how best can I coach the reconciliation,” she explained in detail.
She has been impressed by how much help she has gained from this assistance and her preparation.
“I will tell you, I think I am pretty good with people but it gives me an advantage that I didn't have before because I don't fully understand someone's innate, human personality response like the GPT allows me to do,” Franklin expressed.

“Using AI like a smart sounding board that analyzes people’s personalities and helps guide difficult conversations adds a whole new layer to how leadership and conflict resolution can work,” says Roman Eloshvili, the founder at ComplyControl.
“Artificial intelligence can offer a fresh perspective, helping leaders consider angles they might have otherwise missed, helping cut down on what is usually a time-consuming and mentally-draining process. As a result, problems can be addressed faster and solved in a better manner.”
He is a believer in how the technology can help processes across the board.
“If leaders are thoughtful about how they use AI, it can genuinely improve how teams work together,” says Eloshvili.
“I do think this kind of use will become more common, as AI itself grows more user-friendly. Right now, most companies view it as a tool for operational efficiency first and foremost, but when this tech gains broader trust and acceptance, its use in people-related situations will also follow.”

“Practicing difficult conversations by role-playing is a time-tested means of preparing for them,” says Ann Gregg Skeet, senior director of leadership ethics at the Markkula Center for Applied Ethics at Santa Clara University.
“Using a chatbot instead of a person to prepare might reduce an executive’s self-consciousness about role-playing and frees her from having to find someone willing and available to participate.”
She’s quick, however, to communicate caution.
“But turning to AI for things to replace things that make us uniquely human — like our emotions, our unpredictability, our feelings, is concerning as it can create a cycle where humans become increasingly isolated from one another, seeking solutions to challenges from machines rather than other people,” Gregg Skeet says.
She elaborates as to why this can be problematic.
“The human brain is wired to connect with others and practices that reduce that connection run the risk of changing human brains,” Gregg Skeet says.
"There are demonstrated, researched risks and harms of using conversational AI systems to simulate human relationships,” she adds. “Such AI systems can lead to the development of emotional dependence on machines or manipulation by them, and can introduce privacy risks and biases.”
Gregg Skeet figuratively points to Franklin’s use of the technology to assist her in communication and conflict preparation.
“In this example, the executive is feeding sensitive information into an unpredictable AI system, which exposes that data to risk of future access or of being used to train AI systems further,” she argues.
“How will she feel if the profiles and personality descriptions she has provided about her team become public,” Gregg Skeet rhetorically asks? “How would her team feel if they knew their responses to conversations were being shared with an AI?”
She does believe that Franklin’s desire to be better informed is noble, yet warns that there are dangers involved.
“Attempting to improve one’s conversational abilities is virtuous but executives should
consider the potential ethical risks of doing so using AI, including how it impacts the rights of others,” Gregg Skeet says.

“While companies like Microsoft have been working on using AI to analyze team performance through applications such as Microsoft Teams to give leaders insight into what hours work best for who, call times, chat etiquette, etc., what you’ve described focuses more on developing deep, interpersonal relationships,” says Nizel Adams, CEO and chief engineer at Nizel Co.
“I’ve always advocated that psychology be a part of the regular curriculum starting in grade school since understanding each other and how we all function is a crucial survival skill,” he adds.
“Unfortunately, most people aren’t psychologists, therapists, behavior and body language experts, counselors, etc. so they tend to have a poor skillset in reading others and responding appropriately.”
He is an advocate for technology that can improve how we interact, problem solve and work.
“This tool bridges that gap and hopefully with enough use, the user of it will start to naturally develop these skills themselves,” Adams says.
“Understanding each other leads to less conflict, higher morale and increased efficiency. AI could also provide insight for leaders that don’t realize an employee may have a learning disability or mental illness and guide them through the appropriate way to interact with them in order to help them be as successful as possible.”
He anticipates greater adoption for interpersonal use.
“As the results from use start pouring in, more companies will definitely start incorporating it,” Adams says.
AI for Self Understanding and Self Care: Yes or No?
Franklin values the use of AI to help her when she feels her emotional well-being has been shaken and she is feeling off balance. Franklin has found the technology can be used for self care as well as conflict repairs and relationship healing.
This could be useful for executives and any employees if it caught on, as an aid for stress relief. Yet it might not always be the best assistance for everyone.
"There is a robust executive coaching industry available to executives wanting to improve self-care and their conflict resolutions skills and heal professional relationships,” Gregg Skeet says.
“AI offers perhaps a more cost-effective and ever-ready remedy but does not provide executives with the opportunity to connect with other humans to think and practice together strategies for improvement.”
She further explains the concerns.
“There are well-documented harms associated with using AI to simulate human relationships,” Gregg Skeet says. “Executives will have to be comfortable with those potential harms and the privacy risks associated with putting sensitive information
into AI systems for the use of AI as executive coaches to become more widespread.”
She favors traditional professionals and approaches for risk management and ethical purposes..
“Effective coaches and therapists teach people how to understand themselves, what is triggering them and how to appropriately respond,” Gregg Skeet says. “People relying on instant-access AI to address such skill gaps may never fully develop those skills.
“We have come to understand a host of negative, unintended consequences associated with social media,” she points out. “We should apply some of that learning to AI and be prudent in how we choose to use it."
This Type of Use will Require More Time
"The AI technology itself needs to gain more trust,” Eloshvili asserts.
“One of the core obstacles that currently stands in the way of broader adoption, on any level, is the fact that many people are simply not comfortable with it.
“I don’t just mean the perspective of an average worker who may feel like this tech might diminish their value and make them obsolete. Leaders, to my mind, can also be hesitant to adopt AI.”
He brings up people’s natural human natural instinct to elaborate.
“They fear making decisions based on something they don’t fully understand,” he says. “That’s not an unreasonable concern.”
“A person has to feel comfortable and safe in order to say something like (what Franklin said in talking about having a bad day). And you can’t be emotionally open when talking to an AI if you feel it’s just a cold, unfeeling calculator,” Eloshvili explains.
“So that perception needs to change first. And honestly, I feel that if more leaders openly talk about using AI in this way, the more others will follow. It will simply take time for us to get there."
Maybe AI for Low-Level Therapy is Helpful
“Someone once said that a highly successful person goes home and breaks themselves down at the end of the day,” Adams recalls. “They analyze what they did wrong and figure out areas of improvement.
“This critical and often brutal tearing down of oneself is important to not only keep improving but to stay grounded. Essentially this particular AI is giving the user (Franklin) the ability to understand others while also taking a step back and analyzing themselves.”
It won’t work as needed for everyone.
“To use such a thing would require someone with a temperament that grants them the ability able to handle feedback,” Adams says.
“Someone with a narcissistic or selfish mindset would most likely reject it wholeheartedly as they wouldn’t care about others and can’t fathom looking inward.”
AI is not a tool and technology for everything human.
“It’s important to note that AI is not a replacement for emotional intelligence, so it can’t be relied on to provide all the answers you need,” Eloshvili says.
“Leaders will still be the ones driving interactions with their teams but AI can offer greater clarity on how to approach situations in a way that de-escalates tension.”
He provides a bit of advisory.
“The key, as always, is to approach this matter with care and thoughtfulness,” Eloshvili says. “Remember: human feelings are not something you can automate."
This newsletter normally publishes Tuesday, Thursday and Sunday, with occasional articles on other days. To advertise, link to your business, sponsor an article or section of the newsletter or discuss your affiliate marketing program, contact CI.