当前位置:首页 >焦點 >【】

【】

2024-11-05 07:41:20 [探索] 来源:有聲有色網

Since its founding five years ago, Crisis Text Line has received more than 62 million messages from people experiencing intense psychological or emotional distress. They reach out for help and empathy, and the counselors who respond are trained to help defuse the situation.

It turns out those text conversations have a purpose beyond saving lives; with the help of artificial intelligence, all of that data can be analyzed to reveal important clues about what makes a difficult exchange between two people go well -- or not well at all. That's the basis for Loris, a new for-profit company that's being spun off from Crisis Text Line.

SEE ALSO:One woman's quest to find the right meditation app in a messed-up world

Loris, which has raised $2 million from Kapor Capital, Omidyar Network, and other investment firms, will use its proprietary software to train a company's employees in the skill of having a challenging conversation. That discussion could range from reporting sexual harassment to asking for a raise to handling an irate customer.

"Nobody teaches us to communicate clearly and lovingly and effectively," said Nancy Lublin, founder and CEO of Loris and Crisis Text Line.

"Nobody teaches us to communicate clearly and lovingly and effectively."

Lublin believes that Loris can improve the way people interact with each other by putting the insights from Crisis Text Line to work.

For instance, Lublin said that analyses of Crisis Text Line messages suggest there are "magic words" that can help calm a tense or emotional moment. Those include "smart," "proud," "brave," and increasingly over the past few months, "impressive." If someone says they're overwhelmed, the most effective word to use in response is "strong."

"These are important words to know if you're a manager," Lublin said.

The type of question asked can also increase or worsen your chances of having a conversation that ends on a positive note. Questions that begin with why often conclude with a dead end. Prompts that revolve around how or when, on the other hand, invite open-ended conversation.

Mashable Top StoriesStay connected with the hottest stories of the day and the latest entertainment news.Sign up for Mashable's Top Stories newsletterBy signing up you agree to our Terms of Use and Privacy Policy.Thanks for signing up!

What might be less clear is how those insights can help us better talk about the full spectrum of sensitive issues we encounter every day, including stereotypes or beliefs about sex, gender, race, class, and ability. Loris' software will impart its wisdom primarily through role-playing scenarios.

Lublin said she first thought Loris would need to develop specialized training for different issues. But data analysis actually indicated that good conversation, no matter the subject, follows the same sentence structure and syntax. Such conversations hinge on empathy, compassion, and authenticity.

Via Giphy

Glen Coppersmith, founder and CEO of the startup mental health analytics company Qntfy, said that Loris has a lot of potential. (He is not a part of the company or its efforts.) If the data-driven insights about word choice, for example, lead to practical management and communication strategies, it could provide employees with a powerful tool.

So instead of just focusing on using the word "strong" when someone feels overwhelmed, the respondent would be trained to recognize the emotional undercurrent of the situation and have techniques for empowering the person.

"The idea of how do we provide some better structure ... for people to have these difficult conversations is very timely," Coppersmith said.

Lublin understands that firsthand: The idea for Loris came from companies that reached out to her staff with requests for training that would incorporate the help line's best practices for deescalating highly charged emotions. The company is named after a primate that looks cuddly but has a deadly bite, an apt metaphor to illustrate how we tend to think of communicating well as a "soft" skill, yet conversation gone wrong has the potential to destroy professional and personal relationships.

Via Giphy

The stakes of those misguided conversations aren't lost on Lublin.

"When people avoid hard conversations, think about who loses," she said. "It's super important to us that people learn how to have hard conversations so women, people of color, and people who are marginalized can have a seat at the table."

Loris is a long way from just launching to tackling that systemic problem, but Lublin said the process will begin by inviting a handful of companies to participate in its beta phase. Employees trained on Loris will ideally develop essential communication skills, which Lublin likens to exercising and strengthening a muscle over time.

"Our goal is to make humans be better humans," she said.


Featured Video For You
Yamaha developed an AI system that translates a dancer’s motion into music

TopicsArtificial IntelligenceMental HealthSocial GoodInnovations

(责任编辑:百科)

    推荐文章
    热点阅读