Can AI step in to offer help where humans cannot?

If applied inappropriately, artificial intelligence (AI) can do more harm than good. But, he can offer a much-needed helping hand when humans are unable to find solace from their own.

AI has not always had a good reputation. He was accused of replace human roles, take away a person’s livelihoodand threaten human rights.

With the good checks and balances in placehowever, few can deny the potential of AI to improve business operations and improve lives.

Others have harnessed AI to help save lives.

the The Chopra Foundation in September 2020 was introduced a chatbot, nicknamed Cheep, to provide a “community driven solution” that aims to prevent suicide. The AI-powered platform is trained by “experts” and, based on online interactions, will connect users to 5,000 standby advisors.

Foundation CEO Poonacha Machaiah said, “With Piwi, we’re giving people access to emotional AI to learn, interpret, and respond to human emotions. By recognizing signs of anxiety and changes in mood, we can improve self-awareness and increase coping skills, including measures to reduce stress and prevent suicide through rapid, real-time assistance and intervention.”

Piwi has defused more than 6,000 suicide attempts and processed 11 million text message conversations, according to Chopra Foundation founder Deepak Chopra, an Indian-American author famous for his advocacy of alternative medicine. He described Piwi as an “ethical AI” platform formed with safeguards built into the system, adding that there were always humans in the backend provide support if needed.

Young people, in particular, were drawn to the chatbot, Chopra said. Noting that suicide was the second leading cause of death among teenagers, he said young people liked talking to Piwi because they didn’t feel judged. “They are more comfortable talking to a machine than to humans,” he said in a March 2022 interview on The daily show.

in Singapore, suicide is the first cause of death for people aged 10 to 29. It was also five times deadlier than road accidents in 2020, when the highest number of suicide cases has been recorded in the city-state since 2012. The cause of death was 8.88 per 100,000 population that year, up from 8 in 2019.

Increases were also seen in all age groups, particularly those aged 60 and over, where the number of people who died by suicide reached a new high of 154, up 26% from 2019. Industry watchers attributed the spike in numbers to COVID-19. pandemic, during which they are more likely to have faced social isolation and financial hardship.

It is estimated that every suicide in Singapore affects at least six loved ones.

I too have lost loved ones to mental illness. In the years that followed, I often wondered what else could have been done to prevent their loss. They all had access to medical professionals, but this was obviously insufficient or ineffective.

Did they fail to get help when they needed it most in their last hour because, unlike chatbots, human healthcare professionals weren’t always available 24/7 7? Or were they unable to fully express how they felt to another human because they felt judged?

Would an AI-powered platform like Piwi have convinced them to reconsider their options during that fateful moment before making their final decision?

I had strong reservations on the use of AI in certain fieldsespecially law enforcement and autonomous vehicles, but I think its application in solutions like Piwi is promising.

While it certainly cannot replace human health specialists, it can prove vital when humans are not seen as viable options. Just look at the 6,000 suicide attempts Piwi allegedly defused. How many of these lives might otherwise have been lost?

And there is so much more room to leverage AI innovation to improve healthcare delivery.

Almost a decade ago, I posed the possibility of a web-connected pill dispenser that could automatically dispense prescribed medication to a patient. This would be especially useful for older people who had difficulty remembering the many pills and supplements they needed daily or weekly. It could also mitigate the risk of accidental overdose or misuse.

There have been significant technological advances since I wrote this post that can further improve the accuracy and safety of the pill dispenser. AI-powered visual recognition tools can be integrated to identify and ensure the correct drug is dispensed. The machine can also contain the updated profile of each medication, such as the weight of each pill and its unique characteristics, to further determine the correct medications that have been dispensed.

Clinics and pharmacies can dispense the prescribed drugs to each patient in a cartridge, refillable every few months and protected by the necessary safety devices. Relevant medical data is stored in the cartridge, including dispensing instructions accessible when inserted into the machine at home. The cartridge can also trigger an alert when a refill is needed and automatically send an order to the clinic for a new cartridge to be delivered to the home, if the patient is unable to make the trip.

The pill dispenser can be further integrated with other healthcare functions, such as the ability to test diabetic patients’ blood, as well as telemedicine capabilities so doctors can log in to check on patients if the data sent indicates an anomaly.

AI-powered solutions like the pillbox will be essential in countries with aging populations, like Singapore and Japan. They can support a more distributed healthcare system, in which the central network of hospitals and clinics is not overtaxed.

With the right innovation and implementation, AI can surely help where humans cannot.

For example, 66% of respondents in Asia-Pacific think bots will succeed where humans have failed when it comes to sustainability and social progress, according to a researcher. study published by Oraclewhich surveyed 4,000 respondents in this region, including Singapore, China, India, Japan and Australia.

Additionally, 89% believe that AI will help companies make further progress towards sustainability and social goals. Some 75% express frustration with companies’ lack of progress to date and 91% want organizations to take concrete action on how they prioritize ESG (environmental, social and governance) issues, rather than simple words of support.

Like the Chopra Foundation, CallCabinet also believes AI can help customer service agents deal with the mental stress of handling cases. The UK-based speech analysis software provider says AI-powered tools with advanced acoustic algorithms can process key phrases and assess voice rhythm as well as volume and pitch. These allow organizations to determine the emotions behind the words and gauge the sentiment of each interaction.

CallCabinet suggests these can allow managers to monitor calls for service and identify patterns that signal potential mental health issues, such as negative customer interactions, raised voices and profanity directed at agents. .

Because when humans can’t bring comfort to those in need, then maybe AI can?

RELATED COVERAGE

Leave a Comment