IE 11 Not Supported

For optimal browsing, we recommend Chrome, Firefox or Safari browsers.

How AI Could Help Victims of Domestic Violence, Other Crimes

With more than 10 million people physically abused by an intimate partner each year, according to statistics, domestic violence experts and software developers say artificial intelligence can help.

Flickr-Crime-Scene
(TNS) — With more than 10 million people physically abused by an intimate partner each year, according to the National Coalition Against Domestic Violence, domestic violence experts and software developers say artificial technology can help combat the issue.

"There are a handful of buoys out there that are trying to keep you from drowning," said Anne Wintemute, the CEO of web-based AI chatbot Aimee Says. "And we want to be that safety net between those buoys. We want to be there before the person is ready to reach out to a victim service organization — hopefully, early enough to prevent a future of violence."

With the recent rise of artificial intelligence chatbots such as ChatGPT, Wintemute and Steven Nichols co-founded the app in Denver, Colorado, with the aim of helping victims of domestic violence. Around 300 Alabamians have taken advantage of the resource so far, they said.

"I do think that it (AI technology) can be a benefit," Cherrelle Locke, program director for Crisis Services of North Alabama said. "It is a very new thing. I have not actually worked with it a lot. We are just now preparing to actually see more of it and see how it works."

Nichols said AI, a "phase shift in technology," allows for problem-solving on a large scale.

"And what better problem to work on than domestic violence?" he said.

A Department of Justice-funded study, published in Police Chief Online in July, found that the technology can also be used to enhance police departments' ability to respond to certain needs and assist in recovery for victims of other crimes, too. The DOJ collaborated with the Greensboro Police Department in North Carolina and others to test run an Enhanced Virtual Victim Assistant (EVVA) developed by the nonprofit Research Triangle Institute.

"Chatbots offer the opportunity to improve the police's ability to respond to the informational needs of victims and refer them to community-based support and services without an additional drain on limited resources," the study found.

Aimee Says can refer users to local domestic violence support services, among other things, too. If you visit the web-based application and ask for help in filing a protection from abuse order in Morgan County, for example, it responds with a detailed, step-by-step process for petitioning Morgan County Circuit Court. It even tells you what legal steps to expect next.

Court records showed there were five petitions for protection orders in Morgan County last week alone.

The people who run Aimee Says can't see what users say to Aimee and vice versa, according to Wintemute. She said it's totally confidential, and guest chats disappear once you navigate away from the page via a red "X" in the bottom right corner of the app.

Locke said people who visit Crisis Services of North Alabama's website can click on a "safe exit" link to quickly navigate away from the page to a blank search browser.

"I'm interested to see how they will do this with this new AI tool, whether it's on a phone or a computer, to still give people that level of confidentiality to feel comfortable," she said. "And that it's not going to be accessible by anyone else, specifically their abuser."

Aimee Says can also help users identify different forms and patterns of abuse, according to Wintemute. The Decatur Daily typed a scenario of a partner throwing an object, for example, and Aimee responded that she was "sorry to hear that" and "throwing objects is a form of emotional abuse" and asked relevant follow-up questions.

"You can document incidents by writing down details like date, time, what happened, and how it made you feel. Include any witnesses or evidence like photos of damage. Keep this in a safe place, perhaps digitally with a secure password. Would you like more suggestions on keeping this documentation secure?" Aimee offered.

The EVVA study found that not all topics or conversations are appropriate for a chatbot to engage in with victims, such as "sensitive topics that require emotional support." The study found that having live victim services personnel available to continue to the conversation may be appropriate to enhance the user experience but recognized that this could increase the cost of providing a chatbot considerably, especially if the goal is to have the resource available 24/7.

Aimee Says is available 24/7, and Wintemute said it can help bridge the gap between traditional support systems.

"The family justice centers, the shelters, the legal advocacy groups that are federally and state-funded are really targeted at reducing domestic violence homicide. That's where governments tend to pursue and push money," she said. "The thing is, there's a huge pre-crisis space where prevention is possible, and there's a huge post-crisis space."

Wintemute said Aimee Says can help users identify signs and signals of potential future abuse very early in a relationship.

"And then when people do manage to extricate themselves from these relationships, they are often court-ordered to co-parent with their abusers, which means they have a whole future of abuse ahead of them, and there are no services available to them, especially unpaid or low-cost services.

"Those are the gaps we're trying to bridge, the things we don't consider publicly fundable because they're not as tightly associated with homicide, but they're extremely tightly correlated with mental and physical health."

Some victim service organizations have begun using Aimee Says, too, according to Wintemute. The app can help less-experienced advocates navigate cases with victims.

"They'll also use it in a kind of three-way conversation with victims and survivors," she said. "Sometimes that's due to a language barrier, so the advocate can talk in English through Aimee and then ask for conversion into the other language, ask for specific cultural aspects that the VSO might not be aware of, to kind of facilitate the conversation and identifying and responding to the needs that the victim has."

Aimee Says is available for victim service organizations to provide the app through their websites at no cost.

Most Aimee Says users so far — 70,000 and counting — use the free, inexhaustible chat function. Users can also pay for a subscription service that keeps records and can analyze and discuss uploaded documents, such as legal forms.

"If you have access to the people that do this work, use them," Wintemute said. "And use Aimee as a supplement. But what we know is most people actually don't have access."

The National Coalition Against Domestic Violence estimates nearly 38% of Alabama women will experience domestic violence in their lifetime. According to the most recent, incomplete data compiled and published on the state's crime dashboard website, there were 171 domestic violence crimes in Morgan County in 2023, for a rate of 137 per 100,000 residents. In 2022, there were 370 domestic violence crimes in the county, for a rate of 309 per 100,000 residents.

What about AI skeptics?

"Go ahead and try Aimee," Wintemute said. "Make it a hypothetical. Talk about your friend's life, not yours. Our experience is that people who are anxious about trying it ... are quickly surprised."

© 2024 The Decatur Daily (Decatur, Ala.). Distributed by Tribune Content Agency, LLC.