Botler AI co-founder Ritika Dutt at the Nahum Gelber Law Library (Photo: Christinne Muschi)

People

Applying AI to the #MeToo landscape

Ritika Dutt, BA’13, and her company Botler AI use artificial intelligence techniques to help victims of sexual misconduct gain a clearer sense of their legal rights.

Story by David Silverberg

January 2018

Several years ago, Ritika Dutt, BA’13, found herself in the frightening position of having to deal with a stalker. She didn’t know what her rights were, what she should do, or if the man’s actions were illegal.

“The more I researched about sexual harassment, I realized there were many women, and men, like me, who didn’t know their legal rights in these situations,” says Dutt.

Dutt is the co-founder and COO of Botler AI, a Montreal startup that harnesses artificial intelligence approaches to make the legal landscape a little easier to navigate. One of the first areas that Dutt and the Botler AI team decided to focus on stems, in part, from Dutt’s experience with the stalker.

The company launched a new system in December that harnesses AI technology to pore through more than 300,000 U.S. and Canadian criminal court documents related to what Botler AI refers to as sexual misconduct, including sexual harassment and sexual assault.

Anyone can access the tool free via the company’s site and can anonymously interact with the bot to provide details of their sexual harassment or assault experiences. Once the necessary data is inputted, the system creates an incident report that users can hand over to authorities if they wish.

Within the first two days of the product launch, 800 users had tried Botler AI’s new tool.“The great thing about AI tech is that it doesn’t take breaks, doesn’t go on vacations, so it’s available 24/7,” says Dutt, who co-founded Botler AI with Amir Morajev.

Dutt stresses the bot doesn’t dispense actionable legal advice, but offers the user “confidence grounded in legal doctrine.”

Botler AI knew it was the right time to release the bot once the #MeToo campaign caught fire in the wake of the Harvey Weinsten scandal. “People are talking about sexual assault now more than ever, and victims are looking for ways to empower themselves,” Dutt says.

She says sexually inappropriate conduct can be difficult to assess for those who experience it. For example, can a lingering hug constitute sexual assault? What about inappropriate comments about someone’s body? Those kinds of questions may not be explicitly answered within the bot, but the system can identify any violations of the Criminal Code.

Many victims of sexual misconduct don’t come forward with their allegations and part of the reason why is that they might not understand what their rights are. “Botler AI is a starting point for many victims,” says Dutt.

She anticipates that her company will be using AI techniques to address other areas of law in the days ahead. “The motivation behind becoming a company was to create a general artificial intelligence that would help the average person with any legal issue.” Machine learning pioneer Yoshia Bengio, BEng’86, MSc’88, PhD’91, the director of the Montreal Institute for Learning Algorithms, is serving as a strategic advisor for the company.

Botler AI is looking to expand on their use of AI and responsive bots – immigration law is one area of interest. The company also plans to build a network of trusted lawyers that can work with users who want to explore their legal options.

According to Dutt, many people don’t fully understand the law or their rights in difficult situations. “Our aim is to make the system accessible to everyone, and simple to navigate,” she says. “Once someone is better versed on their situation, it’s up to them to make an educated decision on how they’d like to pursue it.”

Back to top