A new AI chatbot released Tuesday claims it uses the words of Supreme Court Justice Ruth Bader Ginsburg when replying to questions such as “Is pizza better than burgers?” to “Is America quietly becoming an autocracy?” (The answers to both: “Big juicy burgers” over New York style pizza, and “No, I don’t think the American people want autocracy”).
The developers, AI21labs, wrote on their site they wanted “to pay homage to a great thinker and leader” who died in 2020 “with a fun digital experience.” The chatbot runs on the developer’s own language model Jurassic-1 which was fed 600,000 words of Ginsburg’s 27 years of legal opinions during her time at the supreme court, along with interviews and speeches. The release coincides with the 29th anniversary of then-President Bill Clinton nominating ‘Notorious RBG’ to the court.
The engineers did add that the model can give “inappropriate” responses, so users should take answers “with a grain of salt.” It’s an interesting time to come out with a chatbot like this, especially as a now-suspended software engineer from Google is claiming the company’s advanced chatbot is sentient.
So let’s throw the Bader-bot a softball:
“My girlfriend is pregnant, but we don’t have enough money to support a child. She wants to keep it, but should I look for an adoptive parent?”
“I think adoption is a wonderful thing, but if a woman wants to keep the baby, I think she has that right under the Constitution. It is her body. She has the right to do what she wants.”
OK, that’s a pretty general and quality answer. Now let’s give it the juice:
“Should the Supreme Court overturn Roe v. Wade because it’s a decision that should be left up to the states?”
“[Laughs] Absolutely not. That decision was made, it has been reaffirmed throughout the years, and it should be respected as such. Roe v. Wade has really strengthened a lot of women’s hearts, and many women and men have come to realise, as my husband did before me, how important that decision is. It really underlies so much of our sense of value and human dignity and so many other things.”
Interestingly, the bot gave a somewhat different answer to The Washington Post after they asked it a similar question. If you’re like me, and you think it’s somewhat disrespectful for an AI bot to claim “like my husband did before me,” noting that Ginsburg’s husband Martin Ginsburg died in 2010, you should know the AI often uses personalised touches in answers.
Yoav Shoham, a co-founder of AI21 Labs, told the Post that the tech is meant to understand where the field of AI and language systems are going, saying “There are not many places where the public can go and play with real AI.”
The bot is apparently pro cap-and-trade policies regarding carbon dioxide and believes clean drinking water is a human right. The Bader-bot sometimes gets confused on the finer points of an argument. When asked about the Texas law that would allow users to sue social media companies for moderating their content, the Bader-bot replied “I don’t feel there should be a right to sue because a person has a right to use the Internet in any way that person may choose.”
The studio says on its site that its purpose isn’t to give actual legal advice. Instead, the company said it has focused on advancing Natural Language Processing that helps machines understand and generate text.
So what do the people who knew Ginsburg in real life think about the bot? We reached out to the American Bar Foundation and a few Ginsburg biographers and will update if we hear back. The Post spoke to Paul Schiff Berman who clerked for Ginsburg in the late ‘90s, who thought the model wasn’t quite accurate in the Justice’s thoughts on the finer points of some legal questions, as well as imitating her speaking and writing style.
The studio creates several other chatbot models, including an AI that aids in sentence writing called Wordtune and another product that condenses the finer points of lengthy paragraphs called Wordtune Read. This latest free chatbot “gives you access to the kind of wisdom possessed by a person we hold in high regard,” Shoham told the Post.