Meet new ChatGPT challenger Claude

Anthropic, a Google-backed artificial intelligence start-up, is making its rival chatbot to OpenAI’s popular ChatGPT available to businesses that want to add it to their products. PHOTO: AFP

SAN FRANCISCO – Anthropic, a Google-backed artificial intelligence (AI) start-up, is making its rival chatbot to OpenAI’s popular ChatGPT available to businesses that want to add it to their products.

The start-up, created in 2021 by former leaders of OpenAI, including siblings Daniela and Dario Amodei, said the chatbot, named Claude, has been tested during the past few months by technology companies such as Notion Labs, Quora and search engine DuckDuckGo. Quora, for instance, included the chatbot in an app called Poe, which lets users ask questions.

Companies that want to use Claude can sign up via a waiting list. Anthropic aims to offer access within days of the request.

The start-up also is offering a version called Claude Instant, which is less powerful but cheaper and speedier. Earlier in March, OpenAI released ChatGPT for businesses.

Although chatbots themselves are by no means new, Claude is one of a breed of much more powerful tools that have been trained on massive swathes of the Internet to generate text that mimics human speech far better than their predecessors.

Such tools are an application of generative AI, which refers to artificial intelligence systems that consider input such as a text prompt and use it to output new content such as text or images.

OpenAI released ChatGPT for widespread testing last November, unleashing a stampede of tech companies unveiling their own chatbots.

In February, Google said it had started testing its version, Bard, while Microsoft, which has invested US$11 billion (S$14.8 billion) in OpenAI, added a chatbot based on the start-up’s technology to its Bing search engine. Google has invested almost US$400 million in Anthropic, Bloomberg reported in February.

Similar to ChatGPT, Claude is a large language model that can be used for a range of written tasks like summarising, searching, answering questions and coding.

While ChatGPT has faced criticism – and been tweaked – after offering users some disturbing results, Anthropic is positioning its chatbot as a more cautious one from the start. Essentially, it is meant to be harder to wring offensive results from it.

Mr Amodei, Anthropic’s chief executive, said the start-up has been slowly rolling out tests of Claude.

“I don’t want to say all the problems have been solved,” he said. “I think all of these models, including ours, they sometimes hallucinate, they sometimes make things up.”

When used recently via Quora’s Poe app, Claude was easy to converse with, offered snappy answers, and responded apologetically when a tester was unhappy with its answers.

For instance, in one exchange via the Poe app, the chatbot was asked to suggest nicknames for a daughter and then for a son. When the bot was questioned about the results – which included champ, buddy and tiger for a boy and sweet pea, princess and angel for a girl – Claude acknowledged its suggestions “fell into some gender stereotypes”.

“I appreciate you calling out my gender bias,” it typed. “I will be more mindful of avoiding stereotypes in the future. The most important thing is that nicknames are chosen with love and show appreciation for your child’s unique qualities.”

This exchange looks like cautiousness on the part of the chatbot. But many people will simply take the chatbot’s initial answers and move on, rather than asking a follow-up question like the one that prompted Claude to detail its bias, said Dr Julie Carpenter, a research scientist and fellow in the Ethics and Emerging Sciences Group at California Polytechnic State University in San Luis Obispo County.

“It only explained the bias when you followed up with a critical question,” she said. “If you had not exposed the bias, it would have just presented it as an answer. And that is the potential harm that you found.” BLOOMBERG

Join ST's Telegram channel and get the latest breaking news delivered to you.