MongoDB Enables AI-Powered Legal Searches with Qura

Oliver Tree

#genAI

The launch of ChatGPT in November 2022 caught the world by surprise.

But while the rest of us marveled at the novelty of its human-like responses, the founders of Qura immediately saw another, more focused use case.

“Legal data is a mess,” said Kevin Kastberg, CTO for Qura. “The average lawyer spends tens of hours each month on manual research. We thought to ourselves, ‘what impact would this new LLM technology have on the way lawyers search for information?’”

And with that, Qura was born.

Illustration representing LLMs

Gaining trust

From its base in Stockholm, Sweden, Qura set about building an AI-powered legal search engine.

The team trained custom models and did continual pre-training on millions of pages of publicly available legal texts, looking to bring the comprehensive power of LLMs to the complex and intricate language of the law.

“Legal searches have typically been done via keyword search,” said Kastberg. “ We wanted to bring the power of LLMs to this field. ChatGPT created hype around the ability of LLMs to write. Qura is one of the first startups to showcase their far more impressive ability to read. LLMs can read and analyze, on a logical and semantic level, millions of pages of textual data in seconds. This is a game changer for legal search.”

Unlike other AI-powered applications, Qura is not interested in generating summaries or “answers” to the questions posed by lawyers or researchers. Instead, Qura aims to provide customers with the best sources and information.

“We deliberately wanted to stay away from generative AI. Our customers can be sure that with Qura there is no risk of hallucinations or bad interpretation. Put another way, we will not put an answer in your mouth; rather, we give you the best possible information to create that answer yourselves,” said Kastberg.

“Our users are looking for hard-to-find sources, not a gen AI-summary of the basic sources,” he added.

With this mantra, the company claims to have reduced research times by 78% while surfacing double the number of relevant sources when compared to similar legal search products.

MongoDB in the mix

Qura has worked with MongoDB since the beginning.

“We needed a document database for flexibility. MongoDB was really convenient as we had a lot of unstructured data with many different characteristics.”

In addition to the flexibility to adapt to different data types, MongoDB also offered the Qura team lightning-fast search capabilities.

MongoDB Atlas search is a crucial tool for our search algorithm agents to navigate our huge datasets. This is especially true of the speed at which we can do efficient text searches on huge corpuses of text, an important part for navigating documents,” said Kastberg.

And when it came to AI, a vector database to store and retrieve embeddings was also a real benefit.

“Having vector search built into Atlas was convenient and offered an efficient way to work with embeddings and vectorized data.”

What's next?

Qura's larger goal is to bring about the next generation of intelligent search. The legal space is only the start, and the company has larger ambitions to expand beyond Sweden and into other industries too.

“We are live with Qura in the legal space in Sweden and currently onboarding EU customers in the coming month. What we are building towards is a new way of navigating huge text databases, and that could be applied to any type of text data, in any industry,” said Kastberg.

Are you building AI apps? Join the MongoDB AI Innovators Program today! Successful participants gain access to free Atlas credits, technical enablement, and invaluable connections within the broader AI ecosystem. If your company is interested in being featured, we’d love to hear from you. Connect with us at ai_adopters@mongodb.com.

Head over to our quick-start guide to get started with Atlas Vector Search today.