EventGet 50% off your ticket to MongoDB.local London on October 2. Use code WEB50Learn more >>
MongoDB Developer
Atlas
plus
Sign in to follow topics
MongoDB Developer Centerchevron-right
Developer Topicschevron-right
Productschevron-right
Atlaschevron-right

Sentiment Chef Agent App with Google Cloud and MongoDB Atlas

Stanimira Vlaeva, Abirami Sukumaran16 min read • Published Jun 24, 2024 • Updated Jun 24, 2024
Google CloudAIAtlasJavaScript
Facebook Icontwitter iconlinkedin icon
Rate this tutorial
star-empty
star-empty
star-empty
star-empty
star-empty
In food and dining, customer reviews hold valuable insights. However, extracting meaning from large volumes of text and media data is challenging. Our team tackled this problem by developing Sentiment Chef, a smart restaurant agent that goes beyond simple review analysis. Here's how we did it.

The problem: Making sense of the customer reviews

Customer reviews are everywhere — on social media, review sites, and even directly on business websites. These reviews offer a wealth of opinions, but they're often buried in paragraphs of text or hidden within videos and photos. We wanted to make sense of the reviews’ sentiments and use the multimodal nature of data to our advantage while helping customers talk to our agent (chat assistant) to make decisions based on the summary of feedback.
We chose to tackle this problem in the context of the dining industry — the project we built is a restaurant reviews platform.

The initial solution: Sentiment analysis and summarization

Our first step was to build Sentiment Chef, a web app that used MongoDB Atlas Triggers to capture new restaurant reviews and send them to Google Cloud Functions. These functions leveraged the power of Gemini, a cutting-edge large language model (LLM), to analyze sentiment (positive, neutral, negative) and generate concise summaries of the text. We also used Gemini to extract sentiment and tags from images and videos attached to reviews.
The result? A powerful tool that could quickly assess customer sentiment and provide a snapshot of the key takeaways from reviews. This was a great starting point, but we realized we could do much more.

Evolution: From analysis to actionable insights

We wanted to turn Sentiment Chef into a true conversational agent that could actively help users find their next great meal. To do this, we needed to:
  1. Expand our knowledge base: We used MongoDB Atlas Data API to integrate our restaurant review data with other relevant information, such as restaurant details, cuisine, and location data.
  2. Build an analytics engine: We leveraged BigQuery and its machine learning capabilities to create an engine that could process this expanded dataset, create embeddings for semantic search, and generate nuanced responses to user queries.
  3. Create a conversational agent: We turned to Vertex AI agent console to create a chat interface that could interact with our analytics engine, providing users with personalized restaurant recommendations based on their preferences and search criteria.

The result: A restaurant agent that gets it

The evolved Sentiment Chef is now a sophisticated chatbot that understands natural language and can answer questions like:
  • What are the best Italian restaurants near me with outdoor seating?
  • I'm looking for a vegan-friendly restaurant with a lively atmosphere.
  • Can you recommend a quiet place for a romantic dinner?
The agent uses the rich data stored in MongoDB Atlas and the insights generated by our BigQuery analytics engine to deliver accurate and helpful responses.

Summary of the solution: Sentiment analysis web application and smart chatbot agent

The final project has two main components: a web application for submitting and analyzing restaurant reviews, and an intelligent chatbot agent for interpreting the reviews.

Sentiment analysis and summarization with Gemini and MongoDB Atlas Triggers

Sentiment Chef is a web application integrating MongoDB Atlas and Google Cloud to demonstrate generative AI features. The goal of this step is to set up a serverless pipeline that takes the written review about a restaurant via the web app, processes it, and outputs the sentiment (positive, neutral, or negative) along with a ranking (1 to 5) for food, service, and atmosphere — all from the text in the review.
The app supports uploading media (images and videos) to be attached to reviews. The files are uploaded to a Google Cloud Storage bucket and analyzed by Gemini to extract sentiment, tags, and descriptions. The media sentiment contributes to the overall sentiment of the review. The review text, together with the image tags and the descriptions are indexed with an Atlas Search index. This allows users to easily find review images with text search.

Restaurant agent with Vertex AI agents console

The restaurant agent is a conversational assistant that helps users find restaurants that match their needs. The goal of this app is to showcase a restaurant agent powered by the user review and restaurant information created in Part 1. The data from MongoDB Atlas serves as the input to the analytics engine that we use to power the Restaurant Agent! MongoDB Atlas Data API feeds the data from restaurants, reviews, locations, and ratings to the BigQuery Analytics engine that powers the Agent Builder Agent application. The BigQuery analytics engine comprises tables, models (Gemini 1.5 Pro and text-gecko Embeddings), and queries (Vector Search and Gemini remote invocation with generate_text) to power the agent. We use serverless cloud functions to invoke the analytics engine on the user chat text input. Agent Builder agent console app invokes the Cloud Functions endpoint (in OpenAPI YAML format) as the tool to respond to the users’ chat messages. This whole experience can be provisioned from within the web app in Part 1 using the Dialogflow CX API endpoint of the agent.

The hands-on deep-dive

In the following section, you will discover the specific tech stack we used to implement the project, the data flow, and links to the code repository.

Building the Sentiment Chef web application

The first part of the project is the Sentiment Chef web application. The application allows users to submit restaurant reviews and analyzes their sentiments, scoring them in three categories: food, service, and atmosphere.
Customer review with AI-generated sentiment analysis
Finally, once three or more reviews are submitted, the application generates a summary. The summarization function will only run if the existing summary is more than seven days old.
Sentiment Chef app showing an AI-generated summary of restaurant reviews

Tech stack

The front end of the application is built with Angular. The app is hosted on Firebase Hosting.
The rest of the technologies are described below:
  • MongoDB Atlas — a fully managed cloud database service that handles deployment, scaling, and operations for MongoDB
    • MongoDB Atlas Data API — a REST-like interface to interact with MongoDB data, enabling easy integration with various applications and services
    • MongoDB Atlas Functions — serverless JavaScript functions running in MongoDB Atlas
    • MongoDB Atlas Triggers — Event-driven triggers within MongoDB Atlas that trigger Atlas functions based on database changes
  • Google Cloud
    • Vertex AI Gemini API — the gemini-1.0-pro and gemini-1.0-pro-vision models
    • Google Cloud Storage
    • Firebase Hosting
The app leverages Atlas Triggers and Atlas Functions to capture event-driven data interactions within database collections. This data is then enriched via Vertex AI Gemini, enabling advanced text sentiment analysis and comprehensive text summarization.

The media upload flow

Media files (images and videos) are uploaded to a Google Cloud Storage bucket leveraging signed URLs. Once the files are uploaded, the app calls a Google Cloud function to extract metadata using Gemini. The output is a JSON array representing the extracted tags, sentiment, and description. The enriched metadata is sent back to the app. Once a review is submitted, the image URLs and metadata are stored in Atlas.

The review submission flow

The submitted review is stored in a MongoDB Atlas collection named raw_reviews. An Atlas trigger then forwards this data to a Google Cloud function. The Google Cloud function enhances the review text by sending it to Vertex AI Gemini using the Node.js SDK for Vertex AI. After Gemini performs inference on the processed review, the sentiment analysis results are sent back to the Google Cloud function. This function then returns the sentiment data to the Atlas function, which stores the enriched review in the processed_reviews collection. The web app has an active watcher that is notified of any new processed reviews, allowing it to immediately display them in the UI.
You can explore the live Sentiment Chef application and watch the demo video. The complete code for the project is available in the dedicated GitHub repository.

Building the Restaurant Agent chat assistant

The second part of the project is the Restaurant Agent. The Agent is a chat assistant that can answer questions based on the review data.
The agent taps into the detailed data in MongoDB Atlas and the insights from a BigQuery analytics engine to give you accurate and useful responses.

Tech stack

  • MongoDB Atlas and the MongoDB Atlas Data API
  • Google Cloud
    • Vertex AI generative AI models — gemini-1.0-pro, textembedding-gecko
    • Google Cloud Functions
    • Vertex AI Agent Builder
    • BigQuery, BQML (Remote Models, Generate_Text, Vector Search)

Data ingestion

There are multiple ways to interact with data from MongoDB Atlas.
  • MongoDB Atlas Data API
  • MongoDB to BigQuery dataflow template: There are dataflow template jobs for MongoDB to BigQuery for stream and batch data. You can schedule them based on the requirement to bring the operational data from MongoDB to BigQuery for the analytics engine.
In this scenario, we need to pre-process data. That’s why we converted the data to JSONL format, updated the records for data correction, and made it available for you to ingest into your BigQuery tables using the following commands.
Follow the steps in this GitHub file — BigQuery Data Ingestion — to set up the BigQuery dataset and tables with the data necessary for the analytics engine.

Remote models

In this part, we will interact with Vertex AI models from BigQuery to make the generative analytics work for you right where the data resides. This requires us to create an external connection so that BigQuery can interact with Vertex AI. Go to BigQuery from the Google Cloud console. You should now see the dataset and the table created under your project ID in the BigQuery Explorer pane.

Creating external connection

  • Click the +ADD button next to the explorer and select “Connections to external data sources” from the list of options.
  • Select the connection type as “Vertex AI remote models, remote functions and BigLake (Cloud Resource).”
  • Enter a connection name such as bq_llm_connection.
  • Select the location type as “Region” and “us-central1.”
  • Click the Create connection button and your external connection should be created immediately.
  • Go to external connections under your project ID in the explorer pane and click on the connection we just created — it should give you the connection configuration details.
  • Copy the Service Account ID.
  • Go to Identity and Access Management and click the GRANT ACCESS button.
  • Paste the Service Account of the external connection we just copied in the “New Principals” tab, assign the “Vertex AI User,” and save it.

Creating Remote Models

Next, you will create a Remote Model to get access to the textembedding-gecko embedding model from BigQuery,
  • Expand your dataset “restaurant_review” and navigate to the tables you just created.
  • On the right, you should be able to see the SQL editor. Click the + sign to open the new editor tab.
  • Create the remote large language generative model and embeddings model in BigQuery by running the following queries in the BigQuery SQL Editor:
These two DDLs should create the remote models you can use to implement the analytics engine in the next step.

Analytics engine

This custom-built engine will perform the heavy lifting of transforming raw data into actionable insights. It extracts context, creates embeddings, and drives the search for relevant information. We will build this engine in BigQuery.
Summarizing the parameters
To create our “Restaurant Review” analytics engine, we need to create a summary story of all parameters we have of the restaurants and reviews from the data API.
To do that, run the following query.
This query processes and summarizes the data we ingested into the BigQuery tables restaurants and raw_reviews.
Converting the summary to vector embeddings and writing the result into a table
We will convert the summary that we have created in the above query into embeddings using the remote embeddings model we created in one of the earlier steps and store the result in a separate table along with other key information.
Similarity search for the user text and LLM validation of the match
We will match the search text from the user agent (which we will address in an upcoming step) against the embeddings we have from the table we created in the last step.
Let's assume the user's search text is: “I am looking for Indian cuisine restaurants with good ratings.”
Now, we will convert this into embeddings and match against the embeddings in the summary table we created in the previous step. This is done using Vector Search (Cosine Distance Method). Then we will use Gemini 1.5 Pro to validate if the matches identified are relevant to the user’s ask by assigning a percentage or score of the match’s relevance. We will return the five most relevant matches to the user as the chat response.
The following query executes just what we explained here:
The part in bold that is the user query will get replaced by the request parameter when we integrate this in a Cloud function.
Cloud functions to process user input
To orchestrate between the chat agent and the analytics engine, we will create a Cloud function that passes the user search text as a JSON request, invokes the BigQuery SQL we discussed in the last step, and responds with the top five relevant matches as a JSON array string. The function will be exposed as a REST API endpoint.
You can use the source code for the Cloud functions. You can navigate to Cloud Functions from the Google Cloud console and create a new Gen2 Cloud function with Java 11 as the runtime environment. Replace the contents from the repository in the Java and pom.xml files respectively.
You can directly deploy the Cloud function from the Cloud Functions console. Your endpoint should look like this: https://us-central1-*****.cloudfunctions.net/restaurant_agent_from_bq.
Now that we have the analytics engine ready and accessible as a REST endpoint, it is time for us to create the Agent Builder Agent app for our users to interact with the system about restaurants.

Vertex AI agents

Vertex AI Agent Builder offers a no code agents console where they can build generative AI agents using natural language instructions and conversational examples for their agent to follow. The agents console features tooling to make agent building and maintenance easier where developers can create production grade high quality agents out of a prototype, monitor production traffic and improve agent responses over time.
In our case, we’ll use Agent Builder to seamlessly connect with the Cloud Function endpoint we built in Part 1, enabling our shopping assistant to access our retail knowledge base and respond to customer queries intelligently.
Building the agent
To build a new Agent Builder Agent app, refer to the screenshots and steps in "Building a Smart Retail Shopping Assistant (Part 2)" or the Agent product documentation. We will only visit specific configuration items that are unique to our app here:
  • Agent Name: Smart Restaurant Review
  • Goal: You are a friendly Restaurant Review Summary and Search agent. Your job is to answer customer questions about the restaurant. Also, help customers find the best restaurant matches for their search from our database.
Before you are able to enter our complete instructions in the next section, you need to create the tool as we are referencing it in our instructions. So, save the agent at this point and create the tool. To do this, click Tools and + Create.
  • Tool Name: Restaurant Review Tool
  • Type: OpenAPI
  • Description: This tool refers to the dataset in the back end as the context information for restaurant information. It takes as input the user's search text summarized by the agent and matches with the most appropriate list of items and returns as an array of items.
  • Schema (YAML):
Save the tool.
Remember that this is our OpenAPI spec in YAML format for the Cloud Functions endpoint we created in the previous section. You can create the OpenAPI schema for any endpoint of your choice that is relevant to your use case.
Navigate back to the agent and edit the instructions section to include the following:
  • Instructions:
  • Greet the user and ask how you can help them today.
  • If they have introduced themselves with their name or other personal detail, feel free to use that information to address them so.
  • Summarize the user's request and ask them to confirm that you understood correctly.
    • Check if the user search request has more restaurant preferences and details already.
    • If not, seek clarifying details.
    • If the search request is very broad, then ask the user to narrow down the request with specific details that you believe could be personal preferences.
    • Once you have all the necessary details, summarize your final understanding of the request with the user.
  • Use ${TOOL Review Tool} to help the user find the restaurant of their interest.
  • If the request is not related to restaurant details, gracefully convey that you don't have information on that topic.
  • Do not give any information outside the source that is provided to you in ${TOOL Review Tool}.
  • Do not assist with any information unless you are certain that you know the answer.
  • Ask if the user has any other query.
    • If yes, repeat the above step (step 2) again.
    • If no, then move on to the next step.
  • Thank the user for their business and say goodbye.
Save the agent for the latest updates. Test your agent using the preview interface on the right side of the Agent Console.
Agent Console
Here is a rundown of the sample chat I had with the latest agent we built:
Sample conversation
Also, if you’d like to incorporate this agent in your own web application, you can use the Dialogflow CX library to interact with the Dialogflow CX API, specifically the detectIntent method. Refer to “Building a Smart Retail Shopping Assistant (Part 3)” to create an endpoint to access your restaurant review agent from your web app!

What's next?

We're excited about the potential of Sentiment Chef and conversational AI! As you may have guessed, this use case applies to any application that collects customer feedback — whether it's for restaurants, online services, or retail stores. Understanding and responding to sentiment opens up new possibilities for enhancing customer satisfaction and refining user experiences across various industries.
Future improvements may include:
  • Multimodal prompts: Allowing users to search with text and images (e.g., "Find me a restaurant that looks like this picture")
  • Enhanced personalization: Tailoring recommendations based on individual user preferences and past dining experiences
  • Integration with reservation systems: Enabling users to book tables directly through the chat interface
Let us know your thoughts and how you're using AI to enhance your projects!

Conclusion

In conclusion, building a smart restaurant review summary and search agent with MongoDB, BigQuery, Gemini 1.5 Pro, and Agent Builder was a challenging yet satisfying project. By leveraging the power of the MongoDB Atlas Developer Data Platform, BigQuery's advanced analytics capabilities, the Gemini model’s advanced reasoning and state-of-the-art generation capabilities, and Agent Console's intuitive conversational experience, we created a conversational agent that can provide users with comprehensive information about restaurants. This agent not only enhances the customer experience but also showcases the potential of AI in revolutionizing the hospitality industry.
We hope this article has inspired you to explore the possibilities of AI and cloud technologies for your own projects. Let us know what you build with it!
If you have questions or want to connect with other developers building something great with MongoDB, head to the Developer Community next. To stay ahead of the curve on Google Cloud news and have updates sent right to your inbox or to join live discussions, AMAs, and roadmap sessions to learn the latest directly from Googlers or to get free Qwiklab credits, sign up for Google Cloud Innovators.

Facebook Icontwitter iconlinkedin icon
Rate this tutorial
star-empty
star-empty
star-empty
star-empty
star-empty
Related
Tutorial

Building AI Applications with Microsoft Semantic Kernel and MongoDB Atlas Vector Search


Nov 27, 2023 | 8 min read
Tutorial

How to Build a RAG System Using Claude 3 Opus And MongoDB


Aug 28, 2024 | 15 min read
Article

Taking RAG to Production with the MongoDB Documentation AI Chatbot


Aug 29, 2024 | 11 min read
Article

Using SuperDuperDB to Accelerate AI Development on MongoDB Atlas Vector Search


Feb 21, 2024 | 6 min read
Table of Contents