Ainhoa Múgica

7 results

Better Digital Banking Experiences with AI and MongoDB

Interactive banking represents a new era in financial services where customers engage with digital platforms that anticipate, understand, and meet their needs in real-time. This approach encompasses AI-driven technologies such as chatbots, virtual assistants, and predictive analytics that allow banks to enhance digital self-service while delivering personalized, context-aware interactions. According to Accenture’s 2023 consumer banking study , 44% of consumers aged 18-44 reported difficulty accessing human support when needed, underscoring the demand for more responsive digital solutions that help bridge this gap between customers and financial services. Generative AI technologies like chatbots and virtual assistants can fill this need by instantly addressing inquiries, providing tailored financial advice, and anticipating future needs. This shift has tremendous growth potential; the global chatbot market is expected to grow at a CAGR of 23.3% from 2023 to 2030 , with the financial sector experiencing the fastest growth rate of 24.0%. This shift is more than just a convenience; it aims to create a smarter, more engaging, and intuitive banking journey for every user. Simplifying self-service banking with AI Navigating daily banking activities like transfers, payments, and withdrawals can often raise immediate questions for customers: “Can I overdraft my account?” “What will the penalties be?” or “How can I avoid these fees?” While the answers usually lie within the bank’s terms and conditions, these documents are often dense, complex, and overwhelming for the average user. At the same time, customers value their independence and want to handle their banking needs through self-service channels, but wading through extensive fine print isn't what they signed up for. By integrating AI-driven advisors into the digital banking experience, banks can provide a seamless, in-app solution that delivers instant, relevant answers. This removes the need for customers to leave the app to sift through pages of bank documentation in search of answers, or worse, endure the inconvenience of calling customer service. The result is a smoother and user-friendly interaction, where customers feel supported in their self-service journey, free from the frustration of navigating traditional, cumbersome information sources. The entire experience remains within the application, enhancing convenience and efficiency. Solution overview This AI-driven solution enhances the self-service experience in digital banking by applying Retrieval-Augmented Generation (RAG) principles, which combine the power of generative AI with reliable information retrieval, ensuring that the chatbot provides accurate, contextually relevant responses. The approach begins by processing dense, text-heavy documents, like terms and conditions, often the source of customer inquiries. These documents are divided into smaller, manageable chunks vectorized to create searchable data representations. Storing these vectorized chunks in MongoDB Atlas allows for efficient querying using MongoDB Atlas Vector Search , making it possible to instantly retrieve relevant information based on the customer’s question. Figure 1: Detailed solution architecture When a customer inputs a question in the banking app, the system quickly identifies and retrieves the most relevant chunks using semantic search. The AI then uses this information to generate clear, contextually relevant answers within the app, enabling a smooth, frustration-free experience without requiring customers to sift through dense documents or contact support. Figure 2: Leafy Bank mock-up chatbot in action How MongoDB supports AI-driven banking solutions MongoDB offers unique capabilities that empower financial institutions to build and scale AI-driven applications. Unified data model for flexibility: MongoDB’s flexible document model unifies structured and unstructured data, creating a consistent dataset that enhances the AI’s ability to understand and respond to complex queries. This model enables financial institutions to store and manage customer data, transaction history, and document content within a single system, streamlining interactions and making AI responses more contextually relevant. Vector search for enhanced querying: MongoDB Atlas Vector Search makes it easy to perform semantic searches on vectorized document chunks, quickly retrieving the most relevant information to answer user questions. This capability allows the AI to find precise answers within dense documents, enhancing the self-service experience for customers. Scalable integration with AI models: MongoDB is designed to work seamlessly with leading AI frameworks, allowing banks to integrate and scale AI applications quickly and efficiently. By aligning MongoDB Atlas with cloud-based LLM providers, banks can use the best tools available to interpret and respond to customer queries accurately, meeting demand with responsive, real-time answers. High performance and cost efficiency: MongoDB’s multi-cloud, developer-friendly platform allows financial institutions to innovate without costly infrastructure changes. It’s built to scale as data and AI needs to grow, ensuring banks can continually improve the customer experience with minimal disruptions. MongoDB’s built-in scalability allows banks to expand their AI capabilities effortlessly, offering a future-proof foundation for digital banking. Building future-proof applications Implementing generative AI presents several advantages, not only for end-users of the interactive banking applications but also for financial institutions: Enhanced user experience encourages customer satisfaction, ensures retention, boosts reputation, and reduces customer turnover while unlocking new opportunities for cross-selling and up-selling to increase revenue, drive growth and elevate customer value. Moreover, adopting AI-driven initiatives prepares the groundwork for businesses to develop innovative, creative, and future-proof applications to address customer needs and upgrade business applications with features that are shaping the industry and will continue to do so, here are some examples: Summarize and categorize transactional information by powering applications with MongoDB’s Real-Time Analytics . Understand and find trends based on customer behavior that could positively impact and leverage fraud prevention , anti-money laundering (AML) , and credit card application (just to mention a few). Offering investing, budgeting, and loan assessments through AI-powered conversational banking experience. In today’s data-driven world, companies face increasing pressure to stay ahead of rapid technological advancements and ever-evolving customer demands. Now more than ever, businesses must deliver intuitive, robust, and high-performing services through their applications to remain competitive and meet user expectations. Luckily, MongoDB provides businesses with comprehensive reference architectures for building generative AI applications, an end-to-end technology stack that includes integrations with leading technology providers, professional services, and a coordinated support system through the MongoDB AI Applications Program (MAAP) . By building AI-enriched applications with the leading multi-cloud developer data platform, companies can leverage low-cost, efficient solutions through MongoDB’s flexible and scalable document model which empowers businesses to unify real-time, operational, unstructured, and AI-related data, extending and customizing their applications to seize upcoming technological opportunities. Check out these additional resources to get started on your AI journey with MongoDB: How Leading Industries are Transforming with AI and MongoDB Atlas - E-book Our Solutions Library is where you can learn about different use cases for gen AI and other interesting topics that are applied to financial services and many other industries.

November 26, 2024

Anti-Money Laundering and Fraud Prevention With MongoDB Vector Search and OpenAI

Fraud and anti-money laundering (AML) are major concerns for both businesses and consumers, affecting sectors like financial services and e-commerce. Traditional methods of tackling these issues, including static, rule-based systems and predictive artificial intelligence (AI) methods, work but have limitations, such as lack of context and feature engineering overheads to keeping the models relevant, which can be time-consuming and costly. Vector search can significantly improve fraud detection and AML efforts by addressing these limitations, representing the next step in the evolution of machine learning for combating fraud. Any organization that is already benefiting from real-time analytics will find that this breakthrough in anomaly detection takes fraud and AML detection accuracy to the next level. In this post, we examine how real-time analytics powered by Atlas Vector Search enables organizations to uncover deeply hidden insights before fraud occurs. Check out our AI resource page to learn more about building AI-powered apps with MongoDB. The evolution of fraud and risk technology Over the past few decades, fraud and risk technology have evolved in stages, with each stage building upon the strengths of previous approaches while also addressing their weaknesses: Risk 1.0: In the early stages (the late 1990s to 2010), risk management relied heavily on manual processes and human judgment, with decision-making based on intuition, past experiences, and limited data analysis. Rule-based systems emerged during this time, using predefined rules to flag suspicious activities. These rules were often static and lacked adaptability to changing fraud patterns . Risk 2.0: With the evolution of machine learning and advanced analytics (from 2010 onwards), risk management entered a new era with 2.0. Predictive modeling techniques were employed to forecast future risks and detect fraudulent behavior. Systems were trained on historical data and became more integrated, allowing for real-time data processing and the automation of decision-making processes. However, these systems faced limitations such as, Feature engineering overhead: Risk 2.0 systems often require manual feature engineering. Lack of context: Risk 1.0 and Risk 2.0 may not incorporate a wide range of variables and contextual information. Risk 2.0 solutions are often used in combination with rule-based approaches because rules cannot be avoided. Companies have their business- and domain-specific heuristics and other rules that must be applied. Here is an example fraud detection solution based on Risk 1.0 and Risk 2.0 with a rules-based and traditional AI/ML approach. Risk 3.0: The latest stage (2023 and beyond) in fraud and risk technology evolution is driven by vector search. This advancement leverages real-time data feeds and continuous monitoring to detect emerging threats and adapt to changing risk landscapes, addressing the limitations of data imbalance, manual feature engineering, and the need for extensive human oversight while incorporating a wider range of variables and contextual information. Depending on the particular use case, organizations can combine or use these solutions to effectively manage and mitigate risks associated with Fraud and AML. Now, let us look into how MongoDB Atlas Vector Search (Risk 3.0) can help enhance existing fraud detection methods. How Atlas Vector Search can help A vector database is an organized collection of information that makes it easier to find similarities and relationships between different pieces of data. This definition uniquely positions MongoDB as particularly effective, rather than using a standalone or bolt-on vector database. The versatility of MongoDB’s developer data platform empowers users to store their operational data, metadata, and vector embeddings on MongoDB Atlas and seamlessly use Atlas Vector Search to index, retrieve, and build performant gen AI applications. Watch how you can revolutionize fraud detection with MongoDB Atlas Vector Search. The combination of real-time analytics and vector search offers a powerful synergy that enables organizations to discover insights that are otherwise elusive with traditional methods. MongoDB facilitates this through Atlas Vector Search integrated with OpenAI embedding, as illustrated in Figure 1 below. Figure 1: Atlas Vector Search in action for fraud detection and AML Business perspective: Fraud detection vs. AML Understanding the distinct business objectives and operational processes driving fraud detection and AML is crucial before diving into the use of vector embeddings. Fraud Detection is centered on identifying unauthorized activities aimed at immediate financial gain through deceptive practices. The detection models, therefore, look for specific patterns in transactional data that indicate such activities. For instance, they might focus on high-frequency, low-value transactions, which are common indicators of fraudulent behavior. AML , on the other hand, targets the complex process of disguising the origins of illicitly gained funds. The models here analyze broader and more intricate transaction networks and behaviors to identify potential laundering activities. For instance, AML could look at the relationships between transactions and entities over a longer period. Creation of Vector Embeddings for Fraud and AML Fraud and AML models require different approaches because they target distinct types of criminal activities. To accurately identify these activities, machine learning models use vector embeddings tailored to the features of each type of detection. In this solution highlighted in Figure 1, vector embeddings for fraud detection are created using a combination of text, transaction, and counterparty data. Conversely, the embeddings for AML are generated from data on transactions, relationships between counterparties, and their risk profiles. The selection of data sources, including the use of unstructured data and the creation of one or more vector embeddings, can be customized to meet specific needs. This particular solution utilizes OpenAI for generating vector embeddings, though other software options can also be employed. Historical vector embeddings are representations of past transaction data and customer profiles encoded into a vector format. The demo database is prepopulated with synthetically generated test data for both fraud and AML embeddings. In real-world scenarios, you can create embeddings by encoding historical transaction data and customer profiles as vectors. Regarding the fraud and AML detection workflow , as shown in Figure 1, incoming transaction fraud and AML aggregated text are used to generate embeddings using OpenAI. These embeddings are then analyzed using Atlas Vector Search based on the percentage of previous transactions with similar characteristics that were flagged for suspicious activity. In Figure 1, the term " Classified Transaction " indicates a transaction that has been processed and categorized by the detection system. This classification helps determine whether the transaction is considered normal, potentially fraudulent, or indicative of money laundering, thus guiding further actions. If flagged for fraud: The transaction request is declined. If not flagged: The transaction is completed successfully, and a confirmation message is shown. For rejected transactions, users can contact case management services with the transaction reference number for details. No action is needed for successful transactions. Combining Atlas Vector Search for fraud detection With the use of Atlas Vector Search with OpenAI embeddings, organizations can: Eliminate the need for batch and manual feature engineering required by predictive (Risk 2.0) methods. Dynamically incorporate new data sources to perform more accurate semantic searches, addressing emerging fraud trends. Adopt this method for mobile solutions, as traditional methods are often costly and performance-intensive. Why MongoDB for AML and fraud prevention Fraud and AML detection require a holistic platform approach as they involve diverse data sets that are constantly evolving. Customers choose MongoDB because it is a unified data platform (as shown in Figure 2 below) that eliminates the need for niche technologies, such as a dedicated vector database. What’s more, MongoDB’s document data model incorporates any kind of data—any structure (structured, semi-structured, and unstructured), any format, any source—no matter how often it changes, allowing you to create a holistic picture of customers to better predict transaction anomalies in real time. By incorporating Atlas Vector Search, institutions can: Build intelligent applications powered by semantic search and generative AI over any type of data. Store vector embeddings right next to your source data and metadata. Vectors inserted or updated in the database are automatically synchronized to the vector index. Optimize resource consumption, improve performance, and enhance availability with Search Nodes . Remove operational heavy lifting with the battle-tested, fully managed MongoDB Atlas developer data platform. Figure 2: Unified risk management and fraud detection data platform Given the broad and evolving nature of fraud detection and AML, these areas typically require multiple methods and a multimodal approach. Therefore, a unified risk data platform offers several advantages for organizations that are aiming to build effective solutions. Using MongoDB, you can develop solutions for Risk 1.0, Risk 2.0, and Risk 3.0, either separately or in combination, tailored to meet your specific business needs. The concepts are demonstrated with two examples: a card fraud solution accelerator for Risk 1.0 and Risk 2.0 and a new Vector Search solution for Risk 3.0, as discussed in this blog. It's important to note that the vector search-based Risk 3.0 solution can be implemented on top of Risk 1.0 and Risk 2.0 to enhance detection accuracy and reduce false positives. If you would like to discover more about how MongoDB can help you supercharge your fraud detection systems, take a look at the following resources: Revolutionizing Fraud Detection with Atlas Vector Search Card Fraud solution accelerator (Risk 1.0 and Risk 2.0) Risk, AML, and Fraud detection solution GitHub respository Add vector search to your arsenal for more accurate and cost-efficient RAG applications by enrolling in the DeepLearning.AI course " Prompt Compression and Query Optimization " for free today.

July 17, 2024

Transforming Predictive Maintenance with AI: Real-Time Audio-Based Diagnostics with Atlas Vector Search

Wind turbines are a critical component in the shift away from fossil fuels toward more sustainable, green sources of energy. According to the International Energy Agency (IEA), the global capacity of wind energy has been growing rapidly, reaching over 743 gigawatts by 2023. Wind energy, in particular, has one of the greatest potentials to increase countries' renewable capacity growth. Solar PV and wind additions are forecast to more than double by 2028 compared with 2022, continuously breaking records over the forecast period. This growth highlights the increasing reliance on wind power and, consequently, the need for effective maintenance strategies. Keeping wind turbines operating at maximum capacity is essential to ensuring their continued contribution to the energy grid. Like any mechanical device, wind turbines must undergo periodic maintenance to keep them operating at optimal levels. In recent years, advancements in technology—particularly in AI and machine learning—have played a significant role by introducing predictive maintenance breakthroughs to industrial processes like periodic maintenance. By integrating AI into renewable energy systems, organizations of all sizes can reduce costs and gain efficiencies. In this post, we will dig into an AI application use case for real-time anomaly detection through sound input, showcasing the impact of AI and MongoDB Atlas Vector Search for predictive maintenance of wind turbines. Check out our AI resource page to learn more about building AI-powered apps with MongoDB. Predictive Maintenance in Modern Industries Companies increasingly invest in predictive maintenance to optimize their operations and drive efficiency. Research from Deloitte indicates that predictive maintenance can reduce equipment downtime by 5–15 percent, increase labor productivity by 5–20 percent, and reduce overall new equipment costs by 3–5 percent. This helps organizations maximize their investment in equipment and infrastructure. By implementing predictive maintenance strategies, companies can anticipate equipment failures before they occur, ultimately resulting in longer equipment lifetimes, tighter budget control, and higher overall throughput. More concretely, businesses aim to reduce mean time to repair, optimal ordering of replacement parts, efficient people management, and reduced overall maintenance costs. Leveraging data interoperability, real-time analysis, modeling and simulation, and machine learning techniques, predictive maintenance enables companies to thrive in today's competitive landscape. However, despite its immense potential, predictive maintenance also presents significant challenges. One major hurdle is the consolidation of heterogeneous data, as predictive maintenance systems often need to integrate data from various formats and sources that can be difficult to integrate. Scalability also becomes a concern when dealing with the high volumes of IoT signals generated by numerous devices and sensors. And lastly, managing and analyzing this vast amount of data in real-time poses challenges that must be overcome to realize the full benefits of predictive maintenance initiatives. At its core, predictive maintenance begins with real-time diagnostics, enabling proactive identification and mitigation of potential equipment failures in real-time. Figure 1: Predictive Maintenance starts with real-time diagnostics However, while AI has been employed for real-time diagnostics for some time, the main challenge has been acquiring and utilizing the necessary data for training AI models. Traditional methods have struggled with incorporating unstructured data into these models effectively. Enter gen AI and vector search technologies, positioned to revolutionize this landscape. Flexible data platforms working together with AI algorithms can help generate insights from diverse data types, including images, video, audio, geospatial data, and more, paving the way for more robust and efficient maintenance strategies. In this context, MongoDB Atlas Vector Search stands out as a foundational element for effective and efficient gen AI-powered predictive maintenance models. Why MongoDB and Atlas Vector Search? For several reasons, MongoDB stands out as the preferred database solution for modern applications. Figure 2: MongoDB Atlas Developer Data Platform Document data model One of the reasons why the document model is well-suited to the needs of modern applications is its ability to store diverse data types in BSON (Binary JSON) format, ranging from structured to unstructured. This flexibility essentially eliminates the middle layer necessary to convert to a SQL-like format, resulting in easier-to-maintain applications, lower development times, and faster response to changes. Time series collections MongoDB excels in handling time-series data generated by edge devices, IoT sensors, PLCs, SCADA systems, and more. With dedicated time-series collections, MongoDB provides efficient storage and retrieval of time-stamped data, enabling real-time monitoring and analysis. Real-time data processing and aggregation MongoDB's adeptness in real-time data processing is crucial for immediate diagnostics and responses, ensuring timely interventions to prevent costly repairs and downtime. Its powerful aggregation capabilities facilitate the synthesis of data from multiple sources, providing comprehensive insights into fleet-wide performance trends. Developer data platform Beyond just storing data, MongoDB Atlas is a multi-cloud developer data platform, providing the flexibility required to build a diverse range of applications. Atlas includes features like transactional processing, text-based search, vector search, in-app analytics, and more through an elegant and integrated suite of data services. It offers developers a top-tier experience through a unified query interface, all while meeting the most demanding requirements for resilience, scalability, and cybersecurity. Atlas Vector Search Among the out-of-the-box features offered by MongoDB Atlas, Atlas Vector Search stands out, enabling the search of unstructured data effortlessly. You can generate vector embeddings with machine learning models like the ones found in OpenAI or Hugging Face, and store and index them in Atlas. This feature facilitates the indexing of vector representations of objects and retrieves those that are semantically most similar to your query. Explore the capabilities of Atlas Vector Search . This functionality is especially interesting for unstructured data that was previously hard to leverage, such as text, images, and audio, allowing searches that combine audio, video, metadata, production equipment data, or sensor measurements to provide an answer to a query. Let's delve into how simple it is to leverage AI to significantly enhance the sophistication of predictive maintenance models with MongoDB Atlas. Real-time audio-based diagnostics with Atlas Vector Search In our demonstration, we'll showcase real-time audio-based diagnostics applied to a wind turbine. It's important to note that while we focus on wind turbines here, the concept can be extrapolated to any machine, vehicle, or device emitting sound. To illustrate this concept, we'll utilize a handheld fan as our makeshift wind turbine. Wind turbines emit different sounds depending on their operational status. By continuously monitoring the turbine’s audio, our system can accurately specify the current operational status of the equipment and reduce the risk of unexpected breakdowns. Early detection of potential issues allows for enhanced operational efficiency, minimizing the time and resources required for manual inspections. Additionally, timely identification can prevent costly repairs and reduce overall turbine downtime, thus enhancing cost-effectiveness. Now, let’s have a look at how this demo works! Figure 3: Application Architecture Audio Preparation We begin by capturing the audio from the equipment in different situations (normal operation, high vs. low load, equipment obstructed, not operating, etc.). Once each sound is collected, we use an embedding model to process the audio data to convert it to a vector. This step is crucial because by generating embeddings for each audio track, which are high-dimensional vector representations, we are essentially capturing the unique characteristics of the sound. We then upload these vector embeddings to MongoDB Atlas. By adding just a few examples of sounds in our database, they are ready to be searched (and essentially compared) with the sound emitted by our equipment during its operation in real-time. Audio-based diagnosis Now, we put our equipment into normal operation and start capturing the sound it is making in real-time. In this demonstration, we capture one-second clips of audio. Then, with the same embedding model used before, we take our audio clips and convert them to vector embeddings in real-time. This process happens in milliseconds, allowing us to have real-time monitoring of the audio. The one-second audio clips, now converted to vector embeddings, are then sent to MongoDB Atlas Vector Search, which can search for and find the most similar vectors from the ones we previously recorded in our audio preparation phase. The result is given back with a percentage of similarity, enabling a very accurate prediction of the current status of the operation of the wind turbine. These steps are performed repeatedly every second, leveraging fast embedding of vectors and quick searches, allowing for real-time monitoring based on sound. Check out the video below to see it in action! Transforming Predictive Maintenance with AI and MongoDB Predictive maintenance offers substantial benefits but poses challenges like data integration and scalability. MongoDB stands out as a preferred database solution, offering scalability, flexibility, and real-time data processing. As technology advances, AI integration promises to further revolutionize the industry. Thank you to Ralph Johnson and Han Heloir for their valuable contributions to this demo! Head over to our quick-start guide to get started with Atlas Vector Search today. Explore how MongoDB empowers manufacturing operations by visiting these resources: Generative AI in Predictive Maintenance Applications Transforming Industries with MongoDB and AI: Manufacturing and Motion MongoDB for Automotive: Driving Innovation from Factory to Finish Line

May 28, 2024

Retrieval Augmented Generation for Claim Processing: Combining MongoDB Atlas Vector Search and Large Language Models

Following up on our previous blog, AI, Vectors, and the Future of Claims Processing: Why Insurance Needs to Understand The Power of Vector Databases , we’ll pick up the conversation right where we left it. We discussed extensively how Atlas Vector Search can benefit the claim process in insurance and briefly covered Retrieval Augmented Generation (RAG) and Large Language Models (LLMs). Check out our AI resource page to learn more about building AI-powered apps with MongoDB. One of the biggest challenges for claim adjusters is pulling and aggregating information from disparate systems and diverse data formats. PDFs of policy guidelines might be stored in a content-sharing platform, customer information locked in a legacy CRM, and claim-related pictures and voice reports in yet another tool. All of this data is not just fragmented across siloed sources and hard to find but also in formats that have been historically nearly impossible to index with traditional methods. Over the years, insurance companies have accumulated terabytes of unstructured data in their data stores but have failed to capitalize on the possibility of accessing and leveraging it to uncover business insights, deliver better customer experiences, and streamline operations. Some of our customers even admit they’re not fully aware of all the data in their archives. There’s a tremendous opportunity to leverage this unstructured data to benefit the insurer and its customers. Our image search post covered part of the solution to these challenges, opening the door to working more easily with unstructured data. RAG takes it a step further, integrating Atlas Vector Search and LLMs, thus allowing insurers to go beyond the limitations of baseline foundational models, making them context-aware by feeding them proprietary data. Figure 1 shows how the interaction works in practice: through a chat prompt, we can ask questions to the system, and the LLM returns answers to the user and shows what references it used to retrieve the information contained in the response. Great! We’ve got a nice UI, but how can we build an RAG application? Let’s open the hood and see what’s in it! Figure 1: UI of the claim adjuster RAG-powered chatbot Architecture and flow Before we start building our application, we need to ensure that our data is easily accessible and in one secure place. Operational Data Layers (ODLs) are the recommended pattern for wrangling data to create single views. This post walks the reader through the process of modernizing insurance data models with Relational Migrator, helping insurers migrate off legacy systems to create ODLs. Once the data is organized in our MongoDB collections and ready to be consumed, we can start architecting our solution. Building upon the schema developed in the image search post , we augment our documents by adding a few fields that will allow adjusters to ask more complex questions about the data and solve harder business challenges, such as resolving a claim in a fraction of the time with increased accuracy. Figure 2 shows the resulting document with two highlighted fields, “claimDescription” and its vector representation, “claimDescriptionEmbedding” . We can now create a Vector Search index on this array, a key step to facilitate retrieving the information fed to the LLM. Figure 2: document schema of the claim collection, the highlighted fields are used to retrieve the data that will be passed as context to the LLM Having prepared our data, building the RAG interaction is straightforward; refer to this GitHub repository for the implementation details. Here, we’ll just discuss the high-level architecture and the data flow, as shown in Figure 3 below: The user enters the prompt, a question in natural language. The prompt is vectorized and sent to Atlas Vector Search; similar documents are retrieved. The prompt and the retrieved documents are passed to the LLM as context. The LLM produces an answer to the user (in natural language), considering the context and the prompt. Figure 3: RAG architecture and interaction flow It is important to note how the semantics of the question are preserved throughout the different steps. The reference to “adverse weather” related accidents in the prompt is captured and passed to Atlas Vector Search, which surfaces claim documents whose claim description relates to similar concepts (e.g., rain) without needing to mention them explicitly. Finally, the LLM consumes the relevant documents to produce a context-aware question referencing rain, hail, and fire, as we’d expect based on the user's initial question. So what? To sum it all up, what’s the benefit of combining Atlas Vector Search and LLMs in a Claim Processing RAG application? Speed and accuracy: Having the data centrally organized and ready to be consumed by LLMs, adjusters can find all the necessary information in a fraction of the time. Flexibility: LLMs can answer a wide spectrum of questions, meaning applications require less upfront system design. There is no need to build custom APIs for each piece of information you’re trying to retrieve; just ask the LLM to do it for you. Natural interaction: Applications can be interrogated in plain English without programming skills or system training. Data accessibility: Insurers can finally leverage and explore unstructured data that was previously hard to access. Not just claim processing The same data model and architecture can serve additional personas and use cases within the organization: Customer Service: Operators can quickly pull customer data and answer complex questions without navigating different systems. For example, “Summarize this customer's past interactions,” “What coverages does this customer have?” or “What coverages can I recommend to this customer?” Customer self-service: Simplify your members’ experience by enabling them to ask questions themselves. For example, “My apartment is flooded. Am I covered?” or “How long do windshield repairs take on average?” Underwriting: Underwriters can quickly aggregate and summarize information, providing quotes in a fraction of the time. For example, “Summarize this customer claim history.” “I Am renewing a customer policy. What are the customer's current coverages? Pull everything related to the policy entity/customer. I need to get baseline info. Find relevant underwriting guidelines.” If you would like to discover more about Converged AI and Application Data Stores with MongoDB, take a look at the following resources: RAG for claim processing GitHub repository From Relational Databases to AI: An Insurance Data Modernization Journey Modernize your insurance data models with MongoDB and Relational Migrator Head over to our quick-start guide to get started with Atlas Vector Search today.

April 18, 2024

IoT 데이터와 관련한 3대 핵심 과제와 해결 방법

IoT(Internet of Things, 사물인터넷)는 미래 지향적인 솔루션을 위한 핵심 구성 요소로 자리매김하고 있으며 막대한 경제적 가치가 내재되어 있습니다. McKinsey & Company는 2030년까지 IoT가 소비자 및 고객을 통한 가치 창출을 포함해 전 세계적으로 5조5천 억 달러에서 12조6천 억 달러 규모의 시장을 형성할 것으로 전망했습니다. 지속적으로 높아지는 IoT의 인기와 소비자들의 의존도는 아마도 여러분의 손목을 보는 것만으로도 충분히 확인할 수 있을 것입니다. 피트니스 밴드에서 커넥티드 카, 스마트 홈, 그리고 제조 및 소매 산업의 플릿 관리(fleet-management) 솔루션에 이르기까지, IoT는 이미 전 세계 수십 억 대의 장치를 연결하고 있으며, 앞으로 그 수는 더욱 증가할 것입니다. 점차 더 많은 IoT 기반 장치들이 온라인으로 연결되고 더욱 정교한 센서들이 탑재되고 있기 때문에, IoT 솔루션을 보다 쉽게 구현하고 기업들이 새로운 혁신 기회를 포착할 수 있도록 돕는 올바른 기반 기술을 선택하는 것이 매우 중요합니다. 이 블로그에서는 제조, 소매, 통신, 의료 등을 비롯한 다양한 산업에서 MongoDB가 어떻게 IoT 데이터와 관련한 3대 핵심 과제를 성공적으로 해결하는지를 보여드릴 것입니다. 3대 핵심 과제는 다음과 같습니다. 데이터 관리 실시간 분석 공급망 최적화 그림 1: MongoDB Atlas for IoT 자, 이제 자세히 살펴보겠습니다! 데이터 관리 IoT 장치들이 생성한 대용량 데이터를 저장, 전송 및 처리하는 데는 상당한 어려움이 따릅니다. 또한, IoT 장치들이 생성한 데이터는 많은 경우, 가변적인 구조를 가지고 있습니다. 효과적인 의사 결정에 필요한 컨텍스트를 생성하려면, 이러한 데이터에 신중하게 타임스탬프(timestamp)를 저장하고, 인덱스를 만들며, 다른 데이터 소스와의 상관 관계를 분석해야 합니다. 이러한 데이터 볼륨과 복잡성으로 인해 IoT 장치들의 데이터를 효과적이고 효율적으로 처리하기 어렵습니다. Bosch Bosch Digital의 IoT 장치 관리, IoT 데이터 관리, IoT 엣지를 위한 제품 및 서비스 제품군인 Bosch IoT Suite를 보십시오. 이들 제품 및 서비스는 250개 이상의 국제적인 IoT 프로젝트와 1천 만 개 이상의 커넥티드 장치들을 유지하고 있습니다. Bosch는 실시간으로 데이터를 저장, 관리 및 분석하기 위해 MongoDB를 구축했습니다. MongoDB는 정형, 반정형 및 비정형 데이터를 처리할 수 있으며, JSON으로 효율적인 데이터 모델링을 실행할 수 있기 때문에, 각 장치의 정보 모델을 데이터베이스 내 관련 도큐먼트에 쉽게 매핑할 수 있습니다. 또한, 동적 스키마는 애자일(agile) 개발 방법론을 지원하며 애플리케이션과 소프트웨어의 개발을 단순화합니다. 새로운 장치, 센서 및 에셋(asset)을 쉽게 추가할 수 있기 때문에 개발 팀은 보다 우수한 소프트웨어를 개발하는 데 집중할 수 있습니다. ThingSpace 또 다른 예로, Verizon의 시장 선도적인 IoT 연결 관리 플랫폼으로서, 다양한 IoT 제품 및 서비스를 제공하는 데 필요한 네트워크 액세스를 제공하는 ThingSpace를 들 수 있습니다. Verizon은 자사 네트워크 액세스를 구매한 회사들과 협력하여 이들의 장치를 연결하고, 이들 회사의 고유 솔루션과 함께 번들하여 최종 사용자에게 판매합니다. ThingSpace의 각 고객들은 장치들이 항상 작동하도록 보장하기 위해 신뢰할 수 있는 연결이 필요한 IoT 제품을 판매합니다. WiFi는 이러한 연결을 제공할 수 없습니다. Verizon의 모놀리식(monolithic) RDBMS 기반 시스템은 트랜잭션 및 시계열 워크로드 모두를 처리하도록 확장될 수 없기 때문에 Verizon은 앞으로 분산 데이터베이스 아키텍처가 필요하다고 판단했습니다. MongoDB는 Verizon의 다양한 활용 사례와 워크로드 유형 전반에서 Verizon의 요구 사항을 충족하도록 확장되는 유일한 솔루션이라는 것을 입증했습니다. 수많은 장치와 빠르게 수신되는 메시지로 인해 발생한 엄청난 처리 요구는 MongoDB의 매우 가용성과 확장성이 뛰어난 아키텍처로만 해결되었습니다. 네이티브 MongoDB Time Series는 클러스터링된 인덱스로 최적화된 스토리지와 최적화된 Time-Series 쿼리 연산자를 통해 성능을 향상시킬 수 있습니다. 유연한 데이터 모델링, 강력한 인덱싱, Time Series 등과 같은 MongoDB의 첨단 기능들은 IoT 장치들이 생성한 복잡하고 다양한 데이터를 관리하는 효과적인 솔루션입니다. 실시간 분석 오늘날 빅 데이터 분석에서 가장 필수적인 부분 중 하나인 실시간 데이터 분석은 보다 데이터 기반의 실시간 의사 결정을 내릴 수 있도록 한다는 점에서 기업에 매우 중요합니다. 하지만, 그 중요성에도 불구하고 분 단위로 또는 초 단위로 데이터의 변경에 대응할 수 있는 경우는 극소수에 불과합니다. 엔터프라이즈 규모로 실시간 분석을 구현하는 과정에서 많은 과제들이 발생하게 됩니다. 이처럼 엄청난 용량의 데이터를 저장하고 이를 실시간으로 분석하는 것은 완전히 다른 문제입니다. Thermo Fisher Cloud AWS의 과학계를 위한 최대 클라우드 플랫폼 중 하나인 Thermo Fisher Cloud를 살펴보겠습니다. MS Instrument Connect는 Thermo Fisher 고객들이 모든 모바일 장치나 브라우저에서 실시간으로 실험 결과를 확인할 수 있도록 합니다. 각 실험은 수백 만 개의 데이터 "행(row)"을 생성했으며, 이에 따라, 기존 데이터베이스들은 최적의 성능으로 실행되지 못하게 됐습니다. 내부 개발자들은 빠르게 변경되는 다양한 데이터를 쉽게 처리할 수 있는 데이터베이스를 필요로 했습니다. MongoDB의 표현식 쿼리 언어와 리치 보조 인덱스는 고객들의 과학 실험을 위해 필요한 애드혹(ad-hoc) 쿼리와 사전 정의된 쿼리 모두를 유연하게 지원합니다. 언제나 MongoDB Atlas와 같은 서비스를 이용할 수 있기 때문에, Thermo Fisher는 핵심 업무, 즉 과학 분야에서 업계 최고 수준의 서비스를 제공하는 회사로 성장하는 데 집중할 수 있습니다. Joseph Fluckiger, Thermo Fisher의 수석 소프트웨어 아키텍트 MongoDB Atlas 는 원활하게 확장되고, 엄청난 양의 센서 및 이벤트 정보를 수집해 실시간 분석을 지원함으로써 모든 중요한 이벤트를 포착할 수 있습니다. 이를 통해 기업들은 다음과 같은 새로운 기능들을 활용할 수 있습니다. 과도한 데이터 매핑 없이 모든 유형의 스트리밍 또는 배치(batch) 데이터 캡처 내장된 집계 프레임워크로 쉽고 직관적으로 데이터 분석 실행 신속하고 손쉽게 규모에 맞는 데이터 통찰력 제공 MongoDB를 통해 기업들은 쿼리를 최적화하여 신속하게 결과를 제공함으로써 운영을 개선하고 비즈니스 성장에 박차를 가할 수 있습니다. 공급망 최적화 품목들은 공급망 내 여러 지점을 거쳐 이동하기 때문에 이러한 여정의 전반에 걸쳐 처음부터 끝까지 완벽한 가시성을 유지하기 어렵습니다. 어떤 단계에서든 통제력을 잃는다면, 계획의 효율성이 저하되고, 전체 공급망 속도가 느려지며, 궁극적으로 투자수익률(Return on Investment, ROI)이 낮아지는 결과를 초래할 수 있습니다. 필요에 따라 원재료를 소싱하여 물류센터 공간을 최적화하는 것부터 실시간 공급망 통찰력에 이르기까지, IoT 기반 공급망은 사각지대와 비효율을 제거해 이러한 프로세스를 크게 최적화할 수 있도록 지원합니다. Longbow Advantage Longbow Advantage 는 고객들이 자체 공급망을 최적화할 수 있도록 지원함으로써 상당한 비즈니스 성과를 거두고 있습니다. 수백 만 건의 화물이 매일 여러 물류센터를 거쳐 이동하면서 하루 종일 실시간 가시성과 리포팅을 위해 분석해야 하는 엄청난 양의 데이터를 생성합니다. 대표적인 물류센터 가시성 플랫폼인 Rebus는 실시간 성능 리포팅 기능과 함께 엔드투엔드 물류센터 가시성 및 지능형 인력 관리 기능을 결합하고 있습니다. Longbow는 이러한 규모의 데이터를 처리하고 Rebus의 핵심에서 실시간 물류센터 가시성 및 리포팅 기능을 제공할 수 있는 데이터베이스 솔루션을 필요로 했으며, 많은 시간이 소요되는 모놀리식 스프레드시트는 이를 수행하는 데 역부족이라는 사실을 깨달았습니다. MongoDB의 도큐먼트 데이터베이스 모델이 여기에 가장 적합하며, Rebus가 거의 실시간으로 다양한 데이터를 수집하고 저장하며 가시성을 확보하도록 지원할 것이라는 확신을 갖게 됐습니다. 지능형 공급망 솔루션에서 중요한 또 하나의 구성 요소는 실시간 가시성을 제공하고 현장에서 데이터 기반 의사 결정을 수행할 수 있도록 지원하는 IoT 기반 모바일 앱입니다. 이러한 상황에서, 직원들은 연결이 취약하거나 존재하지 않는 지역에서도 데이터에 액세스해야 하기 때문에 오프라인 우선(offline-first) 패러다임이 중요해집니다. MongoDB의 Realm은 리소스가 제한된 환경을 위한 경량의 객체 지향형 임베디드 기술입니다. 모바일 기기에 데이터를 저장하는 데 이상적인 솔루션입니다. Realm 데이터베이스를 래핑(wrap)하는 MongoDB의 Realm SDK와 함께, 개발자들이 최소한의 작업만으로 MongoDB와 모바일 기기의 Realm 간에 원활하게 데이터 동기화를 실행할 수 있는 Atlas Device Sync를 활용함으로써 기업들은 신속하게 모바일 애플리케이션을 개발하고 혁신을 추진할 수 있습니다. MongoDB는 프로세스를 최적화하고 비효율을 제거할 수 있는 IoT 기반 공급망을 위한 강력한 솔루션을 제공하기 때문에 기업들은 데이터 기반 의사 결정을 수행하고 공급망 효율성을 향상시킬 수 있습니다. 결론 IoT 산업이 빠르게 진화하고, 커넥티드 장치의 수가 늘어나면서 이들 솔루션을 활용하는 기업들이 직면한 과제도 증가하고 있습니다. 여러 다양한 실제 활용 사례들을 통해 MongoDB가 어떻게 기업들이 IoT 데이터 관리를 처리하고, 실시간 분석을 수행하며, 공급망을 최적화함으로써 다양한 산업 부문에서 혁신을 추진할 수 있도록 지원했는지 살펴봤습니다. 고객들을 위해 힘든 작업을 관리하도록 설계된 고유한 특징과 기능들을 통해, MongoDB는 IoT 환경의 지속적인 디지털 전환에서 중요한 역할을 수행할 수 있는 유리한 입지를 확보하고 있습니다. 자세한 내용을 원하거나, MongoDB로 시작하시기를 원하십니까? MongoDB의 IoT 리소스를 확인해 보십시오. MongoDB IoT 참조 아키텍처 Relational Migrator를 이용한 기존 애플리케이션 마이그레이션 MongoDB & IIoT e북 IoT 웹페이지

December 29, 2023

Modernize Your Factory Operations: Build a Virtual Factory with MongoDB Atlas in 5 Simple Steps

Virtual factories are revolutionizing the manufacturing landscape. Coined as the "Revolution in factory planning" by BMW Group at NVIDIA, this cutting-edge approach is transforming the way companies like BMW and Hyundai operate, thanks to groundbreaking partnerships with technology companies such as NVIDIA and Unity. At the heart of this revolution lies the concept of virtual factories , computer-based replicas of real-world manufacturing facilities. These virtual factories accurately mimic the characteristics and intricacies of physical factories, making them a powerful tool for manufacturers to optimize their operations. By leveraging AI, they unlock a whole new world of possibilities, revolutionizing the manufacturing landscape, paving the way for improved productivity, cost savings, and innovation. In this blog we will explore the benefits of virtual factories and guide you through the process of building your own virtual factory using MongoDB Atlas. Let’s dive in! Unlocking digital transformation The digitalization of the manufacturing industry has given rise to the development of smart factories. These advanced factories incorporate IoT sensors into their machinery and equipment, allowing workers to gather data-driven insights on their manufacturing processes. However, the evolution does not stop at smart factories automating and optimizing physical production. The emergence of virtual factories introduces simulation capabilities and remote monitoring, leading to the creation of factory digital twins, as depicted in Figure 1. By bridging the concepts of smart and virtual factories, manufacturers can unlock greater levels of efficiency, productivity, flexibility, and innovation. Figure 1:  From smart factory to virtual factory Leveraging virtual factories in manufacturing organizations provides many benefits including: Optimization of production processes and identification of inefficiencies. This can lead to increased efficiency, reduced waste, and improved quality. Aiding quality control by contextualizing sensor data with the manufacturing process. This allows analysis of quality issues and implementation of necessary control measures while dealing with complex production processes. Simulating manufacturing processes and testing new products or ideas without the need for physical prototypes or real-world production facilities. This significantly reduces costs associated with research and development and minimizes the risk of product failure. However, setting up a virtual factory for complex manufacturing is difficult. Challenges include managing system overload, handling vast amounts of data from physical factories, and creating accurate visualizations. The virtual factory must also adapt to changes in the physical factory over time. Given these challenges, having a data platform that can contextualize all the data coming in from the physical factory and then feed that to the virtual factory and vice versa is crucial. And that is where MongoDB Atlas , our developer data platform, comes in, providing synchronization capabilities between physical and virtual worlds, enabling flexible data modeling and providing access to the data via a unified query interface as seen in Figure 2. Figure 2:  MongoDB Atlas as the Data Platform between physical and virtual Factories Now that we’ve discussed the benefits and the challenges of building virtual factories, let’s unpack how simple it is to build a virtual factory with MongoDB Atlas. How to build a virtual factory MongoDB Atlas 1. Define the business requirements The first step of the process is to define the business requirements for the virtual factory. Our team at MongoDB uses a smart factory model from Fischertechnik to demonstrate how easily MongoDB can be integrated to solve the digital transformation challenges of IIoT in manufacturing. This testbed serves as our foundational physical factory and the starting point of this project. Figure 3:  The smart factory testbed We defined our set of business requirements as the following: Implement a virtual run of the physical factory to identify layout and process optimizations. Provide real-time visibility of the physical factory conditions such as inventory for process improvements. This last requirement is critical; while standalone simulation models of factories can be useful, they typically do not take into account the real-time data from the physical factory. By connecting the physical and virtual factories, a digital twin can be created that takes into account the actual performance of the physical factory in real-time. This enables more accurate predictions of the factory's performance, which improves decision-making, process optimization, and enables remote monitoring and control, reducing downtime and improving response times. 2. Create a 3D model Based on the previous business requirements, we created a 3D-model of the factory in a widely used game engine, Unity . This virtual model can be visualized using a computer, tablet or any virtual reality headset. Figure 4:  3D-model of the smart factory in Unity Additionally, we also added four different buttons (red, white, blue, and “stop”) which enables users to submit production orders to the physical factory or stop the process altogether. 3. Connect the physical and virtual factories Once we created the 3D model, we connected the physical and virtual factories via MongoDB Atlas. Let’s start with our virtual factory software application. Regardless of where you deploy it, be it a headset or a tablet, you can use Realm by MongoDB to present data locally inside Unity and then synchronize it with MongoDB Atlas as the central data layer. Allowing us to have embedded databases where there's resource constrainment and MongoDB Atlas as a powerful and scalable cloud backend technology. And lastly, to ensure data synchronization and communication between these two components, we leveraged MongoDB Atlas Device Sync , providing bi-directional synchronization mechanism and network handling. Now that we have our virtual factory set-up, let’s have a look at our physical one. In a real manufacturing environment, many of the shopfloor connectivity systems can connect to MongoDB Atlas and for those who don't natively, it is very straightforward to build a connector. At the shopfloor layer you can have MongoDB set up so that you can analyze and visualize your data locally and set up materialized views. On the cloud layer, you can push data directly to MongoDB Atlas or use our Cluster-to-Cluster Sync functionality. A single IoT device, by itself, does not generate much data. But as the amount of devices grows, so does the volume of machine-generated data and therefore the complexity of the data storage architecture required to support it. The data storage layer is often one of the primary causes of performance problems as an application scales. A well-designed data storage architecture is a crucial component in any IoT platform. In our project, we have integrated AWS IoT Core to subscribe to MQTT messages from the physical factory. Once these messages are received and filtered, they are transmitted to MongoDB Atlas via an HTTP endpoint. The HTTP endpoint then triggers a function which stores the messages in the corresponding collection based on their source (e.g., messages from the camera are stored in the camera collection). With MongoDB Atlas, as your data grows you can archive it using our Atlas Online Archive functionality. Figure 5:  Virtual and physical factories data flow In figure 5, we can see everything we’ve put together so far, on the left hand side we have our virtual factory where users can place an order. The order information is stored inside Realm, synced with MongoDB Atlas using Atlas Device Sync and sent to the physical factory using Atlas Triggers . On the other hand, the physical factory sends out sensor data and event information about the physical movement of items within the factory. MongoDB Atlas provides the full data platform experience for connecting both physical and virtual worlds! 4. Data modeling Now that the connectivity has been established, let's look at modeling this data that is coming in. As you may know, any piece of data that can be represented in JSON can be natively stored in and easily retrieved from MongoDB. The MongoDB drivers take care of converting the data to BSON (binary JSON) and back when querying the database. Furthermore, you can use documents to model data in any way you need, whether it is key value pairs, time series data or event data. On the topic of time series data, MongoDB Time Series allows you to automatically store time series data in a highly optimized and compressed format, reducing customer storage footprint, as well as achieving greater query performance at scale. Figure 5:  Virtual and physical factories sample data It really is as simple as it looks, and the best part is that we are doing all of this inside MongoDB Atlas making a direct impact on developer productivity. 5. Enable computer vision for real-time inventory Once we have the data modeled and connectivity established, our last step is to run event-driven analytics on top of our developer data platform. We used computer vision and AI to analyze the inventory status in the physical factory and then pushed notifications to the virtual one. If the user tries to order a piece in the virtual factory that is not in stock, they will immediately get a notification from the physical factory. All this is made possible using MongoDB Atlas and its connectors to various AI platforms If you want to learn more, stay tuned for part 2 of this blog series where we’ll dive deep into the technical considerations of this last step. Conclusion By investing in a virtual factory, companies can optimize production processes, strengthen quality control, and perform cost-effective testing, ultimately improving efficiency and innovation in manufacturing operations. MongoDB, with its comprehensive features and functionality that cover the entire lifecycle of manufacturing data, is well-positioned to implement virtual factory capabilities for the manufacturing industry. These capabilities allow MongoDB to be in a unique position to fast-track the digital transformation journey of manufacturers. Learn more: MongoDB & IIoT: A 4-Step Data Integration Manufacturing at Scale: MongoDB & IIoT Manufacturing with MongoDB Thank you to Karolina Ruiz Rojelj for her contributions to this post.

June 20, 2023

Three Major IoT Data-Related Challenges and How to Address Them

IoT has formed itself a crucial component for future-oriented solutions and holds a massive potential of economic value. McKinsey & Company estimates that by 2030, IoT (Internet of Things) will enable $5.5 trillion to $12.6 trillion in value worldwide, including the value captured by consumers and customers. For proof of its growing popularity and consumers’ dependency on it, you likely don't need to look any further than your own wrist. From fitness bands to connected vehicles, smart homes, and fleet-management solutions in manufacturing and retail, IoT already connects billions of devices worldwide, with many more to come. As more IoT enabled devices come online, with increasingly sophisticated sensors, choosing the right underlying technology to make IoT solutions easier to implement and help companies seize new innovative opportunities is essential. In this blog we will discuss how MongoDB has successfully addressed three major IoT data-related challenges across various industries, including Manufacturing, Retail, Telecommunications, and Healthcare. The challenges are the following: Data Management Real-Time Analytics Supply Chain Optimization FIgure 1: MongoDB Atlas for IoT Let's dive right in! Data management Storing, transmitting, and processing the large amount of data that IoT devices produce is a significant challenge. Additionally, the data produced by IoT devices often comes in variable structures. This data must be carefully timestamped, indexed, and correlated with other data sources to make the context required for effective decision-making. This data volume and complexity combination makes it difficult to effectively and efficiently process data from IoT devices. Bosch Consider Bosch IoT Suite , a family of products and services in IoT device management, IoT data management, and IoT edge by Bosch Digital. These products and services hold over 250 international IoT projects and over 10 million connected devices. Bosch implemented MongoDB to store, manage, and analyze data in real time. MongoDB’s ability to handle structured, semi-structured, and unstructured data, and efficient data modeling with JSON make it easy to map the information model of each device to its associated document in the database. In addition, dynamic schemas support agile development methodologies and make it simple to develop apps and software. Adding new devices, sensors, and assets is easy, which means the team can focus on creating better software. ThingSpace Another example is that of ThingSpace , Verizon’s market-leading IoT connectivity management platform, which provides the network access required to deliver various IoT products and services. Verizon works with companies that purchase network access from it to connect their devices, bundled together with their own solutions, which they sell to end users. ThingSpace’s customers each sell an IoT product that needs reliable connectivity to ensure the devices always work, which WiFi cannot offer. Verizon’s monolithic RDBMS-based system would not be able to scale to handle both transactional and time-series workloads, so Verizon decided it needed a distributed database architecture going forward. MongoDB proved to be the only solution that scaled to meet Verizon’s requirements across different use cases and combinations of workload types. The immense processing needs resulting from the high number of devices and high velocity of messages coming in were only addressed by MongoDB’s highly available, scalable architecture. Native MongoDB Time Series allow for improved performance, through optimized storage with clustered indexes and optimized Time-Series query operators. MongoDB's advanced capabilities, such as flexible data modeling, powerful indexing and Time Series provide an effective solution for managing the complex and diverse data generated by IoT devices. Real-time analytics Real-time data analytics, one of the most crucial parts of big data analytics today, brings value to businesses for making more data-driven real-time decisions. However, despite its importance, very few can respond to changes in data minute by minute or second by second. Many challenges arise when it comes to the implementation of real-time analytics for enterprises. Storing such a huge volume of data and analyzing it in real time is an entirely different story. Thermo Fisher Cloud Let’s consider the Thermo Fisher Cloud, one of the largest cloud platforms for the scientific community on AWS. MS Instrument Connect allows Thermo Fisher customers to see live experiment results from any mobile device or browser. Each experiment produced millions of "rows" of data, which led to suboptimal performance with existing databases. Internal developers needed a database that could easily handle a wide variety of fast-changing data. MongoDB's expressive query language and rich secondary indexes provided the flexibility to support both ad-hoc and predefined queries customers needed for their scientific experiments. Anytime I can use a service like MongoDB Atlas, I’m going to take that so that we at Thermo Fisher can focus on what we’re good at, which is being the leader in serving science. Joseph Fluckiger, Sr. Software Architect @Thermo Fisher MongoDB Atlas scales seamlessly and is capable of ingesting enormous amounts of sensor and event data to support real-time analysis for catching any critical events or changes as they happen. That gives organizations new capabilities, including: Capturing streaming or batch data of all types without excessive data mapping Analyzing data easily and intuitively with a built-in aggregation framework Delivering data insights rapidly and at scale with ease With MongoDB organizations can optimize queries to quickly deliver results to improve operations and drive business growth. Supply chain optimization Items move through different locations in the supply chain, making it hard to maintain end-to-end visibility throughout their journey. The lack of control on any stage can dramatically harm the efficiency of planning, slow down the entire supply chain and ultimately result in lower return on investment. From optimizing warehouse space by sourcing raw materials as needed, to real-time supply chain insights, IoT-enabled supply chains can help significantly optimize these processes by eliminating blind spots and inefficiencies. Longbow Advantage Longbow Advantage delivers substantial business results by enabling clients to optimize their supply chains. Millions of shipments move through multiple warehouses every day, generating massive quantities of data throughout the day that must be analyzed for real-time visibility and reporting. Its flagship warehouse visibility platform, Rebus, combines real-time performance reporting with end-to-end warehouse visibility and intelligent labor management. Longbow needed a database solution that could process quantities of that scale and deliver real-time warehouse visibility and reporting at the heart of Rebus, and it knew it could not rely on monolithic, time-consuming spreadsheets to do so. It became clear that MongoDB’s document database model was a good match and would allow Rebus to gather, store, and build visibility into disparate data in near real time. Another key component of smart supply chain solutions is IoT-enabled mobile apps that provide real-time visibility and facilitate on-the-spot data-driven decisions. In such situations, the offline first paradigm becomes crucial, since staff need access to data in areas where connectivity is poor or nonexistent. Realm by MongoDB is a lightweight, object-oriented embedded database technology for resource constrained environments. It is an ideal solution for storing data on mobile devices. By utilizing MongoDB’s Realm SDKs, which wrap the Realm database, and Atlas Device Sync , which enables seamless data synchronization between MongoDB and Realm on your mobile phone with minimal developer efforts, businesses can rapidly develop mobile applications and drive innovation. MongoDB provides a powerful solution for IoT-enabled supply chains that can optimize processes and eliminate inefficiencies, enabling organizations to make data-driven decisions and improve supply chain efficiency. Conclusion The IoT industry is rapidly evolving, and as the number of connected devices grows, so do the challenges faced by businesses leveraging these solutions. Through a range of real-world use cases, we have seen how MongoDB has helped businesses deal with IoT data management, perform real-time analytics and optimize their supply chains, driving innovation in a variety of industries. With its unique features and capabilities, designed to manage the heavy lifting for you, MongoDB is well-positioned to continue playing a crucial role in the ongoing digital transformation of the IoT landscape. Want to learn more or get started with MongoDB? Check out our IoT resources MongoDB IoT Reference Architecture Migrate existing applications - with Relational Migrator MongoDB & IIoT ebook IoT webpage

April 24, 2023