Technology

16 results

Infosys Media Platform & MongoDB: Metadata Management and Workflow Orchestration Across Media Supply

Capitalize on current and innovative technologies in the media supply chain with the Infosys Media Platform (IMP). As a part of the cloud and cloud-based Infosys Cobalt ™ portfolio, Infosys’ unifying framework, built on MongoDB Atlas and MongoDB Enterprise Advanced , helps you facilitate creative collaboration, enable productions on an industrial scale, and monetize customer relationships. How? By integrating the various ecosystems involved and providing a common platform to connect you to services and technology solutions for the media content value chain. Infosys’ intelligently-woven media and metadata management framework, leveraging MongoDB’s document model, enables smart workflows and incorporates ML/AI to create, manage and moderate content metadata. This allows the orchestration workflows across different business functions. Additionally, the platform delivers the benefits of productivity, scalability, and agility via the cloud and streamlines collaboration among the ecosystem of partners and technology solutions. Why Infosys Media Platform (IMP)? The Infosys Media Platform consists of various modules that serve critical business functions, such as: The Curation & Digitization Module -- provides the master workflow and ingests content from internal archives and multiple sources using AI/ML to create a composite index of frame level and time-coded metadata of recognized elements (such as celebrities/known personalities, objects, brands, text, and images). It also enables intelligent ad spot identification and includes functions like automated QC, editing, review, and approval, and censor editing The Custom AI Module -- ensures that newly introduced elements such as celebrities, brands, etc., can be continuously trained and recognized. This can also be used to custom-train the AI models to recognize specific content as per customer’s needs The Localization Module -- enables collaboration across multiple locations and global vendors through automatic generation of closed captions and subtitles in multiple languages Metadata Management and Distribution Module -- enables global distribution to digital platforms at scale through standard workflow models and a state-of-the-art dashboard by orchestrating the accumulation of asset level and descriptive time-based metadata from production to delivery With the above modules, Infosys Media Platform can attain the following capabilities: Content Enrichment -- leverages AI models to process video files and generate time-coded metadata for post-production and distribution process Closed Captioning, Subtitling and Localization -- processes audio (dialogs from a video, lyrics from a song, speech from a podcast) and converts it into closed captions and subtitles Content Moderation -- recognizes the presence of mature content (profanity, violence, gore etc.) using the video content and speech detection capabilities Image Processing -- identifies various attributes of an image file, similar to the capabilities on video/music content Metadata Packaging & Distribution -- manages end-to-end supply chain of digital metadata creation, updates, packaging and distribution NLP Based Analytics -- using natural language processing capabilities of the platform, users can review any string of text (dialogs, lyrics, conversations) to determine the context of the conversation as well as the sentiment Why MongoDB for Infosys Media Platform? What company wouldn’t want a database platform that increases developer productivity and data-driven operational efficiency? MongoDB offers both. With MongoDB Atlas , you can reduce the time developers spend managing data and databases, so they can focus on value-added tasks like developing new apps. MongoDB’s document model and query language provide easier access to data, allowing developers to work quickly and efficiently to support new data structures and data types, as well as leverage database-supported roll-ups for analysis. Additionally, MongoDB Atlas introduces 100+ metrics and monitoring capabilities with its complete data platform built to improve operational productivity, so you can work smarter, not harder. A main feature of Infosys Media Platform is its cloud-agnostic nature; and as a cloud-agnostic and multi-cloud data platform, MongoDB is the only platform that satisfies IMP with its ability to run seamlessly across the globe. Through a mixed workload of real-time and transactional analytics, IMP also offers a roadmap on analytics, text search and data visualization capabilities--and MongoDB provides all these features. How MongoDB powers the data platform for Infosys Media Platform (IMP) Data for all the modules previously described is powered by the MongoDB data platform Details like profile data, accounts, ratings, translations, country/region details are stored in MongoDB Audit transactional data are currently running on SQL and there is a roadmap for moving to MongoDB data platform . In the future, MongoDB’s core capabilities will further enhance the Infosys Media Platform and customer experience. Our roadmap includes utilizing MongoDB ACID transactional capabilities to store audit details, as well as using MongoDB functions and triggers for creating cloud agnostic serverless functions. Additionally, MongoDB Atlas may be leveraged for full-text search capabilities applied to media data, and to create charts for dashboarding and real-time analytics of user subscriber data. Download the Modernization Guide

May 19, 2021

How DataSwitch And MongoDB Atlas Can Help Modernize Your Legacy Workloads

Data modernization is here to stay, and DataSwitch and MongoDB are leading the way forward. Research strongly indicates that the future of the Database Management System (DBMS) market is in the cloud, and the ideal way to shift from an outdated, legacy DBMS to a modern, cloud-friendly data warehouse is through data modernization. There are a few key factors driving this shift. Increasingly, companies need to store and manage unstructured data in a cloud-enabled system, as opposed to a legacy DBMS which is only designed for structured data. Moreover, the amount of data generated by a business is increasing at a rate of 55% to 65% every year and the majority of it is unstructured. A modernized database that can improve data quality and availability provides tremendous benefits in performance, scalability, and cost optimization. It also provides a foundation for improving business value through informed decision-making. Additionally, cloud-enabled databases support greater agility so you can upgrade current applications and build new ones faster to meet customer demand. Gartner predicts that by 2022, 75% of all databases will be on the cloud – either by direct deployment or through data migration and modernization. But research shows that over 40% of migration projects fail. This is due to challenges such as: Inadequate knowledge of legacy applications and their data design Complexity of code and design from different legacy applications Lack of automation tools for transforming from legacy data processing to cloud-friendly data and processes It is essential to harness a strategic approach and choose the right partner for your data modernization journey. We’re here to help you do just that. Why MongoDB? MongoDB is built for modern application developers and for the cloud era. As a general purpose, document-based, distributed database, it facilitates high productivity and can handle huge volumes of data. The document database stores data in JSON-like documents and is built on a scale-out architecture that is optimal for any kind of developer who builds scalable applications through agile methodologies. Ultimately, MongoDB fosters business agility, scalability and innovation. Key MongoDB advantages include: Rich JSON Documents Powerful query language Multi-cloud data distribution Security of sensitive data Quick storage and retrieval of data Capacity for huge volumes of data and traffic Design supports greater developer productivity Extremely reliable for mission-critical workloads Architected for optimal performance and efficiency Key advantages of MongoDB Atlas , MongoDB’s hosted database as a service, include: Multi-cloud data distribution Secure for sensitive data Designed for developer productivity Reliable for mission critical workloads Built for optimal performance Managed for operational efficiency To be clear, JSON documents are the most productive way to work with data as they support nested objects and arrays as values. They also support schemas that are flexible and dynamic. MongoDB’s powerful query language enables sorting and filtering of any field, regardless of how nested it is in a document. Moreover, it provides support for aggregations as well as modern use cases including graph search, geo-based search and text search. Queries are in JSON and are easy to compose. MongoDB provides support for joins in queries. MongoDB supports two types of relationships with the ability to reference and embed. It has all the power of a relational database and much, much more. Companies of all sizes can use MongoDB as it successfully operates on a large and mature platform ecosystem. Developers enjoy a great user experience with the ability to provision MongoDB Atlas clusters and commence coding instantly. A global community of developers and consultants makes it easy to get the help you need, if and when you need it. In addition, MongoDB supports all major languages and provides enterprise-grade support. Why DataSwitch as a partner for MongoDB? Automated schema re-design, data migration & code conversion DataSwitch is a trusted partner for cost-effective, accelerated solutions for digital data transformation, migration and modernization through a modern database platform. Our no-code and low-code solutions along with cloud data expertise and unique, automated schema generation accelerates time to market. We provide end-to-end data, schema and process migration with automated replatforming and refactoring, thereby delivering: 50% faster time to market 60% reduction in total cost of delivery Assured quality with built-in best practices, guidelines and accuracy Data modernization: How “DataSwitch Migrate” helps you migrate from RDBMS to MongoDB DataSwitch Migrate (“DS Migrate”) is a no-code and low-code toolkit that leverages advanced automation to provide intuitive, predictive and self-serviceable schema redesign from a traditional RDBMS model to MongoDB’s Document Model with built-in best practices. Based on data volume, performance, and criticality, DS Migrate automatically recommends the appropriate ETTL (Extract, Transfer, Transform & Load) data migration process. DataSwitch delivers data engineering solutions and transformations in half the timeframe of the existing typical data modernization solutions. Consider these key areas: Schema redesign – construct a new framework for data management. DS Migrate provides automated data migration and transformation based on your redesigned schema, as well as no-touch code conversion from legacy data scripts to MongoDB Atlas APIs. Users can simply drag and drop the schema for redesign and the platform converts it to a document-based JSON structure by applying MongoDB modeling best practices. The platform then automatically migrates data to the new, re-designed JSON structure. It also converts the legacy database script for MongoDB. This automated, user-friendly data migration is faster than anything you’ve ever seen. Here’s a look at how the schema designer works. Refactoring – change the data structure to match the new schema. DS Migrate handles this through auto code generation for migrating the data. This is far beyond a mere lift and shift. DataSwitch takes care of refactoring and replatforming (moving from the legacy platform to MongoDB) automatically. It is a game-changing unique capability to perform all these tasks within a single platform. Security – mask and tokenize data while moving the data from on-premise to the cloud. As the data is moving to a potentially public cloud, you must keep it secure. DataSwitch’s tool has the capability to configure and apply security measures automatically while migrating the data. Data Quality – ensure that data is clean, complete, trustworthy, consistent. DataSwitch allows you to configure your own quality rules and automatically apply them during data migration. In summary: first, the DataSwitch tool automatically extracts the data from an existing database, like Oracle. It then exports the data and stores it locally before zipping and transferring it to the cloud. Next, DataSwitch transforms the data by altering the data structure to match the re-designed schema, and applying data security measures during the transform step. Lastly, DS Migrate loads the data and processes it into MongoDB in its entirety. Process Conversion Process conversion, where scripts and process logic are migrated from legacy DBMS to a modern DBMS, is made easier thanks to a high degree of automation. Minimal coding and manual intervention are required and the journey is accelerated. It involves: DML – Data Manipulation Language CRUD – typical application functionality (Create, Read, Update & Delete) Converting to the equivalent of MongoDB Atlas API Degree of automation DataSwitch provides during Migration Schema Migration Activities DS Automation Capabilities Application Data Usage Analysis 70% 3NF to NoSQL Schema Recommendation 60% Schema Re-Design Self Services 50% Predictive Data Mapping 60% Process Migration Activities DS Automation Capabilities CRUD based SQL conversion (Oracle, MySQL, SQLServer, Teradata, DB2) to MongoDB API 70% Data Migration Activities DS Automation Capabilities Migration Script Creation 90% Historical Data Migration 90% 2 Catch Load 90% DataSwitch Legacy Modernization as a Service (LMaas): Our consulting expertise combined with the DS Migrate tool allows us to harness the power of the cloud for data transformation of RDBMS legacy data systems to MongoDB. Our solution delivers legacy transformation in half the time frame through pay-per-usage. Key strengths include: ● Data Architecture Consulting ● Data Modernization Assessment and Migration Strategy ● Specialized Modernization Services DS Migrate Architecture Diagram Contact us to learn more.

May 13, 2021

Announcing the MongoDB SI Architect Certification Program for Modernization to the Cloud

The product names referenced in this blog are outdated. Realm Sync refers to what is currently known as Atlas Device Sync. You know the value of modernization as a strategic initiative. It’s not only about refreshing your portfolio of legacy applications with the latest innovations simply for the sake of moving to the cloud. This is much more than just “lift and shift”. True modernization is about realizing your company’s full potential and gaining a competitive edge through development methodologies, architectural patterns and technologies. And by modernizing with MongoDB, you can build new business functionality 3-5x faster, scale to millions of users wherever they are on the planet, and cut costs by 70% or more. If you’re familiar with our technology and our Modernization Program , you already understand the benefits. But do your customers? And, if not, how do you tell them? To help you get started, the MongoDB Partner team has created the MongoDB SI Architect Certification , a full scale kit of assets related to modernization. This free, self-paced certification helps you improve the modernization experience for a variety of customer types as well as drive conversations with customers around data center exit plans and application qualification for assessing cloud data platforms. Consider this certification the next step in deepening your expertise so you can expand your business opportunities and help customers modernize to the cloud. Customized for System Integrator partners, our certification teaches you how to discuss the benefits of modernization with various customers on a cloud journey. It enables architects to have deep discussions on vertical-based stories, migration tools, best practices, and architecture guidelines. System Integrator partners will also learn the fundamental value of offerings, messaging, objection handling, and more. Most importantly, this certification program equips SI architects with the ability to communicate key takeaways to the customer in a language they understand. Program Structure The free SI Architect certification program is self-paced, takes approximately 20 hours, and divided into six key sections, complete with a final certification exam. Introduction allows partners to access the modernization webinars and modernization program offerings. Top use cases focus on how MongoDB is used in business-wide strategic initiatives, like legacy modernization, cloud data strategy, microservices and more vertical based stories. Customer case studies highlight how MongoDB is deployed and leveraged through real-life customer case studies and proof points. University classes allow participants to leverage MongoDB university on-line as well as on-demand courses relevant to the architects. Competitive edge helps architects understand the true value of MongoDB in comparison to the competition. Final certification culminates the program with a "Talk to the experts" session and final certification exam where participants take a real world industry use case or customer project and assess how to migrate to the cloud. Our “Talk to the experts” session provides users with the opportunity to query experts with questions about the final certification exam. It also introduces the messaging around “MongoDB: The Intelligent Operational Data Platform” and details an Atlas TCO and sizing exercise. In addition to these assets, partners also have access to self-paced developer training and database administrator training here . Note: Download the enhanced Modernization Guide to refresh your knowledge on MongoDB modernization Dive Deeper into MongoDB Cloud Technology What’s one key lesson we know for certain? The data management platform you choose is a key factor in successfully migrating legacy applications to the cloud. The MongoDB Cloud section of our Architecture Guide discusses the unique value MongoDB can bring to organizations making the transition to cloud. Note: Download the Architecture Guide to refresh your knowledge on MongoDB Cloud The key components of the MongoDB cloud platform are: At its core is MongoDB , the general purpose operational database for modern applications. Nearly every application needs a fast database that can deliver single digit millisecond response times; and when it comes to speed, MongoDB delivers. With our flexible document data model, transactional guarantees, rich and expressive query language, and native support for both vertical and horizontal scaling, MongoDB can be used for practically any use case, reducing the need for specialized databases even as your requirements change. With multi-cloud clusters on MongoDB Atlas , customers can realize the benefits of a multi-cloud strategy with true data portability and a simplified management experience. Multi-cloud clusters provide the best-in-class technology across multiple clouds in parallel, migrate workloads across cloud providers seamlessly, and improve high availability with cross-cloud redundancy. Realm Mobile Database extends this data foundation to the edge. Realm is a lightweight database embedded on the client side. Realm helps solve the unique challenges of building for mobile, making it simple to store data on-device while also enabling data access when offline. Realm Sync is seamlessly integrated and keeps data up-to-date across devices and users by automatically syncing data between the client and a backend Atlas cluster. Ready to boost your knowledge and expertise? The Modernization Guide, Architecture Guide, and SI Architect certification program are waiting for you. Get started today. Start the free MongoDB SI Architect certification program today!

March 24, 2021

Capgemini Solutions that help customers modernize applications to MongoDB

Companies across every industry vertical continue to face the challenge of how to effectively migrate and quickly access massive amounts of enterprise data—all while keeping system performance up to par throughout the obstacle-ridden process. The complexities involved with the ubiquitous, traditional Relational Database Management Systems (RDBMS) are many. RDBMS systems can often inhibit performance, falter under heavy volumes and slow down deployment. With MongoDB’s document-based, distributed database, however, performance and volume issues are easily addressed. But when it comes to speeding up time to market? The right auxiliary tools are still needed. Capgemini, a MongoDB partner and global leader in digital transformation, provides the final piece of the puzzle with a new tool rooted in automated intelligence. In this blog, we’ll explore three key Capgemini Solutions that help customers modernize to MongoDB. Tools that expedite time to market Migration from legacy system to MongoDB New development using MongoDB as a backend database Whether your company is developing a new database or migrating from legacy systems to MongoDB, Capgemini’s new Database Convert & Compare (DCC) tool can help. Below, we’ll detail how DCC works, then walk through a few recent, client examples and the immense benefits reaped. Tool: Database Convert & Compare (DCC) A powerful tool developed by the Capgemini team, DCC optimizes activities like database migration, data comparison, validation and much more. The tool can perform data transformations with specific customization based on the source and target database in the scope. When migrating from RDBMS to MongoDB, DCC achieves 70% automation and 30% manual retrofit on a database level. How does DCC work? In context of RDBMS to NoSQL migration, DCC performs the migration in 3 stages. 1) Assessment: Source database schema assessment – DCC extracts source schema information and performs an assessment to generate detailed inventory of data objects such as tables, views, stored procedures and indexes. It also generates a detailed report on data volume from each table which helps in assessing estimated data migration time from source to target Apply analytics to prepare recommendation for target database structure—The target structure varies based on various parameters, such as: Table relationships (one to many, many to many, one to one) Indexes applied on table for performance requirements Column data type 2) Schema Migration Customize tool to apply recommendation from step 1.2 hence generating the script for target database Target schema script preparation – DCC will generate a complete database schema script except for a few object types such as stored procedure, views etc. Produce detailed report of schema migration, inclusive of objects that couldn’t be migrated Manual intervention is required to apply business logic implementation of source database, stored procedures and views to target environment application 3) Data Migration Column mapping – assessment report generates inventory of source database table fields as well as post recommended schema structure; the report also provides recommended field mapping from source to target based on adopted recommendation and DCC customization Post migration data validation script – DCC generates a data validation script after data migration is complete which takes field mapping into consideration from the related assessment and recommendation reports Data migration script for execution – DCC allows for the setup and configuration of different scripts for data migration, such as: One-time data migration from source to target Daily batch run to sync up source and target database data Intermittent data validation during the process of data migrationIf there are any discrepancies found in validation, the job will stop and generate a report with potential root cause of issue in data migration) Standalone data comparison – DCC allows for seclusion of data validation between source and target database. In this case, DCC will generate source database table inventory details and extract target database collection inventory details. Minimal manual intervention is required to perform the field mapping and set the configuration in the tool for data migration execution. Other configuration features such as one time migrations or daily batch migrations can be configured as well. The Capgemini team has successfully implemented and deployed the DCC tool for various banking customers for RDBMS to NoSQL end-to-end migration including for application retrofit and rewiring using other capable tools such as CAP360 Case study 1: Migration from Mainframe to MongoDB for a Large European Investment Bank A large banking client encountered significant challenges in terms of growth and scale-up, low resilience and increased risks, and certainly increasing costs associated with the advent of mobile banking and a related significant increase in volume. To help the client evolve more quickly, Capgemini built an Operational Data Platform to offload expensive mainframe operations, as well as store and process customer transactions for business operations, analysis and reporting. The Challenge: Inefficient and slow to meet customer demand for new digital banking services due to heavy reliance on legacy infrastructure and apps Continued growth in traffic and the launch of new digital services led to increased cost of operations and decreased performance Mainframe was the single point of failure for many applications. Outages resulted in poor customer service, brand erosion, and regulatory concerns The Approach: An analysis of digital channels revealed that 92% of traffic was generated by 25 interaction types, with 85% of these being read-only. To offload these operations from the mainframe, an operational data lake (ODL) was created. MongoDB-based ODL was updated in near real-time via change data capture and messaging queue to power existing apps, new digital services and other APIs. Outcome and Benefits: Accelerated time to market for new digital services, including personalization Improved stand-in capability to support resiliency during planned and unplanned mainframe outages Reduced number of read-only transactions to mainframes (MIPS cost), freeing up resources for additional growth Saved the customer over 80% in year-on-year post migration costs. The new MongoDB database was seamlessly able to handle 25mn+ transactions per day as well as able to handle data volume of over 30 months of history with ~13b transactions held in 114m documents Case study 2: Migration of Large-scale Database from Legacy to MongoDB for US-based Insurance Customer A US-based insurance client faced disparate data spread across 100+ systems, making data aggregation a cumbersome process. The client wanted to access the many data points around a single customer without hindering performance of the entire system. The Challenge: Reconciling different data schemas from multiple systems into a single schema is problematic and, in many cases, impossible. When adding new data sources, it is difficult to iterate on the schema quickly. Providing access to the data within the ‘Single View’ requires ad hoc queries as well as multi-layer indexing and aggregation which becomes complicated for relational databases to provide. Lack of personalization and ability to provide context-based experiences in real time results in lost business opportunities. Approach: In order to assist customer service reps in real-time, we built “The Wall,” a single view application that pulls disparate data from legacy systems for analytics. Additionally, we designed a flexible data model to aggregate disparate data into a single data store. MongoDB’s expressive query language and secondary indexes can reach any field in real time, making data access faster and easier. Our approach was designed based on 4 key foundations: Document Model – Rich and flexible data store. A single document can store up to 16 MB of data. With 20+ data types meant flexibility in terms of managing data Versatility – Variety of structured and non-structured data models defined Analytics – Strong data aggregator framework to aggregate data related to single customer Workload Isolation – Parallel run for operational and analytical workload on same cluster Outcome and Benefits: Our largest insurance customer was able to attain the single view of the customer within 90 days timespan. A different insurance customer achieved 360 degree view of 13 million customers on MongoDB Enterprise Advanced. And yet another esteemed healthcare customer was able to achieve as much as a 300% reduction in processing times and increased processing throughput with 50% less hardware. Ready to accelerate your digital transformation? Capgemini and MongoDB can help you re-envision your data and advance your business processes so you can focus on innovation. Reach out today to get started. Download the Modernization Guide

February 10, 2021

Appy Pie & MongoDB’s Seamless, No-Code Business Solutions for Mobile & Web Apps

>> Announcement: Some features mentioned below will be deprecated on Sep. 30, 2025. Learn more . The tech industry’s ceaseless and exponential growth is no longer a surprise. As long as clients and end-users remain interested in faster, more efficient services, then tech companies will continue to improve business processes to meet the demand. Simultaneously, these improvements will reduce costs and maximize revenue. It’s a win-win--if, of course, it’s done correctly. So, what’s behind most success stories? How do some companies launch and maintain applications at such rapid, expansive scale? Often, the key to success lies in fostering core business processes that are driven by automation. For many tech-based organizations, Appy Pie Connect has been the go-to seamless integration platform that helps them get started. And now, with MongoDB Realm , it’s about to get even easier. Users build best in-class-apps across Android, iOS, and web with MongoDB Realm’s mobile database, sync solution, and application development services. Why use Appy Pie & MongoDB Realm? Together, Appy Pie and MongoDB are driving seismic operational change. Originally, Appy Pie AppMakr product moved to MongoDB Realm for local storage. But after experiencing the immense ease and advantages offered by Realm--specifically, its offline-first database that supports cross-platform app development-- we decided to extend its benefits to the customers of Appy Pie Connect. As an automation platform, Appy Pie Connect helps businesses automate manual tasks through smart integrations, allowing for intuitive, instant sharing between apps less commonly connected like MailChimp and LinkedIn or Stripe and Gmail, and so on. By integrating MongoDB and MongoDB Realm with Appy Pie Connect, customers can easily store or retrieve data within multiple database sources. This enables the storage of flexible schemas and maintains consistency and integration. This unique “no code” technology allows organizations to extract and work with data from MongoDB and then apply that data to desired software through triggered-based actions. For example, users can set up a trigger for every event on their Google Calendar so that their Slack status corresponds and is updated at the start and end of each meeting. This way, data concurrency is maintained without any manual effort. Realm is a particularly great choice for the customers at Appy Pie Connect because of its effortless data syncing. View some of the common use cases below. Example 1 In the Meter Billing example below, cost is calculated in real time based on usage (e.g. video viewing time), resulting in a transparent “pay as you go” model. Example 2 With real-time data sync, any updates or changes to the application are immediately reflected without requiring users to update or reinstall. Example 3 All API failure logs are conveniently displayed to the admin on the Appy Pie dashboard so that immediate troubleshooting actions can be taken. Benefits of MongoDB Realm and MongoDB Atlas Appy Pie Connect already uses MongoDB Atlas, so moving to Realm -- a MongoDB product offered through Atlas -- was a natural choice. Realm allows mobile users to sync data quickly and seamlessly between mobile devices and backend systems, even if they go offline (sync will occur when they are connected) -- and Atlas enables it all.. Some benefits of using Atlas are as follows: Scalability flexible data schema Document oriented storage Ad hoc queries, indexing, and real-time aggregation Powerful tools for data analysis Serverless function Ultimately, Appy Pie Connect helps businesses convert MongoDB into a central data store by pulling in and replicating data from all its sources. This allows customers to create new MongoDB documents automatically from new Typeform entries, new files on Dropbox, new posts on WordPress, or other resources. Similarly, Appy Pie Connect can also send MongoDB data to other third-party apps, including WordPress, Salesforce, Slack, Mailchimp, Google Drive, and many more. This makes enterprise-wide communication and collaboration much more efficient. When data is pulled from MongoDB through automation, it can help streamline other areas of the business. For example, when you pull MongoDB data into MailChimp, you can automatically add a new subscriber on Mailchimp. This ensures that your lists grow automatically, as fast as your business does. Ex. Appy Pie Connect seamlessly sends MongoDB data to third-party apps Use cases Send data from MongoDB to LinkedIn (without any code!) to quickly post accurate job-related content Extract retail product data into Google Sheets to record valuable data in an organized manner in one place Post enterprise-related content from MongoDB to Twitter to streamline social media presence How it works Appy Pie Connect employs a trigger-action based function that allows you, as a platform user, to choose the two apps you want to connect. The process is very straightforward. Once you choose the apps you wish to integrate, you will be presented with multiple options to connect them. Simply click on the “Connect” button. To integrate with the selected applications accounts, simply allow API access for Appy Pie Connect. Next, design the workflows by mapping all of your data synced from the applications you are connecting. Once complete, you are ready to test your brand new Connect with your Trigger and Action apps. And, that’s it! It is time to experience the magic of Appy Pie Connect at work. Let the automation workflows take over the mundane, repetitive tasks, and move on to more innovative, exciting tasks. As the efficiency of an organization improves through automation, one of the most direct advantages is a marked reduction in cost. These integrations help save hundreds of hours of manual effort, thereby freeing up talented resources to instead focus their energy and intellect on more critical, innovative issues. With Appy Pie Connect and MongoDB Realm, businesses can ensure that their workforce is not only optimized but also inspired, a key factor to employee satisfaction and overall company success. Watch this demo to learn how to integrate MongoDB with Google Sheets using Appy Pie Connect to help you automate data exchange between MongoDB and Google Sheets with ease. Click here to learn more about MongoDB Realm

February 2, 2021

Build Better Mobile Apps -- Running MongoDB Realm and Google Cloud

>> Announcement: Some features mentioned below will be deprecated on Sep. 30, 2025. Learn more . We’re partnering with Google Cloud to offer MongoDB Realm as part of the MongoDB Cloud stack with Google Cloud to service users globally whether you’re building a new mobile app or modernizing an existing one. Realm’s integrated application development services make it easy for developers to build industry leading apps on mobile devices and the web. With MongoDB Atlas running as a service with Google Cloud, it’s easy to connect your mobile database to Google services. Customers choose Google Cloud to: avoid vendor lock-in by running multi-cloud and hybrid cloud deployments take advantage of Google Cloud’s machine learning and advanced analytics abilities stay secure with the same protections Google Cloud itself uses to guard their data, applications, and infrastructure. Why MongoDB Realm for Mobile? Realm comes with 3 key features: Cross-platform mobile database Cross-platform mobile sync solution Time-saving application development services Mobile Database Realm’s mobile database is an open source, developer-friendly alternative to CoreData and SQLite. With Realm’s open source database, mobile developers can build offline-first apps in a fraction of the time. Supported languages include Swift, C#, Xamarin, JavaScript, Java, ReactNative, Kotlin, and Objective-C. Realm’s Database was built with a flexible, object-oriented data model, so it’s simple to learn and mirrors the way developers already code. Because it was built for mobile, applications built on Realm are reliable, highly performant, and work across platforms. Sync Solution Realm Sync is an out-of-the-box synchronization service that keeps data up-to-date between devices, end users, and your backend systems, all in real-time. It eliminates the need to work with REST, simplifying your offline-first app architecture. Use Sync to backup user data, build collaborative features, and keep data up to date whenever devices are online - without worrying about conflict resolution or networking code. Powered by the Realm Mobile Database on the client-side and MongoDB Atlas on the backend, Realm is optimized for offline use and scales with you. Building a first-rate app has never been easier. Application Development Services With Realm app development services, your team can spend less time integrating backend data for your web apps, and more time building the innovative features that push your business initiatives forward. Services include: GraphQL Functions Triggers Data access controls User authentication Use these products from Google to accelerate the development and deployment of backend services: Google Kubernetes Engine (GKE) Google Cloud Functions (FaaS) Google App Engine (PaaS) Realm and MongoDB Atlas with Google Cloud and Android As Realm is a MongoDB product offered through Atlas, and Atlas is used by Realm to sync data between the database and clients, Google Cloud and Atlas abilities are key to the Realm user experience. Figure 1: Screenshot of Realm offered through MongoDB Cloud UI MongoDB Atlas and Google Cloud MongoDB Atlas delivers a fully managed service on Google Cloud’s globally scalable and reliable infrastructure. Atlas allows users to manage their MongoDB databases easily through the UI or an API call. It’s simple to migrate to, and offers sophisticated features such as Global Clusters that offer low-latency read and write access anywhere across the globe. 3 Key Abilities with MongoDB Atlas and Google Cloud Geographic Presence All Google Cloud regions have at least 3 availability zones, providing higher availability, resiliency and geographic availability. Other public clouds do not have the same reliability guarantees. Network Offering — Cost and Customer Benefits Global VPC - global resources that reduce complexity in networking implementation Performance - premium tier leverages performance of the Google Cloud network improving application performance and latency across tiers Price - better pricing ratio for network egress costs Native Integrations Security -- Atlas offers native integrations to Google Auth through Realm, support for Google Cloud KMS for additional encryption at rest or MongoDB Client-Side Field Level Encryption, and OAuth flow based console integration Billing -- pay as you go billing on Google Cloud Marketplace (Realm is purchased through Atlas credits similarly on Marketplace) Realm and Android With Realm, you can create mobile applications for Android devices. Realm supports all versions of the Android API after level 9 (Android 2.3 Gingerbread). Below is a sample reference architecture which shows how to leverage MongoDB Atlas with Google Cloud as an Operational Data Layer (ODL) / Operational Data Store (ODS) and build mobile applications using MongoDB Mobile and Realm Sync. Figure 2: Reference Architecture for ODL on MongoDB Atlas and Realm with Google Cloud Realm Customer Story — A Leading New York Healthcare Payer MongoDB has partnered with Exafluence to deliver a COVID employee self-assessment health checker app for a leading healthcare payer in New York since the onset of the pandemic, they’ve needed to quickly adapt to new operational standards, as the situation with COVID evolves. MongoDB Atlas, Realm, Google Cloud, and Exafluence have all been a key part of allowing their onsite operations to continue. The CDC and New York State require organizations to keep track of which of their employees reporting to a physical office for work. As a result, the organization must monitor their New York based employees who still come onsite in order to support their members. They needed an app that would capture their employees’ health, and ask a series of questions to determine if the associate was able to enter the facility. Exafluence -- a MongoDB Global Strategic Partner working with the healthcare payer’s HR team and business team -- was able to deliver a complete solution in only three weeks from start to go-live. This rapid deployment was made possible using MongoDB Atlas, Realm, and Google Cloud. The completed app includes: support for mobile devices a web Portal to aggregate information use of QR Scans to confirm access on iPads deployed in facility entrances integration with Active Directory and alerts to the funds email system This rapid deployment was made possible using MongoDB Atlas and Realm. The organization and Exafluence chose Realm because it’s application development services make it easy to work with data across both web and mobile applications. Realm works with React js, provides offline sync and is Atlas cloud ready. MongoDB Atlas and Realm also make it easy to rapidly develop new features when the next stage of the pandemic changes app requirements. Exafluence will be able to quickly add app features tied to vaccination, like the ability for employees to disclose and share immunization certification via MongoDB’s FHIR API. Prior to the Covid App, this healthcare payer chose to use Atlas on Google Cloud because the fully managed, global DBaaS accelerates development and allows them to manage both structured and unstructured data. They also needed a solution for analytics involving geocoding, machine learning, and dashboarding. With Atlas and Google Cloud, their teams get agility while with elastic scaling and provision on-demand resources. Additional differentiators that drove the organization to select Google Cloud include: Maps API Air flow for scheduling Cloud identity Kubernetes deployment and seamless integration with MongoDB and Realm for mobile development Scalable VM environments Meeting CISO requirements They were able to automate and offload operational tasks while taking advantage of built-in security best practices, and this in turn reduced regulatory risk. With Atlas and Google Cloud, their teams can also elastically scale and provision on-demand resources to build more microservices, in-line with their agile development requirements. Click here to learn more about MongoDB Realm

February 2, 2021

4 Steps to Success: From Surviving with Legacy Systems to Thriving with MongoDB

Legacy data migrations imply a change in the status quo. More often than not, when an organization finally undertakes a thorough analysis of its technology landscape, it arrives at the same decision: to do nothing. It is an understandably daunting task to upgrade or replace 20+ year-old applications and their database counterparts. But there are good reasons, beyond the tri-annual hardware upgrade, to propel those legacy monoliths of the 1990s into the 21st century. Companies that prevailed—and even triumphed—in the volatile spring of 2020 were those that transitioned to a more flexible usage model and were therefore able to adjust their business models more rapidly and reliably. MongoDB’s client, Sanoma, was one of the winners. Sanoma was able to scale from 3,000 to 150,000 users within 24 hours, without any service interruption. Innovation and modernization go hand in hand. However, while modernization can sadly occur without innovation, the opposite is simply not possible. A bit of history The concept of bringing data together through online data layers (ODL) or operational data stores (ODS) isn't new or specific to MongoDB. Accessing legacy systems, bringing data together, and making it all more easily accessible was a common goal even 20 years ago, and led to the search for the golden source of truth (i.e. the definitive master source for any given entity). This search proved elusive early on due to the hurdles involved with bringing data from diverse, over-structured relational constructs to a sole target called Operational Data Store (ODS) or Online Data Layer (ODL). The industry’s first attempts began with Object-oriented databases , then with the dead end of XML data stores. (In my personal opinion, Xquery and Xpath were never meant for real developers). After both endeavors failed, then came the wave of Apache efforts I like to call “Hadoop Solves the Planet,” in which companies dumped all their structured data onto a big-data treasure trove. Unfortunately, this resulted in a data desert rather than the data lake everybody was hoping for, since organizations then had to scramble to build a concept for secondary indexing, data dictionaries, and more, on top of having to rebuild the sensible structures they lost. In the 2010s, the document model, in conjunction with JSON notation , emerged as the new de facto standard. MongoDB release 3.x introduced the combination of ACID (atomicity, consistency, isolation, durability) and compliance with a broad range of data types (in BSON, for those in the know). Soon, the MongoDB team started implementing additional features of relational heritage: secondary indexing, ACID transactions, aggregations and manipulations of data in site, materialized views, joins, unions... the list goes on. Where we are now MongoDB documents can be enriched through different means and channels without touching the content — the consistency of all data and data lineage is implicitly guaranteed. A typical example is the extraction of a delivery address through a supply chain application and a billing address through an enterprise resource planning system. In many cases, those two systems have different requirements. MongoDB documents simply keep both instantiations intact and can even hold multiples of each attached to one single client profile without the need to complete loads and transformations, foreign keys, and all the other ingredients of the relational past. MongoDB simply adds and leverages other sources without destroying their context. MongoDB delivers an ODS and ODL experience while streamlining the time-consuming journey of replacing legacy application code.The data platform of true modernization and innovation has arrived! How your company can get here The entire journey can be summarized in four simple steps: Analysis: Where do I start my data journey to drive the fastest value? Scaffolding: How do I get my data out of the existing platform and bridge it to the new platform? Coding: How do I enter the world of adjusting and adapting my applications landscape? Innovation: Which are the easiest targets for my company to start achieving true innovation? The following sections answer these four questions and provide you with a starting point for your journey toward a new and improved solution landscape. Step 1: Analysis of your existing solution landscape Data Provisioning Data provisioning—the act of bringing data from source system(s) to target system—is actually the easy part of this step. Opinions may vary as to the very best approach, but most existing models for streaming data in real time make the process elegant and allow for a business-driven decision from real-time replication on one end to communicate with the batch of .CSV files on the other end. Application onboarding More exciting is the application onboarding phase, inclusive of the selection and design of initial data domains. Here, simple mechanisms derived from the classic priority concepts can assist—and yes, they existed long before computers. Data domains already exist in objects in the business logic represented through their objects in the various programming languages. But even the most talented application developer deals with constant changes which leads to compromises in those objects and can obfuscate the original clarity in their design so the objects may hide in plain sight. Unearthing those gems and aligning them to the ODS is the most important step towards true legacy modernization. The most simple solution is actually the most practical one: load an object with the existing software and persist it into a MongoDB collection. The effort of persisting the object results in two lines of code that can be easily added. The location of the two lines of code (first line one opens connection to database; second line one persists the object) does not matter as long as it is in a place after the object is built out. This is the first time you will see the beauty of MongoDB and MQL at work. You really have to do nothing for the object itself—e.g. no decomposition or abstraction layer. MongoDB takes care of it for you. When looking at the object in the MongoDB database, e.g. using MongoDB Compass, you will realize that it already looks a lot like the domain object you wanted. The actual task to map objects to domains, or subset of domains, is now mostly driven by the application use case. Tip: How to leverage application mapping to accelerate onboarding In the model below, which was taken from the financial industry but can easily be adopted across industries, we identify the data domains in various applications and map their behavior to the effort it takes to locate them as well as their importance to the app. First, each domain gets a rating for its object complexity, where “complexity” is defined by the implementation team. This is similar to the concept of “ poker ” in a development sprint. Second, each data domain must be located in the application content. Then, it’s tally time. As we can see in the example above, the concept of schedules looks quite easy but is superseded by the client profiles which have a touch more application context (spoiler: those always come out on top). Based on the combination of complexity and the number of data domains affecting an application, we can now easily achieve the model below. Agile is your friend and, assuming a certain “point capacity,” the applications fall into place for their conversion schedule in a quite neutral fashion. The development team will then start with low hanging fruit. As soon as application 1, 6 and 7 are ported, we’re in business in a new modern landscape. Along the journey, the domains will get cleaned up naturally as we do not have the static corsage of the RDBMS table designs. Step 2: Scaffolding Scaffolding is the art of building a bridge that can hold people as they cross it, then immediately dissipate once they step off. But for that critical time, it needs to hold. The same is true for the connectivity between a legacy system and a new data platform. Starting with the first sprint, we have data residing in the MongoDB data platform. If the data is limited to new applications and resides exclusively in MongoDB, nothing needs to be done. However, as shown in the client profiles example above, there may be dependencies to consider. The synchronization between the legacy database and the new MongoDB platform can be easily arranged using microservices and the same concepts used for the initial loading of data. Synchronization can also be achieved through “the gate” if only READ data is needed during the first sprint, or if you’re already dealing with WRITE and the requirement to synchronize those writes back to a legacy system. Streaming: A streaming based solution is a great option for uni-directional operations that allow read only in the most simple way. Service: Selecting a simple, tiny microservice is a good option for the use case where data needs to be selectively written. It works using the document model on the MongoDB side, but can still push necessary updates back to the legacy system, and vice-versa. The great news is that this service potentially exists already, as it requires nothing more than using the old database interface from the legacy application on one side and the new, easy-to-digest JSON document format on the MongoDB side. If both databases are ACID-compliant, any transaction is automatically treated as a normal application interaction on both sides. “Y-Loader”: Another option is a true “Y-loader,” where all transactions are written in sync to both databases in parallel, and the actual transaction is only considered committed when both systems report their commit and completion. Simple two-phase protocols (write to both, wait five seconds, read both to validate and, if in sync, commit to application) are available as ready-made services through various distributed transaction coordinators, but often it’s easier to use the existing data access in the application. In that case, the new data path to MongoDB is in parallel, and a simple redundant checkpoint (which the application logic would have had for the legacy path anyway) is expanded for this purpose. Step 3: Coding The coding with the new domain data model, as well as the MongoDB flexible document model as the underlying base, will immediately impact the coding for the business logic and application development. The operative word is immediately. As the data gets unlocked with the initial persistence of the code object to the MongoDB collection, the developer is simultaneously able to code based on business requirements. Developers will no longer be hindered by reference and requirements of object mappers. As the objects are represented through the MongoDB idiomatic drivers, each programming object resides directly in the data collection; in reverse, any changes to the business logic object will be naturally represented—code-free—in the MongoDB collection. A single blog post can't resolve all open questions and edge cases. Each application, client, and data interface is unique. Databases possess historic technical debt and implicit assumptions that become lost in generations of developers over time. “Do not touch this section—not sure what it does but last time we tried all hell broke loose…” is often-heard advice around the organizational water cooler. But the key lesson? There are many different templates available and very simple methods of quickly taking the lead to significant success. For example, a German client, who was stuck in a combination of IBM DB2 (mainframe and distributed) with a significant Hadoop footprint, was amazed when they realized they could “lift” their data one microservice at a time. This resulted in business requirements shifting from “impossible to do” for some requested queries to “completed in under one second” within a single week of a proof-of-concept. This is no exception. Cases and changes like these are made daily, reinforcing Mark Twain’s sage advice that “The secret of getting ahead is getting started." Step 4: Innovation As the migration from the legacy environment continues, innovation will be the new focus. The unlocking of previously siloed data allows immediate coupling of real-time data with machine learning platforms for various purposes: e.g. scoring for financial decision-making, personalization for retail, or optimization of production processes in the IOT context. New applications and solutions can easily be created on top of the unleashed data, even with various programming languages, direct real-time dashboards created with MongoDB Charts, and different paradigms (again, MongoDB’s idiomatic drivers do magic!) At this time, the discussion with the product owners in your squads and tribes (trying to be real modern here) begins with the question“What is the highest priority component to change?” and “What function is required to enable this change?” Is it worth waiting much longer? The real question is: why did we all not start sooner? It’s time to begin integrating the list of features you always dreamed of having, but never dared to pursue. The MongoDB team is here to help you get started. Reach out today and let’s discuss the best path forward. To learn more about modernizing to MongoDB, click here .

January 27, 2021

How Accenture’s Data Modernizer Tool Accelerates the Modernizing of Legacy and Mainframe Apps to MongoDB

As companies scale their MongoDB estates and apply them to more and more use cases, opportunities for deeper insights arise—but first come issues around data modeling. The Challenge It’s one thing to start modeling for a green field application where you have the ability to apply modeling patterns . But it’s a different matter entirely to start from an existing relational estate and take into account multiple data sources and their business requirements in an effort to create the ideal MongoDB data model based on explicit schema (table and relations) as well as implicit schema (access patterns and queries). Unfortunately, this is a fairly common obstacle. Perhaps an application has been around for a while and documentation hasn’t kept up with changes. Or maybe a company has lost talent and expertise in some areas of a monolithic app. These are just a couple of examples that demonstrate how time consuming and error-prone this effort can be. The Solution MongoDB and Accenture are excited to announce a modernization tool which enables faster MongoDB adoption by accelerating data modeling. As part of its mainframe offload and cloud modernization initiatives, Accenture’s Java-based data modernizer helps companies arrive at a recommended MongoDB document data model more quickly and efficiently. Using this tool results in a faster operational data layer for offloading mainframes and faster implementation of MongoDB Atlas for migrating applications to the cloud while also transforming them. Accenture and MongoDB are long-standing strategic partners, and this investment is another addition to our joint customers’ toolkits. “This modernizer is the right tool for teams who are focused on changing business requirements as they rethink their applications for the cloud. With on average 50% faster data modeling and 80% data model accuracy straight out of the tool, developers can work on modernizing instead of spending weeks figuring out the right data model to simply meet existing business requirements that have not been explicit.” (Francesco De Cassai) How it Works The modernizer analyzes access logs and relational schema from Oracle and DB2 databases, whether they’re powered by mainframes or not. This allows teams to obtain critical insights before kicking off development activities, thereby better informing the process. Additionally, the modernizer combines the implementation of modeling patterns , development best practices such as document sizing (best, average, worst case), and naming conventions, while also providing confidence levels to predict the accuracy of the model it’s suggested. The tool combines Accenture’s deep expertise and lessons learned from numerous successful projects. Our customers are investing in a new approach to managing data: Data as a Service & Data Decoupling. This strategic initiative focuses on consolidating and organizing enterprise data in one place, most often on the cloud, and then making it available to serve digital projects across the enterprise. MongoDB Atlas unlocks data from legacy systems to drive new applications and digital systems, without the need to disrupt existing backends as companies modernize and migrate to the cloud. With this tool and MongoDB capabilities, MongoDB and Accenture allow organizations to get the most out of their data. Find out more about the MongoDB & Accenture partnership here . Learn more about MongoDB’s Modernization Initiative .

January 21, 2021

Modernize data between siloed data warehouses with Infosys Data Mesh and MongoDB

The Data Challenge in Digital Transformation Enterprises that embark on a Digital transformation often face significant challenges with accessing data in a timely manner—an issue that can quickly impede customer satisfaction. To deliver the best digital experience for customers, companies must create the right engagement strategy. This requires all relevant data available in the enterprise be readily accessible. For example, when a customer contacts an insurance company, it is important that the company has a comprehensive understanding of the customer’s background as well as any prior interactions, so they can orchestrate the best possible experience. Data is available in both BI (Business Intelligence) systems, like Enterprise Data Warehouses, and OI (Operational Intelligence) systems, like policy and claim systems. There is a need to bring these BI and OI systems together to avoid any disruption to the digital functions that may delay synchronization. Data removed from an operational system loses context. Re-establishing this domain context and providing persona-based access to the data requires domain-oriented, decentralized data ownership, as well as architecture. Ultimately, organizations seek to use data as a key to fueling the products and services they provide their customers. This data should minimize the cost of customer research—but the data needs to be trusted and high quality. Companies need access to these siloed sources of data in a seamless self-service approach across various product life cycles. The Challenge of Centralized Data Historically, businesses have handled large amounts of data from various sources by ingesting it all into a centralized database (data warehouse, data lake, or data lake on cloud). They would then feed insight drivers, like reporting tools and dashboards as well as online transaction processing applications, from that central repository. The challenge with this approach is the broken link between analytical systems and transactional systems that impedes the digital experience. Centralized systems, like data warehouses, introduce latency and fail to meet the real time response and performance levels needed to build next-generation digital experiences. What is Infosys Data Mesh? Data Mesh helps organizations bridge the chasm between analytics and application development teams within large enterprises. Data Mesh is an architecture pattern that takes a new approach to domain-driven distributed architecture and the decentralization of data. Its basic philosophy is to encapsulate the data, its relationships, context, and access functionality into a data product with guaranteed quality, trust, and ease of use for business consumption. Data Mesh is best suited for low-latency access to data assets used in digital transformations that are intended to improve experience through rich insights. With its richer domain flavor — distributed ownership, manageability, and low latency access — Data Mesh is best positioned as a bridge between transactional (consuming applications) and analytical systems. This diagram depicts the high-level solution view of Data Mesh: Data Mesh Key Design Principles Domain-first approach. Data as a product. Data Mesh products share the following attributes which maximize usability and minimize friction: Self-described: metadata is precise and accurate Discoverable and addressable: products are uniquely identifiable and easy to find Secure and well-governed: only those who are granted access have it Trustworthy: proper data quality conrtols are applie, SLA/SLOs are maintainted Open standard and interoperable: data formats — XBRL, JSON Build new products easily. Any cross-functional team can build a new, enterprise-level product in an existing domain and/or fro existing products Simplified access for multiple technology stacks. Polygot data and ports, cloud and non-cloud. Common infrastructure and services for all data pipelines and catalogs. Platform Requirements for a Data Mesh To build a data mesh, companies need a database platform that can create domain-driven data products that meet various enterprise needs. This includes: Flexible data structures — to accomodate new behaviors An API-driven construct — to access current data products and build new domain-data ones Support for high-performance query on large-scale data structures A shared, scalable infrastructure Why MongoDB is the Optimal Platform for Infosys Data Mesh MongoDB is the best platform for realizing the Infosys Data Mesh architecture and powering analytics-driven enterprises because it provides: A flexible document model and a poly-cloud infrastructure availability so teams can easily modify and enrich flat or hierarchical data models MongoDB Realm Webhooks to create service APIs which connect data across products and enable consumption needs based on business context A scalable, shared infrastructure and support for high-performance querying of large scale data Service APIs for constructing Infosys Data Mesh Two use cases: table, th, td { border: 1px solid black; border-collapse: collapse; } Case 1: Case 2: A wealth management firm offers a variety of products to its customers — things like checking and savings accounts, trading, credit and debit cards, insurance, and investment vehicles. Challenges: Each product is serviced by a different system and technology infrastructure Internal consumers of this data have different needs: product managers analyze product performance, wealth managers and financial advisors rely on customer-centric analytics, and financial control teams track the firm’s revenue performance Solution: Using the Infosys Data Mesh model, the firm’s data owners create domain-data products categorized by customer and product, and then curate and publish them through a technology-agnostic, API-driven service layer. Consumers can then use this service layer to build the data products they need to carry out their business functions. The Risk and Finance unit of a large global bank has multiple regional data lakes catering to each region’s management information system and analytical needs. This poses multiple challenges for creating global data products: Challeges: Technology varies across regions ETL can becomes less advantageous depending on circumstance Regulations govern cross-regional data transfer policies Solution: To address these challenges, the bank creates an architecture of regional data hubs for region-specific products and, as with Case 1, makes those products available to authorized consumers through a technology-agnostic, API-driven service layer. Next, it implements an enterprise data catalog with an easy-to-use search interface on top of the API layer. The catalog’s query engine executes cross-hub queries, creating a self-service model for users to seamlessly discover and consume data products and to align newer ones with their evolving business needs. Enterprise security platform integration ensures that all regulatory and compliance requirements are fully met. How Businesses Overall Can Benefit Data and Insights become pervasive and consumable across applications and personas Speed-to-insights (including real time) enable newer digital experiences and better engagement leading to superior business results Self-service through trusted data products is enabled Infosysy DNA Assets on MongoDB Accelerates the Creation of Industry-Specific Domain Data Products table, th, td { border: 1px solid black; border-collapse: collapse; } Infosys Genome Infosys Data Prep Infosys Marketplace Creates the foundation for Data Mesh by unifying semantics across industries Guides consumers through the product creation process with a scalable data preparation framework Enables discovery and consumption of domain-data products via an enterprise data catalog Download our Modernization Guide for information about which applications are best suited for modernization and tools to get started.

January 14, 2021

Simplifying Data Science with Iguazio and MongoDB: Modernization with Machine Learning

For the most innovative, forward-thinking companies, “data” has become synonymous with “big data” — and “big data” has become synonymous with “machine learning and AI.” The amount of data you have is raw knowledge. The ability to connect the dots in a cohesive picture that lets you see major projections, personalizations, security breaches, etc., in real time — that’s wisdom. Or, as we like to call it, data science. MongoDB Cloud is the leading cloud data platform for modern applications. Iguazio , initially inspired by the powerful Iguazu Falls in South America, is the leading data science platform built for production and real-time use cases. We’re both disrupting and leading various industries through innovation and highly forethought intelligence. It makes perfect sense for us to work together to create a powerful, data-driven solution. Iguazio Data Science & MLOps platform optimizes data science for your use cases Iguazio enables enterprises to develop, deploy, and manage their AI applications, drastically shortening the time required to create real business value with AI. Using Iguazio, organizations can build and run AI models at scale and in real time, deploy them anywhere (multi-cloud, on-prem or edge), and bring to life their most ambitious AI-driven strategies. Enterprises spanning a wide range of verticals use Iguazio to solve the complexities of machine learning operations ( MLOps ), and accelerate the machine learning workflow by automating the following, end-to-end processes: Data collection — ingested from any diverse source, whether structured, unstructured, raw or real-time Data preparation — through exploration and manipulation at scale (go big!) Continuous model training — through acceleration and automation Rapid model and API deployment Monitoring and management of AI applications in production As a serverless, cloud-native data science platform, Iguazio reduces the overhead and complexity of developing, deploying, and monitoring AI models, guarantees consistent and reproducible results, and allows mobilizing, scaling, and duplicating functions to multiple enforcement points. MongoDB delivers unprecedented flexibility for real-time data science integration With its scalable, adaptable data processing model, ability to build rich data pipelines, and capacity to scale out while doing both in parallel, MongoDB is a foundational persistence layer for data science. It allows you to use your data intelligently in complex analytics, drawing new conclusions and identifying actions to take. Data science and data analytics go hand-in-hand, fueled by big data. The MongoDB data platform handles data analytics by: Enabling scalability and distributed processing — processing data with a query framework that reduces data movement by knowing where that data is and optimizing in-place computation Accelerating insights — delivering real-time insight and actions Supporting a full data lifecycle — intelligent data tiering from ingestion to transactions to retirement Leveraging a rich ecosystem of tools and machine learning for data science Here's a look at how Iguazio and MongoDB partner to synthesize a seamless production environment: MongoDB and Iguazio: from research to production in weeks Iguazio fuses with MongoDB to allow intelligent, complex data compilations that lead to real-world ML/AI results like streaming and analytics, IoT applications, conversational interfaces, and image recognition. Data science is opening opportunities for businesses in all areas, from financial services to retail, marketing, telco, and IoT, and those opportunities create demands on data that continue to grow. Iguazio swiftly reduces the development and deployment of data science projects from months to weeks, transforming how businesses, developers, and product owners use and imagine new use cases for their data. Together, MongoDB and Iguazio establish a joint hybrid/multi-cloud data science platform. MongoDB’s unique features create the perfect seeding ground for Iguazio’s data science platform. They include: MongoDB’s high-performing, highly ranked data platform experience No data duplication Optimization for real-time, an essential factor for data science An elastic, flexible model that adjusts to ever-changing load requirements Production that’s ready in minutes Meanwhile, Iguazio’s powerful ML pipeline automation simplifies the complex data science layer by creating a production-ready environment with an end-to-end MLOps solution, including: A feature store for managing features (online and offline) that resides in MongoDB Data exploration and training at scale, using built-in distribution engines such as Dask and Horovod Real-time feature engineering using Iguazio’s Nuclio-supported serverless functions Model management and model monitoring, including drift detection Open and integrated Python environment, including built-in libraries and Jupyter Notebook-as-a-Service Data and data science in the real world When we think of data, stagnant databases may come to mind. But data in action is live, quick, and moves in real time. Data science is no different — and it has quickly incorporated itself in every sector of virtually every industry: Fraud prevention — distinguishing legitimate from fraudulent behavior and learning to prevent new tactics over time Predictive maintenance — finding patterns to predict and prevent failures Real-time recommendation engines — processing consumer data for immediate feedback Process optimization — minimizing costs and improving processes and targets Remote monitoring — quickly detecting anomalies, threats, or failures Autonomous vehicles — continuously learning new processes and landscapes to optimize safety, performance, and maintenance Smart scheduling — increasing coordination among nearly infinite variables Smart mobility systems — using predictive optimization to maintain efficiency, safety, and accuracy IoT & IIoT — generating insights to identify patterns and predict behavior Data science today MongoDB enables a more intuitive process for data management and exploration by simplifying and enriching data. Iguazio helps turn data into smarter insights by simplifying organizations’ modernization into machine learning, AI, and the ongoing spectrum of data science — and we’ve only just scratched the surface. To learn more about how, together, Iguazio and MongoDB can transform your data processes into intelligent data science, check out our joint webinar discussing multiple client use cases. MongoDB and modernization To learn more about MongoDB’s overall modernization strategy for moving from legacy RDBMS to MongoDB Atlas, read here .

December 2, 2020

MongoDB Atlas Arrives in Italy | MongoDB Atlas Arriva in Italia

We’re delighted to announce our first foray into Italy with the launch of MongoDB Atlas on the AWS Europe (Milan) region. MongoDB Atlas is now available in 20 AWS regions around the world, including 6 European regions. Milan is a Recommended Region , meaning it has three Availability Zones (AZ). When you deploy a cluster in Milan, Atlas automatically distributes replicas to the different AZs for higher availability — if there’s an outage in one zone, the Atlas cluster will automatically fail over to keep running in the other two. And you can also deploy multi-region clusters with the same automatic failover built-in. We’re excited that, like customers in France, Germany, the UK, and more, Italian organizations will now be able to keep data in-country, delivering low-latency performance and ensuring confidence in data locality. We’re confident our Italian customers in government, financial services, and utilities in particular will appreciate this capability as they build tools to improve citizens’ lives and better serve their local users. Explore Atlas on AWS Today   In Italian, courtesy of Dominic: Siamo lieti di annunciare la nostra espansione in Italia rendendo disponibile MongoDB Atlas nella regione AWS Europa (Milano). MongoDB Atlas è ora disponibile in 20 regioni AWS nel mondo, comprese 6 regioni europee. Milano è una Recommended Region ; questo significa che ha tre Availability Zones (AZ). Quando viene creato un cluster a Milano, Atlas distribuisce automaticamente le repliche sulle diverse AZ per aumentare la disponibilità e l’affidabilità — nel caso in cui avvenga un disservizio in una zona, il cluster Atlas utilizzerà la funzionalità di failover per restare in esecuzione sulle altre due. Eventualmente è anche possibile creare cluster multi-region che incorporano la stessa logica di failover automatico. Siamo felici che anche le realtà italiane possano scegliere, come i nostri clienti in Francia, Germania, UK, ed altrove, di mantenere i propri dati all’interno dei confini nazionali, dando risposte a bassa latenza ai propri utenti ed assicurando loro la fiducia nella localizzazione fisica dei dati. Siamo sicuri che i nostri clienti in Italia, in particolare nel settore pubblico, nei servizi finanziari, e nelle utilities, apprezzeranno queste nuove possibilità per la creazione di nuovi strumenti per migliorare la vita dei cittadini e servire meglio i loro utenti in Italia. Scopri subito Atlas disponiblie su AWS

November 4, 2020

Accelerating Mainframe Offload to MongoDB with TCS MasterCraft™

Tata Consultancy Services (TCS), a leading multinational information technology services and consulting company, leverages its IP-based solutions to accelerate and optimize service delivery. TCS MasterCraft™ TransformPlus uses intelligent automation to modernize and migrate enterprise-level mainframe applications to new, leading-edge architectures and databases like MongoDB. In this blog, we’ll review the reasons why organizations choose to modernize and how TCS has made the process easy and relatively risk-free. Background Legacy Modernization Legacy modernization is a strategic initiative that enables you to refresh your existing database and applications portfolio by applying the latest innovations in development methodologies, architectural patterns, and technologies. At the current churn rate, about half of today’s S&P 500 firms will be replaced over the next 10 years $100T of economic value is ready to be unlocked over the next decade via digital transformation Source Legacy System Challenges Legacy technology platforms of the past, particularly monolithic mainframe systems, have always been challenged by the pace of disruptive digitalization. Neither the storage nor the accessibility of these rigid systems is agile enough to meet the increasing demands of volume, speed, and data diversity generated by modern digital applications. The result is noise between the legacy system of record and digital systems of engagement. This noise puts companies at a competitive disadvantage. It often manifests as a gap between customer service and user experience, impeding the delivery of new features and offerings and constraining the business from responding nimbly to changing trends. Operational costs of mainframe and other legacy systems have also skyrocketed. With each million instructions per second (MIPS) costing up to $4,000 per year, these older systems can create the equivalent of nearly 40% of an organization’s IT budget in technical debt, significantly increasing the overall annual run cost. And as qualified staff age and retire over the years, it’s becoming harder to find and hire people with the required mainframe skills. To manage MIPS consumption, a large number of our customers are offloading commonly accessed mainframe data to an independent operational data layer (ODL), to which queries are redirected from consuming applications. IT experts understand both the risk and the critical need to explore modernization options like encapsulation, rehosting, replatforming, refactoring, re-architecting, or rebuilding to replace these legacy systems. The key considerations when choosing an approach are familiar: risk of business disruption, cost, timelines, productivity, and the availability of the necessary skills. MongoDB + TCS MasterCraft™ TransformPlus = Transformation Catalyst To stay competitive, businesses need their engineering and IT teams to do these three things, among others: Build innovative digital apps fast Use data as a competitive moat to protect and grow their business Lower cost and risk while improving customer experience Some customers use a “lift and shift” approach to move workloads off the mainframe to cloud for immediate savings, but that process can’t unlock the value that comes with microservice architectures and document databases. Others gain that value by re-architecting and rewriting their applications, but this approach can be time consuming, expensive, and risky. More and more, customers are using a tools-driven refactoring approach to intelligently automate code conversion. What TCS MasterCraft™ TransformPlus Brings to the Table TCS MasterCraft™TransformPlus automates the migration of legacy applications and databases to modern architectures like MongoDB. It extracts business logics from decades-old legacy mainframe systems as a convertible, NoSQL document data model for deployment. This makes extraction faster, easier, and more economical, and reduces the risk that comes with rewriting legacy applications. With more than 25 years of experience, TCS’s track record includes: 60+ modernization projects successfully delivered 500M+ lines of COBOL code analyzed 25M+ lines of COBOL code converted to Java 50M+ new lines of Java code auto-generated What MongoDB Brings to the Table MongoDB’s document data model platform can help make development cycles up to 5 times faster. Businesses can drive innovation faster, cut costs by 70% or more, and reduce their risk at the same time. As a developer, MongoDB gives you: The best way to work with data The ability to put data where you need it The freedom to run anywhere Why is TCS collaborating with MongoDB for Mainframe Offload? Cost. Redirecting queries away from the mainframe to the ODL significantly reduces costs. Even cutting just 20%-30% in MIPS consumption can save millions of dollars in mainframe operating costs. Agility. As an ODL built on a modern data platform, MongoDB helps developers build new apps and digital experiences 3—5 times faster than is possible on a mainframe. User Experience. MongoDB meets demands for exploding data volumes and user populations by scaling out on commodity hardware, with self-healing replicas that maintain 24x7 service. More details can be found here . How TCS MasterCraftTM Accelerates Mainframe Offload to MongoDB Data Migration Configures target document schema to corresponding relational schema Automatically transforms relational data from mainframe sources to MongoDB documents Loads data to MongoDB Atlas with the latest connector support Application Migration Facilitates a cognitive code analysis-based application knowledge repository Ensures complete, comprehensive application knowledge extraction Automates conversion of application logic from COBOL to Java, with data access layer accessing data from MongoDB Splits monolithic code into multiple microservices Automates migration of mainframe screens to AngularJS-based UI Together, TCS MasterCraft™ TransformPlus and MongoDB can simplify and accelerate your journey to the cloud, streamlining and protecting your data while laying the foundation for digital success. Download the Modernization Guide to learn more.

October 28, 2020