How to Migrate Your Node.js App from SQL to MongoDB
Rate this tutorial
With the rapid adoption of the Agile methodology in software development, developers now seek to migrate from SQL to NoSQL databases for more flexibility and horizontal scaling which is cheaper and more efficient for software development.
Migrating from a SQL database to a NoSQL database can be challenging because of key factors like data model differences, query language, access patterns, and query optimization.
In this article, we'll explore key MongoDB concepts like collections, documents, and BSON and the differences in data modeling between SQL and NoSQL databases. We'll also migrate from a relational database (Postgres) to MongoDB.
Below are a few prerequisites that you'll need before moving forward with this article:
- Knowledge of JavaScript
- Experience building Node.js applications
- Experience with SQL and Postgres database
- Node.js ≥v20 installed
In this section, we'll migrate the database and queries of a Node.js backend application from Postgres to MongoDB. The application will be a public phonebook app with the following features:
- User registration and login: Passwords are hashed using bcrypt, and JWT is used for authentication.
- CRUD operations:
- Create contact: Authenticated users can create contacts.
- Get contacts: Get contacts with call rate count and user details, filtered by date range and ordered by call rate count.
- Get contact by ID: Retrieve a single contact by ID for the authenticated user.
- Update contact: Authenticated users can update their contacts.
- Delete contact: Authenticated users can delete their contacts.
Open
psql
and run the following command to create the required tables for the application:1 CREATE TABLE users ( 2 id SERIAL PRIMARY KEY, 3 username VARCHAR(100) UNIQUE NOT NULL, 4 password VARCHAR(100) NOT NULL, 5 email VARCHAR(100) UNIQUE NOT NULL, 6 created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP 7 ); 8 CREATE TABLE contacts ( 9 id SERIAL PRIMARY KEY, 10 name VARCHAR(100) NOT NULL, 11 phone VARCHAR(20) NOT NULL, 12 email VARCHAR(100), 13 user_id INTEGER REFERENCES users(id) ON DELETE CASCADE, 14 created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP 15 ); 16 CREATE TABLE rates ( 17 id SERIAL PRIMARY KEY, 18 description TEXT NOT NULL, 19 contact_id INTEGER REFERENCES contacts(id) ON DELETE CASCADE, 20 user_id INTEGER REFERENCES users(id) ON DELETE CASCADE, 21 created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP 22 );
Create a Node.js project and install the following dependencies:
1 "dependencies": { 2 "bcryptjs": "^2.4.3", 3 "express": "^4.19.2", 4 "jsonwebtoken": "^9.0.2", 5 "pg": "^8.11.5" 6 }
You can install all the required packages with the following command:
1 npm install bcryptjs express jsonwebtoken pg
Create a
.env.postgres
file and add the following environment variables. You will need to change the values to use the credentials to connect to your database.1 DB_USER=your_username 2 DB_HOST=localhost 3 DATABASE=your_DB_name 4 DB_PASSWORD=your_password 5 DB_PORT=5432 6 SECRET_KEY=your_jwt_secret_key
Create a
postgres.js
file in the project root directory and add the following:1 import express from 'express'; 2 import bcrypt from 'bcryptjs'; 3 import jwt from 'jsonwebtoken'; 4 import pg from 'pg'; 5 6 const { Pool } = pg; 7 8 const app = express(); 9 const port = 3000; 10 11 app.use(express.json()); 12 13 const pool = new Pool({ 14 user: process.env.DB_USER, 15 host: process.env.DB_HOST, 16 database: process.env.DATABASE, 17 password: process.env.DB_PASSWORD, 18 port: process.env.DB_PORT, 19 }); 20 21 // JWT secret key 22 const SECRET_KEY = process.env.SECRET_KEY; 23 24 /* START FEATURE IMPLEMENTATION*/ 25 26 /* END FEATURE IMPLEMENTATION*/ 27 28 // Start server 29 app.listen(port, () => { 30 console.log(`Server running on port ${port}`); 31 });
Ensure your package.json includes
"type": "module"
.We've successfully set up a Nodejs backend application and configured the Postgres database using the Express framework.
You can start your server by running:
1 node --env-file=.env.postgres postgres.js
Using the
--env-file
parameter will inject the environment variables, and you'll be able to use them through process.env
.Let's proceed with implementing the features of the application.
User register:
1 app.post('/register', async (req, res) => { 2 const { username, email, password } = req.body; 3 const salt = await bcrypt.genSalt(10); 4 const hashedPassword = await bcrypt.hash(password, salt); 5 const result = await pool.query( 6 'INSERT INTO users (username, email, password) VALUES ($1, $2, $3) RETURNING *', 7 [username, email, hashedPassword] 8 ); 9 const { ...user } = result.rows[0]; 10 11 res.status(201).json({...user, password: null}); 12 });
The user register endpoint ensures secure handling of passwords using
bcrypt
for hashing the password and executes a SQL query to insert the new user into the users
table in the PostgreSQL database via the connection pool.User login:
1 app.post('/login', async (req, res) => { 2 const { email, password } = req.body; 3 4 const user = await pool.query('SELECT * FROM users WHERE email = $1', [email]); 5 6 let validPass = false; 7 8 if (user.rows[0]) validPass = await bcrypt.compare(password, user.rows[0].password); 9 10 if (!validPass) return res.status(400).json({ message: 'Invalid credentials' }); 11 12 const token = jwt.sign({ id: user.rows[0].id, email: user.rows[0].email }, SECRET_KEY, { expiresIn: '1h' }); 13 14 res.status(200).json({ token }); 15 16 });
The user login endpoint ensures secure handling of user authentication using bcrypt for password validation and JWT for session management.
Middleware to authenticate users:
1 const authenticateToken = (req, res, next) => { 2 const token = req.header('Authorization'); 3 if (!token) return res.status(401).json({ message: 'Access Denied' }); 4 const verified = jwt.verify(token.split(" ")[1], SECRET_KEY); 5 req.user = verified; 6 next(); 7 };
This middleware function ensures that only requests with valid JWT tokens can access protected routes, enhancing the application's security.
Create contact:
1 app.post('/contacts', authenticateToken, async (req, res) => { 2 const { name, phone, email } = req.body; 3 const result = await pool.query( 4 'INSERT INTO contacts (name, phone, email, user_id) VALUES ($1, $2, $3, $4) RETURNING *', 5 [name, phone, email, req.user.id] 6 ); 7 res.status(201).json(result.rows[0]); 8 });
The create contact endpoint creates new contacts while ensuring only authenticated users can perform this action.
Get contact by ID:
1 app.get('/contacts/:id', authenticateToken, async (req, res) => { 2 const { id } = req.params; 3 const result = await pool.query('SELECT * FROM contacts WHERE id = $1 AND user_id = $2', [id, req.user.id]); 4 if (result.rows.length === 0) return res.status(404).json({ message: 'Contact not found' }); 5 res.status(200).json(result.rows[0]); 6 });
The get contact by ID endpoint allows only authenticated users to fetch a specific contact by its ID and checks that it belongs to the authenticated user.
Update contact:
1 app.put('/contacts/:id', authenticateToken, async (req, res) => { 2 const { id } = req.params; 3 const { name, phone, email } = req.body; 4 const result = await pool.query( 5 'UPDATE contacts SET name = $1, phone = $2, email = $3 WHERE id = $4 AND user_id = $5 RETURNING *', 6 [name, phone, email, id, req.user.id]\ 7 ); 8 if (result.rows.length === 0) return res.status(404).json({ message: 'Contact not found or not authorized' }); 9 res.status(200).json(result.rows[0]); 10 });
The update contact endpoint allows only authenticated users to access and update their contacts.
Delete contact:
1 app.delete('/contacts/:id', authenticateToken, async (req, res) => { 2 const { id } = req.params; 3 const result = await pool.query('DELETE FROM contacts WHERE id = $1 AND user_id = $2 RETURNING *', [id, req.user.id]); 4 if (result.rows.length === 0) return res.status(404).json({ message: 'Contact not found or not authorized' }); 5 res.status(200).json(result.rows[0]); 6 });
The delete contact endpoint allows only authenticated users to delete their contacts.
Add call rate:
1 app.post('/contacts/:id/rates', authenticateToken, async (req, res) => { 2 const { id } = req.params; 3 const { description } = req.body; 4 const result = await pool.query( 5 'INSERT INTO rates (body, contact_id, user_id) VALUES ($1, $2, $3) RETURNING *', 6 [description, id, req.user.id] 7 ); 8 res.status(201).json(result.rows[0]); 9 });
The add call rate endpoint allows only authenticated users to rate a specific contact by its ID.
The rates are associated with contacts in a one-to-many relationship. The
contact_id
column in the rates table is a foreign key that references the id column in the contacts table.This foreign key establishes a relationship where each rate is linked to a specific contact.
Having a defined relationship allows for efficient queries that can join two or more tables.
For example, you can quickly fetch a contact along with all its associated rates or aggregate data, like the number of rates per contact.
Get contacts:
1 app.get('/contacts', async (req, res) => { 2 let { startDate, endDate } = req.query; 3 if (!endDate) { 4 endDate = new Date().toISOString(); 5 } else { 6 endDate = new Date(endDate).toISOString(); 7 } 8 9 if (!startDate) {\ 10 const now = new Date(); 11 const pastDate = new Date(now); 12 pastDate.setDate(now.getDate() - 7); 13 startDate = pastDate.toISOString(); 14 } else { 15 startDate = new Date(startDate).toISOString(); 16 } 17 18 const result = await pool.query(` 19 SELECT 20 contacts.id, contacts.name, contacts.phone, contacts.created_at, 21 users.username, 22 COUNT(rates.id) AS call_rate 23 FROM contacts 24 LEFT JOIN users ON contacts.user_id = users.id 25 LEFT JOIN rates ON contacts.id = rates.contact_id 26 WHERE contacts.created_at BETWEEN $1 AND $2 27 GROUP BY contacts.id, users.username 28 ORDER BY call_rate DESC 29 `, [startDate, endDate]); 30 res.status(200).json(result.rows); 31 });
The get contacts endpoint retrieves and aggregates contact data, ensuring that only contacts within the specified date range are included in the results. It also provides useful information about the number of calls associated with each contact, ordered by the number of call ratings.
SELECT clause: The SELECT clause selects the specified column in the table to select data from.
FROM clause: The FROM clause specifies the table from which to select data.
LEFT JOIN users ON contacts.user_id = users.id: This joins the contacts table with the users table on the condition that the user_id in the contacts table matches the id in the users table.
LEFT JOIN rates ON contacts.id = rates.contact_id: This joins the contacts table with the rates table on the condition that the id in the contacts table matches the contact_id in the rates table.
WHERE contacts.created_at BETWEEN $1 AND $2: This filters the contacts to only include those created between the dates specified by $1 (startDate) and $2 (endDate).
GROUP BY contacts.id, users.username: This groups the results by contacts.id and users.username. This is necessary because we are using an aggregate function (COUNT(rates.id)), and SQL requires that all non-aggregated columns in the SELECT clause be included in the GROUP BY clause.
ORDER BY call_rate DESC: This orders the results by call_rate in descending order (DESC). This means that contacts with the highest number of rates (or calls) will appear first in the results.
To test out your implementation, don't forget to stop and restart your server.
As we can see, PostgreSQL follows a relational data model. At the same time, MongoDB is a NoSQL document-oriented database so we need to redesign our schema to fit MongoDB's document-based model. In the SQL migration section, we'll identify the entities and relationships in our PostgreSQL schema and map them to MongoDB's collections and documents.
To set up MongoDB for use with our Node.js application, we'll use MongoDB Atlas, a multi-cloud database service that simplifies deploying and managing your databases with just a few clicks.
It also offers a free-forever tier with no credit card requirements. How cool is that?
To create a database cluster, follow the official docs.
Choose the shared cluster option for a free-forever tier. The default configuration is sufficient for our demo application.
When you create your database cluster, you'll be prompted to add a database user and password for your database cluster. Make sure you save the username and password where you can reference as you'll need them to connect to the database cluster.
Create a Node.js project and install the following dependencies:
1 "dependencies": { 2 "express": "^4.19.2", 3 "mongodb": "^8.4.0", 4 "bcryptjs": "^2.4.3", 5 "jsonwebtoken": "^9.0.2", 6 }
You can install the libraries with:
1 npm install mongodb express bcryptjs jsonwebtoken
Create a
.env.mongodb
file and add the following environment variables:1 SECRET_KEY=your_jwt_secret_key 2 DB_USER=your_cluster_username 3 DB_PASSWORD=your_cluster_password 4 DATABASE=users 5 CLUSTER_URI=your_cluster_uri
Create a
mongodb.js
file in the project root directory and add the following:1 import { MongoClient, ObjectId } from 'mongodb'; 2 import express from 'express'; 3 import bcrypt from 'bcryptjs'; 4 import jwt from 'jsonwebtoken'; 5 6 const app = express(); 7 app.use(express.json()); 8 9 const url = `mongodb+srv://${process.env.DB_USER}:${process.env.DB_PASSWORD}@${process.env.CLUSTER_URI}`; 10 const client = new MongoClient(url); 11 const dbName = process.env.DATABASE; 12 const SECRET_KEY = process.env.SECRET_KEY; 13 14 await client.connect(); 15 const db = client.db(dbName); 16 const usersCollection = db.collection('users'); 17 const contactsCollection = db.collection('contacts'); 18 19 /* START FEATURE IMPLEMENTATION*/ 20 21 /* END FEATURE IMPLEMENTATION*/ 22 app.listen(3000, () => { 23 console.log('Server is running on port 3000'); 24 });
To use top-level await, you will need to enable ES module support and ensure that your package.json has
"type": "module"
.The above code allows your Node app to connect to your database cluster.
Run the following command to start your Node server:
1 node --env-file=.env.mongodb mongodb.js
There are two steps here:
- Translate the SQL queries to the MongoDB Query API.
- Use the MongoDB Query Converter to migrate your SQL queries to the MongoDB Query API.
The MongoDB Node.js driver is a package that allows you to connect to MongoDB, execute database operations, and manage collections and documents. It provides a straightforward, schemaless solution to model your application data in a flexible manner.
The first thing to consider for database migration is the data models. Remember how we create tables and relationships for the Postgres database by running the following SQL in the
psql
terminal:1 CREATE TABLE users ( 2 id SERIAL PRIMARY KEY, 3 username VARCHAR(100) UNIQUE NOT NULL, 4 password VARCHAR(100) NOT NULL, 5 email VARCHAR(100) UNIQUE NOT NULL, 6 created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP 7 );
Here is the good part: Using the MongoDB Node.js driver, you do not have to define schemas. When you insert a document into a collection, it is created with the fields you provided with the insert method.
Register:
1 app.post('/register', async (req, res) => { 2 const { username, email, password } = req.body; 3 4 const salt = await bcrypt.genSalt(10); 5 const hashedPassword = await bcrypt.hash(password, salt); 6 const user = { username, email, password: hashedPassword }; 7 const result = await usersCollection.insertOne(user); 8 res.status(201).json({...user, password: null}); 9 });
This endpoint is the MongoDB equivalent of the previous
register
endpoint designed for PostgreSQL. It uses MongoDB's driver insertOne methods to interact with the database. While the PostgreSQL implementation uses an SQL INSERT
statement to insert new data, MongoDB's implementation uses the insertOne
method to insert new data in a collection.Login:
1 app.post('/login', async (req, res) => { 2 const { email, password } = req.body; 3 const user = await usersCollection.findOne({ email }); 4 5 let validPass = false; 6 7 if (user) validPass = await bcrypt.compare(password, user.password); 8 9 if (!validPass) return res.status(400).json({ message: 'Invalid credentials' }); 10 11 const token = jwt.sign({ sub: user._id, email: user.email }, SECRET_KEY, { expiresIn: '1h' }); 12 res.status(200).json({ token }); 13 });
This endpoint is the MongoDB equivalent of the previous
/login
endpoint designed for PostgreSQL. It uses MongoDB's driver findOne methods to interact with the database. While the PostgreSQL implementation uses pool.query
to execute a SQL SELECT
statement to find a user row in the table, MongoDB's implementation uses the usersCollection.findOne
to find a user document in the collection.In MongoDB, the findOne method uses a JSON object to specify the query condition, which is analogous to the WHERE clause in SQL. In SQL, users is the name of the table being queried, similar to a collection in MongoDB.
Middleware to authenticate users:
1 const authenticateToken = (req, res, next) => { 2 const token = req.header('Authorization'); 3 if (!token) return res.status(401).json({ message: 'Access Denied' }); 4 const verified = jwt.verify(token.split(" ")[1], SECRET_KEY); 5 req.user = verified; 6 next(); 7 };
Here, we're using the same function as we did in the previous example.
Create contact:
1 app.post('/contacts', authenticateToken, async (req, res) => { 2 const { name, phone, email } = req.body; 3 const contact = { name, phone, email, user_id: req.user.sub, createdAt: new Date().toISOString() }; 4 const result = await contactsCollection.insertOne(contact); 5 res.status(201).json(contact); 6 });
This endpoint is the MongoDB version of the previous POST
/contact
endpoint we designed for PostgreSQL. Instead of using SQL commands like pool.query
to add a contact row to a table, it uses MongoDB's contactsCollection.insertOne method to add a contact document to a collection.Converting
req.user.id
to ObjectId ensures that the user ID is correctly formatted and can be efficiently queried and indexed in MongoDB. This is important as we'll need to perform operations that involve matching or referencing this ID within the database.Get contact by ID:
1 app.get('/contacts/:id', authenticateToken, async (req, res) => { 2 const { id } = req.params; 3 const contact = await contactsCollection.findOne({ _id: new ObjectId(id), user_id: req.user.sub }); 4 if (!contact) return res.status(404).json({ message: 'Contact not found' }); 5 res.status(200).json(contact); 6 });
This endpoint is the MongoDB equivalent of the previous GET
/contacts/:id
endpoint designed for PostgreSQL. While the PostgreSQL implementation uses pool.query
to execute an SQL SELECT
statement and the WHERE
clause to find a contact row in the table, the MongoDB implementation uses the contactsCollection.findOne
to find a contact document in the collection and returns the found document.PostgreSQL directly uses the
id
and user_id
parameters in the SQL query while MongoDB converts id
and user_id
to ObjectId
to match MongoDB's document ID format.Update contact:
1 app.put('/contacts/:id', authenticateToken, async (req, res) => { 2 const { id } = req.params; 3 const { name, phone, email } = req.body; 4 const contact = await contactsCollection.findOneAndUpdate( 5 { _id: new ObjectId(id), user_id: req.user.sub }, 6 { $set: req.body }, 7 { returnOriginal: false } 8 ); 9 if (!contact) return res.status(404).json({ message: 'Contact not found or not authorized' }); 10 res.status(200).json(contact); 11 });
This endpoint is the MongoDB equivalent of the previous PUT
/contacts
endpoint designed for PostgreSQL. While the PostgreSQL implementation uses pool.query to execute a SQL UPDATE
statement and a WHERE
clause to update the contact in the contacts
table, the MongoDB implementation uses the findOneAndUpdate
method with $set
to update the document in the contacts
collection.Delete contact:
1 app.delete('/contacts/:id', authenticateToken, async (req, res) => { 2 const { id } = req.params; 3 const contact = await contactsCollection.deleteOne({ _id: new ObjectId(id), user_id: req.user.sub }); 4 if (!contact.deletedCount) return res.status(404).json({ message: 'Contact not found or not authorized' }); 5 res.status(200).json(contact.value); 6 });
This endpoint is the MongoDB equivalent of the previous DELETE
/contact
endpoint designed for PostgreSQL. While the PostgreSQL implementation uses pool.query to execute an SQL DELETE
statement and a WHERE
clause to delete the contact in the contacts
table, the MongoDB implementation uses the deleteOne
method to delete the document in the contacts
collection.In MongoDB, querying is done using JSON-like objects. The equivalent of the WHERE clause is the query object passed to methods like findOne, deleteOne, etc.
Add call rate:
1 app.post('/contacts/:id/rates', authenticateToken, async (req, res) => { 2 const { id } = req.params; 3 const { description } = req.body; 4 const rate = { description, createdAt: new Date().toISOString(), user_id: new ObjectId(req.user.id) }; 5 const result = await contactsCollection.updateOne( 6 { _id: new ObjectId(id), user_id: req.user.sub }, 7 { $push: { rates: rate } } 8 ); 9 if (result.matchedCount === 0) return res.status(404).json({ message: 'Contact not found ' }); 10 res.status(201).json(rate); 11 });
This endpoint is the MongoDB equivalent of the previous POST
/contacts/:id/rates
endpoint designed for PostgreSQL. While the PostgreSQL implementation uses pool.query
to execute a SQL INSERT INTO
statement to insert a rate row in the table, MongoDB's implementation uses the contactsCollection.updateOne
to add a rate to a contact, which will now update the contact document by pushing a new rate into the embedded rates array.By embedding rates within contacts, you avoid the need for joins (like
$lookup
), which can improve query performance, especially for read-heavy operations. Now, all relevant information about a contact, including its rates, is stored together in a single document.Get contacts:
1 app.get("/contacts", authenticateToken, async (req, res) => { 2 let { startDate, endDate } = req.query; 3 4 if (!endDate) { 5 endDate = new Date().toISOString(); 6 } else { 7 endDate = new Date(endDate).toISOString(); 8 } 9 10 if (!startDate) { 11 const now = new Date(); 12 const pastDate = new Date(now); 13 pastDate.setDate(now.getDate() - 7); 14 15 startDate = pastDate.toISOString(); 16 } else { 17 startDate = new Date(startDate).toISOString(); 18 } 19 20 const contacts = await contactsCollection 21 .aggregate([ 22 { 23 $match: { 24 createdAt: { 25 $gte: startDate, 26 $lte: endDate, 27 }, 28 user_id: req.user.sub 29 }, 30 }, 31 { 32 $lookup: { 33 from: "users", 34 localField: "user_id", 35 foreignField: "_id", 36 as: "user", 37 }, 38 }, 39 { 40 $addFields: { 41 call_rate: { $size: { $ifNull: ["$rates", []] } }, 42 }, 43 }, 44 { $sort: { call_rate: -1 } }, 45 ]) 46 .toArray(); 47 48 res.status(200).json(contacts); 49 });
This endpoint is the MongoDB equivalent of the previous GET
/contacts
endpoint designed for PostgreSQL. While the PostgreSQL implementation uses a SQL query with JOIN
statements, WHERE
clause, GROUP BY
, and ORDER BY
to perform (matching, joining, adding fields, sorting) operations on the contacts
table, MongoDB's implementation uses the aggregate
method to perform similar operations on the contacts
collection.An aggregation pipeline in MongoDB is a framework that passes documents into a multi-stage pipeline that can transform them and output aggregated results. Each stage in the pipeline performs an operation on the input documents and passes the result to the next stage.
This aggregation pipeline filters contacts based on their creation date and the authenticated user's ID, joins the contacts collection with the users collection to fetch user details, joins the contacts collection with the rates collection to fetch rate details, and then adds a new field --- call_rate --- to count the number of rates per contact and sorts the contacts by call_rate in descending order.
Compared to the SQL statement, you don't need to use two JOINs. Therefore, the performance is better.
Now, let's talk about using the MongoDB Query Converter to migrate your SQL queries to the MongoDB Query API.
MongoDB provides a tool called Relational Migrator which allows you to migrate your SQL queries to the MongoDB Query API with ease. It also addresses common migration and data modeling challenges to provide a smooth transition to MongoDB.
Download Relational Migrator to try it out yourself.
Before using the query converter, you must sign in to your Atlas account in Relational Migrator. For details, see Log In with Atlas.
- Open the Relational Migrator app on your PC and log in with Atlas.
- Connect to your Postgres database.
- Select the tables that you want to migrate.
- Define your schema as follows:
Then, name your project.
Generate the MongoDB schema from the imported Postgres database table as follows:
From the Code Generation tab, click the Query Converter pane.
Click the Paste SQL query button to access the query converter editor.
Here, you can enter the SQL queries you want to convert and click Convert to convert them to MongoDB queries.
MongoDB Compass is a graphical user interface for MongoDB and it's completely free to install on Mac, Windows, and Linux. You can import JSON or CSV data to MongoDB via MongoDB Compass.
To import data to a collection, connect to your database cluster using the connection string from Atlas, create a database and collection, and navigate to the detailed collection view by selecting the collection from the Databases tab or clicking the collection in the left-side navigation.
Next, click the Add Data dropdown, select Import JSON or CSV file, select the appropriate file type, and click Import. After successful import, the dialog closes and Compass displays the collection page containing the newly imported documents.
To import data using
mongoimport
, install the mongoDB database tools, a suite of command-line utilities for working with MongoDB, and then run mongoimport
from the system command line.Here is an example of importing data via
mongoimport
:1 mongoimport mongodb+srv://<cluster-username>:<cluster-password>@cluster0.33t5arb.mongodb.net/Node-API?retryWrites=true&w=majority&appName=Cluster0 --collection=user --file=users.json
Here, we use
mongoimport
to import the JSON data from the users.json file into the users collection in the Node-API database.To import data using the VS Code extension, go to the extensions tab in VS Code and search for MongoDB. You should see the MongoDB for VS Code extension.
Once you successfully install it, you'll see a leaf icon in your VS Code sidebar. Click on it and under connections, add a new connection. Then, click Connect with connection string and pass in your database cluster connection string.
You should see your cluster. Hover on your cluster and click the plus icon to create a new database. This will generate a playground file where you can add the following script to create a database, collection and import data:
1 // Select the database to use. 2 use('database-name'); 3 // Insert documents into the users collection. 4 db.getCollection('users').insertMany("Copy and paste the content of your JSON file here");
Use Postman or any API testing tool to test all the updated endpoints with the migrated queries to ensure the migration works as expected.
Migrating a Node.js application from a SQL database to MongoDB can be seamless and efficient when approached with a well-structured resource. In this tutorial, we took a look at setting up MongoDB, migrating the SQL queries, and performing data migration with MongoDB Compass, MongoImport, and the MongoDB for VS Code extension.
By following the structured approach in the tutorial, you can successfully migrate your Node.js application from a SQL database to MongoDB.
Top Comments in Forums
There are no comments on this article yet.