Docs Menu
Docs Home
/ /
Atlas App Services
/ /

Migrate to Your Own S3 Bucket

On this page

  • Migrate to S3
  • Create an S3 Bucket
  • Migrate Existing Data
  • Update Application Code
  • Shut Down Atlas App Services Hosting
  • Access S3 Bucket from Atlas Functions
  • Integrate with Atlas Functions
  • Write Functions to Access an S3 Bucket

Important

Always refer to the official documentation of both MongoDB Atlas and AWS for the most up-to-date and accurate information. The specific steps may vary depending on the details of your project and the technologies used.

If you are solely using the MongoDB Atlas static hosting service as a blob store for static content and are not hosting a client application, follow the steps below to migrate from using Atlas Hosting to using your own S3 bucket.

1

Set up your own S3 bucket in your AWS account if you haven't already done so. Configure the bucket settings according to your requirements, including access permissions and encryption options.

2

Migrate your existing data from MongoDB Atlas Hosting to your S3 bucket. Depending on the amount of data and your specific requirements, this could involve exporting data from MongoDB with the App Services CLI (appservices pull --include-hosting) and then uploading the data to S3 using tools like the AWS CLI or AWS SDK.

3

Modify your application code to interact with the S3 bucket instead of MongoDB Atlas Hosting. This includes updating code that handles file uploads or downloads, if you are using the App Services CLI commands to do so.

4

Once you have verified that your files deploy successfully to your S3 bucket, delete your hosted files from your Atlas App Services app. As a reminder, hosting domains on Atlas App Services will no longer run starting on March 12, 2025.

You can access static content stored in your S3 bucket from Atlas Functions. To do so, you need to:

  1. Integrate with Atlas Functions

  2. Write Functions to access an S3 bucket

Important

Test and Validate

Test your Atlas Functions to ensure they can successfully access and manipulate objects in the S3 bucket. Validate that the functions behave as expected and handle errors gracefully.

Upload to S3 using the @aws-sdk external dependency. Atlas App Services automatically transpiles dependencies and also supports most default Node.js modules.

To import and use an external dependency, you first need to add the dependency to your application. You can either add packages by name or upload a directory of dependencies.

Important

AWS SDK Support

App Services does not yet support version 3 of the AWS SDK. Use the version 2 SDK when specifying the npm module

Write MongoDB Atlas Functions that interact with your S3 bucket. These functions can perform various operations such as uploading files, downloading files, listing objects, and deleting objects from the S3 bucket.

We'll cover some basic examples in this guide. You may want to explore other ways to interact with your S3 bucket by viewing the full list of client commands for the @aws-sdk/client-s3.

To authenticate AWS requests, store your Access Key ID and Secret Access Key as values. You can then access them within functions and pass them to the SDK.

exports = async function() {
// require calls must be in exports function
const { S3Client, PutObjectCommand, GetObjectCommand } = require("@aws-sdk/client-s3");
const s3Client = new S3Client({
region: "us-east-1", // replace with your AWS region
credentials: {
accessKeyId: context.values.get("awsAccessKeyId"),
secretAccessKey: context.values.get("awsSecretAccessKey"),
},
});
const putCommand = new PutObjectCommand({
Bucket: "bucketName",
Key: "keyName",
Body: EJSON.stringify({ hello: "world" }),
});
const putResult = await s3Client.send(putCommand);
const getCommand = new GetObjectCommand({
Bucket: "bucketName",
Key: "keyName",
});
const getResult = await s3Client.send(getCommand);
}

Fetch static assets and upload them to S3 either by downloading static assets from the URL or uploading them as local files.

To upload downloaded content to S3, use an HTTP library or built-in Node.js modules like http or https to download the static asset from the URL. You can then upload the downloaded content to S3.

Here's an example of downloading an asset using the axios library:

const axios = require('axios');
const stream = require('stream');
const { promisify } = require('util');
// Promisify pipeline function to pipe streams
const pipeline = promisify(stream.pipeline);
async function uploadAssetToS3() {
try {
const response = await axios.get('URL_OF_YOUR_STATIC_ASSET', { responseType: 'stream' });
const uploadParams = {
Bucket: 'YOUR_BUCKET_NAME',
Key: 'YOUR_OBJECT_KEY',
Body: response.data
};
// Upload the static asset to S3
await s3.upload(uploadParams).promise();
console.log('Static asset uploaded successfully');
} catch (error) {
console.error('Error uploading static asset:', error);
}
}
uploadAssetToS3();

To upload a local asset to S3, use the following code snippet:

const uploadParams = {
Bucket: 'YOUR_BUCKET_NAME',
// Specify the name/key for the object in the bucket (usually the file name)
Key: 'YOUR_OBJECT_KEY',
// Provide the local file to be uploaded
Body: 'STATIC_ASSET_CONTENT',
};
// Upload the static asset to S3
s3.upload(uploadParams, (err, data) => {
if (err) {
console.error('Error uploading static asset:', err);
} else {
console.log('Static asset uploaded successfully:', data.Location);
}
});

Back

Migrate to Render