TL;DR
Data centers will generate 14% of global greenhouse emissions by 2040. I show how to archive cold MongoDB data to S3 with serverless triggers and keep querying it through MongoDB Data Lake.
Thanks for coming to my talk, Save The World And Money With MongoDB Atlas Data Lake.
About
Data centers are expensive. It turns out that this is not great for the environment. By 2040, storing digital data is set to create 14% of the world’s greenhouse emissions. As a developer you probably work with a lot of data. Your clusters balloon and become more expensive every day. Now is the time to be a hero, save the world and your wallet.
In this live coding session, I will show you how to archive your cold MongoDB data automatically to an AWS S3 bucket using Serverless Triggers. I will also demonstrate how to keep querying this archived data using MongoDB Data Lake with zero downtime.
You walk away from this session with a clear understanding of data lakes, their features and capabilities. Join this session and be equipped to save the world.
Source Code
GitHub - Save The World And Money With MongoDB Data Lake GitHub - MongoDB IoT Sample Data Generator
Talk Slides
Video
Related Links
- Archiving a MongoDB Cluster
- MongoDB Atlas Data Lake Landing Page
- Want to learn more about MongoDB? Check out MongoDB University.
- Have a question or issue with MongoDB? Join the MongoDB Community Forums and ask a question.
- GitHub - Save The World And Money With MongoDB Data Lake
- GitHub - MongoDB IoT Sample Data Generator