Our great sponsors
-
ohmyzsh
🙃 A delightful community-driven (with 2,300+ contributors) framework for managing your zsh configuration. Includes 300+ optional plugins (rails, git, macOS, hub, docker, homebrew, node, php, python, etc), 140+ themes to spice up your morning, and an auto-update tool so that makes it easy to keep up with the latest updates from the community.
-
SurveyJS
Open-Source JSON Form Builder to Create Dynamic Forms Right in Your App. With SurveyJS form UI libraries, you can build and style forms in a fully-integrated drag & drop form builder, render them in your JS app, and store form submission data in any backend, inc. PHP, ASP.NET Core, and Node.js.
-
nerd-fonts
Iconic font aggregator, collection, & patcher. 3,600+ icons, 50+ patched fonts: Hack, Source Code Pro, more. Glyph collections: Font Awesome, Material Design Icons, Octicons, & more
Install mongosh on your computer to follow along as you read this blog. If you’d prefer a pre-configured sandbox to experiment with, you can try the Docker image I put together while writing the blog.
I like colorful and informational prompts, which is why I am a happy oh-my-zsh user. When I use mongosh to work with my data, I want a quick overview of my connection status and information about the cluster I am connected to.
And if you like pretty icons as much as I do, configure your terminal application to use Nerd Fonts.
It’s even easier to do this if you rely on one of the NPM packages that were built to make HTTP more convenient. In the example below, I am using node-fetch. To follow this example, you will need to have Node.js and npm installed on your computer.
Sometimes, you just want to generate large volumes of synthetic data for development and testing. One great module that helps with this is Falso.
This is something you could probably write a script for. The good news is that we have already scripted it for you, in the form of a shell snippet. Snippets are scripts that are packaged and stored in a registry to facilitate sharing and reuse. We have one dedicated to schema analysis.