Skip to content

This project simplifies remote access to a MongoDB database. Instead of dealing with firewalls, port forwarding, and static IPs, you can deploy a Flask REST API to any cloud service and connect it to your MongoDB instance, enabling secure CRUD operations without worrying about mongodb:// connection limitations.

License

Notifications You must be signed in to change notification settings

lkkhwhb/FirebaseCloudDatabase

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

21 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Flask MongoDB API & Client Libraries

A simple Flask API that acts as a secure middleman for MongoDB operations, paired with client libraries for Python and Node.js.

The primary goal of this project is to simplify remote access to a MongoDB database. Hosting a local MongoDB instance and making it accessible over the internet can be complicated, involving firewall rules, port forwarding, and static IP addresses. Additionally, using the instance with official client packages can be cumbersome, as they only support mongodb:// connections, not https://. This project provides an easy solution: deploy a Flask API to any cloud service, connect it to your MongoDB instance, and interact with your database via a simple REST API giving you secure cloud access without the networking headaches. It also provides you a client side library to interact with the API easily without writing the API logics.

DISCLAMIER

This is a prototype project I created (I was bored) to solve a personal challenge: making a private MongoDB instance easily accessible from anywhere. While it's not designed for production environments, it serves as a practical demonstration of how an API can act as a gateway to a database, simplifying development and testing workflows.

Table of Contents

Features

  • Full CRUD Operations: Create, Read, Update, and Delete for both single and multiple documents.
  • Database Management: Create and drop databases.
  • Collection Management: Drop collections and list all collections in a database.
  • Index Management: Create, list, and drop indexes on collections.
  • Size Calculation: Get the storage size of databases, collections, or individual documents.
  • Intuitive Clients: Object-oriented client libraries for Python and Node.js that mirror the syntax of popular database drivers.

Use Cases

  1. Simplify Remote Access to a Private MongoDB Instance: The primary use case. Run MongoDB on your local machine or a private server and use this API as a public-facing gateway. This avoids complex network configurations and provides a stable endpoint for your applications to connect to.

  2. Safe Development & Testing Environment: Test applications locally or remotely without risking your production database. The client libraries are API-compatible with MongoDB syntax, so switching between this backend and a real one requires minimal code changes.

  3. Learning & Prototyping Tool: Understand how database APIs, REST communication, and client abstractions work internally by studying or extending this project. Perfect for students or developers learning backend architecture.

  4. Mock Backend for Demos or CI: Use it to simulate database operations in continuous integration pipelines, demos, or proof-of-concept projects especially when you need a disposable, controlled environment.

How It Works

This project's architecture consists of three distinct components working in concert to provide secure and simple access to your database.

[Your Application (using a client library)] <---> [Flask API Backend] <---> [Your MongoDB Instance]
  1. The Flask API Backend (server.py): This is the core of the project. It's a Python Flask server that acts as a secure "middleman." It receives simple HTTP requests from a client, translates them into PyMongo database commands, executes them against your MongoDB instance, and returns the results as JSON. It is designed to be deployed to any cloud hosting service (e.g., Render, Heroku, etc.).

  2. The Client Libraries (Python & Node.js): These libraries are what you use in your own application code. They provide a clean, object-oriented interface (Client, Database, Collection) that mimics official MongoDB drivers. Instead of writing manual fetch or requests calls, you can write idiomatic database code, and the library handles all the backend API communication for you.

  3. Your MongoDB Instance: This is your actual database. It can be running on your local machine, a private server, or a cloud service like MongoDB Atlas. The Flask API is the only component that needs direct network access to it, shielding your database from the public internet.

Setup Guide

Part 1: Deploy the Flask API Server

  1. Navigate to Firebase Studio.
  2. Create a new workspace. When prompted for a template, select MongoDB + Flask.
  3. Wait for the environment to be provisioned. This may take a few minutes.
  4. Once the workspace is ready, find the main.py file in the file explorer.
  5. Delete all the existing code in main.py and replace it with the entire contents of the server.py file provided in this project. The server should restart automatically with the new code.
  6. The Flask server needs to know where your MongoDB database is. You will need to set an environment variable named MONGO_URI with your MongoDB connection string. In Firebase Studio or other cloud environments, this can typically be set in a secrets manager or environment variable configuration panel. Your URI will look something like mongodb://user:password@host:port/.
  7. In the right side bar under "firebase" pre-installed extension you will find the backend ports below.
  8. Port 5000 should be listed. click on the lock icon and switch it to public.
  9. Click on the copy icon to copy the public backend web address.
  10. Open the address on a brwoser to check if its actually working, if you see a json response on the screen you are good to go.

Part 2: Use a Client Library

Choose the client library that matches your project's language.

Option A: Python Client

  1. On your local machine, create a new project directory.
  2. Inside your project, create a file (e.g., db_client.py) and paste the entire contents of the __init__.py file into it.
  3. Make sure you have the requests library installed. If not, run:
    pip install requests

Option B: NodeJS Client

  1. On your local machine, make sure you have axios installed in your project:
    npm install axios
  2. From the NodeJS folder in this repository, choose the file that matches your project's module system and copy it into your project:
    • db-client.ts: For TypeScript projects. This is the recommended version for its type safety.
    • module.js: For modern JavaScript projects using ES Modules (import/export syntax).
    • common.js: For traditional Node.js projects using CommonJS (require/module.exports syntax).

Comprehensive Python Usage Guide

This guide provides a detailed walkthrough of every feature in the Python client library, organized by the object you'll be working with: Client, Database, Collection, and the SizeResult helper.

1. The Client Object: Your Connection Gateway

The Client is the entry point for all interactions. It manages the connection to your Flask API and handles requests and errors.

1.1. Initialization and Connection Test

First, create a client instance with your server's full URL. The URL must start with http:// or https://. It's best practice to immediately ping the server to ensure it's reachable.

# Assuming the client code is in a file named db_client.py
from db_client import Client

# Replace with your server's public URL
SERVER_URL = "https://your-workspace-name-5000.app.googlestudio.dev"
client = Client(SERVER_URL)

try:
    # ping() sends a GET request to the server's root.
    # Raises RuntimeError on connection failure.
    status = client.ping()
    print("Server is online:", status)
    # Expected output: Server is online: {'message': 'MongoDB is running'}
except RuntimeError as e:
    print(f"Connection failed: {e}")

1.2. Error Handling

The client translates server-side HTTP errors into a RuntimeError, providing a clear message.

try:
    # This will fail as the endpoint is invalid.
    client._post("invalid_endpoint", {}) 
except RuntimeError as e:
    print(f"Caught an expected error: {e}")
    # Expected output: Caught an expected error: API Error: Not Found (Status: 404)

1.3. Accessing a Database (Pythonic Syntax)

Use dictionary-style syntax (client['db_name']) to get a Database object. This action is lightweight and does not create the database on the server until data is written.

# Get a Database object for 'company_db'.
db = client['company_db']
print(f"Working with database: '{db.name}'")

1.4. Dropping a Database (Client-level)

The Client object provides a direct way to drop a database by its name. This is a destructive action, useful for cleanup scripts.

# Create a temporary DB to demonstrate.
client['temp_db'].create() 
print("Dropping 'temp_db' directly via the client...")
result = client.drop_database('temp_db')
print(result)
# Expected output: {'message': "Database 'temp_db' dropped successfully"}

2. The Database Object: The Collection Manager

The Database object is a container for collections and provides methods to manage them as a group.

2.1. Explicitly Creating a Database

While MongoDB creates databases implicitly, you can create one explicitly to ensure it exists.

# Sends a request to the /create_database endpoint.
creation_result = db.create()
print(creation_result)
# Expected output: {'message': "Database 'company_db' created successfully"}

2.2. Accessing a Collection

Similar to accessing a database, use dictionary-style syntax (db['collection_name']) to get a Collection object.

# Get a Collection object for 'employees'.
employees = db['employees']
print(f"Working with collection: '{employees.name}' in database '{employees.database.name}'")

2.3. Listing and Dropping Collections

Manage collections within a database directly from the Database object.

# Create sample data so a collection exists.
employees.insert_one({"name": "temp_user"})

# list_collection_names() retrieves all collection names.
collections_list = db.list_collection_names()
print("Collections in database:", collections_list['collections'])
# Expected output: Collections in database: ['employees']

# drop_collection() removes a collection by name.
result = db.drop_collection("employees")
print("Dropped 'employees' collection from the database object.", result)

2.4. Getting Database Size

Retrieve the total data size of the database. This returns a SizeResult object for easy unit conversions.

# Add some data to measure.
db['test_coll_1'].insert_many([{'a': i} for i in range(10)])
db['test_coll_2'].insert_many([{'b': i} for i in range(10)])

db_size = db.get_size()
print(f"Database size (user-friendly): {db_size}")
print(f"Database size (raw bytes): {db_size.bytes()}")

2.5. Dropping an Entire Database (Database-level)

This drops the database this object represents. It is a destructive and irreversible action.

print(f"Dropping the entire '{db.name}' database...")
db.drop()
print(f"Database '{db.name}' has been dropped.")

3. The Collection Object: The Data Workhorse

This is where you'll perform most of your work: creating, reading, updating, and deleting documents (CRUD).

# Set up a fresh collection for these examples.
db = client['company_db']
employees = db['employees']
employees.drop() # Clean up from previous runs

3.1. Create: insert_one & insert_many

# Insert a single document.
alice_res = employees.insert_one({
    "name": "Alice", "role": "Engineer", "level": 4, "tags": ["python", "api"]
})
print("Inserted Alice with ID:", alice_res['inserted_id'])

# Insert multiple documents. More efficient than looping.
multi_res = employees.insert_many([
    {"name": "Bob", "role": "Engineer", "level": 5, "tags": ["python", "devops"]},
    {"name": "Charlie", "role": "Manager", "level": 7, "tags": ["management"]}
])
print(f"Inserted {len(multi_res['inserted_ids'])} new documents.")

3.2. Read: find with Filters

The find method accepts any standard MongoDB query filter. If no filter is provided, it returns all documents.

# Find all documents.
all_docs = employees.find()
print("All employees:", all_docs['documents'])

# Find with a simple key-value filter.
engineers = employees.find({"role": "Engineer"})
print("Found engineers:", engineers['documents'])

# Find with an operator ($gt means "greater than").
seniors = employees.find({"level": {"$gt": 4}})
print("Found senior staff:", seniors['documents'])

3.3. Update: update_one & update_many

Updates require a filter to find documents and an update expression (e.g., $set) to modify them.

# Update the first document matching the filter.
update_res = employees.update_one({"name": "Alice"}, {"$set": {"level": 5}})
print(f"Promoted Alice. Matched: {update_res['matched_count']}, Modified: {update_res['modified_count']}")

# Update all documents matching the filter.
update_many_res = employees.update_many(
    {"role": "Engineer"}, 
    {"$set": {"department": "Technology"}}
)
print(f"Assigned engineers to Technology. Matched: {update_many_res['matched_count']}, Modified: {update_many_res['modified_count']}")

3.4. Delete: delete_one & delete_many

# Delete the first document matching the filter.
del_one_res = employees.delete_one({"name": "Charlie"})
print(f"Deleted Charlie. Count: {del_one_res['deleted_count']}")

# Delete all documents matching a filter.
del_many_res = employees.delete_many({"department": "Technology"})
print(f"Deleted {del_many_res['deleted_count']} documents from Technology department.")

3.5. Index Management: create_index, list_indexes, drop_index

Indexes improve query speed and can enforce uniqueness. An index key is a list of [field_name, direction] pairs, where direction is 1 (ascending) or -1 (descending).

# Re-insert data for index examples.
employees.insert_many([
    {"name": "Eve", "email": "eve@company.com", "role": "SRE"},
    {"name": "Frank", "email": "frank@company.com", "role": "SRE"}
])

# Create an index on 'role' to speed up queries.
role_index = employees.create_index([["role", 1]])
print("Created index:", role_index)

# Create a unique index to prevent duplicate emails.
email_index = employees.create_index([["email", 1]], unique=True)
print("Created unique index:", email_index)

# List all indexes on the collection.
indexes = employees.list_indexes()
print("Current indexes:", indexes['indexes'])

# Drop an index by its name.
employees.drop_index(role_index['index_name'])
print(f"Dropped index: '{role_index['index_name']}'")

3.6. Getting Collection & Document Size

Use get_size() on a collection. Provide a filter to get the BSON size of a single matching document.

# Get the size of the entire collection.
coll_size = employees.get_size()
print(f"Total collection size: {coll_size}")

# Get the BSON size of a single document.
frank_doc = employees.find({"name": "Frank"})
frank_id = frank_doc['documents'][0]['_id']
frank_size = employees.get_size(filter={"_id": frank_id})
print(f"Frank's document size: {frank_size.bytes()} bytes")

3.7. Dropping a Collection (Collection-level)

This is the most common way to drop a collection you are working with.

employees.drop()
print("Dropped 'employees' collection via its own object.")

4. The SizeResult Helper: Understanding Sizes

The get_size() method returns a SizeResult object, which simplifies unit conversions.

# Create a collection with data for a clear example.
size_demo_coll = db['size_demo']
size_demo_coll.drop()
size_demo_coll.insert_many([{"x": "a" * 50}] * 2048)

# Get the size object.
size_obj = size_demo_coll.get_size()

# User-friendly string representation.
print(f"User-friendly format: {size_obj}")

# Access specific units.
print(f"Bytes:     {size_obj.bytes()}")
print(f"Kilobytes: {size_obj.kb():.2f} KB")
print(f"Megabytes: {size_obj.mb():.4f} MB")
print(f"Gigabytes: {size_obj.gb():.6f} GB")

# Cast the object to an int to get the raw byte value.
byte_value = int(size_obj)
print(f"Value from int() cast: {byte_value}")

size_demo_coll.drop()

Comprehensive TypeScript/JavaScript Usage Guide

The Node.js client mirrors the Python client's object-oriented structure. All methods are async and return Promises. Instead of dictionary-style access (client['db']), you use method calls (client.database('db')).

The following examples are written in TypeScript. We use TypeScript because its type annotations provide clarity and safety, making it easier to understand what kind of data each method expects and returns. This code can be easily adapted for plain JavaScript (ES Modules or CommonJS) by removing the type declarations.

// Wrapper for running async examples
async function main() {
    // ... all example code goes here ...
}

main().catch(console.error);

1. The Client Object: Your Connection Gateway

The Client is the entry point for all interactions. It manages the connection to your API.

1.1. Initialization and Connection Test

First, create a client instance with your server's URL. It's best practice to immediately ping the server to ensure it's reachable.

// Assuming the client code is in a file named db_client.ts/js
import { Client } from './db_client';

// Replace with your server's public URL
const SERVER_URL = "https://your-workspace-name-5000.app.googlestudio.dev";
const client = new Client(SERVER_URL);

try {
    // ping() sends a GET request to the server's root.
    // Throws an Error on connection failure.
    const status = await client.ping();
    console.log("Server is online:", status);
    // Expected output: Server is online: { message: 'MongoDB is running' }
} catch (e) {
    console.log(`Connection failed: ${e.message}`);
}

1.2. Error Handling

The client translates server-side HTTP errors into a standard Error object with a descriptive message.

try {
    // This will fail as the endpoint is invalid.
    await client._post("invalid_endpoint", {}); 
} catch (e) {
    console.log(`Caught an expected error: ${e.message}`);
    // Expected output: Caught an expected error: API Error: Not Found (Status: 404)
}

1.3. Accessing a Database

Use the database() method to get a Database object. This action is lightweight and does not create the database on the server until data is written.

// Get a Database object for 'company_db'.
const db = client.database('company_db');
console.log(`Working with database: '${db.name}'`);

1.4. Dropping a Database (Client-level)

The Client object provides a direct way to drop a database by name. This is a destructive action, useful for cleanup scripts.

// Create a temporary DB to demonstrate.
await client.database('temp_db').create(); 
console.log("Dropping 'temp_db' directly via the client...");
const result = await client.dropDatabase('temp_db');
console.log(result);
// Expected output: { message: "Database 'temp_db' dropped successfully" }

2. The Database Object: The Collection Manager

The Database object is a container for collections and provides methods to manage them.

2.1. Explicitly Creating a Database

While MongoDB creates databases implicitly, you can create one explicitly to ensure it exists.

// Sends a request to the /create_database endpoint.
const creationResult = await db.create();
console.log(creationResult);
// Expected output: { message: "Database 'company_db' created successfully" }

2.2. Accessing a Collection

Use the collection() method to get a Collection object.

// Get a Collection object for 'employees'.
const employees = db.collection('employees');
console.log(`Working with collection: '${employees.name}' in database '${employees.database.name}'`);

2.3. Listing and Dropping Collections

Manage collections within a database directly from the Database object.

// Create sample data so a collection exists.
await employees.insertOne({ name: "temp_user" });

// listCollectionNames() retrieves all collection names.
const collectionsList = await db.listCollectionNames();
console.log("Collections in database:", collectionsList.collections);
// Expected output: Collections in database: ['employees']

// dropCollection() removes a collection by name.
const result = await db.dropCollection("employees");
console.log("Dropped 'employees' collection from the database object.", result);

2.4. Getting Database Size

Retrieve the total data size of the database. This returns a SizeResult object for easy unit conversions.

// Add some data to measure.
await db.collection('test_coll_1').insertMany([{'a': i} for (i = 0; i < 10; i++)]);
await db.collection('test_coll_2').insertMany([{'b': i} for (i = 0; i < 10; i++)]);

const dbSize = await db.getSize();
console.log(`Database size (user-friendly): ${dbSize}`);
console.log(`Database size (raw bytes): ${dbSize.bytes()}`);

2.5. Dropping an Entire Database (Database-level)

This drops the database this object represents. It is a destructive and irreversible action.

console.log(`Dropping the entire '${db.name}' database...`);
await db.drop();
console.log(`Database '${db.name}' has been dropped.`);

3. The Collection Object: The Data Workhorse

This is where you'll perform most of your work: creating, reading, updating, and deleting documents (CRUD).

// Set up a fresh collection for these examples.
const db = client.database('company_db');
const employees = db.collection('employees');
await employees.drop(); // Clean up from previous runs

3.1. Create: insertOne & insertMany

// Insert a single document.
const aliceRes = await employees.insertOne({
    name: "Alice", role: "Engineer", level: 4, tags: ["python", "api"]
});
console.log("Inserted Alice with ID:", aliceRes.inserted_id);

// Insert multiple documents. More efficient than looping.
const multiRes = await employees.insertMany([
    { name: "Bob", role: "Engineer", level: 5, tags: ["python", "devops"] },
    { name: "Charlie", role: "Manager", level: 7, tags: ["management"] }
]);
console.log(`Inserted ${multiRes.inserted_ids.length} new documents.`);

3.2. Read: find with Filters

The find method accepts any standard MongoDB query filter. If no filter is provided, it returns all documents.

// Find all documents.
const allDocs = await employees.find();
console.log("All employees:", allDocs.documents);

// Find with a simple key-value filter.
const engineers = await employees.find({ role: "Engineer" });
console.log("Found engineers:", engineers.documents);

// Find with an operator ($gt means "greater than").
const seniors = await employees.find({ level: { "$gt": 4 } });
console.log("Found senior staff:", seniors.documents);

3.3. Update: updateOne & updateMany

Updates require a filter to find documents and an update expression (e.g., $set) to modify them.

// Update the first document matching the filter.
const updateRes = await employees.updateOne({ name: "Alice" }, { "$set": { level: 5 } });
console.log(`Promoted Alice. Matched: ${updateRes.matched_count}, Modified: ${updateRes.modified_count}`);

// Update all documents matching the filter.
const updateManyRes = await employees.updateMany(
    { role: "Engineer" }, 
    { "$set": { department: "Technology" } }
);
console.log(`Assigned engineers to Technology. Matched: ${updateManyRes.matched_count}, Modified: ${updateManyRes.modified_count}`);

3.4. Delete: deleteOne & deleteMany

// Delete the first document matching the filter.
const delOneRes = await employees.deleteOne({ name: "Charlie" });
console.log(`Deleted Charlie. Count: ${delOneRes.deleted_count}`);

// Delete all documents matching a filter.
const delManyRes = await employees.deleteMany({ department: "Technology" });
console.log(`Deleted ${delManyRes.deleted_count} documents from Technology department.`);

3.5. Index Management: createIndex, listIndexes, dropIndex

Indexes improve query speed and can enforce uniqueness. An index key is an array of [fieldName, direction] pairs, where direction is 1 (ascending) or -1 (descending).

// Re-insert data for index examples.
await employees.insertMany([
    { name: "Eve", email: "eve@company.com", role: "SRE" },
    { name: "Frank", email: "frank@company.com", role: "SRE" }
]);

// Create an index on 'role' to speed up queries.
const roleIndex = await employees.createIndex([["role", 1]]);
console.log("Created index:", roleIndex);

// Create a unique index to prevent duplicate emails.
const emailIndex = await employees.createIndex([["email", 1]], true);
console.log("Created unique index:", emailIndex);

// List all indexes on the collection.
const indexes = await employees.listIndexes();
console.log("Current indexes:", indexes.indexes);

// Drop an index by its name.
await employees.dropIndex(roleIndex.index_name);
console.log(`Dropped index: '${roleIndex.index_name}'`);

3.6. Getting Collection & Document Size

Use getSize() on a collection. Provide a filter to get the BSON size of a single matching document.

// Get the size of the entire collection.
const collSize = await employees.getSize();
console.log(`Total collection size: ${collSize}`);

// Get the BSON size of a single document.
const frankDoc = await employees.find({ name: "Frank" });
const frankId = frankDoc.documents[0]._id;
const frankSize = await employees.getSize({ _id: frankId });
console.log(`Frank's document size: ${frankSize.bytes()} bytes`);

3.7. Dropping a Collection (Collection-level)

This is the most common way to drop a collection you are working with.

await employees.drop();
console.log("Dropped 'employees' collection via its own object.");

4. The SizeResult Helper: Understanding Sizes

The getSize() method returns a SizeResult object, which simplifies unit conversions.

// Create a collection with data for a clear example.
const sizeDemoColl = db.collection('size_demo');
await sizeDemoColl.drop();
const documents = Array(2048).fill({ x: "a".repeat(50) });
await sizeDemoColl.insertMany(documents);

// Get the size object.
const sizeObj = await sizeDemoColl.getSize();

// User-friendly string representation.
console.log(`User-friendly format: ${sizeObj}`); // or sizeObj.toString()

// Access specific units.
console.log(`Bytes:     ${sizeObj.bytes()}`);
console.log(`Kilobytes: ${sizeObj.kb().toFixed(2)} KB`);
console.log(`Megabytes: ${sizeObj.mb().toFixed(4)} MB`);
console.log(`Gigabytes: ${sizeObj.gb().toFixed(6)} GB`);

// Use in numeric contexts to get the raw byte value.
const byteValue = Number(sizeObj);
console.log(`Value from Number() cast: ${byteValue}`);

await sizeDemoColl.drop();

References

MongoDB & Atlas

  1. MongoDB Official Documentation
    Comprehensive guide covering MongoDB installation, CRUD operations, and architecture.
    https://www.mongodb.com/docs/

  2. MongoDB CRUD Operations
    Explains how to create, read, update, and delete documents in MongoDB.
    https://www.mongodb.com/docs/manual/crud/

  3. MongoDB Documents — Core Concepts
    Learn about BSON structure, document schema, and data types in MongoDB.
    https://www.mongodb.com/docs/manual/core/document/

  4. MongoDB Indexes Guide
    Details on creating, managing, and optimizing indexes for performance.
    https://www.mongodb.com/docs/manual/indexes/

  5. MongoDB Deployment & Administration
    Guides for database setup, sharding, backups, and maintenance.
    https://www.mongodb.com/docs/management/

  6. MongoDB Atlas Documentation
    Full documentation for MongoDB’s managed cloud service, Atlas.
    https://www.mongodb.com/docs/atlas/

  7. MongoDB Atlas Getting Started
    Step-by-step tutorial for setting up a MongoDB cluster on Atlas.
    https://www.mongodb.com/docs/atlas/getting-started/

  8. MongoDB Atlas Security Overview
    Learn how Atlas handles authentication, encryption, and network security.
    https://www.mongodb.com/docs/atlas/security/

  9. MongoDB Atlas Data Federation
    Explore how to query data across multiple sources using Atlas Federation.
    https://www.mongodb.com/docs/atlas/data-federation/overview/

  10. MongoDB Drivers Documentation
    Reference for official MongoDB drivers (Python, Node.js, Java, etc.).
    https://www.mongodb.com/docs/drivers/

  11. MongoDB Shell (mongosh)
    Learn how to use the modern MongoDB shell for database operations.
    https://www.mongodb.com/docs/mongosh/

Firebase / Firestore / Cloud Databases

  1. Firebase Official Documentation
    Centralized documentation for all Firebase services and SDKs.
    https://firebase.google.com/docs

  2. Firebase Realtime Database
    Learn about Firebase’s NoSQL JSON-based real-time database.
    https://firebase.google.com/docs/database

  3. Firebase Realtime Database — Web Guide
    How to read and write data to Realtime Database from web apps.
    https://firebase.google.com/docs/database/web/start

  4. Firebase Cloud Firestore
    Guide for scalable NoSQL document storage with real-time sync.
    https://firebase.google.com/docs/firestore

  5. Cloud Firestore (Google Cloud Docs)
    Technical documentation for Firestore in native mode on Google Cloud.
    https://cloud.google.com/firestore/docs

  6. Firebase Data Connect
    Integrate Firebase with relational databases like PostgreSQL.
    https://firebase.google.com/docs/data-connect

  7. Firebase Storage (Optional for DB Storage)
    Used for storing and retrieving user-uploaded data like images or JSON files.
    https://firebase.google.com/docs/storage

Additional Cloud DB Resources

  1. MongoDB University (Free Courses)
    Free online MongoDB learning courses by MongoDB Inc.
    https://learn.mongodb.com/

  2. Firebase Codelabs (Step-by-step Projects)
    Hands-on tutorials for Firebase and Firestore.
    https://firebase.google.com/codelabs

  3. Google Cloud Databases Overview
    Overview of all Google Cloud-managed database solutions.
    https://cloud.google.com/products/databases

About

This project simplifies remote access to a MongoDB database. Instead of dealing with firewalls, port forwarding, and static IPs, you can deploy a Flask REST API to any cloud service and connect it to your MongoDB instance, enabling secure CRUD operations without worrying about mongodb:// connection limitations.

Topics

Resources

License

Stars

Watchers

Forks