Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
70 changes: 23 additions & 47 deletions amrit-local-setup/README.md
Original file line number Diff line number Diff line change
@@ -1,32 +1,31 @@
# AMRIT Local Environment Setup Guide

## System Requirements
## Overview

AMRIT Local Database Environment Setup activity involves three activities

1) Start the MySQL, MongoDB and Redis databases as Docker Containers
2) Create the Database Schema for AMRIT
3) Load the Database with Sample Data

### Mandatory Dependencies

- Docker Engine + Docker Compose
- Maven 3.6+
- Git version control
- OpenJDK 17+
- MySQL Client

## Architecture Overview

The setup leverages containerization for consistent development environments across the team. Core services are orchestrated via Docker, with MySQL, Mongo and Redis instances running in isolated containers.

## Deployment Steps

First, clone the DevOps repository and navigate to the local setup directory:
### 1. Start the Databases

First, run the below commands to clone the DevOps repository, navigate to the local setup directory and initialize the container services:

```bash
git clone https://github.com/PSMRI/AMRIT-DevOps.git
cd AMRIT-DevOps/amrit-local-setup
```

### 1. Container Orchestration

Initialize the containerized services:

```bash
docker-compose up
```

Expand All @@ -38,46 +37,25 @@ docker-compose up

**Important:** If these services are already running on your host machine, stop the local MySQL, Mongo and Redis instances before proceeding.

**Note:** Before proceeding:
**Note:** Before proceeding, Verify that the Docker containers are running for MySQL, Mongo and Redis.

- Verify that the Docker container is running
### 2. Database Schema Management Service Deployment
### 2. Create the Database Schema

Use the below commands to Run the Database Schema Management Service. This is a Java Service which is used to create the database schema in MySQL Instance

#### Repository Configuration

```bash
git clone https://github.com/PSMRI/AMRIT-DB.git
cd AMRIT-DB
cp `src/main/environment/common_example.properties` to `src/main/environment/common_local.properties`
mvn clean install -DENV_VAR=local
mvn spring-boot:run -DENV_VAR=local
```
1. **Setup Local Properties**:
- Copy `common_example.properties` to `common_local.properties`.
- File location: `src/main/environment`

2. **Create Build Configuration through CLI**:
```
mvn clean install -DENV_VAR=local
```
---

## Run Configuration

1. **Setup Spring Boot through CLI**:
```
mvn spring-boot:run -DENV_VAR=local
```
---

### 3. Load Sample Data

#### Data Package Setup
### 3. Load Sample Data in the Database Tables

1. Download the Data zip folder (`Amrit_MastersData.zip`) from the [official documentation](https://piramal-swasthya.gitbook.io/amrit/data-management/database-schema)
2. Extract the archive contents
3. Update the data path in the appropriate script:
- Line 10 in `loaddummydata.sh` (Linux/MacOS)
- Line 10 in `loaddummydata.bat` (Windows)

#### Execute Data Load
Run the below script to load sample Data into the database tables.The data will be loaded and persistently stored in the MySQL instance.

For Linux/Unix systems:

Expand All @@ -91,10 +69,8 @@ For Windows (PowerShell):
.\loaddummydata.bat
```

The data will be loaded and persistently stored in the containerized MySQL instance.

## Troubleshooting
## Troubleshooting Tips

- Ensure all ports (3306, 6379, 27017) are available before starting the containers
- Verify Docker daemon is running before executing docker-compose
- Check container logs if services fail to start
- Ensure all ports (3306, 6379, 27017) are available before starting the containers
- Check the container logs to see if services failed to start
15 changes: 14 additions & 1 deletion amrit-local-setup/loaddummydata.bat
Original file line number Diff line number Diff line change
Expand Up @@ -6,8 +6,16 @@ set PORT=3306
set USER=root
set PASSWORD=1234

:: Download the file using PowerShell's Invoke-WebRequest (equivalent to wget)
echo Downloading AmritMasterData.zip...
powershell -Command "Invoke-WebRequest -Uri 'https://1865391384-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2FYfDZFIsUuulWkRHaq4c1%2Fuploads%2F1WdSAf0fQBeJOea70EXE%2FAmritMasterData.zip?alt=media&token=18e0b6d6-487c-4c0c-967a-02cdd94d61ad' -OutFile 'AmritMasterData.zip'"
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

My worry is that this DB dump can change in the documentation (with the same file name). Also, I am not sure if the token will expire any time.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Valid Point. I was thinking about it.

File url will change when new data dump is added to the documentation. We need to update the script as well whenever we update the dump file in gitbook. Token doesnt seem to expire as long as the file name is same

Other option is to maintain the dump file in github rather than gitbook. As the file size is large, we need to use git LFS

Do you have any other ideas?

Copy link
Author

@ramnar ramnar Jan 4, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@drtechie - End goal I am trying to achieve in future is to have a single script to start database containers, start schema management service and load dummy data. I believe we can have this single script in schema management service git repository itself. Load dummy data can be integrated with schema management service based one some flag


:: Extract the file using PowerShell's Expand-Archive (equivalent to unzip)
echo Extracting AmritMasterData.zip...
powershell -Command "Expand-Archive -Path 'AmritMasterData.zip' -DestinationPath 'AmritMasterDataFiles'"

Comment on lines +9 to +16
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

External download URL embeds a token; consider treating it as a secret and hardening failure handling.

The GitBook URL used in Invoke-WebRequest contains a token= query parameter, which static analysis correctly flags as a potential generic API key. If this token is sensitive or revocable, committing it to the repo risks accidental exposure and future breakage when it’s rotated. Also, neither the download nor the extraction has error checks, so the script will continue even if the file is missing or the network fails.

Consider:

  • Moving the URL (or just the token) into a config/env variable, or hosting the ZIP at a stable, unauthenticated location.
  • Adding basic checks on %ERRORLEVEL% after Invoke-WebRequest/Expand-Archive and aborting with a clear message on failure.
🧰 Tools
🪛 Gitleaks (8.30.0)

[high] 11-11: Detected a Generic API Key, potentially exposing access to various services and sensitive operations.

(generic-api-key)

🤖 Prompt for AI Agents
In amrit-local-setup/loaddummydata.bat around lines 9 to 16, the script embeds a
tokenized GitBook URL and lacks error handling; remove the hardcoded token by
reading the full URL or only the token from an environment variable or a config
file (do not commit the secret), then after each PowerShell call check
%ERRORLEVEL% (or inspect PowerShell's exit code) and if non-zero echo a clear
error message and exit /b with a non‑zero code so the script stops on download
or extraction failures.

:: Path to the extracted SQL files
set SQL_DIR=<PATH to the extracted files directory>
set SQL_DIR="AmritMasterDataFiles/AmritMasterData"
Comment on lines 17 to +18
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

Fix SQL_DIR quoting to avoid malformed paths.

set SQL_DIR="AmritMasterDataFiles/AmritMasterData" stores the quotes inside the variable, and later you wrap %SQL_DIR% in another pair of quotes: "%SQL_DIR%\%%A". This yields a doubly quoted, slightly malformed path which can cause file resolution issues.

A safer pattern is:

- set SQL_DIR="AmritMasterDataFiles/AmritMasterData"
+ set "SQL_DIR=AmritMasterDataFiles\AmritMasterData"

Then keep the existing usage "%SQL_DIR%\%%A".

📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
:: Path to the extracted SQL files
set SQL_DIR=<PATH to the extracted files directory>
set SQL_DIR="AmritMasterDataFiles/AmritMasterData"
:: Path to the extracted SQL files
set "SQL_DIR=AmritMasterDataFiles\AmritMasterData"
🤖 Prompt for AI Agents
In amrit-local-setup/loaddummydata.bat around lines 17-18, the SQL_DIR variable
is assigned with embedded quotes which become part of the variable value and
lead to double-quoting later; change the assignment to remove the quotes (e.g.,
set SQL_DIR=AmritMasterDataFiles\AmritMasterData) so the variable contains a
plain path, then keep using it as "%SQL_DIR%\%%A" elsewhere.


:: Associating files with databases
set DATABASES="AmritMasterData.sql=db_iemr" "m_beneficiaryregidmapping_dump_1097.sql=db_1097_identity" "m_beneficiaryregidmapping_dump.sql=db_identity"
Expand All @@ -30,3 +38,8 @@ for %%F in (%DATABASES%) do (
)

pause

# Clean up the extracted files
del AmritMasterData.zip
rmdir /s /q AmritMasterDataFiles
echo Cleaned up AmritMasterDataFiles folder.
Comment on lines +42 to +45
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

Batch comment syntax: use REM or :: instead of #.

# Clean up the extracted files in a .bat file is executed as a command (and will print an error), not treated as a comment.

Change it to something like:

-# Clean up the extracted files
+:: Clean up the extracted files
 del AmritMasterData.zip
 rmdir /s /q AmritMasterDataFiles
 echo Cleaned up AmritMasterDataFiles folder.
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
# Clean up the extracted files
del AmritMasterData.zip
rmdir /s /q AmritMasterDataFiles
echo Cleaned up AmritMasterDataFiles folder.
:: Clean up the extracted files
del AmritMasterData.zip
rmdir /s /q AmritMasterDataFiles
echo Cleaned up AmritMasterDataFiles folder.
🤖 Prompt for AI Agents
In amrit-local-setup/loaddummydata.bat around lines 42 to 45, the line starting
with "#" is being treated as a command rather than a comment in a .bat file;
replace the "#" comment with a valid batch comment token (for example prefix the
line with REM or ::) so the cleanup note is a true comment, leaving the del and
rmdir commands unchanged and preserving the final echo.

22 changes: 18 additions & 4 deletions amrit-local-setup/loaddummydata.sh
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,12 @@ PORT="3306"
USER="root"
PASSWORD="1234"
# Path to the extracted SQL files
SQL_DIR=<PATH to the extracted files directory>

# download and extract the SQL files from the given url https://1865391384-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2FYfDZFIsUuulWkRHaq4c1%2Fuploads%2F1WdSAf0fQBeJOea70EXE%2FAmritMasterData.zip?alt=media&token=18e0b6d6-487c-4c0c-967a-02cdd94d61ad
wget -O AmritMasterData.zip "https://1865391384-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2FYfDZFIsUuulWkRHaq4c1%2Fuploads%2F1WdSAf0fQBeJOea70EXE%2FAmritMasterData.zip?alt=media&token=18e0b6d6-487c-4c0c-967a-02cdd94d61ad"
unzip AmritMasterData.zip -d AmritMasterDataFiles
rm AmritMasterData.zip
SQL_DIR="AmritMasterDataFiles/AmritMasterData"

Comment on lines +10 to 15
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

Download URL with embedded token may be sensitive and is a single point of failure.

The hard-coded GitBook URL (with token=) for AmritMasterData.zip is flagged by Gitleaks as a generic API key. If the token is private or short‑lived, committing it to the repo both exposes it and makes the script fragile when it’s rotated or expires. Additionally, if wget/unzip fail, the script will still proceed to the MySQL step with an invalid SQL_DIR.

Recommendations:

  • Move the URL (or token) into configuration (env var, .env file ignored by Git, or a central config script) or host the ZIP at a stable, unauthenticated location.
  • After wget and unzip, check the exit status and/or the presence of $SQL_DIR and abort early with a clear message if the data isn’t available.
🧰 Tools
🪛 Gitleaks (8.30.0)

[high] 10-10: Detected a Generic API Key, potentially exposing access to various services and sensitive operations.

(generic-api-key)


[high] 11-11: Detected a Generic API Key, potentially exposing access to various services and sensitive operations.

(generic-api-key)

🤖 Prompt for AI Agents
In amrit-local-setup/loaddummydata.sh around lines 10 to 15, the script embeds a
GitBook URL with a token and does not validate download/unzip success; move the
URL or token into a configuration source (eg. an environment variable or .env
entry that is gitignored) and reference that variable in the script instead of
the hard-coded string, and after running wget and unzip check their exit codes
and verify the expected SQL_DIR exists (abort with a clear error and non-zero
exit status if any step fails) so the script never proceeds to the MySQL step
with missing data.

# Files and their respective databases
FILES="AmritMasterData.sql m_beneficiaryregidmapping_dump_1097.sql m_beneficiaryregidmapping_dump.sql"
Expand All @@ -22,12 +27,21 @@ for FILE in $FILES_ARRAY; do
DATABASE=$(echo $DATABASES_ARRAY | cut -d ' ' -f $(($i+1))) # get corresponding database
echo "Running $FILE on $DATABASE..."

mysql -h "$HOST" -P "$PORT" -u "$USER" -p"$PASSWORD" "$DATABASE" < "$SQL_DIR/$FILE"
# Execute SQL and capture output, filtering out duplicate key errors
OUTPUT=$(mysql -h "$HOST" -P "$PORT" -u "$USER" -p"$PASSWORD" "$DATABASE" < "$SQL_DIR/$FILE" 2>&1)
EXIT_CODE=$?

if [ $? -eq 0 ]; then
# Check if error is only due to duplicate entries (error code 1062)
if [ $EXIT_CODE -eq 0 ]; then
echo "Successfully executed $FILE on $DATABASE."
elif echo "$OUTPUT" | grep -q "ERROR 1062"; then
echo "Completed $FILE on $DATABASE (duplicate entries skipped)."
else
echo "Error executing $FILE on $DATABASE."
echo "Error executing $FILE on $DATABASE: $OUTPUT"
fi
i=$(($i+1)) # Increment index for the next database
done

# Clean up the extracted files
rm -rf AmritMasterDataFiles
echo "Cleaned up AmritMasterDataFiles folder."