Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[ENH] Add additional information to ezBIDS documentation #122

Merged
merged 2 commits into from
Feb 8, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
41 changes: 40 additions & 1 deletion docs/technical/ezBIDS.md
Original file line number Diff line number Diff line change
Expand Up @@ -340,4 +340,43 @@ The following block code provides information pertaing to the contents of the `e
"section_id":
"type": int
"description": Which section the individual data is in, starts at 1 and progresses up when new localizers are detected
```
```

## Backend Workflow

### 1. Uploading Data
When a user begins uploading data, ezBIDS will create a new session using (post)/session API. A session organizes each ezBIDS upload/conversion process. For each session, a new DB/session collection is created with the mongo ID as session ID, and creates a unique working directory using the session ID on the backend server where all uploaded data is stored. Once all files are successfully uploaded, the client makes a (patch)/session/uploaded/:session_id API call and sets the session state to "uploaded" to let the ezBIDS handler knows that the session is ready to being preprocessing.

### 2. Preprocessing Data
The backend server polls for uploaded sessions, and when it finds "uploaded" session, it launches the preprocessing script, setting the session state to "preprocessing". This consists of several steps, including un-compressing all compressed folders/files, running *dcm2niix*, and creating a list file containing all the NIfTI and sidecar json files. The *ezBIDS_core.py* uses this to analyze the files and at the end create *ezBIDS_core.json*. When the preprocessing is complete, the session state will be set to "analyzed". The preprocessing step then loads *ezBIDS_core.json* json and copies the contents to DB/ezBIDS collection (not session collection) under an original key.

### 3. User interact with the session via web UI.
The web user interface (UI) detects the preprocessing completed by polling for session state, and loads the contents of ezBIDS_core.json via (get)/download/:session_id/ezBIDS_core.json API. User then view / correct the content of *ezBIDS_core.json*.

### 4. User request for defacing (optional)
Before the user finalizes editing the information, they are given chance to deface anatomical images. When requested, the UI will make a (post)/session/:sessino_id/deface API call with a list of images to deface. The backend stores this information as *deface.json* in the workding directory (workdir) and sets the session state to "deface". The backend defacing handler looks for the "deface" state session and sets it to "defacing" and launches the *deface.sh* script. Once defacing is completed, it will set the session state to "defaced".

*deface.sh* creates the following files under the workdir:

1. *deface.finished* (list the anatomical files successfully defaced)
2. *deface.failed* (list the anatomical files that failed defacing)

The UI polls these files to determine which files are successfully defaced (or not). The user can then choose whether they want to use the defaced anatomical file file or not (each object will have a field named defaceSelection set to either "original" or "defaced" to indicate which image to use).

### 5. Finalize
When the user clicks the "Finalize" button, the UI makes an API call for (post)/session/:session_id/finalize API. The UI passes the following content from the memory ($root).
```
{
datasetDescription: this.datasetDescription,
readme: this.readme,
participantsColumn: this.participantsColumn,
subjects: this.subjects, //for phenotype
objects: this.objects,
entityMappings,
}
```

The API then store this information as *finalized.json* in workdir, and copies the content to DB/ezBIDS collection under an updated key (to contrast with original key). The session status will be set to "finalized". This kicks off the finalized handler on the server side, and the handler resets the status to "bidsing" and runs the *bids.sh* script to generate the BIDS structure according to the information stored at the object level. Once finished, the session status will be set to "finished".

### 6. Access BIDS
Once session status becomes "finished", the user will be then allowed to download the final BIDS directory via the download API, or send to other cloud resources
43 changes: 38 additions & 5 deletions docs/using_ezBIDS.md
Original file line number Diff line number Diff line change
Expand Up @@ -196,17 +196,50 @@ ezBIDS provides BIDS conversion support for the following data modalities:
---
## Installing ezBIDS locally

Although ezBIDS is a web-based service that does not require installation, some users may wish to install it on their local machine or server. The main advantage here is that data is not uploaded to the ezBIDS server (i.e. data remains on-site). Instead, data is copied within one of the ezBIDS containers (handler) at */tmp/ezbids-workdir*. However, with this approach users cannot upload their finalized BIDS-compliant data to brainlife.io via ezBIDS, as this requires authentication that isn't available with a local installation. With that said, installing ezBIDS locally consists of a few simple steps in the terminal/CLI:
Although ezBIDS is a web-based service that does not require installation, it is possible to install it on your local machine or server. The main advantage here is that data is not uploaded to the ezBIDS server (i.e. data remains on-site). Instead, data is copied to one of the ezBIDS containers (brainlife_ezbids-handler) at */tmp/ezbids-workdir*. However, with this approach users cannot upload their finalized BIDS-compliant data to brainlife.io via ezBIDS, as this requires authentication that isn't available with a local installation. Furthermore, setting up ezBIDS locally may require some technical know-how. The following steps should enable you to successfully install ezBIDS.

1. *git clone https://github.com/brainlife/ezbids*
2. *cd ezbids && ./dev.sh -d*
#### Step 1: Software prerequisites

Make sure the following software packages are installed on your computer:
1. [Docker](https://www.docker.com/)
2. [Docker Compose](https://docs.docker.com/compose/)
3. [Node(.js) & npm](https://nodejs.org/en/download/)

!!! warning Node and npm installation
Download the LTS version (20.11.0), as ezBIDS does not work well with newer Node versions. If working on Mac OS, avoid downloading from homebrew, as homebrew provides the most recent version or puts an older version in an atypical directory (e.g. *node@20*).

!!! warning Software requirements
Users must have both [Docker](https://www.docker.com/) and [Docker Compose](https://docs.docker.com/compose/) on their machine.
To check that they are in your $PATH, type "<software> --version" (e.g. *docker --version*) into your terminal. If successfully installed, a version ID will be displayed.

#### Step 2: Setup

Type the following commands into your terminal:
1. *git clone https://github.com/brainlife/ezbids.git*
2. *cd ezbids && ./dev.sh -d*

!!! note ezBIDS on HPCs
Users wishing to install ezBIDS on their institution's HPC will not have access to Docker or Docker Compose. ezBIDS is currently unsupported by Singularity, and thus users should reach out to system administrators to see if installation of ezBIDS can be supported.

The second command runs *docker-compose up*, which builds and starts the 4 contains that make up ezBIDS. This will take several minutes to complete, at which point logging information will appear in the terminal. To ensure that the containers are up and running, open a new terminal and type *docker ps*, which should output something like:
```
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
f5ecdfa84704 ezbids_handler "pm2 start handler.j…" 18 hours ago Up 18 hours brainlife_ezbids-handler
bae50478c784 ezbids_api "docker-entrypoint.s…" 18 hours ago Up 18 hours (healthy) 0.0.0.0:8082->8082/tcp, :::8082->8082/tcp brainlife_ezbids-api
374590c576b9 ezbids_ui "docker-entrypoint.s…" 18 hours ago Up 18 hours (healthy) 0.0.0.0:3000->3000/tcp, :::3000->3000/tcp brainlife_ezbids-ui
7a45d0f24b31 mongo:4.4.15 "docker-entrypoint.s…" 18 hours ago Up 18 hours (healthy) 0.0.0.0:27417->27017/tcp, :::27417->27017/tcp brainlife_ezbids-mongodb
```

!!! warning Port communication
ezBIDS needs access to port:3000, ensure that no other software is using that port.

#### Step 3: Accessing ezBIDS
Once the containers are up and running, open a web-browser (ideally Chrome or Firefox) and type *localhost:3000* into the URL. After several seconds, the ezBIDS homepage should appear, and you are set to go.

#### Step 4. Sutting down ezBIDS
Once finished with ezBIDS, you will need to type *docker-compose down* into the terminal in order to bring the containers down. To restart, simply re-run *./dev.sh -d*

!!! note
These commands must be executed from within the ezbids directory.

---
## FAQ

Expand Down
Loading