Skip to content
Peizhao Hu edited this page May 26, 2019 · 76 revisions

Installation

We assume your have Docker engine installed. Run the following commands and replace [tagName] with the latest version in https://hub.docker.com/r/sparkfhe/sparkfhe-dist/tags

docker pull sparkfhe/sparkfhe-dist:[tagName]
docker run -it sparkfhe/sparkfhe-dist:[tagName]

Option 2: Direct installation on your system

Prerequisites: Mac OSX: Homebrew (see https://brew.sh) and wget ($> brew install wget)

Run the following commands to obtain a distribution version of the SparkFHE project. Once finished, you should see a folder like "spark-3.0.0-SNAPSHOT-bin-SparkFHE".

wget https://github.com/SpiRITlab/SparkFHE-Maven-Repo/raw/master/TestDrive.bash
bash TestDrive.bash all

Next, run the following commands to install all shared libraries as dependencies, such as Boost, GoogleTest, GMP, NTL, HELib, etc. Note, the execution of the bash file, "install_shared_libraries.bash", will take a while.

cd /spark-3.0.0-SNAPSHOT-bin-SparkFHE/SparkFHE-Addon/scripts/setup
bash install_shared_libraries.bash

Note, if you have access to our C++ shared library, you can also softlink the "libSparkFHE" folder here and avoid this step.

Run Demo code

If you have followed the Installation instructions shown above successfully, you are ready to give our demo code a run. There are two ways of doing this, running the demo code locally or in a clustered environment. If a job is split up into many slices, it will be running using multi-threading (on local machine) or distributed systems (in cluster environment).

Run Demo in Local Environment

Running the demo locally is quite straight forward, just type in the following commands.

cd /spark-3.0.0-SNAPSHOT-bin-SparkFHE/SparkFHE-Addon/scripts/spark
bash mySparkSubmitLocal.bash

This demo will perform the following operations:

  • Basic arithmetic operations over plaintexts (testing whether Spark is able to use our library)
  • Key generation (produce a key pair in /spark-3.0.0-SNAPSHOT-bin-SparkFHE/gen/keys/)
  • Encrypt and decrypt (produce encryption of 0 and 1, and two vectors of encrypted numbers; generated ciphertexts are in /spark-3.0.0-SNAPSHOT-bin-SparkFHE/gen/records/)
  • Basic arithmetic operations over ciphertexts (1+0, 1*0, 1-0)
  • Compute dot-product (or inner-product) over the two vectors of encrypted numbers

Run Demo in Clustered Environment

If you would like to explore the option of running the above test examples in a clustered environment, additional steps are required. Please go to the following page for more instructions.

Test Outputs

By default, the demo output (see Our Example Run) will contain both the SparkFHE output as well as the Spark output. Some may find it hard to see the SparkFHE output "sandwiched" between the Spark outputs. If this is the case for you, feel free to hide the Spark output using "2 >/dev/null" to redirect them to the null device. Getting back to the SparkFHE output, you should expect the following examples:

Basic FHE arithmetic over two encrypted numbers

Test case(s):

enc(1) + enc(0)
enc(1) * enc(0)
enc(1) - enc(0)

SparkFHE outputs:

Homomorphic Addition:1
Homomorphic Multiplication:0
Homomorphic Subtraction:1

Dot-Product of two vectors of encrypted numbers

Test case(s):

vec_a: (0,1,2,3,4) 
vec_b: (4,3,2,1,0)

SparkFHE outputs:

Dot product: 10

Upgrade the installed distribution

If you want to upgrade your SparkFHE distribution, you can download the lastest TestDrive.bash script.

wget https://github.com/SpiRITlab/SparkFHE-Maven-Repo/raw/master/TestDrive.bash

Then, run the corresponding commands:

Upgrade SparkFHE-API, SparkFHE-Examples, and SparkFHE-Plugin

bash TestDrive.bash dependencies

Upgrade libSparkFHE shared library

bash TestDrive.bash lib

Upgrade SparkFHE-Addon

This can be done by git commands.

cd SparkFHE-Addon
git pull

Thank you for taking an interested in the SparkFHE project.

You may check the Errors&Fixes page if you run into any problem.

Feel free to contribute and leave a comment if you have any questions or suggestions.

Clone this wiki locally