The Comparative Argument Machine (CAM) project is developed by the Language Technology Group at the University of Hamburg. As a starting point for a bigger scientific project the current version compares two objects via a large database. The main goal of the CAM project is the comparison based on understanding of natural language and to output natural language sentences as a result.
If you want to learn more about the project or help to develop it, feel free to contact us. A live demo is available online.
Currently, there are two ways to deploy the CAM project to your own machine: With or without Docker. You will find instructions for both ways here.
- Install Docker and Docker Compose
- Clone the CAM repository from GitHub:
git clone https://github.com/uhh-lt/cam.git cd cam
- If you're using Docker Toolbox, you need to change the
HOSTNAME
constants inurl-builder.service.ts
to match your Docker machine IP (instead oflocalhost
). You can check the Docker machine IP viadocker-machine ip
. - Start Docker containers:
docker-compose up -d
Now CAM is up and running.
You should be able to access the frontend app in your browser:
http://localhost:10101
Or directly receive search results from the backend (as JSON objects):
http://localhost:10100/cam?model=default&fs=false&objectA=dog&objectB=cat&aspect1=size&weight1=3&aspect2=food&weight2=1
(The parameters of this URL are described below.)
Preferably, Elasticsearch should also get its own Dockerfile or should be build from a Docker image with Docker Compose.
To use the suggestions feature, cross-origin resource sharing must be enabled for all origins in the elasticsearch.yml
config:
http.cors.enabled: true
http.cors.allow-origin: "*"
With Elasticsearch set up, the suggestions index can be created:
cd cam/src/Backend/create_suggestoins_index/
python create_es-index_from_suggestions.py
Alternatively, extract es-nodes.tar.gz
to Elasticsearchs' default nodes location (/var/lib/elasticsearch/
).
- Clone the CAM repository from GitHub:
git clone https://github.com/uhh-lt/cam.git cd cam
- Go to the backend folder:
cd src/Backend
- Download Python and install Pipenv.
- Install requirements:
pipenv install pipenv run python -m nltk.downloader stopwords pipenv run python -m nltk.downloader punkt pipenv run python -m nltk.downloader averaged_perceptron_tagger
- Download the following files and place them in
src/Backend/data
(needed for the InferSent model): - Change default hostnames and search type:
- On default, Elasticsearch should be running on https://ltdemos.informatik.uni-hamburg.de/depcc-index/ as specified in
config.json
. If you host the Index on a different cluster, changeelasticsearch.url
in that file. - The default search index is
depcc
. If you want to change this, changeelasticsearch.index
inconfig.json
.
- On default, Elasticsearch should be running on https://ltdemos.informatik.uni-hamburg.de/depcc-index/ as specified in
- Start the backend API:
(If the Elasticsearch needs authentication, specify
pipenv run python main.py
ES_USERNAME
andES_PASSWORD
environment variables.)
Now the backend is up and running.
You should be able to receive search results from the backend (as JSON objects):
http://localhost:5000/cam?model=default&fs=false&objectA=dog&objectB=cat&aspect1=size&weight1=3&aspect2=food&weight2=1
(The parameters of this URL are described below.)
- Download Node.js
- Enter the frontend working directory:
cd src/Frontend/camFrontend
- Install Angular dependencies:
npm install
- Change default hostnames:
On default, the backend is running onlocalhost
. If you want to change this, maybe because you deployed the project to another server, change allHOSTNAME_
constants, e.g.,HOSTNAME_DEFAULT
, inurl-builder.service.ts
. - Start the frontend app:
ng serve -o
The frontend app will automatically open in your web browser.
docker-compose down
docker rmi cam-frontend cam-backend
git pull
docker-compose up -d
git pull
Start the program like described above.
To access the API, URL parameters can be used. The structure is described underneath:
BASE_ADDRESS/cam?model=MODEL&fs=FS&objectA=OBJA&objectB=OBJB&aspect1=ASP1&weight1=WEIGHT1
The base address depends on the backend deployment URL.
- Replace
MODEL
with eitherdefault
,bow
orinfersent
, to select one of the three included models. This parameter is optional. - Replace
FS
withfalse
if you want to do the default search, or withtrue
to enable fast search. - Replace
OBJA
andOBJB
with the objects you want to compare, e.g.,dog
andcat
. Both parameters are mandatory. - Replace
ASP1
andWEIGHT1
with an aspect you want to include and its weight, e. g.price
and5
. - Add as many aspects/weights as you want, but follow these rules:
- You must enter both aspect and weight.
- Aspects and weight parameters must be numbered consecutively. That is, if you include
aspect2
andweight2
, you have to includeaspect1
andweight1
as well. Numbers start at1
. The order of URL parameters does not matter. - Aspects/weights are optional. You can skip aspect/weight parameters to compare two objects without any aspects.
- The frontend limits weights to integers from 1 to 5. If you need equivalent results as from the frontend, use weights from 1 to 5. However, arbitrary integer values may be used. Be careful with negative values or values close to an integer overflow.