This package consists of modality merger algorithm, NL processor for common imitrob setup, it is jointly working with crow-base modules.
See related paper.
flowchart LR;
crow_detector -- "scene_objects" --> MM
INPUT -- "HRICommand<br>Natural<br>language" --> MM("Modality<br>Merger")
INPUT -- "HRICommand<br>Gesture<br>language" --> MM("Modality<br>Merger") -- "HRICommand<br>Merged<br>language" --> OUTPUT
pip install owlready nltk gTTS playsound SpeechRecognition rdflib knowl python-dotenv
Dataset generation:
- Generate datasets with:
python data_creator2.py all
OR download datasets from link and copy them to matchimitrob_hri/data/saves
- Generate single dataset:
python data_creator2.py cX_nY_DZ
(e.g.python data_creator.py c1_n2_D3
)- where X is configuration choice int from 1 to 3 (C1-C3)
- Y is noise levels int from 0 to 5 (n0: no noise, n1:
$N(0,0.2)$ , n2:$N(0,0.4)$ , n3: Real noise model, n3: Real noise model aplified 2x, n4: $N(0,0.6)$) - Z is dataset number from 1 to 4 (D1: Aligned, D2: Unaligned, Arity decisible, D3: Unaligned, Property decisible, D4: Unaligned)
- This creates new datasets into
data/saves
folder
Dataset tests:
- Test on all configurations & all merge functions (this might take few hours - merging across 6 input params):
python tester_on_data2.py all
- Test on all configurations and single merge function:
python tester_on_data2.py all MERGEFUN
, whereMERGEFUN
is string from list ['mul', 'add_2', 'entropy', 'entropy_add_2', 'baseline']MERGEFUN='mul'
is fixed thresholding with mul merge functionMERGEFUN='add_2'
is fixed thresholding with add merge functionMERGEFUN='entropy'
is entropy thresholding with mul merge functionMERGEFUN='entropy_add_2'
is entropy thresholding with add merge function
- Test on single configuration dataset
python tester_on_data2.py cX_nY_DZ MERGEFUN MODEL
, whereMODEL
is: M1, M2, M3 (e.g.python tester_on_data2.py c1_n2_D3 mul M3
)
Further tests:
- Check each samples in dataset
python tester_on_single_data.py
- Check consistency (results time invariant):
python check_consistency_on_data2.py cX_nY_DZ MERGEFUN MODEL
- Check dataset (e.g. how many objects with given properties are there, ...)
python data_checker.py cX_nX_DX
- Experiments generation plots: (generation from npy array results folder)
python results_comparer2.py
- Needs to have results generated in folder
data/results2
, generate results usingpython tester_on_data2.py all
OR use results from link - Toggle experiment number
- Needs to have results generated in folder
- Dependency on packages
crow-base
- ontology, cameras, filters,teleop_gesture_toolbox
- hand gesture recognition,imitrob_templates
,imitrob_robot_server
,imitrob_robot_client
- Panda Robot Control - Usage Run, 1. Term:
cd ~/crow-base && source start_404.sh
python user_scripts/tmux_all.py --config user_scripts/run_tmux_404.yaml && tmux attach-session -t crow_run_all
-
- Term, run NLP node:
cd ~/crow-base && source start_404.sh && ros2 run imitrob_hri nlp_node
- Term, run NLP node:
-
- Term, run MM node:
cd ~/crow-base && source start_404.sh && ros2 run imitrob_hri mm_node
- Use agrument
-e
for'arity'
or'property'
experiment - Use argmument
-a
for performing aligned ('a'
), not aligned action from language ('nal'
), not aligned action from gestures ('nag'
), not aligned object from language ('nol'
), not aligned object from gestures ('nog'
) - Use argument
-r
for run number - Use argument
-s
for user
- Term, run MM node:
-
- Term, run gesture detectors:
cd ~/crow-base && source start_404.sh && ros2 launch teleop_gesture_toolbox crow_gestures.launch
- Term, run gesture detectors:
-
- Term, run
CoppeliaSim
to see the selected pointed object:cd ~/crow-base && source start_404.sh && ros2 launch teleop_gesture_toolbox crow_gestures.launch
- Term, run
-
- Term, run LeapMotion Controller backend:
sudo leapd
- Term, run LeapMotion Controller backend:
-
- Term, run speech-to-text module, we don't provide Google API key for STT, here is script that publishes text input:
ros2 run imitrob_hri nl_input <your NLP sentence>
- Term, run speech-to-text module, we don't provide Google API key for STT, here is script that publishes text input:
-
- Term, node that executes merged actions by robot:
ros2 run imitrob_template template_execution_node
- This needs custom
ros2 run imitrob_robot_server robot_server
running, that prepares the robot for executing mode
- Term, node that executes merged actions by robot:
This saves results to: imitrob_hri/data_real
Examine captured real data with possibility to compute merging with different settings python tester_on_realdata.py