-
Notifications
You must be signed in to change notification settings - Fork 0
/
Copy pathatom.xml
1411 lines (1343 loc) · 241 KB
/
atom.xml
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
480
481
482
483
484
485
486
487
488
489
490
491
492
493
494
495
496
497
498
499
500
501
502
503
504
505
506
507
508
509
510
511
512
513
514
515
516
517
518
519
520
521
522
523
524
525
526
527
528
529
530
531
532
533
534
535
536
537
538
539
540
541
542
543
544
545
546
547
548
549
550
551
552
553
554
555
556
557
558
559
560
561
562
563
564
565
566
567
568
569
570
571
572
573
574
575
576
577
578
579
580
581
582
583
584
585
586
587
588
589
590
591
592
593
594
595
596
597
598
599
600
601
602
603
604
605
606
607
608
609
610
611
612
613
614
615
616
617
618
619
620
621
622
623
624
625
626
627
628
629
630
631
632
633
634
635
636
637
638
639
640
641
642
643
644
645
646
647
648
649
650
651
652
653
654
655
656
657
658
659
660
661
662
663
664
665
666
667
668
669
670
671
672
673
674
675
676
677
678
679
680
681
682
683
684
685
686
687
688
689
690
691
692
693
694
695
696
697
698
699
700
701
702
703
704
705
706
707
708
709
710
711
712
713
714
715
716
717
718
719
720
721
722
723
724
725
726
727
728
729
730
731
732
733
734
735
736
737
738
739
740
741
742
743
744
745
746
747
748
749
750
751
752
753
754
755
756
757
758
759
760
761
762
763
764
765
766
767
768
769
770
771
772
773
774
775
776
777
778
779
780
781
782
783
784
785
786
787
788
789
790
791
792
793
794
795
796
797
798
799
800
801
802
803
804
805
806
807
808
809
810
811
812
813
814
815
816
817
818
819
820
821
822
823
824
825
826
827
828
829
830
831
832
833
834
835
836
837
838
839
840
841
842
843
844
845
846
847
848
849
850
851
852
853
854
855
856
857
858
859
860
861
862
863
864
865
866
867
868
869
870
871
872
873
874
875
876
877
878
879
880
881
882
883
884
885
886
887
888
889
890
891
892
893
894
895
896
897
898
899
900
901
902
903
904
905
906
907
908
909
910
911
912
913
914
915
916
917
918
919
920
921
922
923
924
925
926
927
928
929
930
931
932
933
934
935
936
937
938
939
940
941
942
943
944
945
946
947
948
949
950
951
952
953
954
955
956
957
958
959
960
961
962
963
964
965
966
967
968
969
970
971
972
973
974
975
976
977
978
979
980
981
982
983
984
985
986
987
988
989
990
991
992
993
994
995
996
997
998
999
1000
<?xml version="1.0" encoding="utf-8"?>
<feed xmlns="http://www.w3.org/2005/Atom"><title>All Day I Dream About Science</title><link href="http://siddhantsci.org/" rel="alternate"></link><link href="/atom.xml" rel="self"></link><id>http://siddhantsci.org/</id><updated>2015-08-22T19:53:52+00:00</updated><entry><title>Telerobotics - Final Report</title><link href="http://siddhantsci.org/blog/2015/08/22/telerobotics-final-report/" rel="alternate"></link><updated>2015-08-22T19:53:52+00:00</updated><author><name>Siddhant Shrivastava</name></author><id>tag:siddhantsci.org,2015-08-22:blog/2015/08/22/telerobotics-final-report/</id><summary type="html"><p>Hi all! Yesterday was the firm-pencils-down deadline for the Coding Period and the past week was one of the best weeks of the <em>Google Summer of Code 2015</em> program. I went all-guns-blazing with the <strong>documentation</strong> and <em>Virtual Machine distribution efforts</em> of my work on Telerobotics. I also added some significant features to Telerobotics such as <em>ROS Integration with the EUROPA Scheduler</em> which <a href="http://shrigsoc.blogspot.in/2015/08/finals.html">Shridhar</a> worked on this summer with the Italian Mars Society.</p>
<h1>Project Report</h1>
<p>I completed the main aspects of the Telerobotics interface with strong results -</p>
<ul>
<li>Introduced Robot Operating Sytem (ROS) to ERAS</li>
<li>Developed a Telerobotics Interface to Bodytracking and EUROPA</li>
<li>Implemented Stereoscopic Streaming of 3-D video to the Blender Game Engine V-ERAS application</li>
</ul>
<p>I explain each of these points and summarize my experience in the following paragraphs. In the last week, I got a chance to pursue a collective effort in all the areas of my project -</p>
<h2>Replication Experiments</h2>
<p>The ultimate week started with attempts to ensure that my mentors could replicate my machine setup in order to test and comment on the performance of Telerobotics. To that end, I added detailed instructions to describe my machine and network configuration, which can be <a href="https://bitbucket.org/italianmarssociety/eras/src/402e47bccb3e329c95596a06f8cc66cafcaa0658/servers/telerobotics/doc/machine-configurations.rst?at=default">found here</a>.</p>
<h2>Docker Working!</h2>
<p>I explained the importance of Docker in this project in a <a href="http://siddhantsci.org/blog/2015/06/12/all-for-docker-docker-for-all/">previous post</a>. Franco started the ball rolling by telling me how the <a href="http://www.itsprite.com/openstack-docker-for-gui-based-environments/">ssh-to-image</a> method could be used for running Qt applications in Docker. ROS and Gazebo employ Qt extensively for their visualization and simulation applications. Thus, it was a non-functional requirement of Telerobotics. Thus the long-standing Docker issue was solved. The final Docker image with everything packaged can be used to test Telerobotics. The image can be pulled <a href="https://hub.docker.com/r/sidcode/ros-eras/">from here</a>. The instructions to use the image are in the <a href="https://bitbucket.org/italianmarssociety/eras/src/402e47bccb3e329c95596a06f8cc66cafcaa0658/servers/telerobotics/doc/docker-instructions.rst?at=default">Telerobotics Documentation pages</a>.</p>
<p>A walkthrough with the Docker image can be found in this YouTube video that I created -</p>
<p><div class="videobox">
<iframe width="640" height="390"
src='https://www.youtube.com/embed/cyGshc9RLoQ'
frameborder='0' webkitAllowFullScreen mozallowfullscreen
allowFullScreen>
</iframe>
</div></p>
<h2>Fallback Keyboard Teleoperation</h2>
<p>Telerobotics works out of the box with the Bodytracking module that Vito has developed. But in the unfortunate case when the Tango-Control server fails, there emerges the functional requirement to have a <strong>fallback interface</strong> in place. Seeking inspiration from the Teleoperation tools for ROS, I added the Fallback Keyboard Teleoperation interface. Thus, the Rover can now also be controlled with the Keyboard if need be. The controls are currently inclined towards right-handed astronauts. I hope to add the left-handed version soon as a minor extension of the interface.
The code for this can be <a href="https://bitbucket.org/italianmarssociety/eras/src/402e47bccb3e329c95596a06f8cc66cafcaa0658/servers/telerobotics/src/teleoperation-keyboard.py?at=default">found here</a>.</p>
<h2>EUROPA and Navigation Interfaces</h2>
<p>Shridhar's work on the <a href="https://code.google.com/p/europa-pso/">EUROPA platform</a> needed access to the Telerobotics interface for the following tasks -</p>
<ul>
<li>Getting Robot Diagnostic Information</li>
<li>Navigating the Robot to certain points</li>
</ul>
<p>I achieved the initial goal before midsems. The second goal was achieved this week after the EUROPA Planner was complete. The workflow to this end was to receive coordinates from the EUROPA Tango Server and send them to the ROS Node corresponding to the Husky.</p>
<p>Finding the optimal path between two points on an incompletely-known map is solved by using <a href="http://wiki.ros.org/amcl">Augmented Monte Carlo Localization</a>.</p>
<p>It is necessary to localize the rover with respect to its environment based on the inputs of its multiple sensors. The following diagram from the ROS website explains the concept -
<img alt="ROS Localization" src="http://siddhantsci.org/images/ros_localization.png" /></p>
<p>I used the Husky frame coordinates and added the code using the <a href="http://wiki.ros.org/actionlib">ROS Action Server and Action Client</a> and Tango Event Listeners to create the appropriate <strong>Telerobotics-EUROPA interfaces</strong>. It can be <a href="https://bitbucket.org/italianmarssociety/eras/src/402e47bccb3e329c95596a06f8cc66cafcaa0658/servers/telerobotics/src/europa-navigation.py?at=default">found here</a>.</p>
<h2>Minoru Camera Tools</h2>
<p>The Minoru 3-D Camera that I used to prototype <a href="">streaming applications</a> for ERAS has obscure documentation for Linux platforms. I was able to setup the Minoru Calibration tools from a <a href="https://github.com/bashrc/libv4l2cam">Git clone</a> of the <a href="https://code.google.com/p/sentience/wiki/MinoruWebcam">original</a> <code>vl42stereo</code> package. I added them to the <code>streams</code> tree of the Telerobotics source code. It can be <a href="https://bitbucket.org/italianmarssociety/eras/src/402e47bccb3e329c95596a06f8cc66cafcaa0658/servers/telerobotics/streams/Minoru3D/v4l2stereo-calibrate-minoru/?at=default">accessed here</a>.</p>
<h2>Documentation!</h2>
<p>The documentation underwent a major overhaul this week. In addition to <strong>commenting the code</strong> since the beginning, I ensured to update/add the following documentation pages -</p>
<ul>
<li><a href="https://bitbucket.org/italianmarssociety/eras/src/402e47bccb3e329c95596a06f8cc66cafcaa0658/servers/telerobotics/doc/sad.rst?at=default">Software Architecture Document for Telerobotics</a></li>
<li><a href="https://bitbucket.org/italianmarssociety/eras/src/402e47bccb3e329c95596a06f8cc66cafcaa0658/servers/telerobotics/doc/docker-instructions.rst?at=default">Docker Image Setup Instructions</a></li>
<li><a href="https://bitbucket.org/italianmarssociety/eras/src/402e47bccb3e329c95596a06f8cc66cafcaa0658/servers/telerobotics/doc/telerobotics-guide.rst?at=default">Telerobotics Walkthrough</a></li>
<li><a href="https://bitbucket.org/italianmarssociety/eras/src/402e47bccb3e329c95596a06f8cc66cafcaa0658/servers/telerobotics/doc/setup-minoru.rst?at=default">Minoru Camera Calibration and Instructions</a></li>
<li><a href="https://bitbucket.org/italianmarssociety/eras/src/402e47bccb3e329c95596a06f8cc66cafcaa0658/servers/erasvr/doc/setup.rst?at=default">Oculus Rift Troubleshooting and Installation</a></li>
<li><a href="https://bitbucket.org/italianmarssociety/eras/src/402e47bccb3e329c95596a06f8cc66cafcaa0658/servers/telerobotics/doc/setup-ffmpeg.rst?at=default">Video Streaming FFmpeg Manual</a></li>
</ul>
<p>The latest version of the documentation can be <a href="https://bitbucket.org/italianmarssociety/eras/src/402e47bccb3e329c95596a06f8cc66cafcaa0658/servers/telerobotics/doc/?at=default">found here</a>.</p>
<p>The excitement of the final moments can be ascertained from my commit patterns on the last day -</p>
<p><img alt="Archive Tagging" src="http://siddhantsci.org/images/gsoc_ends.png" /></p>
<blockquote>
<p>Learning Experience</p>
</blockquote>
<p>The past 12 weeks (and an almost equivalent time before that during application period) have been transformative.</p>
<p>Just to get an idea of the different tools and concepts that I've been exposed to, here's a list -</p>
<ul>
<li>Tango Controls</li>
<li>Robot Operating System</li>
<li>Blender Game Engine</li>
<li>Oculus Rift</li>
<li>FFmpeg</li>
<li>Stereoscopic Cameras</li>
<li>Video4Linux2</li>
<li>Python</li>
<li>OpenVPN</li>
<li>Docker</li>
</ul>
<p>That indicates a great deal of experience in terms of tools alone.</p>
<blockquote>
<p>I learned how to create software architecture documents, how to work in tandem with other developers, how to communicate in the Open Source Community, when to seek help, how to seek help, how to help others, how to document my work, how to blog, and much more.</p>
</blockquote>
<p>With so many things to say, here's what I must definitely acknowledge -</p>
<blockquote>
<p>Thank you Python Software Foundation, Italian Mars Society, and Google Open Source Programs Office for this opportunity!</p>
</blockquote>
<p>I seriously can't imagine a better way in which I could have spent the past summer. I got a chance to pursue what I wanted to do, got an amazing mentoring and umbrella organization, a fascinating group of peers to work with, and arguably the best launchpad for Open Source contributions - the Google Summer of Code.</p>
<p>Time for evaluations now! Fingers crossed :-)</p>
<p>I have maintained a weekly-updated blog since the beginning of this summer of code. My organization required the blog frequency to be one post every two weeks. I loved blogging about my progress throughout. The eighteen posts so far can be found in the <a href="http://siddhantsci.org/category/gsoc.html">GSoC Category of my website</a>.
In case you are interested in this project with the Italian Mars Society, you can follow the <a href="http://siddhantsci.org/tag/ims.html">page of my blog</a></p>
<p>Ciao!</p></summary><category term="GSoC"></category><category term="Python"></category><category term="PSF"></category><category term="computers"></category><category term="science"></category><category term="exploration"></category><category term="space"></category><category term="mars"></category><category term="IMS"></category><category term="Italian Mars Society"></category></entry><entry><title>Telerobotics - The Penultimate Crescendo</title><link href="http://siddhantsci.org/blog/2015/08/14/telerobotics-the-penultimate-crescendo/" rel="alternate"></link><updated>2015-08-14T19:53:52+00:00</updated><author><name>Siddhant Shrivastava</name></author><id>tag:siddhantsci.org,2015-08-14:blog/2015/08/14/telerobotics-the-penultimate-crescendo/</id><summary type="html"><p>Hi! As the hard-deadline date for the <em>Google Summer of Code</em> program draws to a close, I can feel the palpable tension that is shared by my mentors and fellow students at the <strong>Italian Mars Society</strong> and the <strong>Python Software Foundation</strong>.</p>
<h1>All-Hands Meeting</h1>
<p>We at the <em>Italian Mars Society</em> had the third all-hands meeting last evening (13th August). The almost two-hour Skype Conference call discussed a gamut of topics in-depth. Some of these were-</p>
<h2><strong>Software Testing</strong> guidelines</h2>
<p>Ezio described the various ways of <strong>Unit Testing</strong> in different applications like rover movements, bodytracking, etc. In my case I had been checking for setup prerequisites and establishing the serializability of the ROS system before other modules could start up. That way the it is successfully ensured that all the required distributed systems are up and running before they are used. <strong>Integration Testing</strong> is crucial in the ERAS application where things like Telerobotics, Bodytracking, and the EUROPA Planner all blend together seamlessly. I've integrated Telerobotics and Bodytracking which can be observed in <a href="https://bitbucket.org/italianmarssociety/eras/commits/e87a0c1bfb46e0b8ba4b684b51060f8527aa1d6b">this commit</a>.</p>
<h2>Telerobotics</h2>
<p>Telerobotics in its current state is more precise than ever. This video demonstrates this fact -</p>
<p><div class="videobox">
<iframe width="640" height="390"
src='https://www.youtube.com/embed/94vfIr1cu7k'
frameborder='0' webkitAllowFullScreen mozallowfullscreen
allowFullScreen>
</iframe>
</div></p>
<p>The YouTube link for the video is <a href="https://www.youtube.com/watch?v=94vfIr1cu7k">this</a>.</p>
<p>I improved upon the previous integration with Bodytracking and handled the possible exceptions that may occur. The results have been stunning. I used the updated version of <a href="https://vigentile.wordpress.com/2015/07/31/enhancement-of-kinect-integration-in-v-eras-fifth-report/">Vito's bodytracker</a> which can detect closed hands. Since the sensor refresh-rate has been reduced to 30 times per second, the Telerobotics module has much smoother movements. Here is a snapshot of the Bodytracking application running in a Windows Virtual Machine -</p>
<p><img alt="Visual Tracker" src="http://siddhantsci.org/images/visual-3.png" /></p>
<h2>EUROPA Planner and Navigation Integration</h2>
<p>Shridhar has been working on the Planner which outputs Cartesian coordinates in the format <code>(x,y)</code> to which the rover must navigate. I am using the <code>AMCL</code> navigation algorithm for known maps in addition to the <code>actionlib</code> server of ROS to facilitate this integration. The challenge here is to resolve between the Cartesian coordinates of <code>EUROPA</code> and that of <code>ROS</code>. This should be hopefully complete in the next couple of days.</p>
<h2>AMADEE15 mission</h2>
<p>Yuval described that the the recently concluded mission was a huge success which focused on the following frontiers-</p>
<ul>
<li>GPS integration with Blender</li>
<li>Photogrammetry to reproduce Blender scenes for Virtual EVAs.</li>
<li>Unity3D and Oculus Integration</li>
<li>AoudaX realtime software</li>
<li>Generic ERAS Data Logger</li>
<li>Husky navigation</li>
</ul>
<p>Franco explained in brief about the <em>Neuro-vestibular</em> and <em>Husky</em> Scientific experiments.</p>
<h2>Other things</h2>
<p>Final efforts with Docker - After a lot of success, I have just one gripe with <strong>Docker</strong>. Running the Gazebo simulator, <code>rviz</code> (ROS visualizer), and the Telerobotics module requires THREE terminals.Working with ROS as a master inherently requires access to a lot of terminals for logging, echoing topic messages, starting programs, etc. The current ways to achieve multiple terminals and Qt applications in Docker are at best makeshift workarounds. To handle a graphics-heavy application like Telerobotics, we require a Graphical Environment. Docker is great for providing a common service framework but not so good at graphical applications like ROS. That's why I have been unable to get Docker working with the graphical aspects of ROS.</p>
<h1>Documentation</h1>
<p>In the final leg of the program, it is vital to go all-guns-blazing with the documentation of the software work that the students do. This is to ensure future development, maintainability, and clarity of thought. I recently added instructions in the Documentation directory - <code>telerobotics/doc/</code> to <strong>replicate my setup</strong>. This can be found in my <a href="https://bitbucket.org/italianmarssociety/eras/commits/dcd59b09d09f2e782e8c596f9f76b529eafd2151">current commit</a>.</p>
<p>I am ensuring that my mentors would be able to replicate my setup and give feedback very soon. The last week of GSoC is quite frenzied with the action to produce a consistent wrap-up of the project. The next post will officially be the last post of my GSoC 2015 experience. In reality, of course, I would keep working on the project and keep blogging :)</p>
<p>Till then, ciao.</p></summary><category term="GSoC"></category><category term="Python"></category><category term="PSF"></category><category term="computers"></category><category term="science"></category><category term="exploration"></category><category term="space"></category><category term="mars"></category><category term="IMS"></category><category term="Italian Mars Society"></category></entry><entry><title>Fine-tuning Telerobotics</title><link href="http://siddhantsci.org/blog/2015/08/07/fine-tuning-telerobotics/" rel="alternate"></link><updated>2015-08-07T19:53:52+00:00</updated><author><name>Siddhant Shrivastava</name></author><id>tag:siddhantsci.org,2015-08-07:blog/2015/08/07/fine-tuning-telerobotics/</id><summary type="html"><p>Hi! As discussed in the previous week, I have been <strong>able</strong> to get the <strong>integration of Telerobotics and Bodytracking</strong> up and running. Huge Victory :) Let me say the same thing in a much bolder typeface -</p>
<h2>Integration Successful!</h2>
<p>The following screenshot demonstrates what I'm talking about -</p>
<p><img alt="Telerobotics Integration" src="http://siddhantsci.org/images/telerobotics-0.png" /></p>
<h2>Screen recording - YouTube Video</h2>
<p>I used the same tool for screen capturing this integration that I used for real-time streaming from a 3-D camera. The output is as follows -</p>
<p><div class="videobox">
<iframe width="640" height="390"
src='https://www.youtube.com/embed/T3qbZaGvYao'
frameborder='0' webkitAllowFullScreen mozallowfullscreen
allowFullScreen>
</iframe>
</div></p>
<p>If my blogging platform is unable to embed the video on the page, you could <a href="https://youtu.be/T3qbZaGvYao">use this link</a> to watch the first version of Telerobotics and Bodytracking integration. The Visual Tracker designed by Vito looks like this -</p>
<p><img alt="Visual Tracker" src="http://siddhantsci.org/images/visual-2.png" /></p>
<h2>Current Status</h2>
<p>It is evident from the video that the setup is functional but not efficient. Moreover, it is buggy. The velocity values are way off the mark that ROS can take which results in jerks in Husky's motion. Also there is a disparity between the refresh rates of ROS and Tango-Controls which is identified by the Device not being unavailable intermittently.</p>
<p>I strongly hope I'll be able to solve these issues in the next post. Of all the <strong>aha</strong> moments that I have been privy to, watching the Integration working was probably the biggest one of them all. It looks futuristic to me. With the Internet of Everything, a lot of things are going to use Teleoperation. I am so glad that we at the Italian Mars Society are gauging the future trends and experimenting with them in the present. I am honored to be facilitate that experiment.</p>
<p>My next post is surely going to be a much more exciting run-down on how Telerobotics progresses :)</p>
<p>Stay Tuned. Ciao!</p></summary><category term="GSoC"></category><category term="Python"></category><category term="PSF"></category><category term="computers"></category><category term="science"></category><category term="exploration"></category><category term="space"></category><category term="mars"></category><category term="IMS"></category><category term="Italian Mars Society"></category></entry><entry><title>Telerobotics and Bodytracking - The Rendezvous</title><link href="http://siddhantsci.org/blog/2015/07/31/telerobotics-and-bodytracking-the-rendezvous/" rel="alternate"></link><updated>2015-07-31T19:53:52+00:00</updated><author><name>Siddhant Shrivastava</name></author><id>tag:siddhantsci.org,2015-07-31:blog/2015/07/31/telerobotics-and-bodytracking-the-rendezvous/</id><summary type="html"><p>Hi! The past week was a refreshingly positive one. I was able to solve some of the insidious issues that were plaguing the efforts that I was putting in last week.</p>
<h2>Virtual Machine Networking issues Solved!</h2>
<p>I was able to use the Tango server across the Windows 7 Virtual Machine and the Tango Host on my Ubuntu 14.04 Host Machine. The proper Networking mode for this turns out to be <strong>Bridged Networking mode</strong> which basically tunnels a connection between the Virtual Machine and the host.</p>
<p>In the bridged mode, the Virtual Machine exposes a Virtual Network interface with its own IP Address and Networking stack. In my case it was <code>vm8</code> with an IP Address different from the IP Address patterns that were used by the <em>real</em> Ethernet and WiFi Network Interface Cards. Using bridged mode, I was able to maintain the Tango Device Database server on Ubuntu and use Vito's Bodytracking device on Windows. The Virtual Machine didn't slow down things by any magnitude while communicating across the Tango devices.</p>
<p>This image explains what I'm talking about -</p>
<p><img alt="Jive on Windows and Ubuntu machines" src="http://siddhantsci.org/images/jive_windows_ubuntu.png" /></p>
<p>In bridged mode, I chose the IP Address on the host which corresponds to the Virtual Machine interface - <code>vmnet8</code> in my case. I used the <code>vmnet8</code> interface on Ubuntu and a similar interface on the Windows Virtual Machine. I read quite a bit about how Networking works in Virtual Machines and was fascinated by the Virtualization in place.</p>
<h2>Bodytracking meets Telerobotics</h2>
<p>With Tango up and running, I had to ensure that <a href="https://vigentile.wordpress.com/2015/07/31/enhancement-of-kinect-integration-in-v-eras-fifth-report/">Vito's Bodytracking application</a> works on the Virtual Machine. To that end, I installed <em>Kinect for Windows SDK</em>, <em>Kinect Developer Tools</em>, <em>Visual Python</em>, <em>Tango-Controls</em>, and <em>PyTango</em>. Setting a new <em>virtual</em> machine up mildly slowed me down but was a necessary step in the development.</p>
<p>Once I had that bit running, I was able to visualize the <strong>simulated Martian Motivity walk done in Innsbruck </strong> in a training station. The Bodytracking server created by Vito <em>published</em> events corresponding to the <code>moves</code> attribute which is a list of the following two metrics -</p>
<ul>
<li>Position</li>
<li>Orientation</li>
</ul>
<p>I was able to read the attributes that the Bodytracking device was publishing by <strong>subscribing</strong> to Event Changes to that attribute. This is done in the following way -</p>
<div class="highlight"><pre> <span class="k">while</span> <span class="n">TRIGGER</span><span class="o">:</span>
<span class="err">#</span> <span class="n">Subscribe</span> <span class="n">to</span> <span class="n">the</span> <span class="err">&#39;</span><span class="n">moves</span><span class="err">&#39;</span> <span class="n">event</span> <span class="n">from</span> <span class="n">the</span> <span class="n">Bodytracking</span> <span class="n">interface</span>
<span class="n">moves_event</span> <span class="o">=</span> <span class="n">device_proxy</span><span class="p">.</span><span class="n">subscribe_event</span><span class="p">(</span>
<span class="err">&#39;</span><span class="n">moves</span><span class="err">&#39;</span><span class="p">,</span>
<span class="n">PyTango</span><span class="p">.</span><span class="n">EventType</span><span class="p">.</span><span class="n">CHANGE_EVENT</span><span class="p">,</span>
<span class="n">cb</span><span class="p">,</span> <span class="p">[])</span>
<span class="err">#</span> <span class="n">Wait</span> <span class="k">for</span> <span class="n">at</span> <span class="n">least</span> <span class="n">REFRESH_RATE</span> <span class="n">Seconds</span> <span class="k">for</span> <span class="n">the</span> <span class="n">next</span> <span class="n">callback</span><span class="p">.</span>
<span class="n">time</span><span class="p">.</span><span class="n">sleep</span><span class="p">(</span><span class="n">REFRESH_RATE</span><span class="p">)</span>
</pre></div>
<p>This ensures that the Subscriber doesn't exhaust the polled attributes at a rate faster than they are published. In that unfortunate case, an <code>EventManagerException</code> occurs which must be handled properly.</p>
<p>Note the <code>cb</code> attribute, it refers to the Callback function that is triggered when an Event change occurs. The callback function is responsible for reading and processing the attributes.</p>
<p>The processing part in our case is the core of the <strong>Telerobotics-Bodytracking interface</strong>. It acts as the intermediary between Telerobotics and Bodytracking - converting the <em>position</em>, and <em>orientation</em> values to <strong>linear and angular velocity</strong> that Husky can understand. I use a high-performance container from the <code>collections</code> class known as <code>deque</code>. It can act both as a stack and a queue using <code>deque.append</code>, <code>deque.appendleft</code>, <code>deque.pop</code>, <code>deque.popleft</code>.</p>
<blockquote>
<p>To calculate velocity, I compute the differences between consecutive events and their corresponding timestamps. The events are stored in a <code>deque</code>, popped when necessary and subtracted from the current event values</p>
</blockquote>
<p>For instance this is how <strong>linear velocity</strong> processing takes place -</p>
<div class="highlight"><pre> <span class="err">#</span> <span class="n">Position</span> <span class="n">and</span> <span class="n">Linear</span> <span class="n">Velocity</span> <span class="n">Processing</span>
<span class="n">position_previous</span> <span class="o">=</span> <span class="n">position_events</span><span class="p">.</span><span class="n">pop</span><span class="p">()</span>
<span class="n">position_current</span> <span class="o">=</span> <span class="n">position</span>
<span class="n">linear_displacement</span> <span class="o">=</span> <span class="n">position_current</span> <span class="o">-</span> <span class="n">position_previous</span>
<span class="n">linear_speed</span> <span class="o">=</span> <span class="n">linear_displacement</span> <span class="o">/</span> <span class="n">time_delta</span>
</pre></div>
<h2>ROS-Telerobotics Interface</h2>
<p>We are halfway through the Telerobotics-Bodytracking architecture. Once the velocities are obtained, we have everything we need to send to ROS. The challenge here is to use velocities which ROS and the Husky UGV can understand. The messages are published ot ROS <em>only</em> when there is some change in the velocity. This has the added advantage of minimzing communication between ROS and Tango. When working with multiple distributed systems, it is always wise to keep the communication between them minimial. That's what I've aimed to do. I'll be enhacing the interface even further by adding Trigger Overrides in case of an emergency situation. The speeds currently are not ROS-friendly. I am writing a high-pass and low-pass filter to limit the velocities to what Husky can sustain. Vito and I will be refining the User Step estimation and the corresponding Robot movements respectively.</p>
<p>GSoC is only becoming more exciting. I'm certain that I will be contributing to this project after GSoC as well. The Telerobotics scenario is full of possibilities, most of which I've tried to cover in my GSoC proposal.</p>
<p>I'm back to my university now and it has become hectic but enjoyably challenging to complete this project. My next post will hopefully be a culmination of the Telerobotics/Bodytracking interface and the integration of 3D streaming with Oculus Rift Virtual Reality.</p>
<p>Ciao!</p></summary><category term="GSoC"></category><category term="Python"></category><category term="PSF"></category><category term="computers"></category><category term="science"></category><category term="exploration"></category><category term="space"></category><category term="mars"></category><category term="IMS"></category><category term="Italian Mars Society"></category></entry><entry><title>Virtual Machines + Virtual Reality = Real Challenges!</title><link href="http://siddhantsci.org/blog/2015/07/24/virtual-machines-virtual-reality-real-challenges/" rel="alternate"></link><updated>2015-07-24T19:53:52+00:00</updated><author><name>Siddhant Shrivastava</name></author><id>tag:siddhantsci.org,2015-07-24:blog/2015/07/24/virtual-machines-virtual-reality-real-challenges/</id><summary type="html"><p>Hi! For the past couple of weeks, I've been trying to get a lot of things to work. Linux and Computer Networks seem to like me so much that they ensure my attention throughout the course of this program. This time it was dynamic libraries, Virtual Machine Networking, Docker Containers, Head-mounted display errors and so on.</p>
<p>A brief discussion about these:</p>
<h2>Dynamic Libraries, Oculus Rift, and Python Bindings</h2>
<p>Using the open-source Python bindings for the <strong>Oculus SDK</strong> available <a href="https://github.com/jherico/python-ovrsdk">here</a>, Franco and I ran into a problem -</p>
<div class="highlight"><pre><span class="nx">ImportError</span><span class="p">:</span> <span class="o">&lt;</span><span class="nb">root</span><span class="o">&gt;/</span><span class="nx">oculusvr</span><span class="p">/</span><span class="nx">linux</span><span class="na">-x86</span><span class="o">-</span><span class="mi">64</span><span class="p">/</span><span class="nx">libOculusVR.so</span><span class="p">:</span> <span class="nx">undefined</span> <span class="nx">symbol</span><span class="p">:</span> <span class="nx">glXMakeCurrent</span>
</pre></div>
<p>To get to the root of the problem, I tried to list all dependencies of the <strong>shared object file</strong> -</p>
<div class="highlight"><pre> <span class="n">linux</span><span class="o">-</span><span class="n">vdso</span><span class="p">.</span><span class="n">so</span><span class="mf">.1</span> <span class="o">=&gt;</span> <span class="p">(</span><span class="mh">0x00007ffddb388000</span><span class="p">)</span>
<span class="n">librt</span><span class="p">.</span><span class="n">so</span><span class="mf">.1</span> <span class="o">=&gt;</span> <span class="o">/</span><span class="n">lib</span><span class="o">/</span><span class="n">x86_64</span><span class="o">-</span><span class="n">linux</span><span class="o">-</span><span class="n">gnu</span><span class="o">/</span><span class="n">librt</span><span class="p">.</span><span class="n">so</span><span class="mf">.1</span> <span class="p">(</span><span class="mh">0x00007f6205e1d000</span><span class="p">)</span>
<span class="n">libpthread</span><span class="p">.</span><span class="n">so</span><span class="mf">.0</span> <span class="o">=&gt;</span> <span class="o">/</span><span class="n">lib</span><span class="o">/</span><span class="n">x86_64</span><span class="o">-</span><span class="n">linux</span><span class="o">-</span><span class="n">gnu</span><span class="o">/</span><span class="n">libpthread</span><span class="p">.</span><span class="n">so</span><span class="mf">.0</span> <span class="p">(</span><span class="mh">0x00007f6205bff000</span><span class="p">)</span>
<span class="n">libX11</span><span class="p">.</span><span class="n">so</span><span class="mf">.6</span> <span class="o">=&gt;</span> <span class="o">/</span><span class="n">usr</span><span class="o">/</span><span class="n">lib</span><span class="o">/</span><span class="n">x86_64</span><span class="o">-</span><span class="n">linux</span><span class="o">-</span><span class="n">gnu</span><span class="o">/</span><span class="n">libX11</span><span class="p">.</span><span class="n">so</span><span class="mf">.6</span> <span class="p">(</span><span class="mh">0x00007f62058ca000</span><span class="p">)</span>
<span class="n">libXrandr</span><span class="p">.</span><span class="n">so</span><span class="mf">.2</span> <span class="o">=&gt;</span> <span class="o">/</span><span class="n">usr</span><span class="o">/</span><span class="n">lib</span><span class="o">/</span><span class="n">x86_64</span><span class="o">-</span><span class="n">linux</span><span class="o">-</span><span class="n">gnu</span><span class="o">/</span><span class="n">libXrandr</span><span class="p">.</span><span class="n">so</span><span class="mf">.2</span> <span class="p">(</span><span class="mh">0x00007f62056c0000</span><span class="p">)</span>
<span class="n">libstdc</span><span class="o">++</span><span class="p">.</span><span class="n">so</span><span class="mf">.6</span> <span class="o">=&gt;</span> <span class="o">/</span><span class="n">usr</span><span class="o">/</span><span class="n">lib</span><span class="o">/</span><span class="n">x86_64</span><span class="o">-</span><span class="n">linux</span><span class="o">-</span><span class="n">gnu</span><span class="o">/</span><span class="n">libstdc</span><span class="o">++</span><span class="p">.</span><span class="n">so</span><span class="mf">.6</span> <span class="p">(</span><span class="mh">0x00007f62053bc000</span><span class="p">)</span>
<span class="n">libm</span><span class="p">.</span><span class="n">so</span><span class="mf">.6</span> <span class="o">=&gt;</span> <span class="o">/</span><span class="n">lib</span><span class="o">/</span><span class="n">x86_64</span><span class="o">-</span><span class="n">linux</span><span class="o">-</span><span class="n">gnu</span><span class="o">/</span><span class="n">libm</span><span class="p">.</span><span class="n">so</span><span class="mf">.6</span> <span class="p">(</span><span class="mh">0x00007f62050b6000</span><span class="p">)</span>
<span class="n">libgcc_s</span><span class="p">.</span><span class="n">so</span><span class="mf">.1</span> <span class="o">=&gt;</span> <span class="o">/</span><span class="n">lib</span><span class="o">/</span><span class="n">x86_64</span><span class="o">-</span><span class="n">linux</span><span class="o">-</span><span class="n">gnu</span><span class="o">/</span><span class="n">libgcc_s</span><span class="p">.</span><span class="n">so</span><span class="mf">.1</span> <span class="p">(</span><span class="mh">0x00007f6204ea0000</span><span class="p">)</span>
<span class="n">libc</span><span class="p">.</span><span class="n">so</span><span class="mf">.6</span> <span class="o">=&gt;</span> <span class="o">/</span><span class="n">lib</span><span class="o">/</span><span class="n">x86_64</span><span class="o">-</span><span class="n">linux</span><span class="o">-</span><span class="n">gnu</span><span class="o">/</span><span class="n">libc</span><span class="p">.</span><span class="n">so</span><span class="mf">.6</span> <span class="p">(</span><span class="mh">0x00007f6204adb000</span><span class="p">)</span>
<span class="o">/</span><span class="n">lib64</span><span class="o">/</span><span class="n">ld</span><span class="o">-</span><span class="n">linux</span><span class="o">-</span><span class="n">x86</span><span class="o">-</span><span class="mf">64.</span><span class="n">so</span><span class="mf">.2</span> <span class="p">(</span><span class="mh">0x00007f6206337000</span><span class="p">)</span>
<span class="n">libxcb</span><span class="p">.</span><span class="n">so</span><span class="mf">.1</span> <span class="o">=&gt;</span> <span class="o">/</span><span class="n">usr</span><span class="o">/</span><span class="n">lib</span><span class="o">/</span><span class="n">x86_64</span><span class="o">-</span><span class="n">linux</span><span class="o">-</span><span class="n">gnu</span><span class="o">/</span><span class="n">libxcb</span><span class="p">.</span><span class="n">so</span><span class="mf">.1</span> <span class="p">(</span><span class="mh">0x00007f62048bc000</span><span class="p">)</span>
<span class="n">libdl</span><span class="p">.</span><span class="n">so</span><span class="mf">.2</span> <span class="o">=&gt;</span> <span class="o">/</span><span class="n">lib</span><span class="o">/</span><span class="n">x86_64</span><span class="o">-</span><span class="n">linux</span><span class="o">-</span><span class="n">gnu</span><span class="o">/</span><span class="n">libdl</span><span class="p">.</span><span class="n">so</span><span class="mf">.2</span> <span class="p">(</span><span class="mh">0x00007f62046b8000</span><span class="p">)</span>
<span class="n">libXext</span><span class="p">.</span><span class="n">so</span><span class="mf">.6</span> <span class="o">=&gt;</span> <span class="o">/</span><span class="n">usr</span><span class="o">/</span><span class="n">lib</span><span class="o">/</span><span class="n">x86_64</span><span class="o">-</span><span class="n">linux</span><span class="o">-</span><span class="n">gnu</span><span class="o">/</span><span class="n">libXext</span><span class="p">.</span><span class="n">so</span><span class="mf">.6</span> <span class="p">(</span><span class="mh">0x00007f62044a6000</span><span class="p">)</span>
<span class="n">libXrender</span><span class="p">.</span><span class="n">so</span><span class="mf">.1</span> <span class="o">=&gt;</span> <span class="o">/</span><span class="n">usr</span><span class="o">/</span><span class="n">lib</span><span class="o">/</span><span class="n">x86_64</span><span class="o">-</span><span class="n">linux</span><span class="o">-</span><span class="n">gnu</span><span class="o">/</span><span class="n">libXrender</span><span class="p">.</span><span class="n">so</span><span class="mf">.1</span> <span class="p">(</span><span class="mh">0x00007f620429c000</span><span class="p">)</span>
<span class="n">libXau</span><span class="p">.</span><span class="n">so</span><span class="mf">.6</span> <span class="o">=&gt;</span> <span class="o">/</span><span class="n">usr</span><span class="o">/</span><span class="n">lib</span><span class="o">/</span><span class="n">x86_64</span><span class="o">-</span><span class="n">linux</span><span class="o">-</span><span class="n">gnu</span><span class="o">/</span><span class="n">libXau</span><span class="p">.</span><span class="n">so</span><span class="mf">.6</span> <span class="p">(</span><span class="mh">0x00007f6204098000</span><span class="p">)</span>
<span class="n">libXdmcp</span><span class="p">.</span><span class="n">so</span><span class="mf">.6</span> <span class="o">=&gt;</span> <span class="o">/</span><span class="n">usr</span><span class="o">/</span><span class="n">lib</span><span class="o">/</span><span class="n">x86_64</span><span class="o">-</span><span class="n">linux</span><span class="o">-</span><span class="n">gnu</span><span class="o">/</span><span class="n">libXdmcp</span><span class="p">.</span><span class="n">so</span><span class="mf">.6</span> <span class="p">(</span><span class="mh">0x00007f6203e92000</span><span class="p">)</span>
<span class="n">undefined</span> <span class="n">symbol</span><span class="o">:</span> <span class="n">glXMakeCurrent</span> <span class="p">(.</span><span class="o">/</span><span class="n">libOculusVR</span><span class="p">.</span><span class="n">so</span><span class="p">)</span>
<span class="n">undefined</span> <span class="n">symbol</span><span class="o">:</span> <span class="n">glEnable</span> <span class="p">(.</span><span class="o">/</span><span class="n">libOculusVR</span><span class="p">.</span><span class="n">so</span><span class="p">)</span>
<span class="n">undefined</span> <span class="n">symbol</span><span class="o">:</span> <span class="n">glFrontFace</span> <span class="p">(.</span><span class="o">/</span><span class="n">libOculusVR</span><span class="p">.</span><span class="n">so</span><span class="p">)</span>
<span class="n">undefined</span> <span class="n">symbol</span><span class="o">:</span> <span class="n">glDisable</span> <span class="p">(.</span><span class="o">/</span><span class="n">libOculusVR</span><span class="p">.</span><span class="n">so</span><span class="p">)</span>
<span class="n">undefined</span> <span class="n">symbol</span><span class="o">:</span> <span class="n">glClear</span> <span class="p">(.</span><span class="o">/</span><span class="n">libOculusVR</span><span class="p">.</span><span class="n">so</span><span class="p">)</span>
<span class="n">undefined</span> <span class="n">symbol</span><span class="o">:</span> <span class="n">glGetError</span> <span class="p">(.</span><span class="o">/</span><span class="n">libOculusVR</span><span class="p">.</span><span class="n">so</span><span class="p">)</span>
<span class="n">undefined</span> <span class="n">symbol</span><span class="o">:</span> <span class="n">glXDestroyContext</span> <span class="p">(.</span><span class="o">/</span><span class="n">libOculusVR</span><span class="p">.</span><span class="n">so</span><span class="p">)</span>
<span class="n">undefined</span> <span class="n">symbol</span><span class="o">:</span> <span class="n">glXCreateContext</span> <span class="p">(.</span><span class="o">/</span><span class="n">libOculusVR</span><span class="p">.</span><span class="n">so</span><span class="p">)</span>
<span class="n">undefined</span> <span class="n">symbol</span><span class="o">:</span> <span class="n">glClearColor</span> <span class="p">(.</span><span class="o">/</span><span class="n">libOculusVR</span><span class="p">.</span><span class="n">so</span><span class="p">)</span>
<span class="n">undefined</span> <span class="n">symbol</span><span class="o">:</span> <span class="n">glXGetCurrentContext</span> <span class="p">(.</span><span class="o">/</span><span class="n">libOculusVR</span><span class="p">.</span><span class="n">so</span><span class="p">)</span>
<span class="n">undefined</span> <span class="n">symbol</span><span class="o">:</span> <span class="n">glXSwapBuffers</span> <span class="p">(.</span><span class="o">/</span><span class="n">libOculusVR</span><span class="p">.</span><span class="n">so</span><span class="p">)</span>
<span class="n">undefined</span> <span class="n">symbol</span><span class="o">:</span> <span class="n">glColorMask</span> <span class="p">(.</span><span class="o">/</span><span class="n">libOculusVR</span><span class="p">.</span><span class="n">so</span><span class="p">)</span>
<span class="n">undefined</span> <span class="n">symbol</span><span class="o">:</span> <span class="n">glBlendFunc</span> <span class="p">(.</span><span class="o">/</span><span class="n">libOculusVR</span><span class="p">.</span><span class="n">so</span><span class="p">)</span>
<span class="n">undefined</span> <span class="n">symbol</span><span class="o">:</span> <span class="n">glBindTexture</span> <span class="p">(.</span><span class="o">/</span><span class="n">libOculusVR</span><span class="p">.</span><span class="n">so</span><span class="p">)</span>
<span class="n">undefined</span> <span class="n">symbol</span><span class="o">:</span> <span class="n">glDepthMask</span> <span class="p">(.</span><span class="o">/</span><span class="n">libOculusVR</span><span class="p">.</span><span class="n">so</span><span class="p">)</span>
<span class="n">undefined</span> <span class="n">symbol</span><span class="o">:</span> <span class="n">glDeleteTextures</span> <span class="p">(.</span><span class="o">/</span><span class="n">libOculusVR</span><span class="p">.</span><span class="n">so</span><span class="p">)</span>
<span class="n">undefined</span> <span class="n">symbol</span><span class="o">:</span> <span class="n">glGetIntegerv</span> <span class="p">(.</span><span class="o">/</span><span class="n">libOculusVR</span><span class="p">.</span><span class="n">so</span><span class="p">)</span>
<span class="n">undefined</span> <span class="n">symbol</span><span class="o">:</span> <span class="n">glXGetCurrentDrawable</span> <span class="p">(.</span><span class="o">/</span><span class="n">libOculusVR</span><span class="p">.</span><span class="n">so</span><span class="p">)</span>
<span class="n">undefined</span> <span class="n">symbol</span><span class="o">:</span> <span class="n">glDrawElements</span> <span class="p">(.</span><span class="o">/</span><span class="n">libOculusVR</span><span class="p">.</span><span class="n">so</span><span class="p">)</span>
<span class="n">undefined</span> <span class="n">symbol</span><span class="o">:</span> <span class="n">glTexImage2D</span> <span class="p">(.</span><span class="o">/</span><span class="n">libOculusVR</span><span class="p">.</span><span class="n">so</span><span class="p">)</span>
<span class="n">undefined</span> <span class="n">symbol</span><span class="o">:</span> <span class="n">glXGetClientString</span> <span class="p">(.</span><span class="o">/</span><span class="n">libOculusVR</span><span class="p">.</span><span class="n">so</span><span class="p">)</span>
<span class="n">undefined</span> <span class="n">symbol</span><span class="o">:</span> <span class="n">glDrawArrays</span> <span class="p">(.</span><span class="o">/</span><span class="n">libOculusVR</span><span class="p">.</span><span class="n">so</span><span class="p">)</span>
<span class="n">undefined</span> <span class="n">symbol</span><span class="o">:</span> <span class="n">glGetString</span> <span class="p">(.</span><span class="o">/</span><span class="n">libOculusVR</span><span class="p">.</span><span class="n">so</span><span class="p">)</span>
<span class="n">undefined</span> <span class="n">symbol</span><span class="o">:</span> <span class="n">glXGetProcAddress</span> <span class="p">(.</span><span class="o">/</span><span class="n">libOculusVR</span><span class="p">.</span><span class="n">so</span><span class="p">)</span>
<span class="n">undefined</span> <span class="n">symbol</span><span class="o">:</span> <span class="n">glViewport</span> <span class="p">(.</span><span class="o">/</span><span class="n">libOculusVR</span><span class="p">.</span><span class="n">so</span><span class="p">)</span>
<span class="n">undefined</span> <span class="n">symbol</span><span class="o">:</span> <span class="n">glTexParameteri</span> <span class="p">(.</span><span class="o">/</span><span class="n">libOculusVR</span><span class="p">.</span><span class="n">so</span><span class="p">)</span>
<span class="n">undefined</span> <span class="n">symbol</span><span class="o">:</span> <span class="n">glGenTextures</span> <span class="p">(.</span><span class="o">/</span><span class="n">libOculusVR</span><span class="p">.</span><span class="n">so</span><span class="p">)</span>
<span class="n">undefined</span> <span class="n">symbol</span><span class="o">:</span> <span class="n">glFinish</span> <span class="p">(.</span><span class="o">/</span><span class="n">libOculusVR</span><span class="p">.</span><span class="n">so</span><span class="p">)</span>
</pre></div>
<p>This clearly implied one thing - <strong>libGL</strong> was not being linked. My task then was to <em>somehow</em> link libGL to the SO file that came with the Python Bindings. I tried out the following two options -</p>
<ul>
<li><strong>Creating my own bindings</strong>: Tried to regenerate the SO file from the Oculus C SDK by using the amazing <a href="https://github.com/davidjamesca/ctypesgen">Python Ctypesgen</a>. This method didn't work out as I couldn't resolve the <em>header</em> files that are requied by <em>Ctypesgen</em>. Nevertheless, I learned how to create Python Bindings and that is a huge take-away from the exercise. I had always wondered how Python interfaces are created out of programs written in other languages.</li>
<li><strong>Making the existing shared object file believe that it is linked to libGL</strong>: So here's what I did - after a lot of searching, I found the nifty little environment variable that worked wonders for our Oculus development - <strong>LD_PRELOAD</strong></li>
</ul>
<p>As <a href="https://rafalcieslak.wordpress.com/2013/04/02/dynamic-linker-tricks-using-ld_preload-to-cheat-inject-features-and-investigate-programs/">this</a> and <a href="http://blog.chaselambda.com/2014/11/28/how-tmux-starts-up-an-adventure-with-linux-tools-updated.html">this</a> articles delineate the power of LD_PRELOAD, it is possible to force-load a dynamically linked shared object in the memory.
If you set LD_PRELOAD to the path of a shared object, that file will be loaded before any other library (including the C runtime, libc.so). For example, to run <code>ls</code> with your special malloc() implementation, do this:</p>
<p><code>$ LD_PRELOAD=/path/to/my/malloc.so /bin/ls</code></p>
<p>Thus, the solution to my problem was to place this in the <code>.bashrc</code> file -</p>
<p><code>LD_PRELOAD="/usr/lib/x86_64-linux-gnu/libGL.so"</code></p>
<p>This allowed Franco to create the Oculus Test Tango server and ensured that our Oculus Rift development efforts continue with gusto.</p>
<h2>ROS and Autonomous Navigation</h2>
<p>On the programming side, I've been playing around with <code>actionlib</code> to interface Bodytracking with Telerobotics. I have created a simple walker script which provides a certain degree of autonomy to the robot and avoids collissions with objects to override human teleoperation commands. An obstacle could be a Martian rock in a simulated environment or an uneven terrain with a possible ditch ahead. To achieve this, I use the <code>LaserScan</code> message and check for the range readings at frequent intervals. The <em>LIDAR</em> readings ensure that the robot is in one of the following states -</p>
<ul>
<li>Approaching an obstacle</li>
<li>Going away from an obstacle</li>
<li>Hitting an obstacle</li>
</ul>
<p>The state can be inferred from the LaserScan Messages. A ROS Action Server then waits for one of these events to happen and triggers the callback which tells the robot to stop, turn and continue.</p>
<h2>Windows and PyKinect</h2>
<p>In order to run Vito's bodytracking code, I needed a Windows installation. Running into problems with a 32-bit Windows 7 Virtual Machine image I had, I needed to reinstall and use a 64-bits Virtual Machine image. I installed all the dependencies to run the bodytracking code. I am still stuck with Networking modes between the Virtual Machine and the Host machine. The TANGO host needs to be configured correctly to allow the TANGO_MASTER to point to the host and the TANGO_HOST to the virtual machine.</p>
<h2>Docker and Qt Apps</h2>
<p>Qt applications don't seem to work with sharing the display in a Docker container. The way out is to create users in the Docker container which I'm currently doing. I'll enable VNC and X-forwarding to allow the ROS Qt applications to work so that the other members of the Italian Mars Society can use the Docker container directly.</p>
<h2>Gazebo Mars model</h2>
<p>I took a brief look at the 3D models of Martial terrain available for free use on the Internet. I'll be trying to obtain the Gale Crater region and represent it in Gazebo to drive the Husky in a Martian Terrain.</p>
<h2>Documentation week!</h2>
<p>In addition to strong-arming my CS concepts against the Networking and Linux issues that loom over the project currently, I updated and added documentation for the modules developed so far.</p>
<p>Hope the next post explains how I solved the problems described in this post. Ciao!</p></summary><category term="GSoC"></category><category term="Python"></category><category term="PSF"></category><category term="computers"></category><category term="science"></category><category term="exploration"></category><category term="space"></category><category term="mars"></category><category term="IMS"></category><category term="Italian Mars Society"></category></entry><entry><title>Streamed away (in Real-Time)!</title><link href="http://siddhantsci.org/blog/2015/07/16/streamed-away-in-real-time/" rel="alternate"></link><updated>2015-07-16T19:53:52+00:00</updated><author><name>Siddhant Shrivastava</name></author><id>tag:siddhantsci.org,2015-07-16:blog/2015/07/16/streamed-away-in-real-time/</id><summary type="html"><p>Hi! This post is <em>all</em> about <strong>Video Streaming and Cameras</strong> :-) If you've wondered how services like YouTube Live or twitch.tv work, then this post is for you. After the <em>Innsbruck experiments</em> and <a href="http://siddhantsci.org/blog/2015/07/08/remote-tests-in-telerobotics/">Remote tests in Telerobotics</a>, it was time for me to create a full-fledged Real Time Video Streaming solution for the ERAS project. After a lot of frustration and learning, I've been able to achieve the following milestones - </p>
<ol>
<li>Stream losslessly from a single camera in real-time to a Blender Game Engine instance.</li>
<li>Create example Blender projects to test <em>multiple video sources</em> streaming over a network.</li>
<li>Record a <strong>live stream</strong> from a <strong>stereoscopic camera</strong> into a side-by-side video encoded on the fly. </li>
</ol>
<p>It's going to be a very long post as I've been playing around with lots of video streaming stuff. All this experience has turned me into a confident Multimedia streamer.</p>
<h2>Why am I doing this?</h2>
<p>Integrating <em>Augmented</em> and <em>Virtual Reality</em> requires one to know the nitty-gritty of <strong>Multimedia Streaming</strong>. This week was spent in learning and tinkering with the various options provided by <a href="https://ffmpeg.org/">FFmpeg</a> and <a href="linuxtv.org/downloads/v4l-dvb-apis/">Video4Linux2</a>. One of the aims of the Telerobotics project is to allow streaming of Rover Camera input to the Astronaut's Head-Mounted Device (<strong>Minoru 3D</strong> camera and <strong>Oculus Rift</strong> in my case). The streamed video has multiple uses -</p>
<ol>
<li>It is used by the various Tango servers (Planning, Vision, Telerobotics, etc) and processed to obtain Semantic relationships between objects in the Martian environment.</li>
<li>The video, in addition to the LIDAR and other sensing devices are the interface of the Human world in the ERAS habitat on Mars. The video stream provides a window to Mars.</li>
<li>The real-time stream helps the astronaut and the simulated astronaut to guide the rover and the simulated rover around on Mars.</li>
<li>Streaming is an integral component of both ERAS and V-ERAS which we at the Italian Mars Society are currently working on. </li>
</ol>
<h2>Initial Impressions</h2>
<p>When I started with 3D streaming, it <em>appeared</em> easy. "I did it with a single camera, two cameras can't be a huge deal, right!". <em>I had never been so wrong</em>. I found myself stuck in the usual embedded device vs the Linux kernel interface -</p>
<ul>
<li>The hardware of desktop machines are unsuitable for Streaming applications. </li>
<li>The Kernel is not configured to use multiple webcams</li>
<li>This results in lots of <strong>memory-related</strong> errors - <code>insufficient memory</code>, <code>rt_underflow</code></li>
</ul>
<p>To tweak the Minoru camera and strike an optimum settings agreement with this cute little stereo camera, I began to dig into the core software components involved -</p>
<h2>Video4Linux2 saves the day!</h2>
<p>The Video4Linux is an important driver framework which makes it possible for Linux users to use Video Capture devices (webcams and streaming equipment). It supports multiple features. The ones that this project is concerned with are -</p>
<ul>
<li>Video Capture/Output and Tuning (<code>/dev/videoX</code>, streaming and control)</li>
<li>Video Capture and Output overlay (<code>/dev/videoX</code>, control)</li>
<li>Memory-to-Memory (Codec) devices (<code>/dev/videoX</code>)</li>
</ul>
<p><a href="https://archive.fosdem.org/2014/schedule/event/v4l_intro/">These slides</a> by Hans Verkuil (Cisco Systems) are and informative entry point for understanding how Video4Linux works.</p>
<p>The different Streaming Modes supported by Video4Linux are -</p>
<ul>
<li>Read/Write (<strong>Supported by Minoru</strong>) </li>
<li>Memory Mapped Streaming I/O (<strong>Supported by Minoru</strong>)</li>
<li>User Pointer Streaming I/O</li>
<li>DMA (Direct Memory Access) Buffer Streaming I/O</li>
</ul>
<p>The take-away from Video4Linux is understanding how streaming works. So a Stream requires the following - queue setup, preparing the buffer, start streaming, stop streaming, wait to prepare, wait to finish, compression and encoding of the input stream, transmission/feeding on a channel, decompression and decoding the received stream, and facilities for playback and time-seek.</p>
<p>The Qt frontend to <code>v4l2</code> made me realize where the problem with the camera lied -</p>
<p><img alt="Qv4l2 Minoru" src="http://siddhantsci.org/images/minoru-qv4l2.jpg" /></p>
<p>The <code>video4linux2</code> specification allows for querying and configuring <strong>everything</strong> about Video Capture Cards. The nifty command-line utitlity <code>v4l2-ctl</code> is a lifesaver while debugging cameras.</p>
<p>For instance, with the Stereo Camera connected, <code>`v4l2-ctl --list-devices</code> gives -</p>
<div class="highlight"><pre><span class="n">Vimicro</span> <span class="n">USB2</span><span class="mf">.0</span> <span class="n">PC</span> <span class="n">Camera</span> <span class="p">(</span><span class="n">usb</span><span class="o">-</span><span class="mo">0000</span><span class="o">:</span><span class="mo">00</span><span class="o">:</span><span class="mf">14.0</span><span class="o">-</span><span class="mf">1.1</span><span class="p">)</span><span class="o">:</span>
<span class="o">/</span><span class="n">dev</span><span class="o">/</span><span class="n">video1</span>
<span class="n">Vimicro</span> <span class="n">USB2</span><span class="mf">.0</span> <span class="n">PC</span> <span class="n">Camera</span> <span class="p">(</span><span class="n">usb</span><span class="o">-</span><span class="mo">0000</span><span class="o">:</span><span class="mo">00</span><span class="o">:</span><span class="mf">14.0</span><span class="o">-</span><span class="mf">1.4</span><span class="p">)</span><span class="o">:</span>
<span class="o">/</span><span class="n">dev</span><span class="o">/</span><span class="n">video2</span>
<span class="n">WebCam</span> <span class="n">SC</span><span class="o">-</span><span class="mi">13</span><span class="n">HDL11939N</span> <span class="p">(</span><span class="n">usb</span><span class="o">-</span><span class="mo">0000</span><span class="o">:</span><span class="mo">00</span><span class="o">:</span><span class="mi">1</span><span class="n">a</span><span class="mf">.0</span><span class="o">-</span><span class="mf">1.4</span><span class="p">)</span><span class="o">:</span>
<span class="o">/</span><span class="n">dev</span><span class="o">/</span><span class="n">video0</span>
</pre></div>
<div class="highlight"><pre><span class="n">v4l2</span><span class="o">-</span><span class="n">ctl</span> <span class="o">--</span><span class="n">list</span><span class="o">-</span><span class="n">frameintervals</span><span class="o">=</span><span class="n">width</span><span class="o">=</span><span class="mi">640</span><span class="p">,</span><span class="n">height</span><span class="o">=</span><span class="mi">480</span><span class="p">,</span><span class="n">pixelformat</span><span class="o">=</span><span class="err">&#39;</span><span class="n">YUYV</span><span class="err">&#39;</span>
</pre></div>
<p>gives</p>
<div class="highlight"><pre><span class="n">ioctl</span><span class="o">:</span> <span class="n">VIDIOC_ENUM_FRAMEINTERVALS</span>
<span class="n">Interval</span><span class="o">:</span> <span class="n">Discrete</span> <span class="mf">0.033</span><span class="n">s</span> <span class="o">(</span><span class="mf">30.000</span> <span class="n">fps</span><span class="o">)</span>
<span class="n">Interval</span><span class="o">:</span> <span class="n">Discrete</span> <span class="mf">0.067</span><span class="n">s</span> <span class="o">(</span><span class="mf">15.000</span> <span class="n">fps</span><span class="o">)</span>
</pre></div>
<p>This means that I've to use one of these settings for getting input from the camera, and then transcode them into the desired stream characteristics.</p>
<h2>Knowing your stereoscopic Camera</h2>
<p><img alt="Stereo" src="http://siddhantsci.org/images/stereo-1.png" /></p>
<p>VLC carefully configured to stream the Left and Right Minoru Cameras/</p>
<p><a href="http://www.minoru3d.com/">Minoru 3D</a> webcam uses the following <em>Color Spaces</em> -</p>
<ol>
<li>RGB3</li>
<li>YU12</li>
<li>YV12</li>
<li>YUYV</li>
<li>BGR3</li>
</ol>
<p>Explanations ahead...</p>
<blockquote>
<p>When colors meet computers and humans</p>
</blockquote>
<p>Color Spaces are models of 'Color Organization' that enable reproducible representations of color in different media (analog, digital). Color is a human subjective visual perceptual property. Recursing these definitions on Wikipedia took me back to Middle School. Color is a physical (observable and measurable) property. The way us humans see it is not the same as a color sensing photodiodes see it and the computer monitors reproduce it. Translating color from one base to another requires a data structure known as the <strong>color space</strong>. The signals from the webcam are encoded into one of the color spaces. Just in case you're wondering - YUV model describes colors in terms of a <strong>Luma (luminance)</strong> component and two chrominance components (U and V). The 2-D UV plane can describe all colors. YUV can be converted into RGB and vice-versa. The YUV422 data format shares U and V values between two pixels. As a result, these values are transmitted to the PC image buffer only once for every two pixels, resulting in an average transmission rate of 16 bits per pixel.
Capturing on the YUV 4:2:2 format is more efficient than RGB formats whereas color reproduction on a pixel array is more convenient via RGB.
For the purposes of Video Streaming from a Stereo Camera System like Minoru, using a RGB color space is the best option because it results in faster performance with a codec like MJPEG (Multi-part JPEG) which is the final requirement for the Blender Game Engine stream. I hope this theoretical explanation superveniently describes the challenge I've been trying to crack.</p>
<p>FFmpeg built with <code>v4l2-utils</code> support is used for the Stereo Streaming.</p>
<h2>Experiments with Blender</h2>
<p>I tried capturing the two video devices directly from the Blender Game Engine application. It was a good experience learning about creating basic Blender Games.</p>
<p><img alt="Blender Game" src="http://siddhantsci.org/images/blender-try-two-sources.jpg" /></p>
<p>The workflow to this end was -</p>
<ul>
<li>Create two Cube Meshes</li>
<li>Enable GLSL shading mode</li>
<li>Set Object Shading to <code>Shadeless</code> to enhance brightness</li>
<li>Add Image Textures to both images</li>
<li>Add a <code>sensor</code> that is triggered to <code>True</code> <strong>always</strong>.</li>
<li>Add a Python script controller corresponding to each sensor.</li>
<li>The script to control the right camera of the stereo system is -</li>
</ul>
<div class="highlight"><pre><span class="n">import</span> <span class="n">VideoTexture</span>
<span class="n">import</span> <span class="n">bge</span>
<span class="n">contr</span> <span class="o">=</span> <span class="n">bge</span><span class="p">.</span><span class="n">logic</span><span class="p">.</span><span class="n">getCurrentController</span><span class="p">()</span>
<span class="n">obj</span> <span class="o">=</span> <span class="n">contr</span><span class="p">.</span><span class="n">owner</span>
<span class="k">if</span> <span class="n">not</span> <span class="n">hasattr</span><span class="p">(</span><span class="n">bge</span><span class="p">.</span><span class="n">logic</span><span class="p">,</span> <span class="err">&#39;</span><span class="n">video</span><span class="err">&#39;</span><span class="p">)</span><span class="o">:</span>
<span class="n">matID</span> <span class="o">=</span> <span class="n">VideoTexture</span><span class="p">.</span><span class="n">materialID</span><span class="p">(</span><span class="n">obj</span><span class="p">,</span> <span class="err">&#39;</span><span class="n">IMimage</span><span class="p">.</span><span class="n">png</span><span class="err">&#39;</span><span class="p">)</span>
<span class="n">bge</span><span class="p">.</span><span class="n">logic</span><span class="p">.</span><span class="n">video</span> <span class="o">=</span> <span class="n">VideoTexture</span><span class="p">.</span><span class="n">Texture</span><span class="p">(</span><span class="n">obj</span><span class="p">,</span> <span class="n">matID</span><span class="p">)</span>
<span class="n">bge</span><span class="p">.</span><span class="n">logic</span><span class="p">.</span><span class="n">video</span><span class="p">.</span><span class="n">source</span> <span class="o">=</span> <span class="n">VideoTexture</span><span class="p">.</span><span class="n">VideoFFmpeg</span><span class="p">(</span><span class="s">&quot;/dev/video2&quot;</span><span class="p">,</span><span class="mi">0</span><span class="p">)</span>
<span class="n">bge</span><span class="p">.</span><span class="n">logic</span><span class="p">.</span><span class="n">video</span><span class="p">.</span><span class="n">source</span><span class="p">.</span><span class="n">scale</span> <span class="o">=</span> <span class="n">True</span>
<span class="n">bge</span><span class="p">.</span><span class="n">logic</span><span class="p">.</span><span class="n">video</span><span class="p">.</span><span class="n">source</span><span class="p">.</span><span class="n">flip</span> <span class="o">=</span> <span class="n">True</span>
<span class="n">bge</span><span class="p">.</span><span class="n">logic</span><span class="p">.</span><span class="n">video</span><span class="p">.</span><span class="n">source</span><span class="p">.</span><span class="n">framerate</span> <span class="o">=</span> <span class="mf">0.2</span>
<span class="n">bge</span><span class="p">.</span><span class="n">logic</span><span class="p">.</span><span class="n">video</span><span class="p">.</span><span class="n">source</span><span class="p">.</span><span class="n">repeat</span> <span class="o">=</span> <span class="o">-</span><span class="mi">1</span>
<span class="n">bge</span><span class="p">.</span><span class="n">logic</span><span class="p">.</span><span class="n">video</span><span class="p">.</span><span class="n">source</span><span class="p">.</span><span class="n">play</span><span class="p">()</span>
<span class="n">print</span><span class="p">(</span><span class="s">&quot;In Video 2 fps: &quot;</span><span class="p">,</span> <span class="n">bge</span><span class="p">.</span><span class="n">logic</span><span class="p">.</span><span class="n">video</span><span class="p">.</span><span class="n">source</span><span class="p">.</span><span class="n">framerate</span><span class="p">)</span>
<span class="n">bge</span><span class="p">.</span><span class="n">logic</span><span class="p">.</span><span class="n">video</span><span class="p">.</span><span class="n">refresh</span><span class="p">(</span><span class="n">True</span><span class="p">)</span>
</pre></div>
<p>But it turns out Blender Game Engine does not provide extensive Video Device control. It relies on the default settings provided by Video4Linux. Since the Minoru camera is unable to stream both camera outputs at 30 frames per second - Blender simply gives in and compromises by playing the first camera output that it receives. Video4Linux simply reports <code>Insufficient Memory</code> for the other stream.</p>
<p>The output could only support one camera at a time -
<img alt="Blender cameras" src="http://siddhantsci.org/images/blender-try-two-cameras.jpg" /></p>
<p>The BGE documentation is ambiguous in the use of the VideoTexture command while controlling webcam devices.</p>
<p>It was an exciting learning experience about contemporary game design nevertheless. The take-away was that Blender Game Engine is unable to handle cameras at the hardware level. Network Streaming with FFmpeg was the only option. </p>
<h2>FFmpeg - the one-stop-shop for Multimedia</h2>
<p>My search for the perfect tool for streaming ended with FFmpeg. It amazes me how versatile this software is. Some people even call it the <a href="https://sonnati.wordpress.com/2011/08/08/ffmpeg-%E2%80%93-the-swiss-army-knife-of-internet-streaming-%E2%80%93-part-ii/">Swiss-army knife of Internet streaming</a>. So I had to basically work with Streams.
Streams are essentially Multimedia resources which are identified with the help of a <em>Media Resource Locator</em> (<strong>MRL</strong>). A combination of <code>ffmpeg</code> and <code>ffserver</code> is what I used to achieve the desired results. The stereoscopic stream produced will be used by multiple applications-</p>
<ol>
<li>Streaming to the Head-Mounted Device (currently Oculus Rift)</li>
<li>Processing Martian environment's video.</li>
<li>View in the ERAS application from ground control.</li>
</ol>
<blockquote>
<p>Why FFmpeg?</p>
</blockquote>
<ul>
<li>It is fast, reliable, and free.</li>
<li>It provides a complete solution from streaming and transcoding to media playback, conversion, and probe analysis.</li>
</ul>
<p>Quoting from its <a href="http://ffmpeg.org/ffmpeg.html">documentation</a> -</p>
<blockquote>
<p>ffmpeg reads from an arbitrary number of input "files" (which can be regular files, pipes, network streams, grabbing devices, etc.), specified by the -i option, and writes to an arbitrary number of output "files", which are specified by a plain output filename. Anything found on the command line which cannot be interpreted as an option is considered to be an output filename. </p>
</blockquote>
<p>I tinkered with loads of <code>ffmpeg</code> options and created a lot of useful junkcode. The good thing about GSoC is that it makes you aware of the open-source influences out there. Throughout this work on streaming, I was motivated by the philosophy of <strong>Andrew Tridgell</strong> who says that <a href="http://samba.org/ftp/tridge/talks/junkcode.pdf">"junkcode can be an important learning tool"</a>.</p>
<div class="highlight"><pre><span class="n">ffmpeg</span> <span class="o">-</span><span class="n">f</span> <span class="n">v4l2</span> <span class="o">-</span><span class="n">framerate</span> <span class="mi">15</span> <span class="o">-</span><span class="n">video_size</span> <span class="mi">640</span><span class="n">x480</span> <span class="o">-</span><span class="n">i</span> <span class="o">/</span><span class="n">dev</span><span class="o">/</span><span class="n">video1</span> <span class="n">outp1</span><span class="p">.</span><span class="n">mp4</span> <span class="o">-</span><span class="n">framerate</span> <span class="mi">15</span> <span class="o">-</span><span class="n">i</span> <span class="o">/</span><span class="n">dev</span><span class="o">/</span><span class="n">video2</span> <span class="n">outp2</span><span class="p">.</span><span class="n">mp4</span>
</pre></div>
<p>This resulted in a steady video stream -</p>
<p>A sample of three different frames at </p>
<div class="highlight"><pre><span class="n">frame</span><span class="o">=</span> <span class="mi">1064</span> <span class="n">fps</span><span class="o">=</span> <span class="mi">16</span> <span class="n">q</span><span class="o">=</span><span class="mf">27.0</span> <span class="n">q</span><span class="o">=</span><span class="mf">27.0</span> <span class="n">size</span><span class="o">=</span><span class="mi">631</span><span class="n">kB</span> <span class="n">time</span><span class="o">=</span><span class="mo">00</span><span class="o">:</span><span class="mo">01</span><span class="o">:</span><span class="mf">07.06</span>
<span class="n">frame</span><span class="o">=</span> <span class="mi">1072</span> <span class="n">fps</span><span class="o">=</span> <span class="mi">16</span> <span class="n">q</span><span class="o">=</span><span class="mf">27.0</span> <span class="n">q</span><span class="o">=</span><span class="mf">27.0</span> <span class="n">size</span><span class="o">=</span><span class="mi">723</span><span class="n">kB</span> <span class="n">time</span><span class="o">=</span><span class="mo">00</span><span class="o">:</span><span class="mo">01</span><span class="o">:</span><span class="mf">07.60</span>
<span class="n">frame</span><span class="o">=</span> <span class="mi">1079</span> <span class="n">fps</span><span class="o">=</span> <span class="mi">16</span> <span class="n">q</span><span class="o">=</span><span class="mf">27.0</span> <span class="n">q</span><span class="o">=</span><span class="mf">27.0</span> <span class="n">size</span><span class="o">=</span><span class="mi">750</span><span class="n">kB</span> <span class="n">time</span><span class="o">=</span><span class="mo">00</span><span class="o">:</span><span class="mo">01</span><span class="o">:</span><span class="mf">08.06</span>
</pre></div>
<p>Learning about the <code>ffmpeg-filters</code> made this experience worthwhile. I was not able to overlay videos side-by-side and combine them in real-time. This is the script that I used -</p>
<div class="highlight"><pre><span class="n">ffmpeg</span> <span class="o">-</span><span class="n">s</span> <span class="mi">320</span><span class="n">x240</span> <span class="o">-</span><span class="n">r</span> <span class="mi">24</span> <span class="o">-</span><span class="n">f</span> <span class="n">video4linux2</span> <span class="o">-</span><span class="n">i</span> <span class="o">/</span><span class="n">dev</span><span class="o">/</span><span class="n">video1</span> <span class="o">-</span><span class="n">s</span> <span class="mi">320</span><span class="n">x240</span> <span class="o">-</span><span class="n">r</span> <span class="mi">24</span> <span class="o">-</span><span class="n">f</span> <span class="n">video4linux2</span> <span class="o">-</span><span class="n">i</span> <span class="o">/</span><span class="n">dev</span><span class="o">/</span><span class="n">video2</span> <span class="o">-</span><span class="n">filter_complex</span> <span class="s">&quot;[0:v]setpts=PTS-STARTPTS, pad=iw*2:ih[bg];[1:v]setpts=PTS-STARTPTS[fg]; [bg][fg]overlay=w&quot;</span> <span class="o">-</span><span class="n">c</span><span class="o">:</span><span class="n">v</span> <span class="n">libx264</span> <span class="o">-</span><span class="n">crf</span> <span class="mi">23</span> <span class="o">-</span><span class="n">preset</span> <span class="n">medium</span> <span class="o">-</span><span class="n">movflags</span> <span class="n">faststart</span> <span class="n">nerf</span><span class="p">.</span><span class="n">mp4</span>
</pre></div>
<p>It basically tells ffmpeg to use a resolution of 320x240 and 24 fps for each of the camera devices and apply an overlay filter to enable side-by-side video output. <code>PTS-STARTPTS</code> allows for time synchronization of the two streams and the presets enable efficient encoding.</p>
<p>I shot a video using the Minoru video camera. After applying the Overlay filter, I got a nice video with the Left and Right video streams arranged side-by-side. In this screenshot, I am pointing my little brother's Nerf guns towards each of the Minoru's two cameras -</p>
<p><img alt="Minoru Nerf Gun" src="http://siddhantsci.org/images/minoru-nerf.png" /></p>
<p>I can experiment with the <strong>Stereoscopic anaglyph filters</strong> to extend it to a single-screen 3D live stream. But the present task involves streaming to the Oculus Rift which is what I'll be working on next. In addition to <code>ffmpeg</code>, I also made use of <code>ffserver</code> and <code>ffplay</code> in my Streaming workflow. These have been explained in a <a href="http://siddhantsci.org/blog/2015/07/01/mid-term-report-gsoc-15/">previous post</a>.</p>
<h2>Experiments with <code>v4l2stereo</code></h2>
<p>Working with stereoscopic cameras is atypical to a traditional Computer Vision workflow. Each of the cameras require calibration in order for Range-Imaging applications like depth maps and point clouds to work. I calibrated my camera using the excellent <a href="https://github.com/bashrc/v4l2stereo">v4l2stereo</a> tool.</p>
<p>Here are some screenshots -</p>
<p><img alt="Minoru Calibration" src="http://siddhantsci.org/images/minoru-calibration.jpg" /></p>
<p>Basic Feature detection -</p>
<p><img alt="Minoru Calibration" src="http://siddhantsci.org/images/minoru-features.jpg" /></p>
<h2>Closing remarks</h2>
<p>This was a very hectic couple of weeks. The output I produced pales in comparison to the tinkering that I had been doing. I'll be using all the important scripts that did not make it to the final repository in the documentation so that future students won't have to wade through the insurmountable learning curve of Multimedia Streaming. All the work regarding this can be found <a href="https://bitbucket.org/italianmarssociety/eras/src/a31a7a135eb0315c4d3aa4d968e0832666af14eb/servers/telerobotics/streams/?at=default">here</a>. I realized the overwhelming importance of IRC channels when I got help from #ffmpeg and #v4l2 channels when I was stuck with no end in sight. I gathered a GREAT DEAL of experience in Video Streaming which I hope will go a long way.</p>
<p>This has been one giant bi-weekly report. Thank you for reading. <em>Ciao!</em></p></summary><category term="GSoC"></category><category term="Python"></category><category term="PSF"></category><category term="computers"></category><category term="science"></category><category term="exploration"></category><category term="space"></category><category term="mars"></category><category term="IMS"></category><category term="Italian Mars Society"></category></entry><entry><title>Remote tests in Telerobotics</title><link href="http://siddhantsci.org/blog/2015/07/08/remote-tests-in-telerobotics/" rel="alternate"></link><updated>2015-07-08T19:53:52+00:00</updated><author><name>Siddhant Shrivastava</name></author><id>tag:siddhantsci.org,2015-07-08:blog/2015/07/08/remote-tests-in-telerobotics/</id><summary type="html"><p>Ciao :) The <em>sixth week</em> of GSoC 2015 got over. According to the <a href="http://erasproject.org/2015-gsoc/#2">Telerobotics project timeline</a>, this week was supposed to be the <strong>Buffer Week</strong> to account for any unforeseen work that may pop up. We at the <strong>Italian Mars Society</strong> were trying to get ROS communication possible over a <em>large</em> network. After effective discussion via mail and prioritizing on Trello, the <strong>first Husky test</strong> was scheduled on July 1, <strong>second test</strong> on July 7 and the <strong>third test</strong> on July 8. It was an international effort spanning timezones in UTC-5:00, UTC+2:00, and UTC+5:30 regions. So zeroing in on a common time was an interesting sub-challenge in itself.</p>
<blockquote>
<p>By a <strong>large</strong> network, I mean this -</p>
</blockquote>
<p><img alt="Remote Testing" src="http://siddhantsci.org/images/remote-problem.png" /></p>
<p>On visceral observation, the problem statement looks quite tractable and practical. But like all problems in Computer Networks, this one looked easy in theory, but frustrated the budding Computer Scientist in me as the solutions proposed didn't work out.</p>
<blockquote>
<p>Husky Test 1</p>
</blockquote>
<p>Matt (from the Space Research Systems Group, Carleton University), Franco, and I were trying to get the Husky UGV in Canada to respond to the commands sent from the three parts of world involved (Canada, India, Italy). The few problems we came across -</p>
<ol>
<li>
<p>ROS version issues caused a minor problem. The Husky robot was running an older version of ROS (Hydro) while Franco and I were using the newer version (Indigo). This caused problems in reading certain Husky messages. Solution - Upgrade ROS version on the Husky robot OR downgrade our version to ROS Hydro and Ubuntu 12.04.</p>
</li>
<li>
<p>Network Issues - Unable to communicate with all three computers in all cases. There was no bidirectional communication between the ROS computers and ports were blocked.</p>
</li>
<li>
<p><strong>Success</strong> - GPS Messages and status messages were received from the Husky robot laptop set as the ROS Master. But the Husky laptop was unable to receive Teleoperation messages from Franco's computer and my computer (even though it detected that we were publishing messages). Again a Network problem.</p>
</li>
</ol>
<blockquote>
<p>Solution - <strong>Virtual Private Networks</strong>, well almost...</p>
</blockquote>
<p>At first, I had to ensure that the TP-Link WiFi Router at home was not creating problems. To ensure this, I added my laptop interface in the <strong>Demilitarized Zone (DMZ)</strong>, and enabled <strong>Port Forwarding</strong> for all the ports of interest.</p>
<blockquote>
<p><em>Success</em> with Blender Game Engine Streaming</p>
</blockquote>
<p>Now, this solved quite a few problems - my public IP could now behave like one. To prove this, Franco and I held a Web-Stream session in which his laptop in Italy behaved as the Blender Game Engine Client while I provided a live video feed from the Minoru Camera while using a <strong>FFMpeg Server</strong>. His words - "You are live. I can see the stream." provided the much-needed boost I required to tackle the pending Computer Networks problems I had to solve in the following couple of days.</p>
<p>Coming to the VPN problem, I first read about the various VPN Server solutions available, like -</p>
<ul>
<li>OpenVPN</li>
<li>PPTP (Point-to-Point Tunneling Protocol)</li>
<li>IPSec</li>
<li>SSH Tunneling</li>
</ul>
<p>The second Husky test was done with a PPTP VPN setup which wasn't quite succesful. The reason being - ROS requires bidirectional communication between the peers, and I couldn't become a peer while I was the VPN server. It caused a slew of other pesky problems like <code>REQ TIMEOUTS</code>, Disconnected ROS Nodes, disabling Internet on the VPN server, etc. But as a start, it was assuring that the problem could be solved. I realized that the learning curve for working with computers at the scale of the Internet is no child's play. But there was another takeaway with the second Husky test. Andrea (from the Husky team) could work with my remote node as the ROS master and still get the Husky up and running. This means that all the Husky traffic and node maintenance could be relegated through my PC and transferred to the Husky. <em>Much assuring.</em></p>
<p>Armed with the Computer Networks concepts I learnt at my college, I set on to set up the slightly tougher OpenVPN server. This is a snapshot of the OpenVPN access server that I set up -</p>
<p><img alt="OpenVPN users" src="http://siddhantsci.org/images/openvpn-users.png" /></p>
<p>I was not only able to set up a world-wide VPN, but also able to set up communication among the peers. But the firewalls on the Husky computer network were strong for it and sent Andrea's laptop in a continous <em>Trying to Reconnect</em> loop. There went our hopes with OpenVPN. I am still looking into this issue. The main issue was that the UDP channel of OpenVPN was accessible in the Husky network but not the TCP channels. This caused intermittent connection losses and the OpenVPN client couldn't figure out what to do. There must be a solution to this and I'll find it.</p>
<p>Throughout this experience, I learnt a lot of new things about practical Computer Networks. Once I'm able to crack the VPN problem, I could put it to use in diverse scenarios (remote robotics testing, as a road warrior, Internet of Things applications, creating a network of friends, etc. ). VPN brings everyone on the same page (or logical subnet). I also did quite a bit of work with the Stereo Video Streaming which would be the theme of my next post. Stay tuned.</p>
<p><em>Ciao!</em></p></summary><category term="GSoC"></category><category term="Python"></category><category term="PSF"></category><category term="computers"></category><category term="science"></category><category term="exploration"></category><category term="space"></category><category term="mars"></category><category term="IMS"></category><category term="Italian Mars Society"></category></entry><entry><title>Mid-term Report - GSoC '15</title><link href="http://siddhantsci.org/blog/2015/07/01/mid-term-report-gsoc-15/" rel="alternate"></link><updated>2015-07-01T19:53:52+00:00</updated><author><name>Siddhant Shrivastava</name></author><id>tag:siddhantsci.org,2015-07-01:blog/2015/07/01/mid-term-report-gsoc-15/</id><summary type="html"><p>Hi all! I made it through the first half of the <a href="http://siddhantsci.org/category/gsoc.html">GSoC 2015 program</a>. This is the <strong>evaluation week</strong> of the <a href="http://www.google-melange.com/gsoc/homepage/google/gsoc2015">Google Summer of Code 2015 program</a> with the <a href="https://www.python.org/psf/">Python Software Foundation</a> and the <a href="http://erasproject.org/">Italian Mars Society ERAS Project</a>. Mentors and students evaluate the journey so far in the program by answering some questions about their students and mentors respectively. On comparing with the timeline, I reckoned that I am on track with the project so far.</p>
<blockquote>
<p>The entire <strong>Telerobotics with Virtual Reality</strong> project can be visualized in the following diagram -</p>
</blockquote>
<p><img alt="Project Architecture" src="http://siddhantsci.org/images/telerobotics-diagram.png" /></p>
<h2>Achievements-</h2>
<h3>Husky-ROS-Tango Interface</h3>
<ul>
<li><strong>ROS-Tango interfaces</strong> to connect the <strong>Telerobotics</strong> module with the <strong>rest of ERAS</strong>.</li>
<li>
<p>ROS Interfaces for Navigation and Control of Husky
<img alt="Husky Navigation" src="http://siddhantsci.org/images/navigate-ros.png" /></p>
</li>
<li>
<p><a href="http://siddhantsci.org/blog/2015/06/24/the-half-life-of-telerobotics/">Logging Diagnostics</a> of the robot to the Tango Bus</p>
</li>
<li>Driving the Husky around using human commands
<img alt="Husky Commands" src="http://siddhantsci.org/images/husky-command.png" /></li>
</ul>
<h3>Video Streaming</h3>
<ul>
<li>Single Camera Video streaming to Blender Game Engine</li>
</ul>
<p>This is how it works. <strong>ffmpeg</strong> is used as the streaming server to which Blender Game Engine subscribes.</p>
<p>The <code>ffserver.conf</code> file is configured as follows which describes the characterstics of the stream:</p>
<div class="highlight"><pre><span class="nb">Port</span> <span class="mi">8190</span>
<span class="nx">BindAddress</span> <span class="mf">0.0.0.0</span>
<span class="nx">MaxClients</span> <span class="mi">10</span>
<span class="nx">MaxBandwidth</span> <span class="mi">50000</span>
<span class="nx">NoDaemon</span>
<span class="o">&lt;</span><span class="nx">Feed</span> <span class="nx">webcam.ffm</span><span class="o">&gt;</span>
<span class="nb">file</span> <span class="p">/</span><span class="nx">tmp</span><span class="p">/</span><span class="nx">webcam.ffm</span>
<span class="nx">FileMaxSize</span> <span class="mi">2000</span><span class="nx">M</span>
<span class="o">&lt;/</span><span class="nx">Feed</span><span class="o">&gt;</span>
<span class="o">&lt;</span><span class="nx">Stream</span> <span class="nx">webcam.mjpeg</span><span class="o">&gt;</span>
<span class="nx">Feed</span> <span class="nx">webcam.ffm</span>
<span class="nb">Format</span> <span class="nx">mjpeg</span>
<span class="nx">VideoSize</span> <span class="mi">640</span><span class="nx">x480</span>
<span class="nx">VideoFrameRate</span> <span class="mi">30</span>
<span class="nx">VideoBitRate</span> <span class="mi">24300</span>
<span class="nx">VideoQMin</span> <span class="mi">1</span>
<span class="nx">VideoQMax</span> <span class="mi">5</span>
<span class="o">&lt;/</span><span class="nx">Stream</span><span class="o">&gt;</span>
</pre></div>
<p>Then the Blender Game Engine and its associated Python library <code>bge</code> kicks in to display the stream on the <strong>Video Texture</strong>:</p>
<div class="highlight"><pre><span class="cp"># Get an instance of the video texture</span>
<span class="n">bge</span><span class="p">.</span><span class="n">logic</span><span class="p">.</span><span class="n">video</span> <span class="o">=</span> <span class="n">bge</span><span class="p">.</span><span class="n">texture</span><span class="p">.</span><span class="n">Texture</span><span class="p">(</span><span class="n">obj</span><span class="p">,</span> <span class="n">ID</span><span class="p">)</span>
<span class="cp"># a ffmpeg server is streaming the feed on the IP:PORT/FILE</span>
<span class="cp"># specified in FFMPEG_PARAM,</span>
<span class="cp"># BGE reads the stream from the mjpeg file.</span>
<span class="n">bge</span><span class="p">.</span><span class="n">logic</span><span class="p">.</span><span class="n">video</span><span class="p">.</span><span class="n">source</span> <span class="o">=</span> <span class="n">bge</span><span class="p">.</span><span class="n">texture</span><span class="p">.</span><span class="n">VideoFFmpeg</span><span class="p">(</span><span class="n">FFMPEG_PARAM</span><span class="p">)</span>
<span class="n">bge</span><span class="p">.</span><span class="n">logic</span><span class="p">.</span><span class="n">video</span><span class="p">.</span><span class="n">source</span><span class="p">.</span><span class="n">play</span><span class="p">()</span>
<span class="n">bge</span><span class="p">.</span><span class="n">logic</span><span class="p">.</span><span class="n">video</span><span class="p">.</span><span class="n">refresh</span><span class="p">(</span><span class="n">True</span><span class="p">)</span>
</pre></div>
<blockquote>
<p>The entire source code for single camera streaming can be found <a href="https://bitbucket.org/italianmarssociety/v-eras-blender/src/42063c0b489152a9f124f80824ad095a752c29ff/scripts/webstream/single%20camera/?at=default">in this repository</a>.</p>
</blockquote>
<ul>
<li>Setting up the <strong>Minoru Camera</strong> for stereo vision
<img alt="Minoru Camera" src="http://siddhantsci.org/images/minoru.jpg" /></li>
</ul>
<p>It turns out this camera can stream at <strong>30 frames per second</strong> for both cameras. The last week has been particularly challenging to figure out the optimal settings for the Minoru Webcam to work. It depends on the Video Buffer Memory allocated by the <strong>Linux Kernel</strong> for <code>libuvc</code> and <code>v4l2</code> compatible webcams. Different kernel versions result in different performances. It is inefficient to stream the left and right cameras at a frame rate greater than 15 fps with the kernel version that I am using.</p>
<ul>
<li>Setting up the Oculus Rift DK1 for the <strong>Virtual Reality</strong> work in the upcoming second term
<img alt="Oculus Rift" src="http://siddhantsci.org/images/oculus-rift.jpg" /></li>
</ul>
<h2>Crash-testing and Roadblocks</h2>
<p>This project was not without its share of obstacles. A few memorable roadblocks come to mind-</p>
<ol>
<li>
<p><strong>Remote Husky testing</strong> - Matt (from <strong>Canada</strong>), Franco (from <strong>Italy</strong>), and I (from <strong>India</strong>) tested whether we could remotely control Husky. The main issue we faced was <strong>Network Connectivity</strong>. We were all on different networks geographically, which the ROS in our machines could not resolve. Thus some messages (like GPS) were accessible whereas the others (like Husky Status messages) were not. The solution we sought is to create a <strong>Virtual Private Network</strong> for our computers for future testing.</p>
</li>
<li>
<p><strong>Minoru Camera Performance differences</strong> - Since the Minoru's performance varies with the Kernel version, I had to bump down the frames per second to <em>15 fps</em> for both cameras and stream them in the Blender Game Engine. This temporary hack should be resolved as ERAS moves to newer Linux versions.</p>
</li>
<li>
<p><strong>Tango related</strong> - Tango-controls is a sophisticated piece of SCADA library with a server database for maintaining device server lists. It was painful to use the provided GUI - Jive to configure the device servers. To make the process in line with other development activities, I wrote a little CLI-based Device server registration and de-registration interactive script. A <a href="http://siddhantsci.org/blog/2015/06/18/when-two-distributed-systems-meet/">blog post</a> which explains this in detail.</p>
</li>
<li>
<p><strong>Common testing platform</strong> - I needed to use ROS Indigo, which is supported only on Ubuntu 14.04. ERAS is currently using Ubuntu 14.10. In order to enable Italian Mars Society and the members to execute my scripts, they needed my version of Ubuntu. <strong>Solution</strong> - Virtual Linux Containers. We are using a <strong>Docker Image</strong> which my mentors can use on their machine regarding of their native OS. <a href="http://siddhantsci.org/blog/2015/06/12/all-for-docker-docker-for-all/">This post</a> explains this point.</p>
</li>
</ol>
<h2>Expectations from the second term</h2>
<p>This is a huge project in that I have to deal with <em>many different technologies</em> like -</p>
<ol>
<li>Robot Operating System</li>
<li>FFmpeg</li>
<li>Blender Game Engine</li>
<li>Oculus VR SDK</li>
<li>Tango-Controls</li>
</ol>
<p>So far, the journey has been exciting and there has been a lot of learning and development. The second term will be intense, challenging, and above all, fun.</p>
<p>To-do list -</p>
<ol>
<li>Get Minoru webcam to work with ffmpeg streaming</li>
<li>
<p>Use Oculus for an Augmented Reality application
<img alt="Oculus Rift" src="http://siddhantsci.org/images/oculus-mars.jpg" />
<a href="https://vimeo.com/111243246">Source</a></p>
</li>
<li>
<p>Integrate Bodytracking with Telerobotics</p>
</li>
<li>Automation in Husky movement and using a UR5 manipulator</li>
<li>Set up a <a href="http://pptpclient.sourceforge.net/">PPTP</a> or <a href="https://openvpn.net/">OpenVPN</a> for ERAS</li>
</ol>
<p>Time really flies by fast when I am learning new things. GSoC so far has taught me how to not be a <a href="https://www.quora.com/What-are-the-characteristics-of-a-bad-software-engineer">bad software engineer</a>, but also how to be a good open source community contributor. That is what the spirit of Google Summer of Code is about and I have imbibed a lot. Besides, working with the Italian Mars Society has also motivated me to learn the Italian language. So Python is not the only language that I'm practicing over this summer ;)</p>
<blockquote>
<p>Here's to the second term of Google Summer of Code 2015!
<img alt="GSoC Banner" src="http://siddhantsci.org/images/gsoc-banner.png" /></p>
</blockquote>
<p>Ciao :)</p></summary><category term="GSoC"></category><category term="Python"></category><category term="PSF"></category><category term="computers"></category><category term="science"></category><category term="exploration"></category><category term="space"></category><category term="mars"></category><category term="IMS"></category><category term="Italian Mars Society"></category></entry><entry><title>The Half-Life of Telerobotics</title><link href="http://siddhantsci.org/blog/2015/06/24/the-half-life-of-telerobotics/" rel="alternate"></link><updated>2015-06-24T19:53:52+00:00</updated><author><name>Siddhant Shrivastava</name></author><id>tag:siddhantsci.org,2015-06-24:blog/2015/06/24/the-half-life-of-telerobotics/</id><summary type="html"><p>Hi all! If you've been following my <a href="http://siddhantsci.org/category/gsoc.html">previous posts</a>, you'd have known that the Telerobotics module has been simmering for a couple of weeks. I'm happy to announce that it is almost complete and would hopefully be integrated with Vito's Bodytracking module.</p>
<p>The last week (week four and five) were the busiest weeks of GSoC for me.</p>
<h2>Learning Experience</h2>
<ul>
<li>I learnt A LOT about Python Software Development</li>
<li>Different types of <a href="http://www.oreilly.com/programming/free/software-architecture-patterns.csp">software architectures</a>,</li>
<li><a href="http://pyvideo.org/video/1093/the-development-process-of-python">The development process of Python</a> by one of the members of the Italian Mars Society who has been the reason I'm able to write more Pythonic code - <a href="http://wolfprojects.altervista.org/">Ezio Melotti</a></li>
<li><a href="http://www.esrf.eu/computing/cs/tango/tango_doc/kernel_doc/pytango/latest/quicktour.html#pytango-quick-tour">PyTango</a> Development</li>
<li>ipython and how helpful it can be for Tango applications</li>
<li>Message queues - Both ROS and Tango utilize ZeroMQ - which makes integration of ROS and Tango much scalable</li>
<li><a href="http://www.vlfeat.org/overview/sift.html">SIFT</a> in Python - I will be working with my mentor Fabio Nigi on this very soon</li>
<li>Making my own stereo camera</li>
</ul>
<h2>Deliverables</h2>
<ul>
<li>A <strong>ROS node</strong> which collects information from all interesting topics from the Husky robot. This can be found <a href="https://bitbucket.org/italianmarssociety/eras/src/db8c7061f4768534ebb2621296a20a016bd240ad/servers/telerobotics/src/robot-info-collector.py?at=default">here</a></li>
<li>A <strong>Tango Server</strong> which integrates with ROS to provide diagnostic information from the robot (<em>Battery Status, Temperature Levels, Current Draw, Voltate, Error Conditions</em> )</li>
<li>A simulated version of the Tango server for the Planning and Scheduling application that Shridhar is working on. These can be accessed <a href="https://bitbucket.org/italianmarssociety/eras/src/db8c7061f4768534ebb2621296a20a016bd240ad/servers/telerobotics/src/robot-diagnostics-server.py?at=default">here</a></li>
<li><strong> Soft Real-time network streaming</strong> FFMPEG server and Blender Client for a single camera video stream. This can be found <a href="https://bitbucket.org/italianmarssociety/v-eras-blender/src/42063c0b489152a9f124f80824ad095a752c29ff/scripts/webstream/single%20camera/?at=default">here</a></li>
</ul>
<h2>Under <strong>heavy</strong> Development</h2>
<ul>
<li>Integration of Bodytracking with Telerobotics. The following message format has been decided upon by the mentors and students:</li>
</ul>
<div class="highlight"><pre><span class="cp"># Attribute definitions for various diagnostic messages</span>
<span class="n">moves</span> <span class="o">=</span> <span class="n">attribute</span><span class="p">(</span><span class="n">label</span><span class="o">=</span><span class="s">&quot;Linear and angular displacement&quot;</span><span class="p">,</span> <span class="n">dtype</span> <span class="o">=</span> <span class="p">(</span><span class="kt">float</span><span class="p">,),</span>
<span class="n">display_level</span> <span class="o">=</span> <span class="n">DispLevel</span><span class="p">.</span><span class="n">EXPERT</span><span class="p">,</span>
<span class="n">access</span> <span class="o">=</span> <span class="n">AttrWriteType</span><span class="p">.</span><span class="n">READ</span><span class="p">,</span>
<span class="n">unit</span> <span class="o">=</span> <span class="s">&quot;(meters, radians)&quot;</span><span class="p">,</span>
<span class="n">fget</span><span class="o">=</span><span class="s">&quot;getMoves&quot;</span><span class="p">,</span> <span class="n">polling_period</span> <span class="o">=</span> <span class="n">POLLING</span><span class="p">,</span>
<span class="n">max_dim_x</span> <span class="o">=</span> <span class="mi">2</span><span class="p">,</span> <span class="n">max_dim_y</span> <span class="o">=</span> <span class="mi">1</span><span class="p">,</span>
<span class="n">doc</span><span class="o">=</span><span class="s">&quot;An attribute for Linear and angular displacements&quot;</span><span class="p">)</span>
</pre></div>
<p>Vito's Bodytracker would <strong>publish events</strong> in the form of Tango events. The associated data would be a float tuple of dimensions <strong>2,1</strong> (2 columns, 1 row). Such a tuple, like (3.4, 1.2) would specify a relative linear and angular displacement of the astronaut. My Telerobotics module would <strong>subscribe to this Tango event</strong> and <em>transform</em> this data to a <strong>Twist</strong> message that the Husky can understand.</p>
<ul>
<li>Extension of Camera Streaming to a dual camera setup. I am extending the streaming capabilty for a stereo camera.</li>
</ul>
<p>Mid-term evaluations start tomorrow! Eagerly looking forward to them. It has been an eventful and productive half summer of code. I hope the next half is even more exciting and challenging as the one that passed.</p>
<p><em>Ciao</em></p></summary><category term="GSoC"></category><category term="Python"></category><category term="PSF"></category><category term="computers"></category><category term="science"></category><category term="exploration"></category><category term="space"></category><category term="mars"></category><category term="IMS"></category><category term="Italian Mars Society"></category></entry><entry><title>When two Distributed Systems meet!</title><link href="http://siddhantsci.org/blog/2015/06/18/when-two-distributed-systems-meet/" rel="alternate"></link><updated>2015-06-18T19:53:52+00:00</updated><author><name>Siddhant Shrivastava</name></author><id>tag:siddhantsci.org,2015-06-18:blog/2015/06/18/when-two-distributed-systems-meet/</id><summary type="html"><p>Hi! This post is meant to be an insight into the experience and progress of the third and fourth weeks of my (a)vocation with the Google Summer of Code Program. Things got much pacier and smooth in the past two weeks. I've been able to get a stable codebase up and running with respect to the aims discussed in the timeline.</p>
<p><img alt="Sublime Text Workspace" src="http://siddhantsci.org/images/workspace2.png" />
<the usual rant> I had to totally restructure my programming workspace for the second time to support Intelligent IDE like features since the Python packages I am working with (ROS and Tango) have a fair number of modules whose documentation I need to read on the fly while coding away. Thus I set up both my <strong>Vim and Sublime Text</strong> environments to support <em>intelli-sense</em>, <em>code completion</em>, <em>block syntax completion</em>, etc. I also added a dual monitor setup with the unused LCD television at my home to make for an efficient programming ambience.
<usual rant></p>
<h2>Telerobotics Code Pushed</h2>
<p>As I mentioned in my <a href="http://siddhantsci.org/blog/2015/04/29/gsoc-2015-with-the-italian-mars-society/">first post</a>, the contributors of the <strong>Italian Mars Society</strong> are given <em>write access</em> to the online Bitbucket repository. This is a tremendous responsibility to ensure that the updates don't disturb the stability of the project. To work with this, I follow the simple and effective advice of my mentors -</p>
<div class="highlight"><pre><span class="n">hg</span> <span class="n">pull</span>
<span class="n">hg</span> <span class="n">update</span>
<span class="n">hg</span> <span class="n">add</span> <span class="p">.</span>
<span class="n">hg</span> <span class="n">commit</span> <span class="o">-</span><span class="n">m</span> <span class="s">&quot;My awesome Commit Message&quot;</span>
<span class="n">hg</span> <span class="n">push</span>
</pre></div>
<p>This simple algorithm ensures that all students can work at their pace without breaking the system. <a href="http://hginit.com/">This simple tutorial</a> can help the uninitiated to understand what I just said.</p>
<p>So while working with Tango servers for my project, I had to constantly use the bundled GUI - <strong>Jive</strong> which works as a one-stop solution for <a href="http://www.esrf.eu/computing/cs/tango/tango_doc/kernel_doc/pytango/latest/quicktour.html">Device Servers</a>. But my primordial hacker instincts prompted me to write a <a href="https://en.wikipedia.org/wiki/Command-line_interface">CLI</a> solution to add and remove device servers using the amazing <a href="http://www.esrf.eu/computing/cs/tango/tango_doc/kernel_doc/pytango/latest/#">PyTango API</a>. Thanks to Ezio's excellent comments on my commits, I've been able to contribute a Pythonic solution for working with Device Servers in a jiffy. The script can be found <a href="https://bitbucket.org/italianmarssociety/eras/src/2da8222593354228a1eb426bef556654e794365c/servers/telerobotics/utility/setup-device.py?at=default">here</a>. It has a nice UI to help the user figure out what he/she needs to enter. I have yet to correct some formatting errors to make it more consistent with PEP8 and the <a href="http://docs.python.org//glossary.html#term-eafp">EAFP</a> idiom. The current stage of argument validation is more like LBYL (Look Before You Leap) which is slow for the script's use-case.</p>
<p><strong>The second module</strong> I pushed is the <strong>Husky Test</strong> script to ensure if the Husky installation works or not on a particular setup. The <a href="https://bitbucket.org/italianmarssociety/eras/src/2da8222593354228a1eb426bef556654e794365c/servers/telerobotics/utility/test_husky.py?at=default">test script</a> which allows a Husky to move with a particular linear and angular velocity. The <a href="https://bitbucket.org/italianmarssociety/eras/src/2da8222593354228a1eb426bef556654e794365c/servers/telerobotics/doc/sad.rst?at=default">Software Architecture Document</a> was also updated to account for the new changes in the ROS-Tango interface architecture. A better understanding of the SAD can be had in <a href="http://siddhantsci.org/blog/2015/05/29/software-architecture-document-for-telerobotics/">an earlier post</a>.</p>
<h2>Docker</h2>
<p>I explained the Docker setup and distribution in a <a href="http://siddhantsci.org/blog/2015/06/12/all-for-docker-docker-for-all/">quick mini-post</a>. I tested that the X-errors don't impede with the scripts that I have been developing since ROS topics can be accessed from the command line as well. This is a good thing. The Docker repository for my workspace can be found <a href="https://registry.hub.docker.com/u/sidcode/ros-eras/">here</a>.</p>
<h2>Python Reading</h2>
<p>I have been voraciously consulting the following sources for getting the knack of Python and PyTango programming -</p>
<ul>
<li>Python Docs for <a href="http://docs.python.org/2/">Python 2</a> and <a href="http://docs.python.org/3/">Python 3</a></li>
<li><a href="http://shop.oreilly.com/product/0636920027072.do">Python Cookbook</a> by O'Reilly Publishers</li>
<li><a href="http://shop.oreilly.com/product/0636920032519.do">Fluent Python</a> (early access) again by O'Reilly Publishers</li>
<li><a href="http://www.esrf.eu/computing/cs/tango/tango_doc/kernel_doc/pytango/latest/index.html">PyTango documentation</a></li>
</ul>
<p>The happiest point of all this reading kicked in when I could help Vito to reduce <strong>fifty lines of code to just two</strong> with the use of the <code>exec</code> construct in Python. In case you're wondering, this is the <a href="https://bitbucket.org/italianmarssociety/eras/commits/2da8222593354228a1eb426bef556654e794365c#Lservers/body_tracker/tracker/tracker.pyT40">code written by Vito</a> -</p>
<div class="highlight"><pre> <span class="nt">joints</span> <span class="o">=</span> <span class="cp">[</span>
<span class="s1">&#39;skeleton_head&#39;</span><span class="p">,</span>
<span class="s1">&#39;skeleton_neck&#39;</span><span class="p">,</span>
<span class="s1">&#39;skeleton_left_shoulder&#39;</span><span class="p">,</span>
<span class="s1">&#39;skeleton_right_shoulder&#39;</span><span class="p">,</span>
<span class="s1">&#39;skeleton_left_elbow&#39;</span><span class="p">,</span>
<span class="s1">&#39;skeleton_right_elbow&#39;</span><span class="p">,</span>
<span class="s1">&#39;skeleton_left_hand&#39;</span><span class="p">,</span>
<span class="s1">&#39;skeleton_right_hand&#39;</span><span class="p">,</span>
<span class="s1">&#39;skeleton_torso&#39;</span><span class="p">,</span>
<span class="s1">&#39;skeleton_left_hip&#39;</span><span class="p">,</span>
<span class="s1">&#39;skeleton_right_hip&#39;</span><span class="p">,</span>
<span class="s1">&#39;skeleton_left_knee&#39;</span><span class="p">,</span>
<span class="s1">&#39;skeleton_right_knee&#39;</span><span class="p">,</span>
<span class="s1">&#39;skeleton_left_foot&#39;</span><span class="p">,</span>
<span class="s1">&#39;skeleton_right_foot&#39;</span>
<span class="cp">]</span>
<span class="nt">attr_init_params</span> <span class="o">=</span> <span class="nt">dict</span><span class="o">(</span>
<span class="nt">dtype</span><span class="o">=(</span><span class="s1">&#39;float32&#39;</span><span class="o">,),</span>
<span class="nt">unit</span><span class="o">=</span><span class="s1">&#39;m&#39;</span><span class="o">,</span>
<span class="nt">max_dim_x</span><span class="o">=</span><span class="nt">3</span><span class="o">,</span>
<span class="nt">polling_period</span><span class="o">=</span><span class="nt">POLLING</span>
<span class="o">)</span>
<span class="nt">for</span> <span class="nt">joint</span> <span class="nt">in</span> <span class="nt">joints</span><span class="o">:</span>
<span class="nt">exec</span> <span class="s2">&quot;%s = attribute(**attr_init_params)&quot;</span> <span class="o">%</span> <span class="nt">joint</span>
</pre></div>
<p>Note that without the <code>exec</code> usage, each line would've to be manually written for each of the joint that we see in the <code>joints</code> list.</p>
<h2>Ongoing Stuff</h2>
<p>There are certain deliverables in the pipeline currently waiting to be pushed to the online repository over the course of the next week. I have been working on -</p>
<ul>
<li>ROS-feedback Aggregator Device Server for Tango</li>
<li>ROS Commander Node for the Husky</li>
<li>Tango Client to understand Husky status (battery levels, sensor monitor, etc.)</li>
<li>Mathematical Transformations and Named Tuples for different structures that Telerobotics requires.</li>
</ul>
<p>GSoC with PSF and Italian Mars Society is turning out to be fun-and-challenging. Mid-term Evaluations start in a week. Lots of work to do. I strongly hope my next post will be a celebratory one highlighting the pushed code I described in <em>Ongoing Stuff</em>.</p>
<p>Until then, Ciao!</p></summary><category term="GSoC"></category><category term="Python"></category><category term="PSF"></category><category term="computers"></category><category term="science"></category><category term="exploration"></category><category term="space"></category><category term="mars"></category><category term="IMS"></category><category term="Italian Mars Society"></category></entry><entry><title>All for Docker; Docker for all!</title><link href="http://siddhantsci.org/blog/2015/06/12/all-for-docker-docker-for-all/" rel="alternate"></link><updated>2015-06-12T19:53:52+00:00</updated><author><name>Siddhant Shrivastava</name></author><id>tag:siddhantsci.org,2015-06-12:blog/2015/06/12/all-for-docker-docker-for-all/</id><summary type="html"><p>Hi! This is going to be a short post about my developments in the Week 3 of my GSoC project. Since my <a href="http://siddhantsci.org/blog/2015/06/08/tango-ing-with-ros-week-2/">last post</a>, I have had the chance to work with some exciting state-of-the-art technologies which allow easy distribution and scalability. These are -</p>
<ol>
<li>Docker
<img alt="Docker Logo" src="http://siddhantsci.org/images/docker-logo.png" /></li>
<li>Tango-Controls
<img alt="Tango Controls logo" src="http://siddhantsci.org/images/tangologo.png" /></li>
</ol>
<p>I used the <a href="https://registry.hub.docker.com/_/ubuntu/">Ubuntu 14.04</a> <em>Docker Container</em> to setup my system which can be used by anyone in the world as a common platform to test the applications that I am working on. This has multiple advantages -</p>
<ul>
<li>Setup-time for collaborators is null. The developer sets up the Docker container and the community members can use it directly.</li>
<li>Host platform-independent. It doesn't matter whether the collaborator's host system is Arch Linux, Windows 8, or a specific version of Ubuntu. Docker uses <a href="http://www.toptal.com/linux/separation-anxiety-isolating-your-system-with-linux-namespaces">Linux namespaces</a> and ensures a separation of concerns.</li>
<li>Revision control mechanism. The developer plays around with a Docker Image just as he/she would do with any other <strong>Distribution Revision Control system</strong>. I <strong>push</strong> my changes to the repository (Docker image) and my mentors can simply <strong>pull the updates</strong> to get the new system configuration.</li>
</ul>
<p>So far, I have setup Tango-Controls, ROS Indigo, and the Husky libraries for my Docker image. These can be found in the <a href="https://registry.hub.docker.com/u/sidcode/ros-eras/">Docker Registry Hub</a></p>
<p>The issues that I am currently facing are -</p>
<ul>
<li>Graphics Problems. X-server Bad Drawing errors. A way to get around this will be to better understand how ROS applications use the X-server and then provide Docker the appropriate graphics capabilities. But this does not impede with the Command Line applications of ROS and Tango which I have been working on.</li>
<li>MySQL connection problems. The workaround currently is to use the Host OS's Tango HOST. I observed that it works fine that way.</li>
</ul>
<p>This is it for this post. I mainly discussed about Docker in this post, which was an important thing that we discussed in the <strong>All-hands meeting on 8th June</strong>. I'll go into much more detail with Tango Controls in the upcoming blog posts and the biweekly reports.</p>
<p>Ciao!</p></summary><category term="GSoC"></category><category term="Python"></category><category term="PSF"></category><category term="computers"></category><category term="science"></category><category term="exploration"></category><category term="space"></category><category term="mars"></category><category term="IMS"></category><category term="Italian Mars Society"></category></entry><entry><title>Tango-ing with ROS- Week 2!</title><link href="http://siddhantsci.org/blog/2015/06/08/tango-ing-with-ros-week-2/" rel="alternate"></link><updated>2015-06-08T00:53:52+00:00</updated><author><name>Siddhant Shrivastava</name></author><id>tag:siddhantsci.org,2015-06-08:blog/2015/06/08/tango-ing-with-ros-week-2/</id><summary type="html"><p>Hi! This one is about my <strong>second week of the Google Summer of Code 2015 program</strong>. It was a busy long <strong>week two</strong> with some crucial design decisions to be implemented and new things to learn. It was also a hectic week of reading how to write better Python code (<code>Fluent Python - O'Reilly Publishers</code>, maintaining Python2 and Python3 compatibility, etc) After finalizing on the architecture last week (shown below), it was time to work on implementing it -</p>
<p><img alt="ROS and Tango" src="http://siddhantsci.org/images/rostango.png" /></p>
<p>Evidently from the diagram, there are <strong>two distributed systems</strong> involved - both significantly complicated. These are -</p>
<ul>
<li>Tango Controls</li>
<li>ROS (Robot Operating System)</li>
</ul>
<p>The challenge here is to create an <strong>event-triggered Tango Device</strong> which serves <strong>as both a client and a server</strong>. This Tango device listens for new events on the Tango bus, and sends data to it when need be. In addition, this is also interfaced with ROS in that the required Tango events for ROS are processed by the device and published to the appropriate <code>TangoROS</code> topic when required. It also subscribes to <code>ROSTango</code> topic to listen to any incoming updates from the robot.</p>
<p>Some use-cases for this are as follows -</p>
<ul>
<li>The Bodytracking server pushes the location/orientation data on the bus.</li>
<li>The TangoROS Device subscribes to the events of the Bodytracking data on the Tango bus.</li>
<li>When an event is triggered, the device processes the data into ROS-compatible messages (<code>location</code> and <code>orientation</code> are <strong>transformed</strong> into <code>linear velocity</code> and <code>angular velocity</code>)</li>
<li>The <em>ROS Commander</em> node (which is subscribed to the <code>ROSTango</code> topic)</li>
<li>The ROS Commander node continuously monitors the robot for different measurements (<strong>sensor readings, battery status, navigation feedback, etc</strong>). The important signals are published to the <code>ROSTango bus</code>.</li>
</ul>
<p>This is my first time working with the powerful Tango-Controls system. It is used by -</p>
<ol>
<li>Italian Mars Society</li>
<li>The very large solar array network (SAK)</li>
<li>Synchotrons and Particle Accelerators around Europe</li>
</ol>
<p>I'll discuss how I work with Tango and ROS in my next blog post.</p>
<p>The Italian Mars Society had an All-hands Skype meeting on 8th June, 2015 where all the GSoC students and mentors discussed project status, software architecture document feedback, roadblocks, hardware needs, collaboration, field tests etc.</p>
<p>Things that were discussed and to be done-</p>
<ul>
<li>Docker Image for ROS setup (<strong>very important</strong>)</li>
<li>Battery status Tango server</li>
<li>ROS Tango Client</li>
<li>ROS Tango server for certain use cases</li>
<li>Tango events</li>
<li>Timestamp based Transformation of parameters in a time-series data</li>
<li>Set up the Minoru 3D camera and the Oculus Rift device</li>
</ul>
<p>This is a week where I'd like most of these things to fall in place. GSoC is turning out to be exciting and challenging! Til the next post. Over to week three.</p>
<p>Ciao!</p></summary><category term="GSoC"></category><category term="Python"></category><category term="PSF"></category><category term="computers"></category><category term="science"></category><category term="exploration"></category><category term="space"></category><category term="mars"></category><category term="IMS"></category><category term="Italian Mars Society"></category></entry><entry><title>Programming a Mars rover - Week 1!</title><link href="http://siddhantsci.org/blog/2015/06/03/programming-a-mars-rover-week-1/" rel="alternate"></link><updated>2015-06-03T00:53:52+00:00</updated><author><name>Siddhant Shrivastava</name></author><id>tag:siddhantsci.org,2015-06-03:blog/2015/06/03/programming-a-mars-rover-week-1/</id><summary type="html"><p>Hi! This is the sixth post in my <a href="http://siddhantsci.org/category/gsoc.html">GSoC '15 blog series</a>.</p>
<p>So the much awaited coding period began on 25th May, 2015. After a refreshing <a href="http://siddhantsci.org/blog/2015/05/23/gsoc-15-community-bonding/">Community Bonding</a> experience, <a href="http://siddhantsci.org/blog/2015/05/26/workspace-setup-for-telerobotics/">setting up my workspace</a>, and <a href="http://siddhantsci.org/blog/2015/05/29/software-architecture-document-for-telerobotics/">creating a Software Architecture Document</a> - I was in a position to start coding.</p>
<h2>Aims and Milestones</h2>
<p>This week, according to the <a href="http://siddhantsci.org/blog/2015/05/07/gsoc-15-about-my-project/">timeline</a>, my aims were -</p>
<ul>
<li>Creating the initial set of ROS nodes for the Husky model for linear and angular motion </li>
<li>Zeroing in on the basic interface for mapping the Kinect bodytracking information and Motivity interface being concurrently developed by Vito to teleoperation commands that Husky can understand</li>
<li>Figuring out a way to integrate ROS and Tango into ERAS</li>
</ul>
<p>So far it has been a good week and I am on schedule. I am able to manipulate the motion of the simulated Husky via an external stimuli.</p>
<h2>Architecture</h2>
<p>Before I describe my programs, let me first describe the high-level architecture with help of a simple diagram -</p>
<p><img alt="Telerobotics Architecture" src="http://siddhantsci.org/images/arch2.png" /></p>
<p>As is evident from the diagram, there are <strong>two distributed systems</strong> involved - both fairly complicated. These are -</p>
<ul>
<li>Tango Controls</li>
<li>ROS (Robot Operating System)</li>
</ul>
<p>This was by far the <strong>biggest challenge</strong> of the project. Interfacing data from one distributed system to the other while maintaining low latency and ensuring high performance.</p>
<p>Another challenge was handling real-time streaming data.
I banged my head against Python Streams. Message brokers like <a href="https://www.rabbitmq.com/">RabbitMQ</a> and <a href="http://zeromq.org/">ZeroMQ</a>. But as <strong>Albert Einstein</strong> said -</p>
<blockquote>
<p>“If you can't explain it to a six year old, you don't understand it yourself.” </p>
</blockquote>
<p>All this while, I was confused in transferring data over an <strong>additional</strong> inter-process communication structure between two distributed systems. Meh. Sounds complicated. It actually is. And that is why I chucked that idea out. After spending three full days on this, I realized a <strong>much simpler architecture</strong> -</p>
<p><img alt="ROS and Tango" src="http://siddhantsci.org/images/rostango.png" /></p>
<p><em>Voila!</em></p>
<p>The good thing about this diagram is that it works at scale with as many ROS nodes one may like to add for the rover (Husky) without compromising on the data coming from the Tango bus. The <strong>missing piece</strong> of the <em>two distributed systems</em> puzzle is solved by a Tango ROS Node. Now I have a plan to work on in the second week of coding.</p>
<p>These requirements had to be reflected in the Software Architecture Document as well. To this end, I set up the excellent <a href="https://github.com/timonwong/OmniMarkupPreviewer">OmniMarkupPreviewer</a> for <em>Sublime Text</em> to preview the <strong>reStructuredText</strong> (<strong>.rst</strong>) documents that I created.</p>
<h2>Tryst with ROS and Husky</h2>
<p>I had never worked with an Unmanned Ground Vehicle before. I did use ROS for robotics experiments at my university lab but needed to quickly jog my memory about ROS programming with <strong>rospy</strong>. The excellent <a href="http://wiki.ros.org/ROS/Tutorials">ROS wiki</a> and the book <strong>ROS By Example</strong> - </p>
<p><img alt="ROS By Example" src="http://siddhantsci.org/images/rbxlogo.png" /></p>
<p>It is a haven for robot hobbyists like me and I'll continue to refer to it for time to come.</p>
<p>Alright, I started my week with ROS programming. My first job was to bring up the simulator and make sure that Husky model responds to commands -</p>
<p>Husky (and other ROS robots) describes movements in the form of <a href="http://docs.ros.org/api/geometry_msgs/html/msg/Twist.html">Twist</a> messages -</p>
<div class="highlight"><pre><span class="p">[(</span><span class="n">x</span><span class="p">,</span> <span class="n">y</span><span class="p">,</span> <span class="n">z</span><span class="p">),</span> <span class="p">(</span><span class="n">a</span><span class="p">,</span><span class="n">b</span><span class="p">,</span><span class="n">c</span><span class="p">)</span> <span class="p">]</span> <span class="n">where</span> <span class="n">x</span><span class="p">,</span><span class="n">y</span><span class="p">,</span><span class="n">z</span> <span class="n">is</span> <span class="n">linear</span> <span class="n">velocity</span> <span class="n">along</span> <span class="n">the</span> <span class="n">x</span><span class="p">,</span><span class="n">y</span><span class="p">,</span><span class="n">z</span> <span class="n">axes</span><span class="p">.</span> <span class="n">And</span> <span class="p">(</span><span class="n">a</span><span class="p">,</span><span class="n">b</span><span class="p">,</span><span class="n">c</span><span class="p">)</span> <span class="n">is</span> <span class="n">the</span> <span class="n">angular</span> <span class="n">velocity</span> <span class="n">about</span> <span class="n">the</span> <span class="n">x</span><span class="p">,</span><span class="n">y</span><span class="p">,</span><span class="n">z</span> <span class="n">axes</span><span class="p">.</span>
</pre></div>
<p>So to move in a circle, we issue [ (5,0,0) , (0,0,2) ]. This would result in a linear speed of 5 in the x direction and angular speed of 2 about the z axis, resulting in a circular motion.</p>
<p>A simple way to explain the working is to use this command -</p>
<div class="highlight"><pre><span class="n">rostopic</span> <span class="n">pub</span> <span class="o">/</span><span class="n">husky_velocity_controller</span><span class="o">/</span><span class="n">cmd_vel</span> <span class="n">geometry_msgs</span><span class="o">/</span><span class="n">Twist</span> <span class="o">-</span><span class="n">r</span> <span class="mi">100</span> <span class="err">&#39;</span><span class="p">[</span><span class="mf">0.5</span><span class="p">,</span><span class="mi">0</span><span class="p">,</span><span class="mi">0</span><span class="p">]</span><span class="sc">&#39; &#39;</span><span class="p">[</span><span class="mi">0</span><span class="p">,</span><span class="mi">0</span><span class="p">,</span><span class="mi">0</span><span class="p">]</span><span class="err">&#39;</span>
</pre></div>
<p>This publishes a Twist message to the Terminal telling the <strong>/husky_velocity_controller/cmd_vel</strong> <em>ROS topic</em> that the <a href="http://docs.ros.org/api/geometry_msgs/html/msg/Twist.html">Twist</a> denotes a linear motion of 0.5 m/s along the x direction.</p>
<p>This is Husky in action -</p>
<p><img alt="Husky in action" src="http://siddhantsci.org/images/husky_in_action.png" /></p>
<p>To do the same using rospy, the procedure is simple -</p>
<ul>
<li>Import the required libraries (to support <em>rospy</em>, logging, and <em>Twist</em> messages)</li>
</ul>
<div class="highlight"><pre><span class="n">import</span> <span class="n">roslib</span>
<span class="n">import</span> <span class="n">rospy</span>
<span class="n">from</span> <span class="n">geometry_msgs</span><span class="p">.</span><span class="n">msg</span> <span class="n">import</span> <span class="n">Twist</span>
</pre></div>
<ul>
<li>Set up a ROS node - in this case <strong>move</strong></li>
</ul>
<div class="highlight"><pre><span class="n">rospy</span><span class="p">.</span><span class="n">init_node</span><span class="p">(</span><span class="err">&#39;</span><span class="n">move</span><span class="err">&#39;</span><span class="p">)</span>
</pre></div>
<p>ROS nodes act as identifiers (source and destination of messages) in the ROS distributed system (modeled as a graph)</p>
<p>For instance, this is the ROS graph while the Husky is moving about -</p>
<p><img alt="ROS Graph" src="http://siddhantsci.org/images/rosgraph.png" />
This is why ROS scales so well. Any number of publisher and subscriber nodes can be added to extend different applications.</p>
<ul>
<li>Set up a publisher to the appropriate ROS topic with the ROS message type </li>
</ul>
<div class="highlight"><pre><span class="n">p</span> <span class="o">=</span> <span class="n">rospy</span><span class="p">.</span><span class="n">Publisher</span><span class="p">(</span><span class="err">&#39;</span><span class="n">husky_velocity_controller</span><span class="o">/</span><span class="n">cmd_vel</span><span class="err">&#39;</span><span class="p">,</span> <span class="n">Twist</span><span class="p">,</span> <span class="n">queue_size</span> <span class="o">=</span> <span class="mi">100</span><span class="p">)</span>
</pre></div>
<p>The <code>queue_size</code> argument specifies the message buffer length, and allows for asynchronous transfer of messages on the ROS meesage queue.</p>
<ul>
<li>Construct a Twist Message</li>
</ul>
<div class="highlight"><pre><span class="n">twist</span> <span class="o">=</span> <span class="n">Twist</span><span class="p">()</span>
<span class="n">twist</span><span class="p">.</span><span class="n">linear</span><span class="p">.</span><span class="n">x</span> <span class="o">=</span> <span class="mf">0.5</span><span class="p">;</span>
<span class="n">twist</span><span class="p">.</span><span class="n">linear</span><span class="p">.</span><span class="n">y</span> <span class="o">=</span> <span class="mi">0</span><span class="p">;</span> <span class="n">twist</span><span class="p">.</span><span class="n">linear</span><span class="p">.</span><span class="n">z</span> <span class="o">=</span> <span class="mi">0</span><span class="p">;</span>
<span class="n">twist</span><span class="p">.</span><span class="n">angular</span><span class="p">.</span><span class="n">x</span> <span class="o">=</span> <span class="mi">0</span><span class="p">;</span> <span class="n">twist</span><span class="p">.</span><span class="n">angular</span><span class="p">.</span><span class="n">y</span> <span class="o">=</span> <span class="mi">0</span><span class="p">;</span>
<span class="n">twist</span><span class="p">.</span><span class="n">angular</span><span class="p">.</span><span class="n">z</span> <span class="o">=</span> <span class="mi">0</span><span class="p">;</span>
</pre></div>
<ul>
<li>Publish the message</li>
</ul>
<div class="highlight"><pre><span class="n">p</span><span class="p">.</span><span class="n">publish</span><span class="p">(</span><span class="n">twist</span><span class="p">)</span>
</pre></div>
<p>That was easy, isn't it?</p>
<p>Changing the attributes can allow the Husky to move in a circle and nautilus shape -</p>
<p><img alt="Husky Circle" src="http://siddhantsci.org/images/husky_circle.png" /></p>
<p>In this way, I proceeded in creating ROS nodes to accept Twist messages from any application and made a small teleoperation program on the lines of the <strong>Arrow</strong> server in ERAS. With the help of Franco, I set up the Arrow Tango server and obtained the attributes for distance and orientation.</p>
<p>The next aim is to use the distance and orientation information on the Tango bus and map it to Husky commands so that it may move around appropriately on ground like this -</p>
<p><img alt="Husky Nautilus" src="http://siddhantsci.org/images/husky_nautilus.png" /></p>
<p><em>Random ROS Tidbit</em> - While working with ROS, I came across this interesting command <code>source</code></p>
<p>Why do I call it interesting?</p>
<p>It does not have a <strong>man-page</strong>, it does not have a <strong>--help</strong> or <strong>-h</strong> argument. It has one simple purpose -</p>
<blockquote>
<p>Execute the content of the file passed as argument <strong>in the current shell</strong></p>
</blockquote>
<p>Note that it is not the same as <strong>./</strong> which creates a new shell to run the command. Shells are nifty processes which allow other program processes to run. I wrote a shell from scratch for a Network Programming course assignment. You may find it <a href="https://github.com/sidcode/sigshell">here</a>.</p>
<h2>Skype Meeting for Bodytracking</h2>
<p>Franco, Yuval, Fabio, Ezio, Vito and I had an important meeting on 2nd May( a couple of hours before writing this post). The purpose of the meeting was <strong>Mapping Bodytracking with Telerobotics</strong>. The whole point of the project is to allow complete virtual and augmented reality immersion of the astronaut and the rover. This is what it means. The robot (a humanoid or a rover) should be able to mimic human action as much as possible. How? If the astronaut runs fast on the Motivity treadmill at a particular angle, the robot should move faster with that angle relative to the moving base position. This would make use of Vito's Kinect-based bodytracking module for determining incremental distance and orientation.</p>
<p>Since Husky understands velocity in the Twist message, the distance/orientation information must be transformed into linear/angular velocity. I'll be working on it this week. </p>
<p>Fabio brought up the important aspect of autonomy-control in the robotic system. He pressed upon the need of having three different stimuli to the robot -</p>
<ul>
<li>From the <strong>Bodytracking application</strong> (external)</li>
<li>From the <strong>robot's onboard sensors</strong> (internal)</li>
<li>From an external source</li>
</ul>
<p>This suggestion definitely adds robustness to the entire design, it will help the robot to avoid hitting a rock and override an astronaut's command in case of danger. I will look into it this week and keep semi-autonomy in Telerobotics in mind.</p>
<p>Yuval talked about contacting the team in Canada which facilitated Husky during V-ERAS 14. The work that I do will be tested on a real Husky eventually. </p>
<p>Adding a UR10 robotic arm to the Husky to facilitate manipulation and imitation of the human hand was also proposed. I'll look into that after the work on steering is complete.</p>
<p>In this way, the meeting was <strong>quite important</strong> and a bunch of <strong>crucial decisions</strong> regarding <strong>Telerobotics and Bodytracking</strong> were taken.</p>
<h2>The Week ahead</h2>
<p>The following week, we'll have another meeting with all the students and possibly a joint code review session. I will be integrating ROS and Tango and adding support for different levels of Robot control through additional ROS nodes.</p>
<h2>Summary</h2>
<p>In hindsight, I was scared my GSoC coding experience would turn out be like this before the start of the <a href="">Coding period</a> -</p>
<p><img alt="Coding By the Sill" src="http://siddhantsci.org/images/codingbythesill.jpg" />
Source - <a href="https://www.facebook.com/cluecomics">CLUE</a></p>
<p>:) In fact I faced nothing like that (but the headphone and the loneliness is true :D ) There were minor setbacks. I had to reinstall ROS as a result of purging my MySQL configuration for Tango. Obviously these were the usual frustrations which crop up with computer programming and Linux, but nothing humongous.
But this is where the <strong>Zen of Python</strong> kicks in! Using top-notch resources like the <code>logging</code> module, <code>rqt-graph</code>, and the inbuilt ROS logger; programming was a breeze. Add to it the awesomeness of Italian Mars Society. I faced a doubt in bodytracking, and six people decided on a Skype call to resolve the issues being faced, and resolve it we did, with gusto.</p>
<p>The first week was super-hectic. Left with a computer and a programming problem; all-nighters were inevitable. It is proving to be a challenging and fun summer. </p>
<p>Watch out for my next post in the GSoC 2015 series!</p>
<p>Ciao!</p></summary><category term="GSoC"></category><category term="Python"></category><category term="PSF"></category><category term="computers"></category><category term="science"></category><category term="exploration"></category><category term="space"></category><category term="mars"></category><category term="IMS"></category><category term="Italian Mars Society"></category></entry><entry><title>Software Architecture Document for Telerobotics</title><link href="http://siddhantsci.org/blog/2015/05/29/software-architecture-document-for-telerobotics/" rel="alternate"></link><updated>2015-05-29T00:53:52+00:00</updated><author><name>Siddhant Shrivastava</name></author><id>tag:siddhantsci.org,2015-05-29:blog/2015/05/29/software-architecture-document-for-telerobotics/</id><summary type="html"><h2>The First Three Days</h2>
<p>Hi! Last couple of days were quite hectic. I am still getting used to the <em>7-hours a day</em> schedule of GSoC. But the good thing about GSoC is you can adjust the programming schedule according to your own convenience which is one more reason which takes it a notch above other summer coding programs. But I am devoting about 10 hours every day in these initial days to ensure I am well-positioned with respect to my timeline and also keep learning stuff on the go.</p>
<p>Rants aside, I recently completed the first draft of the Software Architecture Document for my project.</p>
<h2>Software Architecture Document</h2>
<p><strong>Software Architecture Document</strong> (quite funnily abbreviated as SAD) is an important (read very) piece of information which entails and ensures what a software project is going to look like when it is built and shipped. I must thank the Italian Mars Society for giving me the much-needed push into the world of Open Source Software Engineering.</p>
<p>Architecture of any program, especially open-source programs, as described in the excellent book <a href="http://aosabook.org/en/index.html">Architecture of Open Source Applications</a> describes software in terms of <strong>different</strong> layers of abstraction components, depending on who wants to look and improve upon it. Open source applications are a product of efforts of multiple people working on different aspects of a project together. To facilitate effective and non-redundant collaboration with proper version control, a software architecture document comes in handy.</p>
<p>To put it in one line - </p>
<blockquote>
<p><strong>SAD ensures all developers, testers, and users are on the same page.</strong></p>
</blockquote>
<h2>SAD for Telerobotics Application</h2>
<p>Take my <strong>Telerobotics application</strong> for instance. It is made up of three <em>distinct</em> <strong>features</strong> or <strong>functional requirements</strong> - </p>
<ul>
<li>Mapping Human body-tracking information to rover motion instructions</li>
<li>Allowing real-time streaming of the rover's stereo camera-feed to the ERAS application</li>
<li>Providing an Augmented reality interface obtained from the processing the rover sensor data</li>
</ul>
<p>Although I am the only developer working on these aspects currently, I must ensure that the application is in a <strong>well-maintained state</strong> throughout the life of the project. I must also ensure that a developer with skills in Robotics gets relevant information to the Robotics subsystem of the application (ROS knowledge). I must separate the concerns of a Network Communications developer from the user (the astronaut) while working on Real-time streaming from the rover to the Head Mounted Virtual Reality device.</p>
<p>While the features describe the expected behaviour of the software system, they require a lot of background machinery which is essential for operation but not relevant for exposing to the end-user. These are <strong>non-functional requirements</strong>. To give an example, <strong>Robotics Operating System</strong> is used to manouver the Husky robot around. But the astronaut or the software system need not be concerned that robot communication, control, and command (C3 architecture) takes place using ROS or other robot platforms like YARP or Player/Stage. </p>
<p>Non-functional requirements in turn are quite important for satisfying the performance requirements of the software system. For instance, the Real-time streaming protocol (RTSP) that I'll be working with soon directly impacts the performance requirement of <strong>Hard-Real Time streaming support.</strong></p>
<p>The Software Architecture Document is generic in that it keeps in mind the evolving technology that may be used to cater to the application in focus. For instance, the <strong>Unmanned Ground Vehicle</strong> currently being considered is the <strong>Husky rover</strong>. It is my responsibilty to ensure that the logical layers are independent of the robot being used. The software should be <strong>extensible</strong> easily to a future ground vehicle that may use an altogether different control architecture than <strong>ROS</strong>.</p>
<p>Finally, SAD is practical. It describes the timeline of development of the features.</p>
<h2>My experience with SADs</h2>
<p>Working on the SAD has been an immensely edifying experience for me for several reasons -</p>
<ol>
<li>My first foray into Software Engineering literature.</li>
<li>Learning <em>reStructuredText</em> as the documentation tool for SAD.</li>
<li>Appreciating how finely ingrained software-engineering principles are with Programming Language design. For instance, the sections of a SAD directly imbue the features of Object Oriented Programming (abstraction, encapsulation, separation of concern) and Functional Programming (Side effects, Higher-order functions).</li>
</ol>
<h2>Links to the document</h2>
<p>If you are interested, the link to my <a href="https://bitbucket.org/italianmarssociety/eras/src/132fff239c3ff892f7cfc8836d3a2921244e444e/servers/telerobotics/doc/?at=default">software architecture document source is this</a>.</p>
<p>The documentation on readthedocs can be found <a href="eras.readthedocs.org/en/latest/servers/telerobotics/doc/sad.html">here</a>.</p>
<p>Until my next post on my first week of coding.</p>
<p>Ciao!</p></summary><category term="GSoC"></category><category term="Python"></category><category term="PSF"></category><category term="computers"></category><category term="science"></category><category term="exploration"></category><category term="space"></category><category term="mars"></category><category term="IMS"></category><category term="Italian Mars Society"></category></entry><entry><title>Workspace Setup for Telerobotics</title><link href="http://siddhantsci.org/blog/2015/05/26/workspace-setup-for-telerobotics/" rel="alternate"></link><updated>2015-05-26T00:53:52+00:00</updated><author><name>Siddhant Shrivastava</name></author><id>tag:siddhantsci.org,2015-05-26:blog/2015/05/26/workspace-setup-for-telerobotics/</id><summary type="html"><p>Hi! Yesterday was the start of the <strong>coding period</strong> which will continue for another 12 weeks. The <a href="http://siddhantsci.org/blog/2015/05/23/gsoc-15-community-bonding/">Community Bonding period</a> gave me enough time to install the required packages. This post explains those packages in minimal detail.</p>
<h2>Project Components</h2>
<p>My work would heavily require the use of -</p>
<ol>
<li><strong>ROS (Robot Operating System)</strong> to work with the <a href="www.clearpathrobotics.com/husky/">Husky Rover</a></li>
</ol>
<p><img alt="ROS Logo" src="http://siddhantsci.org/images/roslogo.png" /></p>
<p>ROS is the meta-operating system which is very popular with roboticists. My future posts would describe my work with ROS and the concepts that I am using, in detail.</p>
<p>More specifically, I am working with ROS Indigo Igloo, which is a LTS (Long-term support) release</p>
<p><img alt="Indigo Logo" src="http://siddhantsci.org/images/indigologo.png" /></p>
<ol>
<li><strong>Gazebo Simulation environment</strong> to test the programs written to drive the Husky around</li>
</ol>
<p><img alt="Gazebo Logo" src="http://siddhantsci.org/images/gazebologo.png" /></p>
<p>I am working with Gazebo version 2.2.3.</p>
<ol>
<li><strong>Tango-Controls</strong> Supervisory Control and Data Acquistion system</li>
</ol>
<p>If data from different devices is the blood of ERAS, then Tango is the circulatory system. It does an excellent job of handling multiple devices (Motivity treadmill, Kinect Sensors, Blender Game Engine Instances, and in my case a ROS machine with Husky interfaces)</p>
<p><img alt="Tango Logo" src="http://siddhantsci.org/images/tangologo.png" /></p>
<ol>
<li><strong>Blender Game Engine</strong> to model the standalone V-ERAS application.</li>
</ol>
<p><img alt="Blender Logo" src="http://siddhantsci.org/images/blenderlogo.png" /></p>
<p>The V-ERAS simulation of the spacecraft looks like this -</p>
<p><img alt="V-ERAS simulation" src="http://siddhantsci.org/images/verassim.png" /></p>
<p>In the second phase of the project, I will be involved in real-time streaming of rover stereo camera feed to the displays in the V-ERAS simulation.</p>
<ol>
<li><strong>Python</strong> (of course :D )</li>
</ol>
<p><img alt="Python Logo" src="http://siddhantsci.org/blog/2015/05/26/workspace-setup-for-telerobotics/images/python-logo.png" /></p>
<ol>
<li><strong>Ubuntu 14.04 (Trusty Tahr)</strong></li>
</ol>
<p>ROS Indigo offers complete support for this version of Ubuntu.</p>
<p><img alt="Ubuntu Logo" src="http://siddhantsci.org/blog/2015/05/26/workspace-setup-for-telerobotics/images/ubuntulogo.png" /></p>
<h2>Screenshots</h2>
<p>To Python-ify my experience even further, I installed <strong>Terminator</strong>, a Python-based program which makes terminal arrangement as flexible as humanly possible on Linux.</p>
<p>Working with ROS requires opening up a lot of terminal and Terminator makes this hassle-free.</p>
<p>Take a look for yourselves -</p>
<p><img alt="Terminator" src="http://siddhantsci.org/blog/2015/05/26/workspace-setup-for-telerobotics/images/terminator.png" /></p>
<p>I am using different text editors for different purposes.</p>
<p>While working with <strong>Markdown</strong> and <strong>reStructuredText</strong>, I use Sublime Text.</p>
<p><img alt="Sublime Text Logo" src="http://siddhantsci.org/blog/2015/05/26/workspace-setup-for-telerobotics/images/sublimelogo.png" /></p>
<p>Vim is my editor of choice for all things Python. I have been using it for open-source development since last year.</p>
<p>So, with this I wrap up this setup post.</p>
<p>Just for kicks, this is what my desktop looks like -</p>
<p><img alt="Desktop IMS" src="http://siddhantsci.org/blog/2015/05/26/workspace-setup-for-telerobotics/images/desktop.png" /></p>
<p>I must admit it keeps me motivated to design software for Mars missions. Just in case you're wondering, the theme I use is the MacBuntu theme. It is pretty distraction-free.</p>
<h2>To Coding and beyond!</h2></summary><category term="GSoC"></category><category term="Python"></category><category term="PSF"></category><category term="computers"></category><category term="science"></category><category term="exploration"></category><category term="space"></category><category term="mars"></category><category term="IMS"></category><category term="Italian Mars Society"></category></entry><entry><title>Room 276, Gandhi Bhawan, BITS Pilani</title><link href="http://siddhantsci.org/blog/2015/05/24/room-276-gandhi-bhawan-bits-pilani/" rel="alternate"></link><updated>2015-05-24T00:53:52+05:30</updated><author><name>Siddhant Shrivastava</name></author><id>tag:siddhantsci.org,2015-05-24:blog/2015/05/24/room-276-gandhi-bhawan-bits-pilani/</id><summary type="html"><h2>Why write a post about an indistinguishable dorm room?</h2>
<p><strong>Gandhi 276</strong> (Coordinates - 28.360874 N, 75.588507 E) ushers in a profuse stream of consciousness (and subconsciousness) for me.</p>
<p>I made it my home for <strong>two</strong> long academic years (that is four semesters (2013-2015) and a summer (2015).</p>
<p>Today (24 June, 2015), I am leaving it as I complete the final phase of packing up the lightweight items. Incidentally, today is also the date when I took my BITSAT exam in 2012. It also happily collides with the end of the community bonding period and the start of the coding period for GSoC which I explained in <a href="http://siddhantsci.org/blog/2015/05/23/gsoc-15-community-bonding/">this post</a>.</p>
<h2>How I got this room?</h2>
<p>Out of pure whim. BITS Pilani allows students to choose their wings and rooms. The final wings are decided by a lottery system in cases of a collision. So yes, our wing (the <em>'ghot'</em> wing) got the upper back wing, and consequently I got this room.</p>
<blockquote>
<p>About Gandhi 276</p>
</blockquote>
<h2>Experiences with GN-276</h2>
<p>I stepped foot in this room on <strong>July 31, 2013</strong>. The previous room occupant, <a href="https://in.linkedin.com/pub/lohi-uppalapati/2a/934/27a">U.R. Lohi</a> was the president of BITS Pilani Student Union for the 2012-2013 session. So in a way, I got the president's room, out of pure whim. <em>A cool bragging right to start with.</em></p>
<blockquote>
<p><strong>First Semester, 2013</strong></p>
</blockquote>
<ul>
<li>Best time of my CS program</li>
<li>Gel in with the new wingmates (called <em>wingies</em> at BITS)</li>
<li>The distraction-free semester, even with the new laptop</li>
<li>Programmed extensively in Java, Prolog, C</li>
<li>The last semester with an advanced Mathematics course, which I enjoyed (<strong>Differential Equations</strong>)</li>
<li>Worked hard as the <strong>Technical Team</strong> member of <a href="http://embryo.bits-pilani.ac.in"><strong>BITSEmbryo</strong></a></li>
<li>Still a Windows user</li>
<li>The start of my Robotics career</li>
</ul>
<blockquote>
<p><strong>Second Semester, 2014</strong></p>
</blockquote>
<ul>
<li>The lowest point of my CS program</li>
<li>Watched more than 500 films</li>