Skip to content
This repository was archived by the owner on Jan 24, 2018. It is now read-only.

Commit 8da1a89

Browse files
committed
Update to v0.6.5
1 parent 57b70e5 commit 8da1a89

File tree

464 files changed

+760144
-3607
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

464 files changed

+760144
-3607
lines changed

CMakeLists.txt

Lines changed: 4 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -1,9 +1,10 @@
11
cmake_minimum_required(VERSION 2.8.9)
22
project(realsense_samples)
33

4-
# If not set on commandline, use Yocto cross-compiler SDK environment as root
5-
if(NOT SYSROOT)
6-
set(SYSROOT "$ENV{SDKTARGETSYSROOT}")
4+
if(NOT SAMPLE_PREFIX)
5+
set(SAMPLE_PREFIX "sample_")
76
endif()
87

8+
add_definitions( -DINSTALL_PREFIX="${CMAKE_INSTALL_PREFIX}" )
9+
install(FILES README.md DESTINATION share/doc/librealsense-samples)
910
add_subdirectory(samples)

README.md

Lines changed: 31 additions & 56 deletions
Original file line numberDiff line numberDiff line change
@@ -3,77 +3,52 @@
33
[ ![Release] [release-image] ] [releases]
44
[ ![License] [license-image] ] [license]
55

6-
[release-image]: http://img.shields.io/badge/release-0.2.10-blue.svg?style=flat
6+
[release-image]: http://img.shields.io/badge/release-0.6.5-blue.svg?style=flat
77
[releases]: https://github.com/IntelRealSense/realsense_sdk
88

99
[license-image]: http://img.shields.io/badge/license-Apache--2-blue.svg?style=flat
1010
[license]: LICENSE
1111

1212
## Features
13-
These samples illustrate how to develop applications using Intel® RealSense™ cameras for Objection Recognition, Person Tracking, and SLAM.
14-
15-
## Description
16-
17-
## What's New In This Release
18-
Initial Release
19-
20-
## Compatible Devices
21-
Intel® Linux RealSense™ 3D Camera ZR300
22-
23-
24-
## Compatible Platforms
25-
The library is written in standards-conforming C++11. It is developed and tested on the following platform:
26-
1. Reference Linux* OS for IoT, build 181 or newer
27-
28-
## Supported Languages and Frameworks
29-
C++
13+
These samples illustrate how to develop applications using Intel® RealSense™ cameras for Object Library (OR), Person Library (PT), and Simultaneous Localization And Mapping (SLAM).
3014

3115
## Functionality
32-
**API is experimental and not an official Intel product. It is subject to incompatible API changes in future updates. Breaking API changes are noted through release numbers**
16+
**API is experimental and not an official Intel product. It is subject to incompatible API changes in future updates. Breaking API changes are noted through major release numbers**
3317

34-
The following sample functionality is provided in this release:
35-
- **or_tutorial_1**: This console app illustrates the use of libRealSense, libOR, and the Linux SDK Framework to use the RealSense camera's depth and color sensors to identify objects in the scene. Each object identified will be printed on the command line along with the confidence value for the object, for objects which take up ~50% of the screen for recognition.
18+
The following sample projects are provided in this release:
19+
- **or_tutorial_1**: This console app illustrates the use of librealsense, libOR, and the Linux SDK Framework to use the ZR300 camera's depth and color sensors to identify objects in the scene. Each object identified will be printed on the command line along with the confidence value for the object, for objects which take up ~50% of the screen for recognition.
3620
- **or_tutorial_2**: This console app builds on top of or_tutorial_1, and illustrates how to utilize the libOR object localization and 3d localization functionality. All items with >= 90% confidence will identified along with the bounding box coordinates being displayed on the console.
3721
- **or_tutorial_3**: This console app builds on top of or_tutorial_2, and illustrates how to add the libOR object tracking functionality with localization and 3d localization. Objects for tracking can be as small as 1% of the screen.
38-
- **pt_tutorial_1**: This console app illustrates the use of libRealsense, realsense_persontracking, and the Linux SDK Framework to use the RealSense camera's depth and color sensors to detect people in the scene. The number of people detected in the current frame as well as cumulative total number of people will be displayed as a quantity on the console.
39-
- **pt_tutorial_2**: This sample app illustrates how to analyze someone’s posture. When a person is in the FOV, the app should display the following information: his head pose info (yaw, pitch, roll values) and his body orientation (front/side/back values).
40-
- **pt_tutorial_3**: This console app provides pointing gesture info for the gestures detected in the scene. This provides Pointing detected alert and also includes origin x,y coordinates as well as direction x,y coordinates.
41-
- **slam_tutorial_1**: This console app illustrates the use of libRealsense and libSLam libraries to use the ZR300 camera's depth, fish eye, and IMU sensors to print out the current camera module position and pose
22+
- **or_tutorial_1_web**: This GUI app builds on top of or_tutorial_1 and displays the live color preview from the camera within a browser and shows a table with a label for the object along with its confidence value.
23+
- **or_tutorial_2_web**: This GUI app builds on top of or_tutorial_2 and displays the live color preview from the camera within a browser and draws rectangles on the color images in the region where objects are recognized. This GUI also includes a table that shows a label for the object along with its confidence value and the 3D(x,y,z) coordinates of the object.
24+
- **or_tutorial_3_web**: This GUI app builds on top of or_tutorial_3 and displays the live color preview from the camera within a browser and draws rectangles on the color images in the region where objects are recognized and keeps track of the objects. This GUI also includes a table that shows a label for the object along with its confidence value and the 2D(x,y) coordinates of the object.
25+
- **or_tutorial_1_gui**: This GUI/console app builds on top of or_tutorial_1, and illustrates the use of librealsense, librealsense_object_recognition, and the Linux SDK Framework along with the RealSense camera's depth and color sensors to identify objects in the scene. Each object identified will be listed on the command line and on a pop-up window along with the confidence value for the object.
26+
- **or_tutorial_2_gui**: This GUI/console app builds on on top of or_tutorial_2, and illustrates how to utilize object localization and 3D localization. A GUI is displayed in a pop-up window along with output on the console. Recognized objects will be highlighted with a bounding box, label, and confidence level drawn in a live video stream. Other information is displayed for each recognized object in a table in the pop-up window and on the console.
27+
- **or_tutorial_3_gui**: This GUI/console app builds on top of or_tutorial_3, and illustrates how to add object tracking functionality with localization and 3D localization. Tracked objects can be as small as 1% of the screen. The coordinates of the tracked object(s) are displayed in a pop-up window and the console. Tracked objects are shown by a labeled bounding rectangle drawn over a live video stream. The user can trigger the localization function using the mouse.
28+
- **or_tutorial_4_gui**: This GUI/console app builds on top of or_tutorial_3, and illustrates how to add object tracking functionality. The coordinates of the tracked object(s) are displayed in pop-up window and the console. Tracked objects are shown the pop-up window by a labeled bounding rectangle drawn over a live video stream. The user will choose the tracked object using the mouse.
29+
- **pt_tutorial_1**: This sample app illustrates the use of libRealsense, libPT, and the Linux SDK Framework to use the ZR300 camera's depth and color sensors to detect people in the scene. The number of people detected in the scene will be displayed as a quantity on the console.
30+
- **pt_tutorial_2**: This sample app illustrates how to analyze someone’s posture. When a person is in the FOV, the app should display the following information once every second: his head direction (yaw, pitch, roll values).
31+
- **pt_tutorial_3**: This sample app illustrates how to get the body tracking points and detect a pointing gesture. The app will display 6 body points , a “Pointing Detected” alert when the gesture is performed, and the pointing vector.
32+
- **pt_tutorial_1_web**: This GUI app builds on top of pt_tutorial_1 and displays the live color preview from the camera within a browser and draw rectangles around the person(s) detected in the camera frame and draws a color dot indicating the center of mass for the detected person in the image. There is also a table as part of the GUI that shows the person id, center of mass world coordinates(x,y,z) and the cumulative count of persons detected.
33+
- **pt_tutorial_2_web**: This GUI app builds on top of pt_tutorial_2 and displays the live color preview from the camera within a browser and draws rectangle around the head of a person detected in the camera frame. There is also a table as part of the GUI that shows the person id, and head pose information.
34+
- **pt_tutorial_3_web**: This GUI app builds on top of pt_tutorial_3 and displays the live color preview from the camera within a browser and draws rectangle around the person detected in the camera frame. Also draws an arrow to indicate the direction of the pointing gesture. There is also a table as part of the GUI that shows the person id, gesture origin(x,y,z) and gesture direction.
35+
- **pt_tutorial_4_web**: This GUI sample app illustrates how to register new users to the database, uploade the database to identify them when they appear in the scene.
36+
- **pt_tutorial_5_web**:This GUI app builds on top of pt_tutorial_1, and displays live color preview of the camera, and the bounding box of all detected people in the scene. When selecting one person out of the detected, the application start tracking this person and also show his center of mass (COM) as he’s being tracked in the scene.
37+
- **or_pt_tutorial_1**: This console app illustrates the use of librealsense, OR and PT libraries using the ZR300 camera identify objects and persons. The name of the object along with the confidence value will be printed on the console. Person information like person id and the 2D box coordinates will be printed on the console.
38+
- **or_pt_tutorial_1_web**: This GUI app builds on top of or_pt_tutorial_1 and displays the live color preview from the camera within a browser and draw rectangles around the person(s) and object(s) detected in the camera frame. There is also a table as part of the GUI that shows the person id, center of mass world coordinates(x,y,z) along with object label, object center world coordinates and confidence score.
39+
- **slam_tutorial_1_gui**: This app illustrates the use of the librealsense and librealsense_slam libraries. The application prints the camera pose translation to the console and draws the occupancy map in a separate window. You can pass the filename of a RealSense SDK recording as a command line argument, and it will play back the recording instead of using live camera data (see SLAM dev guide). It also illustrates how to save the occupancy map as a PPM file. This sample is the recommended starting point for learning the SLAM API.
40+
- **slam_tutorial_1_web**: This app builds on top of slam_tutorial_1_gui and displays live fisheye preview, occupancy map, input and tracking FPS for fisheye, depth, gyro and accelerometer frames, within a browser. This application can be used for viewing the SLAM output on a remote machine, which is useful for robots and other headless systems.
41+
- **slam_or_pt_tutorial_1**: This console app illustrates the use of librealsense, SLAM, OR and PT libraries using the ZR300 camera to print out the current camera module position and , identified objects and identified persons. The name of the object along with the confidence value will be printed on the console for identified objects. Person information like person id and the 2D box coordinates will be printed on the console.
42+
- **slam_or_pt_tutorial_1_web**: This GUI app builds on top of slam_or_pt_tutorial_1 and displays live fisheye and color preview, occupancy map, input and tracking fps for fisheye, depth, gyro and accelerometer frames, within a browser. Draws rectangles around recognized objects and persons in the color preview.
4243

43-
## Installation Guide
44-
## Dependency list
45-
The samples in this repository require librealsense, RealSense SDK framework, and RealSense Person Tracking, Object Recognition, and SLAM middleware.
46-
47-
For your own reference build of Yocto, these libraries are provided in the Yocto Build layers at https://github.com/IntelRealSense/meta-intel-realsense
48-
49-
For a pre-built OS image for the Intel Joule module including these libraries, the instructions at https://software.intel.com/en-us/flashing-ostro-on-joule document how to ensure that your Intel Joule module is updated with the latest version of OS. Build #181 or newer is required for these samples
50-
51-
## Reference Linux* OS for IoT, using Command Line
52-
Support for building samples at the command line assume you have completed installation of the cross-compiler SDK, at the default location at /usr/local/ostroxt-x86_64.
53-
54-
To build all samples:
55-
```bash
56-
$ source /usr/local/ostroxt-x86-64/environment-setup-corei7-64-ostro-linux
57-
$ mkdir workspace
58-
$ cd ~/workspace
59-
$ git clone http://github.com/IntelRealSense/realsense_samples
60-
$ mkdir build
61-
$ mkdir install
62-
$ cd ~/workspace/build
63-
$ cmake ../realsense_samples
64-
$ make –j
65-
$ make DESTDIR=../install install
66-
```
67-
This will compile all the samples using the cross-compiler, and place the sample binaries in the 'install' folder. You can then deploy these binaries to your module using ssh.
68-
69-
## Reference Linux* OS for IoT, using Intel® System Studio IoT Edition
70-
These samples are fully integrated as part of the Intel System Studio IoT Edition. See the instructions at https://software.intel.com/en-us/node/672439 for downloading and installation steps.
44+
## Supported Languages and Frameworks
45+
C++
7146

72-
## Known Issues
73-
n/a
47+
## Building The Samples
48+
Samples are provided for your convenience pre-built and validated by Intel (see: <https://software.intel.com/sites/products/realsense/intro/getting_started.html>). In addition, samples are provided as open source, enabling you to further customize them for your specific needs. If you would like to rebuild the samples from source, follow the "Installing development Kit" section of the link above.
7449

7550
## License
76-
Copyright 2016 Intel Corporation
51+
Copyright 2017 Intel Corporation
7752

7853
Licensed under the Apache License, Version 2.0 (the "License");
7954
you may not use this project except in compliance with the License.

debian/changelog

Lines changed: 5 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,5 @@
1+
librealsense-samples (0.6.5-1) unstable; urgency=medium
2+
3+
* Beta4 Release
4+
5+
-- RealSense <realsense-sw@intel.com> Thu, 26 Jan 2017 09:41:36 +0800

debian/compat

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1 @@
1+
9

debian/control

Lines changed: 12 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,12 @@
1+
Source: librealsense-samples
2+
Priority: optional
3+
Maintainer: Intel RealSense <realsense-sw@intel.com>
4+
Build-Depends: debhelper (>=9), cmake (>= 2.8), libopencv-dev (>=3.1), libjpeg-dev, librealsense-dev (>=1.12.1), librealsense-sdk-dev (>=0.7.1), librealsense-persontracking-dev (>=0.5.10), librealsense-object-recognition-dev, librealsense-slam-dev (>=2.0.4)
5+
Standards-Version: 3.9.6
6+
Section: libs
7+
Homepage: https://github.com/IntelRealSense/realsense_samples
8+
9+
Package: librealsense-samples
10+
Architecture: any
11+
Depends: ${shlibs:Depends}, ${misc:Depends}
12+
Description: RealSense Samples

debian/copyright

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,3 @@
1+
Files: *
2+
Copyright: Copyright 2016 Intel Corporation
3+
License: Apache

debian/rules

Lines changed: 12 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,12 @@
1+
#!/usr/bin/make -f
2+
%:
3+
dh $@ --buildsystem=cmake
4+
5+
override_dh_auto_make:
6+
make
7+
8+
override_dh_shlibdeps:
9+
dh_shlibdeps --dpkg-shlibdeps-params=--ignore-missing-info
10+
11+
override_dh_auto_configure:
12+
dh_auto_configure -- -DSAMPLE_PREFIX=rs_

samples/CMakeLists.txt

Lines changed: 19 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,9 +1,26 @@
11
cmake_minimum_required(VERSION 2.8.9)
2+
add_subdirectory(common)
3+
add_subdirectory(or_pt_tutorial_1)
4+
add_subdirectory(or_pt_tutorial_1_web)
25
add_subdirectory(or_tutorial_1)
6+
add_subdirectory(or_tutorial_1_gui)
7+
add_subdirectory(or_tutorial_1_web)
38
add_subdirectory(or_tutorial_2)
9+
add_subdirectory(or_tutorial_2_gui)
10+
add_subdirectory(or_tutorial_2_web)
411
add_subdirectory(or_tutorial_3)
12+
add_subdirectory(or_tutorial_3_gui)
13+
add_subdirectory(or_tutorial_3_web)
14+
add_subdirectory(or_tutorial_4_gui)
515
add_subdirectory(pt_tutorial_1)
16+
add_subdirectory(pt_tutorial_1_web)
617
add_subdirectory(pt_tutorial_2)
18+
add_subdirectory(pt_tutorial_2_web)
719
add_subdirectory(pt_tutorial_3)
8-
add_subdirectory(slam_tutorial_1)
9-
20+
add_subdirectory(pt_tutorial_3_web)
21+
add_subdirectory(pt_tutorial_4_web)
22+
add_subdirectory(pt_tutorial_5_web)
23+
add_subdirectory(slam_tutorial_1_gui)
24+
add_subdirectory(slam_tutorial_1_web)
25+
add_subdirectory(slam_or_pt_tutorial_1)
26+
add_subdirectory(slam_or_pt_tutorial_1_web)

samples/common/CMakeLists.txt

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,2 @@
1+
cmake_minimum_required(VERSION 2.8.9)
2+
add_subdirectory(web_display)

0 commit comments

Comments
 (0)