graph LR;
argus_node("ArgusStereoNode (Raw Image)") --> left_encoder_node("EncoderNode (Compressed Image)");
argus_node --> right_encoder_node("EncoderNode (Compressed Image)");
In this tutorial, we'll demonstrate how you can perform H.264 encoding using a Argus-compatible camera and isaac_ros_h264_encoder, and save the compressed images into a rosbag.
Note:
isaac_ros_h264_encoder
needs to run on Jetson platform.
-
Follow the Quickstart section up to step 6 in the main README.
-
Outside the container, clone an additional repository required to run Argus-compatible camera under
~/workspaces/isaac_ros-dev/src
.cd ~/workspaces/isaac_ros-dev/src
git clone https://github.com/NVIDIA-ISAAC-ROS/isaac_ros_argus_camera
-
Inside the container, build and source the workspace:
cd /workspaces/isaac_ros-dev && \ colcon build && \ source install/setup.bash
-
(Optional) Run tests to verify complete and correct installation:
colcon test --executor sequential
-
Run the launch file. This launch file will launch the example and record
CompressedImage
andCameraInfo
topic data into a rosbag to your current folder.ros2 launch isaac_ros_h264_encoder isaac_ros_h264_encoder_argus.launch.py
-
(Optional) If you want to decode and visualize the images from the rosbag, you can place the recorded rosbag into an x86 machine equipped with NVIDIA GPU, then follow steps 7 & 8 in the Quickstart section. (Change the rosbag path and input dimension accordingly in step 7):
ros2 launch isaac_ros_h264_decoder isaac_ros_h264_decoder_rosbag.launch.py rosbag_path:=<"path to your rosbag folder">
Here is a screenshot of the result example: