diff --git a/docs/CHANGELOG.md b/docs/CHANGELOG.md index 8fac3b04..0338b58e 100644 --- a/docs/CHANGELOG.md +++ b/docs/CHANGELOG.md @@ -1,14 +1,16 @@ -# What's new +# Change Log -Below is summarized list of important changes. This does not include minor/less important changes or bug fixes or documentation update. This list updated every few months. For complete detailed changes, please review [commit history](https://github.com/nervosys/AutonomySim/commits/main). +Below is summarized list of important changes. This does not include minor/less important changes or bug fixes or documentation update. This list updated every few months. For complete detailed changes, please review the project [commit history](https://github.com/nervosys/AutonomySim/commits/main). ### February 2024 * AutonomySim relaunched as clean-slate repository. * Major documentation update. +* Trimmed the fat to go light and fast. ### October 2023 * AutonomySim fork created. * Migration from Batch/Command to PowerShell. +* Support dropped for Unity, Gazebo, and ROS1 to better focus on Unreal Engine 5. * Major project reorganization begun. ### Jan 2022 @@ -188,7 +190,7 @@ Below is summarized list of important changes. This does not include minor/less ### November, 2018 * Added Weather Effects and [APIs](apis.md#weather-apis) * Added [Time of Day API](apis.md#time-of-day-api) -* An experimental integration of [AutonomySim on Unity](https://github.com/nervosys/AutonomySim/tree/main/Unity) is now available. Learn more in [Unity blog post](https://blogs.unity3d.com/2018/11/14/AutonomySim-on-unity-experiment-with-autonomous-vehicle-simulation). +* An experimental integration of [AutonomySim on Unity](https://github.com/nervosys/AutonomySim/tree/master/Unity) is now available. Learn more in [Unity blog post](https://blogs.unity3d.com/2018/11/14/AutonomySim-on-unity-experiment-with-autonomous-vehicle-simulation). * [New environments](https://github.com/nervosys/AutonomySim/releases/tag/v1.2.1): Forest, Plains (windmill farm), TalkingHeads (human head simulation), TrapCam (animal detection via camera) * Highly efficient [NoDisplay view mode](settings.md#viewmode) to turn off main screen rendering so you can capture images at high rate * [Enable/disable sensors](https://github.com/nervosys/AutonomySim/pull/1479) via settings diff --git a/docs/SUPPORT.md b/docs/SUPPORT.md index 9c57ec5b..28a3b17e 100644 --- a/docs/SUPPORT.md +++ b/docs/SUPPORT.md @@ -1,7 +1,7 @@ # Support -We highly recommend to take a look at source code and contribute to the project. Due to large number of incoming feature request we may not be able to get to your request in your desired timeframe. So please [contribute](CONTRIBUTING.md) :). +We highly recommend that user look at source code and contribute to the project. Due to the large number of incoming feature requests, we may not be able to get to your request in your desired timeframe. So, please [contribute](CONTRIBUTING.md): -* [Ask in Discussions](https://github.com/nervosys/AutonomySim/discussions) -* [File GitHub Issue](https://github.com/nervosys/AutonomySim/issues) -* [Join AutonomySim Facebook Group](https://www.facebook.com/groups/1225832467530667/) +* [File GitHub Issues](https://github.com/nervosys/AutonomySim/issues) +* [Join the Discussions](https://github.com/nervosys/AutonomySim/discussions) +* [Join the Discord](https://discord.gg/x84JXYje) diff --git a/docs/adding_new_apis.md b/docs/adding_apis.md similarity index 83% rename from docs/adding_new_apis.md rename to docs/adding_apis.md index 62a98ab6..e6a0e65d 100644 --- a/docs/adding_new_apis.md +++ b/docs/adding_apis.md @@ -1,6 +1,8 @@ # Adding New APIs -Adding new APIs requires modifying the source code. Much of the changes are mechanical and required for various levels of abstractions that AutonomySim supports. The main files required to be modified are described below along with some commits and PRs for demonstration. Specific sections of the PRs or commits might be linked in some places, but it'll be helpful to have a look at the entire diff to get a better sense of the workflow. Also, don't hesitate in opening an issue or a draft PR also if unsure about how to go about making changes or to get feedback. +Adding new `AutonomySim` APIs requires modifying the source code. These changes are required for the various levels of abstractions that `AutonomySim` supports. The primary files that need to be modified are described below, along with demonstration commits and pull requests (PRs). Specific sections of PRs or commits might be linked in some places, but it'll be helpful to have a look at the entire `diff` to get a sense of the workflow. + +Do not hesitate to open an issue or a draft PR if you are unsure about how to make changes or to get feedback. ## Implementing the API @@ -33,19 +35,14 @@ The APIs use [msgpack-rpc protocol](https://github.com/msgpack-rpc/msgpack-rpc) To add the RPC code to call the new API, follow the steps below. Follow the implementation of other APIs defined in the files. 1. Add an RPC handler in the server which calls your implemented method in [RpcLibServerBase.cpp](https://github.com/nervosys/AutonomySim/blob/main/AutonomyLib/src/api/RpcLibServerBase.cpp). Vehicle-specific APIs are in their respective vehicle subfolder. - 2. Add the C++ client API method in [RpcClientBase.cpp](https://github.com/nervosys/AutonomySim/blob/main/AutonomyLib/src/api/RpcLibClientBase.cpp) - 3. Add the Python client API method in [client.py](https://github.com/nervosys/AutonomySim/blob/main/PythonClient/AutonomySim/client.py). If needed, add or modify a structure definition in [types.py](https://github.com/nervosys/AutonomySim/blob/main/PythonClient/AutonomySim/types.py) ## Testing Testing is required to ensure that the API is working as expected. For this, as expected, you'll have to use the source-built AutonomySim and Blocks environment. Apart from this, if using the Python APIs, you'll have to use the `AutonomySim` package from source rather than the PyPI package. Below are 2 ways described to go about using the package from source - -1. Use [setup_path.py](https://github.com/nervosys/AutonomySim/blob/main/PythonClient/multirotor/setup_path.py). It will setup the path such that the local AutonomySim module is used instead of the pip installed package. This is the method used in many of the scripts since the user doesn't need to do anything other than run the script. - Place your example script in one of the folders inside `PythonClient` like `multirotor`, `car`, etc. You can also create one to keep things separate, and copy the `setup_path.py` file from another folder. - Add `import setup_path` before `import AutonomySim` in your files. Now the latest main API (or any branch currently checked out) will be used. - +1. Use [setup_path.py](https://github.com/nervosys/AutonomySim/blob/main/PythonClient/multirotor/setup_path.py). It will setup the path such that the local AutonomySim module is used instead of the pip installed package. This is the method used in many of the scripts since the user doesn't need to do anything other than run the script. Place your example script in one of the folders inside `PythonClient` like `multirotor`, `car`, etc. You can also create one to keep things separate, and copy the `setup_path.py` file from another folder. Add `import setup_path` before `import AutonomySim` in your files. Now the latest main API (or any branch currently checked out) will be used. 2. Use a [local project pip install](https://pip.pypa.io/en/stable/cli/pip_install/#local-project-installs). Regular install would create a copy of the current source and use it, whereas Editable install (`pip install -e .` from inside the `PythonClient` folder) would change the package whenever the Python API files are changed. Editable install has the benefit when working on several branches or API is not finalized. It is recommended to use a virtual environment for dealing with Python packaging so as to not break any existing setup. diff --git a/docs/autonomysim_apis.md b/docs/apis.md similarity index 94% rename from docs/autonomysim_apis.md rename to docs/apis.md index 6f0dcbf4..b7dc157a 100644 --- a/docs/autonomysim_apis.md +++ b/docs/apis.md @@ -1,15 +1,8 @@ ---- -title: "AutonomySim APIs" -keywords: introduction faq -tags: [getting_started, introduction] -sidebar: autonomysim_sidebar -permalink: autonomysim_getting_started.html -summary: These brief instructions will help you get started with the simulator. The other topics in this help provide additional information and detail about working with other aspects of this platform. ---- +# APIs ## Introduction -AutonomySim exposes APIs so you can interact with vehicle in the simulation programmatically. You can use these APIs to retrieve images, get state, control the vehicle and so on. +`AutonomySim` exposes application programming interfaces (APIs) that enable you to interact with vehicle in the simulation programmatically. You can use these APIs to retrieve images, get state, control the vehicle, and so on. ## Python Quickstart @@ -17,13 +10,13 @@ If you want to use Python to call AutonomySim APIs, we recommend using Anaconda First install this package: -``` +```shell pip install msgpack-rpc-python ``` You can either get AutonomySim binaries from [releases](https://github.com/nervosys/AutonomySim/releases) or compile from the source ([Windows](build_windows.md), [Linux](build_linux.md)). Once you can run AutonomySim, choose Car as vehicle and then navigate to `PythonClient\car\` folder and run: -``` +```shell python hello_car.py ``` @@ -33,7 +26,7 @@ If you are using Visual Studio 2019 then just open AutonomySim.sln, set PythonCl You can also install `AutonomySim` package simply by, -``` +```shell pip install AutonomySim ``` @@ -137,29 +130,28 @@ for response in responses: * `simPrintLogMessage`: Prints the specified message in the simulator's window. If message_param is also supplied then its printed next to the message and in that case if this API is called with same message value but different message_param again then previous line is overwritten with new line (instead of API creating new line on display). For example, `simPrintLogMessage("Iteration: ", to_string(i))` keeps updating same line on display when API is called with different values of i. The valid values of severity parameter is 0 to 3 inclusive that corresponds to different colors. * `simGetObjectPose`, `simSetObjectPose`: Gets and sets the pose of specified object in Unreal environment. Here the object means "actor" in Unreal terminology. They are searched by tag as well as name. Please note that the names shown in UE Editor are *auto-generated* in each run and are not permanent. So if you want to refer to actor by name, you must change its auto-generated name in UE Editor. Alternatively you can add a tag to actor which can be done by clicking on that actor in Unreal Editor and then going to [Tags property](https://answers.unrealengine.com/questions/543807/whats-the-difference-between-tag-and-tag.html), click "+" sign and add some string value. If multiple actors have same tag then the first match is returned. If no matches are found then NaN pose is returned. The returned pose is in NED coordinates in SI units in the world frame. For `simSetObjectPose`, the specified actor must have [Mobility](https://docs.unrealengine.com/en-us/Engine/Actors/Mobility) set to Movable or otherwise you will get undefined behavior. The `simSetObjectPose` has parameter `teleport` which means object is [moved through other objects](https://www.unrealengine.com/en-US/blog/moving-physical-objects) in its way and it returns true if move was successful -### Image / Computer Vision APIs +### Image/Computer Vision APIs -AutonomySim offers comprehensive images APIs to retrieve synchronized images from multiple cameras along with ground truth including depth, disparity, surface normals and vision. You can set the resolution, FOV, motion blur etc parameters in [settings.json](settings.md). There is also API for detecting collision state. See also [complete code](https://github.com/nervosys/AutonomySim/tree/main/Examples/DataCollection/StereoImageGenerator.hpp) that generates specified number of stereo images and ground truth depth with normalization to camera plane, computation of disparity image and saving it to [pfm format](pfm.md). +AutonomySim offers comprehensive images APIs to retrieve synchronized images from multiple cameras along with ground truth including depth, disparity, surface normals and vision. You can set the resolution, FOV, motion blur etc parameters in [settings.json](settings.md). There is also API for detecting collision state. See also [complete code](https://github.com/nervosys/AutonomySim/tree/master/Examples/DataCollection/StereoImageGenerator.hpp) that generates specified number of stereo images and ground truth depth with normalization to camera plane, computation of disparity image and saving it to [pfm format](pfm.md). More on [image APIs and Computer Vision mode](image_apis.md). For vision problems that can benefit from domain randomization, there is also an [object retexturing API](retexturing.md), which can be used in supported scenes. ### Pause and Continue APIs -AutonomySim allows to pause and continue the simulation through `pause(is_paused)` API. To pause the simulation call `pause(True)` and to continue the simulation call `pause(False)`. You may have scenario, especially while using reinforcement learning, to run the simulation for specified amount of time and then automatically pause. While simulation is paused, you may then do some expensive computation, send a new command and then again run the simulation for specified amount of time. This can be achieved by API `continueForTime(seconds)`. This API runs the simulation for the specified number of seconds and then pauses the simulation. For example usage, please see [pause_continue_car.py](https://github.com/nervosys/AutonomySim/tree/main/PythonClient//car/pause_continue_car.py) and [pause_continue_drone.py](https://github.com/nervosys/AutonomySim/tree/main/PythonClient//multirotor/pause_continue_drone.py). - +AutonomySim allows to pause and continue the simulation through `pause(is_paused)` API. To pause the simulation call `pause(True)` and to continue the simulation call `pause(False)`. You may have scenario, especially while using reinforcement learning, to run the simulation for specified amount of time and then automatically pause. While simulation is paused, you may then do some expensive computation, send a new command and then again run the simulation for specified amount of time. This can be achieved by API `continueForTime(seconds)`. This API runs the simulation for the specified number of seconds and then pauses the simulation. For example usage, please see [pause_continue_car.py](https://github.com/nervosys/AutonomySim/tree/master/PythonClient//car/pause_continue_car.py) and [pause_continue_drone.py](https://github.com/nervosys/AutonomySim/tree/master/PythonClient//multirotor/pause_continue_drone.py). ### Collision API The collision information can be obtained using `simGetCollisionInfo` API. This call returns a struct that has information not only whether collision occurred but also collision position, surface normal, penetration depth and so on. -### Time of Day API +### Time-of-day API AutonomySim assumes there exist sky sphere of class `EngineSky/BP_Sky_Sphere` in your environment with [ADirectionalLight actor](https://github.com/nervosys/AutonomySim/blob/v1.4.0-linux/Unreal/Plugins/AutonomySim/Source/SimMode/SimModeBase.cpp#L224). By default, the position of the sun in the scene doesn't move with time. You can use [settings](settings.md#timeofday) to set up latitude, longitude, date and time which AutonomySim uses to compute the position of sun in the scene. You can also use following API call to set the sun position according to given date time: -``` +```python simSetTimeOfDay(self, is_enabled, start_datetime = "", is_start_datetime_dst = False, celestial_clock_speed = 1, update_interval_secs = 60, move_sun = True) ``` @@ -176,18 +168,18 @@ Sim world extent, in the form of a vector of two GeoPoints, can be retrieved usi By default all weather effects are disabled. To enable weather effect, first call: -``` +```python simEnableWeather(True) ``` Various weather effects can be enabled by using `simSetWeatherParameter` method which takes `WeatherParameter`, for example, -``` +```python client.simSetWeatherParameter(AutonomySim.WeatherParameter.Rain, 0.25); ``` The second parameter value is from 0 to 1. The first parameter provides following options: -``` +```python class WeatherParameter: Rain = 0 Roadwetness = 1 @@ -199,7 +191,7 @@ class WeatherParameter: Fog = 7 ``` -Please note that `Roadwetness`, `RoadSnow` and `RoadLeaf` effects requires adding [materials](https://github.com/nervosys/AutonomySim/tree/main/Unreal/Plugins/AutonomySim/Content/Weather/WeatherFX) to your scene. +Please note that `Roadwetness`, `RoadSnow` and `RoadLeaf` effects requires adding [materials](https://github.com/nervosys/AutonomySim/tree/master/Unreal/Plugins/AutonomySim/Content/Weather/WeatherFX) to your scene. Please see [example code](https://github.com/nervosys/AutonomySim/blob/main/PythonClient/environment/weather.py) for more details. @@ -207,7 +199,7 @@ Please see [example code](https://github.com/nervosys/AutonomySim/blob/main/Pyth Recording APIs can be used to start recording data through APIs. Data to be recorded can be specified using [settings](settings.md#recording). To start recording, use - -``` +```python client.startRecording() ``` @@ -221,7 +213,7 @@ Note that this will only save the data as specfied in the settings. For full fre Wind can be changed during simulation using `simSetWind()`. Wind is specified in World frame, NED direction and m/s values -E.g. To set 20m/s wind in North (forward) direction - +For example, to set 20m/s wind in north (forward) direction: ```python # Set wind to (20,0,0) in NED (forward direction) @@ -325,9 +317,9 @@ See the [Adding New APIs](adding_new_apis.md) page ## References and Examples * [C++ API Examples](apis_cpp.md) -* [Car Examples](https://github.com/nervosys/AutonomySim/tree/main/PythonClient//car) -* [Multirotor Examples](https://github.com/nervosys/AutonomySim/tree/main/PythonClient//multirotor) -* [Computer Vision Examples](https://github.com/nervosys/AutonomySim/tree/main/PythonClient//computer_vision) +* [Car Examples](https://github.com/nervosys/AutonomySim/tree/master/PythonClient//car) +* [Multirotor Examples](https://github.com/nervosys/AutonomySim/tree/master/PythonClient//multirotor) +* [Computer Vision Examples](https://github.com/nervosys/AutonomySim/tree/master/PythonClient//computer_vision) * [Move on Path](https://github.com/nervosys/AutonomySim/wiki/moveOnPath-demo) demo showing video of fast multirotor flight through Modular Neighborhood environment * [Building a Hexacopter](https://github.com/nervosys/AutonomySim/wiki/hexacopter) * [Building Point Clouds](https://github.com/nervosys/AutonomySim/wiki/Point-Clouds) @@ -350,7 +342,7 @@ We recommend [Anaconda](https://www.anaconda.com/download/) to get Python tools You can install OpenCV using: -``` +```shell conda install opencv pip install opencv-python ``` diff --git a/docs/apis_cpp.md b/docs/apis_cpp.md index 00795024..1881e671 100644 --- a/docs/apis_cpp.md +++ b/docs/apis_cpp.md @@ -1,10 +1,10 @@ # Using the C++ APIs -Please read [general API doc](apis.md) first if you haven't already. This document describes C++ examples and other C++ specific details. +Please read the [general API documentation](apis.md) first if you haven't already. This document describes C++ API examples and other C++ specific details. ## Quick Start -Fastest way to get started is to open AutonomySim.sln in Visual Studio 2019. You will see [Hello Car](https://github.com/nervosys/AutonomySim/tree/main/HelloCar/) and [Hello Drone](https://github.com/nervosys/AutonomySim/tree/main/HelloDrone/) examples in the solution. These examples will show you the include paths and lib paths you will need to setup in your VC++ projects. If you are using Linux then you will specify these paths either in your [cmake file](https://github.com/nervosys/AutonomySim/tree/main/cmake//HelloCar/CMakeLists.txt) or on compiler command line. +The fastest way to get started is to open `AutonomySim.sln` in Visual Studio 2022. You will see [Hello Car](https://github.com/nervosys/AutonomySim/tree/master/HelloCar/) and [Hello Drone](https://github.com/nervosys/AutonomySim/tree/master/HelloDrone/) examples in the solution. These examples will show you the include paths and lib paths you will need to setup in your VC++ projects. If you are using Linux then you will specify these paths either in your [cmake file](https://github.com/nervosys/AutonomySim/tree/master/cmake//HelloCar/CMakeLists.txt) or on compiler command line. #### Include and Lib Folders @@ -18,14 +18,13 @@ Fastest way to get started is to open AutonomySim.sln in Visual Studio 2019. You Here's how to use AutonomySim APIs using C++ to control simulated car (see also [Python example](apis.md#hello_car)): ```cpp - // ready to run example: https://github.com/nervosys/AutonomySim/blob/main/HelloCar/main.cpp #include #include "vehicles/car/api/CarRpcLibClient.hpp" -int main() -{ +int main() { + nervosys::autonomylib::CarRpcLibClient client; client.enableApiControl(true); //this disables manual control CarControllerBase::CarControls controls; @@ -56,14 +55,13 @@ int main() Here's how to use AutonomySim APIs using C++ to control simulated quadrotor (see also [Python example](apis.md#hello_drone)): ```cpp - // ready to run example: https://github.com/nervosys/AutonomySim/blob/main/HelloDrone/main.cpp #include #include "vehicles/multirotor/api/MultirotorRpcLibClient.hpp" -int main() -{ +int main() { + nervosys::autonomylib::MultirotorRpcLibClient client; std::cout << "Press Enter to enable API control\n"; std::cin.get(); @@ -88,7 +86,7 @@ int main() ## See Also -* [Examples](https://github.com/nervosys/AutonomySim/tree/main/Examples) of how to use internal infrastructure in AutonomySim in your other projects -* [DroneShell](https://github.com/nervosys/AutonomySim/tree/main/DroneShell) app shows how to make simple interface using C++ APIs to control drones -* [HelloSpawnedDrones](https://github.com/nervosys/AutonomySim/tree/main/HelloSpawnedDrones) app shows how to make additional vehicles on the fly +* [Examples](https://github.com/nervosys/AutonomySim/tree/master/Examples) of how to use internal infrastructure in AutonomySim in your other projects +* [DroneShell](https://github.com/nervosys/AutonomySim/tree/master/DroneShell) app shows how to make simple interface using C++ APIs to control drones +* [HelloSpawnedDrones](https://github.com/nervosys/AutonomySim/tree/master/HelloSpawnedDrones) app shows how to make additional vehicles on the fly * [Python APIs](apis.md) diff --git a/docs/upgrade_apis.md b/docs/apis_upgrading.md similarity index 94% rename from docs/upgrade_apis.md rename to docs/apis_upgrading.md index 142c792b..facbe1b5 100644 --- a/docs/upgrade_apis.md +++ b/docs/apis_upgrading.md @@ -1,11 +1,12 @@ -# Upgrading API Client Code +# Upgrading APIs + There have been several API changes in AutonomySim v1.2 that we hope removes inconsistency, adds future extensibility and presents cleaner interface. Many of these changes are however *breaking changes* which means you will need to modify your client code that talks to AutonomySim. -## Quicker Way +## A Quicker Way -While most changes you need to do in your client code are fairly easy, a quicker way is simply to take a look at the example code such as [Hello Drone](https://github.com/nervosys/AutonomySim/tree/main/PythonClient//multirotor/hello_drone.py)or [Hello Car](https://github.com/nervosys/AutonomySim/tree/main/PythonClient//car/hello_car.py) to get gist of changes. +While most changes you need to do in your client code are fairly easy, a quicker way is simply to take a look at the example code such as [Hello Drone](https://github.com/nervosys/AutonomySim/tree/master/PythonClient//multirotor/hello_drone.py)or [Hello Car](https://github.com/nervosys/AutonomySim/tree/master/PythonClient//car/hello_car.py) to get gist of changes. -## Importing AutonomySim +## Importing `AutonomySim` Instead of this: diff --git a/docs/custom_drone.md b/docs/autonomylib_realworld.md similarity index 85% rename from docs/custom_drone.md rename to docs/autonomylib_realworld.md index 11209ef8..5f63ba2d 100644 --- a/docs/custom_drone.md +++ b/docs/autonomylib_realworld.md @@ -1,4 +1,4 @@ -# Running AutonomyLib on Real-world Autonomous Systems +# Running AutonomyLib on Real-world Vehicles The `AutonomyLib` library can be compiled and deployed on the companion computer on a real robotic system. For our testing, we connected a `Gigabyte Brix BXi7-5500` companion computer to a Pixhawk/PX4 flight controller over USB on a drone. The Gigabyte PC runs Ubuntu, so we are able to SSH into it over Wi-Fi: @@ -6,11 +6,11 @@ The `AutonomyLib` library can be compiled and deployed on the companion computer Once connected, you can run `MavLinkTest` with this command line: -```bash +```shell MavLinkTest -serial:/dev/ttyACM0,115200 -logdir:. ``` -And this will produce a log file of the flight which can then be used for [playback in the simulator](playback.md). +This will produce a log file of the flight which can then be used for [playback in the simulator](playback.md). You can also add `-proxy:192.168.1.100:14550` to connect `MavLinkTest` to a remote computer where you can run QGroundControl or our [PX4 Log Viewer](log_viewer.md), which is another handy way to see what is going on with your drone. @@ -68,4 +68,4 @@ You can run the `MavlinkCom` library and MavLinkTest app to test the connection If you want to use QGC and AutonomySim together than you will need QGC to let own the serial port. QGC opens up TCP connection that acts as a proxy so any other component can connect to QGC and send `MavLinkMessage` to QGC and then QGC forwards that message to PX4. So you tell AutonomySim to connect to QGC and let QGC own serial port. -For companion board, the way we did it earlier was to have Gigabyte Brix on the drone. This x86 full-fledged computer that will connect to PX4 through USB. We had Ubuntu on Brix and ran [DroneServer](https://github.com/nervosys/AutonomySim/tree/main/DroneServer). The `DroneServer` created an API endpoint that we can talk to via C++ client code (or Python code) and it translated API calls to MavLink messages. That way you can write your code against the same API, test it in the simulator and then run the same code on an actual vehicle. So the companion computer has DroneServer running along with client code. +For companion board, the way we did it earlier was to have Gigabyte Brix on the drone. This x86 full-fledged computer that will connect to PX4 through USB. We had Ubuntu on Brix and ran [DroneServer](https://github.com/nervosys/AutonomySim/tree/master/DroneServer). The `DroneServer` created an API endpoint that we can talk to via C++ client code (or Python code) and it translated API calls to MavLink messages. That way you can write your code against the same API, test it in the simulator and then run the same code on an actual vehicle. So the companion computer has DroneServer running along with client code. diff --git a/docs/autonomysim_getting_started.md b/docs/autonomysim_getting_started.md deleted file mode 100644 index 3748f74e..00000000 --- a/docs/autonomysim_getting_started.md +++ /dev/null @@ -1,8 +0,0 @@ ---- -title: "Getting started with AutonomySim" -keywords: introduction faq -tags: [getting_started, introduction] -sidebar: autonomysim_sidebar -permalink: autonomysim_getting_started.html -summary: These brief instructions will help you get started with the simulator. The other topics in this help provide additional information and detail about working with other aspects of this platform. ---- \ No newline at end of file diff --git a/docs/autonomysim_tutorial_pkgs.md b/docs/autonomysim_tutorial_pkgs.md deleted file mode 100644 index f68b2438..00000000 --- a/docs/autonomysim_tutorial_pkgs.md +++ /dev/null @@ -1,77 +0,0 @@ -# AutonomySim ROS Tutorials - -This is a set of sample AutonomySim `settings.json`s, roslaunch and rviz files to give a starting point for using AutonomySim with ROS. See [AutonomySim_ros_pkgs](https://github.com/nervosys/AutonomySim/blob/main/ros/src/AutonomySim_ros_pkgs/README.md) for the ROS API. - -## Setup - -Make sure that the [AutonomySim_ros_pkgs Setup](AutonomySim_ros_pkgs.md) has been completed and the prerequisites installed. - -```shell -$ cd PATH_TO/AutonomySim/ros -$ catkin build AutonomySim_tutorial_pkgs -``` - -If your default GCC isn't 8 or greater (check using `gcc --version`), then compilation will fail. In that case, use `gcc-8` explicitly as follows- - -```shell -catkin build AutonomySim_tutorial_pkgs -DCMAKE_C_COMPILER=gcc-8 -DCMAKE_CXX_COMPILER=g++-8 -``` - -!!! note - - For running examples, and also whenever a new terminal is opened, sourcing the `setup.bash` file is necessary. If you're using the ROS wrapper frequently, it might be helpful to add the `source PATH_TO/AutonomySim/ros/devel/setup.bash` to your `~/.profile` or `~/.bashrc` to avoid the need to run the command every time a new terminal is opened - -## Examples - -### Single drone with monocular and depth cameras, and lidar - - - Settings.json - [front_stereo_and_center_mono.json](https://github.com/nervosys/AutonomySim/blob/main/ros/src/AutonomySim_tutorial_pkgs/settings/front_stereo_and_center_mono.json) - ```shell - $ source PATH_TO/AutonomySim/ros/devel/setup.bash - $ roscd AutonomySim_tutorial_pkgs - $ cp settings/front_stereo_and_center_mono.json ~/Documents/AutonomySim/settings.json - - ## Start your unreal package or binary here - - $ roslaunch AutonomySim_ros_pkgs AutonomySim_node.launch; - - # in a new pane / terminal - $ source PATH_TO/AutonomySim/ros/devel/setup.bash - $ roslaunch AutonomySim_tutorial_pkgs front_stereo_and_center_mono.launch - ``` - - The above would start rviz with tf's, registered RGBD cloud using [depth_image_proc](https://wiki.ros.org/depth_image_proc) using the [`depth_to_pointcloud` launch file](https://github.com/nervosys/AutonomySim/blob/main/ros/src/AutonomySim_tutorial_pkgs/launch/front_stereo_and_center_mono/depth_to_pointcloud.launch), and the lidar point cloud. - -### Two drones, with cameras, lidar, IMU each - -- Settings.json - [two_drones_camera_lidar_imu.json](https://github.com/nervosys/AutonomySim/blob/main/ros/src/AutonomySim_tutorial_pkgs/settings/two_drones_camera_lidar_imu.json) - - ```shell - $ source PATH_TO/AutonomySim/ros/devel/setup.bash - $ roscd AutonomySim_tutorial_pkgs - $ cp settings/two_drones_camera_lidar_imu.json ~/Documents/AutonomySim/settings.json - - ## Start your unreal package or binary here - - $ roslaunch AutonomySim_ros_pkgs AutonomySim_node.launch; - $ roslaunch AutonomySim_ros_pkgs rviz.launch - ``` - -You can view the tfs in rviz. And do a `rostopic list` and `rosservice list` to inspect the services avaiable. - -### Twenty-five drones in a square pattern - -- Settings.json - [twenty_five_drones.json](https://github.com/nervosys/AutonomySim/blob/main/ros/src/AutonomySim_tutorial_pkgs/settings/twenty_five_drones.json) - - ```shell - $ source PATH_TO/AutonomySim/ros/devel/setup.bash - $ roscd AutonomySim_tutorial_pkgs - $ cp settings/twenty_five_drones.json ~/Documents/AutonomySim/settings.json - - ## Start your unreal package or binary here - - $ roslaunch AutonomySim_ros_pkgs AutonomySim_node.launch; - $ roslaunch AutonomySim_ros_pkgs rviz.launch - ``` - -You can view the tfs in rviz. And do a `rostopic list` and `rosservice list` to inspect the services avaiable. diff --git a/docs/azure.md b/docs/azure.md index 98efe909..61ab37cf 100644 --- a/docs/azure.md +++ b/docs/azure.md @@ -1,59 +1,57 @@ -# Azure Development Environment +# Microsoft Azure Cloud Development Environment -This document explains how to automate the creation of a development environment on Azure and code and debug a Python application connected to AutonomySim using Visual Studio Code +This document explains how to automate the creation of an Azure cloud development environment to code and debug a Python application connected to `AutonomySim` using Visual Studio Code. ## Automatically Deploy the Azure VM Click the blue button to start the Azure deployment (The template is pre-filled with the recommended virtual machine size for the use cases of the following two tutorials) - -*Note: the VM deployment and configuration process may take 20+ minutes to complete* +!!! note + The VM deployment and configuration process may take 20+ minutes to complete* ### Regarding the deployment of the Azure VM -- When using an Azure Trial account, the default vCPU quota is not enough to allocate the required VM to run AutonomySim. If that's the case, you will see an error when trying to create the VM and will have to submit a request for Quota increase. **Be sure to understand how and how much you are going to be charged for the use of the VM** +* When using an Azure Trial account, the default vCPU quota is not enough to allocate the required VM to run AutonomySim. If that's the case, you will see an error when trying to create the VM and will have to submit a request for Quota increase. **Be sure to understand how and how much you are going to be charged for the use of the VM** + +* To avoid charges for the Virtual Machine usage while not in use, remember to deallocate its resources from the [Azure Portal](https://portal.azure.com) or use the following command from the Azure CLI: -- To avoid charges for the Virtual Machine usage while not in use, remember to deallocate its resources from the [Azure Portal](https://portal.azure.com) or use the following command from the Azure CLI: -- -```bash +```shell az vm deallocate --resource-group MyResourceGroup --name MyVMName ``` ## Code and debug from Visual Studio Code and Remote SSH -- Install Visual Studio Code -- Install the *Remote - SSH* extension -- Press `F1` and run the `Remote - SSH: Connect to host...` command -- Add the recently create VM details. For instance, `AzureUser@11.22.33.44` -- Run the `Remote - SSH: Connect to host...` command again, and now select the newly added connection. -- Once connected, click on the `Clone Repository` button in Visual Studio Code, and either clone this repository in the remote VM and open *just the `azure` folder*, or create a brand new repository, clone it and copy the contents of the `azure` folder from this repository in it. It is important to open that directory so Visual Studio Code can use the specific `.vscode` directory for the scenario and not the general AutonomySim `.vscode` directory. It contains the recommended extensions to install, the task to start AutonomySim remotely and the launch configuration for the Python application. -- Install all the recommended extensions -- Press `F1` and select the `Tasks: Run Task` option. Then, select the `Start AutonomySim` task from Visual Studio Code to execute the `start-AutonomySim.ps1` script from Visual Studio Code. -- Open the `multirotor.py` file inside the `app` directory -- Start debugging with Python -- When finished, remember to stop an deallocate the Azure VM to avoid extra charges - -## Code and debug from a local Visual Studio Code and connect to AutonomySim via forwarded ports - -*Note: this scenario, will be using two Visual Studio Code instances. -The first one will be used as a bridge to forward ports via SSH to the Azure VM and execute remote processes, and the second one will -be used for local Python development. -To be able to reach the VM from the local Python code, it is required to keep the `Remote - SSH` instance of Visual Studio Code opened, while working with the local Python environment on the second instance* - -- Open the first Visual Studio Code instance -- Follow the steps in the previous section to connect via `Remote - SSH` -- In the *Remote Explorer*, add the port `41451` as a forwarded port to localhost -- Either run the `Start AutonomySim` task on the Visual Studio Code with the remote session as explained in the previous scenario or manually start the AutonomySim binary in the VM -- Open a second Visual Studio Code instance, without disconnecting or closing the first one -- Either clone this repository locally and open *just the `azure` folder* in Visual Studio Code, or create a brand new repository, clone it and copy the contents of the `azure` folder from this repository in it. -- Run `pip install -r requirements.txt` inside the `app` directory -- Open the `multirotor.py` file inside the `app` directory -- Start debugging with Python -- When finished, remember to stop an deallocate the Azure VM to avoid extra charges +* Install Visual Studio Code +* Install the *Remote - SSH* extension +* Press `F1` and run the `Remote - SSH: Connect to host...` command +* Add the recently create VM details. For instance, `AzureUser@11.22.33.44` +* Run the `Remote - SSH: Connect to host...` command again, and now select the newly added connection. +* Once connected, click on the `Clone Repository` button in Visual Studio Code, and either clone this repository in the remote VM and open *just the `azure` folder*, or create a brand new repository, clone it and copy the contents of the `azure` folder from this repository in it. It is important to open that directory so Visual Studio Code can use the specific `.vscode` directory for the scenario and not the general AutonomySim `.vscode` directory. It contains the recommended extensions to install, the task to start AutonomySim remotely and the launch configuration for the Python application. +* Install all the recommended extensions +* Press `F1` and select the `Tasks: Run Task` option. Then, select the `Start AutonomySim` task from Visual Studio Code to execute the `start-AutonomySim.ps1` script from Visual Studio Code. +* Open the `multirotor.py` file inside the `app` directory +* Start debugging with Python +* When finished, remember to stop an deallocate the Azure VM to avoid extra charges + +## Code and debug on local and connect to remote via forwarded ports + +!!! note + In this scenario, we use two Visual Studio Code instances. The first will be used as a bridge to forward ports via SSH to the Azure VM and execute remote processes and the second will be used for local Python development. To be able to reach the VM from the local Python code, it is required to keep the `Remote - SSH` instance of Visual Studio Code opened while working with the local Python environment on the second instance. + +* Open the first Visual Studio Code instance +* Follow the steps in the previous section to connect via `Remote - SSH` +* In the *Remote Explorer*, add the port `41451` as a forwarded port to localhost +* Either run the `Start AutonomySim` task on the Visual Studio Code with the remote session as explained in the previous scenario or manually start the AutonomySim binary in the VM +* Open a second Visual Studio Code instance, without disconnecting or closing the first one +* Either clone this repository locally and open *just the `azure` folder* in Visual Studio Code, or create a brand new repository, clone it and copy the contents of the `azure` folder from this repository in it. +* Run `pip install -r requirements.txt` inside the `app` directory +* Open the `multirotor.py` file inside the `app` directory +* Start debugging with Python +* When finished, remember to stop an deallocate the Azure VM to avoid extra charges ## Running with Docker @@ -69,24 +67,24 @@ It can use either public docker images from DockerHub or images deployed to a pr ### Building the docker image -```bash +```shell docker build -t / -f ./docker/Dockerfile .` ``` ## Using a different AutonomySim binary -To use a different AutonomySim binary, first check the official documentation on [How to Build AutonomySim on Windows](build_windows.md) and [How to Build AutonomySim on Linux](build_linux.md) if you also want to run it with Docker +To use a different `AutonomySim` binary, first check the official documentation on [How to Build AutonomySim on Windows](build_windows.md) and [How to Build AutonomySim on Linux](build_linux.md) if you also want to run it with Docker Once you have a zip file with the new AutonomySim environment (or prefer to use one from the [Official Releases](https://github.com/nervosys/AutonomySim/releases)), you need to modify some of the scripts in the `azure` directory of the repository to point to the new environment: -- In [`azure/azure-env-creation/configure-vm.ps1`](https://github.com/nervosys/AutonomySim/blob/main/azure/azure-env-creation/configure-vm.ps1), modify all the parameters starting with `$AutonomySimBinary` with the new values -- In [`azure/start-AutonomySim.ps1`](https://github.com/nervosys/AutonomySim/blob/main/azure/start-AutonomySim.ps1), modify `$AutonomySimExecutable` and `$AutonomySimProcessName` with the new values +* In [`azure/azure-env-creation/configure-vm.ps1`](https://github.com/nervosys/AutonomySim/blob/main/azure/azure-env-creation/configure-vm.ps1), modify all the parameters starting with `$AutonomySimBinary` with the new values +* In [`azure/start-AutonomySim.ps1`](https://github.com/nervosys/AutonomySim/blob/main/azure/start-AutonomySim.ps1), modify `$AutonomySimExecutable` and `$AutonomySimProcessName` with the new values -If you are using the docker image, you also need a linux binary zip file and modify the following Docker-related files: +If you are using the docker image, you also need a Linux binary zip file and modify the following Docker-related files: -- In [`azure/docker/Dockerfile`](https://github.com/nervosys/AutonomySim/blob/main/azure/docker/Dockerfile), modify the `AutonomySim_BINARY_ZIP_URL` and `AutonomySim_BINARY_ZIP_FILENAME` ENV declarations with the new values -- In [`azure/docker/docker-entrypoint.sh`](https://github.com/nervosys/AutonomySim/blob/main/azure/docker/docker-entrypoint.sh), modify `AutonomySim_EXECUTABLE` with the new value +* In [`azure/docker/Dockerfile`](https://github.com/nervosys/AutonomySim/blob/main/azure/docker/Dockerfile), modify the `AutonomySim_BINARY_ZIP_URL` and `AutonomySim_BINARY_ZIP_FILENAME` ENV declarations with the new values +* In [`azure/docker/docker-entrypoint.sh`](https://github.com/nervosys/AutonomySim/blob/main/azure/docker/docker-entrypoint.sh), modify `AutonomySim_EXECUTABLE` with the new value -## Maintaining this development environment +## Maintaining the development environment Several components of this development environment (ARM templates, initialization scripts and VSCode tasks) directly depend on the current directory structures file names and repository locations. When planning to modify/fork any of those, make sure to check every script and template to make any required adjustment. diff --git a/docs/build_docs.md b/docs/build_docs.md index 5ea45f46..860acf9f 100644 --- a/docs/build_docs.md +++ b/docs/build_docs.md @@ -1,10 +1,12 @@ # Generating the Documentation -> [!NOTE] -> This is only useful if you want to host your own `AutonomySim` documentation site (e.g., in a secure enclave). +!!! note + This is only useful if you want to host your own `AutonomySim` documentation site (e.g., in a secure enclave). The `AutonomySim` documentation [website](https://nervosys.github.io/AutonomySim/) HTML and CSS files are automatically generated from Markdown files and deployed to GitHub Pages using the `mkdocs` package for Python. You can also self-host the documentation or redirect GitHub Pages to your own domain. The choice is yours. +Compared to `AirSim` and its forks, we offer simpler and cleaner documentation in a modern style. + ## Configure GitHub Repository Configure the GitHub repository to automatically deploy documentation to a GitHub project site from the `gh-pages` branch: diff --git a/docs/build_faq.md b/docs/build_faq.md index 75ae56cd..bf8814b2 100644 --- a/docs/build_faq.md +++ b/docs/build_faq.md @@ -1,234 +1,157 @@ -# FAQ - ---- - -## Windows build - -- [FAQ](#faq) - - [Windows build](#windows-build) - - [Linux build](#linux-build) +# Build FAQ + +## Table of Contents + +- [Build FAQ](#build-faq) + - [Table of Contents](#table-of-contents) + - [Windows Builds](#windows-builds) + - [How to force Unreal to use Visual Studio 2019?](#how-to-force-unreal-to-use-visual-studio-2019) + - [I get error: 'where' is not recognized as an internal or external command](#i-get-error-where-is-not-recognized-as-an-internal-or-external-command) + - [I'm getting error ` could not be compiled. Try rebuilding from source manually`](#im-getting-error-myproject-could-not-be-compiled-try-rebuilding-from-source-manually) + - [I get `error C100 : An internal error has occurred in the compiler` when running build.cmd](#i-get-error-c100--an-internal-error-has-occurred-in-the-compiler-when-running-buildcmd) + - [I get error "'corecrt.h': No such file or directory" or "Windows SDK version 8.1 not found"](#i-get-error-corecrth-no-such-file-or-directory-or-windows-sdk-version-81-not-found) + - [How do I use PX4 firmware with AutonomySim?](#how-do-i-use-px4-firmware-with-autonomysim) + - [I made changes in Visual Studio but there is no effect](#i-made-changes-in-visual-studio-but-there-is-no-effect) + - [Unreal still uses VS2015 or I'm getting some link error](#unreal-still-uses-vs2015-or-im-getting-some-link-error) + - [Linux Builds](#linux-builds) + - [I'm getting error ` could not be compiled. Try rebuilding from source manually`.](#im-getting-error-myproject-could-not-be-compiled-try-rebuilding-from-source-manually-1) + - [Unreal crashed! How do I know what went wrong?](#unreal-crashed-how-do-i-know-what-went-wrong) + - [How do I use an IDE on Linux?](#how-do-i-use-an-ide-on-linux) + - [Can I cross compile for Linux from a Windows machine?](#can-i-cross-compile-for-linux-from-a-windows-machine) + - [What compiler and stdlib does AutonomySim use?](#what-compiler-and-stdlib-does-autonomysim-use) + - [What version of CMake does the AutonomySim build use?](#what-version-of-cmake-does-the-autonomysim-build-use) + - [Can I compile AutonomySim in BashOnWindows?](#can-i-compile-autonomysim-in-bashonwindows) + - [Where can I find more info on running Unreal on Linux?](#where-can-i-find-more-info-on-running-unreal-on-linux) - [Other](#other) - - [Windows build](#windows-build-1) - - [How to force Unreal to use Visual Studio 2019?](#how-to-force-unreal-to-use-visual-studio-2019) - - [I get error: 'where' is not recognized as an internal or external command](#i-get-error-where-is-not-recognized-as-an-internal-or-external-command) - - [I'm getting error ` could not be compiled. Try rebuilding from source manually`](#im-getting-error-myproject-could-not-be-compiled-try-rebuilding-from-source-manually) - - [I get `error C100 : An internal error has occurred in the compiler` when running build.cmd](#i-get-error-c100--an-internal-error-has-occurred-in-the-compiler-when-running-buildcmd) - - [I get error "'corecrt.h': No such file or directory" or "Windows SDK version 8.1 not found"](#i-get-error-corecrth-no-such-file-or-directory-or-windows-sdk-version-81-not-found) - - [How do I use PX4 firmware with AutonomySim?](#how-do-i-use-px4-firmware-with-autonomysim) - - [I made changes in Visual Studio but there is no effect](#i-made-changes-in-visual-studio-but-there-is-no-effect) - - [Unreal still uses VS2015 or I'm getting some link error](#unreal-still-uses-vs2015-or-im-getting-some-link-error) - - [Linux build](#linux-build-1) - - [I'm getting error ` could not be compiled. Try rebuilding from source manually`.](#im-getting-error-myproject-could-not-be-compiled-try-rebuilding-from-source-manually-1) - - [Unreal crashed! How do I know what went wrong?](#unreal-crashed-how-do-i-know-what-went-wrong) - - [How do I use an IDE on Linux?](#how-do-i-use-an-ide-on-linux) - - [Can I cross compile for Linux from a Windows machine?](#can-i-cross-compile-for-linux-from-a-windows-machine) - - [What compiler and stdlib does AutonomySim use?](#what-compiler-and-stdlib-does-autonomysim-use) - - [What version of CMake does the AutonomySim build use?](#what-version-of-cmake-does-the-autonomysim-build-use) - - [Can I compile AutonomySim in BashOnWindows?](#can-i-compile-autonomysim-in-bashonwindows) - - [Where can I find more info on running Unreal on Linux?](#where-can-i-find-more-info-on-running-unreal-on-linux) - - [Other](#other-1) - - [Packaging a binary including the AutonomySim plugin](#packaging-a-binary-including-the-autonomysim-plugin) - ---- - -## Linux build - -- [FAQ](#faq) - - [Windows build](#windows-build) - - [Linux build](#linux-build) - - [Other](#other) - - [Windows build](#windows-build-1) - - [How to force Unreal to use Visual Studio 2019?](#how-to-force-unreal-to-use-visual-studio-2019) - - [I get error: 'where' is not recognized as an internal or external command](#i-get-error-where-is-not-recognized-as-an-internal-or-external-command) - - [I'm getting error ` could not be compiled. Try rebuilding from source manually`](#im-getting-error-myproject-could-not-be-compiled-try-rebuilding-from-source-manually) - - [I get `error C100 : An internal error has occurred in the compiler` when running build.cmd](#i-get-error-c100--an-internal-error-has-occurred-in-the-compiler-when-running-buildcmd) - - [I get error "'corecrt.h': No such file or directory" or "Windows SDK version 8.1 not found"](#i-get-error-corecrth-no-such-file-or-directory-or-windows-sdk-version-81-not-found) - - [How do I use PX4 firmware with AutonomySim?](#how-do-i-use-px4-firmware-with-autonomysim) - - [I made changes in Visual Studio but there is no effect](#i-made-changes-in-visual-studio-but-there-is-no-effect) - - [Unreal still uses VS2015 or I'm getting some link error](#unreal-still-uses-vs2015-or-im-getting-some-link-error) - - [Linux build](#linux-build-1) - - [I'm getting error ` could not be compiled. Try rebuilding from source manually`.](#im-getting-error-myproject-could-not-be-compiled-try-rebuilding-from-source-manually-1) - - [Unreal crashed! How do I know what went wrong?](#unreal-crashed-how-do-i-know-what-went-wrong) - - [How do I use an IDE on Linux?](#how-do-i-use-an-ide-on-linux) - - [Can I cross compile for Linux from a Windows machine?](#can-i-cross-compile-for-linux-from-a-windows-machine) - - [What compiler and stdlib does AutonomySim use?](#what-compiler-and-stdlib-does-autonomysim-use) - - [What version of CMake does the AutonomySim build use?](#what-version-of-cmake-does-the-autonomysim-build-use) - - [Can I compile AutonomySim in BashOnWindows?](#can-i-compile-autonomysim-in-bashonwindows) - - [Where can I find more info on running Unreal on Linux?](#where-can-i-find-more-info-on-running-unreal-on-linux) - - [Other](#other-1) - - [Packaging a binary including the AutonomySim plugin](#packaging-a-binary-including-the-autonomysim-plugin) - ---- - -## Other + - [Packaging a binary including the AutonomySim plugin](#packaging-a-binary-including-the-autonomysim-plugin) -* [Packaging AutonomySim](#packaging-a-binary-including-the-AutonomySim-plugin) +## Windows Builds ---- +### How to force Unreal to use Visual Studio 2019? - -## Windows build - - -###### How to force Unreal to use Visual Studio 2019? - ->If the default `update_from_git.cmd` file results in VS 2017 project, then you may need to run the `C:\Program Files\Epic Games\UE_4.25\Engine\Binaries\DotNET\UnrealBuildTool.exe` tool manually, with the command line options `-projectfiles -project= -game -rocket -progress -2019`. +> If the default `update_from_git.cmd` file results in VS 2017 project, then you may need to run the `C:\Program Files\Epic Games\UE_4.25\Engine\Binaries\DotNET\UnrealBuildTool.exe` tool manually, with the command line options `-projectfiles -project= -game -rocket -progress -2019`. > ->If you are upgrading from 4.18 to 4.25 you may also need to add `BuildSettingsVersion.V2` to your `*.Target.cs` and `*Editor.Target.cs` build files, like this: +> If you are upgrading from 4.18 to 4.25 you may also need to add `BuildSettingsVersion.V2` to your `*.Target.cs` and `*Editor.Target.cs` build files, like this: > ->```c# -> public AutonomySimNHTestTarget(TargetInfo Target) : base(Target) -> { -> Type = TargetType.Game; -> DefaultBuildSettings = BuildSettingsVersion.V2; -> ExtraModuleNames.AddRange(new string[] { "AutonomySimNHTest" }); -> } ->``` +> ```csharp +> public AutonomySimNHTestTarget(TargetInfo Target) : base(Target) +> { +> Type = TargetType.Game; +> DefaultBuildSettings = BuildSettingsVersion.V2; +> ExtraModuleNames.AddRange(new string[] { "AutonomySimNHTest" }); +> } +> ``` > ->You may also need to edit this file: +> You may also need to edit this file: > ->``` ->"%APPDATA%\Unreal Engine\UnrealBuildTool\BuildConfiguration.xml ->``` +> ```shell +> "%APPDATA%\Unreal Engine\UnrealBuildTool\BuildConfiguration.xml +> ``` > ->And add this Compiler version setting: +> And add this Compiler version setting: > ->```xml -> -> -> VisualStudio2019 -> -> ->``` - - +> ```xml +> +> +> VisualStudio2019 +> +> +> ``` -###### I get error: 'where' is not recognized as an internal or external command +### I get error: 'where' is not recognized as an internal or external command ->You have to add `C:\WINDOWS\SYSTEM32` to your PATH enviroment variable. +> You have to add `C:\WINDOWS\SYSTEM32` to your PATH enviroment variable. - +### I'm getting error ` could not be compiled. Try rebuilding from source manually` -###### I'm getting error ` could not be compiled. Try rebuilding from source manually` - ->This will occur when there are compilation errors. Logs are stored in `\Saved\Logs` which can be used to figure out the problem. +> This will occur when there are compilation errors. Logs are stored in `\Saved\Logs` which can be used to figure out the problem. > ->A common problem could be Visual Studio version conflict, AutonomySim uses VS 2019 while UE is using VS 2017, this can be found by searching for `2017` in the Log file. In that case, see the answer above. +> A common problem could be Visual Studio version conflict, AutonomySim uses VS 2019 while UE is using VS 2017, this can be found by searching for `2017` in the Log file. In that case, see the answer above. > ->If you have modified the AutonomySim plugin files, then you can right-click the `.uproject` file, select `Generate Visual Studio solution file` and then open the `.sln` file in VS to fix the errors and build again. - - - -###### I get `error C100 : An internal error has occurred in the compiler` when running build.cmd +> If you have modified the AutonomySim plugin files, then you can right-click the `.uproject` file, select `Generate Visual Studio solution file` and then open the `.sln` file in VS to fix the errors and build again. ->We have noticed this happening with VS version `15.9.0` and have checked-in a workaround in AutonomySim code. If you have this VS version, please make sure to pull the latest AutonomySim code. +### I get `error C100 : An internal error has occurred in the compiler` when running build.cmd - +> We have noticed this happening with VS version `15.9.0` and have checked-in a workaround in AutonomySim code. If you have this VS version, please make sure to pull the latest AutonomySim code. -###### I get error "'corecrt.h': No such file or directory" or "Windows SDK version 8.1 not found" +### I get error "'corecrt.h': No such file or directory" or "Windows SDK version 8.1 not found" ->Very likely you don't have [Windows SDK](https://developercommunity.visualstudio.com/content/problem/3754/cant-compile-c-program-because-of-sdk-81cant-add-a.html) installed with Visual Studio. +> Very likely you don't have [Windows SDK](https://developercommunity.visualstudio.com/content/problem/3754/cant-compile-c-program-because-of-sdk-81cant-add-a.html) installed with Visual Studio. - +### How do I use PX4 firmware with AutonomySim? -###### How do I use PX4 firmware with AutonomySim? +> By default, AutonomySim uses its own built-in firmware called [simple_flight](simple_flight.md). There is no additional setup if you just want to go with it. If you want to switch to using PX4 instead then please see [this guide](px4_setup.md). ->By default, AutonomySim uses its own built-in firmware called [simple_flight](simple_flight.md). There is no additional setup if you just want to go with it. If you want to switch to using PX4 instead then please see [this guide](px4_setup.md). +### I made changes in Visual Studio but there is no effect - +> Sometimes the Unreal + VS build system doesn't recompile if you make changes to only header files. To ensure a recompile, make some Unreal based cpp file "dirty" like AutonomySimGameMode.cpp. -###### I made changes in Visual Studio but there is no effect +### Unreal still uses VS2015 or I'm getting some link error ->Sometimes the Unreal + VS build system doesn't recompile if you make changes to only header files. To ensure a recompile, make some Unreal based cpp file "dirty" like AutonomySimGameMode.cpp. - - - -###### Unreal still uses VS2015 or I'm getting some link error - ->Running several versions of VS can lead to issues when compiling UE projects. One problem that may arise is that UE will try to compile with an older version of VS which may or may not work. There are two settings in Unreal, one for for the engine and one for the project, to adjust the version of VS to be used. +> Running several versions of VS can lead to issues when compiling UE projects. One problem that may arise is that UE will try to compile with an older version of VS which may or may not work. There are two settings in Unreal, one for for the engine and one for the project, to adjust the version of VS to be used. > ->1. Edit -> Editor preferences -> General -> Source code -> Source Code Editor ->2. Edit -> Project Settings -> Platforms -> Windows -> Toolchain ->CompilerVersion +> 1. Edit -> Editor preferences -> General -> Source code -> Source Code Editor +> 2. Edit -> Project Settings -> Platforms -> Windows -> Toolchain ->CompilerVersion > ->In some cases, these settings will still not lead to the desired result and errors such as the following might be produced: LINK : fatal error LNK1181: cannot open input file 'ws2_32.lib' +> In some cases, these settings will still not lead to the desired result and errors such as the following might be produced: LINK : fatal error LNK1181: cannot open input file 'ws2_32.lib' > ->To resolve such issues the following procedure can be applied: +> To resolve such issues the following procedure can be applied: > ->1. Uninstall all old versions of VS using the [VisualStudioUninstaller](https://github.com/nervosys/VisualStudioUninstaller/releases) ->2. Repair/Install VS 2019 ->3. Restart machine and install Epic launcher and desired version of the engine +> 1. Uninstall all old versions of VS using the [VisualStudioUninstaller](https://github.com/nervosys/VisualStudioUninstaller/releases) +> 2. Repair/Install VS 2019 +> 3. Restart machine and install Epic launcher and desired version of the engine ---- +## Linux Builds -## Linux build - +### I'm getting error ` could not be compiled. Try rebuilding from source manually`. -###### I'm getting error ` could not be compiled. Try rebuilding from source manually`. - ->This could either happen because of compile error or the fact that your gch files are outdated. Look in to your console window. Do you see something like below? +> This could either happen because of compile error or the fact that your gch files are outdated. Look in to your console window. Do you see something like below? > ->`fatal error: file '/usr/include/linux/version.h''/usr/include/linux/version.h' has been modified since the precompiled header` +> `fatal error: file '/usr/include/linux/version.h''/usr/include/linux/version.h' has been modified since the precompiled header` > ->If this is the case then look for *.gch file(s) that follows after that message, delete them and try again. Here's [relevant thread](https://answers.unrealengine.com/questions/412349/linux-ue4-build-precompiled-header-fatal-error.html) on Unreal Engine forums. +> If this is the case then look for *.gch file(s) that follows after that message, delete them and try again. Here's [relevant thread](https://answers.unrealengine.com/questions/412349/linux-ue4-build-precompiled-header-fatal-error.html) on Unreal Engine forums. > ->If you see other compile errors in console then open up those source files and see if it is due to changes you made. If not, then report it as issue on GitHub. - - - -###### Unreal crashed! How do I know what went wrong? - ->Go to the `MyUnrealProject/Saved/Crashes` folder and search for the file `MyProject.log` within its subdirectories. At the end of this file you will see the stack trace and messages. ->You can also take a look at the `Diagnostics.txt` file. - - - -###### How do I use an IDE on Linux? - ->You can use Qt Creator or CodeLite. Instructions for Qt Creator are available [here](https://docs.unrealengine.com/en-US/SharingAndReleasing/Linux/BeginnerLinuxDeveloper/SettingUpQtCreator/index.html). - - - -###### Can I cross compile for Linux from a Windows machine? +> If you see other compile errors in console then open up those source files and see if it is due to changes you made. If not, then report it as issue on GitHub. ->Yes, you can, but we haven't tested it. You can find the instructions [here](https://docs.unrealengine.com/latest/INT/Platforms/Linux/GettingStarted/index.html). +### Unreal crashed! How do I know what went wrong? - +> Go to the `MyUnrealProject/Saved/Crashes` folder and search for the file `MyProject.log` within its subdirectories. At the end of this file you will see the stack trace and messages. +> You can also take a look at the `Diagnostics.txt` file. -###### What compiler and stdlib does AutonomySim use? +### How do I use an IDE on Linux? ->We use the same compiler that Unreal Engine uses, **Clang 8**, and stdlib, **libc++**. AutonomySim's `setup.sh` will automatically download them. +> You can use Qt Creator or CodeLite. Instructions for Qt Creator are available [here](https://docs.unrealengine.com/en-US/SharingAndReleasing/Linux/BeginnerLinuxDeveloper/SettingUpQtCreator/index.html). - +### Can I cross compile for Linux from a Windows machine? -###### What version of CMake does the AutonomySim build use? +> Yes, you can, but we haven't tested it. You can find the instructions [here](https://docs.unrealengine.com/latest/INT/Platforms/Linux/GettingStarted/index.html). ->3.10.0 or higher. This is *not* the default in Ubuntu 16.04 so setup.sh installs it for you. You can check your CMake version using `cmake --version`. If you have an older version, follow [these instructions](cmake_linux.md) or see the [CMake website](https://cmake.org/install/). +### What compiler and stdlib does AutonomySim use? - +> We use the same compiler that Unreal Engine uses, **Clang 8**, and stdlib, **libc++**. AutonomySim's `setup.sh` will automatically download them. -###### Can I compile AutonomySim in BashOnWindows? +### What version of CMake does the AutonomySim build use? ->Yes, however, you can't run Unreal from BashOnWindows. So this is kind of useful to check a Linux compile, but not for an end-to-end run. ->See the [BashOnWindows install guide](https://msdn.microsoft.com/en-us/commandline/wsl/install_guide). ->Make sure to have the latest version (Windows 10 Creators Edition) as previous versions had various issues. ->Also, don't invoke `bash` from `Visual Studio Command Prompt`, otherwise CMake might find VC++ and try and use that! +> 3.10.0 or higher. This is *not* the default in Ubuntu 16.04 so setup.sh installs it for you. You can check your CMake version using `cmake --version`. If you have an older version, follow [these instructions](cmake_linux.md) or see the [CMake website](https://cmake.org/install/). - +### Can I compile AutonomySim in BashOnWindows? -###### Where can I find more info on running Unreal on Linux? ->Start here: [Unreal on Linux](https://docs.unrealengine.com/latest/INT/Platforms/Linux/index.html) ->[Building Unreal on Linux](https://wiki.unrealengine.com/Building_On_Linux#Clang) ->[Unreal Linux Support](https://wiki.unrealengine.com/Linux_Support) ->[Unreal Cross Compilation](https://wiki.unrealengine.com/Compiling_For_Linux) +> Yes, however, you can't run Unreal from BashOnWindows. So this is kind of useful to check a Linux compile, but not for an end-to-end run. +> See the [BashOnWindows install guide](https://msdn.microsoft.com/en-us/commandline/wsl/install_guide). +> Make sure to have the latest version (Windows 10 Creators Edition) as previous versions had various issues. +> Also, don't invoke `bash` from `Visual Studio Command Prompt`, otherwise CMake might find VC++ and try and use that! ---- +### Where can I find more info on running Unreal on Linux? +> Start here: [Unreal on Linux](https://docs.unrealengine.com/latest/INT/Platforms/Linux/index.html) +> [Building Unreal on Linux](https://wiki.unrealengine.com/Building_On_Linux#Clang) +> [Unreal Linux Support](https://wiki.unrealengine.com/Linux_Support) +> [Unreal Cross Compilation](https://wiki.unrealengine.com/Compiling_For_Linux) ## Other - -###### Packaging a binary including the AutonomySim plugin +### Packaging a binary including the AutonomySim plugin ->In order to package a custom environment with the AutonomySim plugin, there are a few project settings that are necessary for ensuring all required assets needed for AutonomySim are included inside the package. Under `Edit -> Project Settings... -> Project -> Packaging`, please ensure the following settings are configured properly: +> In order to package a custom environment with the AutonomySim plugin, there are a few project settings that are necessary for ensuring all required assets needed for AutonomySim are included inside the package. Under `Edit -> Project Settings... -> Project -> Packaging`, please ensure the following settings are configured properly: > ->- `List of maps to include in a packaged build`: ensure one entry exists for `/AutonomySim/AutonomySimAssets` ->- `Additional Asset Directories to Cook`: ensure one entry exists for `/AutonomySim/HUDAssets` +> * `List of maps to include in a packaged build`: ensure one entry exists for `/AutonomySim/AutonomySimAssets` +> * `Additional Asset Directories to Cook`: ensure one entry exists for `/AutonomySim/HUDAssets` diff --git a/docs/build_linux.md b/docs/build_linux.md index ce33ba69..fa736eb4 100644 --- a/docs/build_linux.md +++ b/docs/build_linux.md @@ -2,7 +2,10 @@ The current recommended and tested environment is **Ubuntu 18.04 LTS**. Theoretically, you can build on other distros as well, but we haven't tested it. -We've two options - you can either build inside docker containers or your host machine. +There are two options: + +1. Build inside Docker containers +2. Build on your host machine ## Docker @@ -14,9 +17,8 @@ Please see instructions [here](docker_ubuntu.md) #### Build Unreal Engine -- Make sure you are [registered with Epic Games](https://docs.unrealengine.com/en-US/SharingAndReleasing/Linux/BeginnerLinuxDeveloper/SettingUpAnUnrealWorkflow/index.html). This is required to get source code access for Unreal Engine. - -- Clone Unreal in your favorite folder and build it (this may take a while!). **Note**: We only support Unreal >= 4.27 at present. We recommend using 4.27. +* Make sure you are [registered with Epic Games](https://docs.unrealengine.com/en-US/SharingAndReleasing/Linux/BeginnerLinuxDeveloper/SettingUpAnUnrealWorkflow/index.html). This is required to get source code access for Unreal Engine. +* Clone Unreal in your favorite folder and build it (this may take a while!). **Note**: We only support Unreal >= 4.27 at present. We recommend using 4.27. ```bash # go to the folder where you clone GitHub projects @@ -29,7 +31,7 @@ make ### Build AutonomySim -- Clone AutonomySim and build it: +* Clone AutonomySim and build it: ```bash # go to the folder where you clone GitHub projects @@ -53,16 +55,16 @@ Finally, you will need an Unreal project that hosts the environment for your veh Once AutonomySim is setup: -- Go to `UnrealEngine` installation folder and start Unreal by running `./Engine/Binaries/Linux/UE4Editor`. -- When Unreal Engine prompts for opening or creating project, select Browse and choose `AutonomySim/Unreal/Environments/Blocks` (or your [custom](unreal_custenv.md) Unreal project). -- Alternatively, the project file can be passed as a commandline argument. For Blocks: `./Engine/Binaries/Linux/UE4Editor /Unreal/Environments/Blocks/Blocks.uproject` -- If you get prompts to convert project, look for More Options or Convert-In-Place option. If you get prompted to build, choose Yes. If you get prompted to disable AutonomySim plugin, choose No. -- After Unreal Editor loads, press Play button. +* Go to `UnrealEngine` installation folder and start Unreal by running `./Engine/Binaries/Linux/UE4Editor`. +* When Unreal Engine prompts for opening or creating project, select Browse and choose `AutonomySim/Unreal/Environments/Blocks` (or your [custom](unreal_custenv.md) Unreal project). +* Alternatively, the project file can be passed as a commandline argument. For Blocks: `./Engine/Binaries/Linux/UE4Editor /Unreal/Environments/Blocks/Blocks.uproject` +* If you get prompts to convert project, look for More Options or Convert-In-Place option. If you get prompted to build, choose Yes. If you get prompted to disable AutonomySim plugin, choose No. +* After Unreal Editor loads, press Play button. See [Using APIs](apis.md) and [settings.json](settings.md) for various options available for AutonomySim usage. !!! tip -Go to 'Edit->Editor Preferences', in the 'Search' box type 'CPU' and ensure that the 'Use Less CPU when in Background' is unchecked. + Go to 'Edit->Editor Preferences', in the 'Search' box type 'CPU' and ensure that the 'Use Less CPU when in Background' is unchecked. ### [Optional] Setup Remote Control (Multirotor Only) diff --git a/docs/build_macos.md b/docs/build_macos.md index 3a467662..bbcbdc0d 100644 --- a/docs/build_macos.md +++ b/docs/build_macos.md @@ -2,7 +2,10 @@ Only macOS **Catalina (10.15)** has currently been tested. Theoretically, AutonomySim should work on higher macOS versions and Apple Silicon hardware, but this path is not offically supported. -We've two options - you can either build inside docker containers or your host machine. +There are two options: + +1. Build inside Docker containers +2. Build on your host machine ## Docker @@ -23,7 +26,7 @@ Please see instructions [here](docker_ubuntu.md) ### Build AutonomySim -- Clone AutonomySim and build it: +* Clone AutonomySim and build it: ```bash # go to the folder where you clone GitHub projects @@ -31,9 +34,9 @@ git clone https://github.com/nervosys/AutonomySim.git cd AutonomySim ``` -By default AutonomySim uses clang 8 to build for compatibility with UE 4.25. The setup script will install the right version of cmake, llvm, and eigen. +By default AutonomySim uses `clang-8` to build for compatibility with UE 4.25. The setup script will install the right version of `cmake`, `llvm`, and `eigen`. -CMake 3.19.2 is required for building on Apple Silicon. +CMake 3.19.2 is required for building on Apple silicon. ```bash ./setup.sh @@ -47,15 +50,15 @@ Finally, you will need an Unreal project that hosts the environment for your veh ## How to Use AutonomySim -- Browse to `AutonomySim/Unreal/Environments/Blocks`. -- Run `./GenerateProjectFiles.sh ` from the terminal, where `UE_PATH` is the path to the Unreal installation folder. (By default, this is `/Users/Shared/Epic\ Games/UE_4.27/`) The script creates an XCode workspace by the name Blocks.xcworkspace. -- Open the XCode workspace, and press the Build and run button in the top left. -- After Unreal Editor loads, press Play button. +* Browse to `AutonomySim/Unreal/Environments/Blocks`. +* Run `./GenerateProjectFiles.sh ` from the terminal, where `UE_PATH` is the path to the Unreal installation folder. (By default, this is `/Users/Shared/Epic\ Games/UE_4.27/`) The script creates an XCode workspace by the name Blocks.xcworkspace. +* Open the XCode workspace, and press the Build and run button in the top left. +* After Unreal Editor loads, press Play button. See [Using APIs](apis.md) and [settings.json](settings.md) for various options available for AutonomySim usage. !!! tip -Go to 'Edit->Editor Preferences', in the 'Search' box type 'CPU' and ensure that the 'Use Less CPU when in Background' is unchecked. + Go to 'Edit->Editor Preferences', in the 'Search' box type 'CPU' and ensure that the 'Use Less CPU when in Background' is unchecked. ### [Optional] Setup Remote Control (Multirotor Only) diff --git a/docs/build_windows.md b/docs/build_windows.md index 0c7f0db1..cc201254 100644 --- a/docs/build_windows.md +++ b/docs/build_windows.md @@ -14,7 +14,8 @@ Congratulations! Unreal Engine is now installed and ready to use. -NOTE: If you created projects with UE 4.16 or older, see the [upgrade guide](unreal_upgrade.md) to upgrade your projects. +!!! note + If you created projects with UE 4.16 or older, see the [upgrade guide](unreal_upgrade.md) to upgrade your projects. ![Unreal Engine Tab UI Screenshot](images/ue_install.png) @@ -52,9 +53,11 @@ NOTE: If you created projects with UE 4.16 or older, see the [upgrade guide](unr * `.\scripts\build.ps1` * `./scripts/build.sh` -NOTE: We are actively porting the DOS-era Windows batch (.bat) and command (.cmd) scripts to PowerShell (.ps1), as it offers modern features such as cross-platform support, unicode text encoding, and system object piping. Linux and MacOS benefit from supporting a common language, BASH. While MacOS now uses Zsh for its default shell, it is backwards compatible with BASH. Eventually, we may only support PowerShell or BASH (or maybe [Batsh](https://github.com/batsh-dev-team/Batsh)) on all platforms. +!!! note + We are actively porting the DOS-era Windows batch (.bat) and command (.cmd) scripts to PowerShell (.ps1), as it offers modern features such as cross-platform support, unicode text encoding, and system object piping. Linux and MacOS benefit from supporting a common language, BASH. While MacOS now uses Zsh for its default shell, it is backwards compatible with BASH. Eventually, we may only support PowerShell or BASH (or maybe [Batsh](https://github.com/batsh-dev-team/Batsh)) on all platforms. -NOTE: Installing AutonomySim on the `C:\` drive may cause scripts to fail and may also require running VS in Admin mode. If possible, clone the project into a directory on a different drive. If not, ensure correct behaviour. +!!! note + Installing AutonomySim on the `C:\` drive may cause scripts to fail and may also require running VS in Admin mode. If possible, clone the project into a directory on a different drive. If not, ensure correct behaviour. ## Build an Unreal Project @@ -62,7 +65,8 @@ Next, you will need an Unreal project to host an environment for your vehicles. ## Setup a Remote Control -NOTE: The below only applies to multi-rotor drones. +!!! note + The below only applies to multi-rotor drones. To fly drones manually, a physical (or software-emulated) controller is required. For more information, see the [remote control setup guide](remote_control.md). Alternatively, you may (a) wrap [application programming interfaces (APIs)](apis.md) calls for software control or (b) use the [computer vision mode](image_apis.md) for manual keyboard control. @@ -72,7 +76,8 @@ Once AutonomySim is set up by following above steps, you can, 1. Double click on .sln file to load the Blocks project in `Unreal\Environments\Blocks` (or .sln file in your own [custom](unreal_custenv.md) Unreal project). If you don't see .sln file then you probably haven't completed steps in Build Unreal Project section above. - **Note**: Unreal 4.27 will auto-generate the .sln file targetting Visual Studio 2019. Visual Studio 2022 will be able to load and run this .sln, but if you want full Visual Studio 2022 support, you will need to explicitly enable support by going to 'Edit->Editor Preferences->Source Code' and selecting 'Visual Studio 2022' for the 'Source Code Editor' setting. +!!! note + Unreal 4.27 will auto-generate the .sln file targetting Visual Studio 2019. Visual Studio 2022 will be able to load and run this .sln, but if you want full Visual Studio 2022 support, you will need to explicitly enable support by going to 'Edit->Editor Preferences->Source Code' and selecting 'Visual Studio 2022' for the 'Source Code Editor' setting. 2. Select your Unreal project as Start Up project (for example, Blocks project) and make sure Build config is set to "Develop Editor" and x64. 3. After Unreal Editor loads, press Play button. diff --git a/docs/camera_views.md b/docs/camera_views.md index 0a1a62b5..a63c65b1 100644 --- a/docs/camera_views.md +++ b/docs/camera_views.md @@ -1,6 +1,6 @@ # Camera Views -The camera views that are shown on screen are the camera views you can fetch via the [simGetImages API](image_apis.md). +The camera views shown on-screen are the image streams that can be fetched via the [Image APIs](image_apis.md). ![Cameras](images/cameras.png) @@ -26,7 +26,7 @@ You can switch to manual camera control by pressing the M key. While manual came Now you can select what is shown by each of above sub windows. For instance, you can chose to show surface normals in first window (instead of depth) and disparity in second window (instead of segmentation). Below is the settings value you can use in [settings.json](settings.md): -``` +```json { "SubWindows": [ {"WindowID": 1, "CameraName": "0", "ImageType": 5, "VehicleName": "", "Visible": false}, @@ -37,21 +37,18 @@ Now you can select what is shown by each of above sub windows. For instance, you ## Performance Impact -*Note*: This section is outdated and has not been updated for new performance enhancement changes. +!!! note + This section is outdated and has not been updated for new performance enhancement changes. Now rendering these views does impact the FPS performance of the game, since this is additional work for the GPU. The following shows the impact on FPS when you open these views. ![fps](images/fps_views.png) -This is measured on Intel core i7 computer with 32 gb RAM and a GeForce GTX 1080 -graphics card running the Modular Neighborhood map, using cooked debug bits, no debugger or GameEditor open. The normal state with no subviews open is measuring around 16 ms per frame, which means it is keeping a nice steady 60 FPS (which is the target FPS). As it climbs up to 35ms the FPS drops to around 28 frames per second, spiking to 40ms means a few drops to 25 fps. +This is measured on `Intel Core i7` computer with 32 gb RAM and a `GeForce GTX 1080` graphics card running the Modular Neighborhood map, using cooked debug bits, no debugger or GameEditor open. The normal state with no subviews open is measuring around 16 ms per frame, which means it is keeping a nice steady 60 FPS (which is the target FPS). As it climbs up to 35ms the FPS drops to around 28 frames per second, spiking to 40ms means a few drops to 25 fps. The simulator can still function and fly correctly when all this is going on even in the worse case because the physics is decoupled from the rendering. However if the delay gets too high such that the communication with PX4 hardware is interrupted due to overly busy CPU then the flight can stall due to timeout in the offboard control messages. -On the computer where this was measured the drone could fly the path.py program -without any problems with all views open, and with 3 python scripts running -to capture each view type. But there was one stall during this flight, but it -recovered gracefully and completed the path. So it was right on the limit. +On the computer where this was measured the drone could fly the `path.py` program without any problems with all views open, and with 3 python scripts running to capture each view type. But there was one stall during this flight, but it recovered gracefully and completed the path. So it was right on the limit. The following shows the impact on CPU, perhaps a bit surprisingly, the CPU impact is also non trivial. diff --git a/docs/cmake_linux.md b/docs/cmake_linux.md index 305bf48c..5ec13d75 100644 --- a/docs/cmake_linux.md +++ b/docs/cmake_linux.md @@ -1,6 +1,6 @@ # Installing CMake on Linux -If you don't have cmake version 3.10 (for example, 3.2.2 is the default on Ubuntu 14) you can run the following: +If you don't have CMake version 3.10 or greater, you can run the following to install it: ```bash mkdir ~/cmake-3.10.2 @@ -8,16 +8,16 @@ cd ~/cmake-3.10.2 wget https://cmake.org/files/v3.10/cmake-3.10.2-Linux-x86_64.sh ``` -Now you have to run this command by itself (it is interactive) +Now, you have to run this command by itself (it is interactive): ```bash sh cmake-3.10.2-Linux-x86_64.sh --prefix ~/cmake-3.10.2 ``` -Answer 'n' to the question about creating another cmake-3.10.2-Linux-x86_64 folder and then +Answer `n` to the question about creating another `cmake-3.10.2-Linux-x86_64` folder and then this: ```bash sudo update-alternatives --install /usr/bin/cmake cmake ~/cmake-3.10.2/bin/cmake 60 ``` -Now type `cmake --version` to make sure your cmake version is 3.10.2. +Now type `cmake --version` to ensure your CMake version is equal to `3.10.2`. diff --git a/docs/code_structure.md b/docs/code_structure.md index cf7afd96..b1d2a057 100644 --- a/docs/code_structure.md +++ b/docs/code_structure.md @@ -2,51 +2,49 @@ ## AutonomyLib -Majority of the code is located in AutonomyLib. This is a self-contained library that you should be able to compile with any C++11 compiler. +The majority of the code is located in `AutonomyLib`, a self-contained library that can be compiled with any popular `C++11` compiler. -AutonomyLib consists of the following components: +`AutonomyLib` consists of the following components: -1. [*Physics engine:*](https://github.com/nervosys/AutonomySim/tree/main/AutonomyLib/include/physics) This is header-only physics engine. It is designed to be fast and extensible to implement different vehicles. -2. [*Sensor models:*](https://github.com/nervosys/AutonomySim/tree/main/AutonomyLib/include/sensors) This is header-only models for Barometer, IMU, GPS and Magnetometer. -3. [*Vehicle models:*](https://github.com/nervosys/AutonomySim/tree/main/AutonomyLib/include/vehiclesr) This is header-only models for vehicle configurations and models. Currently we have implemented model for a Multirotor and a configuration for PX4 QuadRotor in the X config. There are several different Multirotor models defined in MultirotorParams.hpp including a hexacopter as well. -4. [*API-related files:*](https://github.com/nervosys/AutonomySim/tree/main/AutonomyLib/include/api) This part of AutonomyLib provides abstract base class for our APIs and concrete implementation for specific vehicle platforms such as MavLink. It also has classes for the RPC client and server. +1. [*Physics engine:*](https://github.com/nervosys/AutonomySim/tree/master/AutonomyLib/include/physics) This is header-only physics engine. It is designed to be fast and extensible to implement different vehicles. +2. [*Sensor models:*](https://github.com/nervosys/AutonomySim/tree/master/AutonomyLib/include/sensors) These are header-only models for the barometer, IMU, GPS and magnetometer. +3. [*Vehicle models:*](https://github.com/nervosys/AutonomySim/tree/master/AutonomyLib/include/vehiclesr) These are header-only models for vehicle configurations and models. Currently, we have implemented model for a multirotor and a configuration for a PX4 quadrotor in the `X config`. There are several different multirotor models defined in `MultirotorParams.hpp`, including a hexacopter as well. +4. [*API-related files:*](https://github.com/nervosys/AutonomySim/tree/master/AutonomyLib/include/api) This part of `AutonomyLib` provides abstract base classes for our APIs and concrete implementations for specific vehicle platforms such as `MavLink`. It also contains classes for the RPC client and server. -Apart from these, all common utilities are defined in [`common/`](https://github.com/nervosys/AutonomySim/tree/main/AutonomyLib/include/common) subfolder. One important file here is [AutonomySimSettings.hpp](https://github.com/nervosys/AutonomySim/blob/main/AutonomyLib/include/common/AutonomySimSettings.hpp) which should be modified if any new fields are to be added in `settings.json`. +Apart from these, all common utilities are defined in the [`common/`](https://github.com/nervosys/AutonomySim/tree/master/AutonomyLib/include/common) subfolder. One important file here is [`AutonomySimSettings.hpp`](https://github.com/nervosys/AutonomySim/blob/main/AutonomyLib/include/common/AutonomySimSettings.hpp), which should be modified if any new fields are to be added in `settings.json`. -AutonomySim supports different firmwares for Multirotor such as its own SimpleFlight, PX4 and ArduPilot, files for communicating with each firmware are placed in their respective subfolders in [`multirotor/firmwares`](https://github.com/nervosys/AutonomySim/tree/main/AutonomyLib/include/vehicles/multirotor/firmwares). +`AutonomySim` supports different firmwares for multirotors such as its own `SimpleFlight`, `PX4`, and `ArduPilot`. Files for communicating with each firmware are placed in their respective subfolders in [`multirotor/firmwares`](https://github.com/nervosys/AutonomySim/tree/master/AutonomyLib/include/vehicles/multirotor/firmwares). -The vehicle-specific APIs are defined in the `api/` subfolder, along-with required structs. The [`AutonomyLib/src/`](https://github.com/nervosys/AutonomySim/tree/main/AutonomyLib/src) contains .cpp files with implementations of various mehtods defined in the .hpp files. For e.g. [MultirotorApiBase.cpp](https://github.com/nervosys/AutonomySim/blob/main/AutonomyLib/src/vehicles/multirotor/api/MultirotorApiBase.cpp) contains the base implementation of the multirotor APIs, which can also be overridden in the specific firmware files if required. +Vehicle-specific APIs are defined in the `api/` subfolder along-with required data structures. The [`AutonomyLib/src/`](https://github.com/nervosys/AutonomySim/tree/master/AutonomyLib/src) directory contains `.cpp` files with implementations of various methods defined in the `.hpp` files. For e.g. [`MultirotorApiBase.cpp`](https://github.com/nervosys/AutonomySim/blob/main/AutonomyLib/src/vehicles/multirotor/api/MultirotorApiBase.cpp) contains the base implementation of the multirotor APIs, which can be overridden in the specific firmware files. ## Unreal/Plugins/AutonomySim This is the only portion of project which is dependent on Unreal engine. We have kept it isolated so we can implement simulator for other platforms as well, as has been done for [Unity](https://microsoft.github.io/AutonomySim/Unity.html). The Unreal code takes advantage of its UObject based classes including Blueprints. The `Source/` folder contains the C++ files, while the `Content/` folder has the blueprints and assets. Some main components are described below: -1. *SimMode_ classes*: The SimMode classes help implement many different modes, such as pure Computer Vision mode, where there is no vehicle or simulation for a specific vehicle (currently car and multirotor). The vehicle classes are located in [`Vehicles/`](https://github.com/nervosys/AutonomySim/tree/main/Unreal/Plugins/AutonomySim/Source/Vehicles) +1. *SimMode_ classes*: The SimMode classes help implement many different modes, such as pure Computer Vision mode, where there is no vehicle or simulation for a specific vehicle (currently car and multirotor). The vehicle classes are located in [`Vehicles/`](https://github.com/nervosys/AutonomySim/tree/master/Unreal/Plugins/AutonomySim/Source/Vehicles) 2. *PawnSimApi*: This is the [base class](https://github.com/nervosys/AutonomySim/blob/main/Unreal/Plugins/AutonomySim/Source/PawnSimApi.cpp) for all vehicle pawn visualizations. Each vehicle has their own child (Multirotor|Car|ComputerVision)Pawn class. -3. [UnrealSensors](https://github.com/nervosys/AutonomySim/tree/main/Unreal/Plugins/AutonomySim/Source/UnrealSensors): Contains implementation of Distance and Lidar sensors. +3. [UnrealSensors](https://github.com/nervosys/AutonomySim/tree/master/Unreal/Plugins/AutonomySim/Source/UnrealSensors): Contains implementation of Distance and Lidar sensors. 4. *WorldSimApi*: Implements most of the environment and vehicle-agnostic APIs Apart from these, [`PIPCamera`](https://github.com/nervosys/AutonomySim/blob/main/Unreal/Plugins/AutonomySim/Source/PIPCamera.cpp) contains the camera initialization, and [`UnrealImageCapture`](https://github.com/nervosys/AutonomySim/blob/main/Unreal/Plugins/AutonomySim/Source/UnrealImageCapture.cpp) & [`RenderRequest`](https://github.com/nervosys/AutonomySim/blob/main/Unreal/Plugins/AutonomySim/Source/RenderRequest.cpp) the image rendering code. [`AutonomyBlueprintLib`](https://github.com/nervosys/AutonomySim/blob/main/Unreal/Plugins/AutonomySim/Source/AutonomyBlueprintLib.cpp) has a lot of utility and wrapper methods used to interface with the UE4 engine. ## MavLinkCom -This is the library developed by our own team member [Chris Lovett](https://github.com/lovettchris) that provides C++ classes to talk to the MavLink devices. This library is stand alone and can be used in any project. -See [MavLinkCom](mavlinkcom.md) for more info. +This is the library developed by our own team member [Chris Lovett](https://github.com/lovettchris) that provides C++ classes to talk to the MavLink devices. This library is stand alone and can be used in any project. See [MavLinkCom](mavlinkcom.md) for more info. ## Sample Programs -We have created a few sample programs to demonstrate how to use the API. See HelloDrone and DroneShell. -DroneShell demonstrates how to connect to the simulator using UDP. The simulator is running a server (similar to DroneServer). +We have created a few sample programs to demonstrate how to use the API. See HelloDrone and DroneShell. DroneShell demonstrates how to connect to the simulator using UDP. The simulator is running a server (similar to DroneServer). -## PythonClient +## Python Client -[PythonClient](https://github.com/nervosys/AutonomySim/tree/main/PythonClient) contains Python API wrapper files and sample programs demonstrating their uses. +[PythonClient](https://github.com/nervosys/AutonomySim/tree/master/PythonClient) contains Python API wrapper files and sample programs demonstrating their uses. -## Unreal Framework +## Unreal Engine -The following picture illustrates how AutonomySim is loaded and invoked by the Unreal Game Engine: +The below figure illustrates how `AutonomySim` is loaded and invoked by the Unreal Engine: -![AutonomySimConstruction](images/AutonomySim_startup.png) +![AutonomySimConstruction](media/images/AutonomySim_startup.png) ## Contributing diff --git a/docs/coding_guidelines.md b/docs/coding_guidelines.md index 7812914a..16e04403 100644 --- a/docs/coding_guidelines.md +++ b/docs/coding_guidelines.md @@ -2,12 +2,11 @@ We adopt the modern C++[11..23] standards. Smart pointers, lambdas, and multithreading primitives are your friend. -## Quick Note +## A Note on Standards The great thing about 'standards' is that there are many to chose from: [ISO](https://isocpp.org/wiki/faq/coding-standards), [Sutter & Stroustrup](https://github.com/isocpp/CppCoreGuidelines/blob/master/CppCoreGuidelines.md), [ROS](http://wiki.ros.org/CppStyleGuide), [Linux](https://www.kernel.org/doc/Documentation/process/coding-style.rst), [Google](https://google.github.io/styleguide/cppguide.html), [Microsoft](https://msdn.microsoft.com/en-us/library/888a6zcz.aspx), [CERN](http://atlas-computing.web.cern.ch/atlas-computing/projects/qa/draft_guidelines.html), [GCC](https://gcc.gnu.org/wiki/CppConventions), [ARM](http://infocenter.arm.com/help/index.jsp?topic=/com.arm.doc.dui0475c/CJAJAJCJ.html), [LLVM](http://llvm.org/docs/CodingStandards.html), [Epic Games](https://docs.unrealengine.com/5.3/en-US/epic-cplusplus-coding-standard-for-unreal-engine/) and probably thousands of others. Unfortunately, most of these disagree about basic things, such as how to name a class or a constant. This is due to the fact that the standards often inherit legacy issues in order to support existing codebases. The intention behind this document is to provide guidance that remains as close to ISO, Sutter & Stroustrup and ROS while resolving as many conflicts, disadvantages and inconsistencies as possible among them. !!! note - Since we have dropped support for all other game engines, we will be refactoring our C++ code to better comply with the Epic Games standard for Unreal Engine version 5.3 or greater. ## clang-format @@ -24,7 +23,7 @@ If you find a bug in clang-format you can disable clang formatting of a specific ## Naming Conventions -Avoid using any sort of Hungarian notation on names and "_ptr" on pointers. +Avoid using any sort of Hungarian notation on names and `_ptr` on pointers. | **Code Element** | **Style** | **Comment** | | --------------------- | -------------------------------- | --------------------------------------------------------------------------------------------------------------------------------------------- | @@ -58,8 +57,7 @@ The reason we don't use #pragma once is because it's not supported if same heade Inside function or method body place curly bracket on same line. Outside that the Namespace, Class and methods levels use separate line.This is called [K&R style](https://en.wikipedia.org/wiki/Indent_style#K.26R_style) and its variants are widely used in C++ vs other styles which are more popular in other languages. Notice that curlies are not required if you have single statement, but complex statements are easier to keep correct with the braces. ```cpp -int main(int argc, char* argv[]) -{ +int main(int argc, char* argv[]) { while (x == y) { f0(); if (cont()) { @@ -79,17 +77,18 @@ int main(int argc, char* argv[]) Religiously review all non-scalar parameters you declare to be candidate for const and references. If you are coming from languages such as C#/Java/Python, the most often mistake you would make is to pass parameters by value instead of `const T&;` Especially most of the strings, vectors and maps you want to pass as `const T&;` (if they are readonly) or `T&` (if they are writable). Also add `const` suffix to methods as much as possible. ## Overriding + When overriding virtual method, use override suffix. ## Pointers -This is really about memory management. A simulator has much performance critical code, so we try and avoid overloading the memory manager with lots of calls to new/delete. We also want to avoid too much copying of things on the stack, so we pass things by reference when ever possible. But, when the object really needs to live longer than the call stack you often need to allocate that object on the heap, and so you have a pointer. Now, if management of the lifetime of that object is going to be tricky we recommend using [C++ 11 smart pointers](https://cppstyle.wordpress.com/c11-smart-pointers/). But smart pointers do have a cost, so don’t use them blindly everywhere. For private code where performance is paramount, raw pointers can be used. Raw pointers are also often needed when interfacing with legacy systems that only accept pointer types, for example, sockets API. But we try to wrap those legacy interfaces as much as possible and avoid that style of programming from leaking into the larger code base. +This is really about memory management. A simulator has much performance critical code, so we try and avoid overloading the memory manager with lots of calls to new/delete. We also want to avoid too much copying of things on the stack, so we pass things by reference when ever possible. But, when the object really needs to live longer than the call stack you often need to allocate that object on the heap, and so you have a pointer. Now, if management of the lifetime of that object is going to be tricky we recommend using [C++ 11 smart pointers](https://cppstyle.wordpress.com/c11-smart-pointers/). But smart pointers do have a cost, so don’t use them blindly everywhere. For private code where performance is paramount, raw pointers can be used. Raw pointers are also often needed when interfacing with legacy systems that only accept pointer types, for example, sockets API. But we try to wrap those legacy interfaces as much as possible and avoid that style of programming from leaking into the larger code base. -Religiously check if you can use const everywhere, for example, `const float * const xP`. Avoid using prefix or suffix to indicate pointer types in variable names, i.e. use `my_obj` instead of `myobj_ptr` except in cases where it might make sense to differentiate variables better, for example, `int mynum = 5; int* mynum_ptr = mynum;` +Religiously check if you can use const everywhere, for example, `const float * const xP`. Avoid using prefix or suffix to indicate pointer types in variable names, i.e., use `my_obj` instead of `myobj_ptr` except in cases where it might make sense to differentiate variables better, for example, `int mynum = 5; int* mynum_ptr = mynum;` ## Null Checking -In Unreal C++ code, when checking if a pointer is null, it is preferable to use `IsValid(ptr)`. In addition to checking for a null pointer, this function will also return whether a UObject is properly initialized. This is useful in situations where a UObject is in the process of being garbage collected but still set to a non-null value. +In Unreal C++ code, when checking if a pointer is null, it is preferable to use `IsValid(ptr)`. In addition to checking for a null pointer, this function will also return whether a UObject is properly initialized. This is useful in situations where a `UObject` is in the process of being garbage collected but still set to a non-null value. ## Indentation @@ -111,6 +110,6 @@ git config --global core.autocrlf input For more details on this setting, see [this documentation](https://docs.github.com/en/get-started/getting-started-with-git/configuring-git-to-handle-line-endings). -## This is Too Short, ye? +## On Brevity -Yes, and it's on purpose because no one likes to read 200 page coding guidelines. The goal here is to cover only most significant things which are already not covered by [strict mode compilation in GCC](http://shitalshah.com/p/how-to-enable-and-use-gcc-strict-mode-compilation/) and Level 4 warnings-as-errors in VC++. If you had like to know about how to write better code in C++, please see [GotW](https://herbsutter.com/gotw/) and [Effective Modern C++](http://shop.oreilly.com/product/0636920033707.do) book. +This document is intentionally brief, as nobody likes to read a 200-page set of code guidelines. The goal here is to cover only the most significant items that are already not covered by [strict mode compilation in GCC](http://shitalshah.com/p/how-to-enable-and-use-gcc-strict-mode-compilation/) and Level 4 warnings-as-errors in Visual C++. If you would like to learn how to write better C++ code, please read the [GotW](https://herbsutter.com/gotw/) and [Effective Modern C++](http://shop.oreilly.com/product/0636920033707.do) books. diff --git a/docs/remote_control.md b/docs/controller_remote.md similarity index 67% rename from docs/remote_control.md rename to docs/controller_remote.md index 95abcde9..1d705add 100644 --- a/docs/remote_control.md +++ b/docs/controller_remote.md @@ -12,7 +12,8 @@ To date, XBox and [FrSky Taranis X9D Plus](https://hobbyking.com/en_us/frsky-2-4 AutonomySim can detect large variety of devices. However, devices other than those listed above may require extra configuration. In the future, we may add relared configuration options in the `settings.json` file. If your controller does not work, we recommend trying workarounds such as [x360ce](http://www.x360ce.com/) or modifying the [SimJoystick.cpp file](https://github.com/nervosys/AutonomySim/blob/main/Unreal/Plugins/AutonomySim/Source/SimJoyStick/SimJoyStick.cpp#L50). -NOTE: If a realistic experience is desired, the XBox 360 controller is not recommended as it has insufficient potentiometer encoding precision. For more information, see the FAQ below. +!!! note + If a realistic experience is desired, the XBox 360 controller is not recommended as it has insufficient potentiometer encoding precision. For more information, see the FAQ below. ### FrSky Taranis X9D Plus @@ -34,8 +35,7 @@ Please see the [PX4 controller configuration](https://docs.px4.io/en/getting_sta ### XBox 360 Controller -You can also use an xbox controller in SITL mode, it just won't be as precise as a real RC controller. -See [xbox controller](xbox_controller.md) for details on how to set that up. +You can also use an xbox controller in SITL mode, it just won't be as precise as a real RC controller. See [xbox controller](xbox_controller.md) for details on how to set that up. ### Playstation 3 Controller @@ -47,33 +47,33 @@ Nils Tijtgat wrote an excellent blog on how to get the [DJI controller working w ## FAQ -1. **AutonomySim says my USB controller is not detected.** +### AutonomySim says my USB controller is not detected - This typically happens if you have multiple RCs and or XBox/Playstation gamepads etc connected. In Windows, hit Windows+S key and search for "Set up USB Game controllers" (in older versions of Windows try "joystick"). This will show you all game controllers connected to your PC. If you don't see yours than Windows haven't detected it and so you need to first solve that issue. If you do see yours but not at the top of the list (i.e. index 0) than you need to tell AutonomySim because AutonomySim by default tries to use RC at index 0. To do this, navigate to your `~/Documents/AutonomySim` folder, open up `settings.json` and add/modify following setting. Below tells AutonomySim to use RC at `index = 2`. - - ```json - { - "SettingsVersion": 1.2, - "SimMode": "Multirotor", - "Vehicles": { - "SimpleFlight": { - "VehicleType": "SimpleFlight", - "RC": { - "RemoteControlID": 2 - } +This typically happens if you have multiple RCs and or XBox/Playstation gamepads etc connected. In Windows, hit Windows+S key and search for "Set up USB Game controllers" (in older versions of Windows try "joystick"). This will show you all game controllers connected to your PC. If you don't see yours than Windows haven't detected it and so you need to first solve that issue. If you do see yours but not at the top of the list (i.e. index 0) than you need to tell AutonomySim because AutonomySim by default tries to use RC at index 0. To do this, navigate to your `~/Documents/AutonomySim` folder, open up `settings.json` and add/modify following setting. Below tells AutonomySim to use RC at `index = 2`. + +```json +{ + "SettingsVersion": 1.2, + "SimMode": "Multirotor", + "Vehicles": { + "SimpleFlight": { + "VehicleType": "SimpleFlight", + "RC": { + "RemoteControlID": 2 } } } - ``` +} +``` -2. **The vehicle is unstable when using XBox/PS3 controller** +### The vehicle is unstable when using XBox/PS3 controller - Regular gamepads are not very precise and have lot of random noise. Most of the times you may see significant offsets as well (i.e. output is not zero when sticks are at zero). So this behavior is expected. +Regular gamepads are not very precise and have lot of random noise. Most of the times you may see significant offsets as well (i.e. output is not zero when sticks are at zero). So this behavior is expected. -3. **Where is the RC controller calibration utility in AutonomySim?** +### Where is the RC controller calibration utility in AutonomySim? - We haven't implemented it yet. This means your RC firmware will need to have a capability to do calibration for now. +We haven't implemented it yet. This means your RC firmware will need to have a capability to do calibration for now. -4. **The RC controller is not working with PX4** +### The RC controller is not working with PX4 - First, ensure your RC controller is working in [QGroundControl](https://docs.qgroundcontrol.com/en/SetupView/Radio.html). If it doesn't then it will sure not work in AutonomySim. The PX4 mode is suitable for folks who have at least intermediate level of experience to deal with various issues related to PX4 and we would generally refer you to get help from PX4 forums. +First, ensure your RC controller is working in [QGroundControl](https://docs.qgroundcontrol.com/en/SetupView/Radio.html). If it doesn't then it will sure not work in AutonomySim. The PX4 mode is suitable for folks who have at least intermediate level of experience to deal with various issues related to PX4 and we would generally refer you to get help from PX4 forums. diff --git a/docs/robot_controller.md b/docs/controller_robot.md similarity index 100% rename from docs/robot_controller.md rename to docs/controller_robot.md diff --git a/docs/xbox_controller.md b/docs/controller_xbox.md similarity index 56% rename from docs/xbox_controller.md rename to docs/controller_xbox.md index c085ffec..b8b558e2 100644 --- a/docs/xbox_controller.md +++ b/docs/controller_xbox.md @@ -1,6 +1,6 @@ # XBox Controller -To use an XBox controller with AutonomySim follow these steps: +To use an `XBox` controller with `AutonomySim`, follow the below steps: 1. Connect XBox controller so it shows up in your PC Game Controllers: @@ -10,16 +10,10 @@ To use an XBox controller with AutonomySim follow these steps: ![Gamecontrollers](images/qgc_joystick.png) -Now calibrate the radio, and setup some handy button actions. For example, I set mine so that -the 'A' button arms the drone, 'B' put it in manual flight mode, 'X' puts it in altitude hold mode -and 'Y' puts it in position hold mode. I also prefer the feel of the controller when I check the -box labelled "Use exponential curve on roll,pitch, yaw" because this gives me more sensitivity for -small movements. - -QGroundControl will find your Pixhawk via the UDP proxy port 14550 setup by MavLinkTest above. -AutonomySim will find your Pixhawk via the other UDP server port 14570 also setup by MavLinkTest above. -You can also use all the QGroundControl controls for autonomous flying at this point too. +Now calibrate the radio, and setup some handy button actions. For example, I set mine so that +the `A` button arms the drone, `B` put it in manual flight mode, `X` puts it in altitude hold mode and `Y` puts it in position hold mode. I also prefer the feel of the controller when I check the box labelled `Use exponential curve on roll, pitch, yaw` because this gives me more sensitivity for small movements. +`QGroundControl` will find your Pixhawk via the UDP proxy port 14550 setup by `MavLinkTest` above. `AutonomySim` will find your Pixhawk via the other UDP server port 14570 also setup by MavLinkTest above. You can also use all the QGroundControl controls for autonomous flying at this point too. 3. Connect to Pixhawk serial port using MavLinkTest.exe like this: @@ -27,7 +21,7 @@ You can also use all the QGroundControl controls for autonomous flying at this p MavLinkTest.exe -serial:*,115200 -proxy:127.0.0.1:14550 -server:127.0.0.1:14570 ``` -1. Run AutonomySim Unreal simulator with these `~/Documents/AutonomySim/settings.json` settings: +1. Run `AutonomySim` with the following settings in `AutonomySim/settings.json`: ```json "Vehicles": { diff --git a/docs/create_issue.md b/docs/create_issues.md similarity index 77% rename from docs/create_issue.md rename to docs/create_issues.md index d4608bde..07ec31e7 100644 --- a/docs/create_issue.md +++ b/docs/create_issues.md @@ -1,6 +1,6 @@ # How to Create Issue or Ask Questions Effectively -AutonomySim is open source project and contributors like you keeps it going. It is important to respect contributors time and effort when you are asking a question or filing an issue. Your chances of receiving helpful response would increase if you follow below guidelines: +`AutonomySim` is an open-source project. Contributors like you keep it going. It is important to respect contributors' time and effort when asking a question or filing an issue. Your chances of receiving helpful response will likely improve by following the below guidelines. ## Good Practices diff --git a/docs/dev_workflow.md b/docs/dev_workflow.md index 45e780cd..8d4a33a3 100644 --- a/docs/dev_workflow.md +++ b/docs/dev_workflow.md @@ -50,32 +50,32 @@ build.cmd Above command will transfer your code changes from Unreal project folder back to `Unreal\Plugins` folder. Now your changes are ready to be pushed to AutonomySim repo or your own fork. You can also copy `Unreal\Plugins` to your custom Unreal engine project and see if everything works in your custom project as well. -### Take Away +### Takeaway -Once you understand how Unreal Build system and plugin model works as well as why we are doing above steps, you should feel quite comfortable in following this workflow. Don't be afraid of opening up .cmd files to peek inside and see what its doing. They are quite minimal and straightforward (except, of course, build.cmd - don't look in to that one). +Once you understand how the Unreal Build system and plugin model works, as well as why we are doing above steps, you should feel comfortable in following this workflow. Don't be afraid of opening up `.cmd` files to peek inside and see what they are doing. These files are often minimal and straightforward (except, of course, `build.cmd`). ## FAQ -#### I made changes in code in Blocks project but its not working. +### I made changes in code in Blocks project but its not working. When you press F5 or F6 in Visual Studio to start build, the Unreal Build system kicks in and it tries to find out if any files are dirty and what it needs to build. Unfortunately, it often fails to recognize dirty files that is not the code that uses Unreal headers and object hierarchy. So, the trick is to just make some file dirty that Unreal Build system always recognizes. My favorite one is AutonomySimGameMode.cpp. Just insert a line, delete it and save the file. -#### I made changes in the code outside of Visual Studio but its not working. +### I made changes in the code outside of Visual Studio but its not working. Don't do that! Unreal Build system *assumes* that you are using Visual Studio and it does bunch of things to integrate with Visual Studio. If you do insist on using other editors then look up how to do command line builds in Unreal projects OR see docs on your editor on how it can integrate with Unreal build system OR run `clean.cmd` + `GenerateProjectFiles.cmd` to make sure VS solution is in sync. -#### I'm trying to add new file in the Unreal Project and its not working. +### I'm trying to add new file in the Unreal Project and its not working. It won't! While you are indeed using Visual Studio solution, remember that this solution was actually generated by Unreal Build system. If you want to add new files in your project, first shut down Visual Studio, add an empty file at desired location and then run `GenerateProjectFiles.cmd` which will scan all files in your project and then re-create the .sln file. Now open this new .sln file and you are in business. -#### I copied Unreal\Plugins folder but nothing happens in Unreal Project. +### I copied Unreal\Plugins folder but nothing happens in Unreal Project. First make sure your project's .uproject file is referencing the plugin. Then make sure you have run `clean.cmd` and then `GenerateProjectFiles.cmd` as described in Overview above. -#### I have multiple Unreal projects with AutonomySim plugin. How do I update them easily? +### I have multiple Unreal projects with AutonomySim plugin. How do I update them easily? You are in luck! We have `build_all_ue_projects.cmd` which exactly does that. Don't treat it as black box (at least not yet), open it up and see what it does. It has 4 variables that are being set from command line args. If these args is not supplied they are set to default values in next set of statements. You might want to change default values for the paths. This batch file builds AutonomySim plugin, deploys it to all listed projects (see CALL statements later in the batch file), runs packaging for those projects and puts final binaries in specified folder - all in one step! This is what we use to create our own binary releases. -#### How do I contribute back to AutonomySim? +### How do I contribute back to AutonomySim? Before making any changes make sure you have created your feature branch. After you test your code changes in Blocks environment, follow the [usual steps](https://akrabat.com/the-beginners-guide-to-contributing-to-a-github-project/) to make contributions just like any other GitHub projects. Please use rebase and squash merge, for more information see [An introduction to Git merge and rebase: what they are, and how to use them](https://www.freecodecamp.org/news/an-introduction-to-git-merge-and-rebase-what-they-are-and-how-to-use-them-131b863785f/). diff --git a/docs/distance_sensor.md b/docs/distance_sensor.md index 4317de13..464c97f4 100644 --- a/docs/distance_sensor.md +++ b/docs/distance_sensor.md @@ -1,6 +1,6 @@ # Distance Sensor -By default, Distance Sensor points to the front of the vehicle. It can be pointed in any direction by modifying the settings +By default, the `Distance Sensor` points to the front of the vehicle. It can be pointed in any direction by modifying the settings ## Configurable Parameters @@ -25,5 +25,4 @@ For example, to make the sensor point towards the ground (for altitude measureme ``` !!! note - For Cars, the sensor is placed 1 meter above the vehicle center by default. This is required since otherwise the sensor gives strange data due it being inside the vehicle. This doesn't affect the sensor values say when measuring the distance between 2 cars. See [`PythonClient/car/distance_sensor_multi.py`](https://github.com/nervosys/AutonomySim/blob/main/PythonClient/car/distance_sensor_multi.py) for an example usage. diff --git a/docs/docker_ubuntu.md b/docs/docker_ubuntu.md index ed9220eb..fca6e6e5 100644 --- a/docs/docker_ubuntu.md +++ b/docs/docker_ubuntu.md @@ -1,14 +1,17 @@ -# AutonomySim on Docker in Linux +# Docker on Linux -We've two options for docker. You can either build an image for running [AutonomySim linux binaries](#binaries), or for compiling Unreal Engine + AutonomySim [from source](#source) +There are two options for Docker: + +1. Build an image for running [AutonomySim linux binaries](#binaries) +2. Build an image for compiling Unreal Engine and AutonomySim [from source](#source) ## Binaries -#### Requirements +### Requirements * Install [nvidia-docker2](https://github.com/NVIDIA/nvidia-docker#quickstart) -#### Build the docker image +### Build the docker image * Below are the default arguments. `--base_image`: This is image over which we'll install AutonomySim. We've tested on Ubuntu 18.04 with CUDA 10.0. @@ -25,7 +28,7 @@ We've two options for docker. You can either build an image for running [Autonom * Verify you have an image by: `$ docker images | grep AutonomySim` -#### Running an unreal binary inside a docker container +### Running an unreal binary inside a docker container * Get [a Linux binary](https://github.com/nervosys/AutonomySim/releases) or package your own project in Ubuntu. Let's take the Blocks binary as an example. You can download it by running the folowing: @@ -60,12 +63,12 @@ For AutonomySim, most relevant would be `-windowed`, `-ResX`, `-ResY`. Click on ## Source -#### Requirements +### Requirements * Install [nvidia-docker2](https://docs.nvidia.com/datacenter/cloud-native/container-toolkit/install-guide.html#docker) * Install [ue4-docker](https://docs.adamrehn.com/ue4-docker/configuration/configuring-linux) -#### Build Unreal Engine inside docker +### Build Unreal Engine inside docker * To get access to Unreal Engine's source code, register on Epic Games' website and link it to your github account, as explained in the `Required Steps` section [here](https://docs.unrealengine.com/en-us/Platforms/Linux/BeginnerLinuxDeveloper/SettingUpAnUnrealWorkflow). @@ -86,7 +89,7 @@ For AutonomySim, most relevant would be `-windowed`, `-ResX`, `-ResY`. Click on * [`docker image prune`](https://docs.docker.com/engine/reference/commandline/image_prune/) * [`docker system prune`](https://docs.docker.com/engine/reference/commandline/system_prune/) -#### Building AutonomySim inside UE4 docker container +### Building AutonomySim inside UE4 docker container * Build AutonomySim docker image (which lays over the unreal image we just built) Below are the default arguments. @@ -103,7 +106,7 @@ python build_AutonomySim_image.py \ --target_image=AutonomySim_source:4.19.2-cudagl10.0 ``` -#### Running AutonomySim container +### Running AutonomySim container * Run the AutonomySim source image we built by: @@ -120,7 +123,7 @@ python build_AutonomySim_image.py \ * [Specifying an AutonomySim settings.json](#specifying-settingsjson) * Continue with [AutonomySim's Linux docs](build_linux.md#build-unreal-environment). -#### [Misc] Packaging Unreal Environments in `AutonomySim_source` containers +### [Misc] Packaging Unreal Environments in `AutonomySim_source` containers * Let's take the Blocks environment as an example. In the following script, specify the full path to your unreal uproject file by `project` and the directory where you want the binaries to be placed by `archivedirectory` @@ -136,7 +139,7 @@ This would create a Blocks binary in `/home/ue4/Binaries/Blocks/`. You can test ### Specifying settings.json -#### `AutonomySim_binary` docker image +#### `AutonomySim_binary` Docker image * We're mapping the host machine's `PATH/TO/AutonomySim/docker/settings.json` to the docker container's `/home/AutonomySim_user/Documents/AutonomySim/settings.json`. * Hence, we can load any settings file by simply modifying `PATH_TO_YOUR/settings.json` by modifying the following snippets in [`run_AutonomySim_image_binary.sh`](https://github.com/nervosys/AutonomySim/blob/main/docker/run_AutonomySim_image_binary.sh) @@ -158,7 +161,6 @@ nvidia-docker run --runtime=nvidia -it \ ``` !!! note - Docker version >=19.03 (check using `docker -v`), natively supports Nvidia GPUs, so run using `--gpus all` flag as given - ```bash diff --git a/docs/faq.md b/docs/faq.md index d4715737..9e8ca5d2 100644 --- a/docs/faq.md +++ b/docs/faq.md @@ -1,148 +1,113 @@ # FAQ ---- - -## General +## Table of Contents - [FAQ](#faq) + - [Table of Contents](#table-of-contents) - [General](#general) - - [General](#general-1) - - [Unreal editor is slow when it is not the active window](#unreal-editor-is-slow-when-it-is-not-the-active-window) - - [My mouse disappears in Unreal](#my-mouse-disappears-in-unreal) - - [Where is the setting file and how do I modify it?](#where-is-the-setting-file-and-how-do-i-modify-it) - - [How do I arm my drone?](#how-do-i-arm-my-drone) - - [When making API call I get error](#when-making-api-call-i-get-error) - - [I'm getting Eigen not found error when compiling Unreal project](#im-getting-eigen-not-found-error-when-compiling-unreal-project) - - [Something went wrong. How do I debug?](#something-went-wrong-how-do-i-debug) - - [What do the colors mean in the Segmentation View?](#what-do-the-colors-mean-in-the-segmentation-view) - - [Unreal 4.xx doesn't look as good as 4.yy](#unreal-4xx-doesnt-look-as-good-as-4yy) - - [Can I use an XBox controller to fly?](#can-i-use-an-xbox-controller-to-fly) - - [Can I build a hexacopter with AutonomySim?](#can-i-build-a-hexacopter-with-autonomysim) - - [How do I use AutonomySim with multiple vehicles?](#how-do-i-use-autonomysim-with-multiple-vehicles) - - [What computer do you need?](#what-computer-do-you-need) - - [How do I report issues?](#how-do-i-report-issues) + - [Unreal editor is slow when it is not the active window](#unreal-editor-is-slow-when-it-is-not-the-active-window) + - [My mouse disappears in Unreal](#my-mouse-disappears-in-unreal) + - [Where is the setting file and how do I modify it?](#where-is-the-setting-file-and-how-do-i-modify-it) + - [How do I arm my drone?](#how-do-i-arm-my-drone) + - [When making API call I get error](#when-making-api-call-i-get-error) + - [I'm getting Eigen not found error when compiling Unreal project](#im-getting-eigen-not-found-error-when-compiling-unreal-project) + - [Something went wrong. How do I debug?](#something-went-wrong-how-do-i-debug) + - [What do the colors mean in the Segmentation View?](#what-do-the-colors-mean-in-the-segmentation-view) + - [Unreal 4.xx doesn't look as good as 4.yy](#unreal-4xx-doesnt-look-as-good-as-4yy) + - [Can I use an XBox controller to fly?](#can-i-use-an-xbox-controller-to-fly) + - [Can I build a hexacopter with AutonomySim?](#can-i-build-a-hexacopter-with-autonomysim) + - [How do I use AutonomySim with multiple vehicles?](#how-do-i-use-autonomysim-with-multiple-vehicles) + - [What computer do you need?](#what-computer-do-you-need) + - [How do I report issues?](#how-do-i-report-issues) - [Others](#others) ---- - - ## General - - -###### Unreal editor is slow when it is not the active window - ->Go to Edit/Editor Preferences, select "All Settings" and type "CPU" in the search box. ->It should find the setting titled "Use Less CPU when in Background", and you want to uncheck this checkbox. - - - -###### My mouse disappears in Unreal - ->Yes, Unreal steals the mouse, and we don't draw one. So to get your mouse back just use Alt+TAB to switch to a different window. To avoid this entirely, go to Project settings >in Unreal Editor, go to Input tab and disable all settings for mouse capture. - +### Unreal editor is slow when it is not the active window -###### Where is the setting file and how do I modify it? +> Go to Edit/Editor Preferences, select "All Settings" and type "CPU" in the search box. +> It should find the setting titled "Use Less CPU when in Background", and you want to uncheck this checkbox. ->AutonomySim will create empty settings file at `~/Documents/AutonomySim/settings.json`. You can view the available [settings options](settings.md). +### My mouse disappears in Unreal - +> Yes, Unreal steals the mouse, and we don't draw one. So to get your mouse back just use Alt+TAB to switch to a different window. To avoid this entirely, go to Project settings >in Unreal Editor, go to Input tab and disable all settings for mouse capture. -###### How do I arm my drone? +### Where is the setting file and how do I modify it? ->If you're using simple_flight, your vehicle is already armed and ready to fly. For PX4 you can arm by holding both sticks on remote control down and to the center. +> AutonomySim will create empty settings file at `~/Documents/AutonomySim/settings.json`. You can view the available [settings options](settings.md). - +### How do I arm my drone? -###### When making API call I get error +> If you're using simple_flight, your vehicle is already armed and ready to fly. For PX4 you can arm by holding both sticks on remote control down and to the center. ->If you are getting this error, ->``` ->TypeError: unsupported operand type(s) for *: 'AsyncIOLoop' and 'float' ->``` ->its probably due to upgraded version of tornado package with version > 5.0 in Python that conflicts with `msgpack-rpc-python` which requires tornado package < 5.0. To fix this >you can update the package like this: ->``` ->pip install --upgrade msgpack-rpc-python ->``` ->But this might break something (for example, PyTorch 0.4+) because it will uninstall newer tornado and re-install older one. To avoid this you should create new [conda >environment](https://conda.io/docs/user-guide/tasks/manage-environments.html). +### When making API call I get error - +> If you are getting this error, +> ``` +> TypeError: unsupported operand type(s) for *: 'AsyncIOLoop' and 'float' +> ``` +> It is probably due to upgraded version of tornado package with version > 5.0 in Python that conflicts with `msgpack-rpc-python` which requires tornado package < 5.0. To fix this >you can update the package like this: +> ``` +> pip install --upgrade msgpack-rpc-python +> ``` +> But this might break something (for example, PyTorch 0.4+) because it will uninstall newer tornado and re-install older one. To avoid this you should create new [conda >environment](https://conda.io/docs/user-guide/tasks/manage-environments.html). -###### I'm getting Eigen not found error when compiling Unreal project +### I'm getting Eigen not found error when compiling Unreal project >This is most likely because AutonomySim wasn't built and Plugin folder was copied in Unreal project folder. To fix this make sure you [build AutonomySim](build_windows.md) first (run >`build.cmd` in Windows). - - -###### Something went wrong. How do I debug? - ->First turn on C++ exceptions from the Exceptions window: - ->![exceptions](images/exceptions.png) - ->and copy the stack trace of all exceptions you see there during execution that look relevant (for example, there might be an initial exception from VSPerf140 that you can >ignore) then paste these call stacks into a new AutonomySim GitHub issue, thanks. - - - -###### What do the colors mean in the Segmentation View? - ->See [Camera Views](camera_views.md) for information on the camera views and how to change them. - - +### Something went wrong. How do I debug? -###### Unreal 4.xx doesn't look as good as 4.yy +> First turn on C++ exceptions from the Exceptions window: ->Unreal 4.15 added the ability for Foliage LOD dithering to be disabled on a case-by-case basis by unchecking the `Dithered LOD Transition` checkbox in the foliage materials. >Note that all materials used on all LODs need to have the checkbox checked in order for dithered LOD transitions to work. When checked the transition of generated foliage will >be a lot smoother and will look better than 4.14. +> ![exceptions](images/exceptions.png) - +> and copy the stack trace of all exceptions you see there during execution that look relevant (for example, there might be an initial exception from VSPerf140 that you can >ignore) then paste these call stacks into a new AutonomySim GitHub issue, thanks. -###### Can I use an XBox controller to fly? +### What do the colors mean in the Segmentation View? ->See [XBox controller](xbox_controller.md) for details. +> See [Camera Views](camera_views.md) for information on the camera views and how to change them. - +### Unreal 4.xx doesn't look as good as 4.yy -###### Can I build a hexacopter with AutonomySim? +> Unreal 4.15 added the ability for Foliage LOD dithering to be disabled on a case-by-case basis by unchecking the `Dithered LOD Transition` checkbox in the foliage materials. >Note that all materials used on all LODs need to have the checkbox checked in order for dithered LOD transitions to work. When checked the transition of generated foliage will >be a lot smoother and will look better than 4.14. ->See [how to build a hexacopter](https://github.com/nervosys/AutonomySim/wiki/hexacopter). +### Can I use an XBox controller to fly? - +> See [XBox controller](xbox_controller.md) for details. -###### How do I use AutonomySim with multiple vehicles? +### Can I build a hexacopter with AutonomySim? ->Here is [multi-vehicle setup guide](multi_vehicle.md). +> See [how to build a hexacopter](https://github.com/nervosys/AutonomySim/wiki/hexacopter). - +### How do I use AutonomySim with multiple vehicles? -###### What computer do you need? +> Here is [multi-vehicle setup guide](multi_vehicle.md). ->It depends on how big your Unreal Environment is. The Blocks environment that comes with AutonomySim is very basic and works on typical laptops. The [Modular Neighborhood Pack](https://www.unrealengine.com/marketplace/modular-neighborhood-pack) that we use ourselves for research requires GPUs with at least 4GB of RAM. The [Open World environment](https://www.unrealengine.com/marketplace/open-world-demo-collection) needs GPU with 8GB RAM. Our typical development machines have 32GB of RAM and NVIDIA TitanX and a [fast hard drive](hard_drive.md). +### What computer do you need? - +> It depends on how big your Unreal Environment is. The Blocks environment that comes with AutonomySim is very basic and works on typical laptops. The [Modular Neighborhood Pack](https://www.unrealengine.com/marketplace/modular-neighborhood-pack) that we use ourselves for research requires GPUs with at least 4GB of RAM. The [Open World environment](https://www.unrealengine.com/marketplace/open-world-demo-collection) needs GPU with 8GB RAM. Our typical development machines have 32GB of RAM and NVIDIA TitanX and a [fast hard drive](hard_drive.md). -###### How do I report issues? +### How do I report issues? ->It's a good idea to include your configuration like below. If you can also include logs, that could also expedite the investigation. +> It's a good idea to include your configuration like below. If you can also include logs, that could also expedite the investigation. ->``` ->Operating System: Windows 10 64bit ->CPU: Intel Core i7 ->GPU: Nvidia GTX 1080 ->RAM: 32 GB ->Flight Controller: Pixhawk v2 ->Remote Control: Futaba ->``` +> ``` +> Operating System: Windows 10 64bit +> CPU: Intel Core i7 +> GPU: Nvidia GTX 1080 +> RAM: 32 GB +> Flight Controller: Pixhawk v2 +> Remote Control: Futaba +> ``` ->If you have modified the default `~/Document/AutonomySim/settings.json`, please include your ->settings also. +> If you have modified the default `~/Document/AutonomySim/settings.json`, please include your settings also. ->If you are using PX4 then try to [capture log from MavLink or PX4](px4_logging.md). +> If you are using PX4 then try to [capture log from MavLink or PX4](px4_logging.md). ->File an issue through [GitHub Issues](https://github.com/nervosys/AutonomySim/issues). +> File an issue through [GitHub Issues](https://github.com/nervosys/AutonomySim/issues). - ## Others - * [Linux Build FAQ](build_linux.md#faq) * [Windows Build FAQ](build_windows.md#faq) diff --git a/docs/getting_started.md b/docs/getting_started.md new file mode 100644 index 00000000..bad55622 --- /dev/null +++ b/docs/getting_started.md @@ -0,0 +1 @@ +# Getting Started diff --git a/docs/hard_drive.md b/docs/hard_drive.md index 9298f084..fe324ac7 100644 --- a/docs/hard_drive.md +++ b/docs/hard_drive.md @@ -1,4 +1,4 @@ -# Busy Hard Drive +# Hard Drives It is not required, but we recommend running your Unreal Environment on a Solid State Drive (SSD). Between debugging, logging, and Unreal asset loading the hard drive can become your bottle neck. It is normal that your hard drive will be slammed while Unreal is loading the environment, but if your hard drive performance looks like this while the Unreal game is running then you will probably not get a good flying experience. @@ -6,7 +6,7 @@ It is not required, but we recommend running your Unreal Environment on a Solid In fact, if the hard drive is this busy, chances are the drone will not fly properly at all. For some unknown reason this I/O bottle neck also interferes with the drone control loop and if that loop doesn't run at a high rate (300-500 Hz) then the drone will not fly. Not surprising, the control loop inside the PX4 firmware that runs on a Pixhawk flight controller runs at 1000 Hz. -### Reducing I/O +## Reducing I/O If you can't whip off to Fry's Electronics and pick up an overpriced super fast SSD this weekend, then the following steps can be taken to reduce the hard drive I/O: @@ -18,13 +18,13 @@ If you can't whip off to Fry's Electronics and pick up an overpriced super fast 3. If you must debug the app, and you are using Visual Studio debugger, stop then Visual Studio from logging Intellitrace information. Go to Tools/Options/Debugging/Intellitrace, and turn off the main checkbox. 4. Turn off any [Unreal Analytics](https://docs.unrealengine.com/latest/INT/Gameplay/Analytics/index.html) that your environment may have enabled, especially any file logging. -### I/O from Page Faults +## I/O from Page Faults If your system is running out of RAM it may start paging memory to disk. If your operating system has enabled paging to disk, make sure it is paging to your fastest SSD. Or if you have enough RAM disable paging all together. In fact, if you disable paging and the game stops working you will know for sure you are running out of RAM. Obviously, shutting down any other unnecessary apps should also free up memory so you don't run out. -### Ideal Runtime performance +## Ideal Runtime performance This is what my slow hard drive looks like when flying from UE editor. You can see it's very busy, but the drone still flies ok: diff --git a/docs/hello_drone.md b/docs/hello_drone.md index 7f62de8e..58e06ec2 100644 --- a/docs/hello_drone.md +++ b/docs/hello_drone.md @@ -2,4 +2,4 @@ ## How does Hello Drone work? -Hello Drone uses the RPC client to connect to the RPC server that is automatically started by the AutonomySim. The RPC server routes all the commands to a class that implements [MultirotorApiBase](https://github.com/nervosys/AutonomySim/tree/main/AutonomyLib//include/vehicles/multirotor/api/MultirotorApiBase.hpp). In essence, MultirotorApiBase defines our abstract interface for getting data from the quadrotor and sending back commands. We currently have concrete implementation for MultirotorApiBase for MavLink based vehicles. The implementation for DJI drone platforms, specifically Matrice, is in works. +`Hello Drone` uses the RPC client to connect to the RPC server that is automatically started by `AutonomySim`. The RPC server routes all the commands to a class that implements [`MultirotorApiBase`](https://github.com/nervosys/AutonomySim/tree/master/AutonomyLib//include/vehicles/multirotor/api/MultirotorApiBase.hpp). In essence, `MultirotorApiBase` defines our abstract interface for getting data from the quadrotor and sending back commands. We currently have concrete implementation for `MultirotorApiBase` for MavLink based vehicles. The implementation for DJI drone platforms, specifically Matrice, is in works. diff --git a/docs/image_apis.md b/docs/image_apis.md index aa7d246a..7562f3ff 100644 --- a/docs/image_apis.md +++ b/docs/image_apis.md @@ -34,7 +34,7 @@ int getOneImage() { } ``` -## Getting Images with More Flexibility +## Getting Images with Greater Flexibility The `simGetImages` API which is slightly more complex to use than `simGetImage` API, for example, you can get left camera view, right camera view and depth image from left camera in a single API call. The `simGetImages` API also allows you to get uncompressed images as well as floating point single channel images (instead of 3 channel (RGB), each 8 bit). @@ -78,17 +78,19 @@ img_rgb = np.flipud(img_rgb) AutonomySim.write_png(os.path.normpath(filename + '.png'), img_rgb) ``` -#### Quick Tips +#### Tips -- The API `simGetImage` returns `binary string literal` which means you can simply dump it in binary file to create a .png file. However if you want to process it in any other way than you can handy function `AutonomySim.string_to_uint8_array`. This converts binary string literal to NumPy uint8 array. +* The API `simGetImage` returns `binary string literal` which means you can simply dump it in binary file to create a .png file. However if you want to process it in any other way than you can handy function `AutonomySim.string_to_uint8_array`. This converts binary string literal to NumPy uint8 array. + +* The API `simGetImages` can accept request for multiple image types from any cameras in single call. You can specify if image is png compressed, RGB uncompressed or float array. For png compressed images, you get `binary string literal`. For float array you get Python list of float64. You can convert this float array to NumPy 2D array using: -- The API `simGetImages` can accept request for multiple image types from any cameras in single call. You can specify if image is png compressed, RGB uncompressed or float array. For png compressed images, you get `binary string literal`. For float array you get Python list of float64. You can convert this float array to NumPy 2D array using ``` AutonomySim.list_to_2d_float_array(response.image_data_float, response.width, response.height) ``` + You can also save float array to .pfm file (Portable Float Map format) using `AutonomySim.write_pfm()` function. -- If you are looking to query position and orientation information in sync with a call to one of the image APIs, you can use `client.simPause(True)` and `client.simPause(False)` to pause the simulation while calling the image API and querying the desired physics state, ensuring that the physics state remains the same immediately after the image API call. +* If you are looking to query position and orientation information in sync with a call to one of the image APIs, you can use `client.simPause(True)` and `client.simPause(False)` to pause the simulation while calling the image API and querying the desired physics state, ensuring that the physics state remains the same immediately after the image API call. ### C++ @@ -119,17 +121,17 @@ int getStereoAndDepthImages() { } ``` -## Ready to Run Complete Examples +## Complete Examples ### Python -For a more complete ready to run sample code please see [sample code in AutonomySimClient project](https://github.com/nervosys/AutonomySim/tree/main/PythonClient//multirotor/hello_drone.py) for multirotors or [HelloCar sample](https://github.com/nervosys/AutonomySim/tree/main/PythonClient//car/hello_car.py). This code also demonstrates simple activities such as saving images in files or using `numpy` to manipulate images. +For a more complete ready to run sample code please see [sample code in AutonomySimClient project](https://github.com/nervosys/AutonomySim/tree/master/PythonClient//multirotor/hello_drone.py) for multirotors or [HelloCar sample](https://github.com/nervosys/AutonomySim/tree/master/PythonClient//car/hello_car.py). This code also demonstrates simple activities such as saving images in files or using `numpy` to manipulate images. ### C++ -For a more complete ready to run sample code please see [sample code in HelloDrone project](https://github.com/nervosys/AutonomySim/tree/main/HelloDrone//main.cpp) for multirotors or [HelloCar project](https://github.com/nervosys/AutonomySim/tree/main/HelloCar//main.cpp). +For a more complete ready to run sample code please see [sample code in HelloDrone project](https://github.com/nervosys/AutonomySim/tree/master/HelloDrone//main.cpp) for multirotors or [HelloCar project](https://github.com/nervosys/AutonomySim/tree/master/HelloCar//main.cpp). -See also [other example code](https://github.com/nervosys/AutonomySim/tree/main/Examples/DataCollection/StereoImageGenerator.hpp) that generates specified number of stereo images along with ground truth depth and disparity and saving it to [pfm format](pfm.md). +See also [other example code](https://github.com/nervosys/AutonomySim/tree/master/Examples/DataCollection/StereoImageGenerator.hpp) that generates specified number of stereo images along with ground truth depth and disparity and saving it to [pfm format](pfm.md). ## Available Cameras @@ -138,6 +140,7 @@ These are the default cameras already available in each vehicle. Apart from thes ### Car The cameras on car can be accessed by following names in API calls: `front_center`, `front_right`, `front_left`, `fpv` and `back_center`. Here FPV camera is driver's head position in the car. + ### Multirotor The cameras on the drone can be accessed by following names in API calls: `front_center`, `front_right`, `front_left`, `bottom_center` and `back_center`. @@ -163,7 +166,7 @@ To active this mode, edit [settings.json](settings.md) that you can find in your } ``` -[Here's the Python code example](https://github.com/nervosys/AutonomySim/tree/main/PythonClient//computer_vision/cv_mode.py) to move camera around and capture images. +[Here's the Python code example](https://github.com/nervosys/AutonomySim/tree/master/PythonClient//computer_vision/cv_mode.py) to move camera around and capture images. This mode was inspired from [UnrealCV project](http://unrealcv.org/). @@ -173,7 +176,7 @@ To move around the environment using APIs you can use `simSetVehiclePose` API. T ## Camera APIs -The `simGetCameraInfo` returns the pose (in world frame, NED coordinates, SI units) and FOV (in degrees) for the specified camera. Please see [example usage](https://github.com/nervosys/AutonomySim/tree/main/PythonClient//computer_vision/cv_mode.py). +The `simGetCameraInfo` returns the pose (in world frame, NED coordinates, SI units) and FOV (in degrees) for the specified camera. Please see [example usage](https://github.com/nervosys/AutonomySim/tree/master/PythonClient//computer_vision/cv_mode.py). The `simSetCameraPose` sets the pose for the specified camera while taking an input pose as a combination of relative position and a quaternion in NED frame. The handy `AutonomySim.to_quaternion()` function allows to convert pitch, roll, yaw to quaternion. For example, to set camera-0 to 15-degree pitch while maintaining the same position, you can use: @@ -191,9 +194,9 @@ All Camera APIs take in 3 common parameters apart from the API-specific ones, `c You can set stabilization for pitch, roll or yaw for any camera [using settings](settings.md#gimbal). -Please see [example usage](https://github.com/nervosys/AutonomySim/tree/main/PythonClient//computer_vision/cv_mode.py). +Please see [example usage](https://github.com/nervosys/AutonomySim/tree/master/PythonClient//computer_vision/cv_mode.py). -## Changing Resolution and Camera Parameters +## Changing Camera Resolution and Parameters To change resolution, FOV etc, you can use [settings.json](settings.md). For example, below addition in settings.json sets parameters for scene capture and uses "Computer Vision" mode described above. If you omit any setting then below default values will be used. For more information see [settings doc](settings.md). If you are using stereo camera, currently the distance between left and right is fixed at 25 cm. @@ -216,7 +219,7 @@ To change resolution, FOV etc, you can use [settings.json](settings.md). For exa } ``` -## What Does Pixel Values Mean in Different Image Types? +## What Pixel Values Mean in Different Image Types ### Available ImageType Values @@ -279,9 +282,9 @@ print(np.unique(img_rgb[:,:,1], return_counts=True)) #green print(np.unique(img_rgb[:,:,2], return_counts=True)) #blue ``` -A complete ready-to-run example can be found in [segmentation.py](https://github.com/nervosys/AutonomySim/tree/main/PythonClient//computer_vision/segmentation.py). +A complete ready-to-run example can be found in [segmentation.py](https://github.com/nervosys/AutonomySim/tree/master/PythonClient//computer_vision/segmentation.py). -#### Unsetting object ID +#### Unsetting an Object ID An object's ID can be set to -1 to make it not show up on the segmentation image. @@ -299,7 +302,7 @@ Once you decide on the meshes you are interested, note down their names and use At present the color for each object ID is fixed as in [this pallet](https://github.com/nervosys/AutonomySim/blob/main/Unreal/Plugins/AutonomySim/Content/HUDAssets/seg_color_palette.png). We will be adding ability to change colors for object IDs to desired values shortly. In the meantime you can open the segmentation image in your favorite image editor and get the RGB values you are interested in. -#### Startup Object IDs +#### Initial Object IDs At the start, AutonomySim assigns object ID to each object found in environment of type `UStaticMeshComponent` or `ALandscapeProxy`. It then either uses mesh name or owner name (depending on settings), lower cases it, removes any chars below ASCII 97 to remove numbers and some punctuations, sums int value of all chars and modulo 255 to generate the object ID. In other words, all object with same alphabet chars would get same object ID. This heuristic is simple and effective for many Unreal environments but may not be what you want. In that case, please use above APIs to change object IDs to your desired values. There are [few settings](settings.md#segmentation-settings) available to change this behavior. @@ -317,4 +320,4 @@ These image types return information about motion perceived by the point of view ## Example Code -A complete example of setting vehicle positions at random locations and orientations and then taking images can be found in [GenerateImageGenerator.hpp](https://github.com/nervosys/AutonomySim/tree/main/Examples/DataCollection/StereoImageGenerator.hpp). This example generates specified number of stereo images and ground truth disparity image and saving it to [pfm format](pfm.md). +A complete example of setting vehicle positions at random locations and orientations and then taking images can be found in [GenerateImageGenerator.hpp](https://github.com/nervosys/AutonomySim/tree/master/Examples/DataCollection/StereoImageGenerator.hpp). This example generates specified number of stereo images and ground truth disparity image and saving it to [pfm format](pfm.md). diff --git a/docs/image_sensors.md b/docs/image_sensors.md index b5b1ecb3..fc7b07a5 100644 --- a/docs/image_sensors.md +++ b/docs/image_sensors.md @@ -28,10 +28,10 @@ This is a tutorial for generating simulated thermal infrared (IR) images using A The pre-compiled Africa Environment can be downloaded from the Releases tab of this Github repo: [Windows Pre-compiled binary](https://github.com/nervosys/AutonomySim/releases/tag/v1.2.1) -To generate data, you may use two python files: [create_ir_segmentation_map.py](https://github.com/nervosys/AutonomySim/tree/main/PythonClient//computer_vision/create_ir_segmentation_map.py) and [capture_ir_segmentation.py](https://github.com/nervosys/AutonomySim/tree/main/PythonClient//computer_vision/capture_ir_segmentation.py). +To generate data, you may use two python files: [create_ir_segmentation_map.py](https://github.com/nervosys/AutonomySim/tree/master/PythonClient//computer_vision/create_ir_segmentation_map.py) and [capture_ir_segmentation.py](https://github.com/nervosys/AutonomySim/tree/master/PythonClient//computer_vision/capture_ir_segmentation.py). -* [create_ir_segmentation_map.py](https://github.com/nervosys/AutonomySim/tree/main/PythonClient//computer_vision/create_ir_segmentation_map.py) uses temperature, emissivity, and camera response information to estimate the thermal digital count that could be expected for the objects in the environment, and then reassigns the segmentation IDs in AutonomySim to match these digital counts. It should be run before starting to capture thermal IR data. Otherwise, digital counts in the IR images will be incorrect. The camera response, temperature, and emissivity data are all included for the Africa environment. -* [capture_ir_segmentation.py](https://github.com/nervosys/AutonomySim/tree/main/PythonClient//computer_vision/capture_ir_segmentation.py) is run after the segmentation IDs have been reassigned. It tracks objects of interest and records the infrared and scene images from the multirotor. It uses Computer Vision mode. +* [create_ir_segmentation_map.py](https://github.com/nervosys/AutonomySim/tree/master/PythonClient//computer_vision/create_ir_segmentation_map.py) uses temperature, emissivity, and camera response information to estimate the thermal digital count that could be expected for the objects in the environment, and then reassigns the segmentation IDs in AutonomySim to match these digital counts. It should be run before starting to capture thermal IR data. Otherwise, digital counts in the IR images will be incorrect. The camera response, temperature, and emissivity data are all included for the Africa environment. +* [capture_ir_segmentation.py](https://github.com/nervosys/AutonomySim/tree/master/PythonClient//computer_vision/capture_ir_segmentation.py) is run after the segmentation IDs have been reassigned. It tracks objects of interest and records the infrared and scene images from the multirotor. It uses Computer Vision mode. Details on how temperatures were estimated for plants and animals in the Africa environment, _et cetera_, can be found in [Bondi et al. (2018)](https://teamcore.seas.harvard.edu/publications/airsim-w-simulation-environment-wildlife-conservation-uavs-0). diff --git a/docs/index.md b/docs/index.md index 84e420ae..0cdb8384 100644 --- a/docs/index.md +++ b/docs/index.md @@ -1,20 +1,20 @@

- AutonomySim logo + AutonomySim logo

-# The simulation engine for autonomous systems +

The simulation engine for autonomous systems

## Tutorial -Visit [Getting Started](autonomysim_getting_started.md) to view a short tutorial on getting started with the platform and submitting your first simulation. +Visit [Getting Started](getting_started.md) to view a short tutorial on getting started with the platform and submitting your first simulation. ## Infrastructure -Visit [Infrastructure](autonomysim_infrastructure.md) to learn more about how the AutonomySim system runs your algorithm in the cloud. +Visit [Infrastructure](code_structure.md) to learn more about how the AutonomySim system runs your algorithm in the cloud. ## Features and Releases -Visit [Features and Releases](autonomysim_features.md) to view the features and release notes for this platform. +Visit [Features and Releases](features.md) to view the features and release notes for this platform. ## Repository @@ -22,7 +22,7 @@ Visit [Features and Releases](autonomysim_features.md) to view the features and


- xwerx logo + xwerx logo
TM 2024 © Nervosys, LLC

diff --git a/docs/lidar.md b/docs/lidar.md index f30881c3..759fd66e 100644 --- a/docs/lidar.md +++ b/docs/lidar.md @@ -96,7 +96,6 @@ By default, the LiDAR points are not drawn on the viewport. To enable the drawin ``` !!! note - Enabling `DrawDebugPoints` can cause excessive memory usage and crash in releases `v1.3.1`, `v1.3.0`. This has been fixed in the main branch and should work in later releases ## Client API @@ -116,10 +115,10 @@ Use `getLidarData()` API to retrieve the Lidar data. ### Python Examples -- [drone_lidar.py](https://github.com/nervosys/AutonomySim/blob/main/PythonClient/multirotor/drone_lidar.py) -- [car_lidar.py](https://github.com/nervosys/AutonomySim/blob/main/PythonClient/car/car_lidar.py) -- [sensorframe_lidar_pointcloud.py](https://github.com/nervosys/AutonomySim/blob/main/PythonClient/multirotor/sensorframe_lidar_pointcloud.py) -- [vehicleframe_lidar_pointcloud.py](https://github.com/nervosys/AutonomySim/blob/main/PythonClient/multirotor/vehicleframe_lidar_pointcloud.py) +* [drone_lidar.py](https://github.com/nervosys/AutonomySim/blob/main/PythonClient/multirotor/drone_lidar.py) +* [car_lidar.py](https://github.com/nervosys/AutonomySim/blob/main/PythonClient/car/car_lidar.py) +* [sensorframe_lidar_pointcloud.py](https://github.com/nervosys/AutonomySim/blob/main/PythonClient/multirotor/sensorframe_lidar_pointcloud.py) +* [vehicleframe_lidar_pointcloud.py](https://github.com/nervosys/AutonomySim/blob/main/PythonClient/multirotor/vehicleframe_lidar_pointcloud.py) ## Coming soon diff --git a/docs/log_viewer.md b/docs/log_viewer.md index f1256a40..0a3f9d50 100644 --- a/docs/log_viewer.md +++ b/docs/log_viewer.md @@ -1,6 +1,6 @@ # Log Viewer -The LogViewer is a Windows WPF app that presents the MavLink streams that it is getting from the Unreal Simulator. You can use this to monitor what is happening on the drone while it is flying. For example, the picture below shows a real time graph of the x, y an z gyro sensor information being generated by the simulator. +The `LogViewer` is a Windows WPF app that presents the MavLink streams that it is getting from the Unreal Simulator. You can use this to monitor what is happening on the drone while it is flying. For example, the picture below shows a real time graph of the x, y an z gyro sensor information being generated by the simulator. ## Usage @@ -33,7 +33,6 @@ For this to work you need to configure the `settings.json` with the following se ``` !!! note - Do not use the "Logs" setting when you want realtime LogViewer logging. Logging to a file using "Logs" is mutually exclusive with LogViewer logging. Simply press the blue connector button on the top right corner of the window, select the Socket `tab`, enter the port number `14388`, and your `localhost` network. If you are using WSL 2 on Windows then select `vEthernet (WSL)`. diff --git a/docs/design.md b/docs/manuscripts.md similarity index 100% rename from docs/design.md rename to docs/manuscripts.md diff --git a/docs/mavlinkcom.md b/docs/mavlinkcom.md index 517d62e8..0c374164 100644 --- a/docs/mavlinkcom.md +++ b/docs/mavlinkcom.md @@ -1,6 +1,6 @@ -# Welcome to MavLinkCom +# MavLinkCom -MavLinkCom is a cross-platform C++ library that helps connect to and communicate with [MavLink](https://github.com/mavlink/mavlink) based vehicles. Specifically this library is designed to work well with [PX4](https://github.com/PX4/Firmware) based drones. +`MavLinkCom` is a cross-platform C++ library that helps connect to and communicate with [MavLink](https://github.com/mavlink/mavlink) based vehicles. Specifically this library is designed to work well with [PX4](https://github.com/PX4/Firmware) based drones. ## Design diff --git a/docs/mavlinkcom_mocap.md b/docs/mavlinkcom_mocap.md deleted file mode 100644 index 504c7e7c..00000000 --- a/docs/mavlinkcom_mocap.md +++ /dev/null @@ -1,24 +0,0 @@ -# Welcome to MavLinkMoCap - -This folder contains the MavLinkMoCap library which connects to a OptiTrack camera system for accurate indoor location. - -## Dependencies: - -* [OptiTrack Motive](http://www.optitrack.com/products/motive/). -* [MavLinkCom](mavlinkcom.md). - -### Setup RigidBody - -First you need to define a RigidBody named 'Quadrocopter' using Motive. See [Rigid_Body_Tracking](http://wiki.optitrack.com/index.php?title=Rigid_Body_Tracking). - -### MavLinkTest - -Use MavLinkTest to talk to your PX4 drone, with "-server:addr:port", for example, when connected to drone wifi use: - -* MavLinkMoCap -server:10.42.0.228:14590 "-project:D:\OptiTrack\Motive Project 2016-12-19 04.09.42 PM.ttp" - -This publishes the ATT_POS_MOCAP messages and you can proxy those through to the PX4 by running MavLinkTest on the dronebrain using: - -* MavLinkTest -serial:/dev/ttyACM0,115200 -proxy:10.42.0.228:14590 - -Now the drone will get the ATT_POS_MOCAP and you should see the light turn green meaning it is now has a home position and is ready to fly. diff --git a/docs/mavlinkmocap.md b/docs/mavlinkmocap.md new file mode 100644 index 00000000..21ce6c7b --- /dev/null +++ b/docs/mavlinkmocap.md @@ -0,0 +1,24 @@ +# MavLinkMoCap + +This folder contains the `MavLinkMoCap` library which connects to a OptiTrack camera system for accurate indoor location. + +## Dependencies: + +* [OptiTrack Motive](http://www.optitrack.com/products/motive/). +* [MavLinkCom](mavlinkcom.md). + +### Setup RigidBody + +First, you need to define a RigidBody named 'Quadrocopter' using `Motive`. See [Rigid_Body_Tracking](http://wiki.optitrack.com/index.php?title=Rigid_Body_Tracking). + +### MavLinkTest + +Use MavLinkTest to talk to your PX4 drone, with `-server:addr:port`, for example, when connected to drone wifi use: + +* `MavLinkMoCap -server:10.42.0.228:14590 "-project:D:\OptiTrack\Motive Project 2016-12-19 04.09.42 PM.ttp"` + +This publishes the `ATT_POS_MOCAP` messages and you can proxy those through to the PX4 by running MavLinkTest on the dronebrain using: + +* `MavLinkTest -serial:/dev/ttyACM0,115200 -proxy:10.42.0.228:14590` + +Now the drone will get the `ATT_POS_MOCAP` and you should see the light turn green meaning it is now has a home position and is ready to fly. diff --git a/docs/meshes.md b/docs/meshes.md index 5f8dbcfb..decf02ba 100644 --- a/docs/meshes.md +++ b/docs/meshes.md @@ -1,9 +1,10 @@ -# How to Access Meshes in AutonomySim +# Accessing Meshes `AutonomySim` supports the ability to access the static meshes that make up the scene. -## Mesh structure -Each mesh is represented with the below struct. +## Mesh Data Structure + +Each mesh is represented with the below struct: ```cpp struct MeshPositionVertexBuffersResponse { @@ -21,7 +22,7 @@ struct MeshPositionVertexBuffersResponse { * The x,y,z coordinates of the vertices are all stored in a single vector. This means the vertices vector is Nx3 where N is number of vertices. * The position of the vertices are the global positions in the Unreal coordinate system. This means they have already been transformed by the position and orientation. -## How to use +## Methods The API to get the meshes in the scene is quite simple. However, one should note that the function call is very expensive and should very rarely be called. In general this is ok because this function only accesses the static meshes which for most applications are not changing during the duration of your program. diff --git a/docs/multi_vehicle.md b/docs/multi_vehicle.md index 1850bf0e..2d055bd4 100644 --- a/docs/multi_vehicle.md +++ b/docs/multi_vehicle.md @@ -1,6 +1,6 @@ # Multiple Vehicles -Since release 1.2, `AutonomySim` is fully enabled for multiple vehicles. This capability allows you to create multiple vehicles easily and use APIs to control them. +Since release 1.2, `AutonomySim` supports multiple vehicles. This capability allows you to create multiple vehicles easily and use APIs to control them. ## Creating Multiple Vehicles @@ -74,11 +74,11 @@ In the latest main branch of AutonomySim, the `simAddVehicle` API can be used to `simAddVehicle` takes in the following arguments: -- `vehicle_name`: Name of the vehicle to be created, this should be unique for each vehicle including any exisiting ones defined in the settings.json -- `vehicle_type`: Type of vehicle, e.g. "simpleflight". Currently only SimpleFlight, PhysXCar, ComputerVision are supported, in their respective SimModes. +* `vehicle_name`: Name of the vehicle to be created, this should be unique for each vehicle including any exisiting ones defined in the settings.json +* `vehicle_type`: Type of vehicle, e.g. "simpleflight". Currently only SimpleFlight, PhysXCar, ComputerVision are supported, in their respective SimModes. Other vehicle types including PX4 and ArduPilot-related aren't supported -- `pose`: Initial pose of the vehicle -- `pawn_path`: Vehicle blueprint path, default empty wbich uses the default blueprint for the vehicle type +* `pose`: Initial pose of the vehicle +* `pawn_path`: Vehicle blueprint path, default empty wbich uses the default blueprint for the vehicle type Returns: `bool` Whether vehicle was created @@ -88,6 +88,6 @@ For some examples, check out [HelloSpawnedDrones.cpp](https://github.com/nervosy ![HelloSpawnedDrones](images/HelloSpawnedDrones.gif) -And [runtime_car.py](https://github.com/nervosys/AutonomySim/tree/main/PythonClient/car/runtime_car.py) - +And [runtime_car.py](https://github.com/nervosys/AutonomySim/tree/master/PythonClient/car/runtime_car.py) - ![runtime_car](images/simAddVehicle_Car.gif) diff --git a/docs/object_detection.md b/docs/object_detection.md index 2e6605b8..f8a3be79 100644 --- a/docs/object_detection.md +++ b/docs/object_detection.md @@ -2,20 +2,25 @@ ## About -This feature lets you generate object detection using existing cameras in AutonomySim, similar to detection DNN. -Using the API you can control which object to detect by name and radius from camera. -One can control these settings for each camera, image type and vehicle combination separately. +This feature lets you generate object detectors using existing cameras in `AutonomySim`, similar to deep neural network (DNN) object detectors. Using the API, you can control which object to detect by name and the radial distance from the camera. One can control the settings for each camera, image type, and vehicle combination separately. ## API -- Set mesh name to detect in wildcard format -```simAddDetectionFilterMeshName(camera_name, image_type, mesh_name, vehicle_name = '')``` -- Clear all mesh names previously added -```simClearDetectionMeshNames(camera_name, image_type, vehicle_name = '')``` -- Set detection radius in cm -```simSetDetectionFilterRadius(camera_name, image_type, radius_cm, vehicle_name = '')``` -- Get detections -```simGetDetections(camera_name, image_type, vehicle_name = '')``` +* Set the mesh name to detect in wildcard format: + + ```simAddDetectionFilterMeshName(camera_name, image_type, mesh_name, vehicle_name = '')``` + +* Clear all mesh names previously added: + + ```simClearDetectionMeshNames(camera_name, image_type, vehicle_name = '')``` + +* Set the detection radius in centimeters: + + ```simSetDetectionFilterRadius(camera_name, image_type, radius_cm, vehicle_name = '')``` + +* Get the detections: + + ```simGetDetections(camera_name, image_type, vehicle_name = '')``` The return value of `simGetDetections` is a `DetectionInfo` array: @@ -28,11 +33,11 @@ DetectionInfo relative_pose = Pose() ``` -## Usage example +## Usage -Python script [detection.py](https://github.com/nervosys/AutonomySim/blob/main/PythonClient/detection/detection.py) shows how to set detection parameters and shows the result in OpenCV capture. +The Python script [`detection.py`](https://github.com/nervosys/AutonomySim/blob/main/PythonClient/detection/detection.py) shows how to set detection parameters and shows the result in OpenCV capture. -A minimal example using API with Blocks environment to detect Cylinder objects: +A minimal example using API with Blocks environment to detect Cylinder objects is below: ```python camera_name = "0" @@ -50,28 +55,50 @@ detections = client.simClearDetectionMeshNames(camera_name, image_type) Output result: ```python -Cylinder: { 'box2D': { 'max': { 'x_val': 617.025634765625, - 'y_val': 583.5487060546875}, - 'min': { 'x_val': 485.74359130859375, - 'y_val': 438.33465576171875}}, - 'box3D': { 'max': { 'x_val': 4.900000095367432, - 'y_val': 0.7999999523162842, - 'z_val': 0.5199999809265137}, - 'min': { 'x_val': 3.8999998569488525, - 'y_val': -0.19999998807907104, - 'z_val': 1.5199999809265137}}, - 'geo_point': { 'altitude': 16.979999542236328, - 'latitude': 32.28772183970703, - 'longitude': 34.864785008379876}, +Cylinder: { 'name': 'Cylinder9_2', - 'relative_pose': { 'orientation': { 'w_val': 0.9929741621017456, - 'x_val': 0.0038591264747083187, - 'y_val': -0.11333247274160385, - 'z_val': 0.03381215035915375}, - 'position': { 'x_val': 4.400000095367432, - 'y_val': 0.29999998211860657, - 'z_val': 1.0199999809265137}}} + 'geo_point': { + 'altitude': 16.979999542236328, + 'latitude': 32.28772183970703, + 'longitude': 34.864785008379876 + }, + 'box2D': { + 'max': { + 'x_val': 617.025634765625, + 'y_val': 583.5487060546875 + }, + 'min': { + 'x_val': 485.74359130859375, + 'y_val': 438.33465576171875 + } + }, + 'box3D': { + 'max': { + 'x_val': 4.900000095367432, + 'y_val': 0.7999999523162842, + 'z_val': 0.5199999809265137 + }, + 'min': { + 'x_val': 3.8999998569488525, + 'y_val': -0.19999998807907104, + 'z_val': 1.5199999809265137 + } + }, + 'relative_pose': { + 'orientation': { + 'w_val': 0.9929741621017456, + 'x_val': 0.0038591264747083187, + 'y_val': -0.11333247274160385, + 'z_val': 0.03381215035915375 + }, + 'position': { + 'x_val': 4.400000095367432, + 'y_val': 0.29999998211860657, + 'z_val': 1.0199999809265137 + } + } +} ``` ![image](images/detection_ue4.png) -![image](images/detection_python.png) \ No newline at end of file +![image](images/detection_python.png) diff --git a/docs/orbit.md b/docs/orbit.md index cd71ecda..8aac3dee 100644 --- a/docs/orbit.md +++ b/docs/orbit.md @@ -1,10 +1,10 @@ -# An Orbit Trajectory +# Orbital Trajectories Moved here from [https://github.com/nervosys/AutonomySim/wiki/An-Orbit-Trajectory](https://github.com/nervosys/AutonomySim/wiki/An-Orbit-Trajectory) -Have you ever wanted to fly a nice smooth circular orbit? This can be handy for capturing 3D objects from all sides especially if you get multiple orbits at different altitudes. +Have you ever wanted to fly a nice smooth circular orbit? This can be handy for capturing 3-D objects from all sides, especially if you get multiple orbits at different altitudes. -So the `PythonClient/multirotor` folder contains a script named [Orbit](https://github.com/nervosys/AutonomySim/blob/main/PythonClient/multirotor/orbit.py) that will do precisely that. +The `PythonClient/multirotor` folder contains a script named [Orbit](https://github.com/nervosys/AutonomySim/blob/main/PythonClient/multirotor/orbit.py) that will do precisely that. See [demo video](https://youtu.be/RFG5CTQi3Us) @@ -14,8 +14,8 @@ The demo video was created by running this command line: python orbit.py --radius 10 --altitude 5 --speed 1 --center "0,1" --iterations 1 ``` -This flies a 10 meter radius orbit around the center location at (startpos + radius * [0,1]), in other words, the center is located `radius` meters away in the direction of the provided center vector. It also keeps the front-facing camera on the drone always pointing at the center of the circle. If you watch the flight using LogViewer you will see a nice circular pattern get traced out on the GPS map: +This flies a 10-meter radius orbit around the center location at `(startpos + radius * [0,1])`. In other words, the center is located `radius` meters away in the direction of the provided center vector. This also keeps the front-facing camera on the drone always pointing at the center of the circle. If you watch the flight using `LogViewer`, you will see a nice circular pattern get traced out on the GPS map: ![image](images/orbit.png) -The core of the algorithm is not that complicated. At each point on the circle, we look ahead by a small delta in degrees, called the `lookahead_angle`, where that angle is computed based on our desired velocity. We then find that lookahead point on the circle using sin/cosine and make that our "target point". Calculating the velocity then is easy, just subtract our current position from that point and feed that into the AutonomySim method `moveByVelocityZ`. +The core of the algorithm is uncomplicated. At each point on the circle, we look ahead by a small delta in degrees, called the `lookahead_angle`, with the angle computed based on our desired velocity. We then find the lookahead point on the circle using sine/cosine and make that our target point. Calculating the velocity is then easy; just subtract the current position from that point and feed this into the AutonomySim method, `moveByVelocityZ()`. diff --git a/docs/pfm.md b/docs/pfm.md index a7f6cf2c..3e5cbd64 100644 --- a/docs/pfm.md +++ b/docs/pfm.md @@ -1,7 +1,7 @@ -# pfm Format +# PFM Image File Format -Pfm (or Portable FloatMap) image format stores image as floating point pixels and hence are not restricted to usual 0-255 pixel value range. This is useful for HDR images or images that describes something other than colors like depth. +The Portable FloatMap (PFM) file format stores images with floating-point pixels and hence is not restricted to the 8-bit unsigned integer value range of 0-255. This is useful for HDR images or images that describes something other than colors, such as depth. -One of the good viewer to view this file format is [PfmPad](https://sourceforge.net/projects/pfmpad/). We don't recommend Maverick photo viewer because it doesn't seem to show depth images properly. +A good viewer for this file format is [PfmPad](https://sourceforge.net/projects/pfmpad/). We do not recommend the `Maverick` photo viewer because it doesn't display depth images properly. -AutonomySim has code to write pfm file for [C++](https://github.com/nervosys/AutonomySim/blob/main/AutonomyLib/include/common/common_utils/Utils.hpp#L637) and read as well as write for [Python](https://github.com/nervosys/AutonomySim/tree/main/PythonClient//AutonomySim/utils.py#L122). +`AutonomySim` provides code to write `pfm` files in [C++](https://github.com/nervosys/AutonomySim/blob/main/AutonomyLib/include/common/common_utils/Utils.hpp#L637) and, to read and write `pfm` files in [Python](https://github.com/nervosys/AutonomySim/tree/master/PythonClient//AutonomySim/utils.py#L122). diff --git a/docs/playback.md b/docs/playback.md index 4b185582..2f6fd721 100644 --- a/docs/playback.md +++ b/docs/playback.md @@ -1,51 +1,39 @@ # Playback -AutonomySim supports playing back the high level commands in a *.mavlink log file that were recorded using the MavLinkTest app -for the purpose of comparing real and simulated flight. -The [recording.mavlink](logs/recording.mavlink) is an example of a log file captured using a real drone using the following -command line: +`AutonomySim` supports playing back the high level commands in a `.mavlink` log file that were recorded using the `MavLinkTest` application for the purpose of comparing real and simulated flight. The [recording.mavlink](logs/recording.mavlink) is an example of a log file captured using a real drone using the following commands: ```shell MavLinkTest -serial:/dev/ttyACM0,115200 -logdir:. ``` -Then the log file contains the commands performed, which included several "orbit" commands, the resulting GPS map of the flight -looks like this: +The log file contains the commands performed, which included several "orbit" commands. The resulting GPS map of the flight looks like this: ![real flight](images/RealFlight.png) ## Side-by-side comparison -Now we can copy the *.mavlink log file recorded by MavLinkTest to the PC running the Unreal simulator with AutonomySim plugin. -When the Simulator is running and the drone is parked in a place in a map that has room to do the same maneuvers we can run this -MavLinkTest command line: +Now we can copy the *.mavlink log file recorded by MavLinkTest to the PC running the Unreal simulator with AutonomySim plugin. When the Simulator is running and the drone is parked in a place in a map that has room to do the same maneuvers we can run this MavLinkTest command line: ```shell MavLinkTest -server:127.0.0.1:14550 ``` -This should connect to the simulator. Now you can enter this command: +This should connect to the simulator. Now you can enter this command: ```shell PlayLog recording.mavlink ``` -The same commands you performed on the real drone will now play again in the simulator. You can then press 't' to see -the trace, and it will show you the trace of the real drone and the simulated drone. Every time you press 't' again -you can reset the lines so they are sync'd to the current position, this way I was able to capture a side-by-side trace of the -"orbit" command performed in this recording, which generates the picture below. The pink line is the simulated -flight and the red line is the real flight: +The same commands you performed on the real drone will now play again in the simulator. You can then press `t` to see the trace, and it will show you the trace of the real drone and the simulated drone. Every time you press `t` again you can reset the lines so they are synched to the current position, this way I was able to capture a side-by-side trace of the "orbit" command performed in this recording, which generates the picture below. The pink line is the simulated flight and the red line is the real flight: ![playback](images/Playback.png) -Note: I'm using the ';' key in the simulator to take control of camera position using keyboard to get this shot. +!!! note + We use the `;` key in the simulator to take control of camera position using keyboard to get this shot. ## Parameters -It may help to set the simulator up with some of the same flight parameters that your real drone is using, for example, -in my case I was using a lower than normal cruise speed, slow takeoff speed, and it helps to tell the simulator to -wait a long time before disarming (COM_DISARM_LAND) and to turn off the safety switches NAV_RCL_ACT and NAV_DLL_ACT -(`don't` do that on a real drone). +It may help to set the simulator up with some of the same flight parameters that your real drone is using. For example, in my case, I was using a lower than normal cruise speed, slow takeoff speed, and it helps to tell the simulator to wait a long time before disarming (`COM_DISARM_LAND`) and to turn off the safety switches `NAV_RCL_ACT` and `NAV_DLL_ACT` (do not do that on a real drone). ```shell param MPC_XY_CRUISE 2 @@ -55,4 +43,3 @@ param COM_DISARM_LAND 60 param NAV_RCL_ACT 0 param NAV_DLL_ACT 0 ``` - diff --git a/docs/plugin_contents.md b/docs/plugin_contents.md new file mode 100644 index 00000000..66cb96c5 --- /dev/null +++ b/docs/plugin_contents.md @@ -0,0 +1,8 @@ +# Plugin Contents + +Plugin contents are not shown in `Unreal` projects by default. To view plugin content, you need to click on few semi-hidden buttons: + +![plugin contents screenshot](images/plugin_contents.png) + +!!! caution + Changes you make in content folder are changes to binary files, so be careful. diff --git a/docs/point_clouds.md b/docs/point_clouds.md index c9ce283c..4d2f51e8 100644 --- a/docs/point_clouds.md +++ b/docs/point_clouds.md @@ -2,16 +2,17 @@ Moved here from [https://github.com/nervosys/AutonomySim/wiki/Point-Clouds](https://github.com/nervosys/AutonomySim/wiki/Point-Clouds) -A Python script [point_cloud.py](https://github.com/nervosys/AutonomySim/blob/main/PythonClient/multirotor/point_cloud.py) shows how to convert the depth image returned from AutonomySim into a point cloud. +A Python script [point_cloud.py](https://github.com/nervosys/AutonomySim/blob/main/PythonClient/multirotor/point_cloud.py) shows how to convert the depth image returned from `AutonomySim` into a point cloud. -The following depth image was captured using the Modular Neighborhood environment: +The following depth image was captured using the `Modular Neighborhood` environment: ![depth](images/depth.png) -And with the appropriate projection matrix, the OpenCV `reprojectImageTo3D` function can turn this into a point cloud. The following is the result, which is also available here: [https://skfb.ly/68r7y](https://skfb.ly/68r7y). +And with the appropriate projection matrix, the OpenCV `reprojectImageTo3D` function can turn this into a point cloud. The following is the result, which is also available here: [https://skfb.ly/68r7y](https://skfb.ly/68r7y). ![depth](images/point_cloud.png) [SketchFab](https://sketchfab.com) can upload the resulting file `cloud.asc` and render it for you. -PS: you may notice the scene is reflected on the Y axis, so I may have a sign wrong in the projection matrix. An exercise for the reader :-) +!!! warning + You may notice the scene is reflected on the y-axis, so we may have a sign wrong in the projection matrix. diff --git a/docs/use_precompiled.md b/docs/project_binaries.md similarity index 97% rename from docs/use_precompiled.md rename to docs/project_binaries.md index 14ebb2a1..dc156694 100644 --- a/docs/use_precompiled.md +++ b/docs/project_binaries.md @@ -1,8 +1,8 @@ -# Download Binaries +# Project Binaries You can simply download precompiled binaries and run to get started immediately. If you want to set up your own Unreal environment then please see [these instructions](https://github.com/nervosys/AutonomySim/#how-to-get-it). -### Unreal Engine +## Unreal Engine **Windows, Linux**: Download the binaries for the environment of your choice from the [latest release](https://github.com/nervosys/AutonomySim/releases). @@ -11,7 +11,7 @@ Use [7zip](https://www.7-zip.org/download.html) to unzip these files. On Linux, **macOS**: You will need to [build it yourself](build_linux.md) -### Unity (Experimental) +## Unity (Experimental) A free environment called Windridge City is available at [Unity Asset Store](https://assetstore.unity.com/) as an experimental release of AutonomySim on Unity. **Note**: This is an old release, and many of the features and APIs might not work. diff --git a/docs/px4_build.md b/docs/px4_build.md index 58230e8d..fce79d3c 100644 --- a/docs/px4_build.md +++ b/docs/px4_build.md @@ -11,28 +11,23 @@ bash ./PX4-Autopilot/Tools/setup/ubuntu.sh --no-sim-tools cd PX4-Autopilot ``` -Now to build it you will need the right tools. +To build it, you will need the right tools. ## PX4 Build tools -The full instructions are available on the [dev.px4.io](https://docs.px4.io/master/en/dev_setup/building_px4.html) website, -but we've copied the relevant subset of those instructions here for your convenience. +The full instructions are available on the [dev.px4.io](https://docs.px4.io/master/en/dev_setup/building_px4.html) website, but we've copied the relevant subset of those instructions here for your convenience. -(Note that [BashOnWindows](https://msdn.microsoft.com/en-us/commandline/wsl/install_guide)) can be used to build -the PX4 firmware, just follow the BashOnWindows instructions at the bottom of this page) then proceed with the -Ubuntu setup for PX4. +(Note that [BashOnWindows](https://msdn.microsoft.com/en-us/commandline/wsl/install_guide)) can be used to build the PX4 firmware, just follow the BashOnWindows instructions at the bottom of this page) then proceed with the Ubuntu setup for PX4. ## Build SITL version -Now you can make the SITL version that runs in posix, from the Firmware folder you created above: +Now, you can make the SITL version that runs in POSIX from the Firmware folder you created above: ```shell make px4_sitl_default none_iris ``` -Note: this build system is quite special, it knows how to update git submodules (and there's a lot -of them), then it runs cmake (if necessary), then it runs the build itself. So in a way the root -Makefile is a meta-meta makefile :-) You might see prompts like this: +This build system is quite special, it knows how to update git submodules (and there's a lot of them), then it runs cmake (if necessary), then it runs the build itself. So in a way the root Makefile is a meta-meta makefile. You might see prompts like this: ```shell ******************************************************************************* @@ -42,16 +37,14 @@ Makefile is a meta-meta makefile :-) You might see prompts like this: * and git submodule update --init --recursive ) * ******************************************************************************* ``` -Every time you see this prompt type 'u' on your keyboard. -It shouldn't take long, about 2 minutes. If all succeeds, the last line will link the `px4` app, -which you can then run using the following: +Every time you see this prompt, press `u` on your keyboard. It should take about 2 minutes. If everything succeeds, the last line will link the `px4` application, which you can then run using the following command: ```shell make px4_sitl_default none_iris ``` -And you should see output that looks like this: +Now, you should see output that looks like this: ```shell creating new parameters file @@ -79,16 +72,11 @@ INFO [dataman] Unkown restart, data manager file 'rootfs/fs/microsd/dataman' si CAL_MAG0_ID: curr: 0 -> new: 196616 ``` -so this is good, first run sets up the px4 parameters for SITL mode. Second run has less output. -This app is also an interactive console where you can type commands. Type 'help' to see what they -are and just type ctrl-C to kill it. You can do that and restart it any time, that's a great way to -reset any wonky state if you need to (it's equivalent to a Pixhawk hardware reboot). +The first run sets up the PX4 parameters for SITL mode. The second run has less output. This app is also an interactive console where you can type commands. Type 'help' to see what they are and just type ctrl-C to kill it. You can do that and restart it any time, that's a great way to reset any wonky state if you need to (it's equivalent to a Pixhawk hardware reboot). ## ARM embedded tools -If you plan to build the PX4 firmware for real Pixhawk hardware then you will need the gcc -cross-compiler for ARM Cortex-M4 chipset. You can get this compiler by PX4 DevGuide, specifically -this is in their `ubuntu_sim_nuttx.sh` setup script. +If you plan to build the PX4 firmware for real Pixhawk hardware then you will need the `gcc` cross-compiler for ARM Cortex-M4 chipset. You can get this compiler by PX4 DevGuide, specifically this is in their `ubuntu_sim_nuttx.sh` setup script. After following those setup instructions you can verify the install by entering this command `arm-none-eabi-gcc --version`. You should see the following output: @@ -101,23 +89,19 @@ warranty; not even for MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. ## Build PX4 for ARM hardware -Now you can build the PX4 firmware for running on real pixhawk hardware: +Now, you can build the PX4 firmware and run it on real Pixhawk hardware: ```shell make px4_fmu-v4 ``` -This build will take a little longer because it is building a lot more including the NuttX real time OS, -all the drivers for the sensors in the Pixhawk flight controller, and more. It is also running the compiler -in super size-squeezing mode so it can fit all that in a 1 megabyte ROM !! +This build will take a little longer because it is building a lot more including the NuttX real time OS, all the drivers for the sensors in the Pixhawk flight controller, and more. It is also running the compiler in super size-squeezing mode so it can fit all that in a 1 megabyte ROM! -One nice tid bit is you can plug in your pixhawk USB, and type `make px4fmu-v2_default upload` to flash the -hardware with these brand new bits, so you don't need to use QGroundControl for that. +One nice tid bit is you can plug in your pixhawk USB, and type `make px4fmu-v2_default upload` to flash the hardware with these brand new bits, so you don't need to use QGroundControl for that. ## Some Useful Parameters -PX4 has many customizable parameters (over 700 of them, in fact) and to get best results with AutonomySim we have -found the following parameters are handy: +PX4 has many customizable parameters (over 700 of them, in fact) and to get best results with AutonomySim we have found the following parameters are handy: ```cpp // be sure to enable the new position estimator module: diff --git a/docs/px4_lockstep.md b/docs/px4_lockstep.md index f9004bca..cd998f6c 100644 --- a/docs/px4_lockstep.md +++ b/docs/px4_lockstep.md @@ -1,29 +1,29 @@ -# LockStep +# PX4 Lockstep -The latest version of PX4 supports a new [lockstep feature](https://docs.px4.io/master/en/simulation/#lockstep-simulation) when communicating with the simulator over TCP. Lockstep is an important feature because it synchronizes PX4 and the simulator so they essentially use the same clock time. This makes PX4 behave normally even during unusually long delays in Simulator performance. +The latest version of PX4 supports a new [lockstep feature](https://docs.px4.io/master/en/simulation/#lockstep-simulation) when communicating with the simulator over TCP. Lockstep is an important feature because it synchronizes PX4 and the simulator so they essentially use the same clock time. This makes PX4 behave normally even during unusually long delays in Simulator performance. It is recommended that when you are running a lockstep enabled version of PX4 in SITL mode that you tell AutonomySim to use a `SteppableClock`, and set `UseTcp` to `true` and `LockStep` to `true`. ```json - { - "SettingsVersion": 1.2, - "SimMode": "Multirotor", - "ClockType": "SteppableClock", - "Vehicles": { - "PX4": { - "VehicleType": "PX4Multirotor", - "UseTcp": true, - "LockStep": true, - ... +{ + "SettingsVersion": 1.2, + "SimMode": "Multirotor", + "ClockType": "SteppableClock", + "Vehicles": { + "PX4": { + "VehicleType": "PX4Multirotor", + "UseTcp": true, + "LockStep": true, + ... ``` -This causes AutonomySim to not use a "realtime" clock, but instead it advances the clock in step which each sensor update sent to PX4. This way PX4 thinks time is progressing smoothly no matter how long it takes AutonomySim to really process that update loop. +This causes AutonomySim to not use a real-time clock, but instead it advances the clock in step which each sensor update sent to PX4. This way PX4 thinks time is progressing smoothly no matter how long it takes AutonomySim to really process that update loop. This has the following advantages: -- AutonomySim can be used on slow machines that cannot process updates quickly. -- You can debug AutonomySim and hit a breakpoint, and when you resume PX4 will behave normally. -- You can enable very slow sensors like the Lidar with large number of simulated points, and PX4 +* AutonomySim can be used on slow machines that cannot process updates quickly. +* You can debug AutonomySim and hit a breakpoint, and when you resume PX4 will behave normally. +* You can enable very slow sensors like the Lidar with large number of simulated points, and PX4 will still behave normally. There will be some side effects to `lockstep`, namely, slower update loops caused by running AutonomySim on an underpowered machine or from expensive sensors (like Lidar) will create some visible jerkiness in the simulated flight if you look at the updates on screen in realtime. @@ -43,15 +43,14 @@ If you are running PX4 in cygwin, there is an [open issue with lockstep](https:/ ``` 4. Disable it in AutonomySim by setting `LockStep` to `false` and either removing any `"ClockType": "SteppableClock"` setting or resetting `ClockType` back to default: - ```json - { - ... - "ClockType": "", - "Vehicles": { - "PX4": { - "VehicleType": "PX4Multirotor", - "LockStep": false, - ... - ``` -5. Now you can run PX4 SITL as you normally would (`make px4_sitl_default none_iris`) and it will use -the host system time without waiting on AutonomySim. +```json + { + ... + "ClockType": "", + "Vehicles": { + "PX4": { + "VehicleType": "PX4Multirotor", + "LockStep": false, + ... +``` +5. Now you can run PX4 SITL as you normally would (`make px4_sitl_default none_iris`) and it will use the host system time without waiting on AutonomySim. diff --git a/docs/px4_multi_vehicle.md b/docs/px4_multi_vehicle.md index b409620e..9e9ee94d 100644 --- a/docs/px4_multi_vehicle.md +++ b/docs/px4_multi_vehicle.md @@ -63,7 +63,7 @@ You should also be able to use QGroundControl with SITL mode. Make sure there is ## Starting SITL instances with PX4 console -If you want to start your SITL instances while being able to view the PX4 console, you will need to run the shell scripts found [here](https://github.com/nervosys/AutonomySim/tree/main/PX4Scripts) rather than `sitl_multiple_run.sh`. +If you want to start your SITL instances while being able to view the PX4 console, you will need to run the shell scripts found [here](https://github.com/nervosys/AutonomySim/tree/master/PX4Scripts) rather than `sitl_multiple_run.sh`. Here is how you would do so: diff --git a/docs/px4_setup.md b/docs/px4_setup.md index 8af116be..0c313568 100644 --- a/docs/px4_setup.md +++ b/docs/px4_setup.md @@ -1,8 +1,9 @@ -# PX4 Setup for AutonomySim +# PX4 Setup -The [PX4 software stack](http://github.com/px4/firmware) is an open source very popular flight controller with support for wide variety of boards and sensors as well as built-in capability for higher level tasks such as mission planning. Please visit [px4.io](http://px4.io) for more information. +The [PX4 software stack](http://github.com/px4/firmware) is an open-source popular flight controller with support for wide variety of boards and sensors as well as built-in capability for higher level tasks such as mission planning. Please visit [px4.io](http://px4.io) for more information. -**Warning**: While all releases of AutonomySim are always tested with PX4 to ensure the support, setting up PX4 is not a trivial task. Unless you have at least intermediate level of experience with PX4 stack, we recommend you use [simple_flight](simple_flight.md), which is now a default in AutonomySim. +!!! warning + While all releases of AutonomySim are always tested with PX4 to ensure the support, setting up PX4 is not a trivial task. Unless you have at least intermediate level of experience with PX4 stack, we recommend you use [simple_flight](simple_flight.md), which is now a default in AutonomySim. ## Supported Hardware @@ -60,10 +61,7 @@ See also [initial firmware setup video](https://docs.px4.io/master/en/config/). } ``` -Notice the PX4 `[simulator]` is using TCP, which is why we need to add: `"UseTcp": true,`. Notice we -are also enabling `LockStep`, see [PX4 LockStep](px4_lockstep.md) for more information. The -`Barometer` setting keeps PX4 happy because the default AutonomySim barometer has a bit too much noise -generation. This setting clamps that down a bit which allows PX4 to achieve GPS lock more quickly. +Notice the PX4 `[simulator]` is using TCP, which is why we need to add: `"UseTcp": true,`. Notice we are also enabling `LockStep`, see [PX4 LockStep](px4_lockstep.md) for more information. The `Barometer` setting keeps PX4 happy because the default AutonomySim barometer has a bit too much noise generation. This setting clamps that down a bit which allows PX4 to achieve GPS lock more quickly. After above setup you should be able to use a remote control (RC) to fly with AutonomySim. You can usually arm the vehicle by lowering and bringing two sticks of RC together down and in-wards. You don't need QGroundControl after the initial setup. Typically the Stabilized (instead of Manual) mode gives better experience for beginners. See [PX4 Basic Flying Guide](https://docs.px4.io/master/en/flying/basic_flying.html). @@ -77,7 +75,7 @@ The PX4 SITL mode doesn't require you to have separate device such as a Pixhawk ## FAQ -#### Drone doesn't fly properly, it just goes "crazy". +### Drone doesn't fly properly, it just goes "crazy". There are a few reasons that can cause this. First, make sure your drone doesn't fall down large distance when starting the simulator. This might happen if you have created a custom Unreal environment and Player Start is placed too high above the ground. It seems that when this happens internal calibration in PX4 gets confused. @@ -85,20 +83,19 @@ You should [also use QGroundControl](#setting-up-px4-hardware-in-loop) and make Finally, this also can be a machine performance issue in some rare cases, check your [hard drive performance](hard_drive.md). -#### Can I use Arducopter or other MavLink implementations? +### Can I use Arducopter or other MavLink implementations? Our code is tested with the [PX4 firmware](https://dev.px4.io/). We have not tested Arducopter or other mavlink implementations. Some of the flight API's do use the PX4 custom modes in the MAV_CMD_DO_SET_MODE messages (like PX4_CUSTOM_MAIN_MODE_AUTO) -#### It is not finding my Pixhawk hardware +### It is not finding my Pixhawk hardware Check your settings.json file for this line "SerialPort":"*,115200". The asterisk here means "find any serial port that looks like a Pixhawk device, but this doesn't always work for all types of Pixhawk hardware. So on Windows you can find the actual COM port using Device Manager, look under "Ports (COM & LPT), plug the device in and see what new COM port shows up. Let's say you see a new port named "USB Serial Port (COM5)". Well, then change the SerialPort setting to this: "SerialPort":"COM5,115200". On Linux, the device can be found by running "ls /dev/serial/by-id" if you see a device name listed that looks like this `usb-3D_Robotics_PX4_FMU_v2.x_0-if00` then you can use that name to connect, like this: `"SerialPort":"/dev/serial/by-id/usb-3D_Robotics_PX4_FMU_v2.x_0-if00"`. Note that this long name is actually a symbolic link to the real name, if you use `"ls -l ..."` you can find that symbolic link, it is usually something like `"/dev/ttyACM0"`, so this will also work `"SerialPort":"/dev/ttyACM0,115200"`. But that mapping is similar to windows, it is automatically assigned and can change, whereas the long name will work even if the actual TTY serial device mapping changes. -#### WARN [commander] Takeoff denied, disarm and re-try +### WARN [commander] Takeoff denied, disarm and re-try -This happens if you try and take off when PX4 still has not computed the home position. PX4 will report the home -position once it is happy with the GPS signal, and you will see these messages: +This happens if you try and take off when PX4 still has not computed the home position. PX4 will report the home position once it is happy with the GPS signal, and you will see these messages: ```shell INFO [commander] home: 47.6414680, -122.1401672, 119.99 @@ -107,16 +104,14 @@ INFO [tone_alarm] home_set Up until this point in time, however, the PX4 will reject takeoff commands. -#### When I tell the drone to do something it always lands +### When I tell the drone to do something it always lands -For example, you use DroneShell `moveToPosition -z -20 -x 50 -y 0` which it does, but when it gets to the target location the -drone starts to land. This is the default behavior of PX4 when offboard mode completes. To set the drone to hover instead -set this PX4 parameter: +For example, you use DroneShell `moveToPosition -z -20 -x 50 -y 0` which it does, but when it gets to the target location the drone starts to land. This is the default behavior of PX4 when offboard mode completes. To set the drone to hover instead set this PX4 parameter: ```shell param set COM_OBL_ACT 1 ``` -#### I get message length mismatches errors +### I get message length mismatches errors You might need to set MAV_PROTO_VER parameter in QGC to "Always use version 1". Please see [this issue](https://github.com/nervosys/AutonomySim/issues/546) more details. diff --git a/docs/px4_sitl.md b/docs/px4_sitl.md index 24e63da7..1b75658a 100644 --- a/docs/px4_sitl.md +++ b/docs/px4_sitl.md @@ -1,10 +1,11 @@ # Setting up PX4 Software-in-Loop -The [PX4](http://dev.px4.io) software provides a "software-in-loop" simulation (SITL) version of their stack that runs in Linux. If you are on Windows then you can use the [Cygwin Toolchain](https://dev.px4.io/master/en/setup/dev_env_windows_cygwin.html) or you can use the [Windows subsystem for Linux](https://docs.microsoft.com/en-us/windows/wsl/install-win10) and follow the PX4 Linux toolchain setup. +The [PX4](http://dev.px4.io) software provides a software-in-the-loop (SITL) simulation mode of their stack that runs on Linux. If you are on Windows, you can use the [Cygwin Toolchain](https://dev.px4.io/master/en/setup/dev_env_windows_cygwin.html) or you can use the [Windows subsystem for Linux](https://docs.microsoft.com/en-us/windows/wsl/install-win10) and follow the PX4 Linux toolchain setup. -If you are using WSL2 please read these [additional instructions](px4_sitl_wsl2.md). +If you are using WSL2, please read these [additional instructions](px4_sitl_wsl2.md). -**Note** that every time you stop the unreal app you have to restart the `px4` app. +!!! note + Every time you stop Unreal, you have to restart `px4`. 1. From your bash terminal follow [these steps for Linux](https://docs.px4.io/master/en/dev_setup/dev_env_linux.html) and follow **all** the instructions under `NuttX based hardware` to install prerequisites. We've also included our own copy of the [PX4 build instructions](px4_build.md) which is a bit more concise about what we need exactly. @@ -34,10 +35,9 @@ If you are using WSL2 please read these [additional instructions](px4_sitl_wsl2. INFO [mavlink] mode: Normal, data rate: 4000000 B/s on udp port 14570 remote port 14550 INFO [mavlink] mode: Onboard, data rate: 4000000 B/s on udp port 14580 remote port 14540 ``` + This is an interactive PX4 console, type `help` to see the list of commands you can enter here. They are mostly low level PX4 commands, but some of them can be useful for debugging. - Note: this is also an interactive PX4 console, type `help` to see the list of commands you can enter here. They are mostly low level PX4 commands, but some of them can be useful for debugging. - -5. Now edit [AutonomySim settings](settings.md) file to make sure you have matching UDP and TCP port settings: +5. Edit the [AutonomySim settings](settings.md) file to make sure you have matching UDP and TCP port settings: ```json { "SettingsVersion": 1.2, @@ -70,7 +70,7 @@ If you are using WSL2 please read these [additional instructions](px4_sitl_wsl2. } } ``` - Notice the PX4 `[simulator]` is using TCP, which is why we need to add: `"UseTcp": true,`. Notice we are also enabling `LockStep`, see [PX4 LockStep](px4_lockstep.md) for more information. The `Barometer` setting keeps PX4 happy because the default AutonomySim barometer has a bit too much noise generation. This setting clamps that down a bit which allows PX4 to achieve GPS lock more quickly. + Notice the PX4 `[simulator]` is using TCP, which is why we need to add: `"UseTcp": true,`. Notice we are also enabling `LockStep`, see [PX4 LockStep](px4_lockstep.md) for more information. The `Barometer` setting keeps PX4 happy because the default AutonomySim barometer has a bit too much noise generation. This setting clamps that down a bit which allows PX4 to achieve GPS lock more quickly. 6. Open incoming TCP port 4560 and incoming UDP port 14540 using your firewall configuration. @@ -95,9 +95,7 @@ Notice the above settings are provided in the `params` section of the `settings. "LPE_LON": -122.140165, ``` -PX4 SITL mode needs to be configured to get the home location correct. The home location needs to be set to the same coordinates defined in [OriginGeopoint](settings.md#origingeopoint). - -You can also run the following in the SITL PX4 console window to check that these values are set correctly. +PX4 SITL mode needs to be configured to get the home location correct. The home location needs to be set to the same coordinates defined in [OriginGeopoint](settings.md#origingeopoint). You can also run the following in the SITL PX4 console window to check that these values are set correctly: ```shell param show LPE_LAT @@ -109,7 +107,7 @@ param show LPE_LON Notice the above setting is provided in the `params` section of the `settings.json` file: ```json - "COM_OBL_ACT": 1 +"COM_OBL_ACT": 1 ``` This tells the drone automatically hover after each offboard control command finishes (the default setting is to land). Hovering is a smoother transition between multiple offboard commands. You can check this setting by running the following PX4 console command: @@ -141,17 +139,17 @@ Local position: x=-0.0326988, y=0.00656854, z=5.48506 If the z coordinate is large like this then takeoff might not work as expected. Resetting the SITL and simulation should fix that problem. -## WSL 2 +## WSL2 -Windows Subsystem for Linux version 2 operates in a Virtual Machine. This requires additional setup - see [additional instructions](px4_sitl_wsl2.md). +Windows Subsystem for Linux (WSL) version 2 operates in a virtual machine. This requires additional setup - see [additional instructions](px4_sitl_wsl2.md). ## No Remote Control Notice the above setting is provided in the `params` section of the `settings.json` file: ```json - "NAV_RCL_ACT": 0, - "NAV_DLL_ACT": 0, +"NAV_RCL_ACT": 0, +"NAV_DLL_ACT": 0, ``` This is required if you plan to fly the SITL mode PX4 with no remote control, just using python scripts, for example. These parameters stop the PX4 from triggering "failsafe mode on" every time a move command is finished. You can use the following PX4 command to check these values are set correctly: @@ -161,7 +159,8 @@ param show NAV_RCL_ACT param show NAV_DLL_ACT ``` -NOTE: Do `NOT` do this on a real drone as it is too dangerous to fly without these failsafe measures. +!!! note + Do not do this on a real drone as it is too dangerous to fly without these failsafe measures. ## Manually set parameters @@ -182,4 +181,4 @@ If you want to run the above posix_sitl in a `VirtualBox Ubuntu` machine then it ## Remote Controller -There are several options for flying the simulated drone using a remote control or joystick like xbox gamepad. See [remote controllers](remote_control.md#rc-setup-for-px4) +There are several options for flying the simulated drone using a remote control or joystick like Xbox gamepad. See [remote controllers](remote_control.md#rc-setup-for-px4) diff --git a/docs/px4_sitl_wsl2.md b/docs/px4_sitl_wsl2.md index 03f24259..b4b6eb84 100644 --- a/docs/px4_sitl_wsl2.md +++ b/docs/px4_sitl_wsl2.md @@ -22,7 +22,8 @@ Starting with this [PX4 Change Request](https://github.com/PX4/PX4-Autopilot/com export PX4_SIM_HOST_ADDR=172.31.64.1 ``` -**Note:** Be sure to update the above address `172.31.64.1` to match what you see from your `ipconfig` command. +!!! note + Be sure to update the above address `172.31.64.1` to match what you see from your `ipconfig` command. Open incoming TCP port 4560 and incoming UDP port 14540 using your firewall configuration. @@ -80,9 +81,10 @@ else fi ``` -**Note**: this code might already be there depending on the version of PX4 you are using. +!!! note + This code might already be there depending on the version of PX4 you are using. -**Note**: please be patient when waiting for the message: +Please be patient when waiting for the message: ```text INFO [simulator] Simulator connected on TCP port 4560. diff --git a/docs/modify_recording_data.md b/docs/recording_data.md similarity index 86% rename from docs/modify_recording_data.md rename to docs/recording_data.md index f5b87cf4..6c610682 100644 --- a/docs/modify_recording_data.md +++ b/docs/recording_data.md @@ -1,4 +1,4 @@ -# Modifying Recording Data +# Recording Data `AutonomySim` has a [Recording feature](settings.md#recording) to easily collect data and images. The [Recording APIs](apis.md#recording-apis) also allows starting and stopping the recording using API. @@ -6,13 +6,13 @@ However, the data recorded by default might not be sufficient for your use cases The recorded data is written in a `AutonomySim_rec.txt` file in a tab-separated format, with images in an `images/` folder. The entire folder is by default present in the `Documents` folder (or specified in settings) with the timestamp of when the recording started in `%Y-%M-%D-%H-%M-%S` format. -Car vehicle records the following fields: +The `Car` vehicle records the following fields: ```text VehicleName TimeStamp POS_X POS_Y POS_Z Q_W Q_X Q_Y Q_Z Throttle Steering Brake Gear Handbrake RPM Speed ImageFile ``` -For Multirotor: +The `Multirotor` fields: ```text VehicleName TimeStamp POS_X POS_Y POS_Z Q_W Q_X Q_Y Q_Z ImageFile @@ -24,18 +24,17 @@ Note that this requires building and using AutonomySim from source. You can comp The primary method which fills the data to be stored is [`PawnSimApi::getRecordFileLine`](https://github.com/nervosys/AutonomySim/blob/880c5541fd4824ee2cd9bb82ca5f611eb1ab236a/Unreal/Plugins/AutonomySim/Source/PawnSimApi.cpp#L544), it's the base method for all the vehicles, and Car overrides it to log additional data, as can be seen in [`CarPawnSimApi::getRecordFileLine`](https://github.com/nervosys/AutonomySim/blob/880c5541fd4824ee2cd9bb82ca5f611eb1ab236a/Unreal/Plugins/AutonomySim/Source/Vehicles/Car/CarPawnSimApi.cpp#L34). -To record additional data for multirotor, you can add a similar method in [MultirotorPawnSimApi.cpp/h](https://github.com/nervosys/AutonomySim/tree/main/Unreal/Plugins/AutonomySim/Source/Vehicles/Multirotor) files which overrides the base class implementation and append other data. The currently logged data can also be modified and removed as needed. +To record additional data for multirotor, you can add a similar method in [MultirotorPawnSimApi.cpp/h](https://github.com/nervosys/AutonomySim/tree/master/Unreal/Plugins/AutonomySim/Source/Vehicles/Multirotor) files which overrides the base class implementation and append other data. The currently logged data can also be modified and removed as needed. -E.g. recording GPS, IMU and Barometer data also for multirotor - +For example, recording GPS, IMU, and Barometer data for multirotor: ```cpp // MultirotorPawnSimApi.cpp std::string MultirotorPawnSimApi::getRecordFileLine(bool is_header_line) const { - std::string common_line = PawnSimApi::getRecordFileLine(is_header_line); + if (is_header_line) { - return common_line + - "Latitude\tLongitude\tAltitude\tPressure\tAccX\tAccY\tAccZ\t"; + return common_line + "Latitude\tLongitude\tAltitude\tPressure\tAccX\tAccY\tAccZ\t"; } const auto& state = vehicle_api_->getMultirotorState(); diff --git a/docs/reinforcement_learning.md b/docs/reinforcement_learning.md index 931291e1..3f173803 100644 --- a/docs/reinforcement_learning.md +++ b/docs/reinforcement_learning.md @@ -1,18 +1,17 @@ -# Reinforcement Learning in AutonomySim +# Reinforcement Learning We below describe how we can implement DQN in AutonomySim using an OpenAI gym wrapper around AutonomySim API, and using stable baselines implementations of standard RL algorithms. We recommend installing stable-baselines3 in order to run these examples (please see https://github.com/DLR-RM/stable-baselines3) -#### Disclaimer +!!! warning + This is still in active development. What we share below is a framework that can be extended and tweaked to obtain better performance. -This is still in active development. What we share below is a framework that can be extended and tweaked to obtain better performance. - -#### Gym wrapper +## Gym Wrapper In order to use AutonomySim as a gym environment, we extend and reimplement the base methods such as `step`, `_get_obs`, `_compute_reward` and `reset` specific to AutonomySim and the task of interest. The sample environments used in these examples for car and drone can be seen in `PythonClient/reinforcement_learning/*_env.py` -## RL with Car +## Car RL -[Source code](https://github.com/nervosys/AutonomySim/tree/main/PythonClient/reinforcement_learning) +[Source code](https://github.com/nervosys/AutonomySim/tree/master/PythonClient/reinforcement_learning) This example works with AutonomySimNeighborhood environment available in [releases](https://github.com/nervosys/AutonomySim/releases). @@ -97,7 +96,7 @@ client.setCarControls(car_control) time.sleep(1) ``` -Once the gym-styled environment wrapper is defined as in `car_env.py`, we then make use of stable-baselines3 to run a DQN training loop. The DQN training can be configured as follows, seen in `dqn_car.py`. +Once the gym-styled environment wrapper is defined as in `car_env.py`, we then make use of `stable-baselines3` to run a DQN training loop. The DQN training can be configured as follows, seen in `dqn_car.py`. ```python model = DQN( @@ -126,7 +125,7 @@ Note that the simulation needs to be up and running before you execute `dqn_car. ## RL with Quadrotor -[Source code](https://github.com/nervosys/AutonomySim/tree/main/PythonClient/reinforcement_learning) +[Source code](https://github.com/nervosys/AutonomySim/tree/master/PythonClient/reinforcement_learning) This example works with AutonomySimMountainLandscape environment available in [releases](https://github.com/nervosys/AutonomySim/releases). diff --git a/docs/autonomysim_ros_pkgs.md b/docs/ros_pkgs.md similarity index 67% rename from docs/autonomysim_ros_pkgs.md rename to docs/ros_pkgs.md index 78fb38dc..79d8fb2b 100644 --- a/docs/autonomysim_ros_pkgs.md +++ b/docs/ros_pkgs.md @@ -1,4 +1,4 @@ -# AutonomySim ROS Packages +# ROS Packages `AutonomySim_ros_pkgs`: a `ROS` wrapper over the `AutonomySim` C++ client library. @@ -6,29 +6,25 @@ The below steps are meant for Linux. If you're running AutonomySim on Windows, you can use Windows Subsystem for Linux (WSL) to run the ROS wrapper, see the instructions [below](#setting-up-the-build-environment-on-windows10-using-wsl1-or-wsl2). If you're unable or don't prefer to install ROS and related tools on your host Linux due to some issues, you can also try it using Docker, see the steps in [Using Docker for ROS wrapper](#using-docker-for-ros) -- If your default GCC version is not 8 or above (check using `gcc --version`) - - - Install gcc >= 8.0.0: `sudo apt-get install gcc-8 g++-8` - - Verify installation by `gcc-8 --version` - -- Ubuntu 16.04 +* If your default GCC version is not 8 or above (check using `gcc --version`) + * Install gcc >= 8.0.0: `sudo apt-get install gcc-8 g++-8` + * Verify installation by `gcc-8 --version` +* Ubuntu 16.04 * Install [ROS kinetic](https://wiki.ros.org/kinetic/Installation/Ubuntu) * Install tf2 sensor and mavros packages: `sudo apt-get install ros-kinetic-tf2-sensor-msgs ros-kinetic-tf2-geometry-msgs ros-kinetic-mavros*` - -- Ubuntu 18.04 +* Ubuntu 18.04 * Install [ROS melodic](https://wiki.ros.org/melodic/Installation/Ubuntu) * Install tf2 sensor and mavros packages: `sudo apt-get install ros-melodic-tf2-sensor-msgs ros-melodic-tf2-geometry-msgs ros-melodic-mavros*` -- Ubuntu 20.04 +* Ubuntu 20.04 * Install [ROS noetic](https://wiki.ros.org/noetic/Installation/Ubuntu) * Install tf2 sensor and mavros packages: `sudo apt-get install ros-noetic-tf2-sensor-msgs ros-noetic-tf2-geometry-msgs ros-noetic-mavros*` -- Install [catkin_tools](https://catkin-tools.readthedocs.io/en/latest/installing.html) - `sudo apt-get install python-catkin-tools` or - `pip install catkin_tools`. If using Ubuntu 20.04 use `pip install "git+https://github.com/catkin/catkin_tools.git#egg=catkin_tools"` +* Install [catkin_tools](https://catkin-tools.readthedocs.io/en/latest/installing.html) + `sudo apt-get install python-catkin-tools` or `pip install catkin_tools`. If using Ubuntu 20.04 use `pip install "git+https://github.com/catkin/catkin_tools.git#egg=catkin_tools"` ## Build -- Build AutonomySim +* Build `AutonomySim`: ```shell git clone https://github.com/nervosys/AutonomySim.git; @@ -37,21 +33,21 @@ cd AutonomySim; ./build.sh; ``` -- Make sure that you have setup the environment variables for ROS as mentioned in the installation pages above. Add the `source` command to your `.bashrc` for convenience (replace `melodic` with specfic version name) - +* Ensure that you have setup the environment variables for ROS as mentioned in the installation pages above. Add the `source` command to your `.bashrc` for convenience (replace `melodic` with specfic version name): ```shell echo "source /opt/ros/melodic/setup.bash" >> ~/.bashrc source ~/.bashrc ``` -- Build ROS package +* Build the ROS package: ```shell cd ros; catkin build; # or catkin_make ``` -If your default GCC isn't 8 or greater (check using `gcc --version`), then compilation will fail. In that case, use `gcc-8` explicitly as follows- +If your default GCC isn't 8 or greater (check using `gcc --version`), then compilation will fail. In that case, use `gcc-8` explicitly as follows: ```shell catkin build -DCMAKE_C_COMPILER=gcc-8 -DCMAKE_CXX_COMPILER=g++-8 @@ -65,115 +61,114 @@ roslaunch AutonomySim_ros_pkgs AutonomySim_node.launch; roslaunch AutonomySim_ros_pkgs rviz.launch; ``` - **Note**: If you get an error running `roslaunch AutonomySim_ros_pkgs AutonomySim_node.launch`, run `catkin clean` and try again +!!! note + If you get an error running `roslaunch AutonomySim_ros_pkgs AutonomySim_node.launch`, run `catkin clean` and try again ## Using AutonomySim ROS wrapper -The ROS wrapper is composed of two ROS nodes - the first is a wrapper over AutonomySim's multirotor C++ client library, and the second is a simple PD position controller. -Let's look at the ROS API for both nodes: +The ROS wrapper is composed of two ROS nodes - the first is a wrapper over AutonomySim's multirotor C++ client library, and the second is a simple PD position controller. Let's look at the ROS API for both nodes below. ### AutonomySim ROS Wrapper Node #### Publishers: -- `/AutonomySim_node/origin_geo_point` [AutonomySim_ros_pkgs/GPSYaw](https://github.com/nervosys/AutonomySim/tree/main/ros/src/AutonomySim_ros_pkgs/msg/GPSYaw.msg) +* `/AutonomySim_node/origin_geo_point` [AutonomySim_ros_pkgs/GPSYaw](https://github.com/nervosys/AutonomySim/tree/master/ros/src/AutonomySim_ros_pkgs/msg/GPSYaw.msg) GPS coordinates corresponding to global NED frame. This is set in the AutonomySim's [settings.json](https://microsoft.github.io/AutonomySim/settings/) file under the `OriginGeopoint` key. -- `/AutonomySim_node/VEHICLE_NAME/global_gps` [sensor_msgs/NavSatFix](https://docs.ros.org/api/sensor_msgs/html/msg/NavSatFix.html) +* `/AutonomySim_node/VEHICLE_NAME/global_gps` [sensor_msgs/NavSatFix](https://docs.ros.org/api/sensor_msgs/html/msg/NavSatFix.html) This the current GPS coordinates of the drone in AutonomySim. -- `/AutonomySim_node/VEHICLE_NAME/odom_local_ned` [nav_msgs/Odometry](https://docs.ros.org/api/nav_msgs/html/msg/Odometry.html) +* `/AutonomySim_node/VEHICLE_NAME/odom_local_ned` [nav_msgs/Odometry](https://docs.ros.org/api/nav_msgs/html/msg/Odometry.html) Odometry in NED frame (default name: odom_local_ned, launch name and frame type are configurable) wrt take-off point. -- `/AutonomySim_node/VEHICLE_NAME/CAMERA_NAME/IMAGE_TYPE/camera_info` [sensor_msgs/CameraInfo](https://docs.ros.org/api/sensor_msgs/html/msg/CameraInfo.html) +* `/AutonomySim_node/VEHICLE_NAME/CAMERA_NAME/IMAGE_TYPE/camera_info` [sensor_msgs/CameraInfo](https://docs.ros.org/api/sensor_msgs/html/msg/CameraInfo.html) -- `/AutonomySim_node/VEHICLE_NAME/CAMERA_NAME/IMAGE_TYPE` [sensor_msgs/Image](https://docs.ros.org/api/sensor_msgs/html/msg/Image.html) +* `/AutonomySim_node/VEHICLE_NAME/CAMERA_NAME/IMAGE_TYPE` [sensor_msgs/Image](https://docs.ros.org/api/sensor_msgs/html/msg/Image.html) RGB or float image depending on image type requested in settings.json. -- `/tf` [tf2_msgs/TFMessage](https://docs.ros.org/api/tf2_msgs/html/msg/TFMessage.html) +* `/tf` [tf2_msgs/TFMessage](https://docs.ros.org/api/tf2_msgs/html/msg/TFMessage.html) -- `/AutonomySim_node/VEHICLE_NAME/altimeter/SENSOR_NAME` [AutonomySim_ros_pkgs/Altimeter](https://github.com/nervosys/AutonomySim/blob/main/ros/src/AutonomySim_ros_pkgs/msg/Altimeter.msg) +* `/AutonomySim_node/VEHICLE_NAME/altimeter/SENSOR_NAME` [AutonomySim_ros_pkgs/Altimeter](https://github.com/nervosys/AutonomySim/blob/main/ros/src/AutonomySim_ros_pkgs/msg/Altimeter.msg) This the current altimeter reading for altitude, pressure, and [QNH](https://en.wikipedia.org/wiki/QNH) -- `/AutonomySim_node/VEHICLE_NAME/imu/SENSOR_NAME` [sensor_msgs::Imu](http://docs.ros.org/api/sensor_msgs/html/msg/Imu.html) +* `/AutonomySim_node/VEHICLE_NAME/imu/SENSOR_NAME` [sensor_msgs::Imu](http://docs.ros.org/api/sensor_msgs/html/msg/Imu.html) IMU sensor data -- `/AutonomySim_node/VEHICLE_NAME/magnetometer/SENSOR_NAME` [sensor_msgs::MagneticField](http://docs.ros.org/api/sensor_msgs/html/msg/MagneticField.html) +* `/AutonomySim_node/VEHICLE_NAME/magnetometer/SENSOR_NAME` [sensor_msgs::MagneticField](http://docs.ros.org/api/sensor_msgs/html/msg/MagneticField.html) Meausrement of magnetic field vector/compass -- `/AutonomySim_node/VEHICLE_NAME/distance/SENSOR_NAME` [sensor_msgs::Range](http://docs.ros.org/api/sensor_msgs/html/msg/Range.html) +* `/AutonomySim_node/VEHICLE_NAME/distance/SENSOR_NAME` [sensor_msgs::Range](http://docs.ros.org/api/sensor_msgs/html/msg/Range.html) Meausrement of distance from an active ranger, such as infrared or IR -- `/AutonomySim_node/VEHICLE_NAME/lidar/SENSOR_NAME` [sensor_msgs::PointCloud2](http://docs.ros.org/api/sensor_msgs/html/msg/PointCloud2.html) +* `/AutonomySim_node/VEHICLE_NAME/lidar/SENSOR_NAME` [sensor_msgs::PointCloud2](http://docs.ros.org/api/sensor_msgs/html/msg/PointCloud2.html) LIDAR pointcloud -#### Subscribers: +#### Subscribers -- `/AutonomySim_node/vel_cmd_body_frame` [AutonomySim_ros_pkgs/VelCmd](https://github.com/nervosys/AutonomySim/tree/main/ros/src/AutonomySim_ros_pkgs/msg/VelCmd.msg) +* `/AutonomySim_node/vel_cmd_body_frame` [AutonomySim_ros_pkgs/VelCmd](https://github.com/nervosys/AutonomySim/tree/master/ros/src/AutonomySim_ros_pkgs/msg/VelCmd.msg) Ignore `vehicle_name` field, leave it to blank. We will use `vehicle_name` in future for multiple drones. -- `/AutonomySim_node/vel_cmd_world_frame` [AutonomySim_ros_pkgs/VelCmd](https://github.com/nervosys/AutonomySim/tree/main/ros/src/AutonomySim_ros_pkgs/msg/VelCmd.msg) +* `/AutonomySim_node/vel_cmd_world_frame` [AutonomySim_ros_pkgs/VelCmd](https://github.com/nervosys/AutonomySim/tree/master/ros/src/AutonomySim_ros_pkgs/msg/VelCmd.msg) Ignore `vehicle_name` field, leave it to blank. We will use `vehicle_name` in future for multiple drones. -- `/gimbal_angle_euler_cmd` [AutonomySim_ros_pkgs/GimbalAngleEulerCmd](https://github.com/nervosys/AutonomySim/tree/main/ros/src/AutonomySim_ros_pkgs/msg/GimbalAngleEulerCmd.msg) +* `/gimbal_angle_euler_cmd` [AutonomySim_ros_pkgs/GimbalAngleEulerCmd](https://github.com/nervosys/AutonomySim/tree/master/ros/src/AutonomySim_ros_pkgs/msg/GimbalAngleEulerCmd.msg) Gimbal set point in euler angles. -- `/gimbal_angle_quat_cmd` [AutonomySim_ros_pkgs/GimbalAngleQuatCmd](https://github.com/nervosys/AutonomySim/tree/main/ros/src/AutonomySim_ros_pkgs/msg/GimbalAngleQuatCmd.msg) +* `/gimbal_angle_quat_cmd` [AutonomySim_ros_pkgs/GimbalAngleQuatCmd](https://github.com/nervosys/AutonomySim/tree/master/ros/src/AutonomySim_ros_pkgs/msg/GimbalAngleQuatCmd.msg) Gimbal set point in quaternion. -- `/AutonomySim_node/VEHICLE_NAME/car_cmd` [AutonomySim_ros_pkgs/CarControls](https://github.com/nervosys/AutonomySim/blob/main/ros/src/AutonomySim_ros_pkgs/msg/CarControls.msg) +* `/AutonomySim_node/VEHICLE_NAME/car_cmd` [AutonomySim_ros_pkgs/CarControls](https://github.com/nervosys/AutonomySim/blob/main/ros/src/AutonomySim_ros_pkgs/msg/CarControls.msg) Throttle, brake, steering and gear selections for control. Both automatic and manual transmission control possible, see the [`car_joy.py`](https://github.com/nervosys/AutonomySim/blob/main/ros/src/AutonomySim_ros_pkgs/scripts/car_joy) script for use. -#### Services: +#### Services -- `/AutonomySim_node/VEHICLE_NAME/land` [AutonomySim_ros_pkgs/Takeoff](https://docs.ros.org/api/std_srvs/html/srv/Empty.html) +* `/AutonomySim_node/VEHICLE_NAME/land` [AutonomySim_ros_pkgs/Takeoff](https://docs.ros.org/api/std_srvs/html/srv/Empty.html) -- `/AutonomySim_node/takeoff` [AutonomySim_ros_pkgs/Takeoff](https://docs.ros.org/api/std_srvs/html/srv/Empty.html) +* `/AutonomySim_node/takeoff` [AutonomySim_ros_pkgs/Takeoff](https://docs.ros.org/api/std_srvs/html/srv/Empty.html) -- `/AutonomySim_node/reset` [AutonomySim_ros_pkgs/Reset](https://docs.ros.org/api/std_srvs/html/srv/Empty.html) - Resets *all* drones +* `/AutonomySim_node/reset` [AutonomySim_ros_pkgs/Reset](https://docs.ros.org/api/std_srvs/html/srv/Empty.html) resets *all* drones -#### Parameters: +#### Parameters -- `/AutonomySim_node/world_frame_id` [string] +* `/AutonomySim_node/world_frame_id` [string] Set in: `$(AutonomySim_ros_pkgs)/launch/AutonomySim_node.launch` Default: world_ned Set to "world_enu" to switch to ENU frames automatically -- `/AutonomySim_node/odom_frame_id` [string] +* `/AutonomySim_node/odom_frame_id` [string] Set in: `$(AutonomySim_ros_pkgs)/launch/AutonomySim_node.launch` Default: odom_local_ned If you set world_frame_id to "world_enu", the default odom name will instead default to "odom_local_enu" -- `/AutonomySim_node/coordinate_system_enu` [boolean] +* `/AutonomySim_node/coordinate_system_enu` [boolean] Set in: `$(AutonomySim_ros_pkgs)/launch/AutonomySim_node.launch` Default: false If you set world_frame_id to "world_enu", this setting will instead default to true -- `/AutonomySim_node/update_AutonomySim_control_every_n_sec` [double] +* `/AutonomySim_node/update_AutonomySim_control_every_n_sec` [double] Set in: `$(AutonomySim_ros_pkgs)/launch/AutonomySim_node.launch` Default: 0.01 seconds. Timer callback frequency for updating drone odom and state from AutonomySim, and sending in control commands. The current RPClib interface to unreal engine maxes out at 50 Hz. Timer callbacks in ROS run at maximum rate possible, so it's best to not touch this parameter. -- `/AutonomySim_node/update_AutonomySim_img_response_every_n_sec` [double] +* `/AutonomySim_node/update_AutonomySim_img_response_every_n_sec` [double] Set in: `$(AutonomySim_ros_pkgs)/launch/AutonomySim_node.launch` Default: 0.01 seconds. Timer callback frequency for receiving images from all cameras in AutonomySim. The speed will depend on number of images requested and their resolution. Timer callbacks in ROS run at maximum rate possible, so it's best to not touch this parameter. -- `/AutonomySim_node/publish_clock` [double] +* `/AutonomySim_node/publish_clock` [double] Set in: `$(AutonomySim_ros_pkgs)/launch/AutonomySim_node.launch` Default: false Will publish the ros /clock topic if set to true. ### Simple PID Position Controller Node -#### Parameters: +#### Parameters -- PD controller parameters: +* PD controller parameters: * `/pd_position_node/kd_x` [double], `/pd_position_node/kp_y` [double], `/pd_position_node/kp_z` [double], @@ -192,34 +187,34 @@ Throttle, brake, steering and gear selections for control. Both automatic and ma * `/pd_position_node/reached_yaw_degrees` [double] Threshold yaw distance (degrees) from current position to setpoint position -- `/pd_position_node/update_control_every_n_sec` [double] +* `/pd_position_node/update_control_every_n_sec` [double] Default: 0.01 seconds -#### Services: +#### Services -- `/AutonomySim_node/VEHICLE_NAME/gps_goal` [Request: [srv/SetGPSPosition](https://github.com/nervosys/AutonomySim/blob/main/ros/src/AutonomySim_ros_pkgs/srv/SetGPSPosition.srv)] +* `/AutonomySim_node/VEHICLE_NAME/gps_goal` [Request: [srv/SetGPSPosition](https://github.com/nervosys/AutonomySim/blob/main/ros/src/AutonomySim_ros_pkgs/srv/SetGPSPosition.srv)] Target gps position + yaw. In **absolute** altitude. -- `/AutonomySim_node/VEHICLE_NAME/local_position_goal` [Request: [srv/SetLocalPosition](https://github.com/nervosys/AutonomySim/blob/main/ros/src/AutonomySim_ros_pkgs/srv/SetLocalPosition.srv)] +* `/AutonomySim_node/VEHICLE_NAME/local_position_goal` [Request: [srv/SetLocalPosition](https://github.com/nervosys/AutonomySim/blob/main/ros/src/AutonomySim_ros_pkgs/srv/SetLocalPosition.srv)] Target local position + yaw in global NED frame. -#### Subscribers: +#### Subscribers -- `/AutonomySim_node/origin_geo_point` [AutonomySim_ros_pkgs/GPSYaw](https://github.com/nervosys/AutonomySim/tree/main/ros/src/AutonomySim_ros_pkgs/msg/GPSYaw.msg) +* `/AutonomySim_node/origin_geo_point` [AutonomySim_ros_pkgs/GPSYaw](https://github.com/nervosys/AutonomySim/tree/master/ros/src/AutonomySim_ros_pkgs/msg/GPSYaw.msg) Listens to home geo coordinates published by `AutonomySim_node`. -- `/AutonomySim_node/VEHICLE_NAME/odom_local_ned` [nav_msgs/Odometry](https://docs.ros.org/api/nav_msgs/html/msg/Odometry.html) +* `/AutonomySim_node/VEHICLE_NAME/odom_local_ned` [nav_msgs/Odometry](https://docs.ros.org/api/nav_msgs/html/msg/Odometry.html) Listens to odometry published by `AutonomySim_node` -#### Publishers: +#### Publishers -- `/vel_cmd_world_frame` [AutonomySim_ros_pkgs/VelCmd](https://github.com/nervosys/AutonomySim/tree/main/ros/src/AutonomySim_ros_pkgs/msg/VelCmd.msg) +* `/vel_cmd_world_frame` [AutonomySim_ros_pkgs/VelCmd](https://github.com/nervosys/AutonomySim/tree/master/ros/src/AutonomySim_ros_pkgs/msg/VelCmd.msg) Sends velocity command to `AutonomySim_node` -#### Global params +#### Global parameters -- Dynamic constraints. These can be changed in `dynamic_constraints.launch`: +* Dynamic constraints. These can be changed in `dynamic_constraints.launch`: * `/max_vel_horz_abs` [double] Maximum horizontal velocity of the drone (meters/second) @@ -239,34 +234,29 @@ It involves enabling the built-in Windows Linux environment (WSL) in Windows, in Upon completion, you will be able to build and run the ros wrapper as in a native Linux machine. -##### WSL1 versus WSL2 +#### WSL1 versus WSL2 WSL2 is the latest version of the Windows Subsystem for Linux (WSL). It is many times faster than WSL1 (if you use the native file system in `/home/...` rather than Windows mounted folders under `/mnt/...`) and is therefore preferred for building the code in terms of speed. Once installed, you can switch between WSL1 or WSL2 versions as you prefer. -##### WSL Setup steps +#### WSL Setup steps 1. Follow the instructions [here](https://docs.microsoft.com/en-us/windows/wsl/install-win10). Check that the ROS version you want to use is supported by the Ubuntu version you want to install. 2. Congratulations, you now have a working Ubuntu subsystem under Windows, you can now go to [Ubuntu 16 / 18 instructions](#setup) and then [How to run AutonomySim on Windows and ROS wrapper on WSL](#how-to-run-AutonomySim-on-windows-and-ros-wrapper-on-wsl)! !!! note - - You can run XWindows applications (including SITL) by installing [VcXsrv](https://sourceforge.net/projects/vcxsrv/) on Windows. - To use it find and run `XLaunch` from the Windows start menu. - Select `Multiple Windows` in first popup, `Start no client` in second popup, **only** `Clipboard` in third popup. Do **not** select `Native Opengl` (and if you are not able to connect select `Disable access control`). - You will need to set the DISPLAY variable to point to your display: in WSL it is `127.0.0.1:0`, in WSL2 it will be the ip address of the PC's network port and can be set by using the code below. Also in WSL2 you may have to disable the firewall for public networks, or create an exception in order for VcXsrv to communicate with WSL2: + You can run XWindows applications (including SITL) by installing [VcXsrv](https://sourceforge.net/projects/vcxsrv/) on Windows. To use it find and run `XLaunch` from the Windows start menu. Select `Multiple Windows` in first popup, `Start no client` in second popup, **only** `Clipboard` in third popup. Do **not** select `Native Opengl` (and if you are not able to connect select `Disable access control`). You will need to set the DISPLAY variable to point to your display: in WSL it is `127.0.0.1:0`, in WSL2 it will be the ip address of the PC's network port and can be set by using the code below. Also in WSL2 you may have to disable the firewall for public networks, or create an exception in order for VcXsrv to communicate with WSL2: `export DISPLAY=$(cat /etc/resolv.conf | grep nameserver | awk '{print $2}'):0` !!! tip + * If you add this line to your ~/.bashrc file you won't need to run this command again + * For code editing you can install VSCode inside WSL. + * Windows 10 includes "Windows Defender" virus scanner. It will slow down WSL quite a bit. Disabling it greatly improves disk performance but increases your risk to viruses so disable at your own risk. Here is one of many resources/videos that show you how to disable it: [How to Disable or Enable Windows Defender on Windows 10](https://youtu.be/FmjblGay3AM) - - If you add this line to your ~/.bashrc file you won't need to run this command again - - For code editing you can install VSCode inside WSL. - - Windows 10 includes "Windows Defender" virus scanner. It will slow down WSL quite a bit. Disabling it greatly improves disk performance but increases your risk to viruses so disable at your own risk. Here is one of many resources/videos that show you how to disable it: [How to Disable or Enable Windows Defender on Windows 10](https://youtu.be/FmjblGay3AM) - -##### File System Access between WSL and Windows 10/11 +#### File System Access between WSL and Windows 10/11 Within WSL, the Windows drives are referenced in the `/mnt` directory. For example, in order to list documents within your () documents folder: @@ -274,14 +264,11 @@ Within WSL, the Windows drives are referenced in the `/mnt` directory. For examp or `ls /mnt/c/Users//Documents` - From within Windows, the WSL distribution's files are located at (type in windows Explorer address bar): -`\\wsl$\` -e.g., -`\\wsl$\Ubuntu-18.04` +`\\wsl$\` e.g., `\\wsl$\Ubuntu-18.04` -##### How to run AutonomySim on Windows and ROS wrapper on WSL +#### How to run AutonomySim on Windows and ROS wrapper on WSL For WSL 1 execute: `export WSL_HOST_IP=127.0.0.1` @@ -297,7 +284,7 @@ roslaunch AutonomySim_ros_pkgs rviz.launch ### Using Docker for ROS -A Dockerfile is present in the [`tools`](https://github.com/nervosys/AutonomySim/tree/main/tools/Dockerfile-ROS) directory. To build the `AutonomySim-ros` image - +A Dockerfile is present in the [`tools`](https://github.com/nervosys/AutonomySim/tree/master/tools/Dockerfile-ROS) directory. To build the `AutonomySim-ros` image - ```shell cd tools diff --git a/docs/ros_pkgs_tutorial.md b/docs/ros_pkgs_tutorial.md new file mode 100644 index 00000000..6b6b757c --- /dev/null +++ b/docs/ros_pkgs_tutorial.md @@ -0,0 +1,74 @@ +# ROS Tutorials + +This is a set of sample AutonomySim `settings.json`s, roslaunch and rviz files to give a starting point for using AutonomySim with ROS. See [AutonomySim_ros_pkgs](https://github.com/nervosys/AutonomySim/blob/main/ros/src/AutonomySim_ros_pkgs/README.md) for the ROS API. + +## Setup + +Make sure that the [AutonomySim_ros_pkgs Setup](AutonomySim_ros_pkgs.md) has been completed and the prerequisites installed. + +```shell +cd PATH_TO/AutonomySim/ros +catkin build AutonomySim_tutorial_pkgs +``` + +If your default GCC isn't 8 or greater (check using `gcc --version`), then compilation will fail. In that case, use `gcc-8` explicitly as follows- + +```shell +catkin build AutonomySim_tutorial_pkgs -DCMAKE_C_COMPILER=gcc-8 -DCMAKE_CXX_COMPILER=g++-8 +``` + +!!! note + For running examples, and also whenever a new terminal is opened, sourcing the `setup.bash` file is necessary. If you're using the ROS wrapper frequently, it might be helpful to add the `source PATH_TO/AutonomySim/ros/devel/setup.bash` to your `~/.profile` or `~/.bashrc` to avoid the need to run the command every time a new terminal is opened + +## Examples + +### Single drone with monocular and depth cameras and LiDAR + +* `settings.json` [front_stereo_and_center_mono.json](https://github.com/nervosys/AutonomySim/blob/main/ros/src/AutonomySim_tutorial_pkgs/settings/front_stereo_and_center_mono.json) + + ```shell + source PATH_TO/AutonomySim/ros/devel/setup.bash + roscd AutonomySim_tutorial_pkgs + cp settings/front_stereo_and_center_mono.json ~/Documents/AutonomySim/settings.json + + # Start your unreal package or binary here + roslaunch AutonomySim_ros_pkgs AutonomySim_node.launch; + + # in a new pane / terminal + source PATH_TO/AutonomySim/ros/devel/setup.bash + roslaunch AutonomySim_tutorial_pkgs front_stereo_and_center_mono.launch + ``` + + The above would start `rviz` with tf's, registered RGBD cloud using [depth_image_proc](https://wiki.ros.org/depth_image_proc) using the [`depth_to_pointcloud` launch file](https://github.com/nervosys/AutonomySim/blob/main/ros/src/AutonomySim_tutorial_pkgs/launch/front_stereo_and_center_mono/depth_to_pointcloud.launch), and the LiDAR point cloud. + +### Two drones, with cameras, lidar, IMU each + +* `settings.json`: [two_drones_camera_lidar_imu.json](https://github.com/nervosys/AutonomySim/blob/main/ros/src/AutonomySim_tutorial_pkgs/settings/two_drones_camera_lidar_imu.json) + + ```shell + source PATH_TO/AutonomySim/ros/devel/setup.bash + roscd AutonomySim_tutorial_pkgs + cp settings/two_drones_camera_lidar_imu.json ~/Documents/AutonomySim/settings.json + + # Start your unreal package or binary here + roslaunch AutonomySim_ros_pkgs AutonomySim_node.launch; + roslaunch AutonomySim_ros_pkgs rviz.launch + ``` + +You can view the tfs in `rviz`. And do a `rostopic list` and `rosservice list` to inspect the services avaiable. + +### Twenty-five drones in a square pattern + +* `settings.json`: [twenty_five_drones.json](https://github.com/nervosys/AutonomySim/blob/main/ros/src/AutonomySim_tutorial_pkgs/settings/twenty_five_drones.json) + + ```shell + source PATH_TO/AutonomySim/ros/devel/setup.bash + roscd AutonomySim_tutorial_pkgs + cp settings/twenty_five_drones.json ~/Documents/AutonomySim/settings.json + + # Start your unreal package or binary here + roslaunch AutonomySim_ros_pkgs AutonomySim_node.launch; + roslaunch AutonomySim_ros_pkgs rviz.launch + ``` + +You can view the tfs in `rviz`. And do a `rostopic list` and `rosservice list` to inspect the services avaiable. diff --git a/docs/sensors.md b/docs/sensors.md index 9cb25283..103f6725 100644 --- a/docs/sensors.md +++ b/docs/sensors.md @@ -11,7 +11,6 @@ * Lidar = 6 !!! note - Cameras are configured differently than the other sensors and do not have an enum associated with them. Look at [general settings](settings.md) and [image API](image_apis.md) for camera config and API. ## Default sensors @@ -226,6 +225,6 @@ nervosys::autonomylib::DistanceSensorData getDistanceSensorData(const std::strin distance_sensor_data = client.getDistanceSensorData(distance_sensor_name = "", vehicle_name = "") ``` -### Lidar +### LiDAR -See the [lidar page](lidar.md) for Lidar API. +See the [LiDAR page](lidar.md) for the LiDAR API. diff --git a/docs/settings.md b/docs/settings.md index bf90c38a..b07a59aa 100644 --- a/docs/settings.md +++ b/docs/settings.md @@ -1,8 +1,8 @@ -# AutonomySim Settings +# Settings ## Where are Settings Stored? -AutonomySim is searching for the settings definition in the following order. The first match will be used: +`AutonomySim` searches for the settings definition in the following order. The first match will be used: 1. Looking at the (absolute) path specified by the `-settings` command line argument. For example, in Windows: `AutonomySim.exe -settings="C:\path\to\settings.json"` @@ -215,10 +215,10 @@ Below are complete list of settings available along with their default values. I ## SimMode SimMode determines which simulation mode will be used. Below are currently supported values: -- `""`: prompt user to select vehicle type multirotor or car -- `"Multirotor"`: Use multirotor simulation -- `"Car"`: Use car simulation -- `"ComputerVision"`: Use only camera, no vehicle or physics +* `""`: prompt user to select vehicle type multirotor or car +* `"Multirotor"`: Use multirotor simulation +* `"Car"`: Use car simulation +* `"ComputerVision"`: Use only camera, no vehicle or physics ## ViewMode @@ -392,8 +392,8 @@ Each simulation mode will go through the list of vehicles specified in this sett ### Common Vehicle Setting -- `VehicleType`: This could be any one of the following - `PhysXCar`, `SimpleFlight`, `PX4Multirotor`, `ComputerVision`, `ArduCopter` & `ArduRover`. There is no default value therefore this element must be specified. -- `PawnPath`: This allows to override the pawn blueprint to use for the vehicle. For example, you may create new pawn blueprint derived from ACarPawn for a warehouse robot in your own project outside the AutonomySim code and then specify its path here. See also [PawnPaths](settings.md#PawnPaths). Note that you have to specify your custom pawn blueprint class path inside the global `PawnPaths` object using your proprietarily defined object name, and quote that name inside the `Vehicles` setting. For example, +* `VehicleType`: This could be any one of the following - `PhysXCar`, `SimpleFlight`, `PX4Multirotor`, `ComputerVision`, `ArduCopter` & `ArduRover`. There is no default value therefore this element must be specified. +* `PawnPath`: This allows to override the pawn blueprint to use for the vehicle. For example, you may create new pawn blueprint derived from ACarPawn for a warehouse robot in your own project outside the AutonomySim code and then specify its path here. See also [PawnPaths](settings.md#PawnPaths). Note that you have to specify your custom pawn blueprint class path inside the global `PawnPaths` object using your proprietarily defined object name, and quote that name inside the `Vehicles` setting. For example, ```json { @@ -411,13 +411,13 @@ Each simulation mode will go through the list of vehicles specified in this sett } ``` -- `DefaultVehicleState`: Possible value for multirotors is `Armed` or `Disarmed`. -- `AutoCreate`: If true then this vehicle would be spawned (if supported by selected sim mode). -- `RC`: This sub-element allows to specify which remote controller to use for vehicle using `RemoteControlID`. The value of -1 means use keyboard (not supported yet for multirotors). The value >= 0 specifies one of many remote controllers connected to the system. The list of available RCs can be seen in Game Controllers panel in Windows, for example. -- `X, Y, Z, Yaw, Roll, Pitch`: These elements allows you to specify the initial position and orientation of the vehicle. Position is in NED coordinates in SI units with origin set to Player Start location in Unreal environment. The orientation is specified in degrees. -- `IsFpvVehicle`: This setting allows to specify which vehicle camera will follow and the view that will be shown when ViewMode is set to Fpv. By default, AutonomySim selects the first vehicle in settings as FPV vehicle. -- `Sensors`: This element specifies the sensors associated with the vehicle, see [Sensors page](sensors.md) for details. -- `Cameras`: This element specifies camera settings for vehicle. The key in this element is name of the [available camera](image_apis.md#available_cameras) and the value is same as `CameraDefaults` as described above. For example, to change FOV for the front center camera to 120 degrees, you can use this for `Vehicles` setting: +* `DefaultVehicleState`: Possible value for multirotors is `Armed` or `Disarmed`. +* `AutoCreate`: If true then this vehicle would be spawned (if supported by selected sim mode). +* `RC`: This sub-element allows to specify which remote controller to use for vehicle using `RemoteControlID`. The value of -1 means use keyboard (not supported yet for multirotors). The value >= 0 specifies one of many remote controllers connected to the system. The list of available RCs can be seen in Game Controllers panel in Windows, for example. +* `X, Y, Z, Yaw, Roll, Pitch`: These elements allows you to specify the initial position and orientation of the vehicle. Position is in NED coordinates in SI units with origin set to Player Start location in Unreal environment. The orientation is specified in degrees. +* `IsFpvVehicle`: This setting allows to specify which vehicle camera will follow and the view that will be shown when ViewMode is set to Fpv. By default, AutonomySim selects the first vehicle in settings as FPV vehicle. +* `Sensors`: This element specifies the sensors associated with the vehicle, see [Sensors page](sensors.md) for details. +* `Cameras`: This element specifies camera settings for vehicle. The key in this element is name of the [available camera](image_apis.md#available_cameras) and the value is same as `CameraDefaults` as described above. For example, to change FOV for the front center camera to 120 degrees, you can use this for `Vehicles` setting: ```json "Vehicles": { diff --git a/docs/simple_flight.md b/docs/simple_flight.md index ac4c4e72..be7e574f 100644 --- a/docs/simple_flight.md +++ b/docs/simple_flight.md @@ -1,22 +1,22 @@ -# simple_flight +# Simple Flight Controller -AutonomySim has a built-in flight controller called `simple_flight` that is used by default. You do not need to do anything to use or configure it. AutonomySim also supports [PX4](px4_setup.md) as another flight controller for advanced users. In the future, we also plan to support [ROSFlight](https://rosflight.org/) and [Hackflight](https://github.com/simondlevy/hackflight). +`AutonomySim` has a built-in flight controller called `simple_flight` that is used by default. You do not need to do anything to use or configure it. AutonomySim also supports [PX4](px4_setup.md) as another flight controller for advanced users. In the future, we also plan to support [ROSFlight](https://rosflight.org/) and [Hackflight](https://github.com/simondlevy/hackflight). For background information, see [what is a flight controller?](robot_controller.md). ## Advantages -The advantage of using simple_flight is zero additional setup you need to do and it "just works". Also, simple_flight uses a steppable clock which means you can pause the simulation and things are not at mercy of a high variance low precision clock that the operating system provides. Furthermore, simple_flight is simple, cross platform and consists of 100% header-only dependency-free C++ code which means you can literally switch between the simulator and the flight controller code within same code base! +The advantage of using `simple_flight` is zero additional setup you need to do and it "just works". Also, `simple_flight` uses a steppable clock which means you can pause the simulation and things are not at mercy of a high variance low precision clock that the operating system provides. Furthermore, `simple_flight` is simple, cross platform and consists of 100% header-only dependency-free C++ code which means you can literally switch between the simulator and the flight controller code within same codebase. ## Design Normally flight controllers are designed to run on actual hardware of vehicles and their support for running in simulator varies widely. They are often fairly difficult to configure for non-expert users and typically have a complex build, usually lacking cross platform support. All these problems have played a significant part in the design of simple_flight. -simple_flight is designed from ground up as library with clean a interface that can work onboard the vehicle as well as in the simulator. The core principle is that the flight controller has no way to specify a special simulation mode and therefore it has no way to know if it is running as a simulation or as a real vehicle. We thus view flight controllers simply as a collection of algorithms packaged in a library. Another key emphasis is to develop this code as dependency-free header-only pure standard C++11 code. This means there is no special build required to compile simple_flight. You just copy its source code to any project you wish and it just works. +`simple_flight` is designed from ground up as library with clean a interface that can work onboard the vehicle as well as in the simulator. The core principle is that the flight controller has no way to specify a special simulation mode and therefore it has no way to know if it is running as a simulation or as a real vehicle. We thus view flight controllers simply as a collection of algorithms packaged in a library. Another key emphasis is to develop this code as dependency-free header-only pure standard C++11 code. This means there is no special build required to compile `simple_flight`. You just copy its source code to any project you wish and it just works. ## Control -simple_flight can control vehicles by taking in the desired input as angle rate, angle level, velocity or position. Each axis of control can be specified with one of these modes. Internally, simple_flight uses a cascade of PID controllers to finally generate actuator signals. This means that the position PID drives the velocity PID, which in turn drives the angle level PID which finally drives the angle rate PID. +`simple_flight` can control vehicles by taking in the desired input as angle rate, angle level, velocity or position. Each axis of control can be specified with one of these modes. Internally, `simple_flight` uses a cascade of PID controllers to finally generate actuator signals. This means that the position PID drives the velocity PID, which in turn drives the angle level PID which finally drives the angle rate PID. ## State Estimation @@ -24,11 +24,14 @@ In the current release, we are using the ground truth from the simulator for our ## Supported Boards -Currently, we have implemented simple_flight interfaces for the simulated board. We plan to implement it for the Pixhawk V2 board and possibly the Naze32 board. We expect all our code to remain unchanged and the implementation would mainly involve adding drivers for various sensors, handling ISRs and managing other board specific details. If you have experience in this area, we encourage you to engage with us and contribute! +Currently, we have implemented `simple_flight` interfaces for the simulated board. We plan to implement it for the Pixhawk V2 board and possibly the Naze32 board. We expect all our code to remain unchanged and the implementation would mainly involve adding drivers for various sensors, handling ISRs and managing other board specific details. + +!!! note + If you have experience in this area, we encourage you to engage with us and contribute. ## Configuration -To have AutonomySim use simple_flight, you can specify it in [settings.json](settings.md) as shown below. Note that this is default, so you don't have to do it explicitly. +To have `AutonomySim` use simple_flight, you can specify it in [settings.json](settings.md) as shown below. Note that this is default, so you don't have to do it explicitly. ```json "Vehicles": { @@ -67,8 +70,8 @@ For safety reasons, flight controllers disallow API control unless a human opera } ``` -Finally, simple_flight uses a steppable clock by default which means that the clock advances when the simulator tells it to advance (unlike the wall clock which advances strictly according to the passage of time). This means the clock can be paused, for example, if code hits a breakpoint and there is zero variance in the clock (clock APIs provided by operating systems might have significant variance unless it is a "real time" OS). If you want simple_flight to use a wall clock instead then use following settings: +Finally, `simple_flight` uses a steppable clock by default which means that the clock advances when the simulator tells it to advance (unlike the wall clock which advances strictly according to the passage of time). This means the clock can be paused, for example, if code hits a breakpoint and there is zero variance in the clock (clock APIs provided by operating systems might have significant variance unless it is a "real time" OS). If you want `simple_flight` to use a wall clock instead then use following settings: ```json - "ClockType": "ScalableClock" +"ClockType": "ScalableClock" ``` diff --git a/docs/steering_wheel.md b/docs/steering_wheel.md new file mode 100644 index 00000000..397fe371 --- /dev/null +++ b/docs/steering_wheel.md @@ -0,0 +1,43 @@ +# Steering Wheels + +To use the `Logitech G920 steering wheel` with `AutonomySim` follow these steps: + +1. Connect the steering wheel to the computer and wait until drivers installation complete. + +2. Install Logitech Gaming Software from [here](http://support.logitech.com/en_us/software/lgs) + +3. Before debug, you’ll have to normalize the values in `AutonomySim` code. Perform this changes in `CarPawn.cpp` (according to the current update in the `git`): + + 1. In line 382, change `Val` to `1 – Val`. (the complementary value in the range [0.0,1.0]). + 2. In line 388, change `Val` to `5Val - 2.5` (Change the range of the given input from [0.0,1.0] to [-1.0,1.0]). + 3. In line 404, change `Val` to `4(1 – Val)`. (the complementary value in the range [0.0,1.0]). + +4. Debug the `AutonomySim` project (while the steering wheel is connected; it’s important). + +5. On Unreal Editor, go to `Edit->Plugins->Input Devices` and enable `Windows RawInput`. + +6. Go to `Edit->Project Settings->Raw Input`, and add new device configuration: + Vendor ID: 0x046d (In case of Logitech G920, otherwise you might need to check it). + Product ID: 0xc261 (In case of Logitech G920, otherwise you might need to check it). + Under `Axis Properties`, make sure that `GenericUSBController Axis 2`, `GenericUSBController Axis 4` and `GenericUSBController Axis 5` are all enabled with an offset of 1.0. + Explanation: axis 2 is responsible for steering movement, axis 4 is for brake and axis 5 is for gas. If you need to configure the clutch, it’s on axis 3. + + ![steering_wheel](images/steering_wheel_instructions_1.png) + +7. Go to `Edit->Project Settings->Input`. Under `Bindings` in `Axis Mappings`: + + 1. Remove existing mappings from the groups `MoveRight` and `MoveForward`. + 2. Add new axis mapping to the group `MoveRight`, use `GenericUSBController Axis 2` with a scale of 1.0. + 3. Add new axis mapping to the group `MoveForward`, use `GenericUSBController Axis 5` with a scale of 1.0. + 4. Add a new group of axis mappings, name it `FootBrake` and add new axis mapping to this group, use `GenericUSBController Axis 4` with a scale of 1.0. + + ![steering_wheel](images/steering_wheel_instructions_2.png) + +8. Play and drive! + +### Pay Attention + +Notice that in the first time we 'play' after debugging, we need to touch the wheel to 'reset' the values. + +!!! tip + In the game engine software, you can configure buttons as keyboard shortcuts, we used it to configure a shortcut to record dataset or to play in full screen. diff --git a/docs/steering_wheel_installation.md b/docs/steering_wheel_installation.md deleted file mode 100644 index 6536907f..00000000 --- a/docs/steering_wheel_installation.md +++ /dev/null @@ -1,42 +0,0 @@ -# Logitech G920 Steering Wheel Installation - -To use Logitech G920 steering wheel with AutonomySim follow these steps: - -1. Connect the steering wheel to the computer and wait until drivers installation complete. - -2. Install Logitech Gaming Software from [here](http://support.logitech.com/en_us/software/lgs) - -3. Before debug, you’ll have to normalize the values in AutonomySim code. Perform this changes in CarPawn.cpp (according to the current update in the git): - In line 382, change “Val” to “1 – Val”. (the complementary value in the range [0.0,1.0]). - In line 388, change “Val” to “5Val - 2.5” (Change the range of the given input from [0.0,1.0] to [-1.0,1.0]). - In line 404, change “Val” to “4(1 – Val)”. (the complementary value in the range [0.0,1.0]). - -4. Debug AutonomySim project (while the steering wheel is connected – it’s important). - -5. On Unreal Editor, go to Edit->plugins->input devices and enable “Windows RawInput”. - -6. Go to Edit->Project Settings->Raw Input, and add new device configuration: - Vendor ID: 0x046d (In case of Logitech G920, otherwise you might need to check it). - Product ID: 0xc261 (In case of Logitech G920, otherwise you might need to check it). - Under “Axis Properties”, make sure that “GenericUSBController Axis 2”, “GenericUSBController Axis 4” and “GenericUSBController Axis 5” are all enabled with an offset of 1.0. - Explanation: axis 2 is responsible for steering movement, axis 4 is for brake and axis 5 is for gas. If you need to configure the clutch, it’s on axis 3. - - ![steering_wheel](images/steering_wheel_instructions_1.png) - -7. Go to Edit->Project Settings->Input, Under Bindings in “Axis Mappings”: - Remove existing mappings from the groups “MoveRight” and “MoveForward”. - Add new axis mapping to the group “MoveRight”, use GenericUSBController axis 2 with a scale of 1.0. - Add new axis mapping to the group “MoveForward”, use GenericUSBController axis 5 with a scale of 1.0. - Add a new group of axis mappings, name it “FootBrake” and add new axis mapping to this group, use GenericUSBController axis 4 with a scale of 1.0. - - ![steering_wheel](images/steering_wheel_instructions_2.png) - -8. Play and drive ! - -### Pay Attention - -Notice that in the first time we "play" after debug, we need to touch the wheel to “reset” the values. - -### Tip - -In the gaming software, you can configure buttons as keyboard shortcuts, we used it to configure a shortcut to record dataset or to play in full screen. diff --git a/docs/drone_survey.md b/docs/surveying.md similarity index 73% rename from docs/drone_survey.md rename to docs/surveying.md index a5046709..8ee454c6 100644 --- a/docs/drone_survey.md +++ b/docs/surveying.md @@ -1,8 +1,8 @@ -# Implementing a Drone Survey script +# Surveying Moved here from [https://github.com/nervosys/AutonomySim/wiki/Implementing-a-Drone-Survey-script](https://github.com/nervosys/AutonomySim/wiki/Implementing-a-Drone-Survey-script) -Ever wanted to capture a bunch of top-down pictures of a certain location? Well, the Python API makes this really simple. See the [code available here](https://github.com/nervosys/AutonomySim/blob/main/PythonClient/multirotor/survey.py). +Ever wanted to capture a bunch of top-down pictures of a certain location? Well, the Python API makes this really simple. See the [code available here](https://github.com/nervosys/AutonomySim/blob/main/PythonClient/multirotor/survey.py). ![survey](images/survey.png) @@ -34,9 +34,9 @@ while x < self.boxsize: distance += self.boxsize ``` -Assuming we start in the corner of the box, increment x by the stripe width, then fly the full y-dimension of `-boxsize` to `+boxsize`, so in this case, `boxsize` is half the size of the actual box we will be covering. +Assuming we start in the corner of the box, increment `x` by the stripe width, then fly the full `y`-dimension of `-boxsize` to `+boxsize`, so in this case, `boxsize` is half the size of the actual box we will be covering. -Once we have this list of Vector3r objects, we can fly this path very simply with the following call: +Once we have this list of `Vector3r` objects, we can fly this path very simply with the following call: ```python result = self.client.moveOnPath( @@ -47,6 +47,6 @@ We can compute an appropriate `trip_time` timeout by dividing the distance of th The `lookahead` needed here for smooth path interpolation can be computed from the velocity using `self.velocity + (self.velocity/2)`. The more lookahead, the smoother the turns. This is why you see in the screenshot that the ends of each swimland are smooth turns rather than a square box pattern. This can result in a smoother video from your camera also. -That's it, pretty simple, eh? +That's it. Pretty simple, huh? -Now, of course you can add a lot more intelligence to this, make it avoid known obstacles on your map, make it climb up and down a hillside so you can survey a slope, etc. Lots of fun to be had. +Of course, we can add a lot more intelligence to this, make it avoid known obstacles on the map, make it climb up and down a hillside so you can survey a slope, etc. This is only the tip of the iceberg in terms of what is possible. diff --git a/docs/retexturing.md b/docs/texture_swapping.md similarity index 59% rename from docs/retexturing.md rename to docs/texture_swapping.md index 1163e5aa..e3edf88c 100644 --- a/docs/retexturing.md +++ b/docs/texture_swapping.md @@ -2,24 +2,20 @@ ## How to Make An Actor Retexturable -To be made texture-swappable, an actor must derive from the parent class TextureShuffleActor. -The parent class can be set via the settings tab in the actor's blueprint. +To be made texture-swappable, an actor must derive from the parent class `TextureShuffleActor`. The parent class can be set via the settings tab in the actor's blueprint. ![Parent Class](images/tex_shuffle_actor.png) -After setting the parent class to TextureShuffActor, the object gains the member DynamicMaterial. -DynamicMaterial needs to be set--on all actor instances in the scene--to TextureSwappableMaterial. -Warning: Statically setting the Dynamic Material in the blueprint class may cause rendering errors. It seems to work better to set it on all the actor instances in the scene, using the details panel. +After setting the parent class to TextureShuffActor, the object gains the member `DynamicMaterial`. `DynamicMaterial` needs to be set--on all actor instances in the scene--to `TextureSwappableMaterial`. + +!!! warning + Statically setting the Dynamic Material in the blueprint class may cause rendering errors. It seems to work better to set it on all the actor instances in the scene, using the details panel. ![TextureSwappableMaterial](images/tex_swap_material.png) ## How to Define the Set(s) of Textures to Choose From -Typically, certain subsets of actors will share a set of texture options with each other. (e.g. walls that are part of the same building) - -It's easy to set up these groupings by using Unreal Engine's group editing functionality. -Select all the instances that should have the same texture selection, and add the textures to all of them simultaneously via the Details panel. -Use the same technique to add descriptive tags to groups of actors, which will be used to address them in the API. +Typically, certain subsets of actors will share a set of texture options with each other (e.g., walls that are part of the same building). It's easy to set up these groupings by using Unreal Engine's group editing functionality. Select all the instances that should have the same texture selection, and add the textures to all of them simultaneously via the Details panel. Use the same technique to add descriptive tags to groups of actors, which will be used to address them in the API. ![Group Editing](images/tex_swap_group_editing.png) @@ -35,10 +31,7 @@ The following API is available in C++ and python. (C++ shown) std::vector simSwapTextures(const std::string& tags, int tex_id); ``` -The string of "," or ", " delimited tags identifies on which actors to perform the swap. -The tex_id indexes the array of textures assigned to each actor undergoing a swap. -The function will return the list of objects which matched the provided tags and had the texture swap perfomed. -If tex_id is out-of-bounds for some object's texture set, it will be taken modulo the number of textures that were available. +The string of "," or ", " delimited tags identifies on which actors to perform the swap. The tex_id indexes the array of textures assigned to each actor undergoing a swap. The function will return the list of objects which matched the provided tags and had the texture swap perfomed. If `tex_id` is out-of-bounds for some object's texture set, it will be taken modulo the number of textures that were available. Demo (Python): diff --git a/docs/unreal_blocks.md b/docs/unreal_blocks.md index 7236d867..65e45861 100644 --- a/docs/unreal_blocks.md +++ b/docs/unreal_blocks.md @@ -1,13 +1,10 @@ +# Blocks Environment -# Setup Blocks Environment for AutonomySim - -Blocks environment is available in repo in folder `Unreal/Environments/Blocks` and is designed to be lightweight in size. That means its very basic but fast. - -Here are quick steps to get Blocks environment up and running: +The `Blocks` environment is available in the folder `Unreal/Environments/Blocks` and is designed to be lightweight in size. That means its very basic but fast. Below are quick steps to get Blocks environment up and running. ## Windows -1. Make sure you have [installed Unreal and built AutonomySim](build_windows.md). +1. Ensure you have [installed Unreal and built AutonomySim](build_windows.md). 2. Navigate to folder `AutonomySim\Unreal\Environments\Blocks`, double click on Blocks.sln file to open in Visual Studio. By default, this project is configured for Visual Studio 2019. However, if you want to generate this project for Visual Studio 2022, go to 'Edit->Editor Preferences->Source Code' inside the Unreal Editor and select 'Visual Studio 2022' for the 'Source Code Editor' setting. 3. Make sure `Blocks` project is the startup project, build configuration is set to `DebugGame_Editor` and `Win64`. Hit F5 to run. 4. Press the Play button in Unreal Editor and you will see something like in below video. Also see [how to use AutonomySim](https://github.com/nervosys/AutonomySim/#how-to-use-it). @@ -37,6 +34,6 @@ By default AutonomySim spawns multirotor. You can easily change this to car and ## FAQ -#### I see warnings about like "_BuitData" file is missing +#### I see warnings about like `_BuitData` file is missing These are intermediate files and you can safely ignore it. diff --git a/docs/unreal_custenv.md b/docs/unreal_custenv.md index d38d8939..267973d8 100644 --- a/docs/unreal_custenv.md +++ b/docs/unreal_custenv.md @@ -1,6 +1,6 @@ -# Creating and Setting Up Unreal Environment +# Unreal Environments -This page contains the complete instructions start to finish for setting up Unreal environment with AutonomySim. The Unreal Marketplace has [several environment](https://www.unrealengine.com/marketplace/content-cat/assets/environments) available that you can start using in just few minutes. It is also possible to use environments available on websites such as [turbosquid.com](https://www.turbosquid.com/) or [cgitrader.com](https://www.cgtrader.com/) with bit more effort (here's [tutorial video](https://www.youtube.com/watch?v=y09VbdQWvQY&feature)). In addition there also several [free environments](https://github.com/nervosys/AutonomySim/issues/424) available. +This page contains the complete instructions start to finish for setting up Unreal environment with `AutonomySim`. The `Unreal Marketplace` has [several environments](https://www.unrealengine.com/marketplace/content-cat/assets/environments) available that you can start using in just few minutes. It is also possible to use environments available on websites such as [turbosquid.com](https://www.turbosquid.com/) or [cgitrader.com](https://www.cgtrader.com/) with bit more effort (here's [tutorial video](https://www.youtube.com/watch?v=y09VbdQWvQY&feature)). In addition there also several [free environments](https://github.com/nervosys/AutonomySim/issues/424) available. Below we will use a freely downloadable environment from Unreal Marketplace called Landscape Mountain but the steps are same for any other environments. @@ -10,7 +10,7 @@ There is no `Epic Games Launcher` for Linux which means that if you need to crea ## Step by Step Instructions -1. Make sure AutonomySim is built and Unreal 4.27 is installed as described in [build instructions](build_windows.md). +1. Ensure `AutonomySim` is built and Unreal 4.27 is installed as described in [build instructions](build_windows.md). 2. In `Epic Games Launcher` click the Learn tab then scroll down and find `Landscape Mountains`. Click the `Create Project` and download this content (~2GB download). ![current version](images/landscape_mountains.png) @@ -20,7 +20,6 @@ There is no `Epic Games Launcher` for Linux which means that if you need to crea ![unreal editor](images/unreal_editor.png) !!!note - The Landscape Mountains project is supported up to Unreal Engine version 4.24. If you do not have 4.24 installed, you should see a dialog titled `Select Unreal Engine Version` with a dropdown to select from installed versions. Select 4.27 or greater to migrate the project to a supported engine version. If you have 4.24 installed, you can manually migrate the project by navigating to the corresponding .uproject file in Windows Explorer, right-clicking it, and selecting the `Switch Unreal Engine version...` option. 4. From the `File menu` select `New C++ class`, leave default `None` on the type of class, click `Next`, leave default name `MyClass`, and click `Create Class`. We need to do this because Unreal requires at least one source file in project. It should trigger compile and open up Visual Studio solution `LandscapeMountains.sln`. @@ -28,7 +27,6 @@ There is no `Epic Games Launcher` for Linux which means that if you need to crea 5. Go to your folder for AutonomySim repo and copy `Unreal\Plugins` folder in to your `LandscapeMountains` folder. This way now your own Unreal project has AutonomySim plugin. !!!note - If the AutonomySim installation is fresh, i.e, hasn't been built before, make sure that you run `build.cmd` from the root directory once before copying `Unreal\Plugins` folder so that `AutonomyLib` files are also included. If you have made some changes in the Blocks environment, make sure to run `update_to_git.cmd` from `Unreal\Environments\Blocks` to update the files in `Unreal\Plugins`. 6. Edit the `LandscapeMountains.uproject` so that it looks like this @@ -75,7 +73,6 @@ There is no `Epic Games Launcher` for Linux which means that if you need to crea ![regen](images/regen_sln.png) !!!tip - If the `Generate Visual Studio Project Files` option is missing you may need to reboot your machine for the Unreal Shell extensions to take effect. If it is still missing then open the LandscapeMountains.uproject in the Unreal Editor and select `Refresh Visual Studio Project` from the `File` menu. 9. Reopen `LandscapeMountains.sln` in Visual Studio, and make sure "DebugGame Editor" and "Win64" build configuration is the active build configuration. @@ -111,28 +108,28 @@ Once you have your environment using above instructions, you should frequently u ## FAQ -#### What are other cool environments? +### What are other cool environments? [Unreal Marketplace](https://www.unrealengine.com/marketplace) has dozens of prebuilt extra-ordinarily detailed [environments](https://www.unrealengine.com/marketplace/content-cat/assets/environments) ranging from Moon to Mars and everything in between. The one we have used for testing is called [Modular Neighborhood Pack](https://www.unrealengine.com/marketplace/modular-neighborhood-pack) but you can use any environment. Another free environment is [Infinity Blade series](https://www.unrealengine.com/marketplace/infinity-blade-plain-lands). Alternatively, if you look under the Learn tab in Epic Game Launcher, you will find many free samples that you can use. One of our favorites is "A Boy and His Kite" which is a 100 square miles of highly detailed environment (caution: you will need *very* beefy PC to run it!). -#### When I press Play button some kind of video starts instead of my vehicle. +### When I press Play button some kind of video starts instead of my vehicle. If the environment comes with MatineeActor, delete it to avoid any startup demo sequences. There might be other ways to remove it as well, for example, click on Blueprints button, then Level Blueprint and then look at Begin Play event in Event Graph. You might want to disconnect any connections that may be starting "matinee". -#### Is there easy way to sync code in my Unreal project with code in AutonomySim repo? +### Is there easy way to sync code in my Unreal project with code in AutonomySim repo? Sure, there is! You can find bunch of `.cmd` files (for linux, `.sh`) in `AutonomySim\Unreal\Environments\Blocks`. Just copy them over to your own Unreal project. Most of these are quite simple and self explanatory. -#### I get some error about map. +### I get some error about map. You might have to set default map for your project. For example, if you are using Modular Neighborhood Pack, set the Editor Starter Map as well as Game Default Map to Demo_Map in Project Settings > Maps & Modes. -#### I see "Add to project" option for environment but not "Create project" option. +### I see "Add to project" option for environment but not "Create project" option. In this case, create a new blank C++ project with no Starter Content and add your environment in to it. -#### I already have my own Unreal project. How do I use AutonomySim with it? +### I already have my own Unreal project. How do I use AutonomySim with it? Copy the `Unreal\Plugins` folder from the build you did in the above section into the root of your Unreal project's folder. In your Unreal project's .uproject file, add the key `AdditionalDependencies` to the "Modules" object as we showed in the `LandscapeMountains.uproject` above. diff --git a/docs/unreal_proj.md b/docs/unreal_projects.md similarity index 50% rename from docs/unreal_proj.md rename to docs/unreal_projects.md index 11bfae61..80cb29a2 100644 --- a/docs/unreal_proj.md +++ b/docs/unreal_projects.md @@ -1,10 +1,10 @@ -# Unreal Environment +# Unreal Projects ## Setting Up the Unreal Project ### Option 1: Built-in Blocks Environment -To get up and running fast, you can use the Blocks project that already comes with AutonomySim. This is not very highly detailed environment to keep the repo size reasonable but we use it for various testing all the times and it is the easiest way to get your feet wet in this strange land. +To get up and running fast, you can use the `Blocks` project that already comes with `AutonomySim`. This is not very highly detailed environment to keep the repo size reasonable but we use it for various testing all the times and it is the easiest way to get your feet wet in this strange land. Follow these [quick steps](unreal_blocks.md). @@ -16,6 +16,4 @@ Follow this [step-by-step guide](unreal_custenv.md). ## Changing Code and Development Workflow -To see how you can change and test AutonomySim code, please read our [recommended development workflow](dev_workflow.md). - - +To see how you can change and test `AutonomySim` code, please read our [recommended development workflow](dev_workflow.md). diff --git a/docs/unreal_upgrade.md b/docs/unreal_upgrading.md similarity index 75% rename from docs/unreal_upgrade.md rename to docs/unreal_upgrading.md index 3b1695e2..8d2fb00d 100644 --- a/docs/unreal_upgrade.md +++ b/docs/unreal_upgrading.md @@ -1,12 +1,13 @@ -# Upgrading to Unreal Engine 4.27 +# Upgrading Unreal Engine -These instructions apply if you are already using AutonomySim on Unreal Engine 4.25. If you have never installed AutonomySim, please see [How to get it](https://github.com/nervosys/AutonomySim#how-to-get-it). +These instructions apply if you are already using `AutonomySim` on `Unreal Engine` 4.25. If you have never installed `AutonomySim`, please see [How to get it](https://github.com/nervosys/AutonomySim#how-to-get-it). -**Caution:** The below steps will delete any of your unsaved work in AutonomySim or Unreal folder. +!!! caution + The below steps will delete any of your unsaved work in AutonomySim or Unreal folder. -## Do this first +## First Steps -### For Windows Users +### Windows 1. Install Visual Studio 2022 with VC++, Python and C#. 2. Install UE 4.27 through Epic Games Launcher. @@ -14,7 +15,7 @@ These instructions apply if you are already using AutonomySim on Unreal Engine 4 4. Run `clean_rebuild.cmd` to remove all unchecked/extra stuff and rebuild everything. 5. See also [Build AutonomySim on Windows](build_windows.md) for more information. -### For Linux Users +### Linux 1. From your AutonomySim repo folder, run 'clean_rebuild.sh`. 2. Rename or delete your existing folder for Unreal Engine. @@ -33,26 +34,29 @@ If you have your own Unreal project created in an older version of Unreal Engine ## FAQ -### I have an Unreal project that is older than 4.16. How do I upgrade it? +### I have an `Unreal` project that is older than 4.16. How do I upgrade it? -#### Option 1: Just Recreate Project +#### Option 1: Recreate the Project If your project doesn't have any code or assets other than environment you downloaded then you can also simply [recreate the project in Unreal 4.27 Editor](unreal_custenv.md) and then copy Plugins folder from `AutonomySim/Unreal/Plugins`. -#### Option 2: Modify Few Files +#### Option 2: Modify the Files Unreal versions newer than Unreal 4.15 has breaking changes. So you need to modify your *.Build.cs and *.Target.cs which you can find in the `Source` folder of your Unreal project. So what are those changes? Below is the gist of it but you should really refer to [Unreal's official 4.16 transition post](https://forums.unrealengine.com/showthread.php?145757-C-4-16-Transition-Guide). -##### In your project's *.Target.cs +##### Update the project `*.Target.cs` file 1. Change the contructor from, `public MyProjectTarget(TargetInfo Target)` to `public MyProjectTarget(TargetInfo Target) : base(Target)` 2. Remove `SetupBinaries` method if you have one and instead add following line in contructor above: `ExtraModuleNames.AddRange(new string[] { "MyProject" });` -##### In your project's *.Build.cs +##### Update the project `*.Build.cs` file Change the constructor from `public MyProject(TargetInfo Target)` to `public MyProject(ReadOnlyTargetRules Target) : base(Target)`. -##### And finally... +##### Last -Follow above steps to continue the upgrade. The warning box might show only "Open Copy" button. Don't click that. Instead, click on More Options which will reveal more buttons. Choose `Convert-In-Place option`. *Caution:* Always keep backup of your project first! If you don't have anything nasty, in place conversion should go through and you are now on the new version of Unreal. +Follow above steps to continue the upgrade. The warning box might show only "Open Copy" button. Don't click that. Instead, click on More Options which will reveal more buttons. Choose `Convert-In-Place option`. + +!!! caution + Always keep backup of your project first! If you don't have anything nasty, in place conversion should go through and you are now on the new version of Unreal. diff --git a/docs/upgrade_settings.md b/docs/upgrade_settings.md index 19002673..0b820a22 100644 --- a/docs/upgrade_settings.md +++ b/docs/upgrade_settings.md @@ -1,6 +1,6 @@ -# Upgrading Settings +# Upgrading the Settings -The settings schema in AutonomySim 1.2 is changed for more flexibility and cleaner interface. If you have older [settings.json](settings.md) file then you can either delete it and restart AutonomySim or use this guide to make manual upgrade. +The settings schema in `AutonomySim` 1.2 is changed for more flexibility and cleaner interface. If you have older [settings.json](settings.md) file then you can either delete it and restart AutonomySim or use this guide to make manual upgrade. ## Quicker Way diff --git a/docs/using_car.md b/docs/usage_rover.md similarity index 72% rename from docs/using_car.md rename to docs/usage_rover.md index 7477c2de..abfca6eb 100644 --- a/docs/using_car.md +++ b/docs/usage_rover.md @@ -1,6 +1,6 @@ -# How to Use Car in AutonomySim +# Rover Usage -By default AutonomySim prompts user for which vehicle to use. You can easily change this by setting [SimMode](settings.md#SimMode). For example, if you want to use car instead then just set the SimMode in your [settings.json](settings.md) which you can find in your `~/Documents/AutonomySim` folder, like this: +By default, `AutonomySim` prompts the user for the vehicle to use. You can easily change this by setting [SimMode](settings.md#SimMode). For example, if you want to use car instead then just set the SimMode in your [settings.json](settings.md) which you can find in your `~/Documents/AutonomySim` folder, like this: ```json { diff --git a/docs/who_is_using.md b/docs/userbase.md similarity index 94% rename from docs/who_is_using.md rename to docs/userbase.md index e3ec840b..90a94058 100644 --- a/docs/who_is_using.md +++ b/docs/userbase.md @@ -1,6 +1,6 @@ -# Who is Using AutonomySim? +# Past and Current Users -#### Would you like to see your own group or project here? +## Would you like to see your own group or project here? Just add a [GitHub issue](https://github.com/nervosys/AutonomySim/issues) with quick details and link to your website. @@ -33,4 +33,4 @@ Just add a [GitHub issue](https://github.com/nervosys/AutonomySim/issues) with q * [STPLS3D - University of Southern California Institute for Creative Technologies](http://www.stpls3d.com/) * [Central Michigan University](http://www.waynenterprises.com/research) * [Scaled Foundations](#) -* [Codex Labs](#) \ No newline at end of file +* [Codex Labs](#) diff --git a/docs/voxel_grid.md b/docs/voxel_grid.md index 06740e51..77d3f7cd 100644 --- a/docs/voxel_grid.md +++ b/docs/voxel_grid.md @@ -1,8 +1,10 @@ # Voxel Grid -AutonomySim provides a feature that constructs ground truth voxel grids of the world directly from Unreal Engine. A voxel grid is a representation of the occupancy of a given world/map, by discretizing into cells of a certain size; and recording a voxel if that particular location is occupied. +`AutonomySim` provides a feature that constructs ground truth voxel grids of the world directly from `Unreal Engine`. A voxel grid is a representation of the occupancy of a given world/map, by discretizing into cells of a certain size; and recording a voxel if that particular location is occupied. -The logic for constructing the voxel grid is in WorldSimApi.cpp->createVoxelGrid(). For now, the assumption is that the voxel grid is a cube - and the API call from Python is of the structure: +## Algorithm and Usage + +The logic for constructing the voxel grid is in `WorldSimApi.cpp->createVoxelGrid()`. For now, the assumption is that the voxel grid is a cube - and the API call from Python is of the structure: ```python simCreateVoxelGrid(self, position, x, y, z, res, of) @@ -37,11 +39,11 @@ The occupancy of the map is calculated iteratively over all discretized cells, w The voxel grids are stored in the binvox format which can then be converted by the user into an octomap .bt or any other relevant, desired format. Subsequently, these voxel grids/octomaps can be used within mapping/planning. One nifty little utility to visualize a created binvox files is [viewvox](https://www.patrickmin.com/viewvox/). Similarly, `binvox2bt` can convert the binvox to an octomap file. -##### Example voxel grid in Blocks: +## Example in Blocks ![image](images/voxel_grid.png) -##### Blocks voxel grid converted to Octomap format (visualized in rviz): +## Converion to Octomap format (visualized in rviz) ![image](images/octomap.png) diff --git a/docs/working_with_plugin_contents.md b/docs/working_with_plugin_contents.md deleted file mode 100644 index 90553352..00000000 --- a/docs/working_with_plugin_contents.md +++ /dev/null @@ -1,7 +0,0 @@ -# How to use plugin contents - -Plugin contents are not shown in Unreal projects by default. To view plugin content, you need to click on few semi-hidden buttons: - -![plugin contents screenshot](images/plugin_contents.png) - -**Causion**: Changes you make in content folder are changes to binary files so be careful. \ No newline at end of file