Augmented Unreality is a plugin for Unreal Engine 4 which enables creation of augmented reality applications by displaying a video stream from a camera inside the game and tracking camera position using fiducial markers.
It was created by Krzysztof Lis (Adynathos) as part of a project for ETH Zürich.
Version 1.2.05 - build for UE 4.19, OpenCV 3.4.1:
- Video displayed in-game
- Camera position tracked using fiducial markers, multiple independent sets of markers can be tracked at once
- Editable spatial configurations of markers
- Camera calibration
- Multiple video sources: cameras, video files, network streams. Source can be switched using in-game UI
- Shadow simulation (assuming the scene is on a plane)
- Windows
- Linux
- Android
- Augmented Unreality Plugin - the plugin files only,
- Augmented Unreality Example Project - an example project using the plugin
- Download the example project
- Decompress the archive - and move AugmentedUnrealityEx to the location where you store your Unreal projects.
- Launch Unreal Engine and open AugmentedUnrealityEx/AugmentedUnrealityEx.uproject.
- Print the following boards: Chessboard 1, Chessboard 2, Square B
- Connect a camera and launch the game.
- If the virtual object are not well aligned with the markers, perform camera calibration.
- Download the plugin
- Decompress the archive - and move directory AugmentedUnreality to YourProject/Plugins
- Reopen your project
- Add to your level: AURCameraActor to show the video and one of the fiducial patterns: PatternChessboard_A, PatternChessboard_B, PatternCube, PatternSquare_A, PatternSquare_B
- Add a shadow plane actor if you want shadows cast on the surface under the markers
- Run the game to generate pattern images. Then print the patterns from YourProject/Saved/AugmentedUnreality/Patterns
The plugin tries to detect what video sources are available depending on the platform:
- Android - the device camera will be used, available resolutions determined using the camera API
- Windows, Linux - video acquisition is achieved using OpenCV's VideoCapture. Standard resolutions are offered, but there is no guarantee that the camera can output in all resolutions.
- Video files: AURVideoVideoFile. The VideoFile should be the path to the file relative to FPaths::GameDir(). GStreamer needs to be installed to play videos.
- AURVideoSourceStream - video streamed through network. Set only one of the following:
- ConnectionString - a GStreamer pipeline ending with appsink.
- StreamFile - path to a .sdp file relative to FPaths::GameDir().
- Test video - changes color every second
Select the video source from the menu on the right.
The CalibrationFileName is the location of the file storing calibration for this video source, relative to FPaths::GameSavedDir()/AugmentedUnreality/Calibration. If two sources use the same camera, they should have the same calibration file.
Best quality is obtained if the camera is calibrated. It is important to find the camera's field of view. If the camera's field of view differs from the rendering engine's field of view, the virtual objects will not be properly aligned to the real world. If you notice that the virtual objects move in real world when you move the camera, it means the camera is not correctly calibrated
Each VideoSource can have different camera parameters, therefore each has its own calibration file located at located in FPaths::GameSavedDir()/AugmentedUnreality/VideoSource.CalibrationFilePath. The driver will attempt to load this file and display the information whether the camera is calibrated in the UI.
To perform calibration of your camera:
- Print or display on an additional screen the calibration pattern found in AugmentedUnreality/Content/Calibration/calibration_pattern_asymmetric_circles.png
- Open the example project and start the game
- In the menu in the top-right corner of the screen, choose the right video source and click Calibrate
- Point the camera at the calibration pattern from different directions - pattern is detected if a colorful overlay is drawn
- Wait until the progress bar is full
- The camera properties are now saved to the calibraiton file and will be loaded whenever you use this video source again
This plugin uses ArUco boards for camera pose estimation, specifically the implementation of ArUco in OpenCV contrib.
Boards are used for two purposes:
- Positioning the camera in game world - this aligns the real and virtual world. The board's position in real world is equivalent to the point (0, 0, 0) in game world. Boards used for camera positioning are set in the PlayerController's MarkerBoardDefinitions property (if you are extending the example player controller) or in AURCameraActor's BoardDefinitions if you are spawning the camera actor directly.
- Positioning independent actors - to bind an actor's pose to an AR board, add an AURTrackingComponent to the actor and select the ChildActorClass to one of the board blueprints
An ArUco board is a set of square markers, together with their positions and orientations in space. When a board is visible in the video, its pose relative to the camera can be calculated. In Augmented Unreality, we use boards for finding the pose of the camera in game world and for positioning independent actors with their own markers.
Augmented Unreality allows the user to create their own custom spatial configurations of markers in Unreal Editor. Please see the example boards in AugmentedUnreality/Content/Patterns and AugmentedUnrealityEx/Content/AugmentedUnrealityExample/Patterns.
- ArUco - markers can be arranged into any spatial configuration. Use PatternCube, PatternSquare_A, PatternSquare_B or subclass AURFiducialPatternSpatialBP.
- ChArUco boards - markers are combined with a chessboard grid. More accurate tracking but must be on a plane. Use PatternChessboard_A, PatternChessboard_B or subclass AURFiducialPatternFlatBoardBP.
To design a new board, create a child blueprint of AURFiducialPatternSpatialBP and edit it by adding AURMarkerComponents inside it. Each AURMarkerComponent represents one square on the board.
- Location, Rotation - pose of square in space. You can use SceneCompoenents to organize the board hierarchically.
- Id - identifier of the pattern shown in this square. Each square should have a different Id.
- BoardSizeCm - length of the square's size. This will automatically set the scale. When printing the boards, please ensure the squares match this size.
- MarginCm - margin inside the square, does not affect the total size.
After you create or edit the board blueprint, launch the game to generate the marker images. Then open the directory YourProject/Saved/AugmentedUnreality/Patterns/YourBoardName, print the images, and arrange them in space to match your designed configuration. The IDs of the markers in the editor need to match the numbers present in the images:
The following problems have been solved in this plugin, if you want to learn about these topics, please see:
- Accessing Android's camera video in UE
- Including external libraries in UE4
- Multi-threading in UE4 (1) (2) (3) (4)
- Performing OpenCV camera calibration OpenCV tutorial, adaptation for the plugin
- Drawing on dynamic textures UE tutorial (a bit old) my adaptation
- Conversion between OpenCV's and Unreal's coordinate systems