Skip to content

Latest commit

 

History

History
47 lines (32 loc) · 5.13 KB

readme.md

File metadata and controls

47 lines (32 loc) · 5.13 KB

Fusion IK

Overview

Fusion IK: Solving Inverse Kinematics using a Hybridized Deep Learning and Evolutionary Approach

Setup

  1. Install Unity.
  2. Clone or download and extract this repository.
  3. Open the root of this repository with Unity.
  4. Install Python and required Python packages. If unfamiliar with Python, "pip" comes with standard Python installations, and you can run "pip package" to install a package. It is recommended you create a virtual environment for your packages.
    1. PyTorch.
      1. It is recommended you visit PyTorch's get started page which will allow you to select to install CUDA support if you have an Nvidia GPU. This will give you a command you can copy and run to install PyTorch, TorchVision, and TorchAudio, but feel free to remove TorchVision and TorchAudio from the command as they are not needed. Check which version of CUDA your Nvidia GPU supports. You can check your CUDA version by running "nvidia-smi" from the command line.
      2. When running the training script, a message will be output to the console if it is using CUDA.
    2. The remaining packages can be installed via "pip package" in the console.
      1. pandas
      2. tqdm
      3. onnx

Robot Modeling

An ABB IRB 7600 and a sample 20 degrees of freedom robot have been prepared already. To create a new robot, attach a "Robot" component to the root of the chain. Create and set up a "Robot Properties" component and assign it to the "Robot" component. Joints are set up by attaching an "Articulation Body" and a "Robot Joint" to every joint. Every joint must have limits defined. Only single kinematic chains are supported, no robots with multiple arms. Although initial work to set them up has been done, and single-axis "Revolute" joints have been tested, "Prismatic" and "Spherical" joints were not used in testing. Save the robot as a prefab for use in later steps.

Training

The scenes named "Generate" in the folders "ABB IRB 7600 > Scenes" and "20 DOF > Scenes" are set up to do this. To do this with a new robot, a "Generator" component must exist in the scene and assign your previously created robot prefab. Set the maximum milliseconds the Bio IK attempts for generating will run for. Run the scene and training data will be generated. Visuals will be disabled in the scene as they are not needed. Once complete, run "train.py". Once complete, inside the "Networks" folder, copy the ONNX files for your robot's joints into your Unity project. Assign these ONNX models in order to the "Network Models" field of your "Robot Properties".

Testing

The scenes named "Test" in the folders "ABB IRB 7600 > Scenes" and "20 DOF > Scenes" are set up to do this. To do this with a new robot, a "Tester" component must exist in the scene and assign your previously created robot prefab. Run the scene and testing data will be generated. Visuals will be disabled in the scene as they are not needed. Once complete, run "evaluate.py". Once complete, you can view results in the "Results" folder.

Visualizing

The scenes named "Visualize" in the folders "ABB IRB 7600 > Scenes" and "20 DOF > Scenes" are set up to do this. To do this with a new robot, a "Visualizer" component must exist in the scene and assign your previously created robot prefab. Run the scene to have a GUI as well as keyboard and mouse controls to visualize the various inverse kinematics models.

This repository utilizes a version of Bio IK designed specifically for single-chain robot arms, not being fully capable of other multi-objective solutions as described in the original Bio IK paper. This version of Bio IK was intended for use with this research only, and is not designed for general-purpose game development. For a full version of the original Bio IK with all features, see the official Bio IK Unity asset.