Skip to content

Latest commit

 

History

History
271 lines (218 loc) · 12.8 KB

VisionandSensingApplicationSDK_FuncSpec_ApplicationDevelopment.adoc

File metadata and controls

271 lines (218 loc) · 12.8 KB

Vision and Sensing Application SDK
Application Development
Functional Specifications

1. Change history

Date What/Why

2022/11/16

Initial draft.

2023/01/30

Added post-processing debugging. Directory structure change. Updated the PDF build environment.

2023/05/26

Fixed the notation of tool names and parentheses.
Made fixes associated with specification changes in post-processing code.
Some environments do not render AsciiDoc’s Mermaid diagrams, so modified to rendered image references.
Added alternate text to images.

2. Terms/Abbreviations

Terms/Abbreviations Meaning

"Vision and Sensing Application"

Post-processing (processing of Output Tensor, which is the output of the AI model)

"PPL"

A module that processes the output of the AI model(Output Tensor) of edge AI devices

Wasm

WebAssembly. Binary instruction format for virtual machines

FlatBuffers

Serialization library

WAMR-IDE

An integrated development environment that supports running and debugging WebAssembly applications

PPL parameter

Parameters used to process the "Vision and Sensing Application"

LLDB

Software debugger

3. Reference materials

4. Expected use case

  • Design and implement post-processing

  • Convert post-processing code into a form that can be deployed to edge AI devices

  • Debug post-processing code before deploying it to edge AI devices

5. Functional overview/Algorithm

Functional overview

  • Users can design and implement "Vision and Sensing Application" in C or C++

  • Users can serialize "Vision and Sensing Application" output with "FlatBuffers"

    • Users can generate class definition files from "FlatBuffers" schema files

  • Users can build a "Vision and Sensing Application" implemented in C or C++ to a Wasm

  • The following "Vision and Sensing Application" sample code is provided

    • Object Detection

    • Image Classification

  • Users can build a Wasm with "Vision and Sensing Application" sample code

  • Users can build a "Vision and Sensing Application" to a Wasm for debugging, and debug on SDK environment using test app

"Vision and Sensing Application" creation flow

flowchart TD;
    %% definition
    classDef object fill:#FFE699, stroke:#FFD700
    classDef external_service fill:#BFBFBF, stroke:#6b8e23, stroke-dasharray: 10 2
    style legend fill:#FFFFFF,stroke:#000000

    %% impl
    subgraph legend["Legend"]
        process(Processing/User behavior)
        object[Input/output data]:::object
        extern[External services]:::external_service
    end
Loading
Flow
flowchart TD
    %% definition
    classDef object fill:#FFE699, stroke:#FFD700
    style console fill:#BFBFBF, stroke:#6b8e23, stroke-dasharray: 10 2

    start((Start))
    id1(Define &quotFlatBuffers&quot schema for Vision and Sensing Application output)
    id2(Generate class definition file)
    id3(Implement Vision and Sensing Application)
    id3-1("Prepare input data for debugging (Optional)")
    id3-2("Build a Wasm for debugging (Optional)")
    id3-3("Debug a Wasm (Optional)")
    id4(Build a Wasm for release)
    subgraph console["Console for AITRIOS"]
    id5(AOT compile)
    end
    data1[&quotFlatBuffers&quot schema]:::object
    data2[Class definition file]:::object
    data3[Vision and Sensing Application code]:::object
    data3-1["Output Tensor, PPL parameter for debugging (Optional)"]:::object
    data3-2[".wasm for debugging (Optional)"]:::object
    data4[.wasm for release]:::object
    data5[.aot]:::object
    finish(((Finish)))

    %% impl
    start --> id1
    id1 --- data1
    data1 --> id2
    id2 --- data2
    data2 --> id3
    id3 --- data3
    data3 --> id3-1
    id3-1 --- data3-1
    data3-1 --> id3-2
    id3-2 --- data3-2
    data3-2 --> id3-3
    id3-3 --> id4
    id4 --- data4
    data4 --> id5
    id5 --- data5
    data5 --> finish
Loading
Note
Wasm files created in the SDK environment are AOT compiled in "Console for AITRIOS" and converted into a form that can be deployed to edge AI devices. (You can’t do that in a debug build)

Build features

Provides the following build features:

  • Builds a Wasm for release
    Generates a Wasm file (.wasm) for deployment to edge AI devices

    • Generates a Wasm file (.wasm) from "Vision and Sensing Application" code (.c, or .cpp)

      • Object files (.o) are generated as intermediate files during the Wasm build process

  • Builds a Wasm for debugging
    Generates a Wasm file (.wasm) to debug code before deploying to edge AI devices

    • Generates a Wasm file (.wasm) from "Vision and Sensing Application" code (.c, or .cpp)

      • Object files (.o) are generated as intermediate files during the Wasm build process

Debugging features

Debugging feature using test app

  • The following Wasm debugging features are available through the LLDB and WAMR-IDE libraries and VS Code UI:

    • Specify breakpoint

    • Step execution (Step In, Step Out, Step Over)

    • Specify watch expression

    • Check variable

    • Check call stack

    • Check logs on terminal

  • Provides a test app as a driver to invoke the processing of Wasm files

    • You can specify parameters to input into a Wasm, such as Output Tensor, PPL parameter, when running the test app

Note
Does not support project management feature of WAMR-IDE
Note
To achieve Wasm debugging, the following libraries are mocked:
* Data Pipeline API
* EVP SDK API
* SensCord SDK API

6. User interface specifications

How to start each function

  1. Launch the SDK environment and preview the README.md in the top directory

  2. Jump to the README.md in the tutorials directory from the hyperlink in the SDK environment top directory

  3. Jump to the 4_prepare_application directory from the hyperlink in the README.md in the tutorials directory

  4. Jump to the 1_develop directory from the hyperlink in the README.md in the 4_prepare_application directory

  5. Jump to each feature from each file in the 1_develop directory

Design and implement a "Vision and Sensing Application"

  1. Follow the procedures in the README.md to create the "FlatBuffers" schema file for "Vision and Sensing Application" output

  2. Follow the procedures in the README.md to open a terminal from the VS Code UI and run the command to generate a header file of class definitions from a schema file

    • Class definition header file is generated on the same level as the schema file

  3. Implement a "Vision and Sensing Application"

    • Implement in C or C++

    • Implement source files either by creating a new one or modifying the provided sample code for the "Vision and Sensing Application"

    • Implement using the class definition file generated by the "2."

    • Implement "Vision and Sensing Application" interface using the "Vision and Sensing Application"'s sample code

    • You can optionally install the OSS and external libraries needed to design your "Vision and Sensing Application" and incorporate them into your "Vision and Sensing Application"

Note
This SDK does not guarantee the installation or use of OSS or external libraries, which users may use at their discretion.

Generate a Wasm file for debugging from "Vision and Sensing Application" code

Note
Follow this procedure only when using the debugging feature.
  1. Follow the procedures in the README.md to modify the Makefile for the file location and filename of the "Vision and Sensing Application" code

  2. Follow the procedures in the README.md to open a terminal from the VS Code UI and run the command to build a Wasm for debugging

    • A Docker image is created for the debugging environment, including a Wasm build for debugging, on the Dev Container, and a debug directory is created in the directory on the Dev Container described in the README.md , and the .wasm file is stored in that directory

Edit input parameters to debug a Wasm file

Note
Follow this procedure only when using the debugging feature.
  1. Follow the procedures in the README.md to modify the input parameters, such ad Output Tensor, PPL parameter, for test

Debug a Wasm file

Note
Follow this procedure only when using the debugging feature.
  1. Follow the procedures in the README.md to debug and check the logs in the terminal of VS Code UI, or open the Wasm source code in VS Code UI and specify breakpoint to check stack etc.

Generate a Wasm file from "Vision and Sensing Application" code

  1. Follow the procedures in the README.md to modify the Makefile for the file location and filename of the "Vision and Sensing Application" code

  2. Follow the procedures in the README.md to open a terminal from the VS Code UI and run the command to remove build a Wasm

    • A Docker image for the environment to build a Wasm are created on the Dev Container, and a release directory is created in the directory on the Dev Container described in the README.md , and the .wasm file is stored in that directory

Remove build generation files

  1. Follow the procedures in the README.md to open a terminal from the VS Code UI and run the command to remove build generation files

Remove build generation files and the Docker image for environment to build a Wasm

  1. Follow the procedures in the README.md to open a terminal from the VS Code UI, and run the command to remove build generation files and the Docker image for environment to build a Wasm

When you run a command to remove a Wasm build or build generation files or a Docker image for the build environment, if you run the command with an option other than what is listed in README.md, it will print command usage information to the terminal and interrupt processing.

7. "Vision and Sensing Application" interface

When you design a "Vision and Sensing Application", you need to implement using a set of functions that interface with the "Vision and Sensing Application". Sample code includes examples of their use. See Data Pipeline API Specifications, EVP SDK API Specifications, SensCord SDK API Specifications in the separate document for details. The relationship between each API and the SDK is described in README.md.

8. Target performances/Impact on performances

  • Usability

    • When the SDK environment is built, users can generate class definition file for "FlatBuffers", build a Wasm, and debug a Wasm without any additional installation steps

    • UI response time of 1.2 seconds or less

    • If processing takes more than 5 seconds, indicates that processing is in progress with successive updates

9. Assumption/Restriction

  • Supports only "Vision and Sensing Application" code implemented in C or C++ for Wasm builds

10. Remarks

  • Check the following version information for the tools needed to develop "Vision and Sensing Application" that comes with the SDK

    • "FlatBuffers": Described in the README.md in the 1_develop directory

    • Other tools: Described in the Dockerfile in the 1_develop/sdk directory

11. Unconfirmed items

None