Skip to content

This build system covers building open-source runtime (OSRT) libraries including ONNX-RT, TFLite-RT, NEO-AI-DLR, and TIDL runtime modules from source for Ubuntu/Debian Docker container. Supporting target platforms: TDA4VM, AM62A, AM67A, AM68A, and AM69A

License

Notifications You must be signed in to change notification settings

TexasInstruments-Sandbox/edgeai-osrt-libs-build

Repository files navigation

Open-Source Runtime Library Build for Target Ubuntu/Debian Container

This build system covers building ONNX-RT, TFLite-RT, NEO-AI-DLR, and TIDL runtime modules from source for Ubuntu/Debian systems running on TI's EdgeAI processors (TDA4VM, AM62A, AM67A, AM68A, and AM69A). Tested for aarch64 Ubuntu 22.04 and aarch64 Debian 12.5.

Supported use cases include:

  • Case 1: Compiling with the native GCC in arm64v8 Ubuntu Docker container directly on aarch64 build machine
  • Case 2: Compiling with the native GCC in arm64v8 Ubuntu Docker container on x86_64 machine using QEMU
Figure 1. Building in aarch64 Ubuntu/Debian Container

Prerequisite

docker-pull the base Docker image

Pull the baseline Docker image needed. Assuming outside of a proxy network,

docker pull arm64v8/ubuntu:22.04
docker pull arm64v8/ubuntu:20.04
docker pull arm64v8/debian:12.5

edgeai-ti-proxy (only required to make the build system work in TI proxy network)

Set up edgeai-ti-proxy (repo link)

Before docker-build or docker-run, please make sure sourcing edgeai-ti-proxy/setup_proxy.sh, which will define the USE_PROXY env variable and all the proxy settings for the TI network.

(Only for Case 2) Initialize QEMU to Emulate ARM Architecture on x86 Ubuntu PC

If QEMU was not installed on the build Ubuntu PC,

sudo apt-get install -y qemu-user-static
# to initialize the QENU
./qemu_init.sh

Docker Environment for Building

Docker-build

BASE_IMAGE=ubuntu:22.04 ./docker_build.sh
BASE_IMAGE=ubuntu:20.04 ./docker_build.sh
BASE_IMAGE=debian:12.5  ./docker_build.sh

Docker-run

BASE_IMAGE=ubuntu:22.04 ./docker_run.sh
BASE_IMAGE=ubuntu:20.04 ./docker_run.sh
BASE_IMAGE=debian:12.5  ./docker_run.sh

Build ONNX-RT from Source

All the commends below should be run in the Docker container.

Prepare the source and update the build config

Update PROTOBUF_VER in onnxrt_prepare.sh by, e.g., checking "git log" at onnxruntime/cmake/external/protobuf. Currently it is set: PROTOBUF_VER=3.20.2.

You can run the following in the Docker container for downloading source from git repo, applying patches, and downloading pre-built protobuf:

./onnxrt_prepare.sh

Build

Update PROTOBUF_VER to match to the setting in onnxrt_prepare.sh. The following should be run in the Docker container with QEMU.

(Optional) To build protobuf from source, run the following inside the container.

./onnxrt_protobuf_build.sh

Update "--path_to_protoc_exe" in onnxrt_build.sh accordingly. To build ONNX-RT, run the following inside the container,

./onnxrt_build.sh

Outputs:

  • Shared lib: $WORK_DIR/workarea/onnxruntime/build/Linux/Release/libonnxruntime.so.1.14.0+${TIDL_VER}
  • Wheel file: $WORK_DIR/workarea/onnxruntime/build/Linux/Release/dist/onnxruntime_tidl-1.14.0+${TIDL_VER}-cp310-cp310-linux_aarch64.whl

Package

./onnxrt_package.sh

Output tarball: $WORK_DIR/workarea/onnx-1.14.0-ubuntu22.04_aarch64.tar.gz

Build TFLite-RT from Source

All the commends below should be run in the Docker container.

Prepare the source and update the build config

./tflite_prepare.sh

Build

./tflite_build.sh

To build the Python wheel package:

./tflite_whl_build.sh

Outputs:

  • Static lib: $WORK_DIR/workarea/tensorflow/tflite_build/libtensorflow-lite.a
  • Wheel file: $WORK_DIR/workarea/tensorflow/tensorflow/lite/tools/pip_package/gen/tflite_pip/python3/dist/tflite_runtime-2.12.0-cp310-cp310-linux_aarch64.whl

Package

./tflite_package.sh

Output tarball: $WORK_DIR/workarea/tflite-2.12-ubuntu22.04_aarch64.tar.gz

Build Neo-AI-DLR from Source

All the commends below should be run in the Docker container.

Prepare the source and update the build config

./dlr_prepare.sh

Build

./dlr_build.sh

Package

./dlr_package.sh

Output wheel package: $WORK_DIR/workarea/neo-ai-dlr/python/dist/dlr-1.13.0-py3-none-any.whl

Build TIDL Modules

TIDL runtime modules include TIDL-RT library, TFLite-RT delegate library and ONNX-RT execution provider (EP) library.

All the commends below should be run in the Docker container.

Prepare the source and update the build config

./tidl_prepare.sh

Build

Requirement: vision-apps debian packages are required which can be separately built with "vision-apps-build". Below is for downloading the vision-apps debian packages under ${HOME}/ubuntu22.04-deps or ${HOME}/debian12.5-deps.

./vision_apps_libs_download.sh
./tidl_build.sh

Outputs:

  • TIDL-RT library: $WORK_DIR/workarea/arm-tidl/rt/out/${SOC}/${MPU}/LINUX/release/libvx_tidl_rt.so.1.0
  • TFLite-RT delegate library: $WORK_DIR/workarea/arm-tidl/tfl_delegate/out/${SOC}/${MPU}/LINUX/release/libtidl_tfl_delegate.so.1.0
  • ONNX-RT EP library: $WORK_DIR/workarea/arm-tidl/onnxrt_ep/out/${SOC}/${MPU}/LINUX/release/libtidl_onnxrt_EP.so.1.0

About

This build system covers building open-source runtime (OSRT) libraries including ONNX-RT, TFLite-RT, NEO-AI-DLR, and TIDL runtime modules from source for Ubuntu/Debian Docker container. Supporting target platforms: TDA4VM, AM62A, AM67A, AM68A, and AM69A

Resources

License

Stars

Watchers

Forks

Packages

No packages published