This build system covers building ONNX-RT, TFLite-RT, NEO-AI-DLR, and TIDL runtime modules from source for Ubuntu/Debian systems running on TI's EdgeAI processors (TDA4VM, AM62A, AM67A, AM68A, and AM69A). Tested for aarch64 Ubuntu 22.04 and aarch64 Debian 12.5.
Supported use cases include:
- Case 1: Compiling with the native GCC in arm64v8 Ubuntu Docker container directly on aarch64 build machine
- Case 2: Compiling with the native GCC in arm64v8 Ubuntu Docker container on x86_64 machine using QEMU
Pull the baseline Docker image needed. Assuming outside of a proxy network,
docker pull arm64v8/ubuntu:22.04
docker pull arm64v8/ubuntu:20.04
docker pull arm64v8/debian:12.5
Set up edgeai-ti-proxy
(repo link)
Before docker-build or docker-run, please make sure sourcing edgeai-ti-proxy/setup_proxy.sh
, which will define the USE_PROXY
env variable and all the proxy settings for the TI network.
If QEMU was not installed on the build Ubuntu PC,
sudo apt-get install -y qemu-user-static
# to initialize the QENU
./qemu_init.sh
BASE_IMAGE=ubuntu:22.04 ./docker_build.sh
BASE_IMAGE=ubuntu:20.04 ./docker_build.sh
BASE_IMAGE=debian:12.5 ./docker_build.sh
BASE_IMAGE=ubuntu:22.04 ./docker_run.sh
BASE_IMAGE=ubuntu:20.04 ./docker_run.sh
BASE_IMAGE=debian:12.5 ./docker_run.sh
All the commends below should be run in the Docker container.
Update PROTOBUF_VER
in onnxrt_prepare.sh
by, e.g., checking "git log
" at onnxruntime/cmake/external/protobuf
. Currently it is set:
PROTOBUF_VER=3.20.2
.
You can run the following in the Docker container for downloading source from git repo, applying patches, and downloading pre-built protobuf
:
./onnxrt_prepare.sh
Update PROTOBUF_VER
to match to the setting in onnxrt_prepare.sh
. The following should be run in the Docker container with QEMU.
(Optional) To build protobuf
from source, run the following inside the container.
./onnxrt_protobuf_build.sh
Update "--path_to_protoc_exe
" in onnxrt_build.sh
accordingly. To build ONNX-RT, run the following inside the container,
./onnxrt_build.sh
Outputs:
- Shared lib:
$WORK_DIR/workarea/onnxruntime/build/Linux/Release/libonnxruntime.so.1.14.0+${TIDL_VER}
- Wheel file:
$WORK_DIR/workarea/onnxruntime/build/Linux/Release/dist/onnxruntime_tidl-1.14.0+${TIDL_VER}-cp310-cp310-linux_aarch64.whl
./onnxrt_package.sh
Output tarball: $WORK_DIR/workarea/onnx-1.14.0-ubuntu22.04_aarch64.tar.gz
All the commends below should be run in the Docker container.
./tflite_prepare.sh
./tflite_build.sh
To build the Python wheel package:
./tflite_whl_build.sh
Outputs:
- Static lib:
$WORK_DIR/workarea/tensorflow/tflite_build/libtensorflow-lite.a
- Wheel file:
$WORK_DIR/workarea/tensorflow/tensorflow/lite/tools/pip_package/gen/tflite_pip/python3/dist/tflite_runtime-2.12.0-cp310-cp310-linux_aarch64.whl
./tflite_package.sh
Output tarball: $WORK_DIR/workarea/tflite-2.12-ubuntu22.04_aarch64.tar.gz
All the commends below should be run in the Docker container.
./dlr_prepare.sh
./dlr_build.sh
./dlr_package.sh
Output wheel package: $WORK_DIR/workarea/neo-ai-dlr/python/dist/dlr-1.13.0-py3-none-any.whl
TIDL runtime modules include TIDL-RT library, TFLite-RT delegate library and ONNX-RT execution provider (EP) library.
All the commends below should be run in the Docker container.
./tidl_prepare.sh
Requirement: vision-apps debian packages are required which can be separately
built with "vision-apps-build". Below is for downloading the vision-apps debian packages under ${HOME}/ubuntu22.04-deps
or ${HOME}/debian12.5-deps
.
./vision_apps_libs_download.sh
./tidl_build.sh
Outputs:
- TIDL-RT library:
$WORK_DIR/workarea/arm-tidl/rt/out/${SOC}/${MPU}/LINUX/release/libvx_tidl_rt.so.1.0
- TFLite-RT delegate library:
$WORK_DIR/workarea/arm-tidl/tfl_delegate/out/${SOC}/${MPU}/LINUX/release/libtidl_tfl_delegate.so.1.0
- ONNX-RT EP library:
$WORK_DIR/workarea/arm-tidl/onnxrt_ep/out/${SOC}/${MPU}/LINUX/release/libtidl_onnxrt_EP.so.1.0