Jetson Nano 2 GB TensorRT Python Bindings (original) (raw)
October 17, 2025, 3:33am 1
Hello,
I am currently trying to build TensorRT Python bindings on the Jetson Nano 2 GB. I have been following the instructions in this guide: https://elinux.org/Jetson/L4T/TRT_Customized_Example#TensorRT_Python_Bindings.
However, I am using Python 3.8.18 and my TensorRT binaries are version 8.2.1-1+cuda10.2, so I changed the code accordingly. My code before installing the wheel (I already have the necessary dependencies installed) is given by
wget https://www.python.org/ftp/python/3.8.1/Python-3.8.1.tgz
mkdir python3.8
mkdir python3.8/include
ar x libpython3.8-dev_3.8.0-3~18.04_arm64.deb
tar -xvf data.tar.xz
cp ./usr/include/aarch64-linux-gnu/python3.8/pyconfig.h python3.8/include/
cp -r Python-3.8.1/Include/* python3.8/include/
git clone https://github.com/pybind/pybind11.git
git clone -b release/8.0 https://github.com/NVIDIA/TensorRT.git
cd TensorRT
git fetch origin
git checkout release/8.2
git submodule update --init --recursive
git clean -xdf
git submodule foreach --recursive git clean -xdf
cd ~/TensorRT/python
export TRT_OSSPATH=${PWD}/..
export EXT_PATH=${PWD}/../..
export TARGET=aarch64
export PYTHON_MINOR_VERSION=8
export CMAKE_EXTRA_FLAGS="-DCMAKE_SYSTEM_PROCESSOR=aarch64"
./build.sh
However, for some reason, this results in an x86_64 build (tensorrt-8.2.3.0-cp38-none-linux_x86_64.whl) rather than an ARM64 one, even though I specify that the target is aarch64.
I originally accidentally built the Python bindings for release 8.0 using the same code, and this resulted in an ARM64 build (tensorrt-8.0.1.6-cp38-none-linux_aarch64.whl), so I am not sure why it does not work for release 8.2.
Is there a solution to this? If not, is there a way to make 8.0.1.6 work, even though that is not the version of my binaries?
Thank you very much.
Has anyone encountered this issue before?
Are there any significant changes between TensorRT 8.0 and 8.2 that may have caused this? Also, how come TensorRT 8.1 is not publicly available on the GitHub?
AastaLLL November 20, 2025, 8:36am 6
Hi,
Sorry for the late update.
Do you build the binding on the Jetson or on a desktop?
Please note that it’s expected to run the commands on the Jetson device directly.
Thanks
I built the bindings on the Jetson directly, not on an external desktop or computer. Do you have any other suggestions for resolving the platform mismatch between the Jetson and the generated wheel? Thanks.
AastaLLL December 4, 2025, 7:27am 8
Hi,
Could you try to change the below to aarch64 to see if it can work?
Thanks.
Hello,
Before I do this, I noticed that my wheel is building for TensorRT version 8.2.3.0, which doesn’t match the version for my binaries. How do I build a wheel for 8.2.1.9? And does the fourth number have to match as well, or just the first three?
Thank you.
AastaLLL December 8, 2025, 8:54am 10
Hi,
Do you get the TensorRT from the JetPack?
Or install it from another link?
Thanks.
I got TensorRT already installed from JetPack, via flashing from SDKManager when first setting up the Nano.
AastaLLL December 10, 2025, 7:36am 12
Hi,
The version is captured by the code below.
Could you try to run it manually to see if the version is expected?
Thanks.