jetson安装mmdeploy
时间: 2023-11-19 08:07:21 浏览: 203
以下是在Jetson上安装mmdeploy的步骤:
1. 创建mmdeploy环境,Python版本为3.6:
```
conda create -n mmdeploy python=3.6 -y
conda activate mmdeploy
```
2. 克隆mmdeploy代码并安装依赖:
```
git clone https://github.com/megvii-model/mmdeploy.git
cd mmdeploy
pip install -r requirements.txt
```
3. 构建mmdeploy:
```
python setup.py build
python setup.py install
```
4. 下载TensorRT并安装:
```
wget https://developer.nvidia.com/compute/machine-learning/tensorrt/7.2.3/local_repos/nv-tensorrt-repo-ubuntu1804-cuda11.1-trt7.2.3.4-ga-20210226_1-1_amd64.deb
sudo dpkg -i nv-tensorrt-repo-ubuntu1804-cuda11.1-trt7.2.3.4-ga-20210226_1-1_amd64.deb
sudo apt-key add /var/nv-tensorrt-repo-cuda11.1-trt7.2.3.4-ga-20210226/7fa2af80.pub
sudo apt-get update
sudo apt-get install tensorrt
```
5. 构建mmdeploy TensorRT插件:
```
cd mmdeploy
mkdir build
cd build
cmake .. -DTRT_LIB=/usr/lib/aarch64-linux-gnu/libnvinfer.so.7.2.3 -DTRT_BIN=/usr/lib/aarch64-linux-gnu/libnvinfer_plugin.so.7.2.3 -DPYTHON_INCLUDE_DIR=$(python -c "from distutils.sysconfig import get_python_inc; print(get_python_inc())") -DPYTHON_LIBRARY=$(python -c "import distutils.sysconfig as sysconfig; print(sysconfig.get_config_var('LIBDIR'))")/libpython3.6m.so
make -j$(nproc)
```
6. 启动tritonserver并指定预加载库:
```
LD_PRELOAD=/root/mmdeploy/build/libmmdeploy_tensorrt_ops.so ./tritonserver --model-repository=/root/models
```
阅读全文