CANN/VGGT昇腾推理适配
VGGT inference on Ascend Atlas A2【免费下载链接】cann-recipes-spatial-intelligence本项目针对空间智能业务中的典型模型、加速算法提供基于CANN平台的优化样例项目地址: https://gitcode.com/cann/cann-recipes-spatial-intelligenceCANN Environment PreparatonThe inference of VGGT depends on the CANN development kit package (cann-toolkit) and the CANN binaray operator package(cann-kernels). The supported CANN software version is CANN 8.5.0.Download theAscend-cann-toolkit_${version}_linux-${arch}.runandAscend-cann-${chip_type}-ops_linux-${arch}.runpackages from the CANN Software Package Download Page and install them by referring to the CANN Installation Guide.The required versions of torch and torch_npu are 2.7.1 and 2.7.1.post2.Download the binary package from Ascend Extension for PyTorch and install torch and torch_npu.conda create -n vggt python3.11.13 conda activate vggt pip3 install torch2.7.1 pip3 install torch-npu2.7.1.post2VGGT Model PreparationDownload the open-source VGGT network code from the github repo.git clone https://github.com/facebookresearch/vggt.gitDownload the code of this repository:git clone https://gitcode.com/chenhongyang/cann-recipes-spatial-intelligence.gitCopy the code from the VGGT repository to this project directory in non-overwrite mode:cp -r vggt/examples cann-recipes-spatial-intelligence/models/vggt/ cp -rn vggt/vggt/dependency cann-recipes-spatial-intelligence/models/vggt/vggt/dependency cp -rn vggt/vggt/heads cann-recipes-spatial-intelligence/models/vggt/vggt/ cp -rn vggt/vggt/layers cann-recipes-spatial-intelligence/models/vggt/vggt/ cp -rn vggt/vggt/utils cann-recipes-spatial-intelligence/models/vggt/vggt/Install Python dependencies:pip3 install -r requirements.txtDownload VGGT model weights and copy it to the local pathckpt.VGGT --- examples --- demo_infer.py --- eval --- ckpt --- model.pt --- quant --- vggt --- dependency --- heads --- layers --- models --- utils --- spPerformance MeasurementThis repo provides script to test the functionality and the performance of VGGT model on NPU.Before executing the test scripts, refer to the Ascend Community CANN installation tutorial to set environment variables:source /usr/local/Ascend/ascend-toolkit/set_env.shRun the inference script and the output presents the average inference time of vggt bf16 model.python demo_infer.py --ckpt ckpt/model.ptRun the inference script and the output presents the average inference time of vggt bf16_sp model.bash infer_test.shParameter description for multi NPU inference:torchrun --nproc_per_node1 demo_infer.py \ --ckpt ${model_base} \ --images_path examples/kitchen/images \ --enable_sp \ --ulysses_degree 1 \ --ring_degree 1 # nproc_per_nodeThe torchrun parameter, the number of processes started by each node, needs to be equal to the number of NPU cards used # ckptModel checkpoint file path # images_pathEnter the directory where the image sequence is located # enable_spWhether to enable sequence parallelism, default value: False, with the prerequisite that nproc_per_node1 # ulysses_degreeUlysses parallelism, constraint Ulysses_degree × ring_degreenproc_per_node; Num_ attention heads must be divisible by Ulysses_degree # ring_degreeRing parallelism, constraint Ulysses_degree × ring_degreenproc_per_nodeTo perform vggt int8 model inference, you first need to build the vggt int8 model:python demo_infer.py --ckpt ckpt/model.pt --buildW8A8The vggt int8 model will be built in the current path, and then used for inference:python demo_infer.py --ckpt VGGT_model_W8A8.pt --enableW8A8Accurancy BenchmarkThis repo provides accurancy benchmark to evaluate the VGGT model on NPU. The full benchmark include three programs to test the accurancy of VGGT on Pose Evaluation, Point Map Evaluation and Depth Evaluation.Since the full dataste of benchmark is large, we can initially test the accurancy of VGGT model in Pose Evaluation with the subset of the full Co3DV2 dataset.Dataset Preparation:Download dataCO3D_apple.zipand dataCO3D_backpack.zipfrom CO3D website and unzip them todatasets/co3d/co3d_data/.VGGT --- datasets --- co3d --- co3d_data --- apple --- backpack ...Prepare metadata of the dataset:export VGGT_DIR$(pwd) cd eval/pose_evaluation/dataset_prepare python preprocess_co3d.py --category all --co3d_v2_dir $VGGT_DIR/datasets/co3d/co3d_data/ --output_dir $VGGT_DIR/datasets/co3d/co3d_anno/Accurancy MeasurementExecute the benchmark program:Use vggt bf16 model:export VGGT_DIR$(pwd) cd eval/pose_evaluation python eval_co3d.py --co3d_dir $VGGT_DIR/datasets/co3d/co3d_data/ --co3d_anno_dir $VGGT_DIR/datasets/co3d/co3d_anno/ --ckpt $VGGT_DIR/ckpt/model.ptCurrently, the bf16 model measurement accurancy is about 0.911.Use vggt int8 model:export VGGT_DIR$(pwd) cd eval/pose_evaluation python eval_co3d.py --co3d_dir $VGGT_DIR/datasets/co3d/co3d_data/ --co3d_anno_dir $VGGT_DIR/datasets/co3d/co3d_anno/ --ckpt VGGT_model_W8A8.pt --enableW8A8Currently, the int8 model measurement accurancy is about 0.907.【免费下载链接】cann-recipes-spatial-intelligence本项目针对空间智能业务中的典型模型、加速算法提供基于CANN平台的优化样例项目地址: https://gitcode.com/cann/cann-recipes-spatial-intelligence创作声明:本文部分内容由AI辅助生成(AIGC),仅供参考
本文来自互联网用户投稿,该文观点仅代表作者本人,不代表本站立场。本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如若转载,请注明出处:http://www.coloradmin.cn/o/2599033.html
如若内容造成侵权/违法违规/事实不符,请联系多彩编程网进行投诉反馈,一经查实,立即删除!