如何将 PP-OCRv4_mobile_det_infer 模型转换为 .nb 文件,用来部署到移动端。 #16645
Unanswered
GaoMeng-2021
asked this question in
Q&A
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
请教个问题,我从官网上下载了 PP-OCRv4_mobile_det 的推理模型,解压后,发现有三个文件,如下所示:
inference.json inference.pdiparams inference.yml
随后,尝试使用paddlie-lite的命令工具 opt_linux ,将其转换为 .nb 文件。
提示错误:
opt_linux --model_dir=./PP-OCRv4_mobile_det_infer --optimize_out=ocr_opt_v4mdet --valid_targets=arm
[F 10/14 5:30:23.688 ...e-Lite/lite/model_parser/model_parser.cc:132 PrintPbModelErrorMessage]
Error, Unsupported model format!
1. contents in model directory should be in one of these formats:
(1) model + var1 + var2 + etc.
(2) model + var1 + var2 + etc.
(3) model.pdmodel + model.pdiparams
(4) model + params
(5) model + weights
2. You can also appoint the model and params file in custom format:
eg. |-- set_model_file('custom_model_name')
|-- set_param_file('custom_params_name')'
Aborted (core dumped)
请教如何处理这个问题?
Beta Was this translation helpful? Give feedback.
All reactions