Replies: 1 comment 5 replies
-
|
仔细看了一下报错,发现缺少ccache,装了以后报错变了: 文件C:\Users\XXX.paddlex\official_models\PP-OCRv5_server_det\inference.json是存在的,不知道会不会是中文路径的缘故? |
Beta Was this translation helpful? Give feedback.
-
|
仔细看了一下报错,发现缺少ccache,装了以后报错变了: 文件C:\Users\XXX.paddlex\official_models\PP-OCRv5_server_det\inference.json是存在的,不知道会不会是中文路径的缘故? |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
环境都装好了,按照安装说明先装的模型,然后装ocr,但是做ocr转换的时候报错:
E:>paddleocr ocr -i https://paddle-model-ecology.bj.bcebos.com/paddlex/imgs/demo_image/general_ocr_002.png --use_doc_orientation_classify False --use_doc_unwarping False --use_textline_orientation False
信息: 用提供的模式无法找到文件。
D:\Python\Python313\Lib\site-packages\paddle\utils\cpp_extension\extension_utils.py:718: UserWarning: No ccache found. Please be aware that recompiling all source files may be required. You can download and install ccache from: https://github.com/ccache/ccache/blob/master/doc/INSTALL.md
warnings.warn(warning_message)
Creating model: ('PP-OCRv5_server_det', None)
Model files already exist. Using cached files. To redownload, please delete the directory manually:
C:\Users\XXX\.paddlex\official_models\PP-OCRv5_server_det.Traceback (most recent call last):
File "", line 198, in run_module_as_main
File "", line 88, in run_code
File "D:\Python\Python313\Scripts\paddleocr.exe_main.py", line 7, in
sys.exit(console_entry())
~~~~~~~~~~~~~^^
File "D:\Python\Python313\Lib\site-packages\paddleocr_main.py", line 26, in console_entry
main()
~~~~^^
File "D:\Python\Python313\Lib\site-packages\paddleocr_cli.py", line 128, in main
_execute(args)
~~~~~~~~^^^^^^
File "D:\Python\Python313\Lib\site-packages\paddleocr_cli.py", line 117, in _execute
args.executor(args)
~~~~~~~~~~~~~^^^^^^
File "D:\Python\Python313\Lib\site-packages\paddleocr_pipelines\ocr.py", line 650, in execute_with_args
perform_simple_inference(PaddleOCR, params)
~~~~~~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^
File "D:\Python\Python313\Lib\site-packages\paddleocr_utils\cli.py", line 62, in perform_simple_inference
wrapper = wrapper_cls(**init_params)
File "D:\Python\Python313\Lib\site-packages\paddleocr_pipelines\ocr.py", line 163, in init
super().init(**base_params)
~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^
File "D:\Python\Python313\Lib\site-packages\paddleocr_pipelines\base.py", line 67, in init
self.paddlex_pipeline = self._create_paddlex_pipeline()
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~^^
File "D:\Python\Python313\Lib\site-packages\paddleocr_pipelines\base.py", line 102, in _create_paddlex_pipeline
return create_pipeline(config=self.merged_paddlex_config, **kwargs)
File "D:\Python\Python313\Lib\site-packages\paddlex\inference\pipelines_init.py", line 166, in create_pipeline
pipeline = BasePipeline.get(pipeline_name)(
config=config,
...<5 lines>...
**kwargs,
)
File "D:\Python\Python313\Lib\site-packages\paddlex\utils\deps.py", line 202, in _wrapper
return old_init_func(self, *args, **kwargs)
File "D:\Python\Python313\Lib\site-packages\paddlex\inference\pipelines_parallel.py", line 103, in init
self._pipeline = self._create_internal_pipeline(config, self.device)
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^
File "D:\Python\Python313\Lib\site-packages\paddlex\inference\pipelines_parallel.py", line 158, in _create_internal_pipeline
return self.pipeline_cls(
~~~~~~~~~~~~~~~~~~^
config,
^^^^^^^
...<3 lines>...
hpi_config=self.hpi_config,
^^^^^^^^^^^^^^^^^^^^^^^^^^^
)
^
File "D:\Python\Python313\Lib\site-packages\paddlex\inference\pipelines\ocr\pipeline.py", line 117, in init
self.text_det_model = self.create_model(
~~~~~~~~~~~~~~~~~^
text_det_config,
^^^^^^^^^^^^^^^^
...<6 lines>...
input_shape=self.input_shape,
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
)
^
File "D:\Python\Python313\Lib\site-packages\paddlex\inference\pipelines\base.py", line 105, in create_model
model = create_predictor(
model_name=config["model_name"],
...<6 lines>...
**kwargs,
)
File "D:\Python\Python313\Lib\site-packages\paddlex\inference\models_init.py", line 77, in create_predictor
return BasePredictor.get(model_name)(
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~^
model_dir=model_dir,
^^^^^^^^^^^^^^^^^^^^
...<6 lines>...
**kwargs,
^^^^^^^^^
)
^
File "D:\Python\Python313\Lib\site-packages\paddlex\inference\models\text_detection\predictor.py", line 57, in init
self.pre_tfs, self.infer, self.post_op = self._build()
~~~~~~~~~~~^^
File "D:\Python\Python313\Lib\site-packages\paddlex\inference\models\text_detection\predictor.py", line 77, in _build
infer = self.create_static_infer()
File "D:\Python\Python313\Lib\site-packages\paddlex\inference\models\base\predictor\base_predictor.py", line 248, in create_static_infer
return PaddleInfer(
self.model_name, self.model_dir, self.MODEL_FILE_PREFIX, self._pp_option
)
File "D:\Python\Python313\Lib\site-packages\paddlex\inference\models\common\static_infer.py", line 284, in init
self.predictor = self.create()
~~~~~~~~~~~~^^
File "D:\Python\Python313\Lib\site-packages\paddlex\inference\models\common\static_infer.py", line 383, in create
config = paddle.inference.Config(str(model_file), str(params_file))
RuntimeError: (NotFound) Cannot open file C:\Users\XXX.paddlex\official_models\PP-OCRv5_server_det\inference.json, please confirm whether the file is normal.
[Hint: Expected paddle::inference::IsFileExists(prog_file) == true, but received paddle::inference::IsFileExists(prog_file):0 != true:1.] (at ..\paddle\fluid\inference\api\analysis_config.cc:117)
Beta Was this translation helpful? Give feedback.
All reactions