No Best Accuracy Model and Inference Failure with Exported PP-OCRv5_mobile_rec #16559
-
Beta Was this translation helpful? Give feedback.
Answered by
lyan-chhay
Oct 3, 2025
Replies: 2 comments 1 reply
-
|
For reference, Here is inference.yml code. |
Beta Was this translation helpful? Give feedback.
0 replies
-
|
for best_accuracy model missing, it is due to you might set the eval_batch_step too big . let say let set [0,1000], it will train 1000 iteration before doing eval, if you have small dataset and train for small epoch, maybe it does not reach that iteration. whereas for the error during inference part, I AM FACING THE SAME PROBLEM NOW ;( STILL LOOKING FOR SOLUTION |
Beta Was this translation helpful? Give feedback.
1 reply
Answer selected by
BaeSyoon
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment


for best_accuracy model missing, it is due to you might set the eval_batch_step too big . let say let set [0,1000], it will train 1000 iteration before doing eval, if you have small dataset and train for small epoch, maybe it does not reach that iteration.
whereas for the error during inference part, I AM FACING THE SAME PROBLEM NOW ;( STILL LOOKING FOR SOLUTION