Skip to content

Conversation

oak-barry
Copy link

remove requirements-detail.txt and merge it into requirements.txt
update README.md, configure installation on linux

@jtydhr88
Copy link
Contributor

hi,
if you choose using cuda12.1, and you install onnxruntime_gpu with pip install onnxruntime_gpu==1.17.0 directly, in the runtime, it will throw
UserWarning: Specified provider 'TensorrtExecutionProvider' is not in available provider names.Available providers: 'AzureExecutionProvider, CPUExecutionProvider'
this is because the default onnxruntime_gpu match to cuda 11.8, not 12.1, according to https://onnxruntime.ai/docs/install/
image
and this warning will cause the speed slow down.

for cuda 12.1, as the doc I refered, we need to run it instead:
pip install onnxruntime-gpu --extra-index-url https://aiinfra.pkgs.visualstudio.com/PublicPackages/_packaging/onnxruntime-cuda-12/pypi/simple/

@jtydhr88
Copy link
Contributor

this is also the reason I do the uninstall then re-install step in #15 and bat I commited

@jtydhr88
Copy link
Contributor

jtydhr88 commented Jun 17, 2024

for the repo owners, their orignal env is cuda 11.8, which is ok to run pip install onnxruntime_gpu directly

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants