-
Notifications
You must be signed in to change notification settings - Fork 464
Added MAUI usage example (Android) #1217
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: master
Are you sure you want to change the base?
Conversation
- added auto download of the model when missing
|
Oh ok then unfortunately there is nothing to do, thanks anyway for considering it |
Sorry for letting this PR go cold. @AmSmart since you worked on Android support would you like to provide any review before merging this? It loosk fine to me, but I don't know anything about Android development! |
I'll take a look and feedback shortly |
If you can also solve the indicated issue I would really appreciate it. |
This pull request has been automatically marked as stale due to inactivity. If no further activity occurs, it will be closed in 7 days. |
Hi @AmSmart, do you have any updates on this PR? |
Added MAUI example (Android)
Disclaimer
I’m new to the project and I hope I have followed all the contribution guidelines and policies correctly. If not, please forgive me and kindly let me know what I should fix or improve.
Context
As suggested by @AmSmart in PR #1179, I extended the Mobile project developing a chatbot as a basic working app example using LlamaSharp on MAUI.
Important note on functionality (ISSUE)
I noticed that the example works correctly on an Android emulator (running on PC), but on a real Android device it crashes with the following error related to loading the
CommunityToolkit.HighPerformance.dll
dependency:@AmSmart, could you please check what is going on here?
A simple idea from building the app
While developing the app, it occurred to me that it might be useful to provide an API like LLamaWeights.LoadFromStream to load the model directly from a stream. This could be handy in cases where a small model is bundled with the APK. Currently, since loading requires a file, the model must be extracted from the APK and saved to the device storage, resulting in having two copies: one compressed inside the APK and one extracted. With a stream-based load, the app could load the model directly from the APK without extracting it. I understand that in a real-world scenario the model probably won't be shipped with the APK, but I thought it was an interesting possibility and wanted to hear your thoughts on this.