Skip to content

Add local model support #6

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
mschuwalow opened this issue May 7, 2025 · 6 comments · May be fixed by #7, #8 or #12
Open

Add local model support #6

mschuwalow opened this issue May 7, 2025 · 6 comments · May be fixed by #7, #8 or #12

Comments

@mschuwalow
Copy link

mschuwalow commented May 7, 2025

Right now it's not possible to use golem-llm without an api key / with a local model. We should add support for ollama or one of the alternatives to address that usecase.

EDIT 1:
For testing, we should have ollama run in a docker image and run both durable and portable (optional) wasm against it. We can use some very small model that runs fine on cpu for this.

The tests should be written by writing a special testing component that will be composed with the respective golem-llm wasms using wac. The examples component should be used as the starting point and it's test cases should be included.
As the llms are non-deterministic by nature, just asserting the test functions pass and produce non-empty outputs is probably enough.


This composed component should then be executed using golem as the runtime.
This can be done by using the golem-test-framework. It is also used as part of the main repo and a lot of example tests using it can be found there.


If there are issues with using the golem-test-framework (there is very little documentation for it currently), an alternative is to start golem using one of two ways as part of the ci:

  • using the golem server run command of the cli
  • using the docker images published as part of the main repo. There are docker-compose files in the repo showcasing various setups

The component can then be uploaded to golem and invoked using the cli.

@jdegoes
Copy link

jdegoes commented May 7, 2025

/bounty $750

Copy link

algora-pbc bot commented May 7, 2025

💎 $500 bounty • Ambati Sahithi

💎 $750 bounty • Golem Cloud

Steps to solve:

  1. Start working: Comment /attempt #6 with your implementation plan
  2. Submit work: Create a pull request including /claim #6 in the PR body to claim the bounty
  3. Receive payment: 100% of the bounty is received 2-5 days post-reward. Make sure you are eligible for payouts

❗ Important guidelines:

  • To claim a bounty, you need to provide a short demo video of your changes in your pull request
  • If anything is unclear, ask for clarification before starting as this will help avoid potential rework
  • Low quality AI PRs will not receive review and will be closed
  • Do not ask to be assigned unless you've contributed before

Thank you for contributing to golemcloud/golem-llm!

Attempt Started (UTC) Solution Actions
🟢 @neo773 May 07, 2025, 01:29:14 PM WIP
🟢 @Sanket6652 May 07, 2025, 03:56:38 PM WIP
🟢 @varshith257 May 07, 2025, 07:14:24 PM #7 Reward
🟢 @Rutik7066 May 08, 2025, 11:03:58 AM #8 Reward
🟢 @luffy-orf May 09, 2025, 04:32:21 PM WIP
🟢 @Nanashi-lab May 12, 2025, 09:40:35 AM #12 Reward

@neo773
Copy link

neo773 commented May 7, 2025

/attempt #6

@varshith257 varshith257 linked a pull request May 7, 2025 that will close this issue
@Rutik7066 Rutik7066 linked a pull request May 8, 2025 that will close this issue
@Rutik7066
Copy link

/attempt #6

@luffy-orf
Copy link

/attempt #6

@Nanashi-lab
Copy link

/attempt #6

@Nanashi-lab Nanashi-lab linked a pull request May 12, 2025 that will close this issue
@algora-pbc algora-pbc bot added the $500 label May 15, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
6 participants