Skip to content

Conversation

@Reion19
Copy link

@Reion19 Reion19 commented Jun 12, 2025

All Submissions:

  • [ ✅ ] Have you followed the guidelines in our Contributing document?
  • [ ✅ ] Have you checked to ensure there aren't other open Pull Requests for the same update/change?
  • [ ✅ ] Have you added an explanation of what your changes do and why you'd like us to include them?

Summary of Changes

This pull request includes three main commits:

Dependency Updates
Updated the gem dependencies to their latest stable versions.

RubyAI Quality-of-Life Improvements
Added .chat, .models, and .configure class methods to the RubyAI class. This allows configuration and usage without directly referencing RubyAI::Configuration.

Multi-Provider Support + Refactoring

    Refactored the gem’s configuration structure.

    Added support for multiple LLM providers using separate configuration parameters.

    Introduced the Chat class to allow instantiating different model instances and querying them via #call.

    Wrote specs for most new and changed methods.

    Added untested support for Anthropic LLMs (I couldn't test due to lack of access/funding).

    Made it easier to extend support for future providers.

What’s New?

You can now configure the gem "the Rails way" using RubyAI.configure do ... end.

Added a simple .chat method for sending prompts to the configured LLM.

The original usage with RubyAI::Client.new(...).call still works.

Introduced a Chat class for flexible use of different models.

Preliminary support for Anthropic models.

Easy-to-extend architecture for new LLM providers.

Let me know if you’d like tests for the Anthropic integration once I get access.


def connection
@connection ||= Faraday.new do |faraday|
faraday.adapter Faraday.default_adapter
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Adapter should come last

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

its just an copy+paste from client, so i couldn't know that it should come last

response = connection.post do |req|
req.url Configuration::PROVIDERS[@provider] || Configuration::BASE_URL
req.headers.merge!(headers)
req.body = body.to_json
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Faraday can manage Json automatically

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

the same thing as in the previous message

when 'anthropic'
{
'model' => Configuration::MODELS[provider][model],
'max_tokens' => 1024, # Required parameter for Anthropic API
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why you stay with 1024 tokens? Can't we increase this?

Copy link
Author

@Reion19 Reion19 Jul 2, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

my bad, will be fixed in future commits, because in one of the next PR i've changed configuration of providers

I simply forgot about this thing.

case provider
when 'openai'
{
'Content-Type': 'application/json',
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Faraday will add this header automatically

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this line will be removed in future commits

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants