You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
This is a no-nonsense async Scala client for OpenAI API supporting all the available endpoints and params **including streaming**, the newest **chat completion**, **vision**, and **voice routines** (as defined [here](https://beta.openai.com/docs/api-reference)), provided in a single, convenient service called [OpenAIService](./openai-core/src/main/scala/io/cequence/openaiscala/service/OpenAIService.scala). The supported calls are:
5
5
@@ -35,16 +35,16 @@ Also, we aimed the lib to be self-contained with the fewest dependencies possibl
35
35
In addition to the OpenAI API, this library also supports API-compatible providers (see [examples](./openai-examples/src/main/scala/io/cequence/openaiscala/examples/nonopenai)) such as:
36
36
-[Azure OpenAI](https://azure.microsoft.com/en-us/products/ai-services/openai-service) - cloud-based, utilizes OpenAI models but with lower latency
37
37
-[Azure AI](https://azure.microsoft.com/en-us/products/ai-studio) - cloud-based, offers a vast selection of open-source models
38
-
-[Anthropic](https://www.anthropic.com/api) - cloud-based, a major competitor to OpenAI, features proprietary/closed-source models such as Claude3 - Haiku, Sonnet, and Opus. 🔥 **New**: now with cache support!
38
+
-[Anthropic](https://www.anthropic.com/api) - cloud-based, a major competitor to OpenAI, features proprietary/closed-source models such as Claude3 - Haiku, Sonnet, and Opus. 🔥 **New**: now also through Bedrock!
39
39
-[Google Vertex AI](https://cloud.google.com/vertex-ai) - cloud-based, features proprietary/closed-source models such as Gemini 1.5 Pro and flash
40
40
-[Groq](https://wow.groq.com/) - cloud-based provider, known for its superfast inference with LPUs
41
-
-[Grok](https://x.ai/)(🔥 **New**) - cloud-based provider from x.AI
41
+
-[Grok](https://x.ai/) - cloud-based provider from x.AI
-[Cerebras](https://cerebras.ai/) - cloud-based provider, superfast (akin to Groq)
46
46
-[Mistral](https://mistral.ai/) - cloud-based, leading open-source LLM company
47
-
-[Deepseek](https://deepseek.com/)(🔥 **New**) - cloud-based provider from China
47
+
-[Deepseek](https://deepseek.com/) - cloud-based provider from China
48
48
-[Ollama](https://ollama.com/) - runs locally, serves as an umbrella for open-source LLMs including LLaMA3, dbrx, and Command-R
49
49
-[FastChat](https://github.com/lm-sys/FastChat) - runs locally, serves as an umbrella for open-source LLMs such as Vicuna, Alpaca, and FastChat-T5
50
50
@@ -63,7 +63,7 @@ The currently supported Scala versions are **2.12, 2.13**, and **3**.
63
63
To install the library, add the following dependency to your *build.sbt*
64
64
65
65
```
66
-
"io.cequence" %% "openai-scala-client" % "1.1.1"
66
+
"io.cequence" %% "openai-scala-client" % "1.1.2"
67
67
```
68
68
69
69
or to *pom.xml* (if you use maven)
@@ -72,11 +72,11 @@ or to *pom.xml* (if you use maven)
72
72
<dependency>
73
73
<groupId>io.cequence</groupId>
74
74
<artifactId>openai-scala-client_2.12</artifactId>
75
-
<version>1.1.1</version>
75
+
<version>1.1.2</version>
76
76
</dependency>
77
77
```
78
78
79
-
If you want streaming support, use `"io.cequence" %% "openai-scala-client-stream" % "1.1.1"` instead.
79
+
If you want streaming support, use `"io.cequence" %% "openai-scala-client-stream" % "1.1.2"` instead.
80
80
81
81
## Config ⚙️
82
82
@@ -146,7 +146,7 @@ Then you can obtain a service in one of the following ways.
146
146
147
147
2.[Anthropic](https://www.anthropic.com/api) - requires `openai-scala-anthropic-client` lib and `ANTHROPIC_API_KEY`
148
148
```scala
149
-
valservice=AnthropicServiceFactory.asOpenAI()
149
+
valservice=AnthropicServiceFactory.asOpenAI()// or AnthropicServiceFactory.bedrockAsOpenAI
0 commit comments