|
1 | 1 | ---
|
2 |
| -title: Supported Models in Refact |
3 |
| -description: Supported Models in Refact |
| 2 | +title: Supported Models in Refact.ai |
| 3 | +description: Supported Models in Refact.ai |
4 | 4 | ---
|
5 | 5 |
|
6 |
| -## Cloud Version of Refact |
| 6 | +## Cloud Version |
7 | 7 |
|
8 |
| -### Completion models |
9 |
| -- Refact/1.6B |
10 |
| -- starcoder2/3b |
| 8 | +With Refact.ai, access state-of-the-art models in your VS Code or JetBrains plugin and select the optimal LLM for each task. |
| 9 | + |
| 10 | +### AI Agent models |
| 11 | +- Claude 3.7 Sonnet |
| 12 | +- Claude 3.5 Sonnet |
| 13 | +- GPT-4o |
| 14 | +- o3-mini |
11 | 15 |
|
12 | 16 | ### Chat models
|
13 |
| -- GPT 3.5 |
14 |
| -- GPT 4 (Pro plan) |
| 17 | +- Claude 3.7 Sonnet |
| 18 | +- Claude 3.5 Sonnet |
| 19 | +- GPT-4o |
| 20 | +- GPT-4o-mini |
| 21 | +- o3-mini |
| 22 | + |
| 23 | +For select models, click the `💡Think` button to enable advanced reasoning, helping AI better solve complex tasks. Available only in [Refact.ai Pro plan](https://refact.ai/pricing/). |
| 24 | + |
| 25 | + |
| 26 | +### Code completion models |
| 27 | +- Qwen2.5-Coder-1.5B |
| 28 | + |
| 29 | + |
| 30 | +## BYOK (Bring your own key) |
| 31 | + |
| 32 | +Refact.ai gives flexibility to connect your API key and use any external LLM like Gemini, Grok, OpenAI, Deepseek, and others. Read the guide in our [BYOK Documentation](https://docs.refact.ai/byok/). |
| 33 | + |
| 34 | + |
| 35 | +## Self-Hosted Version |
15 | 36 |
|
16 |
| -## Self-Hosted Version of Refact |
| 37 | +In Refact.ai Self-hosted, you can choose among 20+ model options — ready for any task. The full lineup (always up-to-date) is in the [Known Models file on GitHub](https://github.com/smallcloudai/refact-lsp/blob/main/src/known_models.rs). |
17 | 38 |
|
18 |
| -In Refact self-hosted you can select between the following models: |
19 | 39 |
|
20 | 40 | ### Completion models
|
21 | 41 | <table class="full-table">
|
@@ -173,22 +193,21 @@ In Refact self-hosted you can select between the following models:
|
173 | 193 | </tbody>
|
174 | 194 | </table>
|
175 | 195 |
|
176 |
| -For an up-to-date list of models, see the [Known Models file on GitHub](https://github.com/smallcloudai/refact-lsp/blob/main/src/known_models.rs). |
177 | 196 |
|
178 |
| -## Integrations |
| 197 | +### Integrations |
179 | 198 |
|
180 |
| -Refact.ai offers **OpenAI** and **Anthropic API** integrations. |
| 199 | +On a self-hosted mode, you can also configure **OpenAI** and **Anthropic API** integrations. |
181 | 200 |
|
182 |
| -To enable these integrations, navigate to the **Model Hosting** page activate the **OpenAI** and/or **Anthropic** integrations by pressing the switch button in the **3rd Party APIs** section. |
| 201 | +1. Go to **Model Hosting** page → **3rd Party APIs** section and toggle the switch buttons for **OpenAI** and/or **Anthropic**. |
183 | 202 |
|
184 | 203 | 
|
185 | 204 |
|
186 |
| -Press **API Keys tab** link, you will be redirected to the integrations page. Alternatively, you can access the integrations page by clicking on the **Settings** dropdown menu in the header and selecting **Credentials**. |
| 205 | +2. Click the **API Keys tab** to be redirected to the integrations page (or go via **Settings** → **Credentials**) |
187 | 206 |
|
188 | 207 | 
|
189 | 208 |
|
190 |
| -In the **Credentials** page, you can specify your **OpenAI** and/or **Anthropic** API keys. |
| 209 | +3. Enter your **OpenAI** and/or **Anthropic** key. |
191 | 210 |
|
192 | 211 | :::note
|
193 |
| -Make sure the switch button is enabled for each API you want to use. Even if you specify the API key, it will not be used until the switch button is enabled. |
194 |
| -::: |
| 212 | +Make sure the switch button is enabled for each API you want to use — API keys won't be used unless activated. |
| 213 | +::: |
0 commit comments