|
2 | 2 |
|
3 | 3 | <p className="subtitle">Use Cody's chat to get contextually-aware answers to your questions.</p>
|
4 | 4 |
|
5 |
| -Cody **chat** allows you to ask coding-related questions about any part of your codebase or specific code snippets. You can do it from the **Chat** panel of the supported editor extensions (VS Code, JetBrains) or in the web app. |
| 5 | +You can **chat** with Cody to ask questions about your code, generate code, and edit code. By default, Cody has the context of your open file and entire repository, and you can use `@` to add context for specific files, symbols, remote repositories, or other non-code artifacts. |
6 | 6 |
|
7 |
| -Key functionalities in the VS Code extension include support for multiple simultaneous chats, enhanced chat context configurability through @-mentions, detailed visibility into the code that Cody read before providing a response, and more. |
8 |
| - |
9 |
| -You can learn more about the IDE support for these functionalities in the [feature parity reference](/cody/clients/feature-reference#chat). |
| 7 | +You can do it from the **chat** panel of the supported editor extensions ([VS Code](/clients/install-vscode), [JetBrains](/clients/install-jetbrains), [Visual Studio](/clients/install-visual-studio)) or in the [web](/clients/cody-with-sourcegraph) app. |
10 | 8 |
|
11 | 9 | ## Prerequisites
|
12 | 10 |
|
13 |
| -To use Cody's chat, you'll need to have the following: |
| 11 | +To use Cody's chat, you'll need the following: |
14 | 12 |
|
15 | 13 | - A Free or Pro account via Sourcegraph.com or a Sourcegraph Enterprise account
|
16 |
| -- A supported editor extension (VS Code, JetBrains) installed |
| 14 | +- A supported editor extension (VS Code, JetBrains, Visual Studio) installed |
17 | 15 |
|
18 | 16 | ## How does chat work?
|
19 | 17 |
|
20 |
| -Cody answers questions by searching your codebase and retrieving context relevant to your questions. Cody uses several methods to search for context, including Sourcegraph's native search and keyword search. Finding and using context allows Cody to make informed responses based on your code rather than being limited to general knowledge. When Cody retrieves context to answer a question, it will tell you which code files it read to generate its response. |
| 18 | +Cody answers questions by searching your codebase and retrieving context relevant to your questions. Cody uses several methods to search for context, including Sourcegraph's native search and keyword search. Finding and using context allows Cody to make informed responses based on your code rather than being limited to general knowledge. When Cody retrieves context to answer a question, it will tell you which code files it reads to generate its response. |
21 | 19 |
|
22 |
| -Cody can assist you with various use cases such as: |
| 20 | +Cody can assist you with various use cases, such as: |
23 | 21 |
|
24 | 22 | - Generating an API call: Cody can analyze your API schema to provide context for the code it generates
|
25 | 23 | - Locating a specific component in your codebase: Cody can identify and describe the files where a particular component is defined
|
26 | 24 | - Handling questions that involve multiple files, like understanding data population in a React app: Cody can locate React component definitions, helping you understand how data is passed and where it originates
|
27 | 25 |
|
28 |
| -## Ask Cody your first question |
| 26 | +## Chat features |
29 | 27 |
|
30 |
| -Let's use Cody VS Code extension's chat interface to answer your first question. |
| 28 | +There are several features that you can use to make your chat experience better. These features may vary depending on the [client](/cody/clients) you are using. You can learn more about the support for these functionalities in the [feature parity reference](/cody/clients/feature-reference#chat). |
31 | 29 |
|
32 |
| -- Click the Cody icon in the sidebar to view the detailed panel |
33 |
| -- Next, click the icon for **New Chat** to open a new chat window |
34 |
| -- Write your question or instruction to Cody and then press **Enter**. |
| 30 | +## Default context |
35 | 31 |
|
36 |
| -For example, ask Cody "What does this file do?" |
| 32 | +When you start a new Cody chat, the input window opens with a default `@-mention` context chips for the opened file and the current repository. |
37 | 33 |
|
38 |
| -Cody will take a few seconds to process your question, providing contextual information about the files it reads and generating the answer. |
| 34 | + |
39 | 35 |
|
40 |
| -<video width="1920" height="1080" loop playsInline controls style={{ width: '100%', height: 'auto', aspectRatio: '1920 / 1080' }}> |
41 |
| - <source src="https://storage.googleapis.com/sourcegraph-assets/Docs/Media/chat-interface-0724.mp4" type="video/mp4" /> |
42 |
| -</video> |
| 36 | +At any point in time, you can edit these context chips or remove them entirely if you do not want to use these as context. Any chat without a context chip will instruct Cody to use no codebase context. However, you can always provide an alternate `@-mention` file or symbols to let Cody use it as a new context source. |
43 | 37 |
|
44 |
| -## Ask Cody to write code |
| 38 | +When you have both a repository and files @-mentioned, Cody will search the repository for context while prioritizing the mentioned files. |
45 | 39 |
|
46 |
| -The chat feature can also write code for your questions. For example, in VS Code, ask Cody to "write a function that sorts an array in ascending order". |
| 40 | +## Add new context |
47 | 41 |
|
48 |
| -You are provided with code suggestions in the chat window along with the following options for using the code. |
| 42 | +You can add new custom context by adding `@-mention` context chips to the chat. At any point, you can use `@-mention` a repository, file, line range, or symbol, to ask questions about your codebase. Cody will use this new context to generate contextually relevant code. |
49 | 43 |
|
50 |
| -- The **Copy Code** icon to your clipboard and paste the code suggestion into your code editor |
51 |
| -- Insert the code suggestion at the current cursor location by the **Insert Code at Cursor** icon |
52 |
| -- The **Save Code to New File** icon to save the code suggestion to a new file in your project |
| 44 | +## OpenCtx context providers |
| 45 | + |
| 46 | +<Callout type="info">OpenCtx context providers are in the Experimental stage for all Cody VS Code users. Enterprise users can also use this, but with limited support. If you have feedback or questions, please visit our [support forum](https://community.sourcegraph.com/c/openctx/10).</Callout> |
| 47 | + |
| 48 | +[OpenCtx](https://openctx.org/) is an open standard for bringing contextual info about code into your dev tools. Cody Free and Pro users can use OpenCtx providers to fetch and use context from the following sources: |
| 49 | + |
| 50 | +- [Webpages](https://openctx.org/docs/providers/web) (via URL) |
| 51 | +- [Jira tickets](https://openctx.org/docs/providers/jira) |
| 52 | +- [Linear issues](https://openctx.org/docs/providers/linear-issues) |
| 53 | +- [Notion pages](https://openctx.org/docs/providers/notion) |
| 54 | +- [Google Docs](https://openctx.org/docs/providers/google-docs) |
| 55 | +- [Sourcegraph code search](https://openctx.org/docs/providers/sourcegraph-search) |
| 56 | + |
| 57 | +You can use `@-mention` web URLs to pull live information like docs. You can connect Cody to OpenCtx to `@-mention` non-code artifacts like Google Docs, Notion pages, Jira tickets, and Linear issues. |
| 58 | + |
| 59 | +## Run offline |
| 60 | + |
| 61 | +<Callout type="info">Support with Ollama is currently in the Experimental stage and is available for Cody Free and Pro plans.</Callout> |
| 62 | + |
| 63 | +Cody chat can run offline with Ollama. The offline mode does not require you to sign in with your Sourcegraph account to use Ollama. Click the button below the Ollama logo, and you'll be ready to go. |
| 64 | + |
| 65 | + |
| 66 | + |
| 67 | +You can still switch to your Sourcegraph account whenever you want to use Claude, OpenAI, Gemini, Mixtral, etc. |
| 68 | + |
| 69 | +## LLM selection |
| 70 | + |
| 71 | +Cody allows you to select the LLM you want to use for your chat, optimized for speed versus accuracy. Cody Free and Pro users can select multiple models. Enterprise users with the new [model configuration](/cody/clients/model-configuration) can use the LLM selection dropdown to choose a chat model. |
53 | 72 |
|
54 |
| -During the chat, if Cody needs additional context, it can ask you to provide more information with a follow-up question. If your question is beyond the scope of the context, Cody will ask you to provide an alternate question aligned with the context of your codebase. |
| 73 | +You can read about these supported LLM models [here](/cody/capabilities/supported-models#chat-and-commands). |
55 | 74 |
|
56 |
| -## Selecting Context |
| 75 | + |
57 | 76 |
|
58 |
| -Cody's chat allows you to add files and symbols as context in your messages. |
| 77 | +## Smart Apply and Execute code suggestions |
59 | 78 |
|
60 |
| -- Type `@` and then a filename to include a file as context |
61 |
| -- Type `@#` and then a symbol name to include the symbol's definition as context. Functions, methods, classes, types, etc., are all symbols |
| 79 | +Cody lets you dynamically insert code from chat into your files with **Smart Apply**. Whenever Cody provides a code suggestion, you can click the **Apply** button. Cody will then analyze your open code file, find where that relevant code should live, and add a diff. For chat messages where Cody provides multiple code suggestions, you can apply each in sequence to go from chat suggestions to written code. |
62 | 80 |
|
63 |
| -Cody's experimental [OpenCtx](/cody/capabilities/openctx) support adds even more context sources, including Jira, Linear, Google Docs, Notion, and more. |
| 81 | +Smart Apply also supports the executing of commands in the terminal. When you ask Cody a question related to terminal commands, you can now execute the suggestion in your terminal by clicking the `Execute` button in the chat window. |
64 | 82 |
|
65 |
| -### Chat vs Commands |
| 83 | + |
| 84 | + |
| 85 | +## Chat history |
| 86 | + |
| 87 | +Cody keeps a history of your chat sessions. You can view it by clicking the **History** button in the chat panel. You can **Export** it to a JSON file for later use or click the **Delete all** button to clear the chat history. |
| 88 | + |
| 89 | +## Prompts and Commands |
| 90 | + |
| 91 | +Cody offers quick, ready-to-use [prompts and commands](/cody/capabilities/commands) for common actions to write, describe, fix, and smell code. These allow you to run predefined actions with smart context-fetching anywhere in the editor, like: |
| 92 | + |
| 93 | +- **New Chat**: Ask Cody a question |
| 94 | +- **Document Code**: Add code documentation |
| 95 | +- **Edit Code**: Edit code with instructions |
| 96 | +- **Explain Code**: Describe your code with more details |
| 97 | +- **Generate Unit Tests**: Write tests for your code |
| 98 | + |
| 99 | +<video width="1920" height="1080" loop playsInline controls style={{ width: '100%', height: 'auto' }}> |
| 100 | + <source src="https://storage.googleapis.com/sourcegraph-assets/Docs/Media/cody-prompts-102024-2.mp4" type="video/mp4" /> |
| 101 | + </video> |
| 102 | + |
| 103 | +Read more about [prompts and commands](/cody/capabilities/commands). |
| 104 | + |
| 105 | +## Ask Cody to write code |
| 106 | + |
| 107 | +Cody chat can also write code for your questions. For example, in VS Code, ask Cody to "write a function that sorts an array in ascending order". |
| 108 | + |
| 109 | +You are provided with code suggestions in the chat window and the following options for using the code. |
| 110 | + |
| 111 | +- The **Copy Code** icon to your clipboard and paste the code suggestion into your code editor |
| 112 | +- Insert the code suggestion at the current cursor location by the **Insert Code at Cursor** icon |
| 113 | +- The **Save Code to New File** icon to save the code suggestion to a new file in your project |
66 | 114 |
|
67 |
| -There could be scenarios when Cody's chat might not be able to answer your question. Or the answer lacks the context that you need. In these cases, it's recommended to use Cody **commands**. Cody's responses to commands might be better at times than responses to chats since they've been pre-packaged and prompt-engineered. |
| 115 | +If Cody's answer isn't helpful, you can try asking again with a different context: |
68 | 116 |
|
69 |
| -<Callout type="note"> Commands are only supported in the VS Code and JetBrains extension.</Callout> |
| 117 | +- **Public knowledge only**: Cody will not use your own code files as context; it’ll only use knowledge trained into the base model. |
| 118 | +- **Current file only**: Re-run the prompt again using just the current file as context. |
| 119 | +- **Add context**: Provides @-mention context options to improve the response by explicitly including files, symbols, remote repositories, or even web pages (by URL). |
0 commit comments