Skip to content

Commit d058202

Browse files
authored
Update README.md
1 parent a20f36d commit d058202

File tree

1 file changed

+78
-48
lines changed

1 file changed

+78
-48
lines changed

README.md

Lines changed: 78 additions & 48 deletions
Original file line numberDiff line numberDiff line change
@@ -116,61 +116,91 @@ This is a **customer-funded Value-Based Delivery (VBD)**. Below, you'll find all
116116
- Uses [Bot Framework](https://dev.botframework.com/) and [Bot Service](https://azure.microsoft.com/en-us/products/bot-services/) to Host the Bot API Backend and to expose it to multiple channels including MS Teams.
117117
- Uses also FastAPI to deploy an alternative backend API with streaming capabilites
118118

119-
120119
---
121120

122121
## **Steps to Run the POC/Accelerator**
123122

124-
Note: (Pre-requisite) You need to have an Azure OpenAI service already created
125-
126-
1. Fork this repo to your Github account.
127-
2. In Azure OpenAI studio, deploy these models (older models than the ones stated below won't work):
128-
- "gpt-4o"
129-
- "gpt-4o-mini"
130-
- "text-embedding-3-large"
131-
- "tts"
132-
- "whisper"
133-
3. Create a Resource Group where all the assets of this accelerator are going to be. Azure OpenAI can be in different RG or a different Subscription.
134-
4. ClICK BELOW to create all the Azure Infrastructure needed to run the Notebooks (Azure AI Search, Cognitive Services, etc):
135-
136-
[![Deploy To Azure](https://aka.ms/deploytoazurebutton)](https://portal.azure.com/#create/Microsoft.Template/uri/https%3A%2F%2Fraw.githubusercontent.com%2Fpablomarin%2FGPT-Azure-Search-Engine%2Fmain%2Fazuredeploy.json)
137-
138-
**Note**: If you have never created a `Azure AI Services Multi-Service account` before, please create one manually in the azure portal to read and accept the Responsible AI terms. Once this is deployed, delete this and then use the above deployment button.
139-
140-
## Preferred Development Environment - Azure Machine Learning
141-
142-
1. Clone your Forked repo to your AML Compute Instance. If your repo is private, see below in Troubleshooting section how to clone a private repo.
143-
2. Install the dependencies on your machine (make sure you do the below pip comand on the same Python 3.12 conda environment that you are going to run the notebooks.) For example, in AZML compute instance run:
144-
145-
```bash
146-
conda create -n GPTSearch python=3.12
147-
conda activate GPTSearch
148-
pip install -r ./common/requirements.txt
149-
conda install ipykernel
150-
python -m ipykernel install --user --name=GPTSearch --display-name "GPTSearch (Python 3.12)"
151-
```
152-
<br>
153-
154-
## Alternate Development Environment - Visual Studio Code
155-
156-
1. Create a python virtual environment (.venv)
157-
a. When you create the virtual environment, select the /common/requirements.txt
158-
159-
Or manually
160-
161-
pip install -r ./common/requirements.txt
162-
163-
2. .venv\scripts\activate
164-
165-
3. Pip install ipykernel
166-
167-
5. Edit the file `credentials.env` with your own values from the services created in step 4.
168-
- For BLOB_SAS_TOKEN and BLOB_CONNECTION_STRING. Go to Storage Account -> Security + networking -> Shared access signature>Generate SAS
169-
6. **Run the Notebooks in order** using the "GPTSearch (Python 3.12)" kernel(AML) or .venv kernel(vscode). They build up on top of each other.
170-
123+
### **Pre-requisite**
124+
You must have an **Azure OpenAI Service** already created.
125+
126+
### **1. Fork the Repository**
127+
- Fork this repository to your GitHub account.
128+
129+
### **2. Deploy Required Models**
130+
In **Azure OpenAI Studio**, deploy the following models:
131+
*(Note: Older versions of these models will not work)*
132+
133+
- `gpt-4o`
134+
- `gpt-4o-mini`
135+
- `text-embedding-3-large`
136+
- `tts`
137+
- `whisper`
138+
139+
### **3. Create a Resource Group**
140+
- Create a **Resource Group (RG)** to house all the assets for this accelerator.
141+
- Note: Azure OpenAI services can exist in a different RG or even a different subscription.
142+
143+
### **4. Deploy Azure Infrastructure**
144+
Click the button below to deploy all necessary Azure infrastructure (e.g., Azure AI Search, Cognitive Services, etc.):
145+
146+
[![Deploy to Azure](https://aka.ms/deploytoazurebutton)](https://portal.azure.com/#create/Microsoft.Template/uri/https%3A%2F%2Fraw.githubusercontent.com%2Fpablomarin%2FGPT-Azure-Search-Engine%2Fmain%2Fazuredeploy.json)
147+
148+
**Important:**
149+
If this is your first time creating an **Azure AI Services Multi-Service Account**, do the following manually:
150+
1. Go to the Azure portal.
151+
2. Create the account.
152+
3. Read and accept the **Responsible AI Terms**.
153+
Once done, delete this manually created account and then use the above deployment button.
154+
155+
### **5. Choose Your Development Environment**
156+
157+
#### **Option A: Azure Machine Learning (Preferred)**
158+
1. **Clone** your forked repository to your AML Compute Instance.
159+
- If your repository is private, refer to the **Troubleshooting** section for guidance on cloning private repos.
160+
2. Install the dependencies in a Conda environment. Run the following commands on the **Python 3.12 Conda environment** you plan to use for the notebooks:
161+
162+
```bash
163+
conda create -n GPTSearch python=3.12
164+
conda activate GPTSearch
165+
pip install -r ./common/requirements.txt
166+
conda install ipykernel
167+
python -m ipykernel install --user --name=GPTSearch --display-name "GPTSearch (Python 3.12)"
168+
```
169+
170+
#### **Option B: Visual Studio Code**
171+
1. **Create a Python virtual environment (.venv):**
172+
- When creating the virtual environment, select the `./common/requirements.txt` file.
173+
- Alternatively, install dependencies manually:
174+
```bash
175+
pip install -r ./common/requirements.txt
176+
```
177+
2. **Activate the virtual environment:**
178+
```bash
179+
.venv\scripts\activate
180+
```
181+
3. Install `ipykernel`:
182+
```bash
183+
pip install ipykernel
184+
```
185+
186+
### **6. Configure Credentials**
187+
Edit the `credentials.env` file with the appropriate values from the services created in Step 4.
188+
- To obtain `BLOB_SAS_TOKEN` and `BLOB_CONNECTION_STRING`, navigate to:
189+
**Storage Account > Security + Networking > Shared Access Signature > Generate SAS**
190+
191+
### **7. Run the Notebooks**
192+
- Execute the notebooks **in order**, as they build on top of each other.
193+
- Use the appropriate kernel:
194+
- For **AML**, select: `GPTSearch (Python 3.12)`
195+
- For **VS Code**, select the `.venv` kernel.
196+
197+
### **Troubleshooting**
198+
- If cloning a private repository: Refer to the detailed guide [here](#).
199+
- For issues with dependency installation: Ensure your Python version matches the required version.
171200

172201
---
173202

203+
174204
<details>
175205

176206
<summary>Troubleshooting</summary>

0 commit comments

Comments
 (0)