Skip to content

Commit 1484a2b

Browse files
authored
restructure examples, add nextjs and fastapi starters (#7)
1 parent 0c4df19 commit 1484a2b

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

61 files changed

+12114
-183
lines changed

README.md

Lines changed: 6 additions & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -6,21 +6,17 @@ Clone the repo to see syntax highlights using the BAML VSCode extension!
66

77
### Requirements
88

9-
1. BAML CLI, BAML VSCode extension
10-
2. This repository uses ⚠️⚠️⚠️ Poetry ⚠️⚠️⚠️ as the python environment manager. If you want to use Conda, pip or another dependency mgmt, run `baml init` to get yourself setup properly.
11-
3. Python >=3.9 (Contact us if you have an older version).
9+
1. BAML CLI,
10+
2. BAML VSCode extension
11+
3. OPENAI_API_KEY is set in your .env file. See .env.example at the root.
1212

1313
**Contact us on Discord if you need help running the examples using Conda, pip or another dependency mgmt**.
1414

1515
### Setup
1616

17-
Note: You can always just copy the `.baml` files you want into your own project that you have initialized with `baml init`.
17+
We recommend running `baml init` in your own project (unless you just want to clone the NextJS or FastAPI starter projects). Then just copy the .baml files and functions you want.
1818

19-
1. Clone the repo
20-
2. Install [poetry](https://python-poetry.org/docs/)
21-
3. Run `poetry shell` in the root
22-
4. Run `poetry install`
23-
5. Make sure you can ctrl + s one of the .baml files after you install the BAML VSCode extension to generate a baml_client dir.
19+
Make sure you can ctrl + s one of the .baml files after you install the BAML VSCode extension to generate a baml_client dir.
2420

2521
## Troubleshooting
2622

@@ -30,6 +26,7 @@ Some common steps that help fix things:
3026

3127
1. Make sure you also add `baml` pip package to the project dependencies if you are not using poetry (see pyproject.toml for dependency list).
3228
1. Make sure you're in a poetry shell when you run the python main.py files.
29+
1. Enable Python > Analysis: Type Checking Mode - Basic or greater.
3330
1. Make sure environment variables are set (some baml files use an env.OPEN_AI_KEY). Add a .env file to the root dir with the appropriate api key set to fix this.
3431
1. Restart VScode if the playground isn't working
3532
1. Restart VSCode if you're not getting error highlights for baml-generated code, or ensure the right Python interpreter is set (Command + Shift + P -> Select interpreter)

fastapi-starter/README.md

Lines changed: 20 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,20 @@
1+
## Setup
2+
3+
Open this project up in VSCode (we recommend only opening this folder, and not at the root, or VSCode may not detect the python environment and you may not get type completions for BAML functions).
4+
5+
Ensure your `settings.json` has:
6+
7+
```
8+
{
9+
"python.analysis.typeCheckingMode": "basic"
10+
}
11+
```
12+
13+
1. Run `poetry install`
14+
2. Run `poetry shell`
15+
3. Open up vscode command palette (command + shift + p, and select the .venv folder that was created in this directory as the interpreter)
16+
4. Run `uvicorn fastapi_starter.app:app --reload`
17+
5. Curl the streaming endpoint:
18+
```
19+
curl -X GET -H "Content-Type: application/json" http://localhost:8000/extract_resume
20+
```

fastapi-starter/baml-README.md

Lines changed: 46 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,46 @@
1+
# Getting started with BAML
2+
3+
## Installations
4+
5+
Make sure to download the [VSCode Playground](https://marketplace.visualstudio.com/items?itemName=gloo.baml).
6+
7+
To use BAML with either python or typescript, you should run:
8+
9+
```shell
10+
$ baml update-client
11+
```
12+
13+
This will keep client side libraries in sync. It also prints the commands being run, which you can run manually if they fail.
14+
15+
## Running tests
16+
17+
You can run tests via:
18+
19+
```shell
20+
# To run tests
21+
$ baml test run
22+
23+
# To list tests
24+
$ baml test
25+
26+
# For more help
27+
$ baml test --help
28+
```
29+
30+
## Integrating BAML with python / ts
31+
32+
You can run:
33+
34+
```shell
35+
$ python -m baml_example_app
36+
```
37+
38+
The `baml_example_app.py` file shows how to import from the code BAML generates.
39+
40+
## Deploying
41+
42+
You don't need the BAML compiler when you deploy / release. Your `baml_client` folder contains everything you may need.
43+
44+
## Reporting bugs
45+
46+
Report any issues on our [Github](https://www.github.com/boundaryml/baml) or [Discord](https://discord.gg/BTNBeXGuaS)

fastapi-starter/baml_example_app.py

Lines changed: 100 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,100 @@
1+
"""
2+
Run this script to see how the BAML client can be used in Python.
3+
4+
python -m example_baml_app
5+
"""
6+
7+
import asyncio
8+
from baml_client import baml as b
9+
from datetime import datetime
10+
from typing import List
11+
from typing_extensions import TypedDict
12+
13+
async def extract_resume(resume: str) -> None:
14+
"""
15+
Extracts the resume and prints the extracted data.
16+
"""
17+
print("Parsing resume...")
18+
print(resume[:100] + "..." if len(resume) > 100 else resume)
19+
parsed_resume = await b.ExtractResume(resume)
20+
print(parsed_resume.model_dump_json(indent=2))
21+
22+
await asyncio.sleep(1)
23+
print("\n\nNow extracting using streaming")
24+
async with b.ExtractResume.stream(resume) as stream:
25+
async for x in stream.parsed_stream:
26+
if x.is_parseable:
27+
print(f"streaming: {x.parsed.model_dump_json()}")
28+
response = await stream.get_final_response()
29+
if response.has_value:
30+
print(f"\n final: {response.value.model_dump_json(indent=2)}")
31+
else:
32+
print("No final response")
33+
34+
35+
class ChatMessage(TypedDict):
36+
sender: str
37+
message: str
38+
39+
40+
async def classify_chat(messages: List[ChatMessage]) -> None:
41+
"""
42+
Classifies the chat and prints the classification.
43+
"""
44+
print("Classifying chat...")
45+
chat = "\n".join(map(lambda m: f'{m["sender"]}: {m["message"]}', messages))
46+
print(chat[:100] + "..." if len(chat) > 100 else chat)
47+
48+
classification = await b.ClassifyMessage(
49+
message=chat, message_date=datetime.now().strftime("%Y-%m-%d")
50+
)
51+
print("Got categories: ", classification)
52+
53+
54+
async def main():
55+
resume = """
56+
John Doe
57+
1234 Elm Street
58+
Springfield, IL 62701
59+
(123) 456-7890
60+
61+
Objective: To obtain a position as a software engineer.
62+
63+
Education:
64+
Bachelor of Science in Computer Science
65+
University of Illinois at Urbana-Champaign
66+
May 2020 - May 2024
67+
68+
Experience:
69+
Software Engineer Intern
70+
Google
71+
May 2022 - August 2022
72+
- Worked on the Google Search team
73+
- Developed new features for the search engine
74+
- Wrote code in Python and C++
75+
76+
Software Engineer Intern
77+
Facebook
78+
May 2021 - August 2021
79+
- Worked on the Facebook Messenger team
80+
- Developed new features for the messenger app
81+
- Wrote code in Python and Java
82+
"""
83+
await extract_resume(resume)
84+
85+
messages = [
86+
{"sender": "Alice", "message": "I'm having issues with my computer."},
87+
{
88+
"sender": "Assistant",
89+
"message": "I'm sorry to hear that. What seems to be the problem?",
90+
},
91+
{
92+
"sender": "Alice",
93+
"message": "It's running really slow. I need to return it. Can I get a refund?",
94+
},
95+
]
96+
await classify_chat(messages)
97+
98+
99+
if __name__ == "__main__":
100+
asyncio.run(main())
Lines changed: 6 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,6 @@
1+
{
2+
"input": {
3+
"message": "This is so frustrating, i bought a laptop and it's not working properly. I want to return it and get my money back. I'm so disappointed",
4+
"message_date": "2019-01-01T00:00:00Z"
5+
}
6+
}
Lines changed: 6 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,6 @@
1+
{
2+
"input": {
3+
"message": "Hi! I'm having an issue with my account. Can you help me?",
4+
"message_date": "2019-01-01T00:00:00Z"
5+
}
6+
}
Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,3 @@
1+
{
2+
"input": "Jason Doe\nPython, Rust\nUniversity of California, Berkeley, B.S.\nin Computer Science, 2020\nAlso an expert in Tableau, SQL, and C++\n"
3+
}
Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,3 @@
1+
{
2+
"input": "Sarah Montez\nHarvard University\nMay 2015-2019\n3.92 GPA\nGoogle\nSoftware Engineer\nJune 2019-Present\n- Backend engineer\n- Rewrote search and uplifted metrics by 120%\n- Used C++ and Python\nMicrosoft\nSoftware Intern\nJune 2018-August 2018\n- Worked on the Windows team\n- Updated the UI\n- Used C++\n"
3+
}

fastapi-starter/baml_src/clients.baml

Lines changed: 23 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,23 @@
1+
client<llm> GPT4 {
2+
provider baml-openai-chat
3+
options {
4+
model gpt-4
5+
api_key env.OPENAI_API_KEY
6+
}
7+
}
8+
9+
client<llm> GPT4Turbo {
10+
provider baml-openai-chat
11+
options {
12+
model gpt-4-1106-preview
13+
api_key env.OPENAI_API_KEY
14+
}
15+
}
16+
17+
client<llm> GPT3 {
18+
provider baml-openai-chat
19+
options {
20+
model gpt-3.5-turbo
21+
api_key env.OPENAI_API_KEY
22+
}
23+
}
Lines changed: 33 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,33 @@
1+
class Resume {
2+
name string
3+
education Education[]
4+
skills string[]
5+
}
6+
7+
class Education {
8+
school string
9+
degree string
10+
year int
11+
}
12+
13+
function ExtractResume {
14+
input string
15+
output Resume
16+
}
17+
18+
impl<llm, ExtractResume> version1 {
19+
client GPT4
20+
prompt #"
21+
Parse the following resume and return a structured representation of the data in the schema below.
22+
23+
Resume:
24+
---
25+
{#input}
26+
---
27+
28+
Output JSON format (only include these fields, and no others):
29+
{#print_type(output)}
30+
31+
Output JSON:
32+
"#
33+
}

0 commit comments

Comments
 (0)