Skip to content

Commit 7cffbef

Browse files
committed
Add unified client API with sync/async support and auto-session management
1 parent d1dbdb3 commit 7cffbef

File tree

10 files changed

+1539
-17
lines changed

10 files changed

+1539
-17
lines changed

CHANGELOG.md

Lines changed: 22 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -4,6 +4,28 @@ All notable changes to LocalLab will be documented in this file.
44

55
## [0.4.48] - 2024-03-15
66

7+
### Client Library Changes
8+
9+
#### Added
10+
11+
- Added unified client API that works both with and without async/await
12+
- Implemented automatic session closing to the Python client
13+
- Added proper resource management with atexit handlers and finalizers
14+
- Improved error handling in the Python client
15+
- Added synchronous context manager support (`with` statement)
16+
17+
#### Changed
18+
19+
- Simplified client API - same methods work in both sync and async contexts
20+
- Updated Python client to track activity and close inactive sessions
21+
- Enhanced client session management to prevent resource leaks
22+
- Improved client package version to 0.2.0
23+
24+
#### Fixed
25+
26+
- Fixed issue with unclosed client sessions causing warnings
27+
- Improved error propagation in streaming responses
28+
729
### Changed
830

931
- Removed all response formatting from streaming generation

README.md

Lines changed: 27 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -56,28 +56,46 @@ When you use LocalLab:
5656
```
5757

5858
3. **AI Interaction**
59+
5960
```python
6061
# Your code sends requests through the client
62+
# Async usage
6163
response = await client.generate("Write a story")
6264
print(response) # Server processes and returns AI response
65+
66+
# Or sync usage (New!)
67+
response = client.generate("Write a story")
68+
print(response) # Same result, no async/await needed!
6369
```
6470

6571
## 💡 Quick Examples
6672

6773
```python
68-
# Generate text
69-
response = await client.generate("Hello!")
74+
# Generate text (async or sync)
75+
response = await client.generate("Hello!") # Async
76+
response = client.generate("Hello!") # Sync (New!)
7077

71-
# Chat with AI
72-
response = await client.chat([
78+
# Chat with AI (async or sync)
79+
response = await client.chat([ # Async
80+
{"role": "user", "content": "Hi!"}
81+
])
82+
response = client.chat([ # Sync (New!)
7383
{"role": "user", "content": "Hi!"}
7484
])
7585

76-
# Process multiple prompts
77-
responses = await client.batch_generate([
86+
# Process multiple prompts (async or sync)
87+
responses = await client.batch_generate([ # Async
7888
"Write a joke",
7989
"Tell a story"
8090
])
91+
responses = client.batch_generate([ # Sync (New!)
92+
"Write a joke",
93+
"Tell a story"
94+
])
95+
96+
# Context manager support (New!)
97+
with LocalLabClient("http://localhost:8000") as client:
98+
response = client.generate("Hello!") # Auto-closes when done
8199
```
82100

83101
[➡️ See More Examples](./docs/guides/examples.md)
@@ -127,6 +145,8 @@ Our [Documentation Guide](./docs/README.md) will help you:
127145
- **Resource Efficient**: Automatic optimization
128146
- **Privacy First**: All local, no data sent to cloud
129147
- **Free GPU**: Google Colab integration
148+
- **Unified Client API**: Works with or without async/await (New!)
149+
- **Automatic Resource Management**: Sessions close automatically (New!)
130150

131151
[➡️ See All Features](./docs/features/README.md)
132152

@@ -144,5 +164,5 @@ Our [Documentation Guide](./docs/README.md) will help you:
144164

145165
---
146166

147-
Made with ❤️ by Utkarsh Tiwari
167+
Made with ❤️ by Utkarsh Tiwari
148168
[GitHub](https://github.com/UtkarshTheDev)[Twitter](https://twitter.com/UtkarshTheDev)[LinkedIn](https://linkedin.com/in/utkarshthedev)

client/python_client/__init__.py

Lines changed: 31 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,31 @@
1+
"""
2+
LocalLab Python Client
3+
4+
A Python client for interacting with LocalLab, a local LLM server.
5+
6+
This client can be used both synchronously and asynchronously:
7+
8+
Async usage:
9+
```python
10+
client = LocalLabClient("http://localhost:8000")
11+
response = await client.generate("Hello, world!")
12+
await client.close()
13+
```
14+
15+
Sync usage:
16+
```python
17+
client = LocalLabClient("http://localhost:8000")
18+
response = client.generate("Hello, world!")
19+
client.close()
20+
```
21+
"""
22+
23+
from .client import LocalLabClient
24+
25+
__version__ = "0.2.0"
26+
__author__ = "Utkarsh"
27+
__email__ = "utkarshweb2023@gmail.com"
28+
29+
__all__ = [
30+
"LocalLabClient",
31+
]

0 commit comments

Comments
 (0)