Skip to content
Draft
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
68 commits
Select commit Hold shift + click to select a range
a4e135b
fix: use `.get()` on image URL in `ImagePromptValue.to_string()`
mdrxy Aug 15, 2025
9721684
Merge branch 'master' into wip-v1.0
mdrxy Aug 15, 2025
174e685
feat(anthropic): dynamic mapping of Max Tokens for Anthropic (#31946)
keenborder786 Aug 15, 2025
4dd9110
Merge branch 'master' into wip-v1.0
mdrxy Aug 15, 2025
8bd2403
fix: increase `max_tokens` limit to 64000 re: Anthropic dynamic tokens
mdrxy Aug 15, 2025
f0f1e28
Merge branch 'master' of github.com:langchain-ai/langchain into wip-v1.0
mdrxy Aug 19, 2025
dbc5a3b
fix(anthropic): update cassette for streaming benchmark (#32609)
ccurme Aug 19, 2025
6f058e7
fix(core): (v1) update BaseChatModel return type to AIMessage (#32626)
ccurme Aug 21, 2025
7a10861
Merge branch 'master' into wip-v1.0
ccurme Aug 25, 2025
7e9ae5d
feat(openai): (v1) delete `bind_functions` and remove `tool_calls` fr…
ccurme Aug 25, 2025
a2322f6
Merge branch 'master' into wip-v1.0
mdrxy Aug 26, 2025
2428815
feat: standard content, IDs, translators, & normalization (#32569)
mdrxy Aug 27, 2025
e4b69db
Merge branch 'master' into wip-v1.0
ccurme Aug 27, 2025
9a9263a
fix(langchain): (v1) delete unused chains (#32711)
ccurme Aug 27, 2025
cb4705d
chore: (v1) drop support for python 3.9 (#32712)
ccurme Aug 27, 2025
a47d993
release(core): 1.0.0dev0 (#32713)
ccurme Aug 27, 2025
72b66fc
release(core): 1.0.0a1 (#32715)
ccurme Aug 27, 2025
a80fa1b
chore(infra): drop anthropic from core test matrix (#32717)
ccurme Aug 27, 2025
9b57644
release: anthropic, openai 1.0.0a1 (#32723)
ccurme Aug 27, 2025
ddde1ef
fix: openai, anthropic (v1) fix core lower bound (#32724)
ccurme Aug 27, 2025
e09d90b
Merge branch 'master' into wip-v1.0
mdrxy Aug 28, 2025
925ad65
fix(core): typo in `content.py`
mdrxy Aug 28, 2025
f088fac
Merge branch 'master' into wip-v1.0
mdrxy Aug 30, 2025
b494a3c
chore(cli): drop python 3.9 support (#32761)
mdrxy Aug 30, 2025
830d1a2
Merge branch 'master' into wip-v1.0
mdrxy Aug 31, 2025
0f1afa1
chore(text-splitters): drop python 3.9 support (#32771)
mdrxy Aug 31, 2025
431e6d6
chore(standard-tests): drop python 3.9 (#32772)
mdrxy Aug 31, 2025
a5f92fd
fix: update some docstrings and typing
mdrxy Sep 2, 2025
a487412
chore: move `_convert_openai_format_to_data_block` from `langchain_v0…
mdrxy Sep 2, 2025
365d7c4
nit: OpenAI docstrings
mdrxy Sep 2, 2025
4f8cced
chore: move `convert_to_openai_data_block` and `convert_to_openai_ima…
mdrxy Sep 2, 2025
00def6d
rfc: remove unused `TypeGuard`s
mdrxy Sep 2, 2025
9a3ba71
fix: version equality CI check
mdrxy Sep 2, 2025
820e355
Merge branch 'master' into wip-v1.0
mdrxy Sep 2, 2025
5c8837e
fix some imports
mdrxy Sep 2, 2025
1237f94
Merge branch 'wip-v1.0' of github.com:langchain-ai/langchain into wip…
mdrxy Sep 2, 2025
25d5db8
fix: ci
mdrxy Sep 2, 2025
e15c412
feat(openai): (v1) update default `output_version` (#32674)
ccurme Sep 2, 2025
bf41a75
release(openai): 1.0.0a2 (#32790)
ccurme Sep 2, 2025
2cf5c52
release(core): 1.0.0a2 (#32792)
ccurme Sep 2, 2025
98e4e7d
Merge branch 'master' into wip-v1.0
ccurme Sep 2, 2025
a54f438
Merge branch 'master' into wip-v1.0
ccurme Sep 2, 2025
50b48fa
chore(openai): bump minimum core version (#32795)
ccurme Sep 2, 2025
f98f735
refactor(core): refactors for python 3.10+ (#32787)
cbornet Sep 3, 2025
083fbfb
chore(core): add utf-8 encoding to `Path` `read_text`/`write_text` (#…
cbornet Sep 8, 2025
0b8817c
Merge branch 'master' into wip-v1.0
mdrxy Sep 8, 2025
9e54c5f
Merge branch 'master' into wip-v1.0
mdrxy Sep 8, 2025
b1a105f
fix: huggingface lint
mdrxy Sep 8, 2025
8509efa
chore: remove erroneous pyversion specifiers
mdrxy Sep 8, 2025
3c189f0
chore(langchain): fix deprecation warnings (#32379)
cbornet Sep 8, 2025
188c015
Merge branch 'master' into wip-v1.0
mdrxy Sep 8, 2025
20979d5
Merge branch 'master' into wip-v1.0
mdrxy Sep 9, 2025
a48ace5
fix: lint
mdrxy Sep 9, 2025
544b08d
Merge branch 'master' into wip-v1.0
mdrxy Sep 10, 2025
fded6c6
chore(core): remove beta namespace and context api (#32850)
eyurtsev Sep 10, 2025
cb8598b
Merge branch 'master' into wip-v1.0
mdrxy Sep 11, 2025
311aa94
Merge branch 'master' into wip-v1.0
mdrxy Sep 11, 2025
ced9fc2
Merge branch 'master' into wip-v1.0
mdrxy Sep 11, 2025
8b1e254
fix(core): add `on_tool_error` to `_AstreamEventsCallbackHandler` (#3…
vbarda Sep 11, 2025
750a3ff
Merge branch 'master' into wip-v1.0
mdrxy Sep 11, 2025
5ef7d42
refactor(core): remove `example` attribute from `AIMessage` and `Huma…
mdrxy Sep 11, 2025
ffee515
fix lint
mdrxy Sep 11, 2025
3ef1165
Merge branch 'master' into wip-v1.0
mdrxy Sep 11, 2025
207ea46
Merge branch 'master' into wip-v1.0
mdrxy Sep 11, 2025
387d0f4
fix: lint
mdrxy Sep 11, 2025
7cc9312
fix: anthropic test since new dynamic max tokens
mdrxy Sep 11, 2025
9f14714
fix: anthropic tests (stale cassette from max dynamic tokens)
mdrxy Sep 12, 2025
67aa37b
Merge branch 'master' into wip-v1.0
mdrxy Sep 12, 2025
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
12 changes: 6 additions & 6 deletions .github/scripts/check_diff.py
Original file line number Diff line number Diff line change
Expand Up @@ -121,25 +121,25 @@ def _get_configs_for_single_dir(job: str, dir_: str) -> List[Dict[str, str]]:
if job == "codspeed":
py_versions = ["3.12"] # 3.13 is not yet supported
elif dir_ == "libs/core":
py_versions = ["3.9", "3.10", "3.11", "3.12", "3.13"]
py_versions = ["3.10", "3.11", "3.12", "3.13"]
# custom logic for specific directories
elif dir_ == "libs/partners/milvus":
# milvus doesn't allow 3.12 because they declare deps in funny way
py_versions = ["3.9", "3.11"]
py_versions = ["3.10", "3.11"]

elif dir_ in PY_312_MAX_PACKAGES:
py_versions = ["3.9", "3.12"]
py_versions = ["3.10", "3.12"]

elif dir_ == "libs/langchain" and job == "extended-tests":
py_versions = ["3.9", "3.13"]
py_versions = ["3.10", "3.13"]
elif dir_ == "libs/langchain_v1":
py_versions = ["3.10", "3.13"]

elif dir_ == ".":
# unable to install with 3.13 because tokenizers doesn't support 3.13 yet
py_versions = ["3.9", "3.12"]
py_versions = ["3.10", "3.12"]
else:
py_versions = ["3.9", "3.13"]
py_versions = ["3.10", "3.13"]

return [{"working-directory": dir_, "python-version": py_v} for py_v in py_versions]

Expand Down
2 changes: 1 addition & 1 deletion .github/workflows/_release.yml
Original file line number Diff line number Diff line change
Expand Up @@ -371,7 +371,7 @@ jobs:
runs-on: ubuntu-latest
strategy:
matrix:
partner: [openai, anthropic]
partner: [openai]
fail-fast: false # Continue testing other partners if one fails
env:
ANTHROPIC_API_KEY: ${{ secrets.ANTHROPIC_API_KEY }}
Expand Down
8 changes: 4 additions & 4 deletions .github/workflows/scheduled_test.yml
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
name: '⏰ Scheduled Integration Tests'
run-name: "Run Integration Tests - ${{ inputs.working-directory-force || 'all libs' }} (Python ${{ inputs.python-version-force || '3.9, 3.11' }})"
run-name: "Run Integration Tests - ${{ inputs.working-directory-force || 'all libs' }} (Python ${{ inputs.python-version-force || '3.10, 3.13' }})"

on:
workflow_dispatch: # Allows maintainers to trigger the workflow manually in GitHub UI
Expand All @@ -9,7 +9,7 @@ on:
description: "From which folder this pipeline executes - defaults to all in matrix - example value: libs/partners/anthropic"
python-version-force:
type: string
description: "Python version to use - defaults to 3.9 and 3.11 in matrix - example value: 3.9"
description: "Python version to use - defaults to 3.10 and 3.13 in matrix - example value: 3.11"
schedule:
- cron: '0 13 * * *' # Runs daily at 1PM UTC (9AM EDT/6AM PDT)

Expand Down Expand Up @@ -40,9 +40,9 @@ jobs:
PYTHON_VERSION_FORCE: ${{ github.event.inputs.python-version-force || '' }}
run: |
# echo "matrix=..." where matrix is a json formatted str with keys python-version and working-directory
# python-version should default to 3.9 and 3.11, but is overridden to [PYTHON_VERSION_FORCE] if set
# python-version should default to 3.10 and 3.13, but is overridden to [PYTHON_VERSION_FORCE] if set
# working-directory should default to DEFAULT_LIBS, but is overridden to [WORKING_DIRECTORY_FORCE] if set
python_version='["3.9", "3.11"]'
python_version='["3.10", "3.13"]'
working_directory="$DEFAULT_LIBS"
if [ -n "$PYTHON_VERSION_FORCE" ]; then
python_version="[\"$PYTHON_VERSION_FORCE\"]"
Expand Down
34 changes: 17 additions & 17 deletions docs/docs/how_to/multimodal_inputs.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -58,7 +58,7 @@
},
{
"cell_type": "code",
"execution_count": 10,
"execution_count": null,
"id": "1fcf7b27-1cc3-420a-b920-0420b5892e20",
"metadata": {},
"outputs": [
Expand Down Expand Up @@ -102,7 +102,7 @@
" ],\n",
"}\n",
"response = llm.invoke([message])\n",
"print(response.text())"
"print(response.text)"
]
},
{
Expand Down Expand Up @@ -133,7 +133,7 @@
},
{
"cell_type": "code",
"execution_count": 2,
"execution_count": null,
"id": "99d27f8f-ae78-48bc-9bf2-3cef35213ec7",
"metadata": {},
"outputs": [
Expand Down Expand Up @@ -163,7 +163,7 @@
" ],\n",
"}\n",
"response = llm.invoke([message])\n",
"print(response.text())"
"print(response.text)"
]
},
{
Expand All @@ -176,7 +176,7 @@
},
{
"cell_type": "code",
"execution_count": 4,
"execution_count": null,
"id": "325fb4ca",
"metadata": {},
"outputs": [
Expand All @@ -198,7 +198,7 @@
" ],\n",
"}\n",
"response = llm.invoke([message])\n",
"print(response.text())"
"print(response.text)"
]
},
{
Expand Down Expand Up @@ -234,7 +234,7 @@
},
{
"cell_type": "code",
"execution_count": 3,
"execution_count": null,
"id": "6c1455a9-699a-4702-a7e0-7f6eaec76a21",
"metadata": {},
"outputs": [
Expand Down Expand Up @@ -284,7 +284,7 @@
" ],\n",
"}\n",
"response = llm.invoke([message])\n",
"print(response.text())"
"print(response.text)"
]
},
{
Expand Down Expand Up @@ -312,7 +312,7 @@
},
{
"cell_type": "code",
"execution_count": 4,
"execution_count": null,
"id": "55e1d937-3b22-4deb-b9f0-9e688f0609dc",
"metadata": {},
"outputs": [
Expand Down Expand Up @@ -342,7 +342,7 @@
" ],\n",
"}\n",
"response = llm.invoke([message])\n",
"print(response.text())"
"print(response.text)"
]
},
{
Expand Down Expand Up @@ -417,7 +417,7 @@
" ],\n",
"}\n",
"response = llm.invoke([message])\n",
"print(response.text())"
"print(response.text)"
]
},
{
Expand All @@ -443,7 +443,7 @@
},
{
"cell_type": "code",
"execution_count": 2,
"execution_count": null,
"id": "83593b9d-a8d3-4c99-9dac-64e0a9d397cb",
"metadata": {},
"outputs": [
Expand Down Expand Up @@ -488,13 +488,13 @@
" ],\n",
"}\n",
"response = llm.invoke([message])\n",
"print(response.text())\n",
"print(response.text)\n",
"response.usage_metadata"
]
},
{
"cell_type": "code",
"execution_count": 3,
"execution_count": null,
"id": "9bbf578e-794a-4dc0-a469-78c876ccd4a3",
"metadata": {},
"outputs": [
Expand Down Expand Up @@ -530,7 +530,7 @@
" ],\n",
"}\n",
"response = llm.invoke([message, response, next_message])\n",
"print(response.text())\n",
"print(response.text)\n",
"response.usage_metadata"
]
},
Expand Down Expand Up @@ -600,7 +600,7 @@
},
{
"cell_type": "code",
"execution_count": 5,
"execution_count": null,
"id": "ae076c9b-ff8f-461d-9349-250f396c9a25",
"metadata": {},
"outputs": [
Expand Down Expand Up @@ -641,7 +641,7 @@
" ],\n",
"}\n",
"response = llm.invoke([message])\n",
"print(response.text())"
"print(response.text)"
]
},
{
Expand Down
8 changes: 4 additions & 4 deletions docs/docs/how_to/multimodal_prompts.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -54,7 +54,7 @@
},
{
"cell_type": "code",
"execution_count": 2,
"execution_count": null,
"id": "5df2e558-321d-4cf7-994e-2815ac37e704",
"metadata": {},
"outputs": [
Expand All @@ -75,7 +75,7 @@
"\n",
"chain = prompt | llm\n",
"response = chain.invoke({\"image_url\": url})\n",
"print(response.text())"
"print(response.text)"
]
},
{
Expand Down Expand Up @@ -117,7 +117,7 @@
},
{
"cell_type": "code",
"execution_count": 4,
"execution_count": null,
"id": "25e4829e-0073-49a8-9669-9f43e5778383",
"metadata": {},
"outputs": [
Expand All @@ -144,7 +144,7 @@
" \"cache_type\": \"ephemeral\",\n",
" }\n",
")\n",
"print(response.text())"
"print(response.text)"
]
},
{
Expand Down
4 changes: 2 additions & 2 deletions docs/docs/integrations/chat/anthropic.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -1527,7 +1527,7 @@
},
{
"cell_type": "code",
"execution_count": 1,
"execution_count": null,
"id": "30a0af36-2327-4b1d-9ba5-e47cb72db0be",
"metadata": {},
"outputs": [
Expand Down Expand Up @@ -1563,7 +1563,7 @@
"response = llm_with_tools.invoke(\n",
" \"There's a syntax error in my primes.py file. Can you help me fix it?\"\n",
")\n",
"print(response.text())\n",
"print(response.text)\n",
"response.tool_calls"
]
},
Expand Down
6 changes: 3 additions & 3 deletions docs/docs/integrations/chat/bedrock.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -243,12 +243,12 @@
"id": "0ef05abb-9c04-4dc3-995e-f857779644d5",
"metadata": {},
"source": [
"You can filter to text using the [.text()](https://python.langchain.com/api_reference/core/messages/langchain_core.messages.ai.AIMessage.html#langchain_core.messages.ai.AIMessage.text) method on the output:"
"You can filter to text using the [.text](https://python.langchain.com/api_reference/core/messages/langchain_core.messages.ai.AIMessage.html#langchain_core.messages.ai.AIMessage.text) property on the output:"
]
},
{
"cell_type": "code",
"execution_count": 5,
"execution_count": null,
"id": "2a4e743f-ea7d-4e5a-9b12-f9992362de8b",
"metadata": {},
"outputs": [
Expand All @@ -262,7 +262,7 @@
],
"source": [
"for chunk in llm.stream(messages):\n",
" print(chunk.text(), end=\"|\")"
" print(chunk.text, end=\"|\")"
]
},
{
Expand Down
4 changes: 2 additions & 2 deletions docs/docs/integrations/chat/litellm.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -261,7 +261,7 @@
},
{
"cell_type": "code",
"execution_count": 5,
"execution_count": null,
"id": "c5fac0e9-05a4-4fc1-a3b3-e5bbb24b971b",
"metadata": {
"colab": {
Expand All @@ -286,7 +286,7 @@
],
"source": [
"async for token in llm.astream(\"Hello, please explain how antibiotics work\"):\n",
" print(token.text(), end=\"\")"
" print(token.text, end=\"\")"
]
},
{
Expand Down
Loading