Skip to content

[Bug]: max_length is not passed in the prompt template reduce_system_prompt in Global search.py #1936

@Harshit-54

Description

@Harshit-54

Do you need to file an issue?

  • I have searched the existing issues and this bug is not already filed.
  • My model is hosted on OpenAI or Azure. If not, please look at the "model providers" issue and don't file a new one here.
  • I believe this is a legitimate bug, not just a question. If this is a question, please use the Discussions area.

Describe the bug

max_length was not passed in the prompt template reduce_system_prompt in Global search.py
this throws an error.

Steps to reproduce

Perform a global search(), not stream_search(). This issue will be present.

I have created a PR for the fix - Harshit-54:harshit-54/bugfix

Expected Behavior

Search functionality should work

GraphRAG Config Used

# Paste your config here

Logs and screenshots

Image with _reduce_response Image with _stream_reduce_response _No response_

Additional Information

  • GraphRAG Version:
  • Operating System:
  • Python Version:
  • Related Issues:

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't workingtriageDefault label assignment, indicates new issue needs reviewed by a maintainer

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions