-
Notifications
You must be signed in to change notification settings - Fork 2k
Fixed indexing when no sites are specified #4822
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
The latest updates on your projects. Learn more about Vercel for Git ↗︎
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
PR Summary
Fixed SharePoint connector's site listing functionality to properly discover and index all available sites when no specific sites are provided in the configuration.
- Modified
_fetch_sites
method inbackend/onyx/connectors/sharepoint/connector.py
to correctly iterate throughsites.current_page
instead of using a singleresource_url
- Improved site discovery by using each site's
web_url
for building site descriptors, enabling comprehensive organization-wide indexing - Added support for automatic site discovery when no sites are explicitly specified in the connector configuration
💡 (2/5) Greptile learns from your feedback when you react with 👍/👎!
1 file(s) reviewed, 1 comment(s)
Edit PR Review Bot Settings | Greptile
sites = self.graph_client.sites.get_all().execute_query() | ||
site_descriptors = [ | ||
SiteDescriptor( | ||
url=sites.resource_url, | ||
drive_name=None, | ||
folder_path=None, | ||
site_descriptors = [] | ||
for site in sites.current_page: | ||
site_descriptors.append( | ||
SiteDescriptor( | ||
url=site.web_url, | ||
drive_name=None, | ||
folder_path=None, | ||
) | ||
) | ||
] | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
logic: Consider handling pagination here - current_page
suggests there might be more pages of sites that aren't being fetched
sites = self.graph_client.sites.get_all().execute_query() | |
site_descriptors = [ | |
SiteDescriptor( | |
url=sites.resource_url, | |
drive_name=None, | |
folder_path=None, | |
site_descriptors = [] | |
for site in sites.current_page: | |
site_descriptors.append( | |
SiteDescriptor( | |
url=site.web_url, | |
drive_name=None, | |
folder_path=None, | |
) | |
) | |
] | |
sites = self.graph_client.sites.get_all().execute_query() | |
site_descriptors = [] | |
while True: | |
for site in sites.current_page: | |
site_descriptors.append( | |
SiteDescriptor( | |
url=site.web_url, | |
drive_name=None, | |
folder_path=None, | |
) | |
) | |
if not sites.has_next: | |
break | |
sites.get_next().execute_query() |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@wenxi-onyx can we add a test to the test_sharepoint_connector.py
file that checks this case?
@Weves Added test - note that it only checks if any docs are retrieved. Does not assert "expected sites" because number of sites in our tenant may change at any time. |
* Fixed indexing when no sites are specificed * Added test for Sharepoint all sites index * Accounted for paginated results. * Typing * Typing --------- Co-authored-by: Wenxi Onyx <wenxi-onyx@Wenxis-MacBook-Pro.local>
* Fixed indexing when no sites are specificed * Added test for Sharepoint all sites index * Accounted for paginated results. * Typing * Typing --------- Co-authored-by: Wenxi Onyx <wenxi-onyx@Wenxis-MacBook-Pro.local>
Description
Sharepoint connector didn't correctly list sites in organization
Fixes https://linear.app/danswer/issue/DAN-2044/sharepoint-indexing-bug
How Has This Been Tested?
Created Sharepoint connector with no sites specified
Backporting (check the box to trigger backport action)
Note: You have to check that the action passes, otherwise resolve the conflicts manually and tag the patches.