Skip to content

Slow Query Performance in Postgres Provider Due to Full count on Large Tables #1969

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
webb-ben opened this issue Mar 18, 2025 · 0 comments
Labels
bug Something isn't working OGC API - Features OGC API - Features

Comments

@webb-ben
Copy link
Member

Description
The Postgres provider currently performs a full .count() on every request to populate numberMatched, which significantly slows down queries on large tables. This makes API responses much slower than necessary, especially when querying large datasets for a small subset of the features.

Steps to Reproduce

  1. Set up pygeoapi with a large Postgres table as a data source.
  2. Make a request to an endpoint that queries the table.
  3. Observe that the response time is impacted by the .count() operation.

Expected behavior
Queries should return results faster by avoiding expensive .count() operations on large tables.

Potential workarounds are:

Screenshots/Tracebacks
If applicable, add screenshots to help explain your problem.

Environment

  • OS: Mac
  • Python version: 3.10
  • pygeoapi version: 0.20.dev0

Additional context
Similar constraints exist in other providers, with some only doing a count of features when resulttype=hits

@webb-ben webb-ben added bug Something isn't working OGC API - Features OGC API - Features labels Mar 18, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working OGC API - Features OGC API - Features
Projects
None yet
Development

No branches or pull requests

1 participant