Description
From my point of view:
I replaced RESTful with RPC to illustrate my point better.
GraphQL plays pretty well on composing data with resolver and dataloader.
RPC (better with openapi) has a mature software ecosystem such as rate control, caching, monitoring etc.
and it has capable of generating clients from openapi.json
As a python / fastapi developer I got an idea one day that we can apply resolver and dataloader pattern into any class transforming library of a language which support typings
for example: pydantic.
so that we can declare complicated, deep, nest schema, and let resolver / dataloader handle the fetching process. (no worry about N+1 query).
and let RPC (openapi) ecosystem to generate client with types and method to clients.
I've setup a demo repo for this whole process, it includes:
- define basic schema
- compose complicated schema
- define resolver and dataloader, if a loader return value meets the basic schema, then it can be inherited and extended.
- exposing api to clients.
# 1. define base schemas
class Comment(BaseModel):
id: int
content: str
user_id: int
class Blog(BaseModel):
id: int
title: str
class User(BaseModel):
id: int
name: str
# 2. inherit and extend from base schemas
class MyBlogSite(BaseModel):
name: str
blogs: list[MyBlog] = []
async def resolve_blogs(self):
return await get_blogs()
comment_count: int = 0
def post_comment_count(self):
return sum([b.comment_count for b in self.blogs])
class MyBlog(Blog):
comments: list[MyComment] = []
def resolve_comments(self, loader=LoaderDepend(blog_to_comments_loader)):
return loader.load(self.id)
comment_count: int = 0
def post_comment_count(self):
return len(self.comments)
class MyComment(Comment):
user: Optional[User] = None
def resolve_user(self, loader=LoaderDepend(user_loader)):
return loader.load(self.user_id)
the whole can be progressive:

repo: https://github.com/allmonday/pydantic-resolve-demo
this could be a simple alternative choice for API integration.