Description
It would be nice to provide first-class support to asynchronous resolvers. It's true, CRuby can't do too much in parallel, but what it can do (I/O) might be worth supporting.
#168 (DeferredExecution) may provide a platform for this: the execution strategy maintains a queue of "threads" and resolves them one at a time. Perhaps resolvers could signal to graphql-ruby
that they're "working", and graphql-ruby could push the thread to the end of the queue. Then, when it's done with other things, it could revisit that thread and see if it's ready (and, if it's the only thread left, wait for it).
I'd like to support this behavior in a general way without requiring any dependencies. I know graphql-batch
uses promise.rb
, and personally, I'm interested in concurrent-ruby
's Future
.
I can imagine general support working like this:
-
You provide a GraphQL-friendly wrapper around the async "primitive" of your choice. Perhaps it's an object that responds to
.ready?
,.value
and.wait
. -
You tell GraphQL that the resolve function will return an async response, eg
field :remote_call, RemoteThingType, async: true do resolve -> (obj, args, ctx) { fetch_from_http(obj.thing_id) } end # or ... field :remote_call, RemoteThingType do resolve_async -> (obj, args, ctx) { fetch_from_http(obj.thing_id) } end # or ... field :remote_call, RemoteThingType do resolve -> (obj, args, ctx) { GraphQL.async(fetch_from_http(obj.thing_id)) } end # or ... field :remote_call, RemoteThingType do resolve GraphQL.async(-> (obj, args, ctx) { fetch_from_http(obj.thing_id) }) end
Another important element to this is that it should also support asynchronous connection resolvers. In that case, GraphQL "messes with" the resolver you give it, so we should be careful to support that case too.
What do you think, would this kind of support be valuable in the gem? Are there other factors that should be considered?