Closed hballard closed 4 years ago
This is awesome!
I will take a closer look next week and provide some feedback then :)
@syrusakbary Thoughts?
Nice!!!
@syrusakbary Any update on this?
I took a look in the implementation and besides some small nits (like Apollo*
naming) it looked good!
I'm waiting to few things:
Thanks for the comments @syrusakbary . I think your grapqhl / graphene / promises libraries are amazing...any constructive criticism you have I'm happy to hear. I only used the "Apollo.." naming convention because my implementation was initially based on their (Apollo's) graphql subscription transport protocol. But, as the final spec should be merged soon, I'm happy to drop that convention, since it only affects the main subscription transport class. I've been tied up the last couple months, since I published this, so I haven't been able to devote much time to improving it. Some priorities in the near term for me:
Thanks!
@hballard @syrusakbary With the official spec now merged, is there room for contributions on getting subscriptions implemented? I'm interested in using graphene with a new django project, and this would be huge.
Indeed. This would make Django a great fit for angular2 real time apps
@Helw150 - I can't speak for @syrusakbary, but I know I'd welcome any contributions on my repo. Also, not sure if @syrusakbary is interested in integrating it into graphene eventually, just prefers to fork it, or go his own way. I started this mainly as a hobby project, when I was playing w/ Apollo subscriptions and noticed their wasn't an implementation for graphene (python being my preferred server language). I just pushed a commit to the "tests" branch with about half the subscriptions-transport tests and all the subscriptions-manager tests were added a few weeks ago. The rest of the transport tests should be easier to finish up now, the initial test setup for those slowed me down a bit--some of it was new for me. I haven't had a ton of time to devote to this and I don't really use django...so not sure when I would get to that. My next focus would be Python 3 compatibility...which might not be that difficult. Of course now that the initial commits for graphql subscriptions have been added to the spec, probably lot's more need to be done outside of these focues. You can read my summary of the my transport tests commit on the issue I created for them here
Update: Initial tests have been added as of last weekend (see commit here and I merged a commit last week that added Python 3 compatibility (2.7, 3.4, 3.5, & 3.6). My next two priorities are adding additional executors (threads, asyncio, etc.) and some type of Django compatibility. I don't really use Django...for those that do...is channels the preferred method for adding real-time services to Django now or something like django-websocket-redis or django-socketio? I've been doing a little reading on Django and channels...
Great work @hballard. I'd like to use this as inspiration for an example in my ReactQL starter kit to show how subscriptions can be served from a non-Node.js server.
@leebenson - Very cool. Any feedback you can provide is appreciated. Let me know if I can be of assistance.
Per @syrusakbary previous comment above, "Having a subscriptions integration with Django so we assure that the subscriptions structure is abstracted in a scalable way..."; I've been thinking about the best way to do that.
My thought is that it should be fairly straightforward to generalize the concurrency executor methods like @syrusakbary did in graphql-core and have separate executor classes for each concurrency library (probably will even borrow some of the logic in graphql-core for each one). Then the RedisPubsub and SubscriptionTransport classes would utilize the corresponding executor passed in by the user when they instantiate each class. I'd welcome any feedback anyone has (@syrusakbary or others) on this structure.
I spent a couple hours reading through the Django channels library this weekend and it would seem it could be treated as just another executor class under this model. Also, anyone familiar w/ Django...seems like I would utilize a "class-based consumer" (inherit from WebsocketConsumer) for the SubscriptionServer. The redis pubsub "wait_and_get_message" method in the RedisPubsub class could be implemented as another consumer. Thoughts on this (from anyone more familiar w/ Django channels)?
How about getting someone from the django core team to assist? I think reactive programming is something that would bring new people to django which would benefit in the long run. It was the reason for me to look into other solutions. What do you think?
I'm happy to have any assistance from another contributor--particularly a django core. I haven't reached out to them since I don't utilize django and I needed to abstract the concurrency executor from the base subscriptions manager and websocket server logic, in order to use other concurrency frameworks (like django channels). It would be fairly straightforward to integrate the current gevent version with django-socketio or create a simple package similar to flask-sockets to integrate geventwebsocket into django directly. I found a small library on bitbucket that seems to do just that -- django-gevent-websocket (here).
I'm currently working on abstracting the currency executor to be able to use asyncio for concurrency as well (vs the current gevent). I should merge a version of that with master in the next week or so (which will allow use w/ Sanic web framework using uvloop / asyncio)...and then I was going to turn my attention to django integration, using the same abstraction base. But my plan was to focus more on django-channels...since that seems to be the way forward for django concurrency.
Thank you for the update @hballard that sounds really awesome. :D
I am using django in some projects and would love to replace drf (django-rest-framework) with what you are building. I really prefer reactive programming and I think a graphQL api with subscriptions would totally add a lot of value on top of the django project. I am not a core django person nor do I have experience with django-channels.. I do however have practical experience with django. Feel free to reach out me if it helps. ;)
@hballard will you be at EuroPython? maybe we can find someone to help during the sprints! :)
Afraid not...I live in Texas (US)...and that would be a bit of hike!
Did you get to give it a try yet? :)
@Eraldo @hballard Yes, I think I should post an update here as I'm working full on subscriptions now.
Some thoughts about my journey: the way Apollo-Subscriptions use to manage subscriptions was not very friendly for the developer, needing to hack around the resolution and a specific PubSub implementation that was "bypassing" the GraphQL engine for adapting it into subscriptions. The reason for that is the GraphQL-js engine was not ready for subscriptions (meaning that was only able to return either a promise or a static value, but not a async iterator).
However GraphQL-js recently added a way to subscribing to a GraphQL query (that return an async iterator a.k.a. Observable
) that pushed towards simpler and cleaner implementations of subscriptions that decouple the subscription resolution from the "listener" on the subscription.
That led to better implementations of the transport mechanisms in GraphQL subscriptions like subscriptions-transport-ws.
So, in summary, subscriptions is something that should be bundled fully into the GraphQL engine, in a way that is easy to plug any mechanisms, such as:
That don't require any specific pub/sub implementation and, eventually, let this decision to the developer (in case it want to use it).
For the next version of Graphene, 2.0
I plan to have subscriptions bundled into the Engine :)
There is already a branch in graphql-core where I'm doing the research process.
I will keep updating this thread with more information as I keep working on it.
@syrusakbary what ended being the recommended approach on this? I'm using graphql python and loving it, and I'm working on an iot device project where they use mqtt for data observations. Curious if there is a recommended approach or if it is 'do what thou wilt' for subscriptions in graphql python.
Thanks!
can we have an example of how subscriptions works with Graphene in Django?
@AgentChris Here is how i managed it https://github.com/graphql-python/graphene/pull/500#issuecomment-325560994 also look at https://github.com/graphql-python/graphql-core/issues/149
thanks a los, i was trying to find an example for the last 2 days
Hi @AgentChris, take a look at this module, maybe you might be interested: graphene-django-subscriptions
I would like to know what is the progress in the implementation of subscriptions into the core of Graphene, could someone clarify please?
@Oxyrus it seems that the mechanism has been decided to be rx observables but the method for delivery to the client is still up to you.
Here's a gist with my solution using django-channels. It's a working proof of concept. Next task is optimize it for production environment. Feedback welcome!
@tricoder42 interesting code layout, I was using observables to emit subscriptions but I was having problems making django pretend to be async .. Your aproach seems cleaner. To avoid repeating queries you could use promises and dataloaders to group as much repetitive work to just one computation.
@japrogramer What do you mean by dataloaders to group
?
The main problem is that workers run in different processes, so I need to run query at least once per process. Second problem is that workers are intended to be short-lived, so I can't take it granted that parsed query is available. Worker can be restarted anytime.
@tricoder42 So Is there hope of getting this into graphene or not? Should it be spun off into its own library?
The only part which would fit into graphene is Subscription class. Everything else depends on backend (django-channels or redis), so it might be better to either keep it in separate package or locally in project. Anyway, django-channels 2.0 are on the way which makes imlementation a bit cleaner.
@tricoder42 So Is there hope of getting this into graphene or not? Should it be spun off into its own library?
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub, or mute the thread.
@tricoder42 how does django-channels 2.0 make the implementation cleaner?
@japrogramer I havent' tried it yet, but it should be possible to register custom callbacks to a Group.
Right now it's only possible to add another Channel to a Group and when message is sent, it's automatically broadcasted to all channels in Group. This is enough if you know the shape of data before sending a message. E.g if you have a Serializer (like in REST API), you can listen post_save
signal, serialize new instance and broadcast it to the Group.
In GraphQL, however, we don't know what shape of data user requested, so we need to serialize instance for each Channel (subscriber) in a group separately. That's why I'm having two layers of notifications - one for model changes and second for graphql subscriptions. In channels 2.0 it should be possible to register a callback which feeds data to Observable and returns serialized data.
After first real-world implementation and testing I updated the gist with recent code changes:
Still using django-channels 1.x.
Any examples of Graphql/Graphene using subscriptions in a Flask App? Hopefully something comparable with Apollo suite ?
@tricoder42 I just did some experimenting with Channels 2.0 this is what I have .. Im still digging into the Docs. I also don't have much time this days but Ill update once I get something more substantial going.
Note: line 62, probably needs to be made sync since I don't think on_next can be async
1 from django.utils.translation import get_language
1 import json
2 # import functools
3 import asyncio
4 import rx
5
6 from channels.consumer import AsyncConsumer
7
8 from graphene_django.settings import graphene_settings as gqsettings
9 from .views import DataLoaders
10
11
12 schema = gqsettings.SCHEMA
13
14
15 class GQLConsumer(AsyncConsumer):
16 # NOTE: asgiref.SyncToAsync for django ORM
17
18 async def graphsend(self, opID, result, message):
19 data = result.data
20 await self.send(
21 {
22 'type': 'websocket.send',
23 'text': str(json.dumps({'data': data, 'type': 'data', 'id': opID}))
24 })
25
26 # @allowed_hosts_only
27 async def websocket_connect(self, event):
28 # message.reply_channel.send({'accept': True, 'text': json.dumps({'type': 'connection_ack'})})
29 # TODO: This might need some security, auth users or apps only <10-11-17> #
30 await self.send({
31 "type": "websocket.accept",
32 })
33
34 async def websocket_receive(self, message):
35 # message is gone from the call signature, need to inspect the content of text_data and bytes_data
36 clean = json.loads(message['text'])
37 gqtype = clean.get('type')
38 clean = clean.get('payload')
39
40 if gqtype == 'connection_init':
41 await self.send({'type': 'websocket.send', 'text': json.dumps({'type': 'connection_ack'})})
42 elif gqtype == 'start':
43 __import__('pdb').set_trace()
44 self.operationName = clean.get('operationName')
45 self.query = clean.get('query')
46 self.foovar = clean.get('variables')
47
48 # This part acts like a request
49 message = dict()
50 message['reply_channel'] = self.channel_name
51 message['scope'] = self.scope
52 message['dataloaders'] = DataLoaders(get_language())
53 self.kwargs = {'context_value': message}
54
55 # TODO: Implement weight, can this query run for this user or is it too expensive <10-11-17> #
56 # TODO: Implement timeout mechanism <10-11-17> #
57 result = schema.execute(self.query, variable_values=self.foovar, allow_subscriptions=True, **self.kwargs)
58 if isinstance(result, rx.Observable):
59 class MyObserver(rx.Observer):
60
61 def on_next(self, x):
62 self.graphsend(self.operationName, x, message)
63
64 def on_error(self, e):
65 ...
66
67 def on_completed(self):
68 ...
69
70 result = result.publish().auto_connect()
71 result.subscribe(MyObserver())
72 elif gqtype == 'stop':
73 operationName = clean.get('operationName')
74 await self.channel_layer.group_discard(operationName, self.channel_name)
75 else:
76 await self.send({'type': 'websocket.send', 'text': json.dumps({'data': 'connection_ack'})})
77
78 async def websocket_disconnect(self):
79 if 'Groups' in self.scope['session']:
80 for x in self.scope['session']['Groups'].split(','):
81 ...
82 # for every Group in the session, unsubscribe current connection
83 await self.channel_layer.group_discard(x, self.channel_name)
84 # finally del the Groups from the session
85 del self.scope['session']['Groups']
~
@kavnik
Not with flask, but I did build something with Sanic which is flask like if you are interested I can send it to you.
See here: https://github.com/graphql-python/graphene/issues/545
Adam Hopkins
On Feb 7, 2018, 4:13 AM +0200, kavink notifications@github.com, wrote:
Any examples of Graphql/Graphene using subscriptions in a Flask App? Hopefully something comparable with Apollo suite ? — You are receiving this because you are subscribed to this thread. Reply to this email directly, view it on GitHub, or mute the thread.
also super interested in anything with flask or flask-like
@ahopkins Please can you post the example someplace. Would love to look at it and maybe port it to flask.
@kavink Here is a gist: https://gist.github.com/ahopkins/52bcd7d15de1e0356ee22f82b6cbf9c8
@ahopkins Thanks ! How does one publish to feed and subscribe from server side ? i.e. how can I modify for functions to either send data into feed to client or listen to client from feed.
@kavink graphene is intended to use RXPY for that part, I'm currently trying to work out how to use that with channels so i don't know that much about it but hopefully it leads you down the right road.
@tricoder42 Thank you for your promising code!
Is it possible to use Group
instead of GraphQLSubscriptionStore
to store queries (as sending to multiple channels in a group is optimized)?
For example:
# subscribed_groups = {'User': set(), ...}
# subscription_string = transform('subscription mysub on UserNode { ... }');
def on_message(message):
global subscribed_groups;
...
subscription_string = 'sub-%s' % hash(message.content['payload'])
Group(subscription_string).add(reply_channel);
model_name = message.content['model']
subscribed_groups[model_name].add(subscription_string);
...
@receiver(post_save, sender=model)
def send_model_update(...):
global subscribed_groups;
for group_name in subscribed_groups[model.__name__]:
Group(group_name).send(...)
@heyrict Unfortunately you can't in general case, because each client might be subscribed for different data. In other words: two clients might use the same subscription, but still fetch different data. I'm using custom redis store, because I need to keep graphql query around. In django-channels 1x, workers and interface are different processes so I can't data using globals either.
However, in django-channels 2.x this is solved differently, I'm gonna try it soon.
@japrogramer @ahopkins Thanks for gist and the pointer to RXPY, I will certainly look at it. But what im trying to understand how would it all work , Graphene/RXPY and the gist(Graphql subscriptions), basically trying to wrap my head around on it. i.e. should I use RXPY to publish and read from feed ? But I dont see a way to create channels in RXPY, like architecture wise what all components tie together. I can then try to reverse and implement something working .
I just submitted a pull request to graphql-ws that gives an example of a publish - subscription implementation, to make it easier to use subscriptions with graphql-ws. I modified the README and examples to show how it might work. Not sure if @syrusakbary would want this as a part of the graphql-ws library or in a separate one. Here is my fork in case you want to try it out.
@japrogramer How do you broadcast changes in DB? Do you use django signals?
I've updated my project to django-channels 2.x and also the gist.
TL;DR:
django.{app_label}.{model}
group.django.{app_label}.{model}
(this defaults to subscription's output_type, but might be overridden in Subscription.subscribe
method)(pk, model_label)
tuple as a root_value. Subscription.next
receives this tuple as first parameter and return None or corresponding object from db.I wish I could parse query just a once and pass a coroutine to Observable
. At initialization, the coroutine would subscribe to django.{app_label}.{model}
, and then I would simply call coroutine.send((pk, model))
and Observer would send serialized data to client. However it seems that whenever I pass iterable to observable, it always tries to consume it whole, even when it should wait for new data. I'm kinda lost here, but it's just an optimization. The implementation works as it is, now I'm just struggling with unit tests.
Any feedback welcome, cheers!
@tricoder42 you could try using rx.Observable.from_iterable
also take a look at this page for examples, here is a direct link to a relevant example: https://github.com/thomasnield/oreilly_reactive_python_for_data/blob/master/class_notes/class_notes.md#44---an-observable-emitting-tweets
Im still trying to wrap my head around observables. I like how if the resolve method for the subscription receives an object instance it publishes a new result.
@japrogramer Thank you for the link! I finally got it, see updated gist.
Key points:
Subscription.next
, which returns instance to be sent as a new resultThe pipeline is following:
model_changed
- notification is received on model_changed
channelStreamObservable.send
- calling stream.send
pushes new data down the streamSubscription.next
- (pk, model)
tuple is resolved in model instanceGraphQL Executor
- all results from observable are serialized using GraphQL executorGraphqlSubcriptionConsumer._send_result
- finally, new data are sent to client
Hello @syrusakbary.
Thanks for all your hard work on graphene and graphql-python. Awesome library!!
I posted this on #393 earlier this week...reposting here so it's easier to discover.
I implemented a port of the apollo graphql subscriptions modules (graphql-subscriptions and subscriptions-transport-ws) for graphene / python. They work w/ apollo-client.
It is here.
Same basic api as the Apollo modules. It is still very rough...but works so far, based on my limited internal testing. Uses redis-py, gevent-websockets, and syrusakbary/promises. I was going to add a simple example app, setup.py for easier install, and more info to the readme w/ the API, in the next few days. A brief example is below. Only works on python2 for now. My plan is to start working on tests as well. I figured I'd go ahead and share in this early stage in case anybody is interested...
I'm very new to open source, so any critiques or pull requests are welcome.
Simple example:
Server (using Flask and Flask-Sockets):
Of course on the server you have to "publish" each time you have a mutation (in this case to a redis channel). That could look something like this (using graphene / sql-alchemy):
Client (using react-apollo client):