Open techknowfile opened 4 years ago
To work around this issue, I've created a custom event that takes a dictionary of lists of django models so that I can send a single event on Django Channels that contains the entire change payload. I'm kicking this off from the mutation itself (with the mutation's validated payload) now instead of via .post_save hooks.
Here's an example of that custom event (note, this code doesn't confirm that the objects in the lists are django models):
class DictListModelSubscriptionEvent(SubscriptionEvent):
def __init__(self, operation=None, instance=None):
super(DictListModelSubscriptionEvent, self).__init__(operation, instance)
if type(self.instance) == str:
# deserialize django object
self.instance = self.deserialize_dict()
def to_dict(self):
_dict = super(DictListModelSubscriptionEvent, self).to_dict()
_dict["instance"] = self.serialize_dict()
return _dict
def serialize_dict(self):
result = {}
for key in self.instance:
result[key] = serializers.serialize("json", self.instance[key])
return json.dumps(result)
def deserialize_dict(self):
result = {}
serialized_dict = json.loads(self.instance)
for key in serialized_dict:
result[key] = [
deserialized_object.object
for deserialized_object in list(
serializers.deserialize("json", serialized_dict[key])
)
]
return result
However, I'd still really appreciate a response to the initial question if ya'll happen to have more insight
I have a mutation that takes lists of several different DjangoObjectTypes and calls .save() on each of them inside of an atomic transaction. I'm currently Queueing up their event.send() calls inside of transaction.on_commit() so that the subscription events corresponding to the models' on_save() events only kick off if the entire transaction is successful.
My problem now is this: I want to create a subscription that can send the same payload as what was received in the initial mutation, assuming the mutation commit is successful (as opposed to a single subscription event per change). One way I can see this being done is by keeping my current configuration (one event per django model updated) and the grouping the events that came from the same transaction together with rxpy.
If I could make the events sent inside of a single on_commit become an Observable sequence, I could simply read each sequence of events into the subscription payload. But it seems that currently, each event becomes its own observable, so there's no notion of a "start" and "end" to the sequence of related events. The best I can do is create a buffer that collects different observables together, but this isn't really a viable solution to my problem.
Is it possible to combine multiple django events into an rxpy observable sequence? Or should I be instead creating a custom event, not tied to my django models' .save() but rather tied directly to the mutation (something like putting .on_commit(custom_event(payload)) inside of the .mutate's transaction.atomic block)