Open yarnball opened 7 years ago
Hello, you can use a proxy model, maybe not the cleanest way to do it but there will probably never be an api for that use case.
Thanks- I've taken your advice, but the new model cluster never gets created. Do I need to send it a signal or something?
class Item(EsIndexable, models.Model):
title = models.CharField( blank=True)
tag = models.ManyToManyField('Tag', blank=True)
class Elasticsearch(EsIndexable.Elasticsearch):
serializer_class = ItemEsSerializer
fields = ['title', 'tag']
@receiver(post_delete, sender= Item)
def delete_elastic(instance, **kwargs):
instance.es.delete()
@receiver(post_save, sender= Item)
def index_elastic(instance, **kwargs):
instance.es.do_index()
m2m_changed.connect(tag_index, sender=Item.tag.through)
class Newmodel(Item):
class Meta:
proxy = True
class Elasticsearch(EsIndexable.Elasticsearch):
serializer_class = NewModelEsSerializer
fields = ['title', 'tag']
Yes, unfortunately this is a know django behavior, see https://code.djangoproject.com/ticket/9318. I didn't really dig but you there should be a workaround in the ticket.
If it really doesn't work, another way would be to override the do_index
method of the ElasticsearchManager of the model.
Untested but it could look like this:
from django_elasticsearch.managers import ElasticsearchManager
class ItemEsManager(ElasticsearchManager):
def do_index(self):
for serializer in [ItemEsSerializer, NewModelEsSerializer]:
s = serializer(self.model, **kwargs)
body = s.serialize(self.instance)
es_client.index(index=self.index,
doc_type=self.doc_type,
id=self.instance.id,
body=body)
class Item(Model):
es = NewModelEsManager()
Hi,
I have a use case where I want to use the same model, but which use two different serializers. Each serializer should be on its own elastic-search URL.
Is this possible?
EG:
localhost:9200/django/model-Item/
localhost:9200/django/model-TaskNew/