algolia / algoliasearch-client-scala

⚡️ A fully-featured and blazing-fast Scala API client to interact with Algolia.
https://www.algolia.com/doc/api-client/getting-started/install/scala/
MIT License
26 stars 29 forks source link

Serialization of `alternatives` attribute in QueryRules conditions is incorrect #576

Closed jonas-depop closed 4 years ago

jonas-depop commented 4 years ago

Description

Serialization of query rules alternatives condition is not correct

Steps To Reproduce

Create a query rule with alternatives on (synonyms, ...) and use AlgoliaSyncHelper to export the rules. You will get something like

  {
    "objectID": "qr-1",
    "condition": {
      "pattern": "test",
      "anchoring": "contains",
      "alternatives": {
      }
    },
    "consequence": {
      "params": {
        "numericFilters": [
          "price<0"
        ]
      },
      "filterPromotes": true
    },
    "description": ""
  }

We should get

  {
    "objectID": "qr-1",
    "condition": {
      "pattern": "test",
      "anchoring": "contains",
      "alternatives": true
    },
    "consequence": {
      "params": {
        "numericFilters": [
          "price<0"
        ]
      },
      "filterPromotes": true
    },
    "description": ""
  }
aseure commented 4 years ago

Thank you for reporting this @jonas-depop. I'll take a look this week to reproduce the issue on our end and fix it.

aseure commented 4 years ago

@jonas-depop Could you provide us a Scala snippet showcasing the issue? I've used the following snippet to generate a query rule (with condition.alternatives set) but as shown in the output, the exported rules are correctly deserialized.

Scala snippet

import algolia.AlgoliaDsl._
import algolia.objects.{Alternatives, Condition, Consequence, Rule}
import algolia.{AlgoliaClient, AlgoliaDsl, AlgoliaSyncHelper}
import org.json4s.Formats
import org.json4s.native.Serialization.write

import scala.concurrent.Await
import scala.concurrent.ExecutionContext.Implicits.global
import scala.concurrent.duration.Duration

object Main extends App {

  implicit val formats: Formats = AlgoliaDsl.formats
  implicit val duration: Duration.Infinite = Duration.Inf

  val appId = "..."
  val apiKey = "..."
  val indexName = "..."
  val client = new AlgoliaClient(appId, apiKey)

  val rule = Rule(
    objectID = "qr-1",
    condition = Condition(
      pattern = "test",
      anchoring = "contains",
      alternatives = Some(Alternatives.`true`)
    ),
    consequence = Consequence(
      params = Some(
        Map(
          "numericFilters" -> "price<0"
        )
      ),
      filterPromotes = Some(true)
    ),
    description = Some("")
  )

  println("Saving the rule...")
  val f = client.execute {
    save.rule(rule).inIndex(indexName)
  }

  println("Waiting for the rule to be saved...")
  Await.ready(f, duration)

  println("Exporting the rules...")
  val rules =
    AlgoliaSyncHelper(client)
      .exportRules(indexName)
      .toSeq
      .flatten

  println(s"rules' length: ${rules.length}")
  println("rules' content:")
  rules.map(r => println(write(r)))

  System.exit(0)
}

Output

Saving the rule...
Waiting for the rule to be saved...
Exporting the rules...
rules' length: 1
rules' content:
{"objectID":"qr-1","condition":{"pattern":"test","anchoring":"contains","alternatives":true},"consequence":{"params":{"numericFilters":"price<0"},"filterPromotes":true},"description":""}
jonas-depop commented 4 years ago

The error was mine I was using the default formats

  implicit private val formats: Formats                = org.json4s.DefaultFormats

Sorry for the inconveniences and thanks for your help.

aseure commented 4 years ago

Thanks for letting us know!

The error is also on us since it was not properly documented. I'll see if we can add that to the README or elsewhere.

jonas-depop commented 4 years ago

Hi @aseure I do not know if this is a good idea but maybe if the formats were added directly as implicit in algolia.AlgoliaDsl they would be used by default when doing import algolia.AlgoliaDsl._

That way we would not have entered into this issue.

Thanks again for your support.