avian2 / jsonmerge

Merge a series of JSON documents.
MIT License
214 stars 25 forks source link

Merge of all arrays with append strategy fails. #48

Closed dnj12345 closed 4 years ago

dnj12345 commented 4 years ago

Hi, Not sure if this is an issue or if there is a lack of understanding on my part. I am trying to merge two complicated json objects. The merge fails with the error:

No element of 'oneOf' validates both base and head: #

I am using a slightly modified schema mentioned in in issue #28 and https://www.tablix.org/~avian/blog/articles/talks/tomaz_solc_jsonmerge.pdf. That is, merge all arrays with append strategy.

The merge works great for some JSON documents. However, it failing for some other documents. I've pasted a sample python script below that causes the problem. Could someone let me know if there is an error in my coding or if this is not possible with JSON merge?

Thanks in advance for any feedback.

regards,

#!/usr/bin/env python3

import json
import sys

from jsonmerge import Merger

merge_schema = """
{
  "oneOf": [
    { "type": "number" },
    { "type": "string" },
    {
      "type": "array",
      "mergeStrategy": "arrayMergeById",
      "mergeOptions": {
        "idRef": "/"
      }
    },
    {
      "type": "object",
      "additionalProperties": {
        "$ref": "#"
      }
    }
  ]
}
"""

base = """
{
  "version": "1.0",
  "student": {
    "name": "Jane",
    "dob": "1-1-2020",
    "attribute1": {
      "size": 16777216,
      "name": "abc-xyz"
    },
    "class": {
      "type": "custom",
      "exams": [
        "final"
      ],
      "book": {
        "isbn": 1234,
        "name": "ABC Book",
        "author": "JohnDoe"
      }
    },
    "log": [
      "file"
    ]
  },
  "system": {
    "update": true,
    "update-path": "/tmp/file.json",
    "store1": {
      "store-url": "http://www.test.com/students/store1.json",
      "polling-interval": 5,
      "client-id": "abc-112233",
      "expire": {
        "batch-size": 1000,
        "scan-interval": 15
      }
    },
    "store2": {
      "store-url": "http://www.test.com/students/store2.json",
      "polling-interval": 5,
      "client-id": "abc-112233",
      "expire": {
        "batch-size": 1000,
        "scan-interval": 15
      }
    },
    "store3": {
      "store-url": "http://www.test.com/students/store3.json",
      "polling-interval": 5,
      "client-id": "abc-112233"
    },
    "report": {
      "polling-interval": 5,
      "batch-size": 5000,
      "report-type": "file",
      "exports": [
        {
          "name": "export1",
          "server": "server1:8443",
          "topic": "topic1",
          "resonse" : "required",
          "metadata": {
            "meta1": "Meta 1",
            "meta2": "Meta 2",
            "meta3": "Meta 3"
          }
        }
      ]
    }
  }
}
"""

new = """
{
  "student": {
    "class": {
      "type": "regular",
      "name": "no-name-class",
      "exams": [
        "mid-term1",
        "mid-term2"
      ],
      "book": {
        "isbn": 1234,
        "name": "ABC Book",
        "author": "JohnDoe"
      }
    }
  }
}
"""

def JsonMerge(base, new_obj):
  schema = json.loads(merge_schema)
  merger = Merger(schema)
  return merger.merge(base, new_obj, schema)

if __name__ == "__main__":
  bjson = None
  try:
    bjson = json.loads(base)
  except Exception as err:
    print('Base JSON load error. %s' % err)
    sys.exit(-1)

  njson = None
  try:
    njson = json.loads(new)
  except Exception as err:
    print('New JSON load error. %s' % err)
    sys.exit(-1)

  merged = None
  try:
    merged = JsonMerge(bjson, njson)
  except Exception as err:
    print('JSON merge error. %s' % err)
    sys.exit(-1)

  print(json.dumps(merged, indent=2, separators=(',', ': ')))
avian2 commented 4 years ago

Try passing your data through jsonschema.validate before calling merge.

dnj12345 commented 4 years ago

@avian2 Thank you.

After running it through jsconschema.validate, it looks like I was missing merge strategy for boolean in my schema. After adding this { "type": "boolean" } to schema, it worked. Thanks again.