RyanMarcus / dirty-json

A parser for invalid JSON
GNU Affero General Public License v3.0
294 stars 30 forks source link

Handling duplicate keys with different strategy #19

Closed RezaRahmati closed 4 years ago

RezaRahmati commented 5 years ago

Hi

I have a string like this

   { 
    prop1: 'val1', 
    prop2: { 
         prop3: 'val3' , 
         messages: {something:'val'},  
         messages: { something:'val2', x : {x : 1, y : 5} } 
    } 
    , prop4: 'val4' 
 }

As it's obvious messages is duplicate and when I try to use the library it takes the last one, which is good for most cases.

I was wondering if it's possible to have an option how to treat duplicates, possible values could be (TakeLast, TakeFirst, ConvertToArray)

For ConvertToArray, the output will be something like below

  { 
    prop1: 'val1', 
    prop2: { 
         prop3: 'val3' , 
         messages:  [ 
                    {something:'val'},  
                    { something:'val2', x : {x : 1, y : 5} } 
         ] 
    } 
    , prop4: 'val4' 
 }
RyanMarcus commented 4 years ago

This is a little funky, because there are a lot of cases where a key could be overridden... for example, what if the overridden value is already a list?

The "solution" I came up with was to add a config option to build a sort of linked list out of duplicate keys.

parse('{"key": 1, "key": 2, "key": [1, 2, 3]}', {"duplicateKeys": true})

Will give:

{ key: { value: { value: 1, next: 2 }, next: [ 1, 2, 3 ] } }