Closed jas- closed 2 months ago
You could do this in a few ways, depending on the structure you want. I'm assuming you want each line as an entry in an array, with the fields of each line as an object.
You can use the :{}
object collection syntax to break up the line on key1=value1,key2=value2
syntax, which I think is what you're going for there. For example:
$ line=key1=value1,key2=value2
$ json @line:{}
{"line":{"key1":"value1","key2":"value2"}}
$ # This is the same as
$ json line:string{}="key1=value1,key2=value2"
{"line":{"key1":"value1","key2":"value2"}}
So you could use this feature to do it like this (assuming ./dataset
is a file containing your data):
dataset.sh:
#!/usr/bin/env bash
. json.bash
readarray -t dataset < ./dataset
objects=()
for line in "${dataset[@]}"; do
line=${line//:/=} # replace : with =
out=objects json ...:string{}@line
done
json @objects:json[]
$ ./dataset.sh | jq
{
"objects": [
{
"Item": "one",
"property": "oneA",
"element": "oneB"
},
{
"Item": "two",
"property": "two",
"element": "two"
},
{
"Item": "three",
"property": "three",
"element": "three"
},
{
"Item": "four",
"property": "four",
"element": "four"
}
]
}
The ...
merges the entries of $line
into the parent object instead of making it "line": {...}
.
I used bash parameter substitution to replace : with = rather than tr
, just because it avoids forking. But in an interactive shell you could use tr in a pipeline perhaps, for example:
$ jb ...:{}@<(tr < ./dataset : =)
{"Item":"one","property":"one","element":"one","Item":"two","property":"two","element":"two","Item":"three","property":"three","element":"three","Item":"four","property":"four","element":"four"}
Note that this example is merging all the lines into one object, without the array. In your case that results in duplicate properties, so I guess that's not your intent. But could be useful in other situations. e.g this works well to convert environment variables from env
into objects.
Perhaps not applicable here, but when you have objects with variable properties (e.g. you're adding them conditionally), using bash associative arrays is useful. The :{}
syntax detects/supports associative arrays and includes them as objects:
$ declare -A things=([foo]="example 1" [bar]="example 2")
$ things[baz]="example 3"
$ json @things:{}
{"things":{"foo":"example 1","bar":"example 2","baz":"example 3"}}
Well I feel embarrassed, the above example is in the docs. I must have overlooked it. Thanks for pointing out this solution! The associative array support is going to be useful for sure! Again, I really appreciate the support on this library.
No problem! There's quite a few ways to do things, and it's not always clear what a good way is.
Ok. I have one more question that I couldn't find in the readme.
Say I have an existing dataset I want to quickly convert to a JSON object with this library...
I have been trying something similar to the following; where
${obj[@]}
is equal to the data above.Using
set -x
it shows each iteration of${new}
as a complete string so ONLY the first element is used for a key and the remaining as the value. So the problem is obvious. Short of another loop, there has to be a better way.I am trying not to use
eval
for obvious reasons, but there must a simple way to convert the elements quicker than what I am doing. Thanks for the tips!