kadler / db2sock-test

1 stars 0 forks source link

Defining multiarray DS json #3

Closed kadler closed 6 years ago

kadler commented 6 years ago

Original report by Teemu Halmela (Bitbucket: teemu_, GitHub: Unknown).


I'm trying to test db2sock with our system. But I have a problem defining a json which contains DS arrays.

For example with a program like this.

     H AlwNull(*UsrCtl)

       dcl-ds itemDS qualified;
            field1 char(5);
            field2 char(5);
            field3 char(5);
            field4 char(5);
       end-ds;

       dcl-pr Main extpgm;
         rows zoned(5:0);
         items likeds(itemDS) dim(20);
         last char(10);
       end-pr;

       dcl-pi Main;
         rows zoned(5:0);
         items likeds(itemDS) dim(20);
         last char(10);
       end-pi;

         dcl-s i int(10);
         for i = 1 to rows;
           items(i).field4 = items(i).field1;
           items(i).field3 = items(i).field2;
         endfor;
         last = 'TEST';

       return;

This json doesn't crash and gives almost the right results.

{"pgm":[
    {"name":"TPGM",  "lib":"DB2JSON"},
    {"s": {"name":"rows", "type":"5s0", "value":2}},
    {"ds": [
        {"s":[
            {"name":"field1", "type":"5a", "value":"ff1"},
            {"name":"field2", "type":"5a", "value":"ff2"},
            {"name":"field3", "type":"5a", "value":""},
            {"name":"field4", "type":"5a", "value":""}
        ]},
        {"s":[
            {"name":"field1", "type":"5a", "value":"gg1"},
            {"name":"field2", "type":"5a", "value":"gg2"},
            {"name":"field3", "type":"5a", "value":""},
            {"name":"field4", "type":"5a", "value":""}
        ]}
    ]},
    {"s": {"name":"last", "type":"10a", "value":""}}
]}
output(201): {"script":[{"pgm":["TPGM","DB2JSON",{"rows":2},{"":{"field1":"TEST"},{"field2":{}},{"field3":"ff2"},{"field4":"ff1"},{"field1":"gg1"},{"field2":"gg2"},{"field3":"gg2"},{"field4":"gg1"}},{"last":{}}]}]}

Weird thing is value TEST goes into DS field1.
Should I define the ds name and dim somewhere? Db2socks is still work in progress but is there currently a way to get this working?

I am testing this on v7.3 using the newest db2sock build by hand.

kadler commented 6 years ago

Original comment by Teemu Halmela (Bitbucket: teemu_, GitHub: Unknown).


DS problems should be fixed right now.

kadler commented 6 years ago

Original comment by Tony Cairns (Bitbucket: rangercairns, GitHub: rangercairns).


Ok, I added json outareaLen performance of json_output_printf (slightly changed to make work, but thanks). I included start of pass by value work. You cannot see the full pass by value picture/design yet, aka, needed to test basic register loading on call (uni-size to test). More on this later ...

kadler commented 6 years ago

Original comment by Tony Cairns (Bitbucket: rangercairns, GitHub: rangercairns).


CRLF line endings to LF. Don't know how they got there.

I believe Windows editors tend to employ CRLF. However, I have been Linux desktop for decades (LF), so I don't recall all Windows eccentrics (i think CRLF).

git crlf -- but ... seems opinion is like rain here for Windows editors

kadler commented 6 years ago

Original comment by Teemu Halmela (Bitbucket: teemu_, GitHub: Unknown).


Yeah this definitely needs straight binary interface. Those changes were just too good to pass up.

many of your changes in commit seem to be no change at all.

The latest commit doesn't actually change anything important, it just converts some CRLF line endings to LF. Don't know how they got there. They were just messing with my git, every time I saved that file.

kadler commented 6 years ago

Original comment by Tony Cairns (Bitbucket: rangercairns, GitHub: rangercairns).


to be clear 'big data' ...

We want json to run best can performance (thanks for all help), but json is not really a 'big data' interface. To be clear, I consider nested arrayed ds structures to be 'big data', most likely need a binary style interface (not written db2sock).

kadler commented 6 years ago

Original comment by Tony Cairns (Bitbucket: rangercairns, GitHub: rangercairns).


Well, i am not really ready for performance profiling. Too much is changing for just basics. However, I think strlen change is a good idea.

BTW -- many of your changes in commit seem to be no change at all. I think maybe you are using tabs while i am using spaces. I use spaces because most of the code is generated by python scripts. Also, RPG tests do no like spaces generally. So, can you please switch to space so i can see proposed changes easily?

kadler commented 6 years ago

Original comment by Teemu Halmela (Bitbucket: teemu_, GitHub: Unknown).


I got a little impatient so I did little "profiling". I found a couple of problematic places and made some fixes. These made things a lot more better and speed is starting to get more reasonable. The biggest difference is definitely in nested case. Big input json seems to need more work, I'll see if I can find something.

Before:

$ time test1000_sql400json32 j0184_pgm_hamela04-ds-rpg-occurs-500-output
input(5000000):
output(70932):

result:
success (0)

real    0m0.913s
user    0m0.478s
sys     0m0.001s

$ time test1000_sql400json32 j0185_pgm_hamela04-ds-rpg-occurs-500
input(5000000):
output(69932):

result:
success (0)

real    0m2.061s
user    0m1.118s
sys     0m0.002s

$ time test1000_sql400json32 j0188_pgm_hamela05-ds-rpg-nest 
input(5000000):
output(385359):

result:
success (0)

real    0m9.564s
user    0m5.287s
sys     0m0.001s

After:

$ time test1000_sql400json32 j0184_pgm_hamela04-ds-rpg-occurs-500-output
input(5000000):
output(70932):

result:
success (0)

real    0m0.109s
user    0m0.040s
sys     0m0.001s

$ time test1000_sql400json32 j0185_pgm_hamela04-ds-rpg-occurs-500       
input(5000000):
output(69932):

result:
success (0)

real    0m1.314s
user    0m0.516s
sys     0m0.001s

$ time test1000_sql400json32 j0188_pgm_hamela05-ds-rpg-nest      
input(5000000):
output(385359):

result:
success (0)

real    0m0.161s
user    0m0.070s
sys     0m0.001s

edit: So it seems that most of the time is used by SQLExecute so there aren't big gains to have anymore.

kadler commented 6 years ago

Original comment by Tony Cairns (Bitbucket: rangercairns, GitHub: rangercairns).


Ok. cleaned up the multi-ds code.

I think you will be able to nest multi-array 'ds' structures to hearts content. Again, deeper you nest the more output json will pop out. So, you may have to continuously increase buffer of little test json program test1000_sql400json.c.

big data ...

I won't be doing binary 'big data' interface until after we shake out major areas in json. We still have to add popular RPG convention of 'enddo':'count' to help the massive output of nothing array elements. We also should probably add other 'filters' like overlays (see this range cookie cutter), holes (ignore these elements), and so on to allow for server side filtering the ton-o-json problem you are already witnessing (too slow).

Also, performance, you can't really draw conclusions about performance or even complete understanding of expected output until we implement all json filter bells and whistles, caching resolve ILE, many others, etc.

Do you understand? Aka, we have just began to work with json, not to be confused with endgame?

kadler commented 6 years ago

Original comment by Tony Cairns (Bitbucket: rangercairns, GitHub: rangercairns).


BTW -- I was in a hurry last night, just wanted to fix this for you before evening meal. I did not fully complete ile_pgm_copy_ds replacement, i will try to clean up the stuff today. Thanks for your tests.

kadler commented 6 years ago

Original comment by Tony Cairns (Bitbucket: rangercairns, GitHub: rangercairns).


Nice, no more segfaults.

Cool! More progress then (sausage in the making).

This last test definitely needs straight binary interface because things are starting to take way too long.

Of course, json interface is not a 'bid data' interface. We need a binary interface for 'big data' to be 'fast'.

Or maybe I could explore how something like PHP modules are constructed to get a head start.

Sorry. I already told you 'big data' or 'binary' toolkit interface is not yet written. To be fair, you would be wasting your time most likely. However, education is always a good thing.

We are still doing toolkit 101 basics with json. In fact, i offer, much easier using json abstraction to organize toolkit designs. Case and point, ease at which we are exchanging tests for your needs.

starting to take way too long.

Well, at this point, we are NOT ready to do performance tuning of json interface. All ends PASE/ILE are missing critical caching at this point. Not to leave you hanging ... I have already mentioned we may be able to simply cache the json 'description' of a program call and simply send the json data ... there are many more creative ideas yet to explore here.

Also, you should set a breakpoint in your called RPG program. You will see that calling into your RPG program is already pretty fast, BUT, output of tons of json is slow. I imagine there are some output side performance tuning possibilities, but 1GB of 'string' json is a wild and crazy test to be sure. BTW -- I am very grateful for your wild tests, they will make everything better for all.

explore how something like PHP modules are constructed to get a head start.

Heart of a teacher will be my undoing.

If you are serious about learning how to compile pecl extension for php (7 or 5.6), I can help you set-up a gcc environmental that works. I can even provide a little template so you can hit the ground running.

kadler commented 6 years ago

Original comment by Teemu Halmela (Bitbucket: teemu_, GitHub: Unknown).


Nice, no more segfaults.

This last test definitely needs straight binary interface because things are starting to take way too long. I have no idea how that will look so I'll wait and see. Or maybe I could explore how something like PHP modules are constructed to get a head start.

kadler commented 6 years ago

Original comment by Tony Cairns (Bitbucket: rangercairns, GitHub: rangercairns).


Update, i think i have it fixed now version 1.0.6-sg1.

kadler commented 6 years ago

Original comment by Tony Cairns (Bitbucket: rangercairns, GitHub: rangercairns).


Crashing Ile_pgm_grow call is coming from ile_pgm_copy_ds

Well, unfortunately, i did not have time today to figure out what got messed up with arrays. I will try to look into the issue tomorrow.

kadler commented 6 years ago

Original comment by Tony Cairns (Bitbucket: rangercairns, GitHub: rangercairns).


There is some crazy stuff going on with those values.

Great! I will look into this next test.

I feel my json shouldn't need 1gig of memory, it isn't that horrible.

Yes. This is why i am telling you that the high speed 'big data' interface will likely not be json.

#!bash

===
1) high speed, big data, exotic calls
high speed direct interface - big data interface (memory calls)
===
php->pecl (unwritten)->libtk400.a (interfaces not written)
note: 
a) should work either within php job (unwritten) 
b) or as stored proc call (db2proc we also use for json, etc. below)

===
2) slower web interfaces, maybe ok for 80% of simple json calls as well
parser interface -- web abstraction interfaces json, xml, etc. (socket, fastcgi, etc.)
===
php->pecl(unwritten)->libjson400.a->libtk400.a 
php->pecl(unwritten)->libxml400.a(unwritten)->libtk400.a 
php->pecl(unwritten)->libcvs400.a(unwritten)->libtk400.a 
kadler commented 6 years ago

Original comment by Teemu Halmela (Bitbucket: teemu_, GitHub: Unknown).


Indeed my test program had too small output buffer. I was wondering why it was crashing somewhat differently than my other program. But now I have a bigger test program that should show some overflows :D

I put some printf statements into the ile_pgm_grow to show some values.

diff --git a/toolkit-base/PaseTool.c b/toolkit-base/PaseTool.c
index [8078cea (bb)](https://bitbucket.org/litmis/db2sock/commits/8078cea)..8954e1e [100644 (bb)](https://bitbucket.org/litmis/db2sock/commits/100644)
--- a/toolkit-base/PaseTool.c
+++ b/toolkit-base/PaseTool.c
@@ -1030,7 +1030,10 @@ ile_pgm_call_t * ile_pgm_grow(ile_pgm_call_t **playout, int size) {
     new_len += ILE_PGM_ALLOC_BLOCK;
   }
   /* expanded layout template */
+  printf("size: %d, max: %d, pos: %d, delta: %d, newlen: %d\n",
+    size, layout->max, layout->pos, delta, new_len);
   tmp = tool_new(new_len);
+  printf("tool_new done\n");
   /* copy original data */
   if (orig_len) {
     memcpy(tmp, layout, orig_len);
     H AlwNull(*UsrCtl)

       dcl-ds innerDS qualified;
          field1 char(10);
          field2 char(15);
          field3 char(25);
          field4 char(5);
          field5 char(40);
       end-ds;

       dcl-ds outDS qualified;
          out1 int(10);
          out2 char(5);
          out3 zoned(9:2);
          out4 char(15);
          out5 char(50);
          outTable likeds(innerDS) dim(30);
          out6 char(7);
          out7 char(8);
          out8 char(10);
          out9 zoned(9:2);
       end-ds;

       dcl-ds out2DS qualified;
          o2_out1 char(10);
          o2_out2 char(100);
          o2_out3 char(25);
          o2_out4 char(30);
          o2_out5 zoned(4:2);
       end-ds;

       dcl-pr Main extpgm;
         val int(10);
         inCount int(10);
         input likeds(out2DS) dim(200);
         out2Count int(10);
         output2 likeds(out2DS) dim(200);
         outCount int(10);
         output likeds(outDS) dim(200);
         last char(10);
       end-pr;

       dcl-pi Main;
         val int(10);
         inCount int(10);
         input likeds(out2DS) dim(200);
         out2Count int(10);
         output2 likeds(out2DS) dim(200);
         outCount int(10);
         output likeds(outDS) dim(200);
         last char(10);
       end-pi;

         dcl-s i int(10);
         dcl-s parms int(10);
         val = %parms();
         for i = 1 to %elem(output);
            output(i).out1 = val*val;
            output(i).outTable(1).field2 = 'a' + %char(i);
            output(i).outTable(2).field1 = 'b' + %char(i);
            output(i).outTable(3).field1 = 'c' + %char(i);
            output(i).outTable(4).field2 = 'd' + %char(i);
            output(i).outTable(5).field2 = 'e' + %char(i);
         endfor;
         last = 'TEST';
         outCount = i - 1;
       return;

Running with 64-bit version.

input(1992):                                                                                                                                                                          
{"pgm":[                                                                                                                                                                              
    {"name":"TPGM3", "lib":"DB2JSON"},                                                                                                                                                
    {"s": {"name":"val", "type":"10i0", "value":10}},                                                                                                                                 
    {"s": {"name":"inCount", "type":"10i0", "value":0}},                                                                                                                              
    {"ds": [{"name":"input", "dim":200},                                                                                                                                              
        {"s":[                                                                                                                                                                        
            {"name":"o2_out1", "type":"10a", "value":""},                                                                                                                             
            {"name":"o2_out2", "type":"100a", "value":""},                                                                                                                            
            {"name":"o2_out3", "type":"25a", "value":""},                                                                                                                             
            {"name":"o2_out4", "type":"30a", "value":""}                                                                                                                              
        ]}                                                                                                                                                                            
    ]},                                                                                                                                                                               
    {"s": {"name":"outCount2", "type":"10i0", "value":0}},                                                                                                                            
    {"ds": [{"name":"output2", "dim":200},                                                                                                                                            
        {"s":[                                                                                                                                                                        
            {"name":"o2_out1", "type":"10a", "value":""},                                                                                                                             
            {"name":"o2_out2", "type":"100a", "value":""},                                                                                                                            
            {"name":"o2_out3", "type":"25a", "value":""},                                                                                                                             
            {"name":"o2_out4", "type":"30a", "value":""},                                                                                                                             
            {"name":"o2_out5", "type":"4s2", "value":0}                                                                                                                               
        ]}                                                                                                                                                                            
    ]},                                                                                                                                                                               
    {"s": {"name":"outCount", "type":"10i0", "value":0}},                                                                                                                             
    {"ds": [{"name":"output", "dim":200},                                                                                                                                             
        {"s":[                                                                                                                                                                        
            {"name":"out1", "type":"10i0", "value":0},                                                                                                                                
            {"name":"out2", "type":"5a", "value":""},                                                                                                                                 
            {"name":"out3", "type":"9s2", "value":0},                                                                                                                                 
            {"name":"out4", "type":"15a", "value":""},
            {"name":"out5", "type":"50a", "value":""}
        ]},
        {"ds": [{"name": "innerDS", "dim":30},
            {"s":[
                {"name":"field1", "type":"10a", "value":""},
                {"name":"field2", "type":"15a", "value":""},
                {"name":"field3", "type":"25a", "value":""},
                {"name":"field4", "type":"5a", "value":""},
                {"name":"field5", "type":"40a", "value":""}
            ]}
        ]},
        {"s":[
            {"name":"out6", "type":"7a", "value":""},
            {"name":"out7", "type":"8a", "value":""},
            {"name":"out8", "type":"10a", "value":""},
            {"name":"out9", "type":"9s2", "value":0}
        ]}
    ]},
    {"s": {"name":"last", "type":"10a", "value":""}}
]}

size: 4096, max: 0, pos: 0, delta: 0, newlen: 12288
tool_new done
size: 165, max: 12288, pos: 7942, delta: 26, newlen: 16384
tool_new done
size: 165, max: 16384, pos: 11902, delta: 162, newlen: 20480
tool_new done
size: 165, max: 20480, pos: 16027, delta: 133, newlen: 24576
tool_new done
size: 165, max: 24576, pos: 20152, delta: 104, newlen: 28672
tool_new done
size: 165, max: 28672, pos: 24277, delta: 75, newlen: 32768
tool_new done
size: 165, max: 32768, pos: 28402, delta: 46, newlen: 36864
tool_new done
size: 165, max: 36864, pos: 32527, delta: 17, newlen: 40960
tool_new done
size: 165, max: 40960, pos: 36487, delta: 153, newlen: 45056
tool_new done
size: 169, max: 45056, pos: 40696, delta: 40, newlen: 49152
tool_new done
size: 169, max: 49152, pos: 44752, delta: 80, newlen: 53248
tool_new done
size: 169, max: 53248, pos: 48808, delta: 120, newlen: 57344
tool_new done
size: 169, max: 57344, pos: 52864, delta: 160, newlen: 61440
tool_new done
size: 169, max: 61440, pos: 57089, delta: 31, newlen: 65536
tool_new done
size: 169, max: 65536, pos: 61145, delta: 71, newlen: 69632
tool_new done
size: 169, max: 69632, pos: 65201, delta: 111, newlen: 73728
tool_new done
size: 169, max: 73728, pos: 69257, delta: 151, newlen: 77824
tool_new done
size: 95, max: 77824, pos: 73483, delta: 21, newlen: 81920
tool_new done
size: 80823, max: 81920, pos: 74087, delta: 3513, newlen: [163840 (bb)](https://bitbucket.org/litmis/db2sock/commits/163840)
tool_new done
size: 80823, max: [1077952576 (bb)](https://bitbucket.org/litmis/db2sock/commits/1077952576), pos: [1078033399 (bb)](https://bitbucket.org/litmis/db2sock/commits/1078033399), delta: -85143, newlen: [1078034432 (bb)](https://bitbucket.org/litmis/db2sock/commits/1078034432)
tool_new done
size: 80823, max: [1078034432 (bb)](https://bitbucket.org/litmis/db2sock/commits/1078034432), pos: [1078114222 (bb)](https://bitbucket.org/litmis/db2sock/commits/1078114222), delta: -84110, newlen: [1078116352 (bb)](https://bitbucket.org/litmis/db2sock/commits/1078116352)
Segmentation fault (core dumped)

There is some crazy stuff going on with those values. I feel my json shouldn't need 1gig of memory, it isn't that horrible.
Crashing Ile_pgm_grow call is coming from ile_pgm_copy_ds. Something bad might be happening when calculating those memory positions.

kadler commented 6 years ago

Original comment by Tony Cairns (Bitbucket: rangercairns, GitHub: rangercairns).


So, looks like you will be going nuts with big arrays and nesting. I have decided to increase test1000_sql400json32/64 to 5 million input/output characters. You will need to recompile tests_c (make tgt32 tgt64 install). You can see the output buffer size in output (below).

#!bash

bash-4.3$ ./test1000_sql400json32 j0188_pgm_hamela05-ds-rpg-nest
input(5000000):

You did not include invocation of test1000_sql400json32/64, so i don't know how big your in/out buffers ... very possible yu wrote off the end in that last huge array nested test.

BTW -- again, great to test json with everything (thanks), but 'big data' interface is not available yet calling toolkit direct (without json).

kadler commented 6 years ago

Original comment by Tony Cairns (Bitbucket: rangercairns, GitHub: rangercairns).


Mmmm ... we must be out of sync with repositories as this works fine for me ...

Input:

#!json

{"pgm":[
    {"name":"HAMELA05", "lib":"DB2JSON"},
    {"s": {"name":"val", "type":"10i0", "value":10}},
    {"s": {"name":"outCount", "type":"10i0", "value":0}},
    {"ds": [{"name":"output", "dim":200},
        {"s":[
            {"name":"out1", "type":"10i0", "value":0},
            {"name":"out2", "type":"5av2", "value":""}
        ]},
        {"ds": [{"name": "innerDS", "dim":30},
            {"s":[
                {"name":"field1", "type":"10a", "value":""},
                {"name":"field2", "type":"15a", "value":""},
                {"name":"field2", "type":"25a", "value":""},
                {"name":"field2", "type":"5a", "value":""}
            ]}
        ]},
        {"s": {"name":"out3", "type":"10av2", "value":""}}
    ]},
    {"s": {"name":"last", "type":"10a", "value":""}}
]}

Output:

#!bash

{
    "script": [{
        "pgm": ["HAMELA05", "DB2JSON", {
            "val": 128
        }, {
            "outCount": 200
        }, {
            "output": [
                [{
                    "out1": 16384
                }, {
                    "out2": {}
                }, {
                    "innerDS": [
                        [{
                            "field1": {}
                        }, {
                            "field2": "a1"
                        }, {
                            "field2": {}
                        }, {
                            "field2": {}
                        }],
                        [{
                            "field1": "b1"
                        }, {
                            "field2": {}
                        }, {
                            "field2": {}
                        }, {
                            "field2": {}
                        }],
                        [{
                            "field1": "c1"
                        }, {
                            "field2": {}
                        }, {
                            "field2": {}
                        }, {
                            "field2": {}
                        }],
                        [{
                            "field1": {}
                        }, {
                            "field2": "d1"
                        }, {
                            "field2": {}
                        }, {
                            "field2": {}
                        }],
                        [{
                            "field1": {}
                        }, {
:
goes on for many pages (big, big, big, return)
:
                        }, {
                            "field2": {}
                        }]
                    ]
                }, {
                    "out3": {}
                }]
            ]
        }, {
            "last": "TEST"
        }]
    }]
}

possibilities...

... maybe update your fork repository to match mine???

... maybe re-compiled the test_c directory??? I changed test1000_sql400json.c to have a much bigger buffer for output (million chars from 512k a while back).

kadler commented 6 years ago

Original comment by Tony Cairns (Bitbucket: rangercairns, GitHub: rangercairns).


Okay thanks, I got the connect working. Also if the time comes the connections will probably be made using something like SQL400Connect(). That seems like better way to do things.

Yes. In fact, default connections are already made by SQL400Connect. Wherein, default simply means use the current active profile for DB2 operations including toolkit calls.

Need for connection speed ...

Eventually everybody understands "speed need" for connection pooling. We already built connection pooling into new libdb400.a driver with SQL400pConnect, a persistent connection hash based on db/uid/pwd/qual ("keyed" connection).

#!bash

php->ibm_db2(newish)->libdb400.a->...
... SQL400Connect(db,uid,pwd)->QSQSRVR/toolkit/db2
    (stateless connection 
     -- open/close each "script")
... SQL400pConnect(db,uid,pwd,qual)->QSQSRVR/toolkit/db2 
    (state full connection (reuse qual) 
     -- open until force closed)

The json interface to SQL400Connect includes "qual":"anykey", this will allow for connection pooling calls to SQL400pConnect with a "anykey" to re-use (db/uid/pwd filled in automatically of course).

#!json

{"connect":[{"db":"*LOCAL","uid":"DB2","pwd":"YIKES","qual":"mykey"},
  {"pgm":[{"name":"HELLO","lib":"DB2JSON"},
        {"s":{"name":"char", "type":"128a", "value":"Hi there"}}
       ]}
]}

to infinity and beyond ...

In fact, when libdb400.a implements new "socket interface" (ssh, traditional, web), we could even set-up db2 daemons similar to MySql. Therein we can have very sophisticated connection pooling including wild ideas like ...

... "private connection" -- script/user active cursors, many script invocations

... "abandon connections" -- detect bad user programs hanging on MSGW (toolkit)

... so on

kadler commented 6 years ago

Original comment by Tony Cairns (Bitbucket: rangercairns, GitHub: rangercairns).


You have multiple items going on today. Again, thanks for testing. I will look into each.

kadler commented 6 years ago

Original comment by Teemu Halmela (Bitbucket: teemu_, GitHub: Unknown).


So nested arrays aren't working anymore when I make them stupidly large. Something is overflowing again or can't handle this nesting madness.

     H AlwNull(*UsrCtl)

       dcl-ds innerDS qualified;
          field1 char(10);
          field2 char(15);
          field3 char(25);
          field4 char(5);
       end-ds;

       dcl-ds outDS qualified;
          out1 int(10);
          out2 varchar(5:2);
          outTable likeds(innerDS) dim(30);
          out3 varchar(10:2);
       end-ds;

       dcl-pr Main extpgm;
         val int(10);
         outCount int(10);
         output likeds(outDS) dim(200);
         last char(10);
       end-pr;

       dcl-pi Main;
         val int(10);
         outCount int(10);
         output likeds(outDS) dim(200);
         last char(10);
       end-pi;

         dcl-s i int(10);
         dcl-s parms int(10);
         val = %parms();
         for i = 1 to %elem(output);
            output(i).out1 = val*val;
            output(i).outTable(1).field2 = 'a' + %char(i);
            output(i).outTable(2).field1 = 'b' + %char(i);
            output(i).outTable(3).field1 = 'c' + %char(i);
            output(i).outTable(4).field2 = 'd' + %char(i);
            output(i).outTable(5).field2 = 'e' + %char(i);
         endfor;
         last = 'TEST';
         outCount = i - 1;
       return;
input(797):
{"pgm":[
    {"name":"TPGM3", "lib":"DB2JSON"},
    {"s": {"name":"val", "type":"10i0", "value":10}},
    {"s": {"name":"outCount", "type":"10i0", "value":0}},
    {"ds": [{"name":"output", "dim":200},
        {"s":[
            {"name":"out1", "type":"10i0", "value":0},
            {"name":"out2", "type":"5av2", "value":""}
        ]},
        {"ds": [{"name": "innerDS", "dim":30},
            {"s":[
                {"name":"field1", "type":"10a", "value":""},
                {"name":"field2", "type":"15a", "value":""},
                {"name":"field2", "type":"25a", "value":""},
                {"name":"field2", "type":"5a", "value":""}
            ]}
        ]},
        {"s": {"name":"out3", "type":"10av2", "value":""}}
    ]},
    {"s": {"name":"last", "type":"10a", "value":""}}
]}

Segmentation fault (core dumped)
kadler commented 6 years ago

Original comment by Teemu Halmela (Bitbucket: teemu_, GitHub: Unknown).


I also did a test with nested DS arrays and they seem to give correct results.

kadler commented 6 years ago

Original comment by Teemu Halmela (Bitbucket: teemu_, GitHub: Unknown).


Okay thanks, I got the connect working. Also if the time comes the connections will probably be made using something like SQL400Connect(). That seems like better way to do things.

kadler commented 6 years ago

Original comment by Tony Cairns (Bitbucket: rangercairns, GitHub: rangercairns).


So, i don't encourage connect in json interface, but connect is a "parent" to actions that follow. I recommend cmd in previous post.

#!json

bash-4.3$ ./test1000_sql400json32 j0701_connect_pgm_hello                    
input(1000000):
{"connect":[{"db":"*LOCAL","uid":"DB2","pwd":"YIKES"},
  {"pgm":[{"name":"HELLO","lib":"DB2JSON"},
        {"s":{"name":"char", "type":"128a", "value":"Hi there"}}
       ]}
]}

output(63):
{"script":[{"pgm":["HELLO","DB2JSON",{"char":"Hello World"}]}]}
kadler commented 6 years ago

Original comment by Tony Cairns (Bitbucket: rangercairns, GitHub: rangercairns).


Well, for *LIBL i would use a cmd to set to avoid using a 'connect' at all (1).

#!json

{"script":[
  {"cmd":{"exec":"CHGLIBL LIBL(DB2JSON QTEMP) CURLIB(DB2JSON)"}},
  {"pgm":[{"name":"HELLO"},
        {"s":{"name":"char", "type":"128a", "value":"Hi there"}}
       ]}
]}

(1) This json interface is mostly for REST calling, therefore, passing a "profile" is generally an unnatural act. There is much more to this story ... fun story ... shocking ending ...

kadler commented 6 years ago

Original comment by Teemu Halmela (Bitbucket: teemu_, GitHub: Unknown).


Good thing you found the real problem with ILE_PGM_MAX_ARGS. I was just poking around and saw that making it bigger "fixed" the problem.

I found the problem why calling my real program didn't work. The program was crashing because the LIBL wasn't correct and some needed programs wasn't found. It is just weird that the call just stalls, it does not give any errors to the json client.

This program should show the error.

     H AlwNull(*UsrCtl)
       dcl-pr Main extpgm;
         hello char(128);
       end-pr;

       dcl-pr noprog extpgm;
       end-pr;

       dcl-pi Main;
         hello char(128);
       end-pi;
         hello = 'Hello World';
         noprog();
       return;

But I got my program to work by defining correct libl like this example j0301_cmd_pgm_hello.json.

I also tried to use the connect parameter to give it user that already has the correct libl set. But I don't seem to get it to work, it only gives empty output. Is this how the connect should be defined?

{"script": [
    {"connect":{"db": "*LOCAL", "uid": "USER", "pwd": "PASSWORD"}},
    {"pgm":[
        {"name":"MYPGM"},
    ...
    ]}
]}

output(13): {"script":[]}
kadler commented 6 years ago

Original comment by Tony Cairns (Bitbucket: rangercairns, GitHub: rangercairns).


I ran a test of 500 occurs output in json (not input). Output more typical to me, but still seems a bit slow to me ... need some performance work in the json interface. Again, not for production yet, so we will have to look at this after getting basic conversion functions to work. Aka, we are still writing hello world programs with arrays and such that have failed, so, just not ready for big performance tuning work yet.

SuperDriver - version 1.0.5-sg9

BTW -- I suspect when the high speed direct call libtk400.a is written, this will also need performance work before all said and done (always the case). Well, this is what you get when you watch sausage being made in db2sock factory. If you don't like the watching, well, come back in December when done (hopefully).

kadler commented 6 years ago

Original comment by Tony Cairns (Bitbucket: rangercairns, GitHub: rangercairns).


BTW -- Also, we did not have to change ILE_PGM_MAX_ARGS (a really, really good thing). The problem was in ile_pgm_grow (as i suspected). You can see source change commit along with hamela04 occurs 500 test.

kadler commented 6 years ago

Original comment by Tony Cairns (Bitbucket: rangercairns, GitHub: rangercairns).


Ok fixed your input data occurs 500 test.

SuperDriver - version 1.0.5-sg8

Thanks for the big test. However, I do not consider setting 500 array elements in a 'ds' a json interface style test. This is better suited to the 'big data' direct libtk400.a memory call interface (not written yet).

#!bash

===
1) high speed, big data, exotic calls
high speed direct interface - big data interface (memory calls)
===
php->pecl (unwritten)->libtk400.a (interfaces not written)
note: 
a) should work either within php job (unwritten) 
b) or as stored proc call (db2proc we also use for json, etc. below)

===
2) slower web interfaces, maybe ok for 80% of simple json calls as well
parser interface -- web abstraction interfaces json, xml, etc. (socket, fastcgi, etc.)
===
php->pecl(unwritten)->libjson400.a->libtk400.a 
php->pecl(unwritten)->libxml400.a(unwritten)->libtk400.a 
php->pecl(unwritten)->libcvs400.a(unwritten)->libtk400.a 

BTW -- only mistake in your tests was 499 elements one short of required 500 elements.

kadler commented 6 years ago

Original comment by Tony Cairns (Bitbucket: rangercairns, GitHub: rangercairns).


€ is symbol for euro. Should be atleast in iso8859-15 and utf8 charsets. No idea what should be changed to get it work. I hope we don't use it often.

I think we should open a different issue for ccsid (hell). As suggestion, I have found that editors that actually edit in 1208 (utf-8), tend to make issues disappear. That is, your editor may be in iso8859-15, wherein the ascii<>ebcdic fails on this symbol. However if your editor was in 1208 (utf-8), everything may just work.

Side note: Assuming you are using my little json test program test1000_sql400json. We could check the input in this little program, and, convert from whatever the editor is using (say iso8859-15) into utf-8. There are some handy new SQL400 interfaces that may work (SQL400ToUtf8, SQL400FromUtf8).

kadler commented 6 years ago

Original comment by Tony Cairns (Bitbucket: rangercairns, GitHub: rangercairns).


Also I must compile toolkit-base, toolkit-parser-json and db2proc to make things work. So I don't know what is going on with it.

Admirably irrepressible you are probably going to ignore my warning about ILE_PGM_MAX_ARGS, not to mention using the 'big data' interface instead of json here (damn the torpedoes). The following c structure is used both in toolkit-base and ILE-PROC, This is why you must compile both ends to make any change to ILE_PGM_MAX_ARGS.

#!c

#define ILE_PGM_MAX_ARGS 128
#define ILE_PGM_ALLOC_BLOCK 4096
typedef struct ile_pgm_call_struct {
#ifdef __IBMC__
  /* pad blob alignment */
  int blob_pad[3];
  /* ILE address (set ILE side) */
  char * argv[ILE_PGM_MAX_ARGS];
#else
  /* pad pase alignment */
  int blob_pad[4];
  /* ILE address (untouched PASE side) */
  ILEpointer argv[ILE_PGM_MAX_ARGS];
#endif
  int argv_parm[ILE_PGM_MAX_ARGS];
  int arg_by[ILE_PGM_MAX_ARGS];
  int arg_pos[ILE_PGM_MAX_ARGS];
  int arg_len[ILE_PGM_MAX_ARGS];
  char pgm[16];
  char lib[16];
  char func[128];
  int step;
  int max;
  int pos;
  int vpos;
  int argc;
  int parmc;
  int return_start;
  int return_end;
  char * buf;
} ile_pgm_call_t;

Specifically, we cannot/should not 'tag ILE pointers' on client side of QSQSRVR jobs. Therefore we simply pass the number of argc elements (pointer to parms), to the db2proc ILE program on the stored procedure side. Here we can tag the ILE pointers in the correct process (QSQSRVR process).

#!c

  /* set ILE addresses based memory spill location offset */
  for (argc=0; argc < ILE_PGM_MAX_ARGS; argc++) {
    if (argc < layout->argc) {
      /*  by reference */
      if (layout->argv_parm[argc] > -1) {
        /* ILE address parm location (skip by value slots) */
        parmc = layout->argv_parm[argc];
        offset = layout->arg_pos[parmc];
        /* set ILE address to data */
        layout->argv[argc] = (char *)layout + offset;
      }
    } else {
      layout->argv[argc] = NULL;
    }
  }

Also, you may note we shift spill data by 4 bytes going to/from the stored procedure 'blob' call. This will un-tag pointers in the client side, so they cannot be misused (you little hacker you i says to nobody at all).

kadler commented 6 years ago

Original comment by Tony Cairns (Bitbucket: rangercairns, GitHub: rangercairns).


BTW -- I will look into your 500 element setter test. We should not have to increase ILE_PGM_MAX_ARGS, so something else is wrong. Again, i consider this sort of test as 'exotic', therein probably not a 'real life' candidate for json, xml, cvs, etc. You would really want the direct memory 'big data' interface to toolkit (libtk400.a) ... but ... well ... it is not written yet.

reminder (previous post) ... Now for the bad news, this will take a while to sort out, aka, maybe until the end of the year before we have json bells dinging and whistles tooting at high performance (and async, and web, and socket, and "eval", and, and ...) ... and direct memory 'big data' interface to libtk400.a (and more stuff beyond this...).

kadler commented 6 years ago

Original comment by Tony Cairns (Bitbucket: rangercairns, GitHub: rangercairns).


You may be wasting your time with json interface for these sort of 'exotic' tests. I appreciate any tests to work out edges of json, but any 'big data' operation will not likely be through this interface at all. Basically, if you have these sort of 'big array' problems you will likely use a direct memory call into the toolkit (libtk400.a). That is, there will be a formal 'c code' interface to libtk400.a that can be called directly by any language much like db2 drivers (no json, no xml, etc.). Maybe I can give you a general idea as the complete architecture is not all written yet.

#!bash

===
1) high speed, big data, exotic calls
high speed direct interface - big data interface (memory calls)
===
php->pecl (unwritten)->libtk400.a (interfaces not written)
note: 
a) should work either within php job (unwritten) 
b) or as stored proc call (db2proc we also use for json, etc. below)

===
2) slower web interfaces, maybe ok for 80% of simple json calls as well
parser interface -- web abstraction interfaces json, xml, etc. (socket, fastcgi, etc.)
===
php->pecl(unwritten)->libjson400.a->libtk400.a 
php->pecl(unwritten)->libxml400.a(unwritten)->libtk400.a 
php->pecl(unwritten)->libcvs400.a(unwritten)->libtk400.a 
kadler commented 6 years ago

Original comment by Tony Cairns (Bitbucket: rangercairns, GitHub: rangercairns).


Wait a moment ... so ... you are going to list all 500 elements by hand inside a single ds structure. Mmmm ... well this might work, except parameter Arr2Count->500 has no meaning. You have only one gigantic 500 element 'ds' with repeated 's' names (500 times). Very odd (exotic), but maybe that should work. But this is still wrong (ILE_PGM_MAX_ARGS 4096) ... must be some other 'grow operation failing.

#!json

    {"ds": [{"name":"Arr2"},
        {"s":[{"name":"Arr2P1", "type":"1a", "value":"a"},{"name":"Arr2P2", "type":"30a", "value":"BAAAAAAAAAAAAAAAAAAAAAAAAAAAAA"},{"name":"Arr2P3", "type":"4a", "value":"ap3"},{"name":"Arr2P4", "type":"18a", "value":""},{"name":"Arr2P5", "type":"100a", "value":""}]},
        {"s":[{"name":"Arr2P1", "type":"1a", "value":"a"},{"name":"Arr2P2", "type":"30a", "value":"ABAAAAAAAAAAAAAAAAAAAAAAAAAAAA"},{"name":"Arr2P3", "type":"4a", "value":"ap3"},{"name":"Arr2P4", "type":"18a", "value":""},{"name":"Arr2P5", "type":"100a", "value":""}]},
:
500 times (different data in each)
:

BTW -- you miscounted in the test (only had 499 elements).

kadler commented 6 years ago

Original comment by Tony Cairns (Bitbucket: rangercairns, GitHub: rangercairns).


Mmm, again, your test is complete wrong (we covered this). You can't set individual elements by listing them inside a 'ds'. This is only working by accident with your expansion of ILE_PGM_MAX_ARGS 4096 (because 'ds structs are 'packed'). Your tests is wrong (same logic error as last time).

kadler commented 6 years ago

Original comment by Teemu Halmela (Bitbucket: teemu_, GitHub: Unknown).


I did some changes to the tests and I can only run j018x- tests with this change

-#define ILE_PGM_MAX_ARGS 128
+#define ILE_PGM_MAX_ARGS 4096

Without it I get Segmentation fault (core dumped) error. Also I must compile toolkit-base, toolkit-parser-json and db2proc to make things work. So I don't know what is going on with it. I'm currently running my fork which has all the newest changes

$ test9999_driver_version32
run (trace=)
version (1.0.5-sg7)
success (0)

I also added big json test and it seems to work with your bigkey->countfix. Grab the tests from my fork if you want to include them.

I have no idea what '€' means

€ is symbol for euro. Should be atleast in iso8859-15 and utf8 charsets. No idea what should be changed to get it work. I hope we don't use it often.

kadler commented 6 years ago

Original comment by Tony Cairns (Bitbucket: rangercairns, GitHub: rangercairns).


To save you undue effort, i should say that I am not taking contributions to this project yet. This driver is too important to get stuck in legal tar. So until I sort out an IBM approved vetting process I will chat with anyone, and, work on issues. You may also fork if you like, aka, okay to fool around with a copy on your own. Just trying to set expectations.

kadler commented 6 years ago

Original comment by Tony Cairns (Bitbucket: rangercairns, GitHub: rangercairns).


Update ... i guess valid json one weird character returned ... € ... probably some sort of odd ball conversion ascii<>ebcdic. I have no idea what '€' means, so I am just taking this char out so tests will pass clean json validate check.

kadler commented 6 years ago

Original comment by Tony Cairns (Bitbucket: rangercairns, GitHub: rangercairns).


BTW --- should be noted that your 'yikes dude' complex tests are not producing valid json because of all the special characters embedded in the character data.

I am using this validator on-line ... jsonlint

#!bash

Error: Parse error on line 19:
... {                           "Arr1P2": "!#¤%&/()=?+*^_-:;@£
----------------------^
Expecting 'STRING', 'NUMBER', 'NULL', 'TRUE', 'FALSE', '{', '[', got 'undefined'
kadler commented 6 years ago

Original comment by Tony Cairns (Bitbucket: rangercairns, GitHub: rangercairns).


it works 1.0.5-sg7

Ok, only fix need was k->count in json_grow_key. We did not have to modify ILE_PGM_MAX_ARGS 128, which is a very good thing (... we really, really, do not want this to happen ... you will see when we do pass by value params).

#!bash

bash-4.3$ ./test1000_sql400json32 j0181_pgm_hamela03-ds-rpg-occurs-set-element
input(1000000):
{"pgm":[
    {"name":"HAMELA03",  "lib":"DB2JSON"},
    {"s": [
        {"name":"Parm1", "type":"1a", "value":""},
        {"name":"Parm2", "type":"18a", "value":"äöÄÖåÅáÁàÀ"},
        {"name":"Parm3", "type":"2a", "value":""},
        {"name":"Parm4", "type":"10a", "value":""},
        {"name":"Arr1Count", "type":"3s0", "value":0}
    ]},
    {"ds": [{"name":"Arr1", "dim":10},
        {"s":[
            {"name":"Arr1P1", "type":"7a", "value":""},
            {"name":"Arr1P2", "type":"132a", "value":""},
            {"name":"Arr1P3", "type":"30a", "value":""},
            {"name":"Arr1P4", "type":"1s0", "value":0}
        ]}
    ]},
    {"s": [
        {"name":"Parm5", "type":"1a", "value":""},
        {"name":"Parm6", "type":"4a", "value":""},
        {"name":"Parm7", "type":"18a", "value":""},
        {"name":"Parm8", "type":"100a", "value":"!#¤%&/()=?+*^_-:;@£${[]}\\€<>"},
        {"name":"Parm9", "type":"3s0", "value":0},
        {"name":"Parm10", "type":"3s0", "value":0}
    ]},
    {"s": {"name":"Arr2Count", "type":"5s0", "value":3}},
    {"ds": [{"name":"Arr2"},
        {"s":[
            {"name":"Arr2P1", "type":"1a", "value":"a"},
            {"name":"Arr2P2", "type":"30a", "value":"AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA"},
            {"name":"Arr2P3", "type":"4a", "value":"ap3"},
            {"name":"Arr2P4", "type":"18a", "value":""},
            {"name":"Arr2P5", "type":"100a", "value":""}
        ]},
        {"s":[
            {"name":"Arr2P1", "type":"1a", "value":"b"},
            {"name":"Arr2P2", "type":"30a", "value":"BBBBBBBBBBBBBBBBBBBBBBBBBBBBBB"},
            {"name":"Arr2P3", "type":"4a", "value":"bp3"},
            {"name":"Arr2P4", "type":"18a", "value":""},
            {"name":"Arr2P5", "type":"100a", "value":""}
        ]},
        {"s":[
            {"name":"Arr2P1", "type":"1a", "value":"c"},
            {"name":"Arr2P2", "type":"30a", "value":"CCCCCCCCCCCCCCCCCCCCCCCCCCCCCC"},
            {"name":"Arr2P3", "type":"4a", "value":"cp3"},
            {"name":"Arr2P4", "type":"18a", "value":""},
            {"name":"Arr2P5", "type":"100a", "value":""}
        ]},
        {"ds": [{"name":"Arr2Empty", "dim": 7},
            {"s":[
                {"name":"Arr2P1", "type":"1a", "value":""},
                {"name":"Arr2P2", "type":"30a", "value":""},
                {"name":"Arr2P3", "type":"4a", "value":""},
                {"name":"Arr2P4", "type":"18a", "value":""},
                {"name":"Arr2P5", "type":"100a", "value":""}
            ]}
        ]}
    ]},
    {"s": {"name":"Parm11", "type":"30a", "value":""}}
]}

output(1866):
{"script":[{"pgm":["HAMELA03","DB2JSON",{"Parm1":
{}},{"Parm2":"äöÄÖåÅáÁà"},{"Parm3":{}},{"Parm4":{}},{"Arr1Count":1},
{"Arr1":[[{"Arr1P1":{}},{"Arr1P2":"!#¤%&/()=?+*^_-:;@£${[]}\\<>"},
{"Arr1P3":{}},{"Arr1P4":0.0}],[{"Arr1P1":{}},{"Arr1P2":{}},
{"Arr1P3":{}},{"Arr1P4":0.0}],[{"Arr1P1":{}},{"Arr1P2":{}},
{"Arr1P3":{}},{"Arr1P4":0.0}],[{"Arr1P1":{}},{"Arr1P2":{}},
{"Arr1P3":{}},{"Arr1P4":0.0}],[{"Arr1P1":{}},{"Arr1P2":{}},
{"Arr1P3":{}},{"Arr1P4":0.0}],[{"Arr1P1":{}},{"Arr1P2":{}},
{"Arr1P3":{}},{"Arr1P4":0.0}],[{"Arr1P1":{}},{"Arr1P2":{}},
{"Arr1P3":{}},{"Arr1P4":0.0}],[{"Arr1P1":{}},{"Arr1P2":{}},
{"Arr1P3":{}},{"Arr1P4":0.0}],[{"Arr1P1":{}},{"Arr1P2":{}},
{"Arr1P3":{}},{"Arr1P4":0.0}],[{"Arr1P1":{}},{"Arr1P2":{}},
{"Arr1P3":{}},{"Arr1P4":0.0}]]},{"Parm5":{}},{"Parm6":{}},
{"Parm7":{}},{"Parm8":"!#¤%&/()=?+*^_-:;@£${[]}\\<>"},
{"Parm9":0.0},{"Parm10":0.0},{"Arr2Count":3},
{"Arr2":[{"Arr2P1":"a"},
{"Arr2P2":"AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA"},{"Arr2P3":"ap3"},
{"Arr2P4":{}},{"Arr2P5":"aAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAap3"},
{"Arr2P1":"b"},{"Arr2P2":"BBBBBBBBBBBBBBBBBBBBBBBBBBBBBB"},
{"Arr2P3":"bp3"},
{"Arr2P4":{}},{"Arr2P5":"bBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBbp3"},
{"Arr2P1":"c"},{"Arr2P2":"CCCCCCCCCCCCCCCCCCCCCCCCCCCCCC"},
{"Arr2P3":"cp3"},
{"Arr2P4":{}},{"Arr2P5":"cCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCcp3"},
{"Arr2Empty":[[{"Arr2P1":{}},{"Arr2P2":{}},{"Arr2P3":{}},
{"Arr2P4":{}},{"Arr2P5":{}}],[{"Arr2P1":{}},{"Arr2P2":{}},
{"Arr2P3":{}},{"Arr2P4":{}},{"Arr2P5":{}}],[{"Arr2P1":{}},
{"Arr2P2":{}},{"Arr2P3":{}},{"Arr2P4":{}},{"Arr2P5":{}}],
[{"Arr2P1":{}},{"Arr2P2":{}},{"Arr2P3":{}},{"Arr2P4":{}},
{"Arr2P5":{}}],[{"Arr2P1":{}},{"Arr2P2":{}},{"Arr2P3":{}},
{"Arr2P4":{}},{"Arr2P5":{}}],[{"Arr2P1":{}},{"Arr2P2":{}},
{"Arr2P3":{}},{"Arr2P4":{}},{"Arr2P5":{}}],[{"Arr2P1":{}},
{"Arr2P2":{}},{"Arr2P3":{}},{"Arr2P4":{}},{"Arr2P5":{}}]]}]},
{"Parm11":"äöÄÖåÅáÁà"}]}]}

result:
success (0)

please help your tests

I decided to keep your old schoool RPG program to show 'doubting Thomas' folks that there is nothing up my magic sleeves with RPG free (they really are the same 'ds' and 's' and 'occurs' and ... Thanks Barbara Morris IBM Toronto).

Help I need somebody (The Beatles) ...

Anyway, your tests are getting so complex I am having difficulty knowing what to check in expected output .exp (j0181_pgm_hamela03-ds-rpg-occurs-set-element.exp). If you would not mind, please post a .exp data with your test. Uf Da! (I save Uf Da for special occasions of my perplexed heart as a over worked programmer.)

kadler commented 6 years ago

Original comment by Tony Cairns (Bitbucket: rangercairns, GitHub: rangercairns).


Reminder "do not use for production" ...

So, as you can see last post with let's say that "eval", there is creative room for architecture change to fit both performance and 'use case' in this toolkit. To infinity and beyond, to coin a phrase.

Now for the bad news, this will take a while to sort out, aka, maybe until the end of the year before we have json bells dinging and whistles tooting at high performance (and async, and web, and socket, and "eval", and, and ...).

kadler commented 6 years ago

Original comment by Tony Cairns (Bitbucket: rangercairns, GitHub: rangercairns).


We unfortunately have many programs that take input arrays so getting this work is beneficial. But I think this will work, at least for experimenting

Throw a dog a bone ... you can see setting input arrays using only 'ds' and 's' is difficult. Essentially we are trying to invent a RPG calculation section by rubbing two 'ds' and 's' sticks together (i made fire).

Let's say that ...

My daughter, back days as young girl, use to play "let's say that ..." with the kid next door. Marvels of games from 'hot lava' on the ground to 'jet planes' on the swing set. Point is, we don't have to color inside the lines. We could introduce a simple json language for modifying data specifications.

#!json

{"pgm":[
    {"name":"HAMELA02",  "lib":"DB2JSON"},
    {"s": {"name":"inCount", "type":"10i0", "value":5}},
    {"ds": [{"name":"inputDS","dim":20},
        {"s":[
            {"name":"in1", "type":"5av2", "value":"i1"},
            {"name":"in2", "type":"5av2", "value":"i2"}
        ]}
    ]},
    {"s": {"name":"outCount", "type":"10i0", "value":5}},
    {"ds": [{"name":"outDS","dim":20},
        {"s":[
            {"name":"out1", "type":"5av2", "value":"o1"},
            {"name":"out2", "type":"5av2", "value":"o2"},
            {"name":"out3", "type":"10av2", "value":"o3"}
        ]}
    ]},
    {"s": {"name":"last", "type":"10a", "value":"ll"}}
]}

Above is a standard RPG-like set of 'ds' and 's' data specifications. Let's say that we 'cache' this specification in the driver under the name 'HAMELA02'. Now we can create a little language "eval" to update any of the input elements (below). Of course making for screaming fast performance because we never re-parse 'HAMELA02'.

#!json

{"c":[{"eval":"HAMELA02.inputDS(1).in1","equal":"yahoo"}]}

Let's say that ...

kadler commented 6 years ago

Original comment by Tony Cairns (Bitbucket: rangercairns, GitHub: rangercairns).


Hey, can you please post only RPG free samples for tests. I don't want to add old style RPG programs to tests_ILE_RPG. I mean you are welcome to try out anything (of course), but i wish to add your tests to my collection for regression testing. That is, regression test means your tests will be run every release (free tests for your specific needs).

kadler commented 6 years ago

Original comment by Tony Cairns (Bitbucket: rangercairns, GitHub: rangercairns).


FYI -- Good find on json parser! Here is correct location for k->count (need to test).

#!c
void json_grow_key(json_key_t * k, int i) {
  int g = 0;
  char * old_key = (char *) k->key;
  char * old_val = (char *) k->val;
  char * old_lvl = (char *) k->lvl;
  char * new_key = NULL;
  char * new_val = NULL;
  char * new_lvl = NULL;
  /* already big enough (add grow amount i to count) */
  if (k->max > k->count + i + 1) {
    k->count += i; /* hamela found bug */
    return;
  }
  /* grow by blocks */
  for (g = k->max; k->max < g + i + 1; k->max += JSON400_KEY_BLOCK);
  /* realloc */
  new_key = json_new(k->max * sizeof(int));
  new_val = json_new(k->max * sizeof(char *));
  new_lvl = json_new(k->max * sizeof(int));
  memcpy(new_key,old_key,(k->count * sizeof(int)));
  memcpy(new_val,old_val,(k->count * sizeof(char *)));
  memcpy(new_lvl,old_lvl,(k->count * sizeof(int)));
  k->key = (int *) new_key;
  k->val = (char **) new_val;
  k->lvl = (int *) new_lvl;
  json_free(old_key);
  json_free(old_val);
  json_free(old_lvl);
}
kadler commented 6 years ago

Original comment by Tony Cairns (Bitbucket: rangercairns, GitHub: rangercairns).


please change posts (now on) ...

I also do not know what version of db2sock you are running.

#!bash

bash-4.3$ ./tests_c/test9999_driver_version32
run (trace=on)
version (1.0.5-sg6)
success (0)

incorrect 'fix' ...

Mmm ... no. A 'ds' with "dim": 500 does not introduce more pointer to parms. So this fix is incorrect (not accepted).

#!c
-#define ILE_PGM_MAX_ARGS 128
+#define ILE_PGM_MAX_ARGS 4096

explanation ...

You only have 15 parms in this json (below). This will fit easily in ILE_PGM_MAX_ARGS 128.

#!json
{"pgm":[
    {"name":"TPGM2",  "lib":"DB2JSON"},
    {"s": [
(01)        {"name":"Parm1", "type":"1a", "value":""},
(02)        {"name":"Parm2", "type":"18a", "value":"äöÄÖåÅáÁàÀ"},
(03)        {"name":"Parm3", "type":"2a", "value":""},
(04)        {"name":"Parm4", "type":"10a", "value":""},
(05)        {"name":"Arr1Count", "type":"3s0", "value":0}
    ]},
(06)    {"ds": [{"name":"Arr1", "dim":10},
        {"s":[
            {"name":"Arr1P1", "type":"7a", "value":""},
            {"name":"Arr1P2", "type":"132a", "value":""},
            {"name":"Arr1P3", "type":"30a", "value":""},
            {"name":"Arr1P4", "type":"1s0", "value":0}
        ]}
    ]},
    {"s": [
(07)        {"name":"Parm5", "type":"1a", "value":""},
(08)        {"name":"Parm6", "type":"4a", "value":""},
(09)        {"name":"Parm7", "type":"18a", "value":""},
(10)        {"name":"Parm8", "type":"100a", "value":"!#¤%&/()=?+*^_-:;@£${[]}\\€<>"},
(11)        {"name":"Parm9", "type":"3s0", "value":0},
(12)        {"name":"Parm10", "type":"3s0", "value":0}
    ]},
(13)    {"s": {"name":"Arr2Count", "type":"5s0", "value":500}},
(14)    {"ds": [{"name":"Arr2", "dim": 500},
        {"s":[
            {"name":"Arr2P1", "type":"1a", "value":"a"},
            {"name":"Arr2P2", "type":"30a", "value":"AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA"},
            {"name":"Arr2P3", "type":"4a", "value":"ap3"},
            {"name":"Arr2P4", "type":"18a", "value":""},
            {"name":"Arr2P5", "type":"100a", "value":""}
        ]}
    ]},
(15)    {"s": {"name":"Parm11", "type":"30a", "value":""}}
]}

argv[15]
pointer->Parm1
pointer->Parm2
pointer->Parm3
pointer->Parm4
pointer->Arr1
pointer->Parm5
pointer->Parm6
pointer->Parm7
pointer->Parm8
pointer->Parm9
pointer->Parm10
pointer->Arr2Count
pointer->Arr2
pointer->Parm11

     C     *Entry        Plist                            
     C                   Parm                    Parm1    
     C                   Parm                    Parm2    
     C                   Parm                    Parm3    
     C                   Parm                    Parm4    
     C                   Parm                    Arr1Count
     C                   Parm                    Arr1     
     C                   Parm                    Parm5    
     C                   Parm                    Parm6    
     C                   Parm                    Parm7    
     C                   Parm                    Parm8    
     C                   Parm                    Parm9    
     C                   Parm                    Parm10               
     C                   Parm                    Arr2Count            
     C                   Parm                    Arr2                 
     C                   Parm                    Parm11               

next ...

I don't know what went wrong, but your fork fixes are not the answer (rejected).

kadler commented 6 years ago

Original comment by Teemu Halmela (Bitbucket: teemu_, GitHub: Unknown).


So when we change previous programs Arr2 -> Occurs(500) things start to throw segmentation faults. But I got things working with these changes change1 change2.

So now even this input works which is a good sign.

{"pgm":[
    {"name":"TPGM2",  "lib":"DB2JSON"},
    {"s": [
        {"name":"Parm1", "type":"1a", "value":""},
        {"name":"Parm2", "type":"18a", "value":"äöÄÖåÅáÁàÀ"},
        {"name":"Parm3", "type":"2a", "value":""},
        {"name":"Parm4", "type":"10a", "value":""},
        {"name":"Arr1Count", "type":"3s0", "value":0}
    ]},
    {"ds": [{"name":"Arr1", "dim":10},
        {"s":[
            {"name":"Arr1P1", "type":"7a", "value":""},
            {"name":"Arr1P2", "type":"132a", "value":""},
            {"name":"Arr1P3", "type":"30a", "value":""},
            {"name":"Arr1P4", "type":"1s0", "value":0}
        ]}
    ]},
    {"s": [
        {"name":"Parm5", "type":"1a", "value":""},
        {"name":"Parm6", "type":"4a", "value":""},
        {"name":"Parm7", "type":"18a", "value":""},
        {"name":"Parm8", "type":"100a", "value":"!#¤%&/()=?+*^_-:;@£${[]}\\€<>"},
        {"name":"Parm9", "type":"3s0", "value":0},
        {"name":"Parm10", "type":"3s0", "value":0}
    ]},
    {"s": {"name":"Arr2Count", "type":"5s0", "value":500}},
    {"ds": [{"name":"Arr2", "dim": 500},
        {"s":[
            {"name":"Arr2P1", "type":"1a", "value":"a"},
            {"name":"Arr2P2", "type":"30a", "value":"AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA"},
            {"name":"Arr2P3", "type":"4a", "value":"ap3"},
            {"name":"Arr2P4", "type":"18a", "value":""},
            {"name":"Arr2P5", "type":"100a", "value":""}
        ]}
    ]},
    {"s": {"name":"Parm11", "type":"30a", "value":""}}
]}

I just need to figure out why it hangs when I do the call on the real program.

kadler commented 6 years ago

Original comment by Teemu Halmela (Bitbucket: teemu_, GitHub: Unknown).


Bear with me, I have more horrible examples.

So there is this program. These are actual parameters from existing program (names have been changed and Arr2 is actually 500 long). I also wanted to try there is no funny business going on when using non-free RPG, as all of our programs are like that. In this example Arr2 is used as input/output array.
The good thing is this seems to work properly. Except the €-char is missing, but I don't know if RPG is supposed to handle that character . Things only start to break horribly if we change Arr2 to Occurs(500).

     D Parm1           S              1A             
     D Parm2           S             18A             
     D Parm3           S              2A             
     D Parm4           S             10A             
     D Arr1Count       S              3S 0           
     D Arr1            DS                  Occurs(10)
     D  Arr1P1                        7A             
     D  Arr1P2                      132A             
     D  Arr1P3                       30A             
     D  Arr1P4                        1S 0           
     D Parm5           S              1A             
     D Parm6           S              4A             
     D Parm7           S             18A             
     D Parm8           S            100A             
     D Parm9           S              3S 0           
     D Parm10          S              3S 0           
     D Arr2Count       S              5S 0           
     D Arr2            DS                  Occurs(10)
     D  Arr2P1                        1A             
     D  Arr2P2                       30A                  
     D  Arr2P3                        4A                  
     D  Arr2P4                       18A                  
     D  Arr2P5                      100A                  
     D Parm11          S             30A                  

     D i               S              5P 0                

     C     *Entry        Plist                            
     C                   Parm                    Parm1    
     C                   Parm                    Parm2    
     C                   Parm                    Parm3    
     C                   Parm                    Parm4    
     C                   Parm                    Arr1Count
     C                   Parm                    Arr1     
     C                   Parm                    Parm5    
     C                   Parm                    Parm6    
     C                   Parm                    Parm7    
     C                   Parm                    Parm8    
     C                   Parm                    Parm9    
     C                   Parm                    Parm10               
     C                   Parm                    Arr2Count            
     C                   Parm                    Arr2                 
     C                   Parm                    Parm11               

     C                   For       i             = 1 To Arr2Count By 1
     C                   Eval      %occur(Arr2)  = i                  
     C                   Eval      Arr2P5        = %trim(Arr2P1) +    
     C                                             %trim(Arr2P2) +    
     C                                             %trim(Arr2P3)      
     C                   Endfor                                       
     C                                                                
     C                   Eval      Parm11        = %trim(Parm2)       
     C                                                                
     C                   Eval      %occur(Arr1)  = 1                  
     C                   Eval      Arr1P2        = %trim(Parm8)
     C                   Eval      Arr1Count     = 1           
     C                                                         
     C                   Eval      *Inlr         = '1'         
     C                   Return                                
input(2591):
{"pgm":[
    {"name":"TPGM2",  "lib":"DB2JSON"},
    {"s": [
        {"name":"Parm1", "type":"1a", "value":""},
        {"name":"Parm2", "type":"18a", "value":"äöÄÖåÅáÁàÀ"},
        {"name":"Parm3", "type":"2a", "value":""},
        {"name":"Parm4", "type":"10a", "value":""},
        {"name":"Arr1Count", "type":"3s0", "value":0}
    ]},
    {"ds": [{"name":"Arr1", "dim":10},
        {"s":[
            {"name":"Arr1P1", "type":"7a", "value":""},
            {"name":"Arr1P2", "type":"132a", "value":""},
            {"name":"Arr1P3", "type":"30a", "value":""},
            {"name":"Arr1P4", "type":"1s0", "value":0}
        ]}
    ]},
    {"s": [
        {"name":"Parm5", "type":"1a", "value":""},
        {"name":"Parm6", "type":"4a", "value":""},
        {"name":"Parm7", "type":"18a", "value":""},
        {"name":"Parm8", "type":"100a", "value":"!#¤%&/()=?+*^_-:;@£${[]}\\€<>"},
        {"name":"Parm9", "type":"3s0", "value":0},
        {"name":"Parm10", "type":"3s0", "value":0}
    ]},
    {"s": {"name":"Arr2Count", "type":"5s0", "value":3}},
    {"ds": [{"name":"Arr2"},
        {"s":[
            {"name":"Arr2P1", "type":"1a", "value":"a"},
            {"name":"Arr2P2", "type":"30a", "value":"AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA"},
            {"name":"Arr2P3", "type":"4a", "value":"ap3"},
            {"name":"Arr2P4", "type":"18a", "value":""},
            {"name":"Arr2P5", "type":"100a", "value":""}
        ]},
        {"s":[
            {"name":"Arr2P1", "type":"1a", "value":"b"},
            {"name":"Arr2P2", "type":"30a", "value":"BBBBBBBBBBBBBBBBBBBBBBBBBBBBBB"},
            {"name":"Arr2P3", "type":"4a", "value":"bp3"},
            {"name":"Arr2P4", "type":"18a", "value":""},
            {"name":"Arr2P5", "type":"100a", "value":""}
        ]},
        {"s":[
            {"name":"Arr2P1", "type":"1a", "value":"c"},
            {"name":"Arr2P2", "type":"30a", "value":"CCCCCCCCCCCCCCCCCCCCCCCCCCCCCC"},
            {"name":"Arr2P3", "type":"4a", "value":"cp3"},
            {"name":"Arr2P4", "type":"18a", "value":""},
            {"name":"Arr2P5", "type":"100a", "value":""}
        ]},
        {"ds": [{"name":"Arr2Empty", "dim": 7},
            {"s":[
                {"name":"Arr2P1", "type":"1a", "value":""},
                {"name":"Arr2P2", "type":"30a", "value":""},
                {"name":"Arr2P3", "type":"4a", "value":""},
                {"name":"Arr2P4", "type":"18a", "value":""},
                {"name":"Arr2P5", "type":"100a", "value":""}
            ]}
        ]}
    ]},
    {"s": {"name":"Parm11", "type":"30a", "value":""}}
]}

output(1863): {"script":[
    {"pgm":["TPGM2","DB2JSON",
        {"Parm1":{}},
        {"Parm2":"äöÄÖåÅáÁà"},
        {"Parm3":{}},
        {"Parm4":{}},
        {"Arr1Count":1},
        {"Arr1":[
            [{"Arr1P1":{}},
                {"Arr1P2":"!#¤%&/()=?+*^_-:;@£${[]}\\<>"},
                {"Arr1P3":{}},{"Arr1P4":0.0}
            ],
            [{"Arr1P1":{}},{"Arr1P2":{}},{"Arr1P3":{}},{"Arr1P4":0.0}],
            [{"Arr1P1":{}},{"Arr1P2":{}},{"Arr1P3":{}},{"Arr1P4":0.0}],
            [{"Arr1P1":{}},{"Arr1P2":{}},{"Arr1P3":{}},{"Arr1P4":0.0}],
            [{"Arr1P1":{}},{"Arr1P2":{}},{"Arr1P3":{}},{"Arr1P4":0.0}],
            [{"Arr1P1":{}},{"Arr1P2":{}},{"Arr1P3":{}},{"Arr1P4":0.0}],
            [{"Arr1P1":{}},{"Arr1P2":{}},{"Arr1P3":{}},{"Arr1P4":0.0}],
            [{"Arr1P1":{}},{"Arr1P2":{}},{"Arr1P3":{}},{"Arr1P4":0.0}],
            [{"Arr1P1":{}},{"Arr1P2":{}},{"Arr1P3":{}},{"Arr1P4":0.0}],
            [{"Arr1P1":{}},{"Arr1P2":{}},{"Arr1P3":{}},{"Arr1P4":0.0}]
        ]},
        {"Parm5":{}},
        {"Parm6":{}},
        {"Parm7":{}},
        {"Parm8":"!#¤%&/()=?+*^_-:;@£${[]}\\<>"},
        {"Parm9":0.0},
        {"Parm10":0.0},
        {"Arr2Count":3},
        {"Arr2":[
            {"Arr2P1":"a"},
            {"Arr2P2":"AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA"},
            {"Arr2P3":"ap3"},
            {"Arr2P4":{}},
            {"Arr2P5":"aAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAap3"},
            {"Arr2P1":"b"},
            {"Arr2P2":"BBBBBBBBBBBBBBBBBBBBBBBBBBBBBB"},
            {"Arr2P3":"bp3"},
            {"Arr2P4":{}},
            {"Arr2P5":"bBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBbp3"},
            {"Arr2P1":"c"},
            {"Arr2P2":"CCCCCCCCCCCCCCCCCCCCCCCCCCCCCC"},
            {"Arr2P3":"cp3"},
            {"Arr2P4":{}},
            {"Arr2P5":"cCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCcp3"},
            {"Arr2Empty":[
                [{"Arr2P1":{}},{"Arr2P2":{}},{"Arr2P3":{}},{"Arr2P4":{}},{"Arr2P5":{}}],
                [{"Arr2P1":{}},{"Arr2P2":{}},{"Arr2P3":{}},{"Arr2P4":{}},{"Arr2P5":{}}],
                [{"Arr2P1":{}},{"Arr2P2":{}},{"Arr2P3":{}},{"Arr2P4":{}},{"Arr2P5":{}}],
                [{"Arr2P1":{}},{"Arr2P2":{}},{"Arr2P3":{}},{"Arr2P4":{}},{"Arr2P5":{}}],
                [{"Arr2P1":{}},{"Arr2P2":{}},{"Arr2P3":{}},{"Arr2P4":{}},{"Arr2P5":{}}],
                [{"Arr2P1":{}},{"Arr2P2":{}},{"Arr2P3":{}},{"Arr2P4":{}},{"Arr2P5":{}}],
                [{"Arr2P1":{}},{"Arr2P2":{}},{"Arr2P3":{}},{"Arr2P4":{}},{"Arr2P5":{}}]
            ]}
        ]},
        {"Parm11":"äöÄÖåÅáÁà"}
    ]}
]}
kadler commented 6 years ago

Original comment by Teemu Halmela (Bitbucket: teemu_, GitHub: Unknown).


Thank you again for in depth explanations. Almost understood it all :smiley:.

I was little bit confused why RPG was working when I gave it dim(5), because like you said RPG is expecting 20 dim array but only got 5, so rest of the parameters should be out of sync. But the output was correct so I went with it to decrease the output size (I was lazy).

We unfortunately have many programs that take input arrays so getting this work is beneficial. But I think this will work, at least for experimenting.

{"pgm":[
    {"name":"TPGM",  "lib":"DB2JSON"},
    {"s": {"name":"inCount", "type":"10i0", "value":5}},
    {"ds": [{"name":"input"},
        {"s":[ {"name":"in1", "type":"5av2", "value":"a1"}, {"name":"in2", "type":"5av2", "value":"a2"}]},
        {"s":[ {"name":"in1", "type":"5av2", "value":"b1"}, {"name":"in2", "type":"5av2", "value":"b2"}]},
        {"s":[ {"name":"in1", "type":"5av2", "value":"c1"}, {"name":"in2", "type":"5av2", "value":"c2"}]},
        {"s":[ {"name":"in1", "type":"5av2", "value":"d1"}, {"name":"in2", "type":"5av2", "value":"d2"}]},
        {"s":[ {"name":"in1", "type":"5av2", "value":"e1"}, {"name":"in2", "type":"5av2", "value":"e2"}]},
        {"ds": [{"name": "inputEmpty", "dim":15},
            {"s":[ {"name":"in1", "type":"5av2", "value":""}, {"name":"in2", "type":"5av2", "value":""}]}
        ]}
    ]},
    {"s": {"name":"outCount", "type":"10i0", "value":0}},
    {"ds": [{"name":"output", "dim":20},
        {"s":[ {"name":"out1", "type":"5av2", "value":""}, {"name":"out2", "type":"5av2", "value":""}, {"name":"out3", "type":"10av2", "value":""}]},
    ]},
    {"s": {"name":"last", "type":"10a", "value":""}}
]}
kadler commented 6 years ago

Original comment by Tony Cairns (Bitbucket: rangercairns, GitHub: rangercairns).


Ok, fixed basic function of 'ds within 'ds'. You can find a new 1.0.5-sg6 binary on yips.

this is correct ...

Here is a test that will do the 'exotic' set input array attribute with json. Specifically, elements 1-2 will be set to default "i1", "i2", elements 3-5 will be set to "bob", "was", "here","mary","hello","alan", elements 6-20 will be set to default "i1", "i2".

#!bash

bash-4.3$ ./test1000_sql400json32 j0171_pgm_hamela02-ds-set_input_array
input(512000):
{"pgm":[
    {"name":"HAMELA02",  "lib":"DB2JSON"},
    {"s": {"name":"inCount", "type":"10i0", "value":5}},
    {"ds": [{"name":"inputDS"},
      {"ds": [{"name":"inputDS1","dim":2},
        {"s":[
            {"name":"in1", "type":"5av2", "value":"i1"},
            {"name":"in2", "type":"5av2", "value":"i2"}
        ]}
      ]},
      {"s":[
            {"name":"in1_3", "type":"5av2", "value":"bob"},
            {"name":"in1_3", "type":"5av2", "value":"was"}
      ]},
      {"s":[
            {"name":"in1_4", "type":"5av2", "value":"here"},
            {"name":"in1_4", "type":"5av2", "value":"mary"}
      ]},
      {"s":[
            {"name":"in1_5", "type":"5av2", "value":"hello"},
            {"name":"in1_5", "type":"5av2", "value":"alan"}
      ]},
      {"ds": [{"name":"inputDS2","dim":15},
        {"s":[
            {"name":"in1", "type":"5av2", "value":"i1"},
            {"name":"in2", "type":"5av2", "value":"i2"}
        ]}
      ]}
    ]},
    {"s": {"name":"outCount", "type":"10i0", "value":5}},
    {"ds": [{"name":"outDS","dim":20},
        {"s":[
            {"name":"out1", "type":"5av2", "value":"o1"},
            {"name":"out2", "type":"5av2", "value":"o2"},
            {"name":"out3", "type":"10av2", "value":"o3"}
        ]}
    ]},
    {"s": {"name":"last", "type":"10a", "value":"ll"}}
]}

output(1633):
{"script":[{"pgm":["HAMELA02","DB2JSON",
{"inCount":5},
{"inputDS":[
{"inputDS1":[[{"in1":"i1"},{"in2":"i2"}],[{"in1":"i1"},{"in2":"i2"}]]},
{"in1_3":"bob"},{"in1_3":"was"},
{"in1_4":"here"},{"in1_4":"mary"},
{"in1_5":"hello"},{"in1_5":"alan"},
{"inputDS2":[
[{"in1":"i1"},{"in2":"i2"}],[{"in1":"i1"},{"in2":"i2"}],[{"in1":"i1"},{"in2":"i2"}],
[{"in1":"i1"},{"in2":"i2"}],[{"in1":"i1"},{"in2":"i2"}],[{"in1":"i1"},{"in2":"i2"}],
[{"in1":"i1"},{"in2":"i2"}],[{"in1":"i1"},{"in2":"i2"}],[{"in1":"i1"},{"in2":"i2"}],
[{"in1":"i1"},{"in2":"i2"}],[{"in1":"i1"},{"in2":"i2"}],[{"in1":"i1"},{"in2":"i2"}],
[{"in1":"i1"},{"in2":"i2"}],[{"in1":"i1"},{"in2":"i2"}],[{"in1":"i1"},{"in2":"i2"}]
]}]},
{"outCount":5},
{"outDS":[
[{"out1":"i1"},{"out2":"i2"},{"out3":"i1i2"}],
[{"out1":"i1"},{"out2":"i2"},{"out3":"i1i2"}],
[{"out1":"bob"},{"out2":"was"},{"out3":"bobwas"}],
[{"out1":"here"},{"out2":"mary"},{"out3":"heremary"}],
[{"out1":"hello"},{"out2":"alan"},{"out3":"helloalan"}],
[{"out1":"o1"},{"out2":"o2"},{"out3":"o3"}],
[{"out1":"o1"},{"out2":"o2"},{"out3":"o3"}],
[{"out1":"o1"},{"out2":"o2"},{"out3":"o3"}],
[{"out1":"o1"},{"out2":"o2"},{"out3":"o3"}],
[{"out1":"o1"},{"out2":"o2"},{"out3":"o3"}],
[{"out1":"o1"},{"out2":"o2"},{"out3":"o3"}],
[{"out1":"o1"},{"out2":"o2"},{"out3":"o3"}],
[{"out1":"o1"},{"out2":"o2"},{"out3":"o3"}],
[{"out1":"o1"},{"out2":"o2"},{"out3":"o3"}],
[{"out1":"o1"},{"out2":"o2"},{"out3":"o3"}],
[{"out1":"o1"},{"out2":"o2"},{"out3":"o3"}],
[{"out1":"o1"},{"out2":"o2"},{"out3":"o3"}],
[{"out1":"o1"},{"out2":"o2"},{"out3":"o3"}],
[{"out1":"o1"},{"out2":"o2"},{"out3":"o3"}],
[{"out1":"o1"},{"out2":"o2"},{"out3":"o3"}]
]},
{"last":"TEST"}]}]}

result:
success (0)

this is not correct ...

In case you still have doubt, wrong thinking appears below (my opinion). That is to say, because a name in a 'ds' structure appears more than once it does not mean 'moving' to the next record. Aka, "name":"in1" appears 5 times in ONE record of 'ds' "name":"input" below, as does "name":"in2". To wit, this is just a big 'ds' of 10 's' elements of dim(5) that happens to have the same name "in1" and 'in2" re-used for elements. In fact, RPG compiler would complain, but json could care less about "name" until parsing on client (or other exotic action like enddo). Also, as mention, an expected RPG program input dim(20) will not work with "dim":5 truncation. To wit, RPG is expecting 20, and, RPG get's what RPG wants, or you get really weird results (your secondary problem description is weird result that would happen).

#!json

{"ds": [{"name":"input", "dim": 5},

{"s":[{"name":"in1", "type":"5av2", "value":"a1"},
{"name":"in2", "type":"5av2", "value":"a2"}]},

{"s":[ {"name":"in1", "type":"5av2", "value":"b1"}, 
{"name":"in2", "type":"5av2", "value":"b2"}]},

{"s":[ {"name":"in1", "type":"5av2", "value":"c1"}, 
{"name":"in2", "type":"5av2", "value":"c2"}]},

{"s":[ {"name":"in1", "type":"5av2", "value":"d1"}, 
{"name":"in2", "type":"5av2", "value":"d2"}]},

{"s":[ {"name":"in1", "type":"5av2", "value":"e1"}, 
{"name":"in2", "type":"5av2", "value":"e2"}]}

]},

(*) I understand you can make an argument that 'move to next element' occurs when 'same name' is duplicated. I will not accept this design idea, because much to error prone in my opinion. However, like i mentioned, you can build your own json parser and call toolkit-base anyway you like, with any json syntax you choose (to infinity and beyond ... but not in my parser).

kadler commented 6 years ago

Original comment by Tony Cairns (Bitbucket: rangercairns, GitHub: rangercairns).


Warning ... I have a problem with nested 'ds' structures. I will fix next release. However, until fixed, you will not be able to try out any fancy 'ds' within 'ds' workaround for your set array values experiment (aka, no exotic ds work ... yet).

kadler commented 6 years ago

Original comment by Tony Cairns (Bitbucket: rangercairns, GitHub: rangercairns).


Here is your same test with corrected input json (common format).

#!bash
bash-4.3$ ./test1000_sql400json32 j0170_pgm_hamela02-ds
input(4096):
{"pgm":[
    {"name":"HAMELA02",  "lib":"DB2JSON"},
    {"s": {"name":"inCount", "type":"10i0", "value":5}},
    {"ds": [{"name":"inputDS","dim":20},
        {"s":[
            {"name":"in1", "type":"5av2", "value":"i1"},
            {"name":"in2", "type":"5av2", "value":"i2"}
        ]}
    ]},
    {"s": {"name":"outCount", "type":"10i0", "value":5}},
    {"ds": [{"name":"outDS","dim":20},
        {"s":[
            {"name":"out1", "type":"5av2", "value":"o1"},
            {"name":"out2", "type":"5av2", "value":"o2"},
            {"name":"out3", "type":"10av2", "value":"o3"}
        ]}
    ]},
    {"s": {"name":"last", "type":"10a", "value":"ll"}}
]}

output(1564):
{"script":[{"pgm":["HAMELA02","DB2JSON",
{"inCount":5},
{"inputDS":[
[{"in1":"i1"},{"in2":"i2"}],[{"in1":"i1"},{"in2":"i2"}],
[{"in1":"i1"},{"in2":"i2"}],[{"in1":"i1"},{"in2":"i2"}],
[{"in1":"i1"},{"in2":"i2"}],[{"in1":"i1"},{"in2":"i2"}],
[{"in1":"i1"},{"in2":"i2"}],[{"in1":"i1"},{"in2":"i2"}],
[{"in1":"i1"},{"in2":"i2"}],[{"in1":"i1"},{"in2":"i2"}],
[{"in1":"i1"},{"in2":"i2"}],[{"in1":"i1"},{"in2":"i2"}],
[{"in1":"i1"},{"in2":"i2"}],[{"in1":"i1"},{"in2":"i2"}],
[{"in1":"i1"},{"in2":"i2"}],[{"in1":"i1"},{"in2":"i2"}],
[{"in1":"i1"},{"in2":"i2"}],[{"in1":"i1"},{"in2":"i2"}],
[{"in1":"i1"},{"in2":"i2"}],[{"in1":"i1"},{"in2":"i2"}]
]},
{"outCount":5},
{"outDS":[
[{"out1":"i1"},{"out2":"i2"},{"out3":"i1i2"}],
[{"out1":"i1"},{"out2":"i2"},{"out3":"i1i2"}],
[{"out1":"i1"},{"out2":"i2"},{"out3":"i1i2"}],
[{"out1":"i1"},{"out2":"i2"},{"out3":"i1i2"}],
[{"out1":"i1"},{"out2":"i2"},{"out3":"i1i2"}],
[{"out1":"o1"},{"out2":"o2"},{"out3":"o3"}],
[{"out1":"o1"},{"out2":"o2"},{"out3":"o3"}],
[{"out1":"o1"},{"out2":"o2"},{"out3":"o3"}],
[{"out1":"o1"},{"out2":"o2"},{"out3":"o3"}],
[{"out1":"o1"},{"out2":"o2"},{"out3":"o3"}],
[{"out1":"o1"},{"out2":"o2"},{"out3":"o3"}],
[{"out1":"o1"},{"out2":"o2"},{"out3":"o3"}],
[{"out1":"o1"},{"out2":"o2"},{"out3":"o3"}],
[{"out1":"o1"},{"out2":"o2"},{"out3":"o3"}],
[{"out1":"o1"},{"out2":"o2"},{"out3":"o3"}],
[{"out1":"o1"},{"out2":"o2"},{"out3":"o3"}],
[{"out1":"o1"},{"out2":"o2"},{"out3":"o3"}],
[{"out1":"o1"},{"out2":"o2"},{"out3":"o3"}],
[{"out1":"o1"},{"out2":"o2"},{"out3":"o3"}],
[{"out1":"o1"},{"out2":"o2"},{"out3":"o3"}]
]},
{"last":"TEST"}]}]}

result:
success (0)

The noticeable RPG changes like "out3":"i1i2" are only in the first 5 elements per "inCount":5. It worked.

BTW -- As you can see we need the convention for "enddo":"outCount" to avoid records not changed in output (if desired).