JSON TO Stream - Forum - OpenEdge Development - Progress Community
 Forum

JSON TO Stream

This question is not answered

I have a TT and I would like to use WRITE-JSON and get in the following format.

Each record must be on a separate line by itself.

There shouldn’t be any record-separating commas.

No array brackets

Spaces between structural punctuation (outside the quoted strings) are not significant; they are acceptable but not required.

A sample is below.

 

{ "Outlet Name" : "Store A", "Outlet Number" : "1", "Outlet Address 1" : "6565 Penn Ave", "Outlet Address 2" : "", "Outlet City" : "Pittsburgh", "Outlet State" : "PA", "Outlet Zip Code" : "15206", "Transaction Date/Time" : "2016-05-09-16:01:08", "Market Basket Transaction ID" : "353063910", "Scan Transaction ID" : "1", "Register ID" : "1", "Quantity" : "1", "Price" : "7.06", "UPC Code" : "2820000477", "UPC Description" : "MARL FSC SLVR BX KG", "Unit of Measure" : "Pack", "Promotion Flag" : "N", "Outlet Multi-pack Flag" : "N", "Outlet Multi-pack Quantity" : "", "Outlet Multi-pack Discount Amount" : "", "Account Promotion Name" : "", "Account Discount Amount" : "", "Manufacturer Discount Amount" : "", "Coupon PID" : "", "Coupon Amount" : "", "Manufacturer Multi-pack Flag" : "N", "Manufacturer Multi-pack Quantity" : "", "Manufacturer Multi-pack Discount Amount" : "", "Manufacturer Promotion Description" : "", "Manufacturer Buy-down Description" : "", "Manufacturer Buy-down Amount" : "", "Manufacturer Multi-pack Description" : "", "Account Loyalty ID Number" : "", "Coupon Description" : "" }
{ "Outlet Name" : "Store A", "Outlet Number" : "1", "Outlet Address 1" : "6565 Penn Ave", "Outlet Address 2" : "", "Outlet City" : "Pittsburgh", "Outlet State" : "PA", "Outlet Zip Code" : "15206", "Transaction Date/Time" : "2016-05-10-00:08:58", "Market Basket Transaction ID" : "353012783", "Scan Transaction ID" : "1", "Register ID" : "2", "Quantity" : "2", "Price" : "5.40", "UPC Code" : "2820030992", "UPC Description" : "L&M FSC BX 100", "Unit of Measure" : "Pack", "Promotion Flag" : "Y", "Outlet Multi-pack Flag" : "Y", "Outlet Multi-pack Quantity" : "2", "Outlet Multi-pack Discount Amount" : "0.50", "Account Promotion Name" : "Buy 2 packs get $1.00 off", "Account Discount Amount" : "", "Manufacturer Discount Amount" : "", "Coupon PID" : "", "Coupon Amount" : "", "Manufacturer Multi-pack Flag" : "N", "Manufacturer Multi-pack Quantity" : "", "Manufacturer Multi-pack Discount Amount" : "", "Manufacturer Promotion Description" : "", "Manufacturer Buy-down Description" : "", "Manufacturer Buy-down Amount" : "", "Manufacturer Multi-pack Description" : "", "Account Loyalty ID Number" : "", "Coupon Description" : "" }
{ "Outlet Name" : "Store B", "Outlet Number" : "2", "Outlet Address 1" : "123 Main Ave", "Outlet Address 2" : "", "Outlet City" : "Oakmont", "Outlet State" : "PA", "Outlet Zip Code" : "15140", "Transaction Date/Time" : "2016-05-09-15:08:13", "Market Basket Transaction ID" : "353065229", "Scan Transaction ID" : "1", "Register ID" : "1", "Quantity" : "10", "Price" : "1.08", "UPC Code" : "2590022731", "UPC Description" : "SS CGRLO TROPIC 2/$.99", "Unit of Measure" : "Pack", "Promotion Flag" : "N", "Outlet Multi-pack Flag" : "N", "Outlet Multi-pack Quantity" : "", "Outlet Multi-pack Discount Amount" : "", "Account Promotion Name" : "", "Account Discount Amount" : "", "Manufacturer Discount Amount" : "", "Coupon PID" : "", "Coupon Amount" : "", "Manufacturer Multi-pack Flag" : "N", "Manufacturer Multi-pack Quantity" : "", "Manufacturer Multi-pack Discount Amount" : "", "Manufacturer Promotion Description" : "", "Manufacturer Buy-down Description" : "", "Manufacturer Buy-down Amount" : "", "Manufacturer Multi-pack Description" : "", "Account Loyalty ID Number" : "", "Coupon Description" : "" }

I have tried using WRITE-JSON on the TT to both a FILE or a STREAM but all the output is put on one line. If use the formatted...too many lines.

TEMP-TABLE ttblScanData:WRITE-JSON ("STREAM",  /* target type...in this case our STREAM */
                                   "EJ",   /* our STREAM name */
                                   FALSE,  /* formatted = false */
                                   ?,   /* encoding */
                                   FALSE,  /* omit initial values */
                                   TRUE,   /* omit outer object */
                                   FALSE   /* write before image */
                                   ).

I have also tried loping through TT and using the SERIALIZE-ROW to a stream...again all records on same line.

hBuffer:SERIALIZE-ROW ("JSON",  /* target format...we want JSON */
                          "STREAM", /* target type..we want to use a stream */
                          "EJ",   /* our STREAM name */
                          FALSE,  /* formatted = false */
                          ?,   /* encoding */
                          FALSE,  /* omit initial values */
                          TRUE   /* omit outer object */
                          ).

I have also tried use SERIALIZE-ROW to a JsonObject and the use WriteStream on the JsonObject but again all records on one line.

hBuffer:SERIALIZE-ROW ("JSON",    /* target format...we want JSON */
                          "JsonObject",  /* target type..we want to use a stream */
                          oJsonObject,  /* our STREAM name */
                          FALSE,    /* formatted = false */
                          ?,     /* encoding */
                          FALSE,    /* omit initial values */
                          TRUE     /* omit outer object */
                          ).
                         
   oJsonObject:WriteStream("EJ",
                           FALSE,
                           ?).

Am I missing something or will I have to use SERIALIZE-ROW to a LONGCHAR and the write that to the STREAM with a SKIP?

All Replies
  • This is technically not JSON. You'll need to do something like

    method writerow (input hbuffer as handle, input pStreamHandle as handle)

         put stream pStreamHandle unformatted "~{ ".

    do i = 1 to hBuffer:num-fields:

       hfield = hbuffer:buffer-field(i)

         put stream pStreamHandle unformatted

          quoter(hfield:name) ":"

    .

        case hfielddata-type:

            //process data output

    put stream unformatted ...

        end case.

    put stream unformatted " " . // field delimters

    end.

    put stream unformatted " ~}~n" .

    end method.

  • You will need to handle serialization of each entry in your temp-table array manually.

    USING Progress.Json.ObjectModel.JsonObject FROM PROPATH.

    USING Progress.Json.ObjectModel.JsonArray FROM PROPATH.

    define temp-table test

      field afield as character

       field bfield as character.

    create test.

    test.afield = "a".

    test.bfield = "b".

    create test .

    test.afield ="c".

    test.bfield = "d".

    define variable tjson as JsonArray no-undo.

    tjson = new JsonArray().

    temp-table test:DEFAULT-BUFFER-HANDLE:write-json("JsonArray", tjson).

    define stream jsonout.

    output stream jsonout to value("c:\temp\test.json").

    define variable i as integer no-undo.

    define variable trow as JsonObject no-undo.

    do i = 1 to tjson:Length:

       trow = tjson:GetJsonObject(i).

       trow:WriteStream(stream jsonout:handle:name).

       put stream jsonout skip.

    end.

    output stream jsonout close.

  • Thanks Peter...already explained to vendor that it is not JSON. We have no control over the format...it is either this or PIPE delimited text file.

  • Thanks Matt...I will give that a try.

  • Matt,

    I have this working using the JsonArray as you mentioned. I also have it working by looping through TT and using the JsonObject followed by a SKIP. I will have to test to see what performs better. THANKS.

    I do have one more issue. I have a field in the TT defined as the following. The vendor requires these names and the "Transaction Date/Time" turns into "Transaction Date\/Time" in the json file. Is there anyway to resolve this?

    FIELD TransactionDateTime AS CHARACTER SERIALIZE-NAME "Transaction Date/Time"

  • It appears the WriteStream is escaping the forward slash. From what I read this is not necessary. Is this a bug?

  • This is not a bug.  While it might not be necessary, our implementation of the JSON serialization chose to escape the forward slash, backslash, etc. characters.  See www.json.org for the layout of a JSON string.

  • thanks for the info.