Problem with escape characters in when receiving JSon messages trough a rest endpoint - Forum - OpenEdge General - Progress Community

Problem with escape characters in when receiving JSon messages trough a rest endpoint

 Forum

Problem with escape characters in when receiving JSon messages trough a rest endpoint

This question is not answered

When the JSon message below is sendt to the rest endpoint (Webservice), the sender receive an error 500. The JSon message is not loaded into the temp-table.
The Progress Application server and the database running with cpcoll Norwegian, and cpinternal/cpstream iso8859-1.

The rest endpoint is running on Progress Application Server 11.7.2 (Pasific App Server).

The problem is how to handle '\u2019'.

{
"dsProd": {
"Prod": [
{
"prodnr": "KOID280818",
"besk": "Fighting the Giants: Castro\u2019s Revolution vs The World",
"StartPeriode": 201804,
"SluttPeriode": 209904,
}
]
}
}

All Replies
  •  
     
    Start by verifying the “Content-Type” header on the client request.  It MUST state that the character encoding is UTF-8 or UTF-16.  JSON requires this.
     
    Most webservers are setup to use ISO8859-1 by default as required by the standard.  Including PASOE.  There was  kbase recently created on this.  This can be changed, but probably shouldn’t be.  So because of this, if the client doesn’t specify character encoding for a message it is assumed to be ISO8859-1.  This is wrong for JSON.  JSON requires Unicode which is either UTF-8 or UTF-16.
     
    This looks like the right single quote from an MS word doc.
     
     
     
    Can this character be converted to ISO8859-1?
     
     
     
     
     
  •  
    Answering my own question:
     
    Seems like this character doesn’t exist in ISO8859-1…
     
     
    PASOE team should probably comment, but returning a 500 for a client encoding issue isn’t terribly friendly.  A good 304 with a good error message about encoding failure would be nice here.
     
    A few things they can do
    1. Make sure they are declaring the proper content-type character encoding
    2. Reject the request on the client side and tell the user to fix it
    3. Consider stripping the bad character on the client before sending to the server (this doesn’t address the bad server behavior)
    4. Encode the bad character to something safer in the client before sending it to the server using javascript’s encodeURIComponent() function, then do something with it on the server.

     

     
     
     
  •  
     
    Further info for anyone reading this:
     
    Browser’s will submit an XHR request back to a server using the code page from the original request.  So if a .jsp page is running javascript that has an XHR request, then the browser will select the encoding for the XHR request based on what the .jsp request content type is set to.
     
    So if the .jsp page is declared without content type, then the server default is used…which is probably going to be ISO8859-1.  If the .jsp sets its content type to UTF-8, then the browser is going to send any XHR requests that originate from that page as UTF-8.  Unless explicitly overridden when setting the headers for the individual XHR request.
     
     
    So if you don’t want to specify content encoding for every XHR request, you can set your .jsp pages to UTF-8 and the browser will do the right thing.  Or you can make each XHR request have the right Content-Type header that includes the character set encoding.
     
    The body encoding for the JSON must end up as Unicode otherwise you’re violating the JSON standard and the server is going to reject the request if there are characters outside the code page.
     
    But if they have to move these characters over to the AVM (since this is PASOE), they still need to deal with the characters in that context as well…then also in the database if this is going to get stored there as well.
     
     
    mattB
     
     
     
  • Thank you for the answer. I try i out.