I need to read and write CSV files for external systems. The data involved is very context related, so doing the "Export" approach is not what I am looking for. Also, when creating the CSV, I need to handle certain things like single and double quotes; new lines; and other "control" codes that might be in the data.
I have a text only development environment and have been experimenting with using a FOR EACH with the tables and conditions needed for the output and using a DISPLAY statement and just concatenating the trimmed fields together with double quotes and commas. This appears to create what I want, but I am struggling with keeping it from truncating the line.
Perhaps, there is an easier way....
On the import side, I am looking for a starting point.
Any trivial code examples for the read or write would be nice.
There is a command line utility called quoter that comes with an OpenEdge install that might make your life easier when reading the .csv file. In newer versions I think it is built into the language.
For writing the .csv files, look into using the PUT UNFORMATTED statement instead of DISPLAY since it gives you a lot more control over how the output is formatted.
To read import delimiter ','
To write export delimiter ','
As I indicated Miles, the Export approach is not good - been there, done that. What is lacking is control over "what" is exported and the format of the data.
Matthew - ok, let me look over the PUT...
Ah yes! The PUT statement works fine as follows:
OUTPUT TO ")...
This allows me to use REPLACE on fields where I run into trouble causing values, like single and double quotes and new lines.
Now, the "read" side - I was looking at the INPUT FROM statement - perhaps with an INSERT statement... Thoughts?
Consider IMPORT UNFORMATTED for maximum control. You can pick this apart using ENTRY() and do any special processing you want into temporary variables, then create the buffer and assign everything.
Consulting in Model-Based Development, Transformation, and Object-Oriented Best Practice http://www.cintegrity.com
The best approach for reading CSV file is following -->
DEFINE STREAM myStream.DEFINE VARIABLE c AS CHAR NO-UNDO EXTENT 50.INPUT STREAM myStream FROM "myFile.CSV".REPEAT: IMPORT STREAM myStream DELIMITER "," c.END.INPUT CLOSE.
DEFINE STREAM myStream.DEFINE VARIABLE c AS CHAR NO-UNDO EXTENT 50.
INPUT STREAM myStream FROM "myFile.CSV".
REPEAT: IMPORT STREAM myStream DELIMITER "," c.END.
Extents contains everything including single quotes, double qoutes, so you don't need to worry about them ...
If you use UNFORMATTED method, you need to do additional validation of the cells, line string etc... in this scenario you have everything clean...