FILL operation - Forum - OpenEdge Deployment - Progress Community
 Forum

FILL operation

  • I am working on data fetching performance enhancements and need to find an article that discusses the ramifications of client-server fill versus appserver data fill.  Also need to understand the impact of BATCH-SIZE, number of rows fetched, number of columns in the dataset, static temp table/dataset handling versus dynamic, etal.  I am doing some experimentation performing the data gathering using all these methods but I am not sure what to adjust for optimum performance.  If you can point me to an article that discusses these issues and provides recommendations for optimumn performance, I would appreciate it.

    Environment is  Progress 10.2B,.Net, Unix database server.

  • doug.macmillan wrote:

    I am working on data fetching performance enhancements and need to find an article that discusses the ramifications of client-server fill versus appserver data fill.  Also need to understand the impact of BATCH-SIZE, number of rows fetched, number of columns in the dataset, static temp table/dataset handling versus dynamic, etal.  I am doing some experimentation performing the data gathering using all these methods but I am not sure what to adjust for optimum performance.  If you can point me to an article that discusses these issues and provides recommendations for optimumn performance, I would appreciate it.

    Environment is  Progress 10.2B,.Net, Unix database server.

    There's been some discussion on this topic over time here on PSDN. I'd ask more specific client-side questions at the http://communities.progress.com/pcom/community/psdn/openedge/gui4dotnet community, and server-side ones at probably the http://communities.progress.com/pcom/community/psdn/openedge/abl community, although there's a fair amount of cross-pollination between the ABL and OO foums.

    (I don't have permissions to move this question to another forum, so you may have to ask it again).

    -- peter

  • You can also take a look at the presentation I did on batching, filtering and sorting in 2009. There is a powerpoint that touches on some of the mentioned subjects, and code sample that tries to illustrate how to support this in an layered architecture like OERA. Note that the general OERA and UI code is over simplified as the focus is only batching, sorting and filtering.

    http://communities.progress.com/pcom/docs/DOC-102833

    Dale Hubbuck poitned out a problem with the sample code in newer versions and then provided a solution in the following post.

    http://communities.progress.com/pcom/message/112200#112200

  • Is it a fair to compare performance in .Net GUI application using non-native controls to a progress 7 style application using browses, etc?  IOW - is it reasonable to believe that the performance of a GUI .Net app that fetches data in client server mode and populates an IG grid will ever be as fast as an old style progress app that fills a browse.  Feel like it might be a losing battle when continually the .Net app is compared to old progress app from a performance perspective.  In the old progress app the db table is bound directly to the browse via a query and unwanted records are thrown out both via the query and a find trigger that performs business logic and returns error from the find trigger if the record should not be included.  If it is to be included, additional business logic is performed to fill in calculated fields, etc.  In .Net, the model is diff.  The query string is created and fills a temp table via a data set with records being excluded via the query and also via the before beforerowfill event.  This all works but is not nearly as fast as the old progress app.  There are indexing issues (many multi-field indexes versus single field indexes) that prevent the most effective bracketing which forces me to throw records out on the back end in beforerowfill in lieu of via the query.  There are things that I can do to boost performance as far as filling the grid with data faster to provide a better UI experience for the user.  But these involve delaying performing additional business logic for calculated fields and don't really address the performance question that I am asking.  I may go this route out of necessity and run the additional business logic on row activate in the grid (or something similar) so that the data is available for viewing but it is not all built up as part of the initial fill operation.  Any comments suggestions are welcome.

  • Another (much more brief!) question...

    in old style process app where db table is bound to a browse via a query, is there internal data batching built into the progress browse and query?

  • in old style process app where db table is bound to a browse via a query, is there internal data batching built into the progress browse and query?

    Yes - by 50 or 80 records or so. Depending on joins in the query or not.

  • what do you mean exactly by "depending on joins in the query or not"?

  • what do you mean exactly by "depending on joins in the query or not"?

    For queries with a join the browser read 50 rows, for queries without a join 80 rows.

    Or the other way around. Must be somewhere in the documentation ...

  • >  IOW - is it reasonable to believe that the performance of a GUI .Net app that fetches data in client server mode and populates an IG grid will ever be as fast as an old style progress app that fills a browse. 

    The Progress browse is very specialized in optimizing the performance when reading from a Progress query. Especially when using INDEXED-REPOSITION.

    Out of the box the ProBindingSource does only support forward batching. But what when trying to jump to the last of 100.000 records? Or fetch the 90.000th record? That would require forward and backwords batching. The ProBindingSource and the .NET Grid's work with 0 based row indexes that cannot become negative...

    So without coding around this issue, it's a battle the .NET Controls cannot win.

    We've implemented the INDEXED-REPOSITON as part of our GUI for .NET framework, based on ProDataset and Business Entities. It requires far more code on the frontend and the backend as I would have liked (the ProDataset's batching mechanism is designed for forward batching only as well). But the result was worth the work. We can fetch the last batch in almost no time and jump to any record inbetween evenly fast and then batch forward and backwords.

    beforerowfill in lieu of via the query

    Try to avoid the row-level event handlers of the dataset! They kill performance. When it's a minority of records that are to be excluded, use a table based after-fill and delete the unwanted records.

    additional business logic on row activate in the grid (or something similar) so that the data is available for viewing but it is not all built up as part of the initial fill operation.

    Are you talking about RowActivate (selection of a row) or OnInitializeRow (similar to ROW-DISPLAY)? I always compute all data in the business entity. Also for being able to reuse all the fill logic with other clients (mobile devices or web browsers etc.)

    There are things that I can do to boost performance

    It's tough - but can be done. And should be done in a framework and not in every Form over and over again so that further optimization can be done in the future on a central place. Make sure to get your batching working - forward and backwards. Divide and conquer is the key to performance. Like the Progress browser is doing.

  • Are you saying that row level fill events are very expensive and should be avoided?

    This is what I am doing... (code snippet below)

    the EventProcHandle is the reference to a persistent procedure and each of the callbacks are run for each row if they are define in the procedure.  Typically, BeforeDataRowFill has the job of excluding records I don't want that are not excluded via the query.  AfterDataRowFill has the job of doing extra business logic for calculated fields to fill out the dataset.

    To your comment about throwing away records with a table level fill event, couple questions/comments...

    Seems that this would affect the batch size in the dataset since I am including records that I am ultimately going to throw away en masse

    I could mark these records and dispose at the end easily enough but the management of the batch would appear to be cumbersome to say the least

    Is there a better way of filling up the add'l dataset fields (calculated fields) instead of in the AfterDataRowFill procedure?

    IF VALID-HANDLE(EventProcHandle) THEN DO:
               
                IF LOOKUP("BeforeDataFill",EventProcHandle:INTERNAL-ENTRIES) NE 0 THEN
                    hDataSource:GET-BUFFER-HANDLE(TempTableName):SET-CALLBACK-PROCEDURE
                        ("before-fill","BeforeDataFill",EventProcHandle).
               
                IF LOOKUP("BeforeDataRowFill",EventProcHandle:INTERNAL-ENTRIES) NE 0 THEN
                    hDataSource:GET-BUFFER-HANDLE(TempTableName):SET-CALLBACK-PROCEDURE
                        ("before-row-fill","BeforeDataRowFill",EventProcHandle).
               
                IF LOOKUP("AfterDataRowFill",EventProcHandle:INTERNAL-ENTRIES) NE 0 THEN
                    hDataSource:GET-BUFFER-HANDLE(TempTableName):SET-CALLBACK-PROCEDURE
                        ("after-row-fill","AfterDataRowFill",EventProcHandle).
               
                IF LOOKUP("AfterDataFill",EventProcHandle:INTERNAL-ENTRIES) NE 0 THEN
                    hDataSource:GET-BUFFER-HANDLE(TempTableName):SET-CALLBACK-PROCEDURE
                        ("after-fill","AfterDataFill",EventProcHandle).
               
            END.

  • Is there a better way of filling up the add'l dataset fields (calculated fields) instead of in the AfterDataRowFill procedure?

    Use the AFTER-FILL of either the table or the dataset and FOR EACH through the whole temp table and do your computations that way.

    Each invocation of the event handlers has an overhead. AFTER-ROW-FILL is executed per row, AFTER-FILL per batch which is 1/(batch size) less number of calls.

    BEFORE-ROW-FILL is a different thing. If you are about to throw away most record, the query is the way to go.

    You should provide way more details to discuss this further on.

    When you are throwing away 90 % of the records, by the current BEFORE-ROW-FILL code and "cannot" do that in the query, I'd rather not use the FILL methods. You can also FOR EACH the DB table and BUFFER-COPY into the temp-temp in those cases.

  • understand the comments re detail - the data varies widely so the appropriate implementation is different based on the data condition - just trying the figure out the best way to get the data - and fast.  I just don't believe comparing performance in GUI for .Net to native progress browse is fair.  However it is what it is.  Thanks for the detailed info.