KoolReport's Forum

Official Support Area, Q&As, Discussions, Suggestions and Bug reports.
Forum's Guidelines

How to consume large array data on datagrid #2160

Closed saiful opened this topic on on Jun 25, 2021 - 2 comments

saiful commented on Jun 25, 2021

hello, I have a project that uses a datagrid to handle millions of data, and the datasource I use is an array-converted from JSON APIs from our backend, if I use an array, the memory exhausted error message always appears. I've seen and tried examples of handling big data using serverside and fastrender with datasources directly from PDO, but we don't think it's suitable. the question is how to handle a large array of data sources when using a datagrid without sacrificing a lot of memory? is there a way for asynchronously processing or some kind of lazyload?

Sebastian Morales commented on Jun 28, 2021

Saiful, does your API support paging? There's no way to load millions of rows into memory and output them to client/browser all at once without memory or processing time issue. We could use datagrid/DataTables' server side mode because most sql database server (ms sql server, oracle, postgresql, mysql, etc) support paging.

If your API does support paging there might be a way to make it work with DataTables. Rgds,

saiful commented on Jul 1, 2021

thanks for the explanation, our team is considering modifying the API source

Build Your Excellent Data Report

Let KoolReport help you to make great reports. It's free & open-source released under MIT license.

Download KoolReport View demo
None yet

DataGrid