The Syncfusion native Blazor components library offers 70+ UI and Data Viz web controls that are responsive and lightweight for building modern web apps.
.NET PDF framework is a high-performance and comprehensive library used to create, read, merge, split, secure, edit, view, and review PDF files in C#/VB.NET.
We have a table with approx 11 million rows by 10 columns wide (with various length fields). Will Essesntial Grouping be able to cope with this and display records using ASP.NET?
ADAdministrator Syncfusion Team March 9, 2005 12:24 PM
Bob,
Assuming you don''t want all the million rows on the client page at the same time, if you specify some filter settings that would reduce the number of rows displayed (to 100 or so) then the performance would be acceptable.
-Praveen
BOBobHopeMarch 9, 2005 12:56 PM
Hi,
Thanks for the response. I wouldn''t necessarily need to show all of the rows at once but I would need to be able to show the summary info.
i.e. If I have all of my 10 columns, the user will need to be able to pick at least one of those columns (possible even up to 5 or 6) and be able to group by them. I will then only need to show the summary results.
My main concern is that the essential grouping component would need to load all of these rows in order to do the grouping and wondered if someone knew if it would handle this situation?
Thanks
Bob
ADAdministrator Syncfusion Team March 9, 2005 01:54 PM
Hi Bob,
10 million rows might be too much too handle.
You could test drive such scenario with the GroupingPerf example that ships with the Windows Forms Grouping Grid. It lets you specify the number of records and whether to group and categorize them. Once you go above 500,000 or a million records the memory consumption might become an issue.
One way to be able to display more rows would be to fall back to the plain grouping engine without grid and nested table support. In the samples that ship with the plain grouping engine you will notice that you can set a property "RecordsAsDisplayElements" = true. This will cut down memory usage for the grouping engine and you should be able to load it with a couple million records. 11 million might still be too much, though.
Stefan