Discussions
Categories
Groups
Community Home
Categories
INTERNAL ENABLEMENT
POPULAR
THRUST SERVICES & TOOLS
CLOUD EDITIONS
Quick Links
MY LINKS
HELPFUL TIPS
Back to website
Home
Intelligence (Analytics)
Performance problem in report with large data set
IN_BIRTY
Hi
We are facing a rigorous performance problem in a report, which has a large dataset. There are about 9 million data in the oracle database. And we tried pulling up 2000 claim data out of it. The report being pulled up, but it takes around 2.5 minutes to navigate to the next page through the report.
The dataset of the report is comparitively large as it fetches lot many columns joined with multiple tables.
Any idea, why is it very slow to page through the report data with large data set?
Any suggested solutions?
Thanks in advance.
Indu
Find more posts tagged with
Comments
bhanley
The bottleneck is likely at the database level. Have you run your query outside of BIRT to see how it performs? Any filtering you can do at the database level (via your query) rather than against your data set in BIRT will be a huge performance gain. You will leverage the resources of the database (which are optimized to filter and organize data) as well as send less information over the wire to the report server.
EDH
Can you recommend any design patterns to produce reports in an efficient way given a large resultset ?
Assume that my database query behaves as it should be.
Assume that although it behaves as it should be it can return a large number of result rows. Let's say 25.000 rows.
How should I design my BIRT report given that I don't want to run into an OutOfMemory error when a couple of my users request the system to produce the report ?
That is, is there any way you can retrieve the data in chunks and produce the report in chunks ? If not, how will it ever be possible to create "big" reports ?
Best regards,
EDH