Home
Analytics
BIRT Out Of Memory exception
eliac
Dear All,
I am using BIRT 3.7.2 to render my reports.
My configuration is the below -Xms512 -Xmx1024
Whenever a report is launched and it has more that 20 000 rows, the engine is hanging and it is giving an out of memory.
I have added some system.out to the ResultIterator.java and after following BIRT logs, it appears that the exception is occurring at the level getValue() method, and the report is not rendering.
Is there any way to avoid this, maybe for example dividing the original nbr of records into certain amounts, let's say 10000 and add the records to several stacks and then rendering the report.
Or is there any recommended configuration for Tomcat 6 , BEA 10,11 and WAS servers for huge reports?
Thanks in advance for you help
Regards
Find more posts tagged with
Comments
icirillo
Hi eliac,
I just executed a report containing 25000 rows using the Birt Engine 3.7.2 (via command line).
Could you try to set the following values to the java options?
-Xms128m -Xmx512m -XX:MaxPermSize=128m -Djava.awt.headless=true
CBR
Hi,
simplest way would be to increase the heap size. If you are running on 64 Bit (i would recommend that) that shouldn't be a problem. I m running very huge reports with 4GB RAM on a tomcat without any issue.
But it is quite hard to make recommendations without knowing your typical report. Just a few questions for you:
- Do you use Scripted DataSets or JDBCDatasets?
- What's the typical data volume of a single report? One can simply estimate the number and the column of rows. If one of your columns contain binary data it is much worse because this data has to be loaded into memory as well.
- If you use JDBC have you configured the JDBC driver to stream rows? (some driver are loading all rows before they return the very first one to BIRT, this is causing OutOfMemory in most cases when working with reports that need to load a high data volume)
- If you are already streaming rows from database you can configure BIRT to use a disc cache if the number of rows increase a certain limit. You can define this limit as a parameter for the report engine. This is only possible by doing code changes to your application. If you are using the Web-Viewer this is not a real option for you (except you want to modify that code)
- How do you display that data? Do you use data cube, normal table or list or a chart?
eliac
Hello Cbrell,
Thank you for the detailed reply.
Concerning your question, please find my reply below :
1- I am using JDBCDatasets.
2- My report contains about 17 columns, it is working fine with a limit < 20 000 rows. once this limit is reached i have the out of memory issue.
3- I haven't changed the configuration of the jdbc driver, we are using the drivers provided from oracle, along with the struts,sping and ibatis framework. I will try to look on the way this configuration is done and see if this will solve my problem.
4- At first I used the birt web viewer, but now i have created my own viewer using the different render option available from BIRT (html,excel, ppt...), so setting the disk cache option might help me as well. I will look deeper into it.
5- The particular report that is causing the issue is using normal tables.
Regards
CBR
Hi eliac,<br />
<br />
i would be interested in your problem. Does the dataset contain columns having binary or large data?<br />
<br />
If you are using BIRT through API can you try to add a RAM limit for the dataset? You can do that by just using Application Context. It can be set on any task.<br />
<br />
Just try<br />
<pre class='_prettyXprint _lang-auto _linenums:0'>
task.getAppContext().put(DataEngine.MEMORY_BUFFER_SIZE, 500);
</pre>
<br />
task variable contains any Task (RunAndRenderTask or RunTask...setting doesn't make any sense in RenderTask but would be possible:-)) This will just write dataset data to disc if it consumes more then 500MB of RAM. If the issue does not disappear you would have to have a closer look at the JDBC settings.
CBR
would you be able to share the report? (just wanted to know if i can see any issues in it)
Btw: if you speak german you can have a look at the presentation slides of one of my talks:
http://www.oio.de/m/konf/jax2012/High-Performance-BIRT-Reports-JAX-2012.pdf
eliac
Hello,
Thanks again for the reply.
Unfortunately I cannot post the report since its proprietary of the company i work for but I was able to solve the issue both from the API side as well from the BIRT viewer side. The viewer side requires a turnaround but it helped to fix the issue.
I used in the API (this only works for the runandRender task (for xls,pdf) but not for the render task (that works only with html) :
Map context = BirtUtility.getAppContext( request );
context.put(DataEngine.MEMORY_BUFFER_SIZE, 50);
context.put(DataEngine.MEMORY_DATA_SET_CACHE, new Integer(0));
where the second line will use disk cache when possible to write the reports.
For the BIRT viewer the turnaround is simple ( i used the same for PDF format and XLS format where I was getting the out of memory issue and that interest me the most, but i think the same behavior could be applied for other formats as well) :
The trick is to use the /run servlet instead of using the /frameset, the main difference is that the /run creates a runandRender task while the /frameset creates a render task :
First we have to modify the __exportAction : function( ) in BirtExportReportDialog.js :
we need to add after : if( action.search( reg ) < 0 )
{
action = action + "&" + Constants.PARAM_OVERWRITE + "=false";
}
else
{
action = action.replace( reg, "$1=false" );
}
the below code if( format=="pdf" || format =="xls"){
action = action.replace("frameset","run")
}
Then in a second step, we need to modify BIRT source code, in the ReportEngineService.java we need to add the to the createRenderTask() method and createRunandRenderTask () method
context.put(DataEngine.MEMORY_BUFFER_SIZE, 50);
context.put(DataEngine.MEMORY_DATA_SET_CACHE, new Integer(0));
after the Map context = BirtUtility.getAppContext( request );
I hope the above will help someone else to get rid of the error as I did.
Thanks a lot for the hints as well.
CBR
Hi,
does the render task cause out of memory as well? The Render Task doesn't load any data so as far as i know the settings you provided do not influence render task.
Is there a reason why you deactivate it? (if you do not need it that's fine, just wanted to know if it causes any issue)
Thanks a lot for sharing your solution!
eliac
Hello,
Yes the render task was causing the problem when the data exceeded certain limits (in my case 20 000 rows) even when using the html option view, but when I added context.put(DataEngine.MEMORY_DATA_SET_CACHE, new Integer(0)); to the createRenderTask() method in ReportServiceEngine.java, the error disappeared and i was able to see the report in HTML view. but the problem of out of memory reappeared from within the viewer when exporting to Excel or any other format, therefore i modified the export screen code and replaced the /frameset with /run to redirect the viewer to the modified createRunAndRender() method and the problem was solved.
Arif shah
<blockquote class="ipsBlockquote" data-author="cbrell" data-cid="101514" data-time="1338456951">
<div>
<p>Hi eliac,<br><br>
i would be interested in your problem. Does the dataset contain columns having binary or large data?<br><br>
If you are using BIRT through API can you try to add a RAM limit for the dataset? You can do that by just using Application Context. It can be set on any task.<br><br>
Just try</p>
<pre class="_prettyXprint _lang-auto _linenums:0">
task.getAppContext().put(DataEngine.MEMORY_BUFFER_SIZE, 500);
</pre>
task variable contains any Task (RunAndRenderTask or RunTask...setting doesn't make any sense in RenderTask but would be possible:-)) This will just write dataset data to disc if it consumes more then 500MB of RAM. If the issue does not disappear you would have to have a closer look at the JDBC settings.
<p> </p>
</div>
</blockquote>
<p>Hi,</p>
<p> </p>
<p>I am also facing the same issue. I am calling the report from Birtviewer on Tomcat and getting outofmemoy exception. Can you please tell me where do I have to put the above code to increase the momery. Is there any other configuration file where I can increase the ram? Which file do I have to edit to edit the java options for increasing the memory?</p>
<p> </p>
<p>Thanks</p>
<p>Arif</p>