Discussions
Categories
Groups
Community Home
Categories
INTERNAL ENABLEMENT
POPULAR
THRUST SERVICES & TOOLS
CLOUD EDITIONS
Quick Links
MY LINKS
HELPFUL TIPS
Back to website
Home
Intelligence (Analytics)
Report Failing for large Data
kapadiatap
We have our own plugin to create the Datasource and we are executing the Birt Reports from a Websphere on Linux Server.
We are using BIRT version 2.5.
We get the following exception for large data:
[1/14/12 1:19:50:376 EST] 00000021 prefs W Could not lock User prefs. Unix error code 24.
[1/14/12 1:19:50:376 EST] 00000021 prefs W Couldn't flush user prefs: java.util.prefs.BackingStoreException: Couldn't get file lock.
[1/14/12 1:20:03:332 EST] 0000001e ReportEngine E An error happened while running the report. Cause:
java.lang.NullPointerException
at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:76)
at java.io.BufferedOutputStream.write(BufferedOutputStream.java:89)
at org.eclipse.birt.core.util.IOUtil.writeInt(IOUtil.java:217)
at org.eclipse.birt.data.engine.executor.cache.ResultObjectUtil.writeData(ResultObjectUtil.java:341)
at org.eclipse.birt.data.engine.executor.cache.ResultObjectUtil.writeData(ResultObjectUtil.java:249)
at org.eclipse.birt.data.engine.executor.cache.disk.DataFileWriter.write(DataFileWriter.java:105)
at org.eclipse.birt.data.engine.executor.cache.disk.RowFile.writeRowsToFile(RowFile.java:140)
at org.eclipse.birt.data.engine.executor.cache.disk.RowFile.writeRows(RowFile.java:106)
at org.eclipse.birt.data.engine.executor.cache.disk.DiskDirectExport.outputResultObjects(DiskDirectExport.java:86)
at org.eclipse.birt.data.engine.executor.cache.disk.DiskDataExport.innerExportRestData(DiskDataExport.java:142)
at org.eclipse.birt.data.engine.executor.cache.disk.DiskDirectExport.exportRestDataToDisk(DiskDirectExport.java:58)
at org.eclipse.birt.data.engine.executor.cache.disk.DiskCacheResultSet.processRestResultObjects(DiskCacheResultSet.java:81)
at org.eclipse.birt.data.engine.executor.cache.disk.DiskCache.<init>(DiskCache.java:111)
at org.eclipse.birt.data.engine.executor.cache.SmartCacheHelper.populateData(SmartCacheHelper.java:351)
at org.eclipse.birt.data.engine.executor.cache.SmartCacheHelper.initInstance(SmartCacheHelper.java:283)
at org.eclipse.birt.data.engine.executor.cache.SmartCacheHelper.getResultSetCache(SmartCacheHelper.java:244)
at org.eclipse.birt.data.engine.executor.cache.SmartCache.<init>(SmartCache.java:69)
at org.eclipse.birt.data.engine.executor.transform.pass.PassUtil.populateOdiResultSet(PassUtil.java:132)
at org.eclipse.birt.data.engine.executor.transform.pass.PassUtil.pass(PassUtil.java:62)
at org.eclipse.birt.data.engine.executor.transform.pass.PassManager.doSinglePass(PassManager.java:183)
at org.eclipse.birt.data.engine.executor.transform.pass.PassManager.pass(PassManager.java:93)
at org.eclipse.birt.data.engine.executor.transform.pass.PassManager.populateResultSet(PassManager.java:74)
at org.eclipse.birt.data.engine.executor.transform.ResultSetPopulator.populateResultSet(ResultSetPopulator.java:196)
at org.eclipse.birt.data.engine.executor.transform.CachedResultSet.<init>(CachedResultSet.java:135)
at org.eclipse.birt.data.engine.executor.DataSourceQuery.execute(DataSourceQuery.java:855)
at org.eclipse.birt.data.engine.impl.PreparedOdaDSQuery$OdaDSQueryExecutor.executeOdiQuery(PreparedOdaDSQuery.java:399)
at org.eclipse.birt.data.engine.impl.QueryExecutor.execute(QueryExecutor.java:1045)
at org.eclipse.birt.data.engine.impl.ServiceForQueryResults.executeQuery(ServiceForQueryResults.java:232)
at org.eclipse.birt.data.engine.impl.QueryResults.getResultIterator(QueryResults.java:158)
at org.eclipse.birt.report.engine.data.dte.QueryResultSet.<init>(QueryResultSet.java:98)
at org.eclipse.birt.report.engine.data.dte.DteDataEngine.doExecuteQuery(DteDataEngine.java:168)
at org.eclipse.birt.report.engine.data.dte.AbstractDataEngine.execute(AbstractDataEngine.java:265)
at org.eclipse.birt.report.engine.executor.ExecutionContext.executeQuery(ExecutionContext.java:1875)
The reports without any dynamic Computed Columns, execute without any exceptions from my local Windows machine, with 1.4 million records whereas it fails from the Linux Server.
Also, note that when I run these reports from my local machine I am using the same datasource as on the Linux Server.
I also see warning messages from BIRT :
WARNING: handle type: org.eclipse.birt.report.model.api.OdaDataSetHandle
Jan 14, 2012 1:40:23 PM null null
Can you please help?
~T~
Find more posts tagged with
Comments
Yaytay
What is the heap usage like on the Linux and Windows boxes?
BIRT has a habit of trying to load entire recordsets into memory, and 1.4M rows is likely to use a fair chunk of heap.
I haven't ever seen such ugly errors as a result of this though.
Jim
kapadiatap
<blockquote class='ipsBlockquote' data-author="'Yaytay'" data-cid="94237" data-time="1326694075" data-date="15 January 2012 - 11:07 PM"><p>
What is the heap usage like on the Linux and Windows boxes?<br />
<br />
BIRT has a habit of trying to load entire recordsets into memory, and 1.4M rows is likely to use a fair chunk of heap.<br />
I haven't ever seen such ugly errors as a result of this though.<br />
<br />
Jim<br /></p></blockquote>
<br />
<br />
Thanks for your reply.<br />
On Linux box the reports are running on Websphere with 2 GB memory.<br />
On Windows its running with 1 GB memory on Eclipse.
Yaytay
<blockquote class='ipsBlockquote' data-author="'kapadiatap'" data-cid="94253" data-time="1326725942" data-date="16 January 2012 - 07:59 AM"><p>
Thanks for your reply.<br />
On Linux box the reports are running on Websphere with 2 GB memory.<br />
On Windows its running with 1 GB memory on Eclipse.<br /></p></blockquote>
But how much is being used?<br />
<br />
Use jconsole (provided with the JDK) or some other JMX tool to examine the actual memory usage within the JVM.