Home
Analytics
There is an error in loading files of data set cache
remo
Hi
I designed a BIRT report (v.2.5.2) containing 2 data sources (scripted and jdbc) and 4 data sets. When deployed to my application in the development environment, the report runs without any problems.
However, when I deploy the same *.rptdesign file to the production system, I get the following error:
org.eclipse.birt.report.engine.api.EngineException: There is an error in loading files of data set cache
Malformed input around byte 35
I've only heard about design-time data set caches, but not about runtime data set caches. From the error message I assume that this cache is corrupt. Where are these caches located? How can I flush/delete them?
Thanks for any help.
Remo
Find more posts tagged with
Comments
JasonW
Remo,
Is your scripted datasource reading a file? BTW you can "clean" the runtime by looking for the configuration directory under /WEB-INF/Platform/ and deleting everything in it except the config.ini. Make sure the app server is not running the app when you do this.
Jason
remo
Jason,
I was unable to find a configuration directory under /WEB_INF/Platform. However, there is a configuraion directory under ReportEngine, which contains the config.ini and three other directories.
I assumed that's the one and deleted everything but the config.ini. Unfortunately, it's still the same error.
And no, my scripted data source is not reading any files.
Do you have any further ideas?
Thanks,
Remo
JasonW
Remo,
Are you running the same versions of the runtime in production?
Jason
remo
Jason,<br />
<br />
It's the exact same version.<br />
I have other serious issues with BIRT right now (see <a class='bbc_url' href='
http://www.birt-exchange.org/org/forum/index.php/topic/20333-outofmemoryerror-without-useoldaliasmetadatabehaviortrue/'>http://www.birt-exchange.org/org/forum/index.php/topic/20333-outofmemoryerror-without-useoldaliasmetadatabehaviortrue/</a>)
. Could there be a connection between these two cases?<br />
<br />
Remo
JasonW
Remo,
I doubt this is an issue, but are you setting the max perm size on the runtime?
-XX:MaxPermSize=256m
Jason
remo
Jason,
Is this a suggestion to solve this issue here (malformed byte...) or the other one (OutOfMemoryError...)?
No, so far I don't increase the max perm size. Do I have to add it to the Tomcat VM parameters?
Remo
JasonW
This usually fixes the out of memory error. I am wondering if the two issues are related. I set mine in the catalina.bat.
set JAVA_OPTS=%JAVA_OPTS% -XX:MaxPermSize=256m -Djava.util.logging.manager=org.apache.juli.ClassLoaderLogManager -Djava.util.logging.config.file="%CATALINA_BASE%\conf\logging.properties"
Jason
remo
Unfortunately, increasing the perm gen space solves neither problem...
Remo
JasonW
Can you upload the report?
Jason
remo
Jason,<br />
<br />
The report is attached to this post. Before posting, I ran the report again, once in my development environment, once in the production environment. Here the results:<br />
<br />
Development environment:<br />
<pre class='_prettyXprint _lang-auto _linenums:0'>
Exception in thread "Thread-65" java.lang.OutOfMemoryError: Java heap space
at org.eclipse.birt.core.util.IOUtil.convertBytes2String(IOUtil.java:1233)
at org.eclipse.birt.core.util.IOUtil.readUTF(IOUtil.java:1127)
at org.eclipse.birt.core.util.IOUtil.readString(IOUtil.java:795)
at org.eclipse.birt.data.engine.executor.cache.ResultObjectUtil.readData(ResultObjectUtil.java:185)
at org.eclipse.birt.data.engine.executor.dscache.CacheUtilFactory$DiskLoadUtil.loadObject(CacheUtilFactory.java:551)
at org.eclipse.birt.data.engine.executor.dscache.DataSetFromCache.loadObject(DataSetFromCache.java:76)
at org.eclipse.birt.data.engine.executor.dscache.DataSetFromCache.fetch(DataSetFromCache.java:58)
at org.eclipse.birt.data.engine.executor.cache.OdiAdapter.fetch(OdiAdapter.java:161)
at org.eclipse.birt.data.engine.executor.cache.RowResultSet.next(RowResultSet.java:105)
at org.eclipse.birt.data.engine.executor.cache.ExpandableRowResultSet.next(ExpandableRowResultSet.java:63)
at org.eclipse.birt.data.engine.executor.cache.SmartCacheHelper.populateData(SmartCacheHelper.java:311)
at org.eclipse.birt.data.engine.executor.cache.SmartCacheHelper.initInstance(SmartCacheHelper.java:283)
at org.eclipse.birt.data.engine.executor.cache.SmartCacheHelper.getResultSetCache(SmartCacheHelper.java:244)
at org.eclipse.birt.data.engine.executor.cache.SmartCache.<init>(SmartCache.java:69)
at org.eclipse.birt.data.engine.executor.transform.pass.PassUtil.populateOdiResultSet(PassUtil.java:142)
at org.eclipse.birt.data.engine.executor.transform.pass.PassUtil.pass(PassUtil.java:62)
at org.eclipse.birt.data.engine.executor.transform.pass.PassManager.doSinglePass(PassManager.java:183)
at org.eclipse.birt.data.engine.executor.transform.pass.PassManager.pass(PassManager.java:93)
at org.eclipse.birt.data.engine.executor.transform.pass.PassManager.populateResultSet(PassManager.java:74)
at org.eclipse.birt.data.engine.executor.transform.ResultSetPopulator.populateResultSet(ResultSetPopulator.java:196)
at org.eclipse.birt.data.engine.executor.transform.CachedResultSet.<init>(CachedResultSet.java:152)
at org.eclipse.birt.data.engine.executor.dscache.DataSourceQuery.execute(DataSourceQuery.java:143)
at org.eclipse.birt.data.engine.impl.PreparedOdaDSQuery$OdaDSQueryExecutor.executeOdiQuery(PreparedOdaDSQuery.java:399)
at org.eclipse.birt.data.engine.impl.QueryExecutor.execute(QueryExecutor.java:1045)
at org.eclipse.birt.data.engine.impl.ServiceForQueryResults.executeQuery(ServiceForQueryResults.java:232)
at org.eclipse.birt.data.engine.impl.QueryResults.getResultIterator(QueryResults.java:158)
at org.eclipse.birt.report.engine.data.dte.QueryResultSet.<init>(QueryResultSet.java:98)
at org.eclipse.birt.report.engine.data.dte.DteDataEngine.doExecuteQuery(DteDataEngine.java:168)
at org.eclipse.birt.report.engine.data.dte.AbstractDataEngine.execute(AbstractDataEngine.java:265)
at org.eclipse.birt.report.engine.executor.ExecutionContext.executeQuery(ExecutionContext.java:1875)
at org.eclipse.birt.report.engine.executor.QueryItemExecutor.executeQuery(QueryItemExecutor.java:80)
at org.eclipse.birt.report.engine.executor.DataItemExecutor.execute(DataItemExecutor.java:75)
</pre>
<br />
Production environment:<br />
<pre class='_prettyXprint _lang-auto _linenums:0'>
org.eclipse.birt.report.engine.api.EngineException: There is an error in loading files of data set cache
Malformed input around byte 35
</pre>
<br />
I have no idea, why it doesn't run any longer in the development environment but also ends up in an OutOfMemoryError now. It's more likely now that there's a connection to my other <a class='bbc_url' href='
http://www.birt-exchange.org/org/forum/index.php/topic/20333-outofmemoryerror-without-useoldaliasmetadatabehaviortrue'>Post</a>.<br
/>
<br />
The test data I use is a very small fixture data base with only a few records. It's highly unlikely that the amount of data causes an OutOfMemoryError.<br />
<br />
The Tomcat in my development environment runs with: -Xmx1024M -XX:MaxPermSize=256m<br />
The Tomcat in the production environment runs with: -Xmx4096m -XX:MaxPermSize=256m<br />
<br />
Thanks for having a look at the report!<br />
<br />
Remo
remo
Jason,
After looking at the above stack trace, I stepped into the code and realized the following:
The byte[] ret in IOUtil:1125 has a length of 822MB! In convertBytes2String() line 1233, BIRT tries to allocate a second array of that size, which of course fails with the available memory in development environment - but not in production! This explains the difference between the two.
Maybe you know what's going on there, i.e. why it is trying to allocate that much memory...
Thanks,
Remo
JasonW
Remo,
You may need to open a bug for this, but can you try adding the following to a beforeFactory script:
reportContext.getAppContext().put("org.eclipse.birt.data.cache.memory","0");
reportContext.getAppContext().put("org.eclipse.birt.data.cache.RowLimit", "0");
You may also want to move the js functions that you have in the dynamic text to the beforeFactory script as well.
Jason
remo
Jason,
I opened bug #323563 for this issue. Adding
reportContext.getAppContext().put("org.eclipse.birt.data.cache.memory","0");
reportContext.getAppContext().put("org.eclipse.birt.data.cache.RowLimit", "0");
to the beforeFactory didn't help.
Remo
JasonW
Thanks for opening. BTW did you move the scripts from the dynamic text element?
It would be nice if you could keep simplifying the report in order to find out what is happening.
Jason