BIRT Cubes scaling up for huge data
Options
<p>Hi,</p>
<p>I have a BIRT cube which takes data from a huge data set (around 18 million records) which are filtered to get about the 182500 records which are aggregated in CUBE and linked to a Chart.<br>
The Report gets hanged during processing.<br>
How much data can a CUBE process ? Are there any limits? How is the performance and scaling up to huge data?</p>
<p> </p>
<p> </p>
<p>I have a BIRT cube which takes data from a huge data set (around 18 million records) which are filtered to get about the 182500 records which are aggregated in CUBE and linked to a Chart.<br>
The Report gets hanged during processing.<br>
How much data can a CUBE process ? Are there any limits? How is the performance and scaling up to huge data?</p>
<p> </p>
<p> </p>
0
Comments
-
<p>I understand that the engineering team tests the Data Cubes with 1 million records as explained in a similar post at <a data-ipb='nomediaparse' href='https://www.eclipse.org/forums/index.php/t/207942/'>https://www.eclipse.org/forums/index.php/t/207942/</a> If you have a portable example design, you can post it here. If you are seeing problems in the designer, you can try increasing the amount of memory heap available to the designer... or similar if you are using the BIRT runtime.</p>
<p>For anyone using the commercial iHub products, it is recommended to use a BIRT Data Object instead of Data Cube as explained here: <a data-ipb='nomediaparse' href='http://blogs.opentext.com/birt-data-objects-more-than-meets-the-eye/'>http://blogs.opentext.com/birt-data-objects-more-than-meets-the-eye/</a></p>Warning No formatter is installed for the format ipb0
Categories
- All Categories
- 109 Developer Announcements
- 49 Articles
- 100 General Questions
- 122 IM Services
- 40 OpenText Hackathon
- 31 Developer Tools
- 20.6K Analytics
- 4.1K AppWorks
- 8.9K Extended ECM
- 897 Cloud Fax and Notifications
- 77 Digital Asset Management
- 9.3K Documentum
- 29 eDOCS
- 123 Exstream
- 39.8K TeamSite
- 1.7K Web Experience Management