OLAP session not found

      OLAP session not found

      Hi All,

      I want to build the cube and after making the dimensions and started loading the data, after a few minutes I get a pop-up window, see attachment.

      I dont understand this, because yesterday there was not a problem to create the cube and nothing has been changed on the server since then.

      And when clicking the message away, I get the window that the Tomcat service is down, but when looking on the server I see that the service is running.

      When looking in the execution log, the last message is:

      FATAL : Abnormally terminated execution of job XXXXXX: Java heap space

      Does someone know the answer.

      I am using:


      Palo ETL Server Version: 5.0.2.3687
      Palo ETL Web Client Version: 5.0.0.794



      Thx in advance,

      Johan
      Images
      • Session.jpg

        20.86 kB, 340×191, viewed 229 times

      Post was edited 1 time, last by “greatmaze” ().

      Connection problems

      Thx for your reply.

      It didn't solve the problem. I did a union on 4 CSV files and got a total of more than 1.500.000 records.

      I just split them and run them as single CSV. the first three no problem, but the last CSV file which has around 595.000 records fails to load.

      I join this file with 2 other files in which one with 450.000 records and the other 54.000 records

      After a few minutes I get a message that the Tomcat service is down, but it isn't and again I get this popup as shown in the file.

      I run the failed file with only 499.998 records in it and voila, no problem it runs.

      Then I ran the same file but now with 520.000 records in it and it fails.



      Then I tried the following:

      2014-08-18 10:56:13,244 INFO : Data retrieval from extract E_KPI_fff2013
      2014-08-18 10:56:13,259 INFO : Data retrieval from extract E_KPI_aaa2013
      2014-08-18 10:56:14,150 INFO : Lines read from file E_KPI_fff2013: 53682
      2014-08-18 10:56:14,166 INFO : Data retrieval from extract E_KPI_xxx2013
      2014-08-18 10:56:29,869 INFO : Lines read from file E_KPI_aaa2013: 450365
      2014-08-18 10:59:23,071 INFO : Lines read from file E_KPI_xxx2013: 597695
      2014-08-18 11:02:26,258 ERROR : Cannot import Data into Cube xxx_fff_aaa: Out of memory.; SQL statement:



      What I did is:
      aaa LEFTOUTERJOIN fff into mmm
      xxx LEFTOUTERJOIN mmm into final input for cube



      SETENV.BAT
      set Min_Memory=256
      set Max_Memory=4096 (was 512)
      set PermGen_Max_Memory=1024 (was 512)
      set JVM_Optional_Parameters=-XX:MinHeapFreeRatio=10;-XX:MaxHeapFreeRatio=20;
      set Log_Directory=F:\JEDOX\Jedox Suite\log\tomcat
      set Java_Directory=C:\Program Files\Java\jre7
      set Library_Path=



      Is there a limit on how many rows there can be in a CSV file? If so then it really sucks.

      Thx in advance.

      Post was edited 5 times, last by “greatmaze” ().

      Hi greatmaze,

      from my point of view csv is not very adapted to your use case. 1.5M rows are nothing for a RDMBS but it is very much for csv files, especially if you have many columns in them.

      don't you have the possibility to import the files in a mysql database for instance ? this is quite easy and you could adapt your queries accordingly
      laloune

      Post hoc, non est propter hoc

      OLAP session not found

      Hi laloune,

      thx for your reply.

      Yes I know it is not the best solution working with CSV files, but there is not an other way to use a database.

      Or is it also possible to use an Access database?

      But the problem is that you need an extra step, load the new monthly file in the database and then add it to the cube.

      OLAP session not found

      I managed to let it build the cube without errors, but now something strange happened.

      As you can see below that at 2013 it says extract from Temporary instead of AAAA as it is in the other extracts.

      Where does this come from, I deleted everything related to 2013 and rebuild it, but still I get the Temporary. I have never made a Temporary file.



      2014-08-19 14:20:50,694 INFO : Lines read from file E_KPI_AAAA2008: 38151

      2014-08-19 14:20:51,163 INFO : Lines read from file E_KPI_BBBB2008: 35184

      2014-08-19 14:20:55,038 INFO : Lines read from file E_KPI_CCCC2008: 12772



      2014-08-19 14:21:18,881 INFO : Lines read from file E_KPI_AAAA2009: 142120

      2014-08-19 14:21:20,975 INFO : Lines read from file E_KPI_BBBB2009: 124317

      2014-08-19 14:21:30,741 INFO : Lines read from file E_KPI_CCCC2009: 26760



      2014-08-19 14:22:15,756 INFO : Lines read from file E_KPI_AAAA2010: 263138

      2014-08-19 14:22:25,178 INFO : Lines read from file E_KPI_BBBB2010: 219317

      2014-08-19 14:22:37,834 INFO : Lines read from file E_KPI_CCCC2010: 38797



      2014-08-19 14:23:46,475 INFO : Lines read from file E_KPI_AAAA2011: 394738

      2014-08-19 14:24:01,897 INFO : Lines read from file E_KPI_BBBB2011: 321448

      2014-08-19 14:24:37,116 INFO : Lines read from file E_KPI_CCCC2011: 63552



      2014-08-19 14:26:32,928 INFO : Lines read from file E_KPI_AAAA2012: 566946

      2014-08-19 14:28:03,897 INFO : Lines read from file E_KPI_BBBB2012: 439832

      2014-08-19 14:47:29,944 INFO : Lines read from file E_KPI_CCCC2012: 71487



      2014-08-19 14:50:08,022 INFO : Lines read from extract Temporary: 597695

      2014-08-19 14:50:16,084 INFO : Lines read from file E_KPI_BBBB2013: 450365

      2014-08-19 14:50:43,538 INFO : Lines read from file E_KPI_CCCC2013: 53682



      2014-08-19 14:52:25,381 INFO : Lines read from file E_KPI_AAAA2014: 411673

      2014-08-19 14:52:42,288 INFO : Lines read from file E_KPI_BBBB2014: 358789

      2014-08-19 14:53:06,616 INFO : Lines read from file E_KPI_CCCC2014: 39616
      I guess that the multiple joins create a temporary "file" on the fly... that is why it logs this out.

      Yes I guess that this should be possible to use an Access db but I guess you would have to execute a VBScript to load the csv into the database.

      But you could even use talend to do that (get csv data, load into your database, and even execue your Jedox ETL Job) : all the 3 steps done in 1 single process !

      hope this helps,
      laloune

      Post hoc, non est propter hoc