Palo vs Excel capacity

This site uses cookies. By continuing to browse this site, you are agreeing to our Cookie Policy.

  • Palo vs Excel capacity

    I just heared from Palo and am very curious about what it's going to bring us. I'm going to investigate it.

    For now I would like to get a response from other people who have been struggeling with large data files in Excel:

    To feed our "on the spot" trouble shooting need of a production process, we've been looking for means to analyse large amounts of data. It first was stored in an Excel sheet, but as you know was limited there to 65536 rows. I introduced the solution to keep the data outside of Excel by distracting the data from a csv datafile with SQL for further analsis in an Excel pivot table. This works fine (though very slow) for datafiles with less than 370k records but for larger files only Excel 2003 was capable to handle the data (otherwise "not enough memory" would be the alert). I could go up to 460k records (have not tested for the top boundry yet).

    Please let me know what your experiences have been and how Palo has answered your needs...

    11:30 am - just tried to load the 460k csv datafile in Palo - after 20 minutes received message I was "out of virtual memory". (Vir mem is set at over 1024MB.) On this system I can work my previous solution... Any comments?


    The post was edited 1 time, last by BartH ().

  • re.:Palo vs Excel capacity

    >>...just tried to load the 460k csv datafile in Palo - after 20 minutes received message I was "out of virtual memory". (Vir mem is set at over 1024MB.)

    Hi BarTH,
    working/testing PALO from the very beginning already and after the normal labour pain I easily managed to import CSV files with 18.000+ records (file size 2.2MB) - takes app. 10min to properly import those records.
    From pre-version 1b onwards PALO was stable enough to do some serious testing.
    I am using it in a parallel run with my existing OLAP software and up till now I am still in balance. The resulting cube is app. 12Mb containing close to 37.000 records.

  • RE: re.:Palo vs Excel capacity

    Hello Mitchener,

    Happy to read you're OK with 37.000+ records.

    I'm still out of business op Palo with my 460.000+ records (unlike when I use Excel 2003 and import in pivot table) and this is not even the maximum amount of records I'm expecting to get. These 460k records are in a 111MB csv file and enclose 4 months of testing results so if I want to analyse a whole year, I'll get well over 1.5 mln records. Could a quicker processor or larger vir mem help? My ram is 785MB...
    We provide a problem for every solution ?(.

    The post was edited 1 time, last by BartH ().

  • RE: re.:Palo vs Excel capacity

    Hi Bart,

    thanks a lot for sending the 111 MB flat file (837.828 records) with which you were experiencing problems. We did some thorough analysis here and found that the problem is basically related to RAM size. In the Import Wizard, after the flat file is selected, it will be read entirely into the memory which is where a problem may arise. On a system with 1 GB RAM there is no problem reading your file, with 512 MB the process gets stuck, of course depending on which other processes are running.

    We recommend to split the import file into several smaller ones, on the 512 MB machine for example I had no problem importing data from a flat file of 17 MB (275.000 records).

    Apart from that we will rework the import routine to avoid memory issues.

    Jedox Quality Assurance

    The post was edited 1 time, last by holger ().