Trying to connect SQL Database with Jedox Excel using PALO ETL version 3.2

      Trying to connect SQL Database with Jedox Excel using PALO ETL version 3.2

      Hi,



      I will be grateful if you could lend me some help.



      Using Palo Community
      Version 3.2 that is free and supports write-back, I'm trying to create on Jedox a cube from a SQL database but I have some difficulties.

      I've created all dimensions before creating the cube. To do that I've done some queries on the Relational Extract menu.
      Then, I've done another Relational Extract with Extractions to create the source that I'll load to the cube. As it's shown on the Jedox ETL Manual on page 59, "the source defines in its rows the data to be written to the OLAP Cube. The source must contain in its first columns the name of the dimension elements of the cube. The last column contains the value which is written in the cube cell."
      I've loaded the dimension elements normally.
      But, when I was trying to load the cube, I have to complete this fields:
      - Source: no problem
      - Target Connection: no problem
      - Target Cube Name: I must complete that but I don't know how without using OLAP Manager who is not in this version.

      I would really appreciate any help in this matter.



      Lots of thanks.



      :)



      (I write you from Chile although I'm spanish. Excuse any mistake, please.)
      I had this problem earlier.

      In the ETL Manager web page, the page where you defined and browsed the dimensions (In my case on 5.0 it sits under Navigation -> OLAP Manager -> Modeller).

      Select your data base using the drop down on the top.

      Then, below the right pane, look for the three cube icons. Click on the left icon (it looks like a 3D cube with a little grey square overlayed on it.) That will define a new cube. Add the dimensions you defined earlier to this cube (your dimensions should be visible on the left pane).

      After you save the cube definition, it should appear in the dialog in the "Target Cube Name" drop down. when you try to create the Load ETL job.
      Hi,

      you don't have to create a cube before loading data into it. ETL Server will create the cube for you during the first load.
      Keep in mind: Columns must have the same names as the existing dimensions. One (last) column for the values. The cube will then be created with the dimensions you supplied through your extract.

      Regards.
      Hi,

      I have this schema:

      Connections
      ...Palo
      ......ConnectionPalo
      ------> Test: OK
      ...SQL Server
      ......ConnectionBBDD
      ------> Test: OK
      Extracts
      ...Relational
      ......Dim_Escenario
      ------> Preview data: OK
      ......Dim_Empresa
      ------> Preview data: OK
      ......Dim_Rubro
      ------> Preview data: OK
      ......Dim_DIV-LN-LP
      ------> Preview data: OK
      ......Ventas_1
      ------> Preview data: OK

      Transforms
      ...TableTransform
      ......Ventas_2
      .........Source: Ventas_1
      .........Mode: Normalization
      .........Normalizer field: Medidas
      .........Target:
      ............Field name........................Input
      ...............Dim_Escenario..................Código Escenario
      ...............Dim_Empresa....................Código Empresa
      ...............Dim_Rubro.........................Código Rubro
      ...............Dim_DIV-LN-LP...................Código DIV-LN-LP
      .........Measure: Venta
      .........Aggregation: sum
      ----------> Preview data: OK
      .............| Dim_Escenario | Dim_Empresa | Dim_Rubro | Dim_DIV-LN-LP | Medidas | #Value |

      Loads
      ...Dimension
      ......Dim_Escenario
      .........Type: Dimension
      .........Source: Dim_Escenario
      .........Target Connection: ConnectionPalo
      .........Target Dimension: EMPTY
      .........Standard mode: update
      ------> Test: OK

      ......Dim_Empresa
      .........Type: Dimension
      .........Source: Dim_Empresa
      .........Target Connection: ConnectionPalo
      .........Target Dimension: EMPTY
      .........Standard mode: update
      ------> Test: OK

      ......Dim_Rubro
      .........Type: Dimension
      .........Source: Dim_Rubro
      .........Target Connection: ConnectionPalo
      .........Target Dimension: EMPTY
      .........Standard mode: update
      ------> Test: OK

      ......Dim_DIV-LN-LP
      .........Type: Dimension
      .........Source: Dim_DIV-LN-LP
      .........Target Connection: ConnectionPalo
      .........Target Dimension: EMPTY
      .........Standard mode: update
      ------> Test: OK

      ......Ventas_2
      .........Type: Dimension
      .........Source: Ventas_2
      .........Target Connection: ConnectionPalo
      .........Target Dimension: EMPTY
      .........Standard mode: update
      ------> Test: OK

      ...Cube
      ......Type: Cube
      ......Source: Ventas_2
      ......Target Connection: ConnectionPalo
      ......Target Dimension: EMPTY
      ......Target Cube Name: EMPTY
      ......Standard mode: update
      ------> Test: KO ------------> The component"Cubo_Ventas" is tested with errors: java.rmi.RemoteException

      I would really appreciate any help in this matter.



      Lots of thanks.

      Post was edited 1 time, last by “RObregon” ().

      It's true!!! Now the test is OK.

      Now my problem is running the job:

      Jobs
      ...Standard
      ......Masterdata
      .........Jobs and Loads to be executed:
      ............Dim_Escenario
      ............Dim_Empresa
      ............Dim_Rubro
      ............Dim_DIV-LN-LP
      ............Ventas_2

      ......CubeData
      .........Jobs and Loads to be executed:
      ............Cubo_Ventas

      ......Default
      .........Jobs and Loads to be executed:
      ............Masterdata
      ............CubeData

      2013-06-13 15:28:25,986 INFO : Using jpalo for communicating with palo olap server.
      2013-06-13 15:28:26,017 INFO : Starting execution of job Masterdata
      2013-06-13 15:28:26,017 INFO : Execution Parameters: #logLimit:100 ; #failOnError:true
      2013-06-13 15:28:26,017 INFO : Starting load DimEmpresa of Dimension DimEmpresa
      2013-06-13 15:28:26,017 INFO : Data retrieval from extract Empresas
      2013-06-13 15:28:26,048 INFO : Lines read from extract Empresas: 3
      2013-06-13 15:28:26,048 INFO : Creating Palo database ETL Jedox BBDD
      2013-06-13 15:28:26,048 ERROR : Cannot load Dimension DimEmpresa: invalid database name [database name contains an illegal character] (Palo Error 2000)

      I would really appreciate any help in this matter.

      Lots of thanks.
      Hm.

      Try separating the dimension loads from the cube loads (put the dimension loads into a separate job.)
      Run the dimension loads then go to the web gui and look at the OLAP Manager -> Modeler -> ConnectionPalo server.

      Double click on the Dimensions like "Dim_Empresa" and verify they're populated with the elements you created. If not, then we can investigate why the Dimension loading didn't work.

      Did your cube get defined by the ETL? (did it show up on the OLAP Manager page?) If it did show up, right click on it and see if there are any element in it.
      I've corrected the last ERROR "Cannot load Dimension DimEmpresa: invalid database name [database name contains an illegal character] (Palo Error 2000)"

      I've separated in three jobs:
      ...Masterdata: with Dimensions
      ...Cubedata: with the Cube
      ...Default: Dimensions + Cube

      When I try to execute the job MAsterdata, the ERROR is not "default", is "warning":
      -------------> Source Ventas2 of load Ventas2 has to be a tree for the loading of consolidations. Dimension will be flat.

      2013-06-13 16:33:19,862 INFO : Using jpalo for communicating with palo olap server.
      2013-06-13 16:33:19,878 INFO : Starting execution of job Masterdata
      2013-06-13 16:33:19,878 INFO : Execution Parameters: #logLimit:100 ; #failOnError:true
      2013-06-13 16:33:19,878 INFO : Starting load DimEmpresa of Dimension DimEmpresa
      2013-06-13 16:33:19,878 INFO : Data retrieval from extract Empresas
      2013-06-13 16:33:19,909 INFO : Lines read from extract Empresas: 3
      2013-06-13 16:33:19,909 INFO : No new elements are loaded
      2013-06-13 16:33:19,924 INFO : New consolidations loaded: 3
      2013-06-13 16:33:19,924 INFO : Finished load DimEmpresa of Dimension DimEmpresa
      2013-06-13 16:33:19,924 INFO : Starting load DimEscenario of Dimension DimEscenario
      2013-06-13 16:33:19,924 INFO : Data retrieval from extract Escenarios
      2013-06-13 16:33:19,924 INFO : Lines read from extract Escenarios: 2
      2013-06-13 16:33:19,924 INFO : No new elements are loaded
      2013-06-13 16:33:19,940 INFO : New consolidations loaded: 2
      2013-06-13 16:33:19,940 INFO : Finished load DimEscenario of Dimension DimEscenario
      2013-06-13 16:33:19,940 INFO : Starting load DimCentroCosto of Dimension DimCentroCosto
      2013-06-13 16:33:19,940 INFO : Data retrieval from extract CentrosCosto
      2013-06-13 16:33:19,940 INFO : Lines read from extract CentrosCosto: 43
      2013-06-13 16:33:19,956 INFO : No new elements are loaded
      2013-06-13 16:33:20,018 INFO : New consolidations loaded: 43
      2013-06-13 16:33:20,018 INFO : Finished load DimCentroCosto of Dimension DimCentroCosto
      2013-06-13 16:33:20,018 INFO : Starting load Ventas2 of Dimension Ventas1
      2013-06-13 16:33:20,018 INFO : Data retrieval from transform Ventas2
      2013-06-13 16:33:20,018 INFO : Data retrieval from extract Ventas1
      2013-06-13 16:33:22,218 INFO : Lines read from extract Ventas1: 11966
      2013-06-13 16:33:22,296 INFO : Lines read from extract Ventas2: 13
      2013-06-13 16:33:22,296 INFO : No new elements are loaded
      2013-06-13 16:33:22,296 INFO : Data retrieval from transform Ventas2
      2013-06-13 16:33:22,342 WARN : Source Ventas2 of load Ventas2 has to be a tree for the loading of consolidations. Dimension will be flat.
      2013-06-13 16:33:22,342 INFO : Data retrieval from transform Ventas2
      2013-06-13 16:33:22,389 INFO : Lines read from extract Ventas2: 13
      2013-06-13 16:33:22,389 INFO : For attribute DimEmpresa, values loaded: 13
      2013-06-13 16:33:22,405 INFO : For attribute DimCentroCosto, values loaded: 13
      2013-06-13 16:33:22,405 INFO : For attribute Medidas, values loaded: 13
      2013-06-13 16:33:22,405 INFO : For attribute #Value, values loaded: 13
      2013-06-13 16:33:22,405 INFO : Finished load Ventas2 of Dimension Ventas1
      2013-06-13 16:33:22,405 INFO : Finished execution of job Masterdata with status: Completed with Warnings

      In community version I only have four ítems:
      - File Manager
      - User Manager
      - ETL Manager
      - Connection

      I have not the OLAP Manager :(
      Why good if there are warnings?? :(
      I tried the CubeData job and it fails... :(
      I think, I must solve the error "Source Ventas2 of load Ventas2 has to be a tree for the loading of consolidations. Dimension will be flat." before trying the CubeData job again... :(

      I'll try again all the steps:

      Build dimensions:
      1. Relational extract for each dimension
      2. Build tree for each dimension (i.e. TreeFH)
      3. Build one dimension for the facts.
      4. Load dimensions

      Build cube:
      1. Relational extract for cube -> select base fields for each dimension and select the facts
      2. Build TableTransform and take the relational extract for cube (step 1) as input.
      -> mode: normalize
      -> set the dimension for the facts
      -> set up the aggregation for the facts
      3. Load cube

      And I think things will go well... :)
      The warnings imply that the Dimension was created, but that the C nodes (the consolidation nodes) were not created correctly. In most cases this means that the format of the file you're using as the source for the hierarchy definition has problems.

      But the fact that it loaded the Dimension implies that it should be able to construct a cube using your dimensions (meaning that the cube load should work.)

      I was going on the assumption that loading the cube was more important that correcting the dimension creation since you can rerun the ETL later to correct a malformatted Dimension.

      Are you getting the same error message as before when you run the CubeData job?
      I have runned the CubeData job successfully!!

      But Masterdata and Default obviously have the same warning message that I have not solve yet: "Source Ventas2 of load Ventas2 has to be a tree for the loading of consolidations. Dimension will be flat."

      To verify that it works, I'm trying to connect to the cube from Excel. Do you think that I will be able to connect from the Modeller? Or as there are warnings I won't be able to?

      Thanks a lot!!
      Yea. You should be able to view the cube with the modeler now.

      The warnings were purely for consolidations (e.g. if you try to collapse all the countries in Europe into Europe, you can't do that.)

      For that Problem, check the formatting for the TreeFH example on P189 of the admin manual. Its kind of a bad example. Just realize if you want to create a hierarchy like

      Source Code

      1. World ---> Europe ---> France ---> Paris \ \--> Nice \->Asia-------> China ----> Beijing




      You need to have *all* level of the hierarchy specified for each element. e.g.
      World | Europe | France | Paris
      World | Europe | France | Nice
      World | Asia | China | Beijing