ETL - Measure (Date) of Cube 1 as Time Dimension in Cube 2 -- SOLVED

      ETL - Measure (Date) of Cube 1 as Time Dimension in Cube 2 -- SOLVED

      Dear Community,

      i have a little problem setting up an ETL process, for my Palo Project.

      What i'm facing are several cube with various dimensions including a time dimension. The measure dimension of my transaction data cubes consists of various values and three other date-values. In simple, I would like to take a date of the measure dimension and turn it into the time dimension of another cube. First question, is it possible? And if it's possible, which transform-functions have to be chosen? I've tried a lot but I wasnt able to find a solution yet.

      Summary: Cube 1: Date (Measure) --transform --> Cube 2: Date (Dimension)

      Thanks in advance for all suggestions.

      Greets,
      Peter.

      Post was edited 1 time, last by “Peter Parks”: Solution found. ().

      Hm.

      How about just creating a fully populated time dimension? (your Measures should be a subset of the time dimension.)

      Questions about the time dimension. Will you aggregate on the time dimension? or will it be a flat dimension? Are the time samples evenly distributed in the Measure space?

      If the time distribution is requires minutes/seconds granularity and time range spans 10's of years, then fully populated time dimensions may not work for you.

      If you're dead set on creating/syncing your own dimension from measure data, you can look at the examples for palo_eadd (I saw some examples in the Palo SDK under PHP/sample/ResourceId/palo_eadd.php).

      I don't highly recommend it and would try to do everything I could to move to a standard time dimension.

      PS. One more thing you could do is use a regular time dimension and do bucketing to store the data. Remember, if you want to reuse a dimension in a cube, you'll need to clone it, and the ETL can help you do that.
      Hi Byoshimi,

      First of all thanks for your reply and sorry for not responding earlier this week. This topic is still one of my top priorities, which i need to get a solution for.

      Yes its a fully populated time dimension. Grain level is days only. I did set up two parallel hierarchies one for years and moths and one for calendar weeks which are all aggregations i need but they are both using the same base elements which are as stated above days. Hope that's enough on that.

      So basically I need to shift a date from my measure dimension, which is another date then the actual transaction date (actual time dimension used in the first cube) to be considered as the time dimension for the second cube.

      A very close solution I figured out was to use the TableTransform to denormalise my cube. I will post a screenshot later today. The result is that my time measure receives its own column as well as my acutal value which is only one needed to be transfered to the second cube. The next step would be to use a field transform to get rid of the actual time dimension and use the date of the measure dimension as the new time dimenson for the load. This may sound ridiculously complicated but is a very easy and practical way. The only error I receive so far is that i receive two rows per transaction. One for the value and one with the date of the measure dimension.

      Believe me as soon as you see my snapshot you'll understand my problem.

      Thanks once again.
      Hello everyone,

      I just solved my problem yesterday night. Thanks Noel for clarifying, I was not sure if this is the best solution for it.

      I unfortunatly ticked a checkbox, which constantly gave me an error message. After removing this, i just had to pick "first" as option for the text element (date) and "sum" for my value elements.
      The next step was to create a FieldTransform to finally remove the old date and consider the measure (date) as the new time dimension. I also set up a mapping to assign the new value to a proper measure dimension element. And there you go the result was excalty what I needed to receive.

      Thanks to all, who tried to help.

      Greets,
      Peter