How to lookup a "range value"

    This site uses cookies. By continuing to browse this site, you are agreeing to our Cookie Policy.

    • How to lookup a "range value"


      i Have a source table with thousands of products. Every product has an unique ID. In another table the products or grouped. Every group is represented by a range of product IDs. Now I want to join these tables like the SQL statement


      1. SELECT source.ProductID,productgroup.GroupID,...
      2. FROM source, productgroup
      3. WHERE source.ProductID >= productgroup.StartProductID AND source.ProductID <= productgroup.EndProductID
      . Unfortunately the Join transform and the LookUp-function can only look for equal values -not for ranges - in the used tables and I see no way to use more than one table in a SELECT-clause at the extract-step in the ETL process. So I've tried to create a groovy function to solve my problem. It works basically but the peformance is so horrible that nobody can really use it - it runs more than one hour! (the source table has thousands of products and every product needs a loop in the group table).

      For interested people: here is the code which is responsible for the horrible runtime:

      Source Code

      1. PF=ProductID_Format;
      2. xGroupID="UNDEF";
      3. processor = API.getProcessor("TR_Product_Basis_Field");
      4. def row =;
      5. while (row != null) {
      6. if (PF >= row.getColumn("StartProductID").getValueAsString() && PF <= row.getColumn("EndProductID").getValueAsString()) {
      7. xGroupID= row.getColumn("GroupID").getValueAsString();
      8. break;
      9. };
      10. row =;
      11. };
      12. processor.setFirstRow(0);
      13. return xGroupID;
      Display All

      Has anyone an idea how to solve my problem by a performant way? (By the way: I have no possibility to change the tables before starting the ETL process.)

      Best regards
    • Hi,

      I had the same problem (with start dates and end dates) and unfortunately there are no other possibilities than looping on the range of values (as you did). This is rather a classical issue in Business Intelligence.

      For that I used Talend (sorry for the ad) that allows you to do that rather easily (using a tMap component)

      Post hoc, non est propter hoc
    • Hello laloune,

      thank you for the answer very much. But I don't want to abondon my hope to solve the problem with Palo. Because of security circumstances in my company I'm not able to use third party software.

      Maybe another expert has an idea. It would be a big step, if there is existing a trick to use more than one table in a SELECT-clause at the extract-step in the ETL process.

      Best regards
    • Range by rounding ('better than nothing' - solution)


      I had the same problem and solved it roughly, maybe it helps you, too:…4421&highlight=#post14421

      I wanted to lookup a target value out of a storage-cube for the complete amount of these values, based on a calculated value out of another cube. But the just accepts a hardcoded reference or a !'xyz'-dimension reference, which i don't have due to the calculation source of my reference value. So I used the round function to streamline the calculated value (but then just with 1 digit after comma, less than desired) and made my reference data in the storage cube more detailed (more work than desired). The looks now like this:

      Source Code

      1. PALO.DATA("DB_all","ReferenceStore",LEFT(!'Client',3),
      2. STR(ROUND(PALO.DATA("DB_all","Sales",!'Country',!'Fullfiller',!'Product',!'Date',!'Customer',!'Store',"M-Share"),1),3,1))

      The M-Share is the calculated value. The STR() is needed, because the don't seem to accept it otherwise due to the number-type resulting from round. Like this it works...
      If M-Share is calculated as 47.5678%, the reference becomes 47.6 and the ReferenceStore contains then a dimension element named (!) 47.6 and a cell value found by this (e.g. 97) to align the calculated value to a certain scale which users use to judge about the value.

      Best regards,