cancel
Showing results for 
Search instead for 
Did you mean: 

Transport of data from model to model

Former Member
0 Kudos

Hi all,

I came across the following:

We want to transport transaction data from one model to another.

For some reason, they do not balance once the package has been successfully executed.

I did some investigation and it appears that some records don't append.

To clarify, here is an extraction I made from BW (RSA1):

The REEF_OFF_REEF dimension is not in the destination cube (OIS).

The records in yellow for the cost center (TH3690) appended (summed up) correctly to 135.892.

Why would the cost center TH5900 (in purple), not append?

I know the value is the same with just a different Reef, but because OIS is not using Reef, it is suppose to append? Am I wrong assuming this?

Technical:

BPC version: BPC 10 Netweaver,

BPC Support Package: BPC SP 17,

BPC EPM version: EPM 21

Process Chain used in DM package: /CPMB/LOAD_DELTA_IP

Any suggestions would be helpful!

Thanks!

Jaco de Kock

Accepted Solutions (1)

Accepted Solutions (1)

gajendra_moond
Contributor
0 Kudos

Hi Jaco

While loading are you using Overwrite or Append? You should use Append.

Former Member
0 Kudos

Hi Gajendra,

I cleared the data in OIS, and re-ran the package using append, still out with 9.74.

gajendra_moond
Contributor
0 Kudos

Hi Jaco

Could you try sending data using script logic in the source model?

*DESTINATION_APP = "Your destination model"

*WHEN Costcenter

*IS*

*REC(EXPRESSION = %VALUE%)

*ENDWHEN

damovand
Product and Topic Expert
Product and Topic Expert
0 Kudos

Hi,

Is there anyway that the two equal values can be summed up in the source InfoProvider, before being loaded?

Regards,

Leila

Former Member
0 Kudos

Hi Leila,

How will it be possible, because only the one dimension is different in the data string?

The rest of the record is the same.

We still need all the detail, all the other dimensions, and there is no dimension that is not used....

Regards,

Jaco

former_member186338
Active Contributor
0 Kudos

But why not to test script logic solution

Scope all members of dimension you want to sum.

It's a standard approach...

Vadim

damovand
Product and Topic Expert
Product and Topic Expert
0 Kudos

Hello,

The only other thing that I can think of is to have FEEF_OFF_FEEF presented as a dimension in BPC.  That way it can be used in the BPC transformation file and differentiate the two incoming values from each other.

This may be a good solution if you expect to see many such values that can only be differentiated by value of that column.

Best Regards,

Leila Lappin

Former Member
0 Kudos

Hi Gajendra,

I created the script, linked it to a data manger package and executed the package.

The script:

*XDIM_MEMBERSET CATEGORY = %CATEGORY_SET%

*XDIM_MEMBERSET TIME = %TIME_SET%

*DESTINATION_APP = AMPS_OIS

*SKIP_DIM = CUSTOMER, INVESTMENTCENTRE, PROD_ACCOUNT, REEF_OFF_REEF, WORK_PLACE

*ADD_DIM COST_ELEMENT = NO_COST_ELEMENT, EQUIPMENT_SIO = NO_EQUIPMENT_SIO, PATTERSON = NO_PATTERSON, PROFIT_CENTRE = NO_PROFIT_CENTRE, VENDOR = NO_VENDOR

*WHEN CATEGORY

*IS ALT

*REC(FACTOR=0)

*REC(EXPRESSION=%VALUE%)

*ENDWHEN

*COMMIT

The two models balances with one another, which is perfect!

Thank you so much for the advice,

My last question is, is it BPC standard to push the data like with this script or to pull it, how we done it previously?

Thanks again!

Jaco de Kock.

former_member186338
Active Contributor
0 Kudos

Hi Jaco,

Push with the script is a correct and recommended approach.

The only additional thing I will recommend is to implement RUNLOGIC_PH badi and before pushing data clear destination scope with the script in the target model launched by RUNLOGIC_PH.

How To Implement the RUNLOGIC_PH Keyword in SAP... | SCN

Vadim

P.S. "*REC(FACTOR=0)" is useless - it will not clear if there is no record in the source model. *COMMIT is always useless!

Former Member
0 Kudos

Hi Vadim,

REC(FACTOR=0):

I included this afterwards, because when I ran the package again, the values doubled up.

I only want to clear the value if their is a value for that specific record in the data set.

Thus it worked fine for me, I can run the package 5 times, but only the new data is then copied, I don't have to run a clear package before running the script again.

*COMMIT:

I also included this afterwards, because one specific user is unable to see the data in a report. From another model we use EPMRetrieveData, but the values is empty for him.

He ran the report on my laptop, and is able to see it. He is also the only user that cannot see the data.

I will add a new discussion about this giving more detail, I don't want to nest issues.

Thanks Vadim!

Jaco.

former_member186338
Active Contributor
0 Kudos

"I included this afterwards, because when I ran the package again, the values doubled up." - absolutely strange - not possible ! *REC(EXPRESSION=%VALUE%) will always overwrite target! I think you simply not scoped the source cube properly!

For each dimension in *SKIP_DIM = CUSTOMER, INVESTMENTCENTRE, PROD_ACCOUNT, REEF_OFF_REEF, WORK_PLACE

You have to scope data!

"I only want to clear the value if their is a value for that specific record in the data set." - bad idea, you have to count the case when the record in the source cube is missing and the target record will not be cleared!

"*COMMIT:

I also included this afterwards, because one specific user is unable to see the data in a report. From another model we use EPMRetrieveData, but the values is empty for him."

*COMMIT has absolutely no relation to the report behavior!

Vadim

Former Member
0 Kudos

""I included this afterwards, because when I ran the package again, the values doubled up." - absolutely strange - not possible ! " - I am telling you, it was doubled, that's when I cleared the model, included the Rec(FACTOR=0), ran the package (script) 5 times, and it did not happen again. It was double on a report, and I did not change the context lock options on the sheet.


"*COMMIT has absolutely no relation to the report behavior!" - I agree, but I had to start looking at as to why he is not able to see the data, so I added it (Just to check, but I have not yet removed it) , but still with no luck.


Jaco,

former_member186338
Active Contributor
0 Kudos

"and it did not happen again" - sorry, but looks like you have some issues with report!


DM package with the script like:


*REC(EXPRESSION=%VALUE%)


writing data to another model using DESTINATION_APP

will NEWER double values on the next run! Never !


But anyway, please correctly scope dimensions I mentioned in my previous post.


Please, remove COMMIT - the only effect it has is to reset the script scope... no effect on saving data - ENDWHEN loop perform autocommit!


Vadim

Former Member
0 Kudos

Unfortunately the business does not want to follow this approach by pushing data.

They want to pull it, so we have opened a SAP OSS, because it is suppose to aggregate reccords before posting them.

Thanks all!

former_member186338
Active Contributor
0 Kudos

Sorry, but your requirements are not clear...

What do you exactly mean by "pull" in this case?

Vadim

"pull" from the business point of view!

Former Member
0 Kudos

They want to extract data from on model to another.

Thus they want to extract Cube A into Cube B, by using a data manager package in Cube B using a transformation file and the BPC process chain /CPMB/LOAD_INFOPROV_UI.

But as referenced above, some records are skipped.

former_member186338
Active Contributor
0 Kudos

To my mind they simply want to launch DM package from Cube B, the rest are technical details unknown by users!

In our system we use the following:

1. We implemented RUNLOGIC_PH badi in our system (How To Implement the RUNLOGIC_PH Keyword in SAP... | SCN

just few minutes to import transport and ignore version!

2. We have a script in Cube B with the following:

//Scope is defined

//Clear target scope in Cube B

*WHEN ACCOUNT

*IS *

*REC(EXPRESSION=0)

*ENDWHEN

//Launch push script in Cube A

*START_BADI RUNLOGIC_PH

QUERY = OFF

WRITE = ON

DEBUG = OFF

APP = Cube_A

LOGIC = CALLED_LOGIC_IN_CUBE_A.LGF

DIMENSION CATEGORY = %CATEGORY_SET%

DIMENSION TIME = %TIME_SET%

DIMENSION CUSTOMER = <ALL>

//... all dimensions of Cube A

DIMENSION COST_ELEMENT = <NONE>

//... all dimensions in Cube B missing in Cube A

*END_BADI

3. And in Cube A we have a push script CALLED_LOGIC_IN_CUBE_A.LGF

with DESTINATION_APP etc...

The scope will be passed by RUNLOGIC_PH.

Vadim

P.S. The difference between push and pull:

With push you have number of records in the source cube - you push records to the target cube independent on records of the target cube.

With pull you loop records in the target cube reading data from source - incorrect!

Answers (1)

Answers (1)

damovand
Product and Topic Expert
Product and Topic Expert
0 Kudos

Hello,

The only thing I see that could explain this is that the two records in the purple color have exact amounts for the exact data region.  BPC is not loading the second record because it thinks it is a duplicate. 

Regards,

Leila Lappin

Former Member
0 Kudos

Hi Leila,

Thanks for the reply, that was also the only explanation I could think of, but I thought that because it is reading transaction data from a source, it should add them up because the source should not have duplicates.

Thus, transporting the data should show the two records (that it will be identical in the OIS module), but then append them when using a report or compressing the cube. Transaction data should be able to have duplicates?

It is the same when you clear data, there should be a string of data with for example 25.00 and then a record with -25.00. Thus resulting in a 0.00 when running a report. This record will be removed when optimizing.

Thanks,

Jaco

damovand
Product and Topic Expert
Product and Topic Expert
0 Kudos

Hi Jaco,

I am not aware of the details of delta loading but I think configuring an infoprovider for delta loading is done outside of BPC packages.  This could be the reason why BPC is not recognizing the second record and treats it as a duplicate.  Maybe someone else can add more insight. 

Best Regards,

Leila