Additional Blogs by SAP
cancel
Showing results for 
Search instead for 
Did you mean: 
thomas_rinneberg
Explorer

Before Export Check
I do not know how you organize your BW development team to collect their changes on transport requests. SAP recommends to always use the transport collector in transaction RSOR, but for organizational reasons, this might sometimes not be applicable. However it is, you probably know the situation, that you get errors in the import post processing when importing a transport request to a BW target system. Usually, the error is caused by some missing objects, e.g. an InfoObject is missing for an InfoCube. If you are developing BW content, this problem of inconsistent transports is even more complicated, because you will see the error not before the content is installed. Now this problem is tackled already at the root: The releasing of the task or transport request will start a check of the BW objects contained on the task or request. The following conditions are checked:

  • The object exists, i.e.
    • For transport of active versions: The object exists, is unchanged since activation and not deactivated via impact 
    • For transport of content: The content version resp. shadow object exists
    If the object does not exist, there will be a warning, that the transport will lead to a deletion. This might be desired, but might also be an error.
  • The object is not generated (generated objects must not be transported)
  • All dependent objects (and here new functionality for finding dependent objects directly on shadow objects were developed) fullfill:
    • they exist
    • they have an TADIR entry (registered in transport framework)
    • they are not on a local package (e.g. $TMP)
    • they are recorded on either the same transport request or on another already released transport request. In case they are on another transport request, which is not yet released, this will give a warning, since the other request should be released first

In case one of the conditions is not fullfilled, you will see a popup like the following:



On above popup, you can press the red cross (abort), which will stop to release the request, and is recommended, but you may also press the green check, which will give you one more warning that the request contains errors. If you confirm it, the request is released despite of the errors.

 If the popup contains only warnings, you also may press the red cross or the green check, in latter case the request is released without additional warning. A little surprisingly for those of you, who know the performance of RSOR, this Before Export Check is quite fast, so there should be no need to switch it off, which is – nevertheless – possible,  both in general and for a particular request or for particular objects on a particular request.

InfoPackage Split
The 7.x source system dependent objects “New” DataSource, Transformation and DTP do support something special when imported to a BW target system: You can develop them for one source system in development BW, but import them for more than one source system in the target BW. This is customized in view V_RSLOGSYSMAP in the target BW system:

Here, before BW release 7.00, only one “Target source system” per “Original source system” could be entered. Since BW release 7.00, for the mentioned 7.x objects more than one target can be entered per original and the 7.x objects are imported several times even though they were exported only once. The additional targets are flagged with the “7.0” flag in the view. The InfoPackage, being a 3.x object, is missing in above list of objects, even though it is still relevant in a pure 7.x data flow as well. But finally, with release 7.30, this gap is closed: Also InfoPackages count as 7.x objects in V_RSLOGSYSMAP and can be distributed to several source systems in the BW target system.

Dummy Source Systems
SAP BW has some very big customers, one of them being so big, that the aforementioned split of source system dependent objects during import is not sufficient. This customer wants to split the objects onto some of his many many target source systems, but for other target source systems, the objects shall look slightly different, e.g. other scheduling times, or other selection conditions. However, and regarding this part, the customer is as tiny and cost-saving as any other, in development BW, he has only one single development source system. How could he develop different flavours of his source system dependent objects with only one development source system? The answer is: Dummy Source Systems.

A Dummy Source System is a source system, which references an existing real source system, but is only an alias in BW. All the source system dependent metadata objects in BW (including 3.x objects) reference this alias. The remote function calls, which are necessary to replicate DataSources or actually activate and run the metadata objects are re-directed to the real source system.

However, one thing is not possible with Dummy Source Systems: Load Delta. This is because the real source system itself does not know about the Dummies. The real source system exists only once, and the BW is only existing once for it (there are no Dummy BW’s in the source system). So the delta is recorded only once for this BW system. So the BW cannot request different deltas from each of the Dummy Source Systems. But you can load full data and of course, you can create and maintain delta data flows for the Dummy Source Systems. You just cannot try them out.

 

Now, nice concept, but you might have figured, that you need to duplicate the whole data flow onto the Dummy Source System just in order to change one little property like the scheduling options. And creating BW objects is such a lot of work! But behold – there is a solution for this also in BW 7.30: The DataFlow Copy Wizard. But this shall be covered in another blog.

 

Ah, and by the way, you can reach the creation of dummy source systems by right-clicking on the header line of the source system tree:





Legal Disclaimer
 

1 Comment