on 04-09-2014 1:27 PM
Dear All,
I have created a Generic DataSource on SAP Query and during creation I used "Additive Delta" option and also used CALDAY for delta. I have not uploaded delta before using Generic DataSource.
1. How should I load the Delta Records for Generic DataSource, without missing any records?
2. Why do I have a pointer (T:Code: RSA7 BW Delta Queue Maintenance) in Stat column for Generic DataSource?
3. What are the safety time intervals for?
4. Do I need to run any V3 job at ECC side before delta load, since this SAP Query is based on "PURCHASING" tables?
I will appreciate your reply.
Many Thanks!!!
Tariq Ashraf
Hi,
1. for calday, use lower limit as blank and upper limit as 1. Run delta info pack only once in a day.
2. Its just shows, delta relevant field and last time when delta was loaded into bw.
3. by using safety intervals we can load delta data without missing and without duplciates.
for more please search on google about the same.
4. No V3 jobs for generic data source.
Thanks
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Hello Raman and Yogesh,
Thank you both for your prompt responses. On 10.04.2014 I've created "Generic DataSource with Generic Delta" and used CalDay as Delta-Specific Field and lower limit kept to blank and upper limit to 1. On 11.04.2014 I've initialized the Delta with Data Transfer using Process Chain: Time 1:15 p.m
see below screen shot:
Result: Process Chain was scheduled to immediate and it executed and initialized the delta and brought records as well. See below screen shot of RSA7:
Q: 1: Why generic delta current status is 10.04.2014, as I'v scheduled delta through process chain yesterday @ 20:50: and why Total column is showing 0 records? Even I've checked Display Data Entries with Delta Repetition?
Following is the InfoPackages which I created for Delta Loads and used in another Process Chain for Delta scheduled (daily @ 20: 50).
Following is the DTP used into Delta Process Chain:
Today, when I checked my InfoCube, and found that same number of records being loaded again despite InfoPackage and DTP were created as Delta, see screen shot:
See below PSA table:
Q: 2: While am I unable to load the Delta records and why same number of records being loaded again?
Q: 3: Yesterday, around 1:15 p.m I've initialized Delta with Data successfully and yesterday was my first load of delta records, could this be a problem? as the posting period was on when I initialized and I tried to load the delta on the same day?
Q: 4: Have I missed something in the design or data flow?
Q: 5: What is the best practice when you load full data and initialized delta and schedule deltas for Generic DataSources with CalDay as delta specific field?
I will appreciate your replies.
Many Thanks!!!
Tariq Ashraf
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Hi Raman,
Once more thanks a lot for you quick reply. Now, after designing my Data Flow I'm going to do following steps:
1. Initialize Delta ( Init without Data Transfer)
2. Full Load
3. Then schedule my "Deltas" using Process Chains ( Period Values: Daily, Start Date: 10/04/2014 Time: 23:45:01 )
According to above mentioned steps:
1. Do i need to ask for Posting-Free time and schedule my deltas accordingly?
2. Will the tonight's delta fetch all records up to 'Initialization" pointer?
3. What if the Document-Processing is on during nights, will my tomorrows deltas fetch all the records which were not fetched by the today's delta?
4. Will I not miss a single record even?
Many thanks!
Tariq Ashraf
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Hello Tariq
1. Do i need to ask for Posting-Free time and schedule my deltas accordingly?
Ans : You have Safety Interval in place, so I believe no need for posting free time . Any ways it is going fetch today date - safety upper limit.. data ! So don't worry
2. Will the tonight's delta fetch all records up to 'Initialization" pointer?
Ans : Yes
3. What if the Document-Processing is on during nights, will my tomorrows deltas fetch all the records which were not fetched by the today's delta?
Ans : It will only get transactions from last push until the days before the upper limit. For this YES you will get.
4. Will I not miss a single record even?
Ans: Not Likely.. (NO)
Hope this helps !
Regards
Yogesh Narwani
Hi,
1. Not required. daily transactions are not depended on bw triggering/loading time. Its depend on business. have a word with your reporting users and run your loads at before starting the business hours or at end of the business hours. But need to schedule only once in a day.
2. About delta records no need worry, as per upper limit as 1 and lower limit blank., all delta records are captured.
3. Yes, as per the date. it will fetched
4.Not sure. but most of the time won't be missed. if you follow the safety limits. even if you missed, You can load the missed delta records by using repair full request with proper selections.
Thanks.
Dear All,
Thank you so much Raman and Yogesh for your prompt replies.
As per your suggestion I've used Delta-Specific Field: Date on Which Record Entered: CPUDT.
1.What are the other two options for (Radio Buttons Selections)?
i. Time Stamp (UTC)
ii. Time Stamp (Local)
iii. Calend. Day
Settings:
2. Why I had to put 1 in Safety Interval Upper Limit?
3. Why I had left Safety Interval Lower Limit: Bank?
4. What are these two options for and when we used which one?
i. New Status for Changed Records
ii. Additive Delta
Many Thanks!
Tariq Ashraf
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Hi,
1. There is timestmap delta method. if you used timestamp, based on time(local - its your time,UTC -universal) you need to select proper time stamp(local or utc)
Calday - your using calday, so select it and enter the delta specific field at field nm CPUDT.
2 and 3.
Lower limit - will consider the last extraction date.
upper limit - consider the present date -1.
Example:
CALDAY last extraction happen on date is 07 Apr 2014.
next delta extraction will pull records 07 Apr 2014 to 08 Apr 2014.
because of upper limit one, it pulls records upto yester day one.
upper and lower limits using to pull delta records without missing single record.
4. based on that we can use related targets in bw side.
i. New Status for Changed Records - you can use DSO
ii. Additive Delta - You can use Cube
Thanks
Hello Tariq ,
As you are looking for the various options, here are some references
Settings
1 .When you are loading the data based on CALDAY ( means once in a single day ) , the Safety Interval acts as a safety net for fetching the delta reocrds.
For Suppose you are running the load on 10/04/2014 . Now upper limit of the load is 1 , so it will making upper safety delta as 10/04 - 1 = 09/04 only the records of that day will be fetched.
2. Same applies to the lower limit . 10/04 - 0 = 09/04 . So record for 09/04 will be fetched only
So ultimately a stated day's delta is available
Options
Additive Delta : In additive delta you will get the difference i.e keeping key or record same then
second record - first record.
if the latest record has value as 5000 and same previous record has value = 2000 then overall delta will be
5000-2000 = 3000
With this you can use the same directly for updation in Cube ( but not in ODS)
New Image :
All together a new record is generated ! You can update the same in DSO
Time Option
1. Time StamP UTC ( This stamps the Universal Time Stamp ( GMT & atomic time)
2. TimStamp ( Local) : Time local to your server
3. CAL DAY : this is as per your delta field.
Hope this helps !
Regards
YN
Hello Tariq ,
As you have used the CALDAY as a delta which is generally used for to fetch the records from the application tables to BW on daily basis ( frequency once in a day). you can use this delta by keeping the appropriate safety limits ( upper and lower ).
1. Keep the lower limit blank to fetch all the delta records of the day !
2. Pointers as to say acts a Identification mark, or like tag mark so as system/ user can recognize upto that point delta has been moved successfully to the respective or required system.
3. Safety Intervals are like precautionary mechanism which is used to avoid any discrepancy while loading the data. From discrepancy this means loosing or missing out any record, duplication of record etc. We have to be careful in deciding the safety Intervals.
4. I believe No .., (and i believe you might have scheduled them. )
Hope this helps !
Regards
YN
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
User | Count |
---|---|
94 | |
11 | |
10 | |
9 | |
9 | |
7 | |
6 | |
5 | |
4 | |
4 |
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.