cancel
Showing results for 
Search instead for 
Did you mean: 

Data Extraction at DB Level

sowdip
Explorer
0 Kudos

Hi,

I have Write Optimized DSO of huge volume (12 TB) close to 3 yrs.

Now I want to retain data for 6 Months , but all traditional deletion methods fail as with dumps.

Selective loads to another DSO isn't helping either.

I was thinking whether I can create small tables out of this big active table at Oracle DB level. 

My question is

1) Is it possible to create these smaller table at DB level ?

2) If I succeed the above , how can load this into a brand new DSO ?

Thanks,

Prashanth

Accepted Solutions (0)

Answers (2)

Answers (2)

sowdip
Explorer
0 Kudos

Sure will do.

Thanks a lot Matthew !

I'm closing this discussion.

matt
Active Contributor
0 Kudos

I assume your DSO has some kind of period or date InfoObject.

A DSO is implemented as a transparent table (or collection of tables). Therefore a competent ABAPper could certainly write a program that deletes all data with a date of more than 6 months ago. It may even be as simple as DELETE /bic/aDSONAME00 WHERE date LT six_months_ago.

sowdip
Explorer
0 Kudos

ABAP Program is also an option. But since no of records in the active table is so huge ,

select statement crashes.

matt
Active Contributor
0 Kudos

What is the nature of the crash? Out of memory? Time out?

sowdip
Explorer
0 Kudos

As informed by BASIS , Its out of memory and puts lots of stress on CPU utilization.

We have been instructed not to run this program again 😞

We also have some other approaches like creating smaller flat files using some UNIX based tools like 'PUTTY' and place it in App server and then load back into a new DSO.

I'm looking at some more out of the box approach to tackle this.

matt
Active Contributor
0 Kudos

Presumably this is a one off (once you're running regularly the volumes will be higher), and it's a business requirement. Basis cannot tell the business "you cannot run this program" - they need to find a solution. If my Basis team were to try to instruct me to do anything (which they wouldn't because they're competent professionals), I'd tell them where to get off. The business instructs Basis, not the other way around - you should escalate if necessary.

By the way "out of the box" solution is one that is prepackaged. I think you're asking for "outside of the box" solution.

The problem is probably due to rollback - the entire db operation needs to be able to be rolled back - and that's what causing the stress on the db server.

Here's a simple solution.

1) Create a write optimised DSO

2) Load that DSO with the data you need from the old DSO

3) Drop the old DSO contents at DB level. A simple drop shouldn't require any rollback

4) Load the old DSO from the new DSO.

sowdip
Explorer
0 Kudos

DSO to DSO solution is already in-place. Each request is ~100 mil records.

Selective loads from a w/o DSO reading each request is taking hours to even kick off first data package. 

If it was a standard DSO , probably it would have been easier.

I have already proposed one more option to re-create required data from the Cube into a new DSO and drop data from old DSO. Looks like this is safest and recommended at the moment.

As I mentioned I was just checking for more tested options. 

matt
Active Contributor
0 Kudos

Another way: create a Z table with the same structure as your /BIC/Adso00 table. Write an ABAP program that populates this Z table with the required data from the DSO (and will write into the DSO from the Z table)

Run the program to populate the Z table

Drop the DSO.

Run the program to repopulate the DSO

The program should use OPEN CURSOR and SELECT, to minimise server load.

You should also look at the various notes for reducing the tables that hold requests for loads. Implement and run the program for this DSO. (Sorry, don't know the note numbers off the top of my head)

sowdip
Explorer
0 Kudos

Yes, definitely an option.

Will active table go for lock when I'm inserting data into a new Z-table and simultaneously loading DSO as part of regular load.

matt
Active Contributor
0 Kudos

You are only reading from the DSO, so you should be fine. For absolute safety though, I'd stop loads while the processing is going on. And of course, test test test in your QA system.