cancel
Showing results for 
Search instead for 
Did you mean: 

Out of Memory Errors

Former Member
0 Kudos

Hi Experts,

While running our packages, were encountering the following errors. The weird thing is that we can ran this packages 3 to 5 times without error then suddenly, these errors would come out. When we ran again the said package, it would ran successfully again. Sometimes we need to restart the server since there are cases that packages would run for more than 24 hours. And after the restart, packages would run normally again. What could be the possible cause of this?

Error 1:

Execute formulasError

Failed to commit transaction

3 errors writing data

There is insufficient system memory to run this query.

The ROLLBACK TRANSACTION request has no corresponding BEGIN TRANSACTION.

Send through SendGovernor failed.

Error 2:

Execute formulasError

Error in ProcessSQL: Out of memory

Error 3:

Execute formulasError

Error connecting to OLAP: Errors in the OLE DB provider. A valid instance named 'BPCDB' cannot be found on the 'FTBPCDB' server.

Error 4:

Execute formulasError

Error querying fact: -2147024882 - Exception of type 'System.OutOfMemoryException' was thrown.

We're on a Multi-Server setup SAP BPC 7.0MS SP7 SQL 2005.

Thanks,

Marvin

Accepted Solutions (1)

Accepted Solutions (1)

sorin_radulescu
Employee
Employee
0 Kudos

what is the version of SAP BPC used.

Please implement the follow note:1404769 or 1409651 function by your version of SAP BPC.

Any way the content is the same but it is doing just reference to correct version of SAP BPC.

In any case I think your problem is more related to scriptlogic because you are not using XDM_MAXMEMBERSET.

It is working when you run after full optimze with compress 2-3 times because you don't have so many rows but once which you are running the package more rows are generated and the selection is becaome bigger and bigger.

At same point the COM+ components is arriving to the limit and the DM package crask. Doing full optimize with compress and running again the package you will see it is working.

So to solve the problem you have to implement the note mentioned and also you have to make sure into script logic you are using XDIM_MAXMEMBERSET instruction.

Check into log of script logic what is the selection providing error and you will understand for what dimension you have to use XDM_MAXMEMBERSET

Regards

Sorin Radulescu

Former Member
0 Kudos

Hi Sorin,

We're already running on SAP BPC 7.0MS SP7. We do have large data being processed but we already set a Full Optimize twice a day. We're already observing the size of the Fact tables every time we run the packages. We've run it yesterday for 5x straight without those errors and then run a full optimization after that. But when we run them again, those errors pop out. After 5 to 6 times of unsuccessful run it ran successfully again. Strange thing is that we never do any optimization or processing of the cube in between, we just ran it until it ran successfully. I'm currently looking in the "Configuration Guide SAP Business Planning and Consolidation 7.0 version for the Microsoft Platform" for some configuration adjustments and I see here an item that might help us, the "Setting u20183GB Supportu2019 for Memory on the COM+ Object". We're still not familiar about this since we haven't tried this yet, what could be the pros and cons if we apply this to our server? Will this help us solve the issue?

Thanks,

Marvin

Former Member
0 Kudos

Hi Marvin,

You can try using the following statement to shrink the Log files size. This will free up a lot of space and may help with the issue.

dbcc shrinkfile (Appset_Log, 1); --where Appset needs to be replaced with the name

Regards,

Santosh

Answers (2)

Answers (2)

sorin_radulescu
Employee
Employee
0 Kudos

I don't advice you to use /3GB option. This must be used in other situations

Again I have to mentioned that your problem is really the script logic and the unique solution will be to add XDIM_MAXMEMBERSET.

Otherwise time to time it will provide you error function by numberof records selected from database.

Kind regards

Sorin Radulescu.

Former Member
0 Kudos

Hi Sorin,

We're already using that in some of our logics and also there isn't that much heavy logics in the default formula. What I'm wondering is that if a package(basically were just running the Default Formulas for testing purposes) was ran successfully on first attempt and then re-run it again and returns in error, if there is a problem querying with it shouldn't ran successfully in the first place since it's the same package. And right now, I just tried clearing some data but I can proceed with those errors. I've already tried running a full optimize but no luck it still results in error. I've also tried removing all the logics in the default formulas but still, clear package returns this error "ExecuteSQL::Not enough storage is available to complete this operation.;CommitTrans::Out of memory;". As per checking, there is still 76GB of free disk space in the server.

Thanks,

Marvin

Former Member
0 Kudos

Hi Marvin,

Please try the following steps:

1. Shrink the DB Log file.

2. Do a full process on the Application

3. Do a Full Optimize with Compress Database and Index Defragmentation options selected.

4. If Step 3 fails check the FACT, FAC2 and FACTWB tables for any records with source <> 0 and delete them. Re-run the Full Optimize with only Compress Database option selected. Once this completes re-run it with only Index De-fragmentation checked.

Hope these steps will help resolve your issue.

Regards,

Santosh

Former Member
0 Kudos

Hi Experts,

We tried to switch to SQL 2008 and applied SAP NOTE 1468161 and we're getting positive results.

Thanks for your help guys.

Cheers,

Marvin

Former Member
0 Kudos

We've been running out of memory, too, with some of our packages. In our situation our Temp DB ran out of space because we had the memory set to a fixed value of available space for the Temp DB. We saw improvements and fewer memory errors when we switched our Temp DB space to a % of available memory. We also had Audit running, too, and that process was re-indexing the tables every six hours. We've removed the re-index 6 hour interval and are still trying to detmine if that has an impact.

We schedule a Lite Optimize each hour. We do a Full Optimize about once a week.

Former Member
0 Kudos

Hi Marvin,

I currently face the same problem as yours at the beginning of this post.

I understood that you solved your problem by :

- migrating Sql2005 to 2008,

- and Applying this SAP Note : 1468161

But, while reading this note, i ve seen that it only applies for Evdre() issues... not DM issues

Could be please explain more how you solve it ? it is really that SAP Note that solved your Exception Out Of Memory error ?

Tks a lot for your help,

Regards,

Olivia.

Former Member
0 Kudos

Hi Marvin,

With every single transaction in the system, it maintains a transaction log. And with due course of time, it increases a lot. I would suggest you to take backup of your SQL database, and when prompted, select the transaction for taking the backup. This will free up the space.

Hope this helps.

Former Member
0 Kudos

Hi nilanjan,

We've already applied "SAP NOTE 1402580 - BPC: The size of BPC SQL Server database is growing very quickly" because the last time we've ran out of disk space due to very big log files. But now we still have plenty of free disk space about 76GB and we do a full optimize everyday. I'm just wondering since we can't predict when the error will occur and when we re-run the very same package that resulted in error, it runs ok again. Is there some settings/setup we need to check?

Thanks,

Marvin