cancel
Showing results for 
Search instead for 
Did you mean: 

PB 12 Memory issue with huge arrays

Former Member
0 Kudos

A french guy has memory issues with PB 12, the memory is not deallocated after the garbagecollect call.

Here is the code to reproduce the problem:

Double t_1[], t_2[] 
long ll_i  
for ll_i = 1 to 50000000 
t_1[ll_i] = 1 
next 
t_1 = t_2 
garbagecollect() 

During the loop, the memory is increasing but after clearing the variable and calling garbagecollect(), the memory is still used.

The guy must exit the application to free the memory.

He tried with a fixed size array but PB does not allow 50 millions records in an array.

Any suggestion?

Link in french:

PowerBuilder � Donf ! / Comment lib�rer compl�tement la m�moire avec PB

Accepted Solutions (0)

Answers (5)

Answers (5)

Former Member
0 Kudos

"A French guy has memory issues..." sorry that first sentence caught me off guard.

I know an American guy with memory issues (me).

Former Member
0 Kudos

Hi Chris and Cristian,

i will transmit your answers.

Thank you for your competence

Former Member
0 Kudos

If you want to speed up the memoryallocation, go backwards.

Like

Double t_1[], t_2[]

long ll_i 

for ll_i = 50000000 to 1 STEP -1

     t_1[ll_i] = 1

next

t_1 = t_2

The Memory for the Array would be allocated once.

Regards, Christian

Former Member
0 Kudos

Hi Patrick;

   Changing your code as follows should do the trick ....

Double t_1[], t_2[]

long ll_i 

SetNull (t_2)

for ll_i = 1 to 50000000

  t_1[ll_i] = 1

next

t_1 = t_2

garbagecollect()

HTH

Regards ... Chris

Former Member
0 Kudos

The way PowerBuilder handles memory, it does not release memory back to the operating system. Whenever it needs more memory, it will use memory already allocated which is faster.

50 million entries in an array is just poor application design. The data should be stored in a database and retrieved as needed.

If there is a valid reason to have it all in memory at once, a datastore would probably be a better choice.

Former Member
0 Kudos

I completely agree with you about 50 millions records in an array and the role of the database.

I dont know the reason to keep so much data in memory.

But from my point of view, using the datastore to keep 50 millions records is also bad practice.

I will let you know the answer of the french guy.

Thank you for your competence.