on 03-29-2014 8:19 AM
A french guy has memory issues with PB 12, the memory is not deallocated after the garbagecollect call.
Here is the code to reproduce the problem:
Double t_1[], t_2[]
long ll_i
for ll_i = 1 to 50000000
t_1[ll_i] = 1
next
t_1 = t_2
garbagecollect()
During the loop, the memory is increasing but after clearing the variable and calling garbagecollect(), the memory is still used.
The guy must exit the application to free the memory.
He tried with a fixed size array but PB does not allow 50 millions records in an array.
Any suggestion?
Link in french:
PowerBuilder � Donf ! / Comment lib�rer compl�tement la m�moire avec PB
"A French guy has memory issues..." sorry that first sentence caught me off guard.
I know an American guy with memory issues (me).
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Hi Chris and Cristian,
i will transmit your answers.
Thank you for your competence
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
If you want to speed up the memoryallocation, go backwards.
Like
Double t_1[], t_2[]
long ll_i
for ll_i = 50000000 to 1 STEP -1
t_1[ll_i] = 1
next
t_1 = t_2
The Memory for the Array would be allocated once.
Regards, Christian
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Hi Patrick;
Changing your code as follows should do the trick ....
Double t_1[], t_2[]
long ll_i
SetNull (t_2)
for ll_i = 1 to 50000000
t_1[ll_i] = 1
next
t_1 = t_2
garbagecollect()
HTH
Regards ... Chris
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
The way PowerBuilder handles memory, it does not release memory back to the operating system. Whenever it needs more memory, it will use memory already allocated which is faster.
50 million entries in an array is just poor application design. The data should be stored in a database and retrieved as needed.
If there is a valid reason to have it all in memory at once, a datastore would probably be a better choice.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
I completely agree with you about 50 millions records in an array and the role of the database.
I dont know the reason to keep so much data in memory.
But from my point of view, using the datastore to keep 50 millions records is also bad practice.
I will let you know the answer of the french guy.
Thank you for your competence.
User | Count |
---|---|
74 | |
10 | |
8 | |
7 | |
6 | |
5 | |
5 | |
5 | |
4 | |
4 |
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.