on 03-21-2014 7:17 PM
Hello all,
I was trying to run apriori, and then HANA writer in my database.
Apriori executes directly, and then I configure and try to execute HANA Writer. But it keeps executing for more than half an hour, and then gives the following error: Error 131 Transaction rolled back by lock wait timeout. Lock timeout occurs while waiting TABLE_LOCK of mode EXCLUSIVE.
Bimal suggested me to reduce the data to be analysed, so then I reduced it to a pretty small part and executed again.
It gave me the same error after more than 30 minutes waiting executing.
after having this error, I tried just execute apriori, but with a filter of 5 transactions and than it worked. After that I tried to visualize the results, and got the following error:
It means that my HANA memory is full?
I got the view from HANA:
Regards!
Did anyone get the same problem already?
Even I'm trying to work with Predictive Analysis, I've posted this message in HANA Forum because the error occurs in a HANA Component,and not in an algorithm from Preditive. (the algorithm worked fine).
And when I try to execute Apriori, and than the HANA Writer, I get the following error:
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Hi Jurgen,
I'm a developper for PAL from Shanghai team.
Could you please provide some details information about this case:
- How many records did you use ?
- What’s the value of “MIN_SUPPORT” and “MIN_CONFIDENCE” did you use ?
The algorithm of apriori will consume much memory when the input data is very large and min_support is low.
So I suggest you set “MIN_SUPPORT” and “MIN_CONFIDENCE” as 0.9 firstly, check if it can output result.
Thanks,
Peng
Hi Peng, thank you so much for your suggestion.
I talked with some people from SAP Brazil and they suggested me to change my HANA to the last version (because the one I was using have some bugs in the memory) : SPS7 Rev73, so then I'm asking to the responsable to do this changes for me.
I was using the confidence and Support as the one it cames originally: 0.9 and 0.1
I also tried to analyse only a small sample data (about 200k lines) and it also didn't work.
I noticed that after trying to visualize the results from any execution, the resident memory didn't reduced, and after each time I execute, the memory gets more fully. Did you know what I mean?
So now I'll wait to see if they upgrade it for me, and after that I'll try the steps you suggested me!
In back to you with my feedback.
Regards
User | Count |
---|---|
87 | |
10 | |
10 | |
10 | |
7 | |
6 | |
6 | |
5 | |
5 | |
4 |
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.