on 02-02-2016 5:46 PM
My Dear Colleagues,
We have table , with has two coloums as text and image holders.
We are running on version 15.7 SP 134. Have the config parameters deallocating text , using compression, etc in place.
Does the following details suggest, that there is a fragmenation issue prevailing?
59958 is the total rowcount , and 1991830.495 is the avg datalength for the text/image coloums.
1> sp_estspace 'SAPSR3DB.BC_MSG_LOG', 59958, null, null, 16, 1991830.495
2> go
name type idx_level Pages Kbytes
------------------------------------------------------------------------------------ ------------------------------------------------ ------------------------------------ ------------------------------------------------ ------------------------------------------------
SAPSR3DB.BC_MSG_LOG data 0 6683 106933
SAPSR3DB.BC_MSG_LOG text/image 0 59958 959328
BC_MSG_LOG_6669636632 nonclustered 0 927 14832
BC_MSG_LOG_6669636632 nonclustered 1 15 240
BC_MSG_LOG_6669636632 nonclustered 2 1 16
I_BC_MSG_LOG_TIME nonclustered 0 42 656
I_BC_MSG_LOG_TIME nonclustered 1 1 16
(1 row affected)
Total_Mbytes
-----------------
1056.66
name type total_pages time_mins
------------------------------------------------------------------------------------ ------------------------------------------------ ------------------------------------------------ ------------------------------------------------
SAPSR3DB.BC_MSG_LOG data 59958 0
BC_MSG_LOG_6669636632 nonclustered 943 0
I_BC_MSG_LOG_TIME nonclustered 43 0
Ajit, Mark,
When I was a Sybase, Inc. employee in Professional Services in NYC back in the early nineties I was sent to perform capacity planning at a client prior to an upgrade.
I had to rely on sp_estspace heavily for the task but quickly found out, as you have Ajit, that it was was not completely up to it.
The results of the sizing exercize would have a direct impact on how much would be spent on storage so I decided to take the guesswork out of it and modify the procedure myself to avoid under or over estimating the size of the data.
If you extract the code with the "defncopy" utility, and assuming they have not removed the comments, you will see that it was altered by Malcom Colton and Hal Spitz, and probably others in addition to myself, for similar reasons.
In my case, I modified the procedure to take into account an average number of pages per text blob field, because the default was only one, and my client, wanted multiple scenarios to get best, average and worse case scenarios.
So, I exhort you to take a shot a it. The code is not very complex and you will easily find the sections you need to alter to suit your environment, and your system.
I hope that helps.
Regards,
Jean-Pierre
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
User | Count |
---|---|
84 | |
24 | |
11 | |
9 | |
7 | |
6 | |
5 | |
5 | |
5 | |
4 |
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.