Level Extreme platform
Subscription
Corporate profile
Products & Services
Support
Legal
Français
What does Pack do?
Message
 
To
17/06/2014 06:55:42
General information
Forum:
Visual FoxPro
Category:
Coding, syntax & commands
Environment versions
Visual FoxPro:
VFP 9 SP2
OS:
Windows Server 2012
Network:
Windows 2008 Server
Database:
Visual FoxPro
Application:
Desktop
Miscellaneous
Thread ID:
01601949
Message ID:
01601957
Views:
60
>>A customer has bloat on an fpt that is in near constant use. I've gone over the code to see where I might be adding in too much date, removed some fields etc. but still it run towards the 2Gb limit. I can pack the file at night time, generally after midnight but not my favourite thing to do.
>>Can somone tell me what pack does please, I've read the online and help stuff but it doesn't make sense to what I see and my testing.
>>Is there any way to reduce the size of the fpt while users are in the system?
>>In a test area I have a dbf of 120mb and related ftp of 850Mb.
>>If I use the select command from this file into a new dbf the fpt reduces to 50mb, this is the same is deleted is set on or off.
>>If I use the copy to command from this file into a new dbf the fpt reduces to 50mb, this is the same is deleted is set on or off.
>>I then tried to select into a cursor and loop through the original file replacing the 3 memo fields with the cursor version, using a variable. The FPT got bigger. I also tried replacing the memo field with a blank first but it still got bigger. I removed the variable element and did a direct replace from the cursor.memo1 to the file, again the file got bigger.
>>Finally I used the file that I selected into where the fpt was reduced 50mb and did a replace into the original file. The FPT got bigger.
>>~M
>
>(1) Pack does two things
>(a) Permantly deletes the records that have been deleted
>(b) reduces the memos (fpt)
>
>(2) Replacing the contents of a memo field allocates a new block
>
>(3) It may be worthwhile to set its blocksize to 1 byte rather than the default of 64 bytes - which allocates 64 bytes even for a memo field of 1 byte
>
>as follows
>
>
>= AlterBlockSize('TableName', 0)
>
>*_______________________________________________________________________________
>function	AlterBlockSize(_table, n)
>
>	local OldBlockSize, location
>	OldBlockSize = set('BlockSize')
>
>	use (m._table) excl in 0
>	location = fullpath(dbf(m._table))
>	location = addbs(JustPath(m.Location)) + JustStem(m.location)
>	
>	
>	set blocksize to m.n
>	alter table (m._table) add x123456 L default .f. NOVALIDATE
>	alter table (m._table) drop x123456 NOVALIDATE
>	use in (m._Table)
>	
>	set BlockSize to (m.OldBlockSize)
>	
>	delete file (m.location + '.BAK')
>	delete file (m.location + '.TBK')
>
>endfunc
>*_______________________________________________________________________________
>
Thanks for all the replies
What I didn't understand is where the bloat is coming from and the entries in the table, including the memo fields, are not editable by users. Also out of 900,000 records the number of deleted records are 8-10 so the bloat is not coming from there.
I have set blocksize to 0 in the calling prg, however the table was most likely not created with the blcoksize parameter set to 0, so your program above should sort that out. I think that this situation best explains the bloat that I am seeing
Thanks for the reading material Tore, it explains the bloat, expecially with the blocksize parameter.
The file is that bit too big and used too often and in too many places to have a running pack command at each use, I was thinking about running a nightly run that tries to pack.
~M
Go raibh maith agat

~M
Previous
Next
Reply
Map
View

Click here to load this message in the networking platform