Information générale
Catégorie:
Base de données, Tables, Vues, Index et syntaxe SQL
>>I have an app that updates insurance plans. I store some general notes about each plan in a memo field. This field usually contains 7 or 8 small sentences. My problem is that when I populate this information (one sentence at a time) for thousands of plans the memo field bloats to >330 mb. After packing the memo (and 10 minutes) it goes down to 50mb. I'm worried about users running out of disk space. Is there a way to reduce this without stopping in the middle of the routine and doing a pack memo? I update the memo fields with the following code:
>>
>>REPL myMemo WITH ALLT(myMemo)+CHR(13)+cNewLine IN myTable
>>
>>I have also used the ADDITIVE reference:
>>
>>REPL myMemo WITH CHR(13)+cNewLine ADDITIVE IN myTable
>>
>>This gives no relief to the file size issue.
>>
>
>Nope - VFP does not reuse blocks in a memo field when the field is updated, so each REPLACE is expensive - it create new blocks to write the entire field content fresh each time you update it. Why not build the memo field content before doing the REPLACE?
I will be processing between 10 and 50 files updating different parts of insurance plans. each file may or may not require the memo field for a given plan. Instead of updating the plan table directly I could create a cursor with planID n(6), memo_line c(100) and populate the cursor instead. I could then scan through this cursor and build a variable for each plan and do the REPLACE with that variable into the Plan table. Does this sound feasible?
Thanks,
Marcus.
Précédent
Suivant
Répondre
Voir le fil de ce thread
Voir le fil de ce thread à partir de ce message seulement
Voir tous les messages de ce thread
Voir tous les messages de ce thread à partir de ce message seulement