>Thought I would post a results message to the end of this thread.
>
>1. Building a single replace statement and using macro expansion to run DID improve performance
Expected that - one replace should be faster than multiple
>2. Building a single "UPDATE" statement and using macro expansion did NOT, indeed it made things worse.
Did not expect that to be honest
>3. Using GETPEM() DID improve performance, quite considerably actually.
Yes, I tested that and it's faster
>4. Changing memory management DID improve performance, minimal memory when running USE/REPLACE
> maximum when running SELECT statement. I actually made it a workstation setting and we run a speed test utility on each
> workstation to calculate optimal memory settings.
Are you talking about sys(3050) and/or sys(1104) ?
>5. I changed from using aMembers, I created my own array of fields as part of the scatter data method,
> scanning through my own array IS faster than aMembers!.
Do you mean you create the array once per instantiated object and reuse it rather than using amembers() for each save ?
Since you have to access all the elements of the array, for each is faster
ie
for i = 1 to alen(aa)
x = m.aa[m.i]
endfor
Faster since the line of assigning to x does not need to be executed
for each x in m.aa
endfor
>I have not yet gone over to separate cloned objects and using CompObj() but I am looking into it.
With this you have an extra object. CompObj() just tells you whether any field has changed. To know which one, you have to compare all the fields
>Hope above helps anyone else in similar positions.
>
>
>Gary.
Gregory