Level Extreme platform
Subscription
Corporate profile
Products & Services
Support
Legal
Français
Side by side comparison (strings & local data)
Message
From
28/12/2003 01:32:12
Dragan Nedeljkovich (Online)
Now officially retired
Zrenjanin, Serbia
 
 
To
27/12/2003 15:28:00
General information
Forum:
Visual FoxPro
Category:
Visual FoxPro and .NET
Miscellaneous
Thread ID:
00861648
Message ID:
00862315
Views:
35
>>>Hi all
>>>
>>>Just did a quick side-by-side comparison with C# and VFP8, the test is aimed at string building from local-data using ADO.Net & Fox's local data-engine, can anyone tell me if I've messed up in away or if there is a quicker way of doing this:
>>>
>>>Here's the VFP program:
>>
>>Here's the VFP program the way I'd write it:
>>
LOCAL lcBigXML
>>Close Databases all
>>*Start time
>>SET DECIMALS TO 3
>>
>>USE C:\DEV\CSHARP\DEMO\DEMOS\DATA\CLIENT IN 0
>>
>>SELECT * FROM Client WHERE UPPER(Cl_Sname) = "LAWRENCE" INTO CURSOR TEST NOFILTER
>>
>>=AFIELDS(laFields, "Test")
>>
>>StartTime = SECONDS()
>>Set Textmerge to memvar lcText noshow
>>Set Textmerge delimiters to "{{","}}"
>>Set Textmerge on
>>\local lcXml
>>\	lcXML ="<CLIENT>"
>>	FOR f = 1 TO ALEN(laFields, 1)
>>\		lcXML = lcXML + "<{{Alltrim(laFields(f, 1))}}>" + ;
>>\			TRANSFORM(Test.{{Alltrim(laFields(f, 1))}}) +;
>>\			"</{{Alltrim(laFields(f, 1))}}>"
>>	ENDFOR
>>\	lcXML = lcXML + "</CLIENT>"
>>\return lcXml
>>Set Textmerge to
>>StrToFile(lcText, "runner.prg")
>>Compile runner.prg
>>lcBigXml="<TEST>"
>>Scan
>>	lcBigXml=lcBigXml+runner()
>>ENDSCAN
>>
>>lcBigXML = lcBigXML + "</TEST>"
>>
>>*Show Time-Taken
>>?SECONDS() - StartTime
>>
>>Now if you don't mind, compare this for speed.
>
>
>Hi Dragan. Would you please explain why creating and compiling a seperate prg is faster? Thanks.

Because it compiles only once. ExecScript() compiles each time (unless we include scan/endscan inside it so it runs only once), and evaluate() has to do the lookup of the expression for each field, for each record.

Writing a simple .prg generator makes much more sense when we have to do some special selection among the fields. I once had an import procedure which brought over the same 1000 records times about 128 fields over and over, in case there was an update somewhere. There were several classes of records (same structure, but not all the fields were involved each time), and the structure of these records was guaranteed only for some of the columns - the others may or may not be there. So instead of checking for each field on each record, I generated a routine which would do the conversion for the fields present. My calculation turned out that this has saved me some 40000 IF statements per run. And it was about 60 times faster from the original version, and about 3 times faster than my last version before that.

back to same old

the first online autobiography, unfinished by design
What, me reckless? I'm full of recks!
Balkans, eh? Count them.
Previous
Next
Reply
Map
View

Click here to load this message in the networking platform