Heya Sorin ...
ah -
the concept of a 'test bed' ..
what's your test bed look like ?
and - moving forward - what criteria for any one else's test bed
would be significant from a measurement standpoint ?
I suggest:
1. OS name/version/SP
2. amount o RAM
3. details on cache / VMM / swap file settings
4. complete disclosure on OS config files
5. complete disclosure on VFP config files
6. LOC
7. Number o Loops / Iterative processes [per module]
8. any OUTSIDE drivers required
9. LOG file of runtime [from sysinternals FILELOG]
10. driver types for data at test time [odbc/ole-db/native]
11. disk type
12. hd interface type
13. cache on HD
14. cache on HD controller
I would suggest any test on W2K with 256M ram to be significantly different than on W2K with 1Gig ram, for example, as VFP is a memory hog [uh - .NET is as well on the server side of CLR]
if you make a RAM DISK for the temp space, that's skewing the results and that 'test bed' shouldn't count.
--Nota Bene - I'm not singling *YOU* out for this - but any real test ? should at least have *THOSE* metrics available, or the results will seem un-objective [at least in me small wee heid] ...
I'm sure there's the same kind o content for blot net tests - so lets see what other 'named' metrics will surface that match what VFP can do [that is measure-able] ..
Note To JayVeePee - not your baliwick, this time - you don't OWN the hardware.
mondo regards [Bill]
ps - that list wasn't inclusive - but is a good place to start .
Previous
Reply
View the map of this thread
View the map of this thread starting from this message only
View all messages of this thread
View all messages of this thread starting from this message only