>That's the part that makes one want to cry. We are working on computers which are at least 100X more powerful than those of a few years ago not to mention a few decades ago. The first mainframe I worked on (IBM 7040) had about 80KB of memory (16K words), a clock cycle around 1 microsecond, and no disk drive. At a cost of $100,000 the university computer center later doubled (maxed) the memory. A Burroughs B5500 computer of the same era (mid-sixties) could compile Algol programs at 3000 lines per minute.
>
>My HP pentium 166 MHZ can execute over 1 billion cycles in the time in takes to load FoxPro. What is it doing with all those cycles? Why are the programs so huge? There's got to be enormous waste in there.
I guess it acnowledges the VFP with the specifics of your machine, and cuts your memory into usable pieces (memory variables space, string space, buffers, prg cache etc). There are just too many options included, like "am I running a right-to-left version", "what fonts can I use", "which codepage", "how many file handles available", "what colour shoes do I wear today", "where are my bitmaps" etc etc. Besides, lots of code around may come from different vendors - it has to check the OCXes and their accompanied .DLLS (if any), check versions of million things etc etc.
Your B5500 worked with its owh Algol only, and never had to bother about anything else. I also doubt the overall wisdom of things as they are now; many of them seem to be rather loosely coupled black boxes, and I wouldn't bet that each box was debugged and optimized for either speed or size. Just look at the hex dump of any today's .exe - why does almost each one of them have to have a font list compiled inside?