65000 total
elements, which means the total product of array dimensions can't exceed this limit - thus you can't have more than [2,32500], [3,21666], [4,16250] ... [65,1000] etc etc in a single array.
Since VFP is data oriented anyway, and its cursor/rushmore/SQL engine is far better than its array engine, I always think twice when I want to use arrays. For up to a few hundred elements, arrays are OK, but if I need to do a lot of ascan()s or want the array sorted in different orders at different times, I rather create a cursor and index it.
Manual sorting algorithms have their place... I wrote one multipass procedure of a kind in CP/M Turbo Pascal 3 back in 1987, when there was no other way to achieve what we needed. Nowadays, it would have to be some critically slow and/or unreliable piece of a tool to be worth the time replacing with a homemade routine.
>Nice point. Array can hold no more than ~64,000 records.
>
>>>Hi!
>>>
>>>Do you want to discuss this topic from the sciense point of view? Here we do not used ANY assumption about distribution of record. Anyway, a loop and a lot of statements in VFP FAR slower than ASORT() for ANY distribution, as far as I know, just because VFP commands are done by pure interpreter, when ASORT() is a C++ code done by processor. Maybe, for little number of elements VFP routine can be more quick, however, with large number of elements it is not comparable.
>>
>>With a large number of elements you can't use arrays anyway. That's where I'd rather stuff them into cursor and index it. At a few hundred thousand records this is pretty much the fastest option.