>Al,
>
>>I agree that this technique is elegant. However, Nadya asks for "efficient" in the thread title. As a guess, I'd say this technique isn't that efficient. For example, if you are trying to pull 40 random records out of a million, this technique means a million records will be pulled over the wire, sorted, then the top 40 sliced off the top. Doing 40 random GOTOs in the same million-record table may be more efficient.
>
>Most certainly if you have a huge table or slow network based file and only want a few records it's much more time efficient to do something like:
>
>
>rand(-1)
>lnRecs = reccount( "TheSampledTable" )
>select * ;
> from TheSampledTable ;
> into cursor TheSample readwrite ;
> where 1=2
>
>do while reccount( "TheSample" ) < lnSampleSize
> select TheSampledTable
> goto int( lnRecs * rand() ) + 1
> if ( ! deleted() )
> scatter name oData memo
> select TheSample
> append blank
> gather name oData memo
> endif
>enddo
>
I don't see how you're handling the case where you might select the same record more than once. Unless you're DELETEing it in TheSampledTable. Otherwise, checking to see if you've already selected a given record can be expensive.
Regards. Al
"Violence is the last refuge of the incompetent." -- Isaac Asimov
"Never let your sense of morals prevent you from doing what is right." -- Isaac Asimov
Neither a despot, nor a doormat, be
Every app wants to be a database app when it grows up