>>In my case, the 100,000 records are split over about 12-15 locations, so it would be reasonable as a first guess to assume that the view size would be on the order of 5,000-10,000 records (for a "main" office possibly quite a bit higher, perhaps 20-30K), so indexing on-the-fly could become a performance issue. Do you agree?
>
>I am not sure - indexing should be fairly fast, I think; I am more worried about fetching the records for the view.
>
>Speed tests are warranted.
>
>Also, see if you can add a second criterion to reduce the result set even more. I mean, does the user really need to see 20,000 records at a time???
This is a social-services agency that needs to be able to quickly look up ANY client, so I can't think of a very practical way to further limit the result set.
I'm afraid you're right about the testing as the only good way to find some answers. Looks like I'll be rolling out some Cartesian joins to generate large test data sets :-(
Thanks for your help, in any case. The FAQ was particularly useful.
Ray Roper