>This one is very particular and I am not sure if there is a way to predict the ideal index for optimization.
>
>We offer an advanced search. That one has up to 20 fields that the user may enter criteria and search into a table from 2 to 4 million records. Most of those fields are relational and some of them are character and date fields. They all have individual indexes. And, there is already some indexes having multiple fields used for other processes in the application but sometimes used by the execution plan depending on what WHERE clauses would be needed in this case.
>
>This usually works pretty well. However, the most used fields are two character fields where one of them is necessary to use as a LIKE clause and the second one, from a full text index, as a CONTAINS clause. As those ones cannot be fully optimized, this, sometimes, creates queries timing out. They take more than 5 minutres and this has to be fixed.
>
>Whereas already defined fixed conditions, I have optimized that it will always respond immediately. But, in here, we cannot know in advance which fields will be used. Thus, adding one or more fields will change how the execution plan works. I thought of creating an index which would include all 20 fields but is this really how to do it? Also, based on some minor analysis last night, even with two or three fields, it doesn't work as expected.
>
>How are you handling such situation? The goal is to have performance responses no matter which fields and how many fields are used in the search.
Michel, is this one large query, or a procedure with multiple steps / queries?