Level Extreme platform
Subscription
Corporate profile
Products & Services
Support
Legal
Français
Help Optimizing a query
Message
From
14/08/2001 15:11:59
 
 
To
14/08/2001 14:55:31
General information
Forum:
Visual FoxPro
Category:
Databases,Tables, Views, Indexing and SQL syntax
Miscellaneous
Thread ID:
00542086
Message ID:
00543698
Views:
11
>>>>>>1. I am accessing the data locally.
>>>>>>2. The total number of records is 7,127.
>>>>>>3. The speed is approximately 17 seconds with this data set.
>>>>>>
>>>>>
>>>>>Did you tried to delete tags all and add them one by one by adding a new condition? 17 sec. seems too long for me. It should be 1-3 sec. Did you try it on different machines?
>>>>>
>>>>>>>Did you access this table locally or through the network? What's the number of records in a table and what speed do you get?
>>>>>>>
>>>>
>>>><snip>
>>>>
>>>>The last thing I tried was to create a simple index which would return 20000 records. This test CDX contains no other indexes other that the one index. I ran a test on this one index and it still took 27+ seconds on my machine.
>>>
>>>Can you please provide more details? Especially, what is the SQL - SELECT command, and what index(es) you have?
>>>
>>>Hilmar.
>>
>>In this case I have a table with about 1.2 million records.
>>I have a character field which as a width of one called RampCode.
>>I have one index only on the RampCode field.
>>As a pre-digest idea, I was going to store the values into this table so that
>>my Sql would be basically like this.
>>
>>Select * From VinData Where RampCode = lcRampCode Into Cursor qDetails NoFilter
>>
>>This takes approximately 21 seconds. I've updated the time from 27+. That was with SET DELETED ON.
>
>First, I wonder why the time changed at all once DELETED was set OFF?
>You call it a "test CDX". Does that mean that this is a separate CDX, apart from the 'production' CDX?
>If VINDATA records are wide, then naming the desired fields should make a significant difference.
>
>Honestly, I've found this thread a bit hard to follow. It seems that you report latest result without reporting how the query/data/indexes differed. It might help to have all of the facts.
>
>good luck
>JimN

We've gone in quite a few different directions trying to come up with a solution so I can see why the thread would be hard to follow.

What I was attempting above was to determine if the size of the CDX was slowing the query down. I wanted to have one index only against the same data that all the other tests were run against.

I copied the data without any indexes. I then added a field called RampCode and indexed only on RampCode. I replaced 20,000 records with a value of "1" based on the Ramp, Vendor and HourCount. In my current data that is the largest set that a user might be able to select.

You're right about the number of fields. I changed it to select just what was required for the report and again ran the test. This time was 25 seconds. Really anything greater than 6 seconds is unacceptable because this is a report that has to run on a website. I believe that Rick Strahl would only be happy with times under one second. I'm not even close to that.

Actually the Deleted setting can make a dramatic difference. I was doing a series of COUNTs against a large table and the times were 12 seconds, 15 seconds 9 seconds and just the other day they were averaging about .3 seconds. Once I figured out that SET DELETED was ON. I turned it off and the times returned to about .3 seconds.
Previous
Next
Reply
Map
View

Click here to load this message in the networking platform