Plateforme Level Extreme
Abonnement
Profil corporatif
Produits & Services
Support
Légal
English
Can this SELECT be improved to go faster?
Message
De
24/07/2000 13:41:35
 
 
Information générale
Forum:
Visual FoxPro
Catégorie:
Base de données, Tables, Vues, Index et syntaxe SQL
Divers
Thread ID:
00395152
Message ID:
00396239
Vues:
11
Hi Larry,

>1. Issue the command =sys(3054,1) and then run your query. Sys(3054) reports the level of Rushmore Optimization that is being used.

Rushmore Optimization is "none" for table1 and "full" for table2. The index on survey_year is being used for optimization.

>2. Check the indexes that you have for table1 and table2. It looks like you should have one on element, beg_odometer and end_odometer.

Table1 has only its PK index, and table2 has regular indexes on survey_year, element, and beg_odometer, but not end_odometer. Do you mean it should have one index all three fields, like str(element)+str(beg_odometer)+str(end_odometer) ?

Thanks,
Bridget


>
>>Hi, all.
>>
>>The goal is for the resulting table to have one record for each lane that exists in each section of highway. The sections in table2 (which has the lane data) are smaller than those in table1, so there may be several sections in table2 that fall at least partly within a single section in table1. Here is my SELECT, which produces correct results:
>>
>>SELECT DISTINCT table1.section_id, table2.lane_number FROM data\table1 ;
>> INNER JOIN data\table2 ON table2.element = table1.element ;
>> WHERE ((table2.beg_odometer>=table1.beg_odometer AND table2.beg_odometer<=table1.end_odometer) OR ;
>> (table2.iri_end_odometer>=table1.beg_odometer AND table2.end_odometer<=table1.end_odometer)) AND ;
>> table2.survey_year = cYear INTO TABLE data\my_test DATABASE data\temp
>>
>>I am just wondering if this can be refined to run faster, or if there is some other better approach. In my testing so far, I have about 350 records in table1 and 1.5 million records in table2, and it takes an average of 80 seconds. The full size of table1 will be about 32,000 records... so now we're talking hours. This may be acceptable if it's the fastest way, but I'm hoping that isn't the case. (I haven't worked with tables this large before, so I'm not sure what is a reasonable time to expect.)
>>
>>Thanks in advance for any advice!
Bridget K. Dawes
Précédent
Suivant
Répondre
Fil
Voir

Click here to load this message in the networking platform