Level Extreme platform
Subscription
Corporate profile
Products & Services
Support
Legal
Français
Windows systems - is file fragmentation bad?
Message
From
01/01/2003 10:02:57
Hilmar Zonneveld
Independent Consultant
Cochabamba, Bolivia
 
 
To
01/01/2003 09:51:30
General information
Forum:
Visual FoxPro
Category:
Databases,Tables, Views, Indexing and SQL syntax
Miscellaneous
Thread ID:
00736741
Message ID:
00737153
Views:
18
>I'm trying to dream up some simple tests to check this out and measure (basically) the impact of originally-laid (fragmented) versus defragged.

I am thinking about this.

I don't know whether it is better to do a complete defrag for a start or not - but one way or the other, when adding records the files will become fragmented (since information is added both to the DBF and to the CDX).

How about creating a table and an index. Insert 10,000 or perhaps 100,000 records. This should result in a nicely fragmented table.

Now, some ideas for the speed tests.

  • Sequential processing, with a SCAN. Do something simple, like copy a field to a variable each time.
  • SEEK tests in a loop.
  • A SELECT - SQL command that gets perhaps a total of 100 records.
  • Hmmm... multitable selects might be appropriate as well. This will require additional table(s) in the setup phase.

    I consider the SQL - SELECT statements especially relevant, because I think they tend to be the most time-consuming.

    Now, defragment your hard disk, and repeat the tests.

    Hilmar.
    Difference in opinions hath cost many millions of lives: for instance, whether flesh be bread, or bread be flesh; whether whistling be a vice or a virtue; whether it be better to kiss a post, or throw it into the fire... (from Gulliver's Travels)
  • Previous
    Next
    Reply
    Map
    View

    Click here to load this message in the networking platform