>I have wondered that also, having come up from dBase 3 and using only tables. I've continued to use them because I understand them and they do everything I want, plus I can easily identify them and send them to others as stand alone files if the need exists (I sometimes have to copy to xxx fox2x to make them readable).
>
>But my question is different, although related. I have around 1000 individual files in the one directory for one application. It didn't start that way of course, justy grew with the application. This would of course be reduced if I used databases, but I'm not worried about the number unless having that many files in one directory slows things down. They used to say under Dos not to put too many files in the one directory or performance will be degraded. Does that matter any more?
John,
Yes it does matter (significantly), depending on the OS the tables are held under. So :
- Dos -> bad
- W9x -> bad
- Novell 2.x, 3.x -> bad
- NT (all versions) -> good (maybe depending on FAT -> remove that)
- Novell 4.x, 5.x -> good.
All the "bad's" sequentially scan the directory, in the end being in the memory but degrading the CPU. For a server this is BAD.
All the "good's" have direct access to the dir-entries (it's sorted in there or so).
Now note that the bad's are even more bad, thinking of FAT-entries remaining, even if you deleted the files. So copy 10 times 10,000 different files to a directory (each time deleting the previous set), and you won't even get an answer anymore. Deleting the directory itself will solve that.
About the fewer files when having the DBC ? not true IMO; the tables will just be there as always, and the DBC is just additional (another table holding info about the usual (free) tables).
HTH,