>With launch, the program performs a number of tests on the source files and I am starting to get "Not a Table" messages when the files get large. I have a Utility that rebuilds the files from backup structures to deal with this but it is becoming annoying for a few of my clients.
>
Typically, this is a corruption of the table header; I'd guess that either they aren't shutting down workstations/servers in an orderly fashion (lots of times this will happen when someone has modified a table and locks up their station) or that the server has too few available shared handles and is dropping inactive sessions to enable new active ones, resulting in some cache coherency issues. You may want to investigate making some changes to the registry of the server to disable opportunistic locking, increase the available number of active file handles and disable some of the cache behavior options for Win2K Server.
>In Windows NT 2000 there is no ScanDisk so I have then activating CHKDSK every Friday night so it will run CHKDSK on the Monday morning. This has helped but there are still problems. So I am now thinking I should be asking them to run Defrag as well and I don't actually know if Windows NT 2000 has a Defrag Utility like Windows 95/98.
>
Windows 2000 has a defragmentation tool, a limited version of DiskKeeper, a commercial disk maintenance tool. I prefer the commercial version, since it offers more flexibility for unattended operation than the included version.
>Has anyone else had problems with dealing with files in the order of 10-15 MB in the Windows NT Environment? If so what should I be telling my clients?
>
I seriously doubt it's the file size; I have files of several hundred MB size operating in the Win2K environment. I'd investigate the client behavior regarding orderly shutdown and unexpected dropped LAN connections.