>Dear Andrus,
>
>I never cancel the connection, since it is not a good practice to stop the connection at the middle... SQL server generates Error log, if it is done repeatedly for long time, it increases the chances to crash your servers, unless you're living with your servers, you can do the maintenance very often, I don't recommend to do this.
>
>I struggle this problem with myself, too. A ten thousand record fetching takes a lot of time... I think about it for a long time, and may be solved into 2 scenerios...
>
>1. If the 10,000 records are not updated so frequently (days or every several hours), I'll use a stored procedure in SQL server to grasp records from the server, and I d/l the tables via FTP... if you have IPSTUFF, you can use via HTTP
>
>Since d/l records using SQL-PT would use a lot of time, however, if I create another table for d/l, it is much faster, however, the table is not protected by SQL server...
>
>2. If it is every minutes or seconds, I'll limit the records using offset, i.e. just allowing user to d/l a portion of the table, e.g. Record no. 200-300, each for 100 records, then press next page for another fetching...
I have found that PostgreSQL has LIMIT clause which can returs the part
of table like
SELECT customer_id FROM customer ORDER BY customer_id LIMIT 100 OFFSET 500
So using PostgreSQL it is possible to add a button "fetch next 100 records" to my grid control.
Should I switch to PostgreSQL ?
Andrus