Level Extreme platform
Subscription
Corporate profile
Products & Services
Support
Legal
Français
Tableupdate() a large amount of records
Message
From
25/08/2004 12:29:04
 
General information
Forum:
Visual FoxPro
Category:
Coding, syntax & commands
Miscellaneous
Thread ID:
00936296
Message ID:
00936349
Views:
19
Yeah, I saw it. It takes me forever to write a message so I didn't see it after I posted. :)

>Did you see Steve's reply to me? Seems MSDE has an overall 2GB limit for the database, not just the tables. So this kills that idea. Looks like its native tables, full SQL Server, Oracle or MySQL for your archives.
>
>>Hi Mark
>>
>>Thanks for the reply. That is an interesting idea to use MSDE or SQL Server for the archived tables. For now though, I think I'll just stick to foxpro. But if it turns out this mechanim is way to slow because of checking for the 2gig/1 billion rec limit I'll make that suggestion to the powers that be. Speaking of my question from yesterday, I decided to check every 10000th record. We are only going to use 90% of the limits so I'd really be checking for 1.8 gig/900 million records.
>>
>>It sounds a lot like him to commit every 100th record so that the changes would be seem faster. I was just wondering if tableupdate could fail on a large number of records, but what you said makes sense.
>>
>>Thanks,
>>Chris
Previous
Next
Reply
Map
View

Click here to load this message in the networking platform