Level Extreme platform
Subscription
Corporate profile
Products & Services
Support
Legal
Français
VFP and SQL Server in One app
Message
General information
Forum:
Visual FoxPro
Category:
Other
Miscellaneous
Thread ID:
01647697
Message ID:
01647819
Views:
66
Yeah I was wondering if anyone would even bother to try to follow it :) - its a pretty long and painful read for sure - glad you did. It seems like I might actually have something pretty slick here once I get it all put together. I'm developing a sort of 'wizard' to do this automatically - so in theory I should be able to take any set of foxpro data, run my wizard and select the tables I want to auto-sync - and the wizard will create all the '_pending' tables, local and remote views in VFP, and generate a .SQL that will run via SQLEXEC on whatever backend you want to create those tables, etc etc etc. The other thing that this can do is not just update one remote database, I can have it updating several at a time. So someone updates a VFP record - then it will go out and update multiple backends at once - so it could update say a PostgreSQL and Oracle and SQL Servers all at the same time. I actually wrote it with this possibility in mind partly because they insisted upon using PostgreSQL and I wanted to use SQL Server, and my concern is that someday they will come up to me and say they want to switch backends. Don't know how useful that would really be to anyone - but I suspect that other VFP developers might like the tool itself so they can accomplish what I'm doing here with just a couple clicks.

The reason I created this was because of reports they want. There are a few hundred clients, but each of the clients can have more than one company - and each company has their own VFP database - so if a client has 10 companies - then all their data is in 10 different VFP databases on 10 different servers - so there is no practical way to generate a report with any analytics for their entire enterprise (all 10 companies) - by doing this I've got all the data in one PostgreSQL database and identify the records for each with a company number and enterprise number. ...and of course the data will end up being 100% up-to-date because of the syncing. Thus reports can be created of the PostgreSQL database :)

I thought about using the Stonefield Database Toolkit and extended DBC properties to handle some of this stuff but I ended up not doing that because I don't really have a lot of time here. If I had to do it again and actually had a realistic time frame to get it done I would look into that further though because I think my slick idea here could be even more slick with SDT involved as it could be even more data driven than it is now.

The only thing that has errrkkked me on this project thus far is how they've decided to do the reports. I of course recommended SQL Server Reporting tools - but they don't want to pay for SQL licences - so I said we should use Crystal Reports - and got the same blowback - don't wanna pay for it. ... so much to my disapproval they hired a guy to create a report writer - and he's writing it in 32 bit Delphi for Windows! arrrgg! When I realized they had made this move I asked the Delphi guy if he could compile it into a web app - because ya know - its 2017 and it seems not good idea to create a 32bit Windows app for reporting - but he says he doesn't know how to do that. It's my understanding that there is a way to use the same Delphi Code base to create a web app - but I really don't know anything about that. I really feel that this is sort of shooting yourself in the foot though and laid out all my reasons for thinking this & brought it up repeatedly in meetings - but no one seemed to listen to me. I didn't even realize they were hiring anyone to do this until after they had hired the guy - and naturally he a personal friend of the big boss so not much I can say now to convince anyone.



>Good grief, that's brilliant. And painful to read. :) All those parts flying this way and that. Congratulations.
>
>>>My company has a legacy system running pure VFP8.
>>>One of our client wants to move to sql server as a back end.
>>>We thought about rewriting the app using c#.
>>>But have decided to use the same VFP code and simply adding SQL server capability...... with same table structures.
>>>
>>>So, the app can run in full VFP mode. or full SQL SERVER mode.
>>>
>>>Can anyone make an argument AGAINST doing such thing?
>>>
>>>Thanks in advance..
>>
>>A lot of depends on how the VFP app was written on how practical it is. One thing I've done quite a few times in the past is change the VFP app to use just remote views.
>>I do it like this:
>>1) Create a DBC with nothing in it but remote views that point to the VFP data.
>>2) Create identical tables in SQL server
>>3) Now you can change the connection string in the remote views to point to the SQL Server tables.
>>
>>That is of course the short version of the story - but you get the idea. I've done this quite a few times and had pretty good success with it.
>>
>>Now that is all fine and dandy depending upon how much of your code has to be re-worked to make use of the remote views.
>>
>>Right now I'm dealing with a similar issue where they want all the data in PostgreSQL tables - but the app is so old that none of the tables have primary keys, the data is not at all normalized, and they did not use .CDX index files or a DBC. The application has 1,000's of screens and miles of code so re-writing it to get it to use remote views and such is not a practical plan - but they want the data in PostgreSQL now - so this is what I'm doing for that:
>>1) Put all the tables in a DBC and create primary keys on each of the tables
>>2) Create an identical set of tables in the VFP DBC with '_pending' at the end of the name, and added a few fields to each table
>>3) Created a normalized set of tables in PostgreSQL (so some tables broken into multiple tables
>>4) Created remote views to the postgreSQL tables, and local views to the '_pending' VFP tables
>>5) Created add, update, append triggers for each of the VFP tables (not the _pending ones)
>>6) Created a stored procedure in the DBC that, with the use of the triggers:
>> a) adds, deletes, updates the PostgreSQL tables via remote views - any data mapping is done here (because now the tables are different as the data is normalized)
>> b) If the PostgreSQL table update files, then the record's data is logged to the '_pending' table.
>> c) Prior to doing the add/update/delete action, use the local views to the _pending tables to see if there is a record in there with matching primary key - if there is then update the _pending record first, THEN the record currently sent via the trigger.
>>7) Created another stored procedure in the DBC to run record validation
>> a) Record validation rules logs last update time to the source VFP tables' record.
>> 8) Crated another stored procedure for add/update/delete - but this time all it does is updated the '_pending' tables instead of the PostgreSQL tables (this will make sense in a minutes....)
>>9) To implement it all - I created a tiny app to do the initial data upload it imports the old VFP tables from a recent backup into my new VFP tables and runs the same stored procedures to update the PostgreSQL tables, So how to handle it:
>> a) kick all users out
>> b) import VFP tables into new VFP tables in DBC - don't create any triggers yet
>> c) after import, create triggers to update to pending tables
>> d) allow users back in
>> e) now import all the data from VFP to PostgreSQL - use tiny app to run stored procedure on each record (due to size of data in my case this takes several days
>> f) kick users back out
>> g) change triggers to update to PostgreSQL tables
>> h) let users back in
>> i) update the '_pending' tables to the PostgreSQL server (use my tiny app for that)
>>
>>....so the result here is that anytime someone makes a change to a record in the VFP table, the PostgreSQL table(s) are automatically updated - and with any luck whatsoever this will only require some very minor adjustments to the VFP application's code and the users won't have to be out of the app that long to implement. Only other piece of the puzzle would be creating a little gizmo that runs to update the _pending tables - either as part of the stored procedure or my tiny app on a timer.
>>
>>Now of course this create a live set of PostgreSQL data - but there is nothing here that allows for the PostgreSQL data to update the VFP tables, which in my case is fine because the PostgreSQL data will be read only - but I think I could some up with a way to do the same thing in reverse with triggers and stored procedures in the PostgreSQL database to update the VFP tables (and some up with a way to prevent an endless loop) - and I think I could probably pull that off if it ever came down to it.
>>
>>Maybe my idea here is nuts or someone else has a better idea - but to rewrite the VFP app would take 10 to 12 programmers a solid 2 years, and I am just 1 programmer and they need the data in few weeks - so this is what I came up with. Long term I figure I can slowly migrate the app to something web-based and have the users use 2 apps for a while, 1 VFP app and 1 web-app - eventually getting rid of the VFP app all together.
ICQ 10556 (ya), 254117
Previous
Next
Reply
Map
View

Click here to load this message in the networking platform