Level Extreme platform
Subscription
Corporate profile
Products & Services
Support
Legal
Français
Which is best for Desktop Apps VFP?.NET
Message
From
01/02/2004 15:38:10
Walter Meester
HoogkarspelNetherlands
 
General information
Forum:
Visual FoxPro
Category:
Other
Miscellaneous
Thread ID:
00860600
Message ID:
00872808
Views:
114
HI Rick,

>Sorry for the late response to this. I'm on vacation and I promised myself I won't spend more than an hour a day near the computer <g>...

Well you´re better of than me. My wife would not let me anywhere a computer for two weeks, if I´m on vacation.

>Maybe I expressed myself a little vaguely in that post (can happen when replying to messages at 2 in the morning <g>). My point here is that a) you can get strong typing at design time including compiler support for type checking making it possible to catch many errors and typos at design time rather than at runtime and b)

I don´t see much value in this for the final application per se. There is much to say about strong vs weak typing. I´ve got experience in both, and I must say that when doing for example C++ bypassing problems that comes with strong typing (as you might not know the type of object that will be passed) seems more a problem than the problems that come with weak typing. However I see that this more a discussion that should not be scoped to the database side of programming as it is a much broader subject.

the ability to use a clean object model to access information about the data that you’re working with at runtime. You can do the latter in VFP as well, but it’s not nearly as complete nor as clean (laFields[x,2] to retrieve a type is not particularily clear for example).

I recognize this is a more clean situation for .NET.

>ADO.Net also provides a clean abstracted data model that allows you to pass data around which is not easy to do with cursors period.

Agreed.

You can’t take a business object that uses a cursor as its underlying data store and pass it over COM or a Web Service to another tier of an application. There are ways to get around this (as I have built into my own WWWC framework for that matter) but it takes a lot more work or a framework to do this. The main point I’m trying to make with this is that ADO.Net is a set of consistent related classes and an object model that provides you access to the data functionality.

This has always been the main advantage of objects. I agree. However, this is something you should use with care, you don´t want to send objects with large data structures over the network while not needed.

>This is not to say that you can’t do the things in VFP, but this is trying to say that you can do just about everything that you do with VFP with ADO.Net. But you will do it differently. I think a lot of people who give .Net a bad wrap are doing so based on doing things the VFP way, which if you ask any non-VFP developer is not a common practice… Just because things that we do in VFP don’t work the same in .Net doesn’t mean it can’t be done or even that it may require more code. It’s just done differently.

I think you (not you personally) have to have to higher picture of how data should be handled over a network and esspecially in enterprises. There is lots to read about this subject in the hardcore database magazines. When you ´downgrade´ those practises and theories to the normal app you have a much better view of what ´best practises are´. And I must say that .NET with its data handling does not come close to what you can do in VFP because of the relational local database engine (for local datamunging).

>>for anyone. However, even when you're considering switching to .NET, you've got to realize that you'll leave the luxury of a local database engine behind that handling data is substantially different in many ways. You've got to analyze this for yourself if this is something you're willing to give up.

>Well, I agree on this and I do miss a local data engine. But that’s part of the choice you have to make as a developer comparing tools.

Which part do you miss the most ?

>> 1. A database server should in its essense not be missused for such tasks. Datamunging can be very resource consuming and therefore should not take place on any database server. It greatly increases the risks of performance problems. I know a few DBA's that have to deal with those situation about every day, so this certainly is not a non issue. A database server is ment to get your data in its raw form (just as a file server is getting your raw file). Datamunging actions are far more better left at the client, *IF* the client if capable of handling that.

>I don’t agree with this statement at all. Name one other tool that is not x-Base that does things with a local data engine?
There are a lot of tools out there out of the xBase world. MS access, Paradox and a whole lot of other desktop database tools with the capability of connecting to remote databases and dowload temp tables from the server.

>Read any pure Database relational text and there will be no mention of offloading data for ‘local processing’. That’s a silly notion – that’s what the database is there for.

I don´t think this is correct. In the corporate world this is not uncommon practise. In OLTP environments this is a hard requirement. Typically in such environment you have one heavy database server for serving RAW data and you´ll have a datawarehouse or a OLAP database for extracting ´information´ (read datamunged data) out of it. There are a lot of infrastructures possible but they all have the goal of offloading the server that takes the most transactions and do the analyzing on another server, buisiness objects or even frontend application.

Even in DS systems where you have lots of data on your front ends (not untypical in ERP/ERM) you want to avoid stressing the database. You don´t want to wait 5 minutes for saving your data, because some management report is running and locking some data you are about to change.

>Load balancing and dealing with data performance are real issues but people who know what they’re doing with the data server should be able to deal with this.

Well the fact is that people who are administering the databases are alomost never the ones writing the applications. So they have to do what can be done to optimize. But this not mean that this is something that could not be avoided.

>Even discounting that, here again you make this assumption that you can’t filter or traverse the data as you can in VFP. It’s true that there are fewer options for looking at the data, but OTOH you’re dealing with a simple structure that can be traversed extremely quickly, so ‘filtering’ data can be done simply by traversing the list for example and pulling out what you need into a custom view. This is no less efficient than filtering or sorting on the fly in VFP. And if you use the built in high level structures in .Net such as DataViews it doesn’t even take more code than it does with VFP.

>DML inside of a language outside of xBase is a myth. Just about every other programming environment uses an object approach to data access where you pass SQL strings and return a result that is returned to you in some sort of object. DML is a Data Engine thing and only in XBase does the line between data engine and 4GL language mix…

As I said, I don´t think this is correct. There are many languanges that do use a DML. MS access, Paradox, Powerbuilder, and a few others as well.

>> 1. It is a 3GL solution to a 4 GL problem. in a 4 GL you more specify what you want rather than how this is implemented.
>> 2. Iterating through collections is far less readable than a few solid SQL or xBase commands. Though I admit that setting up a good object model with usefull method naming ease the pain a bit.
>> 3. As a result of 2 writing bugfree code is much harder to do.
>> 4. As a result of 2 and 3, readability is much harder also.
>
>First I’m not sure what you’re billing as 3GL vs. 4GL. I think you’re comparing Views vs. a Data object?

No, 3GL = third generation langugage. The use of objects and iterating through objects is considerd to be a 3GL operation, whereas using a DML as SQL is considerd to be 4GL.

>The rest of the points are totally subjective and based on your opinion. Bugfree code of all things has no place in this discussion as this is a completely separate issue. If anything ADO.Net makes this much less of an issue by full support for Intellisense and compile time validation of the code you write including, if you chose, of your data (typed DataSets or typed Datarows for example).

The fact is that if you have to implement a SQL query on local data in ADO.NET, you´ve got to program a lot. As a result, there it is more difficult to write bufree code as you literally have to program the SQL command. The risks of having a bug in there is much more a problem than writing a single SQL command. Readability is harded because a developer has to wade though the code that access the collections and objects and trying to figure out how thing are done. With a simple SQL command this is not a problem because you´re not interested in how this is handled internally because you assume at forehand it is bugfree.

>> I know looping and filtering is possible. But how about SETting an index ORDER and drilling down the index with a SCAN FOR WHILE . These kind of DML commands are just the DML commands I just so highly appriciate, because those are the fine building blocks that are key to success in data munging. And yes, I'm talking about cursors and views, not DBFs specificly.
>
>A DataView provides this functionality. And lest you think this is a lot of code – it is not. Creating a DataView requires two lines of code, after which you get a View that you can traverse or data bind.

The dataview must be build up and for all the internal handling behing this process and absense of the index ´concept´ this is performance wise not any alternative to the very quick DML of setting an index and drilling down that to get your data.

>Ultimately developers have to make their own decisions and not just look at one or two bullet points that support their point of view to justify their choice. I know I use .Net because a) I like it, b) it works for the apps that I'm building both internally and for customers and because c) I believe it is the future for development with Microsoft development tools. The road to get there hasn't been easy and you can bet I still get frustrated at the things that don't work right in .Net. But heck, what dev tool works 'right' in the first place. VFP surely is one that has many, many quirks and odd behaviors. The trick is mastering the tweaks and peculiarities and take advantage of them whereever possible.

And this takes more than a few months. I´ve been programming the fox for arround 10 years now and I could say I´m pretty confortable in mung my data into any form very quickly. This has always has been my expertise and will be for the future. THis is why I love VFP. VFP is about data and offers you a VERY extensive tool in getting your data. This is exactly my objection agains ADO.NET: it is not mature enough to give me the performance I need.

I´ve been busy the last week in optimizing a routine that mungs data from a dozen tables into one. There is a lot of calculation arround there. The calculation however was nothing compared to get the data from the database (either DBFS or SQL server) through SPT or views. The performance on a slow network (10 Mb) on a 700 Mhz machine was dissapointing. I could not optimize the data retrieval with SQL either with VFP or SQL server. For the VFP version of the server I reverted to the technique of SETting ORDER, SEEKing and SCAN WHILE. The data retrieval was about 10 times as fast. I´m still buffled about the performance difference (something I need to find out). This Illustrates once again that you have so many ways in VFP to skin the cat.

Walter,
Previous
Next
Reply
Map
View

Click here to load this message in the networking platform