Level Extreme platform
Subscription
Corporate profile
Products & Services
Support
Legal
Français
UT Premier Discount -VFPConversion Seminar - Feb 16, 17
Message
 
To
08/02/2005 11:13:29
Walter Meester
HoogkarspelNetherlands
General information
Forum:
Visual FoxPro
Category:
Conferences & events
Miscellaneous
Thread ID:
00983141
Message ID:
00985008
Views:
41
>>>>You won't get arguments from me on those (although 64 bit is questionable and probably will be provided anyway once the MS C++ compilers start spitting out 64 bit code more easily).
>
>>However, what do you hope to get out of this? What will that buy you really?
>
>I'm not saying this is somthing I or anybody else needs right now. You were talking about ground breaking enhancements.

It's harder and harder with anything to come up with a 'ground breaking' technology. I mean after all software development works a certain way and it's been around for a long while. I think what's happening now is that Microsoft is trying to make the tools more efficient and more integrated for better or for worse. The stuff in Whidbey (the Whitehorse visual class designers (not UI, but class designers), the various databinding hook up Wizards and mappers and the ever expanding role of Intellisense in the environment are making the process more integrated.

But for core language and development features I think things are pretty stale.

>I seldom or never use UDFs() in SQL commands and try to avoid any functions in there too, for this particular reason.

You may have that discipline but most people (myself included) often don't have that sort of constraint. For you probably because you're not using VFP data in the first place (SQL right?) - you don't really have a choice. If I'm using VFP data in the first place and I'm pretty sure I won't go to a SQL backend. Especially in light that using UDFs or VFP functoins may reduce the number of queries you have to run in some situations.

>Currently I see .NET going towards VFP (VFP data features to be implemented in .NET) not the other way arround.

Again, I'm not sure what it is exactly that is so important in the language that makes VFP so indispensible.

>You certainly have a point in that the 'VFP community' looks for solution for solving data munging on the client. .NET-ers are looking for solutions on the Server. Neither is the right way. It should be processed where it belongs. No arguments from me that is not possible to write massive systems in .NET, but the type of products I'm busy with are not best suited for all server processing. Not only has the architecture be totally reviewed, but it does not make sence to store certain data on the server. I generally work with lots of meta data that is either stored in a database or in the executable. Munging with both local and remote data is very common. Not because I prefer it that way, but mainly because it is difficult to do otherwise.
>I'm perfectly aware that the types of applications I write are not found too often. They are very data intensive.

I can relate. I have a couple of consumer level applications (Help Builder) that I had at some point considered porting to .NET. But there are a number of issues that would have made this process difficult and ultimately there wasn't going to be a lot of benefit to this process. So the application remains in VFP quirks and all. But ultimately choosing what works best is a choice we luckily all have.


>I'm a big forestander of avoiding large SPs on the server doing mata munging. SQL servers are tipically not well equiped for doing that. The main problem that these processes require too much resources. Not a problem when only a few processes are running simultaniously, but a big problem in with larger numbers of users all doing their SPs for reporting or other datamunging services. This really becomes a nightmare when the isolation levels need to set high so read and writelocks are going to block other processes (transactions). There are all kinds of other problem with it as well.

Personally, I actually agree and the truth is I probably use that approach in most of the apps I'm involved in. But for scalable systems using local data processing is often not a good option because you loose control.

Also, it's important that local .NET data management is not slow per se - it's just slow for certain things in large data scenarios. Those scenarios can often be addressed differently - using data readers and manipulating the data as it comes in. But granted this is more work than using a cursor.

>>My strength isn't in data optimizations, and I personally am also a 'client data' guy because the systems I deal with usually make this possible. However in my experience for typical systems that I've been involved with which are medium to largish .NET works very well with ADO.NET's data features.
>
>Whether you do medium or largish is really not a measure for complexity. Most large applications are basically set up very simply heavely relying on a solid framework. Having a lot of tables, forms, reports, really does not tell you anything.

True, but I'm talking about data size here. Transactional vs. heavy data processing. Transactional applications are very well suited to .NET. I think the main issue is data processing when you have to process data on the client.


>>I feel I have to say this again - I'm not trying to sell anybody on .NET, however I think some of the people involved in this thread don't know what they're talking about when they deride .NET because they haven't looked at it in any detail, or tried some simplistic demo, saw that it doesn't do what VFP does, required more code and dismissed it.

>I know you are not trying to sell us anything. I find your message one of the most valuable in this respect.
>
>>I've followed your discussions of what you've been through and I trust your opinion that you couldn't do what you needed easily. But, I also doubt that it's not because it can't be done...
>
>I've got a firm background in the relational theory and do know databases (in fact I graduated on the subject) both set oriented and record oriented. Of course I'm not talking about things beeing impossible in .NET. They are just HARDER or VERY HARD. you might have more problems with resource consumption and performance (when dealing with large recordsets). Writing classes might ease the pain in some circumstances, but in general it won't cut it, because you really have to write classes to create a DML as powerfull and complete as the xBase commands. Else I end up with classes that are too specific and not widely applicable. For example I could write a class to overcome the problem of wanting to join a local dataset with another with certain algorithms, but that would not be the solution to a simular problem. Actually You want to write a SQLHandle class which takes a SQL command applicable to ADO recordsets. Well go ahead, not my idea of beeing productive. Maybe a great idea for a
>software vendor trying to sell such class, but not for joe average just wanting to be productive.

I understand, but again, it might be an issue of perspective of how you choose to do things. I'm not saying that this is wrong, butin some cases you must do things differently with different tools. All it means is that it's not impossible. Here again the server side point would come up if you'd talk to a typical SQL backend person. Not better or worse, but different.

>I did not say that. The reason for them to take so long might have been, not wanting to be on the bleeding edge. A few of the controls I use Crystal Reports viewer, Leadtools, Pegasus are now offering .NET components. Some of them are still not as complete as their COM or ActiveX equivalent. Anyways there is not sign they are going to withdrawl from the activeX market.

I don't think any vendors will withdraw from the market as long as people are buying the controls. It doesn't cost them anyuthing to keep selling the existing controls.

My point is that these controls are not likely to see any future enhancements. The vendor community is driven by sales - and ActiveX control sales are definitely on a hard decline. All you need to do is look at the commponent retail sites to see what they are putting front and center - it's not activeX controls. Those retails sites put up what sells so they can move product.

Personally that omre than anything is what scares me with VFP - the fact that ActiveX as a technology is finished. Microsoft has pretty much said as much that new system APIs and components are not likely to be exposed through COM APIs in the future. Which means if new technology comes along that we may need to integrate with there may not be a convenient COM API available to take advantage of it.

(by that time though we can hope at least that .NET will be prelevant enough that you can integrate with .NET if you need to do this and use .NET with COM front end to perform whatever task you need).

>I'm not specifically talking about wizards and designers, but just the language containing complex functions (e.g. the DML, like SQL).

>I agree, but my main focus is the integrated database engine. hence my comments.

Understood, it depends on how much you truly rely on the language.

This reminds me of something a client I was doing training for told me once. He had an existing very complex Fox 2.x application. Ugly as all hell, with a cryptic user interface. BUt it was very, very fast. But scattered all over the place and nearly unmaintainable by anyone but the designer. The code was mostly pre Fox 2.x code, so there was no SQL Code anywhere - just SEEK, SEEK and SEEK again <g>... The code was next to unmanagable (even for this customer finding code was difficult). Yet he insisted that everytime he looked into using Rushmore and SELECT statements to clean up pages and pages of code, performance dropped considerably. While certainly there's some truth to this (using VFP commands is often faster than SELECT) most of the time the problem was using SQL Select incorrect, not properly optimizing for Rushmore or simply using SQL when the old code might have actually been more efficient. The gentleman went on raving about the DML in Visual FoxPro. Basically he ended migrating the application to VFP 7 just for hte UI, leaving the business logic usig basically the same pre 2.x code.

Sometimes we look at our way of doing things as an excuse to not make changes that are necesasry or ignore other opportunities to expand the scope of our application or even as would be the case in this example, the maintenance of the application by reducing complexity.

I've done it, and I'm sure many of you reading this have done it. Heck, I probably am doing it today with some technologies like Java, or even worse in the past looking back and thinking of VB6.

Technological choices are seldomly based purely on technical issues - a lot of the decisions that into this are with what you are familiar with and some fear of the unfamiliar as well...

Certainly this topic has a lot of that going...
+++ Rick ---

West Wind Technologies
Maui, Hawaii

west-wind.com/
West Wind Message Board
Rick's Web Log
Markdown Monster
---
Making waves on the Web

Where do you want to surf today?
Previous
Next
Reply
Map
View

Click here to load this message in the networking platform