Plateforme Level Extreme
Abonnement
Profil corporatif
Produits & Services
Support
Légal
English
Mixed Emotions
Message
 
 
À
27/07/2001 02:10:59
Al Doman (En ligne)
M3 Enterprises Inc.
North Vancouver, Colombie Britannique, Canada
Information générale
Forum:
Visual FoxPro
Catégorie:
Autre
Titre:
Divers
Thread ID:
00534404
Message ID:
00536233
Vues:
8
>Apparently a university group in England developed a language that could be used in narrow, critical domains (e.g. air traffic control) to eliminate bugs. However, I believe it included its own real-time OS, not a large general-purpose OS like Windows.

Exactly; the permutations became exponetionally smaller, and the application was very narrow. I could probably get that pretty stable, but I bet I could still find a few bugs. :-)

>You claimed on another branch that you're able to crash your microwave and other embedded devices. That, I find surprising and interesting. I'd be interested to find out, in general terms, how your testing methods compare to those of the embedded/real time markets, whether you have much in common, etc.

Well, I don't have a repro (Microsoft-speak for "reproducible") scenario for the microwave (happens a few times a year), or my later-mentioned camera (happened once), but I can crash my cell phone at will. However, the cell phone has software that has to run on multiple platforms (models of cell phones).

>My understanding is that testing for embedded/real time systems is the most stringent anywhere, because the consequences of bugs are so severe.

No, it's because the environment is so limited. Test an HP heart monitor? Piece of cake, because I don't have to burn resources testing against HP Heart Monitor 98 running on a Gateway heart monitor, HP Heart Monitor 2000 on a Compaq, HP Heart Monitor 2000 upgraded from HP Heart Monitor 98, etc. I use HP as an example because their medical equipment software quality kicks some butt. But the permutations are pretty limited, too. I don't think the testing is more stringent, I just think they can have better test coverage because the environment is more stringent.

OTOH, maybe their testing is more stringent, because they can afford to burn the resources for a limited environment. All I know is that the testing trade journals don't seem to indicate that the embedded folks have some secret formula that the rest of us don't.

>However, it would be nice to think that perhaps embedded/real time design, and testing techniques could be applied to more consumer and systems software (if they're not already).

The techniques are the same, but the scope for general use software is much wider. For instance, in the case of the latest hotfix firestorm (edit box bug), that bug can only be found by having a human being visually looking at the editbox to make sure the scrollbars are visible. Automated testing would never find it. In the case of one OS on one piece of hardware, a test team could afford to have a human examining every visual test (assuming the tester doesn't slit his or her own wrists from boredom) to make sure that if you click in the very limited number of cases where you can click, and if you can scroll in the very limited number of cases where you can scroll, it works as expected. In the case of software that has many permutations of visual interaction, it's possible something may get missed.

Embedded/real time software teams use the same testing techniques and methodologies as any other professional software test team, they just don't have as much to cover, IMO.
Mike Stewart
Précédent
Suivant
Répondre
Fil
Voir

Click here to load this message in the networking platform