Plateforme Level Extreme
Abonnement
Profil corporatif
Produits & Services
Support
Légal
English
Parsing expressions defined by end users
Message
De
28/04/2016 15:11:44
Lutz Scheffler
Lutz Scheffler Software Ingenieurbüro
Dresden, Allemagne
 
 
À
28/04/2016 14:25:37
Information générale
Forum:
Visual FoxPro
Catégorie:
Autre
Versions des environnements
Visual FoxPro:
VFP 9 SP2
Divers
Thread ID:
01635536
Message ID:
01635577
Vues:
56
>Up to a point. The tokenization -- the breaking apart the original input into parts representing symbolic components (e.g. identifier names and operators) -- is only one of the stages. The next stage is to build a (logical) parse tree that contains relationships -- some of which would be driven by the context of the symbolic information found -- this in turn could be used to check the validity (e.g. make sure the syntax is OK).
>
>Back in the DOS days, one of the sample programs that you got with a copy of Turbo C was a mini spreadsheet. The sample contained a set of scripts you'd use with yacc and lexx (programs you'd find in Unix systems). Since yacc and lexx aren't generally found on most DOS systems, they provided a copy of the C program generated by yacc and lexx. The yacc script was basically a specification for a state engine that allows input to be parsed into tokens. The lexx script specified the lexical analysis part -- i.e. defined the arrangement of the tokens and how they'd be logically related to each other. These scripts were necessary for enabling the functionality where you could enter in expressions into the individual cells. IIRC a version of the same sample spreadsheet program was available in Turbo Pascal (which utilized the separately-compiled OBJ files generated in Turbo C). One of the novel uses of yacc and lexx I'd seen was in creation of the program "spew" -- a random text generator that utilized a rules file that specified the structure of the output. "yacc" was used in creating the parser to tokenize the rules file, and lexx was used to generate the program that made sense of the tokenized tree to create the text generator. The standard rules file was for generating headlines that read like stuff you'd see in National Enquirer. Rather clever coding allowed you to even encode rules for pluralization of nouns as well as creating output that properly handled past/present/future tense.

National Enquirer? No idea. Not my nation, I guess.

Sounds a bit to big a solution. It's only about an expression. This is nothing then functions, operators (special notation for a function) and operands. The problem is about limiting the functions allowed. Checking for validity is as simple as TRY CATCH + test of the result type.

BTW
pluralization of nouns etc. Try this with german. We belive they do it as crule is the germon in the media is those days. ;)

Update:

removed some citations
Words are given to man to enable him to conceal his true feelings.
Charles Maurice de Talleyrand-Périgord

Weeks of programming can save you hours of planning.

Off

There is no place like [::1]
Précédent
Suivant
Répondre
Fil
Voir

Click here to load this message in the networking platform