>>>The purpose of the process I'm looking for is to limit the access to potentially dangerous objects and functions. So, I think that a simple de-tokenization that can identify authorized functions, and constants in the supported data types, will do. The evaluation will be left to the VFP parser (with proper error handler).
>>
>>I have no idea what you you mean with de-tokenization here :(
>
>If I'm not mistaken, he's basically saying that once you've parsed the input string into a sequence of tokens, you can perform a lexical analysis by traversing the parse tree to identify the identifier and check them against a list of what you want to allow or prohibit.
If he's using VFP's expression evaluator, the whole parsing will be done by eval() function. But I think the check should be on the individual variables, and/or functions level. For functions it's probably easy to put together a list; for variables, I'd put any of the allowed ones, plus the fields from permissible aliases, into another list (or as properties of an empty object). So if they aren't on the list (or pemstatus(theEmptyObject, lcVar, 5)=.f.), the variable is not allowed, throw a tantrum.