Level Extreme platform
Subscription
Corporate profile
Products & Services
Support
Legal
Français
BOOL Datatype
Message
 
To
10/04/2000 14:44:56
General information
Forum:
Visual FoxPro
Category:
Windows API functions
Title:
Miscellaneous
Thread ID:
00357504
Message ID:
00357641
Views:
19
>I was going to argue that C, as a precursor of C++, predates BASIC, but it does not - apparently BASIC was invented in 1964. However, C does predate QuickBASIC. (I wonder, if the old big-iron BASIC treated TRUE and FALSE the same way as QB?)

In my experience, if the BASIC interpreter supported bitwise operations, you'd define them the same way.

>Since I can't argue that C predates BASIC and therefore BASIC should conform to C and not the other way around, I'd have to argue space efficiency. As you pointed out, using 0 and 1 lets you store a boolean in a single bit. In these days of HD prices dropping to the $1/GB level, we forget how expensive storage used to be.
>
>Now you might wonder, if HD prices are so low, why does SQL Server still store booleans as bits? If earlier versions did the same thing, backwards compatibility would be a compelling reason. However, you might also be able to argue that efficiency is important as well.
>
>Modern DBMSs will achieve peak performance only with data held in RAM cache. If these servers are disk-bound then they have CPU cycles to burn, and overhead in converting 1 byte to 8 bits may be of little consequence, compared to paging out to disk. So, to get the most mileage out of the usually limited and precious RAM, using 1 bit instead of a whole byte may actually improve overall performance.
>
>So, I guess efficiency will never truly go out of style :-)

I don't think the value is being stored in a bit. I've seen C++ headers from VS define it FALSE as 0 and TRUE as 1. Regardless of the data type, they're going occupy at least one byte, if not more, of memory.

I just find it a bit strange that it's not being defined as its logical opposite. Computers do what they do because of the recursive definition of the state of a bit. It's either opened or closed, set or clear. When I was studying programming principles, defining things in terms of something that had already been defined was not only encouraged, but expected. It's simply consistent with the way the machine works.

Not that defining TRUE as 1 is wrong or anything. You could define it as 42 (yes, that's a Douglas Adams reference) or anything expect zero. I just find it odd, and apparently it was just the way MS did it. No reason, or anything else.
George

Ubi caritas et amor, deus ibi est
Previous
Next
Reply
Map
View

Click here to load this message in the networking platform