George,
Sure, I'll try.
The concept of NULL in relational algebra terms is the unknown value.
For example consider a logical field whose value is unknown (NULL).
We have three possible values for this field .T., .F., or unknown. Logic says that;
.T. = .T.
and
.F. = .F.
are both true
However, since NULL may be .T. or .F. (since the value is actually unknown) it follows that;
.T. = NULL
is neither .T. or .F., the NULL value may be either .T. or .F. The value of the comparison is also unknown (there is that NULL again).
How about;
NULL = NULL
well again, the first value may be .T. or .F. and so may the second, therefore they may be equal or they may not, so the true answer is unknown (again that NULL value).
NULL is not actually a value, it represents the value being unknown.
Where does NULL come into real power? Imagine an application that must calculate statistics based on the daily temperature. The temperature may be 0 degees. There also may be days when the temperature was not recorded (therefore it is unknown). If we consider that the unrecorded temperatures are 0 then our statistics will be inaccurate. We must exclude those unknown measurements from our calculations. So we can record them as NULL and then exclude NULL values from the calculations.
Hope this helps.