It strikes me that in both of the first two tables, x and + need to be substituted by /\ (intersection) and \/(union) respectively.

Consider a universe having exactly one element. 0 and 1 stand for cardinalities of sets. Then 1 /\ 1 = 1 and everything else is the same.

x and + don't respect the elegant laws of a lattice, let alone a complemented distributive lattice. You end up with the messier laws of a ring instead. You can take lattices of divisors of a number, but this doesn't include + (and all hard number theory problems mix x and +).

This question is for testing whether you are a human visitor and to prevent automated spam submissions.