The few secure systems that have been built with a graphic user interface have been designed for users indoctrinated in military security principles. Military systems often impose security policies on users which limit the destinations to which the user can send data that he can see. This is called mandatory security. We will simplify things here by considering that the user is the master of such policies. The military security systems impose externally specified information categories, corresponding security labels, and attach various restrictions, by category, on the flow of such information. Again we will assume that the user is master of his own categories and category based restrictions. Nor will we separate the roles of such policy makers and information users. Thus we may be unable to support an institutional policy of “company confidential” printed at the ends of certain pages.
The military has a long and somewhat successful tradition of directing people to keep secrets without necessarily telling them why in each case. Mandatory security is an extension of this tradition to the computer. It seems cumbersome and unnecessary for the individual who wishes to keep his own secrets.
We will ignore here another military information technology called “polyinstantiation” which I would call misinformation systems. A magnificent example of a misinformation system (non computerized) was described in Anthony Cave Brown’s “Bodyguard of Lies” where a systematic feint seems to have deceived the Germans about the actually intended point of attack on the European continent by the allies during World War II. Neal Stephenson’s book, Cryptonomicon is a fictional reconstruction of many details in this saga. The design of data base systems may be heavily involved in current versions of such technologies.