Keywords: Bug Feature, stable design

I think that I have written on this but lacking obvious keywords, I don't know how to find what I have written.

The plain issue is that when software is not well specified, it is not clear which behaviors are features and which are security holes. I have noted that it is difficult to discern just when a Unix system call to open a file should succeed. The information is scattered and perhaps incomplete. Under those circumstances programmers write code and if their file open calls work they figure that they must have obeyed all of the rules. If they have indeed exploited a system behavior later deemed to be a security flaw their application may fail in the future when they are not at hand to fix it. They may be accused by a later generation of programmers of having violated good practice, an ex-post-facto judgment made relative to ideas that might or might not be better than the old practices.

Principled design can go a long way towards solving this problem, even when the designers have not enunciated those principles. Together with documentation that conveys these principles, explicitly or implicitly, architectures can survive the ravages of age.

One such principle that was implicit in the 360 and 370 architectures supported virtualizability.