In desperation to find some real information on Palladium I finally used the fairly prominent search feature on Microsoft’s home page and quickly came to their white paper. It is a pretty good high level description of some security properties it would be useful for a personal computer to have. There is little there to convince the reader that they know how to build such a system, but that may not be the point of the paper.

They say that Palladium is orthogonal to DRM (Digital Rights Management), and that both are important. I am surprised but as I am not sure what they mean by either term, I cannot yet argue. They mention "trusted e-mail" under DRM which really confuses me. Perhaps they mean e-mail in a mandatory security framework wherein A controls whether B can send data to C. DRM can be viewed in this light. Perhaps they separate DRM from Palladium for reasons of internal turf politics. Perhaps they actually see it is a separate technology. I think it may be their view of the market. Some may want one and not the other.

They describe one scenario where a corporate employee at home connects to the corporate computer. Within the several computers there is a distributed TCB that protects the company’s interests and includes some software on the computer at home. Another TCB within the home computer protects the employee’s data from the employer. Those are my words for what they describe.

The white paper proposes that systems are shipped with "Palladium turned off", and that only the user, not software, can turn it on. It has been a good many years since I directly changed any hardware feature with by bare hands. Recently software has always intervened. At trusted boot however, this is feasible.

The paper speaks of separate, physically isolated memory, as if they had not heard of memory maps. Perhaps this is merely to gain credibility in a wider audience, perhaps it bears somehow on tamper-resistance.

They acknowledge the need of a secure path to the user.

I quote:

"Palladium" will not eliminate any features of Windows that users have come to rely on; everything that runs today will continue to run with "Palladium".
It is hard to imagine a system that eliminates viruses that write on the boot block and yet do not disable VMWARE or Linux. Perhaps they mean "any features that Microsoft had intended users to rely upon.". Alternatively they may plan to architect a layer of hardware and software beneath booting and allow those viruses to perform to spec but maintain their "vaults" at the lower level. I think that this is a bad idea.

I quote once more:

If a banking application is to be trusted to perform an action, it is important that the banking application has not been subverted.
Again, "trusted by whom?". The bank and the bank customer have different interests. The customer may hope for a virtual paper trail with which he can verify the correct behavior of the bank or prove incorrect behavior, while the bank may want to vouch for data on the customer’s computer. In short where did the software come from and who is it loyal to? Is it the bank’s agent or is it the user’s agent? Their example is unclear here.

They speak of "sealed storage" which seems to mean just data abstraction, with teeth, of course.

I quote:

Some platforms may allow a user to restrict the TORs that are allowed to run, but the user will still be in full control of this policy.
This may be the architect talking or it may be Microsoft talking. I think it would be in Microsoft’s best interests if this were so. The ball is actually more in Intel’s court. Will they choose to document the new hardware well enough to warrant trust? Will they tell Microsoft things that they do not tell others?


This is a message signed by a processor specific private key, sent with the permission of the user, indicating that the system is in some secure state that may convince the receiver of the message that the platform is suitable to receive confidential data from the receiver. They refer to this as a hardware feature. I suppose that they are thinking "microcode".


This is sort of a kernel that runs in "kernel mode". I suppose that kernel mode is more privileged than privileged mode. Is it less privileged than the mechanism that produces the attestation? It supports "trusted agents" carrying messages between them and administering "seals" that determines which code can see which data.

In their brief description of trusted agents they clearly understand that different agents are trusted by different interests. They are one step ahead of classic military computer security in this regard.

Viruses and Signing

Arghhh! They think that code signing is the answer to viruses!!! This is the first thing that makes me think that they have no concept of capabilities. I was about to close on an optimistic note. Oh well.


They say that the architecture guards against break once, run anywhere. Perhaps so. It seems not to guard against: Break once; Transcode a movie there; Play it anywhere. Keykos does not solve this either.

They promise bravely that they will provide means of making backups. This is a complex issue of conflicting commitments. Keykos has this problem and perhaps more flexibility to solve it or at least manage various compromises.

Other Interesting Pages on Palladium

Little of the following has been reflected in the above. I disagree a bit with Ross Anderson’s FAQ but it raises many interesting points and questions. Some technical clues here.
I made some notes on TCPA before reading about Palladium.
Two relevant patents from Lampson et al at MS: P.S. Recent thoughts on Palladium (NGSCB).

Another note on Palladium