Page 22, in answer to “Surely we are more than the sum of our parts!”, I would say, “so is a desk computer.”.
The book is shaping up as a sequel to his “Society of Mind” with the term ‘resource’ supplanting ‘agent’. Many more details are proposed.
I find many of Minsky’s mechanisms unobvious yet compelling. Many are common sense, but many are unfamiliar. Minsky explains many familiar subjective phenomena with his mechanisms.
Page 30 Quibble: Not all instincts that we are born with are present at birth; morphogenesis is not yet done then. (Minsky makes this point on page 180!)
Minsky writes as if he presumes my notions of self-reflection as described here. Perhaps he imagines them to be so obvious as not to need description. Perhaps he is right. Perhaps he will get around to it later.
Page 51: Minsky ‘parodies’ two notions of society — the ‘social contract’ and those of sociobiology. This is an interesting comparison but a false dichotomy. It seems clear to me that the ‘social contract’ is an emergent phenomenon whose mechanism is well explained by sociobiology. It seems clear that the contract proper was to be taken as a metaphor—it was an “as if” explanation. It made many good predictions which is all that is demanded of a metaphor. The question remains: “How often is the social contract a superior way of thinking in regards to economy of thought?”.
Page 85: I am amused that his scheme called “Rule Based Reaction-Machine” is just like Dijkstra’s guarded clause programming construct.
Page 233: I agree with Minsky. James says “I cannot imagine what kind of emotion or fear would be left if the feeling neither of quickened heart beats nor of shallow breathing, neither of trembling lips nor weakened limbs, neither of goose flesh nor of visceral stirrings, were present”. This cannot be imagined for as signals that produce these effects are sent from the head to the body, the sending event and apparatus is inaccessible to the conscious.
My subjective experience suggests that I don’t smile when I enjoy humor alone. This corroborates Minsky’s position.
Page 242; “Cognitive Contexts”: This section is highly parallel to the computer design problem of interrupts and I suspect inspired by that subject.
Page 313: “Dumbbell Ideas and Dispositions”
Minsky asks why we have so many ‘two part distinctions’. The best that mathematicians can do to map a manifold with coordinates is to use enough real numbers to serve as coordinates, at least locally. Real numbers have two extremes, big and little. I think that that answers Minsky’s question.
My conclusion is that these are features that we must understand and insert into AI’s if we want those AI’s to ‘function among us’. The book tries to identify many of these but is still far to vague to implement the features now.
Minsky mentions AI as only an incidental application of his ideas. Minsky’s mechanisms seem possible to program, but only with additional observation of people, including subjective. My impression is that this would be a project of perhaps unprecedented size. I am impressed that our DNA code-base occupies about 6×109 bits of information—one gigabyte! That is smaller than Microsoft’s Windows package! Both are known to have considerable redundancy—I don’t know which has more. Windows relies on but does not include the expression of the hardware design. Our DNA includes the design of the corresponding underlying ‘technology’. Nature has many ‘coding tricks’ that we have not fathomed. This does not directly bear on the length of the path to AI but does suggest that there are levels of concept that neither Minsky nor others have produced yet. It smells of a ‘silver bullet’ which has been silently hoped for by many researchers—yet broadly discredited. Even so a gigabyte is a lot code to reverse engineer. It is a hard way to gain those concepts.
I think that Minsky is right to focus attention at this level of the mystery. Much of the biological scientific community focuses on chemistry, neuron design, and brain maps as our best path towards understanding the brain. The latter will be useful in medical treatment of us carbon based critters; but I think that AI will gain relatively little from such science. I suspect that there are yet unimagined mechanisms, intermediate between neuron layout and Minsky’s mechanisms, that are both real phenomena, and necessary to grasp in order to understand the brain. Pessimistically these may be mechanisms that defeat the power of our sort of abstractions.
I refer to ‘real abstractions’ above where often abstractions are taken as mere mental constructs useful and perhaps necessary to understand some phenomenon. When the same typewriter is successfully used to produce a play and physics lecture that typewriter is a real part of the abstraction of human language. The typewriter is a human artifact but nature has produced its own, such as language.