About Me

My photo
Reality is merely an illusion, albeit a very persistent one

Saturday 23 March 2013


The forgotten story behind IBM's 'first mainframe


 Thus, small sweet sherries all round for the birthday of IBM's S/360 mainframe, launched 40 years ago.
The venerable machine is being feted around the world as the grandfather of modern computing: it brought such innovations as lookahead, pipelining, branch prediction, multitasking, memory protection, generalised interrupts, and the 8-bit byte to the commercial market. For those of us who've been brought up on a diet of microprocessor roadmaps, it's a welcome reminder that the latest, greatest chips depend on inventions dating back to the days when the Beatles still wanted to hold your hand.

t is no coincidence that the end of the Second World War saw the start of digital computing. As well as the now-famous work done by Turing and others at Bletchley Park, atomic weaponry research in the US had proved two things -- that nuclear and thermonuclear bombs would define the course of the rest of the century, and that designing the things required more sums to be done than was humanly possible. The push for high-powered computation was on.
By 1955, the University of California Radiation Lab was looking for a computer faster than ever before. IBM bid but lost to Univac -- then the biggest computer company -- and IBM hated to lose. The company came back a year later with a proposal for Los Alamos labs for a computer with "a speed at least a hundred times greater" than existing machines. It won that one, and had four years to deliver the beast. The project was officially called the 7030, but was far better known as Project Stretch -- it would stretch every aspect of computing.
The innovations began right at the start. Stretch would be built with a brand-new invention, the transistor, and it was the first design to rely on a simulator. This was built by John Cocke and Harwood Kolsky early on, and let the designers try out new ideas before committing them to the final machine -- a method of working that has since become universal.
It's hard to list all the ideas that Stretch embodied and that have since become canon law in processor design. It could fetch and decode multiple instructions simultaneously -- remember the superscalar hype of the late 90s? -- and pipelined them, decoupling decoding and execution. It could predict the results of calculations and speculatively execute code depending on its best guess, and could look ahead to unexecuted instructions to make the best use of its internal resources.
So, while you toast the success of the S/360 -- another small sherry? -- remember that it and almost everything else you'll touch with a chip inside is the inheritor of a burst of unmatched innovation, one that flowered years before, in the unholy light of Trinity.

Thursday 21 March 2013


Biological Tooth Replacement Is a Step Closer


 Scientists have developed a new method of replacing missing teeth with a bio engineered material generated from a person's own gum cells. Current implant-based methods of whole tooth replacement fail to reproduce a natural root structure and as a consequence of the friction from eating and other jaw movement, loss of jaw bone can occur around the implant.
The research is led by Professor Paul Sharpe, an expert in craniofacial development and stem cell biology at King's College London and published in the Journal of Dental Research.
Research towards achieving the aim of producing bio engineered teeth -- bioteeth -- has largely focused on the generation of immature teeth (teeth primordia) that mimic those in the embryo that can be transplanted as small cell 'pellets' into the adult jaw to develop into functional teeth.
Remarkably, despite the very different environments, embryonic teeth primordia can develop normally in the adult mouth and thus if suitable cells can be identified that can be combined in such a way to produce an immature tooth, there is a realistic prospect bioteeth can become a clinical reality. Subsequent studies have largely focussed on the use of embryonic cells and although it is clear that embryonic tooth primordia cells can readily form immature teeth following dissociation into single cell populations and subsequent recombination, such cell sources are impractical to use in a general therapy.
Professor Sharpe says: 'What is required is the identification of adult sources of human epithelial and mesenchymal cells that can be obtained in sufficient numbers to make biotooth formation a viable alternative to dental implants.'
In this new work, the researchers isolated adult human gum tissue from patients at the Dental Institute at King's College London, grew more of it in the lab, and then combined it with the cells of mice that form teeth. By transplanting this combination of cells into mice the researchers were able to grow hybrid human/mouse teeth containing dentine and enamel, as well as viable roots.
Professor Sharpe concludes: 'Epithelial cells derived from adult human gum tissue are capable of responding to tooth inducing signals from embryonic tooth mesenchyme in an appropriate way to contribute to tooth crown and root formation and give rise to relevant differentiated cell types, following in vitro culture.
'These easily accessible epithelial cells are thus a realistic source for consideration in human biotooth formation. The next major challenge is to identify a way to culture adult human mesenchymal cells to be tooth-inducing, as at the moment we can only make embryonic mesenchymal cells do this.'

You spend a third of your life sleeping. What if your dreams are real? Perhaps our dismissal of dreams as “just dreams” is based on a misunderstanding of the nature of consciousness and physical reality.
“I am real” said Alice (in Wonderland). “If I wasn’t real, I shouldn’t be able to cry.”
“I hope you don’t suppose those are real tears?” Tweedledum interrupted in a tone of great contempt.
We take for granted how our mind puts everything together. Everything we experience is a whirl of information occurring in our heads. Biocentrism — a new “theory of everything” — tells us that space and time aren’t the hard objects we think, but rather tools our mind uses to put everything together. They’re the key to consciousness, and why in experiments with particles, space and time — and indeed the properties of matter itself — are relative to the observer. During both dreams and waking hours, your mind collapses probability waves to generate a physical reality, replete with a functioning body. You’re able to think and experience sensations in a 3D world.
We dismiss dreams because they end when we wake up. However, the duration of the experience doesn’t mean it has any less basis in physical reality. Certainly we don’t think day-to-day life is less real because we fall asleep or die. It’s true we don’t remember events in our dreams as well as in waking hours, but the fact that Alzheimer’s patients may have little memory of events doesn’t mean their life is any less real. Or that individuals who take psychedelic drugs don’t experience physical reality, even if the spatio-temporal events they experience are distorted or they don’t remember all of the events when the drugs wear off.
We also dismiss dreams as unreal because they’re associated with brain activity during sleep. But are our waking hours unreal because they’re associated with the neural activity in our brain? Certainly, the bio-physical logic of consciousness — whether during a dream or waking hours — can always be traced backwards, whether to neurons or the Big Bang. But according to biocentrism, reality is a process that involves our consciousness.
In contrast to dreams, we assume the everyday world is just “out there” and that we play no role in its appearance. We think they’re different. Yet experiments show just the opposite: day-to-day reality is no more objective or observer-independent than dreams. The most vivid illustration of this is the famous two-hole experiment. When you watch a particle go through the holes, it behaves like a bullet, passing through one hole or the other. But if no one observes the particle, it exhibits the behavior of a wave and can pass through both holes at the same time. This and other experiments tell us that unobserved particles exist only as waves of probability.
Critics claim this behavior is limited to the microscopic world. But this “two-world” view (that is, one set of physical laws for small objects, and another for the rest of the universe) has no basis in reason and is being challenged in labs around the world. Last year (Nature 459, 683, 2009), researchers showed that quantum behavior extends into the everyday realm. Pairs of vibrating ions were coaxed to entangle so their physical properties remained bound together when separated by large distances (“spooky action at a distance,” as Einstein put it). “Such situations are not observed in nature,” stated the authors. “This may be simply due to our inability to sufficiently isolate the system of interest from the surrounding environment — a technical limitation.” Other experiments with huge molecules called “Buckyballs” also show that quantum reality extends beyond the microscopic world. And in 2005, KHC03 crystals exhibited entanglement ridges one-half inch high, quantum behavior nudging into the ordinary world of human-scale objects.
Whether awake or dreaming, you’re experiencing the same bio-physical process. True, they’re qualitatively different realities, but if you’re thinking and feeling, it’s real. Thus, RenĂ© Descartes’ famous statement Cogito, ergo sum (“I think, therefore I am”).