Ray Kurzweil has written a series of books ( The Singularity Is Near: When Humans Transcend Biology, The Age of Spiritual Machines: When Computers Exceed Human Intelligence, and The Intelligent Universe: AI, ET, and the Emerging Mind of the Cosmos) in which the underlying theme is that intelligence will blossom unimaginably aided and abated by an equivalent blossoming of machine intelligence. And the arguments are compelling – the sheer momentum of Moores Law has seen an unbroken doubling of computing power every 12-16 months for the past 40 years. CPU speed and processing power is at the the core of Moores Law but also now memory capacity, disk speed and capacity, communication bandwidth, etc are all joining the CPU bandwagon over the last 20 years. And even natural physical barriers such as the speed of light in silicon, the limits to miniaturization, the natural vulnerability/randomness of atomic level processes appear to fall by the wayside to new solution sets. So the progression of computing power is now at gigaflops – billions of computer operations per second and will be at 1000 times that in 10-15 years. Ray has started to do the imagineering of what that type of computing power could bring about.
But of course the Devil is in the details.
Hardware computing power has always lead software development. And software development has not nearly kept the same relentless productivity pace of hardware development. Even software that generates software – is at best primitive, clumsy, and woefully incomplete. Read the commentary on Model Driven development.
The fundamental retardant to software emulating hardwares Moores Law is “do no evil”.
Do no evil is the famously attributed motto of Googles Larry Page and Sergei Brin. But it applies to software development in the following sense – one must suspend evil intent in the development of most complex software. Because that software, in order to be effective is granted rights and privileges that could easily disable the whole system. Think of the eval() function in JavaScript or the old fashion ability of dBase macro commands to create new commands on the fly or the privileged status of OS kernel processes.
This is the conundrum of software development. The more and deeper it delves into advanced automatic code generation; AI based modification of system goal(s), constraint(s), and/or priority(s); dynamic program changes, to say nothing of distributed processing and long transaction synchronizations – the wider it opens the Pandoras Box of unintended evil consequences.
And dont be naive.
Security experts are astonished if not overwhelmed by the number and complexity of zero day viruses, rootkit based attacks, and sophisticated dynamic hacker networks controlling hundreds of thousands of infected and now zombie-ized machines. This swiftness of incorporating the latest software technologies for ill gain has a strong parallel in the current credit crunch started in the US but now spreading worldwide. The computationally intensive and complex derivatives and other financial instruments were taken advantage of very quickly by unscrupulous financial players to dupe greedy but supposedly sophisticated financial analysts and investment bankers. The result h s been huge writedowns and losses at the banks amounting to well over a quarter of a trillion dollars and still growing. In sum, more than the Shadow knows that evil lurks and nearly instantaneously takes advantage of the latest advances in software development.
To make matters worse the battle against software evil, like that against terrorists, is asymmetrical and uneven. Most of the latest software developments are open and readily available in published dissertations, floods of journal and web articles, and often the software itself is free to try on the Internet. In contrast, hackers work in secret on malaware such as bot controllers for tens of thousands of zombies and cloaking devices that allow viruses and rootkits to mutate and stay invisible until called upon by their master controllers to launch ever more sophisticated attacks. These zero-day attacks provide the first clues that the malaware has new capabilities and attack vectors. Can you imagine if sophisticated AI algorithms are marshalled for nefarious ends? That is the problem confronting ever wider sets of software developers – they have to do due diligence to insure a)their software is not subject to commandeering attack and b)they have ways of disabling their software given that it has been modified and redirected to do evil.
So the first, obstacle to software development is thus who wish to do evil – and he constraints it places on software development. The next constraint is do no evil economically.
Do No Evil Economically
There is a recent NYTimes article on how Google is replacing Microsoft as the Computings Primary Monopoly. Of course because this is an article in the Business section, the monopoly word is never used but rather its euphemism “network effects” and what that allows a software company to achieve – dominant market share for several years (=, hint, monopoly). Here is how it is explained in the article: “Microsoft was a master practitioner of network effects, the straightforward precept in economics that the value of a product or service often goes up as more people use it. There is nothing new about the concept. It was true of railways, telephones and fax machines, for example. Microsoft, however, [acquired and] applied the power of network effects more lucratively than any company had done before it. “
In short Microsoft carefully courted journalist, reviewers, and developers to create the impression that there was a swell of buying interest around their software, often pricing or combining it with temporary specials such that large sales were guarantted. later, when competitors threatened markets they waere late to enter or wanted to dominate, Microsoft engaged in anti-competitive practices like charging zero for all of Naetscapes product line, incorporating anything it wanted into its Windows operating system killing off vendors that had at one-time been crucial to the success of early versions of DOS and Windows.
Open Source owes a good deal of its origins to the fact that Venture Capitalists refused to fund firms that would charge a purchase price for their product if it at all infringed on any Microsoft market existing or implied. Finally, Microsoft perfected the standards game: 1)race out in front of everybody else and work on software routines to set the standards – then work in the committees to get the Microsoft standards approved; 2)if you get a Microsoft standard, then add proprietary extensions which work best in Microsoft software only to get customers hooked on Redmond; and 3)if you dont get the standards then either poison the well (as in the case of the Java JVM in which Redmond insisted the contract language allowed them to maintain and distribute ad infinitum, so that the latest version of the JVM would likely not be downloaded on Windows pCs which also happened to have a 95% market share) or simply ignore the standards and do your own thing (SVG, XFORMS, PNGand JPEG2 which have had wide acceptance among the developer community for many years and have been completely ignored by Microsoft and “duplicated” by XAML, ASP FORMS, and .wmp – all proprietary technologies).
All these practices are part of what the NYTimes article describes as “direct network effects”.
However, Microsoft is not alone in striving to achieve Monopoly Share in the computing field. IBM often used equivalent practices to achieve its mainframe and minicomputer monopolies throght the mid-1960s to the late 1980s. All of the major vendors in the database field including IBM, Microsoft, Oracle, and Sybase have used variation on these techniques to try to wrest a monopoly share of the database market. It is instructive to note that onlyMySQL among some very promising database startups has been able to break into the top database companies – and its using an Open Source startegy coupled with an innovative database design based on pluggable engines. So whats wrong with a little rough and tumble Darwinian business practices?
In optimization theory the problem with monoplies is called premature local maxima. For a short period of time, this is ineed the best solution in the computing field. But in the process promising technologies starve and die out. Meanwhile the main winning technology often becomes less and less resposnive to the changing needs of the market. And given the pace of change in the computing field with Moores law creating enormous new price and performance opportunities every 4-6 years (were talking price changes of 4-16 times better) – the old local maxima is often no longer that – hence the the current move to cloud Computing to replace the Microsoft Client-other Servers allowed model. In short, Joseph Schumpeters “monopolies plant the seeds of their own destruction” is true; he just didnt warn you how slow that transition could be.
Hence, do no evil economically is just not really operative in the software development world acting as a second barrier to software ever approximating the speed of hardware development which Ray Kurzweil is almost assuming will occur at the hardware rate. 50 years, and counting and I still am waiting to have computing at my fingertips.
The Evil of Human Limitations
100 years and we still dont understand how our brains store memories. 60 years and counting and we still cant carry out a conversation with HAL. Seven plus or minus two is the average numerical memory capacity of humans for learning new sequences of numbers/characters. And yet the new 80 core CPUs will require programmers to master sequences of coding constructs that will depend on many reversible factors to determine whats best. Programmers are going to need computer assistance to code cleverly – but we have already seen the risk of that. Thus, Do No Evil will have to triumph if software is going to be able to support Spiritual Machines.