Sept 30, 2009
My talk just got accepted for Toorcon San Diego 11. It's an incredible honor that I completely hope to live up to by working my tail off for another 26 days. I'm happy about everything and I'm looking forward to swimming in the ocean no matter how cold it is.
I told my mother about my interest in the singularity today. The singularity explains the slope of exponential technology growth when it is impossible to keep up with it. The explanation I gave my mother is like showing a television to a child. The television does something that the child understands but the child doesn't understand why. Confusion with sufficiently advanced technology is certainly expected (the CEO of Amazon thought DSL was complex in 2003) but that isn't what the singularity is about. The singularity assumes that the change in technology will be more rapid than any single human can understand a reasonable sized chunk. You can understand your specialty and you can understand what your colleagues say, but a person can't understand the gadgets they will need to carry around. But education is incredibly important to people in our society. It doesn't take long to document what you're doing, but even with completely open source it takes far longer to understand how something works than to figure out how to use it. With user interfaces improving by repetition and copying as well as impressively intelligent designers putting time into UI, understanding will always be behind usage. If we assume that technology will continue to grow exponentially, we are in for a treat as it washes over us like HG Well's Eloi in the classic Time Machine. My plan is to reassess the situation every so often and decide what to do on a case by case basis. Planning too far ahead for something like the singularity seems like a bad idea.
The person who coined the term is giving the Keynote at Toorcon 11. Science fiction authors are definitely at the forefront of singularity prediction. My favorite sci-fi author, Cory Doctorow is in the habit of writing about societies that have passed the theoretical technology point for a singularity. He may be considered an anti-singularity writer, but that's just a viewpoint on the society that will end up with the technology, not the technology itself. For me, the technology in the hands of say plants is profound enough to change the way we understand the world. A person who wishes to understand what they use should have no problem finding documentation but it probably read like calculus, AI, and Python mangled into some sort of web UI. Or maybe it'll look like the Gibson in Hackers. I mean, the UI designers will probably be the right age for that movie by the time the singularity occurs.
Back in 2005 I felt it necessary to write down what I had foreseen as the proper way to bring about the singularity. In fact, there are three obvious ways that I could see the singularity occurring in the next 10-100 years. This is what I wrote:
1) Artificial Intelligence: if computer scientists are able to create an AI that learns quite well so that it becomes as smart as a human, it will cause 2 things to occur: all manual labor can be done by robots, the AI will be able to become smarter than a human by adding more computational power. The AI will then do what we humans are having a tough time doing: bringing about the singularity by some other method.
2) Nanotechnology: the MNT assembler would be able to manufacture any object from the ground up. That would include itself, which means that it could replicate a copy of itself. The nanites would then be able to create an object of any size given the proper amount of material. It could create solar panels, a fusion reactor, transport water using nanites as a transport method, create and maintain powerlines, and build the appliances that it runs, etc. All the materials could be designed to be as strong or as convenient as possible. It's quite obvious that mature nanotech in the next decade or two will cause the singularity.
3) Fusion power: economical fusion power would give the world a source of energy so large that we could grow plants in our houses using sunlamps for cheaper than grocery. Cars would drive at the same price that they do now except no jump due to lack or natural disaster. Since supply will outstrip demand, price will go down so that power is extraordinarily cheap. It is likely that this will cause a renaissance which will allow anyone to work on anything they want, which is a singularity which is not as glamorous as nanotech singularity, but will likely cause AI or nanotech singularity to occur.
If you notice the above was definitely not written today and was definitely written in the past. How can you tell? The writing is similar but definitely missing key ideas that I try to get across recently. In 2005, I had written hundreds of blogs, but none of them had a reasonable size audience and none of them did I have an interest in making my words precise. Anyhow, the ideas are fairly worthwhile.
Two criticisms I have of the above thoughts are:
1) Nanotech won't solve fusion as easily as it says. Fusion is not about mechanical or chemical processes, it's about the physics of atoms undergoing fusion. On the other hand, making solar electricity free would mean that fusion wouldn't be necessary to power a massive economy.
2) Cheap power created by fusion won't automatically cause a singularity since it requires software. It would make for free housing (free electricity plus rented machining technology turns the crust into warm housing for underground dwellers) and food but that wouldn't create a singularity without the software to match. It would be pretty silly to make everything free but then everyone has to work dayjobs to sustain it.
So, how will the singularity play out in the best case scenario? My bet is on the tortoise -- AI and nanotech will slowly integrate themselves into every technology we create. They won't seem fancy because we'll understand it as much as we want. When the last items come together, there'll be plenty of time to switch over to the free stuff while we're all waiting for something profound to happen. We'll spend a lot of time chatting and writing about it, many of us will complain and resist the new age. Unlike Kurzweil, I don't think that luddites will disagree so much as opt out. Opting out of a technological change or a societal construct is a well-respected pastime.
For example, I consider a majority of my virtues to be hinged on the things I opt out of. I opt out of war because I value human life. I opt out of fashion because it degrades value of human character and I value human character. I opt out of dating because I consider it to rejecting autonomy in human relationships and I value freedom. Since many people have caused me to question my assumptions, I dated two women over the past 2.5 years and learned much. I also opt out of driving because I consider automobiles to be too expensive and unnecessary. Since 2003 I have mostly opted out of eating meat, dairy, eggs, or any animal products. I'm not a vegan (long story) because I have a difference in opinion but I value my source of food and would never want to eat more than I need. I often opt out of politics because I consider large blameless committees and corrupt politicians to be improper way to decide use of public funds. Out of ~9 years of paying taxes, I have opted out of paying a large amount ~8 of those years because I don't believe in government, taxes, law, politicians, war, or coercion. I opt out of watching professional sports because I don't value development of muscle for sport. I opt out of fads when I see them because I don't like being part of a group unless they're smart or right. I opt out of haircuts sometimes for months because I can hardly understand why people care so much about it . Opting out is not very special, it's just the natural state of being when you aren't doing something. You don't even need a reason to opt out. It's not like I have to opt out of not existing.
I guess this blog needs a conclusion eh? Let's say that the singularity doesn't happen. What does that change today? Can the future affect today? Sure, but knowing whether or not the future will be weird doesn't affect you, right? If I base my business around the singularity and it doesn't happen, it's a failed business but it's also a big kink in my business model. If someone writes science fiction for a living but misses the whole singularity thing might make a rather large mistake writing a future like we expected in the 50/60s. The flying car was an obvious mistake if you think back with hindsight. If you think ahead with foresight, you shouldn't be thinking flying cars anymore because a shift in our priorities has made that undesirable even if it was possible. A science fiction writer these days needs imagination and a grip on the readers' understanding of the world (which is difficult without a similar thought process) at the same time they are looking at the future or a strange new world. In the same way a programmer who is starting a business needs imagination, skill, and a grip on the customer's needs (which is difficult without a similar need) at the same time they are looking at the future or an exotic new technology.