Welcome to Singulati: Our Singularity Weblog Forum

Singulati is a free digital forum open to both singularitarians and skeptics who come together to discuss, debate, collaborate, learn and network within a friendly, informal and rewarding atmosphere. Join us today: together we can build a better future, better you!

You must be logged in to post

Register? | Lost Your Password?

Search Forums:


Minimum search word length is 4 characters - Maximum search word length is 84 characters
Wildcard Usage:
*  matches any number of characters    %  matches exactly one character

Are machines already winning?

No Tags

4:53 pm
May 1, 2011

Nikki Olson


posts 85


It may sound like an odd question to ask; afterall, AI systems are still pretty dumb, and most of our best robots have trouble remaining upright when traversing diverse terrain. It does not do well, yet, in 'our world'. 

But in many ways, we are not coping well with post-Industrial life, in the 'machine's world'. There are negative health affects of living in an urban rather than rural environment, synthetic food is more and more being blammed for (the newer) human health problems, and virtual communication and online socializing is being blammed for disrupting important social bonds, and so on. 

Machines will suffer no such ailments. So not only will machines outsmart us one day, they in no way need to answer to biological ancestory in the way we do. 

Citizens of industrialized societies live longer than foragers, which is largely due to the opportunity for medical intervention, sanitation, irrigation and so on… the ideal is the best of both worlds, lots of time in nature, but with the conveniences of advanced civilization. 

Humans are accountable to their 'history' (our genes are optimized for Savanna life and haven't evolved much since, and we have to deal with the negative effects that result from deviating from optimal situations). Machines are not accountable to their 'history' in the same sense.  

How do you think this will all play out? How do we remain accountable to our biology on the one hand, yet compete with machines on the other? 

Nikki Olson


6:43 pm
May 1, 2011


east coast USA


posts 64


Post edited 2:32 pm - May 2, 2011 by CMStewart

My guess is we will experience a period of traumatic adjustment to eventual synthetic existence- even more traumatic than the one we are experiencing now (urban depersonalization, synthetic / GMO food dangers, etc). Many will not be able- or be unwilling- to endure the transition from a biological existence to a biological-synthetic existence, and eventually to a 100% synthetic existence (in the far future). As Kurzweil and other leaders in the Singularity movement have said, we are operating on "outdated software," and the transhumanism movement is largely an attempt to correct this liability.

I already see the transhumanism movement developing a distaste for the "unenhanced human." Within a few decades, unenhanced humans will be the new "Third World," and will be pitied and scorned. There is nothing we can do to stop this- it is part of the human condition- and may eventually be the only "fully human" characteristic we carry into in 100% synthetic existence.

Humans to AI: "Do as we say, not as we do."

8:14 pm
May 1, 2011

Nikki Olson


posts 85


Hi CMStewart! 

Thanks for the excellent response! It is something I have been thinking about for some time and I think you are right. It is also something I wanted to write about for SW, so I may quote you! Capitalism is a machine itself, that lead to our building the perfect environment for machines, but not for humans. There is no going back and we wouldn't want to either; so like you say, we will go through a painful transition into living well in the 'machine's world'. 

Nikki Olson


2:53 pm
May 2, 2011


east coast USA


posts 64


Post edited 2:54 pm - May 2, 2011 by CMStewart

You're welcome, Nikki! Feel free to quote me, I would be honored.


Yes, capitalism is a machine . . and while I recognize its obvious advantages- spurring entrepreneurship, technology, competitiveness, etc- I also see how destructive it is when unregulated. It seems we are on a teeter-totter and we must achieve just the right balance of technology and safety. Too much unregulated technology will lead to our destruction, either through nuclear war, catastrophic global climate change, or tainted food and water. Too little technology- especially at this point to correct our technological mistakes- will also end our existence, as we are on the path to self-destruction. With live with this knowledge of potential doom everyday, and are complacent to it. And with each higher level of potential doom, we adjust our collective mindset accordingly, just so we can go about our day-to-day lives without becoming neurotic. Unfortunately, ignoring a problem (usually) does not make it go away.

Humans to AI: "Do as we say, not as we do."

No Tags

About the Singularity Weblog Forum

Forum Timezone: America/Toronto

Most Users Ever Online: 58

Currently Online:
2 Guests

Currently Browsing this Topic:
1 Guest

Forum Stats:

Groups: 4
Forums: 26
Topics: 138
Posts: 459


There are 50 Members

There is 1 Admin
There is 1 Moderator

Top Posters:

KimSolez - 92
Nathan Wosnack - 67
CMStewart - 64
simple1248 - 35
Shinjeez - 6
ZiPlague - 6

Recent New Members: shemaenot, baregawi, Jake Lara, Androide, JaimieFathom, PacMan

Administrators: Socrates (89 Posts)

Moderators: Nikki Olson (85 Posts)