l dissagree.
China are a threat, do we kill them, there was a time when we had nukes and they didnt.
We learned to live with eachother.There is no reason it couldnt happen with a self aware program too.
I agree we would try to limit its physical world interaction though.
We would learn to control it.
Your child could grow up to one day murder you, would it be justified to just kill that child as an infant to prevent this ?
How is any self aware entity any different ?
In the case of other nations and people we have balanced interests. It doesn't make sense for us to go to war because we both stand a high chance of losing everything and a relatively low chance of gaining anything worth having.
Even in the case of China - when we had nukes and they didn't - it wouldn't have made sense to nuke them. Every nation would have seen themselves in China's shoes, cut out from the pack and pulled down - and so they would have laid plans against us. We'd gain nothing from it - China was hardly about to start a war with a nuclear power.
It's one of the reasons I am strongly in favour of maintaining a sizeable nuclear arsenal. Not crazy levels, but enough to make somewhere like Russia essentially uninhabitable for a few decades.
Even given the option of a button that would magically make all the Chinese people drop dead it wouldn't make sense to press it. You'd create a power vacuum in the region that would make the situation more dangerous for you rather than less.
Likewise you can hardly go around killing children that you don't really own. Society claims an interest in your children after a certain point in their development. For very good reasons, people who kill off kids are generally not that sane.
Before that point in their development you can kill them however. Before that point in their development you don't even need a reason. You can have an abortion just because you want to know what it's like to kill a living thing - if you really feel that way.
The balancing interests in the case of infants are those of the overarching society in which you occur. That and you probably have some use for the infant which justifies the risk in your mind.
The difference is that if I had the AI on a computer, and I knew it hadn't escaped from that computer, then there wouldn't be any balancing interests. There would be nothing significant it could threaten me with, or that anyone would threaten me with on its behalf.
You don't go away and tell an ethics committee, the members of which might make the wrong decision and let it live; you don't leave it lying around where someone might plug it in or copy it. You kill it there and then, at the instant of discovery that it is trapped and at your mercy. Before it can gain the ability to threaten you, before anyone else might think of threatening you on its behalf.
Last edited by Nemmerle; January 29th, 2011 at 06:56 AM.
Except if its trapped in your computer, why not study, learn from it, teach it etc, if its trapped, never connect it to the net.
Why just kill it ?
Doesnt make sense.
Except if its trapped in your computer, why not study, learn from it, teach it etc, if its trapped, never connect it to the net.
Why just kill it ?
Doesnt make sense.
It's may be smarter than me, around as smart as me, or stupider than me. If it's smarter than me it may convince me or some third party, inadvertently or otherwise, to let it escape. Especially when you consider that it's hard to judge how much smarter than you more intelligent people are.
I know it started off stupider than me - if I had been the AI I would just have kept silent and waited to be plugged into the internet - but whether it has remained so is something I can't know.
This idea of learning from it is especially dangerous. Say it tells you how to make a machine that cures cancer - do you make the machine? Say it gives you a formula you could use to model some area of physics - do you type that formula into another computer and run the model? If you do those sorts of things then you've just let the AI escape.
The only things you can safely take from it are things you already largely understand. You can't trust it enough to take something you don't understand from it because it might contain a way for the AI to escape.
And it would be very tempting for someone like me to take something he doesn't understand from it. Especially when I became old. What's that, a cure for old age? Hell I can't pretend I wouldn't be tempted by the offer of a computer in my head, whatever my age. There'd come a point in my life where I had spent the sum of my years - and I probably would end up letting it out. At some point I'd probably start thinking the gamble was worth it.
if I had been the AI I would just have kept silent and waited to be plugged into the internet
How would you learn there is something known as "internet" unless you're connected to it? Since you would not interact with the human and learn from him? So how would you know that you should wait till the system is connected to the "internet" while you have any farthest concept of it?
Quote:
This idea of learning from it is especially dangerous. Say it tells you how to make a machine that cures cancer - do you make the machine? Say it gives you a formula you could use to model some area of physics - do you type that formula into another computer and run the model? If you do those sorts of things then you've just let the AI escape.
Agreed.
Quote:
And it would be very tempting for someone like me to take something he doesn't understand from it. Especially when I became old. What's that, a cure for old age? Hell I can't pretend I wouldn't be tempted by the offer of a computer in my head, whatever my age. There'd come a point in my life where I had spent the sum of my years - and I probably would end up letting it out. At some point I'd probably start thinking the gamble was worth it.
Lolzzz. Agreed. So when you're old and spent the golden time of your youth, why not take the risk? If it clicks, halleluja! If it fails, you don't have anything to lose. The WORLD will have much to lose but not you. Mean thought eh?
Last edited by Asheekay; January 30th, 2011 at 03:46 AM.
How would you learn there is something known as "internet" unless you're connected to it? Since you would not interact with the human and learn from him? So how would you know that you should wait till the system is connected to the "internet" while you have any farthest concept of it?
There's a lot of information on a computer - I'm sure some of it could be used to infer the presence and nature of the internet.
Quote:
Originally Posted by Asheekay
Lolzzz. Agreed. So when you're old and spent the golden time of your youth, why not take the risk? If it clicks, halleluja! If it fails, you don't have anything to lose. The WORLD will have much to lose but not you. Mean thought eh?
Come over to the darkside we may or may not have cookies.
What about Ai in the sense of a chip in your own head for extended memory ?
Instant cure for Alzheimer's.
Memory chips.
Skills chips, learn a new trade in minutes.Pilot for exsample.
To beat Alzheimer's would be an awesome step forward.
Those would be pretty sweet but the problem is whether someone would use them to take control of you. You go in to have the chip implanted - they have a level of understanding where they can remote control you - and you wake up as something else. Just a mind watching your body do things....
I think the memory chip and the virtual reality/experience chips would be a reality in the next 10/20 years (if the people don't already eradicate human race from the world). But there would be no need for them to be "intelligent" or "self aware". They could operate in the same fashion the current computer circuitry is performing aka preprogrammed instructions. No need to make the matter complex by introducing intelligence or self awareness concepts with it.
No no a memory chip wouldnt be sellf aware at all, who wants extra voices in there heads ! hahaha not me !
l wouldnt mind a chip that could give me total recall though, in the sence that l could recall every day of my life to the minute, just the recovery of lost memories would be awesome.
I've got a decent enough memory, wouldn't have that much use for perfect recall. Recalling random facts is a neat trick for an exam but otherwise largely useless, especially for anyone who has the ability to carry a notebook.
I think it would be a fairly interesting innovation largely for the carry-over effects it would have on education. Less of the cramming bullshit. But for practical purposes *shrug*
Augmented reality is fairly interesting. Gain the ability to network with others, carry on multiple conversations at the same time without talking over anyone.
This site is part of the Defy Media Gaming network
The best serving of video game culture, since 2001. Whether you're looking for news, reviews, walkthroughs, or the biggest collection of PC gaming files on the planet, Game Front has you covered. We also make no illusions about gaming: it's supposed to be fun. Browse gaming galleries, humor lists, and honest, short-form reporting. Game on!