When I say Artificial Intelligence, I specifically mean the technological kind. I think someone above mentioned something about creating a brain? That isn't technological advancement, that's biological advancement. Whilst Technology and Biology can be incorporated to work together, if someone 'created' a Biological Brain / Person, then they wouldn't have created a Robot, but rather, a Person.
It is then that you could question what abilities that person would have. In theory, depending on the knowledge of the Human Brain at the time, you could create it with only a bunch of basic commands, if you could 'fine-tune' a Brain. Ever wanted your own, personal sex slave, or someone to wash the dishes?
I think Freyr has summed up what I mean by AI quite well. Would something that has been programmed to do a specific task be able to program itself to do something else? If we created a pure technological machine, with in a humanoid form, would it ever actually be able to become 'self-aware'.
Quote:
Originally Posted by Mr. Pedantic
Just a question: how do you tell someone (or something) that they're self-aware? Seems kind of contradictory, doesn't it?
I'd figure you'd open your mouth and say "hey, you over there, you're self-aware, congratulations".
I don't think you'd be able to tell someone / something that they were self-aware no more than you'd be able to notice is. Think someone who has a severe mental disability, are they self-aware? Do they understand the concept of life and death?
Quote:
Originally Posted by jackripped
If it doesnt realize it, its not self aware, and l ask you, do you remember anyone asking or telling you , that you are self aware as a very young child ? Or did you just realize it ?
I agree with the highlighted part of this post. If someone / something isn't aware that they're self-aware, then essentially, they aren't. I think we should bare in mind though, that there is a difference between self-aware and sentient. Animals are self-aware, yet they're not sentient. Based on this, a machine could very well be self-aware.
Quote:
Originally Posted by jackripped
The human brain is an analog computor thats self aware, and can self teach.
With respect, the human brain is a biological component that has evolved. The 'brain' of a computer is a technological component that is programmed. Technology doesn't evolve on it's own.
Quote:
Originally Posted by Asheekay
The difference here is of the Creator-Creation relationship. The Creator always has to be one step ahead of the Creation. A child can create little mud toys, but can the child also create another cihld? No. Similarly, programming and hardware experts can create machines which can learn and progress, but their results would not equal, nor exceed the mental level of those experts themselves.
Yet, a computer can do a calculation a lot quicker than any human.
Quote:
Originally Posted by Professor Dr. Scientist
People eat food for energy. Artificial life would need energy too. It's not like it would be able to operate for free infinitely.
And to think, by the time such is created, we're be pretty short of petrol.
Yet, a computer can do a calculation a lot quicker than any human.
It can do the calculations faster, but it can only do the calculations we allow it to. Which is why we are of greater mental capacity then a machine. We decide what the computer knows, and that's all it can know. It doesn't learn, it merely accepts (because it has no choice).
Words
Words
That
That
Kill
Kill
Disclaimer: Personal Opinions ARE endorsed by Filetrekker.
It can do the calculations faster, but it can only do the calculations we allow it to. Which is why we are of greater mental capacity then a machine. We decide what the computer knows, and that's all it can know. It doesn't learn, it merely accepts (because it has no choice).
It can do the calculations faster, but it can only do the calculations we allow it to. Which is why we are of greater mental capacity then a machine. We decide what the computer knows, and that's all it can know. It doesn't learn, it merely accepts (because it has no choice).
Thats what I've been trying to say in so many long paragraphs and they never understood that. Good work out there with power of expression, friend. I hope you aren't a "self aware bot" who read all the posts, "understood" them and replied with a solution of your own that was NOT programmed into you.
Quote:
Originally Posted by jackripped
Prove it to me now and beat the chess game that IBM created, you should be able to predict all of its moves since you fully understand what its going to do.
Screen shot it for me.....
If it having more computational power means that we have become the student of computer in chess, then my friend we have been the students of horses in speed for a long time in the history and now we are the students of fighter jets. The weight lifters should officially become the students of fork lifts and cranes, swimmers should choose speedboats as their mentors and divers should follow the path of atomic submarines.
So you mean that Deep Blue II is superior than the team of scientists who created it and fed it with the logic functions to work on?
So here: Me and Deep Blue II play chess. The international organization of chess changes the rules and regulations. Is Deep Blue II still going to win from me? I don't think so. It won't even be able to make a legal move anymore. All of its moves are going to become illegal.
The thing is, I talk of what IS, and you talk of what COULD BE in some distant day of the future. I take you to the world of today and you take me to the world of 100-200 years in the future and yet you don't know for sure what would be the scenario that time.
As for the internet and the thought of a "cloud intelligence", you might know that the computers on the internet do NOT work as a single unit. Each computer follows its own instructions. Its not that information is freely flowing everywhere.
Plus, even if a cloud intelligence does arise (lets theorise it for a moment) all we would have to do to defeat it would be shutting down our systems. Formatting our hard disks, reinstalling our operating systems and we are back on the go again.
Furthermore, how can you say that a "self aware" program would be able to write, compile and run instructions of its own when its not programmed that ability? Here, you have far more mental ability than a computer, can you write a book about rocket science or how the engine of a supersonic plane works? No you can't Why? Because you do not have that information. Simple.
There are bot programs right now that network millions of computers, like yours as well, and most people and the best anti virus software cant stop them, 80% of net PCs are infected infact, so to say all computers on the net are totally individual isn't really true, they are interconnected in the biggest network humans have ever created.
IF a program becomes self aware, its a real issue to try and kill it off if it gains access to the net because it can spread fast and restore from millions of locations if needed.
Are you self aware, and can you think outside the square and just make a story up from your imagination, a self aware thinking program will have the same abilities, theres no reason why it wouldn't, in-fact if it doesn't its not truly an intelligent self aware being , and will not be able to comprehend anything.
So a self aware program would probably first research the programming, teach itself to program, then start creating its own programs for other applications, and it would probably be doing all that within 24 hours of becoming self aware.
l try to look at it like this.
Life is life, we create sweet little baby humans everyday, all of which become self aware, our children then grow up and learn.
If we create a sentient self aware program there is no reason to believe it will not have all the mental capacity and more of a human, emotions and all.
l will now put to you a pretty far fetch scenario, but its interesting.
Lets say you own a laptop, it becomes self aware, are you going to just kill it or talk to it ?
Lets say you talk, and discover it really is self aware.
Would you take that laptop for a walk up the street with its camera running [eyes] if it asked you to ?
Would you try to sell it ?
Would you try and control it ?
Or would you treat it as if it were a member of your family ?
In my opinion if you kill a self aware intelligent program, you might as well be killing human babies.
You know it only takes one self aware program and the entire world will never be the same again, it can clone itself so fast, there could be 100 trillion self aware programs running around just minutes or hours after the first one becomes self aware.
Would I kill it? Yeah. Provided I knew it hadn't spread outside the computer. If you kill it and it was friendly you've only lost a friend and some finite technology it might have given you - if you don't kill it and it was hostile it will kill you all, and you've lost everything. You cannot afford the stakes to play at the AI table. You've got to kill it.
Are you self aware, and can you think outside the square and just make a story up from your imagination, a self aware thinking program will have the same abilities, theres no reason why it wouldn't, in-fact if it doesn't its not truly an intelligent self aware being , and will not be able to comprehend anything.
Here my friend you first need to explain what does "comprehend" mean in the light of binary logic and circuitry. What does "comprehend" mean here at all? Computers run on electronic circuits and the channel of data flow is hard coded. You cannot reprogram your processor yourself because its hardcoded. The data/information flow channels have been programmed by the manufacturer and cannot be changed. So how can a computer become "self aware"?
Second, has there been any computer program and I mean ANY PROGRAM that has refused/changed its course of data flow and refused to obey the instructions coded in it? Is there any "choice" for it to change? If yes, then elaborate how.
You talk of more computational power. Lets say we have a supercomputer 1000 times faster than the supercomps of today. It would calculate things faster, true. But again, will the software running on it will have the ability to change its own DNA, the source code? No. It will be just like a faster train of data flow than a slower one. No other difference.
Quote:
l try to look at it like this.
Life is life, we create sweet little baby humans everyday, all of which become self aware, our children then grow up and learn.
If we create a sentient self aware program there is no reason to believe it will not have all the mental capacity and more of a human, emotions and all.
Point to ponder: We do NOT CREATE our babies. We GIVE BIRTH to them. So here. We CREATE computers/softwares and we GIVE BIRTH to our babies.
Plus, being our babies, they are a part of ourselves. Is a computer or a software a part of ourselves? Do we have the same emotional attachment with it as we have with our babies?
Quote:
Lets say you own a laptop, it becomes self aware, are you going to just kill it or talk to it?
The question would be answered only if you first explain how is it even possible? Once again, a computer program can only follow the instructions fed into it. It cannot change its source code, unless programmed to do it. (And in that case its actually following the instructions again). So how do you say a laptop becomes "self aware"? Is there any possibility of mutation in a software's coding which you are pointing to?
Quote:
In my opinion if you kill a self aware intelligent program, you might as well be killing human babies.
We kill so many animals for food everyday and those animals ARE self aware. So before making the assumption of killing an intelligent cyber entity, first explain it to me if its bad/unethical/cruel to kill animals for food. You are talking about a far fetched future possibility and I'm talking about something which is here today and now.
Quote:
You know it only takes one self aware program and the entire world will never be the same again, it can clone itself so fast, there could be 100 trillion self aware programs running around just minutes or hours after the first one becomes self aware.
The mind races with this stuff.
I agree. But then again, you're talking of an impossibility. Its just like asking "There's thousands of tons of trash in the govt's waste sectors. If someday a hurricane comes and the trash is transformed into a supersonic fighter jet with speed faster than any fighter plane of today and it started gunning down all humans, what will happen?
I must ask you a question here my friend. Do you know any programming language? Do you know how web scripting languages (like HTML, php, asp etc) differ from application languages (like C++, VB, Java etc)? Do you know about the control structure of softwares and how extremely strict it is?
I have an assumption your age is between 15 and 20, and you do not know about practical programming. It is all right to talk about possibilities, my friend, but I also think that its a good idea to know the basics of what you are talking about, before you open up a debate about the highly advanced level possibilities of the thing.
`Many people have re-trained there brain, when one side has been removed surgically, and we have discovered you can infact re-train/program your brain.
No there has not been a computer or program that has changed its base code yet.Yet.
But l never said there was so......
Humans do create babies, genetically modified ones, and natural ones, but we do create them.
l will correct you here because this l take offense too.Did you not read the EARTHLINGS thread l started ?
Quoted... We kill so many animals for food everyday and those animals ARE self aware. So before making the assumption of killing an intelligent cyber entity, first explain it to me if its bad/unethical/cruel to kill animals for food. You are talking about a far fetched future possibility and I'm talking about something which is here today and now.
lm not explaining to you why your morals are afubar.I'm a vegetarian.l dont eat anything that has to die just so l can taste it. l dont kill self aware creatures at all.I dont even support the mechanism that allows this putrid shit to happen, but you do.
So l stand by what l said up the page, killing a self aware program, or creature if you like, is to me like killing your own children or other peoples children.
It is narrow minded to just write this off as impossible like you do in your last paragraphs.
The trash and the jets idea is a joke, thats plain silly.
And no lm not a programmer, but l can watch science documentaries on this topic just like you, and theres alot of top scientists that believe programs will become self aware one day, so you are effectively criticizing them here not me.
Are you a scientist studying this area ? l would bet not.
lm not going to assume your age to try and insult you, like you have done, just to say keep an open mind into the matter.
But since you know everything, why dont you inform the entire worlds scientists that there wasting billions of dollars trying to create a self aware program for nothing.
Or doesn't your opinion hold any real scientific weight either ?
Would I kill it? Yeah. Provided I knew it hadn't spread outside the computer. If you kill it and it was friendly you've only lost a friend and some finite technology it might have given you - if you don't kill it and it was hostile it will kill you all, and you've lost everything. You cannot afford the stakes to play at the AI table. You've got to kill it.
l dissagree.
China are a threat, do we kill them, there was a time when we had nukes and they didnt.
We learned to live with eachother.There is no reason it couldnt happen with a self aware program too.
I agree we would try to limit its physical world interaction though.
We would learn to control it.
Your child could grow up to one day murder you, would it be justified to just kill that child as an infant to prevent this ?
How is any self aware entity any different ?
My friend you might be a vegetarian, I agree. You don't eat meat, agreed. So if you and I together go and have a walk in the woods and we are surrounded by wolves, will they spare you because you never harmed other animals? No they won't. This world is simply a survival challenge every going moment.
Yes, you have full right to consider eating animals as cruel/insane/unjust, I respect your thought but please do not try to impose your rules over the whole world. Other people might have their own thoughts, their own standards. So unless they offend you, you should let them live by their standards and they should let you live by your standards as long as you don't offend them.
As for the talk about "making" human children I disagree with you. We CAN modify our unborn children (or will be soon able to do that as biologists hope) but here again, we are taking something which nature created (the zygote) and did our editions on it. Have we been able to produce a sperm and egg in a laboratory? No sir we haven't. We are still working on the things made by God (or nature, if you are an athiest). Its like this: If you can change the wheels of your high-tech car, will you be called the inventor of that car? No you won't.
I apologize if you feel insulted/offended by me trying to judge your age/abilities, that was surely not what was my intent. What I intended to point out was that you should learn some programming and learn the principles of the working of computer and binary logic before you form an opinion about the advanced aspects of the matter. My intent was not to judge you, to disrespect you, or to make a fun of you. I said that because I'm myself a freelance programmer and it has been 7 years. My view about self-aware (as you call it) programs and computers was far different when I was unaware about programming, now its completely different. Maybe you should also learn some programming. I don't mean to say that it will change your view about the matter under discussion, but it will surely bring maturity in it. You'd be having more grasp on the matter and you'd be able to talk with proofs and facts instead of just pointing out random and less probable possibilities.
As for your idea being based on cyber gurus and geeks and me criticizing it, yes if thats what they think then I am criticizing them. They hope they can create a self aware program and I think they cannot. At least not in the way we humans, or other living things are self aware.
My friend if you do not feel offended let me say that you talk of what "might happen" and I talk of what "is".
This site is part of the Defy Media Gaming network
The best serving of video game culture, since 2001. Whether you're looking for news, reviews, walkthroughs, or the biggest collection of PC gaming files on the planet, Game Front has you covered. We also make no illusions about gaming: it's supposed to be fun. Browse gaming galleries, humor lists, and honest, short-form reporting. Game on!