Robot slaves - do we have the right?

Started by Unbeliever, December 15, 2015, 06:06:06 PM

Previous topic - Next topic

Baruch

Humans aren't sentient.  Why do you think that a machine can be sentient?  The idea that a machine can be sentient, is proof enough that one is not sentient ... at least sufficiently that it makes any difference.  And according to physics, we are machines.  Let us join the PofTA mutants in worshipping the Holy Bomb.

Ah yes, u-topia aka no-place.  Feel the burn of the logical contradiction, while proclaiming what a rationalist one is.  Again proof of un-sentience. 

Political-economics (as it is properly called) shows that politics and economics are inseparable.  Again proof of un-sentience.

The idea that the upper class will suddenly behave differently than they have for the last 5000 years.  What is insanity?  Doing the same thing over and over again, even though it doesn't work.  Again proof of un-sentience.

The goal is to have robot domestics ... so that you can't even work for the 1% as a domestic.  They don't like you, they don't want to employ you.

Yes, merge the human with the machine ... cyborgs ... won't it be fun to robotize your wife ... we don't like our spouse or our children either.  Such sentience we have.
Ha’át’íísh baa naniná?
Azee’ Å,a’ish nanídį́į́h?
Táadoo ánít’iní.
What are you doing?
Are you taking any medications?
Don't do that.

Baruch

Quote from: Shiranu on December 18, 2015, 04:06:58 AM
The ape (I mean that as a statement about man and not you) thinks itself far more important than it truly is.

We were bred to work by hundreds of millions of years of evolution... this idea that we shouldn't need to work to live and have the ability to make it so is younger than our average life span. I am all for work becoming easier and necessities coming free... but any luxuries (more than the most basic of food or roof over your head) should require work because...again... that's what we were bred for. One day these support system will fall and if humanity is so dependent on them... what then?

You already answered this, with the TEDx of the guy from Thailand.  There is no future for technological society.  Just a few villagers living naturally, working 2 hours per day max.  But that requires enlightenment.  There is no enlightenment in technological society.  All Eloi, no Morlocks.
Ha’át’íísh baa naniná?
Azee’ Å,a’ish nanídį́į́h?
Táadoo ánít’iní.
What are you doing?
Are you taking any medications?
Don't do that.

doorknob

We should have to work for some things sure. but if there are no jobs available then what? Be poor and miserable for the rest of life. Or how bout just die? Once androids become a thing humans will be obsolete. And why the hell would androids want to work just to support us? So there would have to be some human jobs left. Probably mandated by law like the minorities thing. You have to employ x amount of humans or some such. And then humans will be paid less than androids because lets face it, we are slow and week, and stupid by comparison.

It's interesting but I guess we will cross that bridge when we get there. It may be scary but we will get there. People are working on it as we speak. But I digress.

I think we are already having job and economic problems so we may self destruct before that comes.

CloneKai

#48
Quote from: doorknob on December 18, 2015, 10:53:50 AM
We should have to work for some things sure. but if there are no jobs available then what? Be poor and miserable for the rest of life. Or how bout just die? Once androids become a thing humans will be obsolete. And why the hell would androids want to work just to support us? So there would have to be some human jobs left. Probably mandated by law like the minorities thing. You have to employ x amount of humans or some such. And then humans will be paid less than androids because lets face it, we are slow and week, and stupid by comparison.

It's interesting but I guess we will cross that bridge when we get there. It may be scary but we will get there. People are working on it as we speak. But I digress.

I think we are already having job and economic problems so we may self destruct before that comes.
the thing is we are going that way.
Far before robots start asking 'who am i?', we will be using machines to replace most of the human labor force. Agriculture, markets, finance, transport, and many more. only jobs requiring creativity will be safe. everything else, machine can do. including fixing your appliance or maybe even helping you in legal issues. Most of these issues can be solved without "Artificial intelligence".
so most of the people will be unemployed in that world. so something must be done to keep them busy and peaceful.

CloneKai

Quote from: Baruch on December 18, 2015, 07:30:19 AM
Humans aren't sentient.  Why do you think that a machine can be sentient?  The idea that a machine can be sentient, is proof enough that one is not sentient ... at least sufficiently that it makes any difference.  And according to physics, we are machines.  Let us join the PofTA mutants in worshipping the Holy Bomb.

Ah yes, u-topia aka no-place.  Feel the burn of the logical contradiction, while proclaiming what a rationalist one is.  Again proof of un-sentience. 

Political-economics (as it is properly called) shows that politics and economics are inseparable.  Again proof of un-sentience.

The idea that the upper class will suddenly behave differently than they have for the last 5000 years.  What is insanity?  Doing the same thing over and over again, even though it doesn't work.  Again proof of un-sentience.

The goal is to have robot domestics ... so that you can't even work for the 1% as a domestic.  They don't like you, they don't want to employ you.

Yes, merge the human with the machine ... cyborgs ... won't it be fun to robotize your wife ... we don't like our spouse or our children either.  Such sentience we have.
Other than me, i can't say for sure that anything else is 'sentient'.
I have "the ability to feel, perceive, or experience", i don't know whether other can do that. or are simply following a well written program to trick me, or a borg like creature with nested minds, with only me being awake. or maybe even i am not real. just think that i am  :shocked:

"The idea that the upper class will suddenly behave differently than they have for the last 5000 years.  What is insanity?  Doing the same thing over and over again, even though it doesn't work."

they might not like us. but they might like us to be peaceful and good looking.
afterall they can't garentee that their children won't like us, after watching so much TV. better to be safe than sorry.

doorknob

You bring up an interesting point how ever, artists are a dime a dozen. It's one of the reasons I didn't pursue my dream of being an artist. So that's not really a good option.

I have a feeling there will be a civil war where people are litterally fighting for their lives. As the rich get rid of welfare and they are! They take away the middle class jobs and make every one poor. With minimum wages being raised to a so called "livable" wage but middle class jobs are not being raised! Soon the two will be equivocal. Not saying that poor people don't deserve to be paid a livable wage they absolutely do. But the rich only raise prices so their piggy bank doesn't get hurt. The government should force the powers that be to pay their employees fair wages out of their own bank account! They can afford it! No one else can!

If a small percentage of people would stop hogging all the resources there would be no problem.

CloneKai

#51
Quote from: doorknob on December 18, 2015, 11:31:54 AM
You bring up an interesting point how ever, artists are a dime a dozen. It's one of the reasons I didn't pursue my dream of being an artist. So that's not really a good option.

in the world i speak of, i think artist will value little bit higher than today. hopefully.
o and not just artists. i was talking about people who have to find creative solutions or stuffs like that. doctors, designers,
these people will be safe. but other occupations which follow some sort of steps or procedures, these jobs can be taken over by computer programs. like Accountant or mechanic.
Quote from: doorknob on December 18, 2015, 11:31:54 AM
I have a feeling there will be a civil war where people are litterally fighting for their lives. As the rich get rid of welfare and they are! They take away the middle class jobs and make every one poor. With minimum wages being raised to a so called "livable" wage but middle class jobs are not being raised! Soon the two will be equivocal. Not saying that poor people don't deserve to be paid a livable wage they absolutely do. But the rich only raise prices so their piggy bank doesn't get hurt. The government should force the powers that be to pay their employees fair wages out of their own bank account! They can afford it! No one else can!

If a small percentage of people would stop hogging all the resources there would be no problem.
that is the issue. if the rich want to be safe. want to live in great society.
or do they want to live with high towering walls and 24 hours securities. and when ever they leave those walls they have to be surrounded by security guards.

think of it like this, would a billionaire be happier in USA or Pakistan.
if the rich want to have a good life in a stable society, WE need to be happy and non violent.


trdsf

Quote from: stromboli on December 16, 2015, 09:53:56 PM
You are assuming that we build something and give it the capacity to become an independent thinker and then don't let it? If something is designed and built to a purpose it is used for that purpose. Building something to all on its own become independent, would imply the understanding that it would do so.
This exactly.  Slave implies the denial of rights and dignity to an equal sentient being.  A robot is by definition a machine -- this is what made Asimov's robot stories so different: Karel Capek aside, he was pretty much the first writer to treat robots as an industrial product rather than the creation of a mad genius, and he was the first one to posit built-in controls to permit reasonably abstract thought without full free will.

If and when robots become commonplace, this is how it will happen: as industrial products.  And corporations aren't going to produce anything that even remotely has the chance of thinking for itself.
"My faith in the Constitution is whole, it is complete, it is total, and I am not going to sit here and be an idle spectator to the diminution, the subversion, the destruction of the Constitution." -- Barbara Jordan

stromboli

Most people aren't aware it was Capek that invented the word/term robot. From his play R.U.R. (Rossum's Universal Robots)

Asimov defined the 3 laws of robotics. Both are still considered the standard-name/behavior. Blade Runner (Do Androids Dream Of Electric Sheep) explored the concept of sentience and the humanity-or lack of it- with how we treated an android or semi sentient species. The idea that an android could be built to kill and given the right circumstances find justification to do so comes from there. I think Blade Runner should be required reading for every human being on the planet.

All that said, it still comes back to what capabilities we give an autonomous device to function separate from human control- Mars Rover, albeit a robot in terms of functionality, does not have independent decision making that I am aware of. And again, to assume that an android has the capability of reaching independent thought and self awareness implies that it was given that capability.

I can see a scenario similar to Skynet in Terminator or the Matrix in that a vast intelligence gathering/collecting system that has disseminating/decision making capability could, given sufficient memory and independence, become self aware and make decisions. But we are talking about a vast net of independent systems given the ability to do so and left to its own devices. The key being given the ability to do so. Yes, it could theoretically happen, but meeting green men in my bathroom next Tuesday could also theoretically happen. My personal belief is the likelihood of it is vanishingly small at best.

If green men show up in my bathroom next Tuesday I will let you know.

trdsf

Quote from: stromboli on December 23, 2015, 09:47:38 AM
Most people aren't aware it was Capek that invented the word/term robot. From his play R.U.R. (Rossum's Universal Robots)

I quite like ÄŒapek; I first read R.U.R. when I was 12 and was quite fascinated.  Later, I heard a BBC Radio production of his War with the Newts that was also excellent.  I really need to get a copy of the original novel.

Quote from: stromboli on December 23, 2015, 09:47:38 AM
Asimov defined the 3 laws of robotics. Both are still considered the standard-name/behavior. Blade Runner (Do Androids Dream Of Electric Sheep) explored the concept of sentience and the humanity-or lack of it- with how we treated an android or semi sentient species. The idea that an android could be built to kill and given the right circumstances find justification to do so comes from there. I think Blade Runner should be required reading for every human being on the planet.

All that said, it still comes back to what capabilities we give an autonomous device to function separate from human control- Mars Rover, albeit a robot in terms of functionality, does not have independent decision making that I am aware of. And again, to assume that an android has the capability of reaching independent thought and self awareness implies that it was given that capability.

I can see a scenario similar to Skynet in Terminator or the Matrix in that a vast intelligence gathering/collecting system that has disseminating/decision making capability could, given sufficient memory and independence, become self aware and make decisions. But we are talking about a vast net of independent systems given the ability to do so and left to its own devices. The key being given the ability to do so. Yes, it could theoretically happen, but meeting green men in my bathroom next Tuesday could also theoretically happen. My personal belief is the likelihood of it is vanishingly small at best.

If green men show up in my bathroom next Tuesday I will let you know.

Asimov's Three Laws were an interesting first stab, but I've always thought they require genuine sentience in order to be able to process them, except in the most simple situations.

The Mars rovers are semi-autonomous, and have decision-making ability under certain circumstances.  I wouldn't call what they do any level of intelligence since it's all programmatic, though I recall that Carl Sagan likened the expected level of intelligence of a rover to be anywhere between a bacterium and a grasshopper.  They certainly have enough autonomous action that I would consider them robots rather than mere waldoes -- and really, they do have the functional equivalent of the Second and Third Laws of Robotics built into them.  It 'knows' how much of a grade is too steep for it and would present a risk of knocking it over, and avoids them: that's Third Law.  It accepts new programming and new targets: that's Second Law.  First Law is unnecessary since the only harm it could do would be to deliberately refuse to accept orders and return data, and that takes a great deal more self-awareness than any of the rovers have.

AI of course is to computer science as fusion generators are to physics -- no matter when you are, they're always twenty to fifty years in the future (in fairness, I think physicists are closer to fusion power than programmers are to AI).  And the Turing Test is no longer really a good test for sentience: language parsing is near or even at the point that a computer with a sufficiently deep database could simulate holding a conversation.  Certainly they can "understand" natural language -- you can type a full grammatical question into Google, and get sensible search results.

I'm not sure that AI is even the goal anymore -- just better parsing of natural language and better recognition of the local environment and how to interact with it in a complex but programmatic way.
"My faith in the Constitution is whole, it is complete, it is total, and I am not going to sit here and be an idle spectator to the diminution, the subversion, the destruction of the Constitution." -- Barbara Jordan

Unbeliever

God Not Found
"There is a sucker born-again every minute." - C. Spellman

pr126

Well meant utopia will turn out to be inhuman dystopia with alarming regularity, usually with mountains of corpses.




Baruch

#57
Quote from: doorknob on December 18, 2015, 11:31:54 AM
You bring up an interesting point how ever, artists are a dime a dozen. It's one of the reasons I didn't pursue my dream of being an artist. So that's not really a good option.

I have a feeling there will be a civil war where people are litterally fighting for their lives. As the rich get rid of welfare and they are! They take away the middle class jobs and make every one poor. With minimum wages being raised to a so called "livable" wage but middle class jobs are not being raised! Soon the two will be equivocal. Not saying that poor people don't deserve to be paid a livable wage they absolutely do. But the rich only raise prices so their piggy bank doesn't get hurt. The government should force the powers that be to pay their employees fair wages out of their own bank account! They can afford it! No one else can!

If a small percentage of people would stop hogging all the resources there would be no problem.

I wanted to be an artist too when I was pre-HS.  But I figured it would never pay, and after I heard about Van Gough, it might even be dangerous ;-(  The arts don't pay, aside from a few super-stars.  Like most professional sports.  The entertainment business (and sports is entertainment) is the prototype for all economics ... a few super-stars at the top and most of us waiting tables, waiting to be discovered ;-))  I will never be a 7 ft tall African-American ... so I can forget about the NBA.

CloneKai ... the goal is to live like Sultan Akbar of the Mughal Empire.  Peasants?  That is what the army is for ... to make sure the peasants know who is boss.  If one has to keep up a regiment of war elephants ... one will have a need for resources.

https://www.youtube.com/watch?v=nbuM0aJjVgE
Ha’át’íísh baa naniná?
Azee’ Å,a’ish nanídį́į́h?
Táadoo ánít’iní.
What are you doing?
Are you taking any medications?
Don't do that.

stromboli

Quote from: trdsf on December 23, 2015, 01:58:33 PM
I quite like ÄŒapek; I first read R.U.R. when I was 12 and was quite fascinated.  Later, I heard a BBC Radio production of his War with the Newts that was also excellent.  I really need to get a copy of the original novel.

Asimov's Three Laws were an interesting first stab, but I've always thought they require genuine sentience in order to be able to process them, except in the most simple situations.

The Mars rovers are semi-autonomous, and have decision-making ability under certain circumstances.  I wouldn't call what they do any level of intelligence since it's all programmatic, though I recall that Carl Sagan likened the expected level of intelligence of a rover to be anywhere between a bacterium and a grasshopper.  They certainly have enough autonomous action that I would consider them robots rather than mere waldoes -- and really, they do have the functional equivalent of the Second and Third Laws of Robotics built into them.  It 'knows' how much of a grade is too steep for it and would present a risk of knocking it over, and avoids them: that's Third Law.  It accepts new programming and new targets: that's Second Law.  First Law is unnecessary since the only harm it could do would be to deliberately refuse to accept orders and return data, and that takes a great deal more self-awareness than any of the rovers have.

AI of course is to computer science as fusion generators are to physics -- no matter when you are, they're always twenty to fifty years in the future (in fairness, I think physicists are closer to fusion power than programmers are to AI).  And the Turing Test is no longer really a good test for sentience: language parsing is near or even at the point that a computer with a sufficiently deep database could simulate holding a conversation.  Certainly they can "understand" natural language -- you can type a full grammatical question into Google, and get sensible search results.

I'm not sure that AI is even the goal anymore -- just better parsing of natural language and better recognition of the local environment and how to interact with it in a complex but programmatic way.

I read "War With The Newts" about the same age. It was from an old library in a country town, an original copy. I've also had my hands on an original copy of "The Hobbit" with Tolkien's illustrations in pristine condition but couldn't figure out a way to steal it. I wanted it bad enough to.

Unbeliever

I expect the main thing we might have to fear in this regard is the development of autonomous weapon systems.

Fully Autonomous Weapons Fact Sheet
God Not Found
"There is a sucker born-again every minute." - C. Spellman