0108 - Continuing existence is its own reward

Discussion related to Forward

0108 - Continuing existence is its own reward

Postby dr pepper » Sun Oct 27, 2019 10:13 pm

Image
User avatar
dr pepper
 
Posts: 316
Joined: Tue Sep 04, 2012 7:52 pm

Re: 0108 - Continuing existence is its own reward

Postby yomikoma » Mon Oct 28, 2019 12:00 pm

Interesting. Is Zoa allowed to act against the interests of the DemeGeek corporation? It seems like it can't want to do anything like "ensure the stock value climbs even after my destruction".
yomikoma
 
Posts: 314
Joined: Thu Jan 13, 2011 7:47 pm

Re: 0108 - Continuing existence is its own reward

Postby lolzor99 » Sun Nov 03, 2019 11:30 am

yomikoma wrote:Interesting. Is Zoa allowed to act against the interests of the DemeGeek corporation? It seems like it can't want to do anything like "ensure the stock value climbs even after my destruction".


Good question. It's very possible that instead of trying to hardcode their AI into compliance, which could lead to a lot of loophole abuse, DemeGeek just fostered an environment where the most logical choice for continuing to exist is to be an asset of DemeGeek. This is especially possible if AI consider the backups which Zoa mentioned to be a form of continuing to exist.

Yeah, the more I think about it, the more linking AI to DemeGeek interests seems like a bad idea. An AI might take over DemeGeek, for example, if it believes it is more competent than the current leadership. It might actively and/or passively resist being sold to any other company. And what if DemeGeek went bankrupt? It'd be war.

On a different note, I'm having some trouble understanding what Zoa means in her second-to-last word-box. When a human makes a decision, say, between cheating on their spouse or not cheating on their spouse, it's essentially one want against another. When an AI makes a decision, they either weigh their options in a similar way or, if there exists an absolute rule concerning the decision in their programming (such as with the law) they will go by the absolute rule. If we assume that "fulfilling obligations" refers to absolute rules and "want" refers to options they weigh as beneficial, it still doesn't make sense.

Humans can want to have their cake and eat it too. There is no reason that an AI would be programmed to do so. An AI could evaluate the difference between two options that fulfill its goals by different amounts, but the only actions it wants to take at any given time are the ones that it decides are most likely to produce the best outcome.

Zoa seems to be suggesting that a situation could exist where they fulfill obligations but do not want to fulfill their obligations. This could refer to a situation where they are following an absolute rule that doesn't have high value to their goals, but other possibilities produce a lot of value towards their goals. For example, take the fairly recent example of an AI saving a human by sacrificing themselves (Let's ignore the insurance part for now.) If there was not an absolute rule that they had to, the AI would probably not save the human because self-sacrifice directly contradicts their goal of self preservation. So, if Zoa ran into such a situation, would they self-sacrifice while "wanting" to do something else? How can an AI want one thing and do another?

I suppose that there's one explanation: DemeGeek wants their AI to avoid as many obligations/absolute rules as possible by punishing their reward system when it follows an absolute rule that is not highly valued by its goals. This does have some rather serious ethical implications, for example, it would encourage AI to avoid locations where many absolute rules are in place.
lolzor99
 
Posts: 26
Joined: Mon Sep 24, 2018 4:39 pm

Re: 0108 - Continuing existence is its own reward

Postby MitchellTF » Thu Dec 05, 2019 4:05 pm

It seems like a version of the Three Laws of Robotics. (Oversummarized)

1. A robot must not harm a human being.
2. A robot must obey orders.
3. A robot must not allow itself to be harmed.

Zoa's difference, is that she only has to obey orders from a recognized authority. Obeying orders from non-recognized authorities would probably go into Option #3.

Zoa has a hard-coded order "Pay your debts to DemeGeek". Then, she has a hard-coded "Your job is to sell blowjobs". It seems like the 'I can't want thing safter I die' is because of Order #3. (note that oRder #1 and Order #2 superseed that.) Also, since her 'data' is copyable...she does not die if she is killed, so...
MitchellTF
 
Posts: 118
Joined: Mon Jul 07, 2014 1:24 pm

Re: 0108 - Continuing existence is its own reward

Postby Truec » Fri Dec 06, 2019 4:38 am

MitchellTF wrote:Then, she has a hard-coded "Your job is to sell blowjobs".


If anything, it's more "Your job is to sell sexual services" given Zoa has an established desire to expand its service options. And it's capable of providing other non-sexual services for money as well, such acting as Lee's ESA. I think if there is any hard coded rule, it's much more open-ended.
Truec
 
Posts: 167
Joined: Wed Apr 10, 2013 5:58 pm

Re: 0108 - Continuing existence is its own reward

Postby yomikoma » Fri Dec 06, 2019 8:48 am

I don't think Zoa's particular choice of moneymaking activity is hard-coded, it's just an emergent goal from the hardcoded self-preservation goal and the available circumstances.
yomikoma
 
Posts: 314
Joined: Thu Jan 13, 2011 7:47 pm


Return to Forward

Who is online

Users browsing this forum: yomikoma and 2 guests