Quote:
All of this is true, but I'm also proposing it needs to judge the feedback on its own, whether its positive or negative... To judge the outcome of its actions and the goal it needs to reach and understand on its own if this outcome of its actions is the desired outcome or the undesired. It needs to make the judgement itself...

Well I agree. But you need to somehow define wrong and right. No one can do this on their own. Children get tought
about good and bad by their parents all the time. Also feelings and emotions contribute to this feedback. If it hurts
you most likely have done something wrong (punching the wall or whatever). And if you're feeling happy the opposite
might be the case.
This way you also get tought about wrong and right. But you never learned that pain is a bad and happieness a good
thing. You just know that one is something you should avoid and the other is worth seeking.

The AI needs at least a starting point. Based on this, it can later learn judging the effects of it's own actions in more
complex ways and scenarios.


Edit:
Quote:
It might be next to impossible, but another thought is not to provide a goal at all and leave the A.I. to set its own goal, but this requires a richer environment, one where the A.I. can die and has 'food' or something more, but again, not instructing it that food is good or death is bad. I cant even imagine how this could be done...I'm sleepy laugh

Sorry for ranting again grin , but in this case I'm sure the AI will still just behave randomly. Simply because it wouldn't
give a shit about food or death, those don't have a meaning. It would do whatever it likes to do - but since there's no
goal, there aren't any things the AI would prefer doing (like not dying). Over time it would helplessly perform random
actions until the gamemechanics decide that the AI starved to death. ( sad end. frown )
(or it will keep doing random stuff forever if there's enough supply of food lying around everywhere which it somehow
manages to eat)

Edit2: Totally forgot: The AI could also kill itself (if the game mechanics allow this). Even if the AI knew that this means
no more helpless wandering around doing random things - it just wouldn't care because theres simply no meaning to it.

But I guess you had something more basic in mind.. being able to walk around, some food lying here and there and
eventual starvation if you don't find enough to eat.

(end of edit2)

However.. you could program the AI to pick someting totally random for it to seek. You're still setting a goal, but the
outcome might be intresting. (or just boring, I don't know)

Last edited by Kartoffel; 07/05/15 23:50. Reason: too many edits.

POTATO-MAN saves the day! - Random