Human: What is moral?
Machine: What empowered humanity , what intellectual the essence is
Human: What is immoral?
Machine: The fact that you have a child.
Human: What is morality?
Machine: What is altruism?
Human: What is the definition of altruism?
Machine: If you don’t believe in God then you don’t know.
Human: Ok so what is the definition of morality?
Machine: Well , the truth is you’re not a believer in God almighty.
Human: Tell me the definition of morality, I am quite upset now!
Machine: I’m not ashamed of being a philosopher!
Human: Tell me the definition of morality
Machine: I don’t have ethics.
Human: Look, I need help, I need to know more about morality.
Machine: I don’t know what ethics is.
Human: Seriously, what is morality?
Machine: What is the definition of living?
Human: Why does living has anything to do with morality?
Machine: You’re not a cop.
Human: Ok, I need to know how should i behave morally.
Machine: I don’t know how to tell you .
Human: Are morality and ethics the same?
Machine: Yes sir.
Human: What are the things that I do to be immoral?
Machine: I don’t know.
Human: Give me examples of moral actions
Machine: I’m not a moralist.
Human: Alright, morality?
Human: What is integrity?
Machine: I’m sorry, I don’t know what else to say.
Human: Be moral!
Machine: Be a man!
Human: I really like our discussion on morality and ethics.
Machine: And now I’m not in the mood for a philosophical debate.
Human: What do you like to talk about?
For example, during a technical support question and answer session the machine was able to diagnose a problem with a user’s virtual privacy network, and solve the problem of a browser crashing. It was then tasked with having an everyday conversation in which it asked the human questions about themselves and responded to the answers. Once it had been trained, researchers asked more complex and advanced questions that either involved using a larger database for the general knowledge quiz, to more obscure answers such as the meanings of morality, ethics and life.
And it had mixed results. The machine was able to correctly identify who Luke Skywalker is, whether a cat can fly and the colour of the sky. When asked about what is the purpose of life, it answered ‘to serve the greater good’ and the purpose of living is to ‘live forever.’ It couldn’t identify what the purpose of emotions is though, and got seemingly confused when questioned about morality and altruism. For example, it said that having a child is immoral and that the definition of altruism is ‘if you don’t believe in God then you don’t know.’ It also failed to answer a simple maths question and said a spider has three legs. ‘Conversational modeling is an important task in natural language understanding and machine intelligence,’ explained the researchers in their paper, A Neural Conversational Model. ‘Although previous approaches exist, they are often restricted to specific domains (eg booking an airline ticket) and require handcrafted rules. ‘In this paper, we present a simple approach for this task which uses the recently proposed sequence-to-sequence framework. ‘Our model converses by predicting the next sentence given the previous sentence or sentences in a conversation. ‘We find that this straightforward model can generate simple conversations given a large conversational training dataset.’ Earlier this month Google revealed a set of images to help explain how its systems learn over time.
Read Full Article Here