The Three Laws of Robotics

The Three Laws are a series of laws designed by Isaac Asimov to create robots that cannot endanger humanity and serve them in the best possible way.

Law I A robot may not injure a human being or, through inaction, allow a human being to come to harm.

Law II A robot must obey any orders given to it by human beings, except where such orders would conflict with the First Law.

Law III A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.