According to Isaac Asimov, there are three laws of robotics:
1. A robot may not injure a human being, or, through inaction, allow a human being to come to harm.

2. A robot must obey the orders given it by human beings except where such orders would conflict with the First Law.

3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.

Would you want to be a robot? Life might be easier if you were. You would be programmed, so there would be no decisions to make. You would have no soul and, therefore, not be responsible for your actions. Robots do not have free will. God didn’t make robots because He wanted a group of people to choose to love him back freely.

If humans were robots, according to the First Law, there would be no murders, and everyone would watch out for others and do whatever they could to help or protect them. A robot would never “sin” because of the Second Law. And there would be healthy robots who would never get depressed and commit suicide.

When God made mankind, He, too, established some Laws for us to live by. We know them as the Ten Commandments:

You shall have no other gods before Me.
You shall make no idols.
You shall not take the name of the Lord your God in vain.
Keep the Sabbath day holy.
Honor your father and your mother.
You shall not murder.
You shall not commit adultery.
You shall not steal.
You shall not bear false witness against your neighbor.
You shall not covet.

These commandments weren’t established to make your life miserable but, in reality, to make all of our lives good. Living according to the above list is a great way to live. “No. I am not a robot.” I have accepted God and His Son, Jesus. I believe in the power and presence of the Holy Spirit. I do my best to live by these commandments. I worship God and praise His Holy name. I do all of this of my own free will, out of love for the one true God who first loved me.

Copyright © 2024 Mark Brady. All rights reserved.