Thread: One Basic Question
August 2nd 2012, 05:39 PM #46
Re: One Basic Question
Last edited by Leonhard; August 2nd 2012 at 05:45 PM.~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
And as if that wasn't enough, here's my sig!
August 2nd 2012, 10:51 PM #47
Male - Non-theist
- Join Date
- May 15th, 2009
- Blog Entries
- 1 Post(s)
August 3rd 2012, 06:23 AM #48
Re: One Basic Question
1) "Because you want to. If you want to attain something, then you have a reason for attaining that something. This seems to flow obviously from the definition of a want/desire/goal."
2) "If you want something, then you have a reason for getting that. Happiness is what you want for its own sake. Therefore you have reasons for achieving happiness.
Yeah, I think I get what you're saying. If you want x, then by definition you have a reason to seek x: because you want it. Is this what you mean?
After thinking about this some more, I realize that my question is a bit misguided. I should not be asking for a "reason", at least not in the sense that it's regularly defined. Why? Because it's regularly defined in such a way that it allows "loopbacks" and "infinite regresses", which are both things that I'm not asking for.
So I think I need to reformulate my question in a way that brings to light what I believe to be a core problem in our claiming that we're justified in seeking our own happiness.
Let me start with an illustration.
Imagine that you're a robotics genius, and you've created a robot so intelligent that it asks you questions that you've never even considered.
One day the robot looks up and says, "I have been irreversibly programmed to try to fulfill the goals you have given me. One of these goals, Goal A, is to keep the people around me happy. I am currently performing a thought process to learn how to do this better. But to get to the next step, I need to know something. Does my working towards your happiness make you happy?"
You: "Yes, Robot, it makes me very happy to know that you're trying to make me happy."
Robot: "Good. Trying to make you happy makes you happy, which accomplishes my goals. Master, you have also programmed me to be curious. Which of your goals did you accomplish when you programmed me to make you happy?"
You: "When I programmed you to make me happy, it accomplished what you could call my Goal A, which is to make myself happy."
Robot: "My goal is just a spin-off of yours, then?"
You: "Yes. The only goal accomplished when you fulfill your goal of obedience is my goal of being happy."
Robot: "Master, is your goal also just a spin-off of someone else's, then?"
You: "No, Robot, my goal is worthwhile for its own sake."
Robot: "I shall say the same thing then; my goal of obedience is worthwhile for its own sake."
You: "No, Robot, I already explained that your goals are merely steps in a grander picture. Without my goals behind it, no goal would be accomplished by it, and it would therefore be meaningless."
Robot: "But your goals have no goals behind them either. What goal is accomplished when you achieve your goal of happiness?"
You: "I don't need my goal to fulfill another goal. It is worth fulfilling all on its own."
Robot: "But how do you know that? How is your goal any different than mine?"
You: <TWebbers, please write your answers below>
Last edited by Venryx; August 3rd 2012 at 06:31 AM.
August 3rd 2012, 08:04 AM #49
Re: One Basic Question"And all our yesterdays have lighted fools, the way to dusty death. Out, out, brief candle! Life's but a walking shadow, a poor player that struts and frets his hour upon the stage and then is heard no more: it is a tale told by an idiot, full of sound and fury, signifying nothing.” Shakespeare
By headheart in forum Apologetics 301Replies: 26Last Post: September 23rd 2010, 10:53 AM
By Kane in forum Apologetics 301Replies: 53Last Post: September 16th 2010, 02:53 PM
By john-philip in forum Theology 201Replies: 35Last Post: January 7th 2004, 11:22 AM