+David Gerard Saw the latest revision of the of "LessWrong" entry on RationalWiki?
"I have decided not to accept the SAI's grant to publish on TDT -Rachael Briggs"
I wonder why...
"I have decided not to accept the SAI's grant to publish on TDT -Rachael Briggs"
I wonder why...
View 26 previous comments
What a condescending asshole http://lesswrong.com/r/discussion/lw/f5b/the_problem_with_rational_wiki/7pjd
As if I would fake email conversations with academics. He can go and email her to verify it if he likes.Oct 29, 2012
I'm sure she'll look at her inbox and be delighted she ever went within a mile of SIAI.Oct 29, 2012
+Alexander Kruel wrote: "Isn't it true that given a persons epistemic state there are actions that are less wrong than others with respect to the persons values? If you agree, then what we ought to do is exactly that for which speaks the most in its favor. This includes using those methods that we deem to be best suited to calculate the former. In this sense the concept of "should" seems to be pretty straightforward."
Okay, yes - this sense of "should" is fairly straightforward. By which I mean: it still requires books on probability theory, statistics, optimization theory, game theory etc. to work out the details, but there's no fundamental mystery about it preventing people from agreeing on how it works.
So, suppose someone says "I want to maximize the expected value of this well-defined quantity". Then we should be able to agree on which of two choices create a larger expected value of that quantity - assuming we agree on the probabilities involved.
But you said that sometimes this leads to an intuition that it's "crazy" to do this. In other words: it would be crazy to really want to maximize the expected value of that quantity.
So what's going on is this: we have intuitions about what people want, or "should" want - in a different sense of the word "should". We may try to capture these intuitions by saying people want to maximize the expected value of some quantity called "utility". But sometimes this blows up in our face, leading to predictions that don't match what most people want.
To me this isn't surprising, since I don't think people really want to maximize the expected value of some quantity!Oct 29, 2012
To make this clear. I replied to David Gerard on why she might have chosen to edit the RationalWiki entry with a loose and informal speculation about possibly detrimental associations with ideas like the Roko Basilisk.
I wrote, quote:
"David Gerard This is pure speculation but I think she was keen to get her name off that entry ;-)"
In the next paragraph I drew attention to an issue that interests me.
I wrote, quote:
"If I was a serious academic I wouldn't like my name to appear there. I would actually feel embarrassed about any formal association with LW/SIAI..."
Then in the next two comments I went to justify it a little.
At no point did I try to lie about the motives of Rachael Briggs. I am at most guilty of using a comment thread of a post I made at my personal Google+ channel to make wild speculations and draw attention to issues I find interesting.Oct 29, 2012
Lukeprog: "Briggs decided not to spend further time writing the TDT paper. However, SI is now paying her hourly to give us feedback on the TDT paper that Alex Altair is developing. She's very good at that, and appears to be enjoying the process." http://lesswrong.com/r/discussion/lw/f5b/the_problem_with_rational_wiki/86yxJan 3, 2013
Too bad none of them did expand on the exact reasons for why she decided not to spend further time writing the TDT paper.
In the name of transparency and for the purpose of evaluation any scrutiny of SI's output by third-party academics should be made public.
Except of course it is her who doesn't want to give a reason for why she declined to write the full paper. Which I don't understand. She should feel comfortable stating her reasons irregardless of hurting any feelings. After all SI is a charity to be judged by potential donors and organisations like GiveWell. Withholding data won't help.Jan 3, 2013