EA - Clarifications on diminishing returns and risk aversion in giving by Robert Wiblin

The Nonlinear Library: EA Forum - A podcast by The Nonlinear Fund

Podcast artwork

Categories:

Link to original articleWelcome to The Nonlinear Library, where we use Text-to-Speech software to convert the best writing from the Rationalist and EA communities into audio. This is: Clarifications on diminishing returns and risk aversion in giving, published by Robert Wiblin on November 25, 2022 on The Effective Altruism Forum.In April when we released my interview with SBF, I attempted to very quickly explain his views on expected value and risk aversion for the episode description, but unfortunately did so in a way that was both confusing and made them sound more like a description of my views rather than his.Those few paragraphs have gotten substantial attention because Matt Yglesias pointed out where it could go wrong, and wasn't impressed, thinking that I'd presented an analytic error as "sound EA doctrine".So it seems worth clarifying what I actually do think. In brief, I entirely agree with Matt Yglesias that:Returns to additional money are certainly not linear at large scales, which counsels in favour of risk aversion.Returns become sublinear more quickly when you're working on more niche cause areas like longtermism, relative to larger cause areas such as global poverty alleviation.This sublinearity becomes especially pronounced when you're considering giving on the scale of billions rather than millions of dollars.There are other major practical considerations that point in favour of risk-aversion as well.(SBF appears to think the effects above are smaller than Matt or I do, but it's hard to know exactly what he believes, so I'll set that aside here.)The offending paragraphs in the original post were:"If you were offered a 100% chance of $1 million to keep yourself, or a 10% chance of $15 million — it makes total sense to play it safe. You’d be devastated if you lost, and barely happier if you won.But if you were offered a 100% chance of donating $1 billion, or a 10% chance of donating $15 billion, you should just go with whatever has the highest expected value — that is, probability multiplied by the goodness of the outcome [in this case $1.5 billion] — and so swing for the fences.This is the totally rational but rarely seen high-risk approach to philanthropy championed by today’s guest, Sam Bankman-Fried. Sam founded the cryptocurrency trading platform FTX, which has grown his wealth from around $1 million to $20,000 million.The point from the conversation that I wanted to highlight — and what is clearly true — is that for an individual who is going to spend the money on themselves, the fact that one quickly runs out of any useful way to spend the money to improve one's well-being makes it far more sensible to receive $1 billion with certainty than to accept a 90% chance of walking away with nothing.On the other hand, if you plan to spend the money to help others, such as by distributing it to the world's poorest people, then the good one does by dispersing the first dollar and the billionth dollar are much closer together than if you were spending them on yourself. That greatly strengthens the case for taking the risk of receiving nothing in return for a larger amount on average, relative to the personal case.But: the impact of the first dollar and the billionth dollar aren't identical, and in fact could be very different, so calling the approach 'totally rational' was somewhere between an oversimplification and an error.Before we get to that though, we should flag a practical consideration that is as important, or maybe more so, than getting the shape of the returns curve precisely right.As Yglesias points out, once you have begun a foundation and people are building organisations and careers in the expectation of a known minimum level of funding for their field, there are particular harms to risking your entire existing endowment in a way that could leave them and their work stranded and half-finished.While in the hypothetical your downside is meant to be capped at zero, in reality, 'swinging for the fences' with...

Visit the podcast's native language site