paul-crowe / Bug-Tracker

trackin' sum bugz
0 stars 0 forks source link

EA anchors too hard on existing orgs/ideas/strategies #2

Open paul-crowe opened 1 year ago

paul-crowe commented 1 year ago

Example: Lack of productive competition between orgs

Summary: "To encourage this, I'd love to see more support for individuals doing great projects who are better suited to the flexibility of doing work independently of any organization, or who otherwise don't fit a hole in an organization."

Date: Feb 8th 2017

Status:

Lag to response:

Current canonical instance:

Prior status of critic:

Fundamental criticism:

Public responses:

paul-crowe commented 1 year ago

Example: Only 'whitelisted' activities/goals are really EA

Summary: If it isn't on the shortlist of approved effective activities, it's a waste of time. Examples of whitelisted things; working at an EA-branded organization, or working directly on AI safety.

Date: 7th Feb 2017

Status:

Lag to response:

Current canonical instance:

Prior status of critic:

Fundamental criticism:

Public responses: In general there seems to be a lot of acknowledgement of this pressure, but also a good deal of pushback:

paul-crowe commented 1 year ago

Example: EAs might not actually change their mind much about values and goals, or form new opinions

Summary: As listed

Date: 8th Feb 2017

Status:

Lag to response:

Current canonical instance:

Prior status of critic:

Fundamental criticism:

Public responses:

This one doesn't seem to hold up, especially since the mass shift of focus to AI/longtermism. Examples of people who updated their values:

paul-crowe commented 1 year ago

Example: Over-focused, over-confident, over-reliant

Summary:

Date: 1st May 2014

Status:

Lag to response:

Current canonical instance:

Prior status of critic:

Fundamental criticism:

Public responses: Over-focused:

Over-confident:

Over-reliant:

paul-crowe commented 1 year ago

Example: Inconsistent Rigor / Standard of Evidence

Summary: "Effective altruists insist on extraordinary rigor in their charity recommendations—cf. for instance GiveWell’s work. Yet for many ancillary problems—donating now vs. later, choosing a career, and deciding how “meta” to go (between direct work, earning to give, doing advocacy, and donating to advocacy), to name a few—they seem happy to choose between the not-obviously-wrong alternatives based on intuition and gut feelings. "

Date: 12th Feb 2013

Status:

Lag to response:

Current canonical instance:

Prior status of critic:

Fundamental criticism:

Public responses: It should be noted that doing anything to address this (presenting newbies with a prescribed list of 'approved' life paths) would just feed into the "EA is an overbearing cult" objection. Also, this accusation of a "follow your gut" attitude contradicts the claims of Only 'whitelisted' activities/goals are really EA.

paul-crowe commented 1 year ago

Example: EA has a motivated reasoning problem

Summary: EA has a problem with motivated reasoning and emotional biases which impairs its truth-seeking powers.

Date: 14th Sept 2021

Status:

Lag to response:

Current canonical instance:

Prior status of critic:

Fundamental criticism:

Public responses:

paul-crowe commented 1 year ago

Example: EA makes implicit and mute assumptions

Summary: Looking at the underlying assumptions that create EA culture, and in turn create "intellectual blind spots", specifically relating to homogeny, heirarchy and intelligence.

Date: 15th May 2020

Status:

Lag to response:

Current canonical instance:

Prior status of critic:

Fundamental criticism:

Public responses: Point: has there ever been a group which didn't make implicit and mute assumptions? Is this an "EA" issue or a "human being" issue?

paul-crowe commented 1 year ago

Example: EA is overly hierarchical and top-down

Summary: "Cultural norms around intelligence keep diversification at bay. A leader’s position is assumed justified by his intelligence and an apprehension to appear dim, heightens the barrier to voicing fundamental criticism."

EA is driven by the notion of solving all the world's problems through the sheer power of intellect. This leads to a pecking order of smarts, which in turn leads to fear of criticising those on top, lest ye be considered dumb. Doubt = lack of understanding. Guru worship.

Date: 15th May 2020

Status:

Lag to response:

Current canonical instance:

Prior status of critic:

Fundamental criticism:

Public responses:

paul-crowe commented 1 year ago

Example: Longtermism and feedback loops

Summary: No way to tell how things are going, since the results won't be known for another 1000 years. Thus feedback tends to come from peers, increasing the risk of groupthink.

Date: 24th March 2022

Status:

Lag to response:

Current canonical instance:

Prior status of critic:

Fundamental criticism:

Public responses:

paul-crowe commented 1 year ago

Example: Needs qualitative research

Summary: Too much of a focus on numbers, which can allow mistakes to happen. Such as.

Date: -

Status:

Lag to response:

Current canonical instance:

Prior status of critic:

Fundamental criticism:

Public responses:

paul-crowe commented 1 year ago

Example: Lack of mentorship and guidance

Summary: Too many people going it alone. Nothing designed to increase group effectiveness.

Date: 2nd Jul 2017

Status:

Lag to response:

Current canonical instance:

Prior status of critic:

Fundamental criticism:

Public responses: Note that the rate of posts on the "personal development" board has exploded since a few years ago

paul-crowe commented 1 year ago

Example: Neglectedness may be a poor predictor of marginal impact

Summary: The assumption that more good can be done in areas not recieving a lot of attention could be misguided

Date: 9th November 2018

Status:

Lag to response:

Current canonical instance:

Prior status of critic:

Fundamental criticism:

Public responses:

paul-crowe commented 1 year ago

Example: EA is being slow to recognise its own limitations

Summary: "So EA is discovering the limits of the philosophy that underpins it (Rational Choice Theory). It's just slow. It could move much faster by rejecting it and adopting Effectual logic wholesale."

Date: 28 Apr 2022

Status:

Lag to response:

Current canonical instance:

Prior status of critic:

Fundamental criticism:

Public responses:

paul-crowe commented 1 year ago

Example: OpenPhil made inflation worse

Summary: As listed

Date: 24th March 2022

Status:

Lag to response:

Current canonical instance:

Prior status of critic:

Fundamental criticism:

Public responses:

paul-crowe commented 1 year ago

Example: Earning to give should have focused more on “entrepreneurship to give”

Summary: Entrepreneurship can offer a potentially higher reward than the tried-and-true path of earning to give as an employee

Date: 9th Aug 2022

Status:

Lag to response:

Current canonical instance:

Prior status of critic:

Fundamental criticism:

Public responses: