Are Founders Allowed To Lie?

Interesting anthropology RE lambda heat. Here’s what I perceive to be the apparent social contract of SF tech:

a little bit of lying is OK, as long as you are doing something useful

but you can never say this out loud!

if you are found to be a liar, you need to reestablish ingroup/outgroup boundaries around deceit so that your network does not reject you: there is a level of deceit that you are absolutely not OK with, and you must now also point out that people overstepping this new boundary are not only deceptive, they are strange, there is something uncanny about them (fashionable right now to post about theranos, how obviously fraudulent and strange the founder was, not like you!).

it’s basically up to the biggest ingroup liar to set these boundaries and make them really clear so that other people know what frauds to call out (mustn’t tread on any ingroup feet, etc)

This is a bit gross in practice but really the big question is: if we value ‘’‘progress’’’, how much deceit do we allow in people actually making an unusual amount of progress happen? We let Elon get away with it, he has earnt it… Which billionaires are navigating this particularly well?

1 Like

I’m also curious how different things get if we max out the social contract for progress.

My understanding is that high performance is actually correlated with honesty/intellectual authenticity and it’s pretty much a psyop to suggest that you need to allow dishonest people in order for progress to be faster– it’s literally just the case that we have a bunch of low-integrity low-performers ruining things for everyone.

Is this true? What’s the best writing people have read on this?

The question I find most interesting is this:

Is there a hidden reserve of high competence that only emerges once we fix the dishonesty problem?

My strong suspicion is that the answer is yes, and we could maybe double or even 10X the number of extreme high performers by just adjusting the social contract to stop rewarding liars, which has a similar multiplier on global progress. Has this ever been estimated, or discussed, even once? Seems a big deal…

I’m also curious what infrastructure (decentralized reputation measure or whatever) you could build to solve this. And the ‘devil’s advocate’ arguments, downsides of policing honesty (what’s the false +ve rate, do useful things look fraudulent to start off?)

highly important point I have never met a cynic who doesn’t occasionally miss massive upside, or predict something will fail/is a scam that ends up becoming very good-- this is very very important and if a cynic does not acknowledge this then they are no better than a fraud in my opinion, since they are implicitly pushing cynicism as superior ideology.

1 Like

There’s an Alex Danco blog for everything apparently:

1 Like

(post withdrawn by author, will be automatically deleted in 24 hours unless flagged)

(post withdrawn by author, will be automatically deleted in 24 hours unless flagged)