4 Comments

Not the most exciting article, but one that you’ll reference a lot

Expand full comment

Agreed! I just feel like it needed to be said.

Expand full comment

I understand where you're coming from in these articles and you are raising excellent points. But what is your take on the whistleblowers who've left OpenAI? Sam Altman puts up a good front and all that, got great press, but then again, so did Sam Bankman-Fried.

You pointed out, in a previous article, that heavy restrictive regulation could have severely damaged the growth of personal computers back in the 1980s. Fair enough. But I see another comparison.

Gain of function research being valuable but inherently dangerous. Covid 19 hits. Must be from a wet market...not a lab leak (and don't say that because it's racist). Lotsa CYA in my opinion. Screw ups happen, damage gets done. Often because participants were too concerned with progressing to the next step than they were travelling along a safer path. The result there was 3,000,000+ dead.

Are you going to get involved with this: https://aitreaty.org/ or this: https://righttowarn.ai/

Expand full comment

Jonathan:

I'm quite supportive of properly scoped whistleblower protections for AI company employees, and indeed of broader transparency measures (I am still figuring out how exactly I think these should be crafted, but more to come soon).

It's not obvious to me that gain-of-function microbiology research and AI development are similar, though I do see why one might imagine that, and I wouldn't necessarily discount that possibility. But I don't think we know that yet.

I will look into these efforts!

Expand full comment