Abstraction
Subscribe
Sign in
Home
Archive
About
Why AI X-Risk Gets Overestimated
Most AI X-Riskers have only ever encountered strawman skeptics
May 16
•
Jonathan Mann
Share this post
Why AI X-Risk Gets Overestimated
abstraction.substack.com
Copy link
Linkedin
Facebook
Email
Notes
New
How AI Could Take Over the World
and bring an end to human civilization
Apr 17
•
Jonathan Mann
2
3
Share this post
How AI Could Take Over the World
abstraction.substack.com
Copy link
Linkedin
Facebook
Email
Notes
AGI is a Red Herring
When it comes to the threats from AI, there isn't some magic threshold where it suddenly becomes dangerous
Apr 7
•
Jonathan Mann
1
Share this post
AGI is a Red Herring
abstraction.substack.com
Copy link
Linkedin
Facebook
Email
Notes
Against Forecasting Tournaments
Why tournaments often encourage biased forecasting
Feb 6
•
Jonathan Mann
1
1
Share this post
Against Forecasting Tournaments
abstraction.substack.com
Copy link
Linkedin
Facebook
Email
Notes
Judicious Prospecting
Finding opportunities worth pursuing
Jan 24
•
Jonathan Mann
1
Share this post
Judicious Prospecting
abstraction.substack.com
Copy link
Linkedin
Facebook
Email
Notes
About
Abstraction will mostly be about forecasting and predictions, information processing platforms like prediction markets, AI safety and x-risk, and other…
Dec 24, 2021
•
Jonathan Mann
Share this post
About
abstraction.substack.com
Copy link
Linkedin
Facebook
Email
Notes
Abstraction
The ideal is when nothing more can be abstracted away
Subscribe
Abstraction
Subscribe
About
Archive
Sitemap
Share this publication
Abstraction
abstraction.substack.com
Copy link
Linkedin
Facebook
Email
Notes
Abstraction
The ideal is when nothing more can be abstracted away
By Jonathan Mann
· Launched 4 months ago
Subscribe
No thanks
By registering you agree to Substack's
Terms of Service
, our
Privacy Policy
, and our
Information Collection Notice
This site requires JavaScript to run correctly. Please
turn on JavaScript
or unblock scripts