brainsteam.co.uk/brainsteam/content/annotations/2022/12/19/1671458951.md

3.0 KiB
Raw Blame History

date hypothesis-meta in-reply-to tags type url
2022-12-19T14:09:11
created document flagged group hidden id links permissions tags target text updated uri user user_info
2022-12-19T14:09:11.863238+00:00
title
My AI Safety Lecture for UT Effective Altruism
false __world__ false tmH8RH-mEe27ArstPwKXEA
html incontext json
https://hypothes.is/a/tmH8RH-mEe27ArstPwKXEA https://hyp.is/tmH8RH-mEe27ArstPwKXEA/scottaaronson.blog/?p=6823 https://hypothes.is/api/annotations/tmH8RH-mEe27ArstPwKXEA
admin delete read update
acct:ravenscroftj@hypothes.is
acct:ravenscroftj@hypothes.is
group:__world__
acct:ravenscroftj@hypothes.is
nlproc
selector source
endContainer endOffset startContainer startOffset type
/div[2]/div[2]/div[2]/div[1]/p[43] 779 /div[2]/div[2]/div[2]/div[1]/p[43] 174 RangeSelector
end start type
16443 15838 TextPositionSelector
exact prefix suffix type
And famously, self-driving cars have taken a lot longer than many people expected a decade ago. This is partly because of regulatory barriers and public relations: even if a self-driving car actually crashes less than a human does, thats still not good enough, because when it does crash the circumstances are too weird. So, the AI is actually held to a higher standard. But its also partly just that there was a long tail of really weird events. A deer crosses the road, or you have some crazy lighting conditions—such things are really hard to get right, and of course 99% isnt good enough here. the last jobs to be automated. We can maybe fuzzily see ahe TextQuoteSelector
https://scottaaronson.blog/?p=6823
I think the emphasis is wrong here. The regulation is secondary. The long tail of weird events is the more important thing. 2022-12-19T14:09:11.863238+00:00 https://scottaaronson.blog/?p=6823 acct:ravenscroftj@hypothes.is
display_name
James Ravenscroft
https://scottaaronson.blog/?p=6823
nlproc
hypothesis
annotation /annotations/2022/12/19/1671458951
And famously, self-driving cars have taken a lot longer than many people expected a decade ago. This is partly because of regulatory barriers and public relations: even if a self-driving car actually crashes less than a human does, thats still not good enough, because when it does crash the circumstances are too weird. So, the AI is actually held to a higher standard. But its also partly just that there was a long tail of really weird events. A deer crosses the road, or you have some crazy lighting conditions—such things are really hard to get right, and of course 99% isnt good enough here.
I think the emphasis is wrong here. The regulation is secondary. The long tail of weird events is the more important thing.