73 lines
3.0 KiB
Markdown
73 lines
3.0 KiB
Markdown
|
---
|
|||
|
date: '2022-12-19T14:09:11'
|
|||
|
hypothesis-meta:
|
|||
|
created: '2022-12-19T14:09:11.863238+00:00'
|
|||
|
document:
|
|||
|
title:
|
|||
|
- My AI Safety Lecture for UT Effective Altruism
|
|||
|
flagged: false
|
|||
|
group: __world__
|
|||
|
hidden: false
|
|||
|
id: tmH8RH-mEe27ArstPwKXEA
|
|||
|
links:
|
|||
|
html: https://hypothes.is/a/tmH8RH-mEe27ArstPwKXEA
|
|||
|
incontext: https://hyp.is/tmH8RH-mEe27ArstPwKXEA/scottaaronson.blog/?p=6823
|
|||
|
json: https://hypothes.is/api/annotations/tmH8RH-mEe27ArstPwKXEA
|
|||
|
permissions:
|
|||
|
admin:
|
|||
|
- acct:ravenscroftj@hypothes.is
|
|||
|
delete:
|
|||
|
- acct:ravenscroftj@hypothes.is
|
|||
|
read:
|
|||
|
- group:__world__
|
|||
|
update:
|
|||
|
- acct:ravenscroftj@hypothes.is
|
|||
|
tags:
|
|||
|
- nlproc
|
|||
|
target:
|
|||
|
- selector:
|
|||
|
- endContainer: /div[2]/div[2]/div[2]/div[1]/p[43]
|
|||
|
endOffset: 779
|
|||
|
startContainer: /div[2]/div[2]/div[2]/div[1]/p[43]
|
|||
|
startOffset: 174
|
|||
|
type: RangeSelector
|
|||
|
- end: 16443
|
|||
|
start: 15838
|
|||
|
type: TextPositionSelector
|
|||
|
- exact: " And famously, self-driving cars have taken a lot longer than many people\
|
|||
|
\ expected a decade ago. This is partly because of regulatory barriers and\
|
|||
|
\ public relations: even if a self-driving car actually crashes less than\
|
|||
|
\ a human does, that\u2019s still not good enough, because when it does crash\
|
|||
|
\ the circumstances are too weird. So, the AI is actually held to a higher\
|
|||
|
\ standard. But it\u2019s also partly just that there was a long tail of\
|
|||
|
\ really weird events. A deer crosses the road, or you have some crazy lighting\
|
|||
|
\ conditions\u2014such things are really hard to get right, and of course\
|
|||
|
\ 99% isn\u2019t good enough here."
|
|||
|
prefix: ' the last jobs to be automated. '
|
|||
|
suffix: '
|
|||
|
|
|||
|
|
|||
|
|
|||
|
|
|||
|
We can maybe fuzzily see ahe'
|
|||
|
type: TextQuoteSelector
|
|||
|
source: https://scottaaronson.blog/?p=6823
|
|||
|
text: I think the emphasis is wrong here. The regulation is secondary. The long
|
|||
|
tail of weird events is the more important thing.
|
|||
|
updated: '2022-12-19T14:09:11.863238+00:00'
|
|||
|
uri: https://scottaaronson.blog/?p=6823
|
|||
|
user: acct:ravenscroftj@hypothes.is
|
|||
|
user_info:
|
|||
|
display_name: James Ravenscroft
|
|||
|
in-reply-to: https://scottaaronson.blog/?p=6823
|
|||
|
tags:
|
|||
|
- nlproc
|
|||
|
- hypothesis
|
|||
|
type: annotation
|
|||
|
url: /annotations/2022/12/19/1671458951
|
|||
|
|
|||
|
---
|
|||
|
|
|||
|
|
|||
|
|
|||
|
<blockquote> And famously, self-driving cars have taken a lot longer than many people expected a decade ago. This is partly because of regulatory barriers and public relations: even if a self-driving car actually crashes less than a human does, that’s still not good enough, because when it does crash the circumstances are too weird. So, the AI is actually held to a higher standard. But it’s also partly just that there was a long tail of really weird events. A deer crosses the road, or you have some crazy lighting conditions—such things are really hard to get right, and of course 99% isn’t good enough here.</blockquote>I think the emphasis is wrong here. The regulation is secondary. The long tail of weird events is the more important thing.
|