Add 'brainsteam/content/annotations/2023/01/29/1674989844.md'
continuous-integration/drone/push Build is passing Details

This commit is contained in:
ravenscroftj 2023-01-29 11:00:04 +00:00
parent aa8ba5f150
commit fddf4f92ac
1 changed files with 61 additions and 0 deletions

View File

@ -0,0 +1,61 @@
---
date: '2023-01-29T10:57:24'
hypothesis-meta:
created: '2023-01-29T10:57:24.658922+00:00'
document:
title:
- 2301.11305.pdf
flagged: false
group: __world__
hidden: false
id: tnNraJ_DEe2YBceDAVt0Uw
links:
html: https://hypothes.is/a/tnNraJ_DEe2YBceDAVt0Uw
incontext: https://hyp.is/tnNraJ_DEe2YBceDAVt0Uw/arxiv.org/pdf/2301.11305.pdf
json: https://hypothes.is/api/annotations/tnNraJ_DEe2YBceDAVt0Uw
permissions:
admin:
- acct:ravenscroftj@hypothes.is
delete:
- acct:ravenscroftj@hypothes.is
read:
- group:__world__
update:
- acct:ravenscroftj@hypothes.is
tags:
- chatgpt
- detecting gpt
target:
- selector:
- end: 22349
start: 22098
type: TextPositionSelector
- exact: "Empirically, we find predictive entropy to be positively cor-related\
\ with passage fake-ness more often that not; there-fore, this baseline uses\
\ high average entropy in the model\u2019spredictive distribution as a signal\
\ that a passage is machine-generated."
prefix: tropy) predictive distributions.
suffix: ' While our main focus is on zero'
type: TextQuoteSelector
source: https://arxiv.org/pdf/2301.11305.pdf
text: this makes sense and aligns with the [gltr](http://gltr.io) - humans add more
entropy to sentences by making unusual choices in vocabulary that a model would
not.
updated: '2023-01-29T10:57:24.658922+00:00'
uri: https://arxiv.org/pdf/2301.11305.pdf
user: acct:ravenscroftj@hypothes.is
user_info:
display_name: James Ravenscroft
in-reply-to: https://arxiv.org/pdf/2301.11305.pdf
tags:
- chatgpt
- detecting gpt
- hypothesis
type: annotation
url: /annotations/2023/01/29/1674989844
---
<blockquote>Empirically, we find predictive entropy to be positively cor-related with passage fake-ness more often that not; there-fore, this baseline uses high average entropy in the modelspredictive distribution as a signal that a passage is machine-generated.</blockquote>this makes sense and aligns with the [gltr](http://gltr.io) - humans add more entropy to sentences by making unusual choices in vocabulary that a model would not.