2023-01-29T10:57:24.658922+00:00 |
|
false |
__world__ |
false |
tnNraJ_DEe2YBceDAVt0Uw |
|
admin |
delete |
read |
update |
acct:ravenscroftj@hypothes.is |
|
acct:ravenscroftj@hypothes.is |
|
|
acct:ravenscroftj@hypothes.is |
|
|
|
selector |
source |
end |
start |
type |
22349 |
22098 |
TextPositionSelector |
|
exact |
prefix |
suffix |
type |
Empirically, we find predictive entropy to be positively cor-related with passage fake-ness more often that not; there-fore, this baseline uses high average entropy in the model’spredictive distribution as a signal that a passage is machine-generated. |
tropy) predictive distributions. |
While our main focus is on zero |
TextQuoteSelector |
|
|
https://arxiv.org/pdf/2301.11305.pdf |
|
|
this makes sense and aligns with the [gltr](http://gltr.io) - humans add more entropy to sentences by making unusual choices in vocabulary that a model would not. |
2023-01-29T10:57:24.658922+00:00 |
https://arxiv.org/pdf/2301.11305.pdf |
acct:ravenscroftj@hypothes.is |
display_name |
James Ravenscroft |
|