brainsteam.co.uk/brainsteam/content/replies/2022/11/21/1669061585.md

4.4 KiB

date hypothesis-meta in-reply-to tags type url
2022-11-21T20:13:05
created document flagged group hidden id links permissions tags target text updated uri user user_info
2022-11-21T20:13:05.556810+00:00
title
IEEEtran-7.pdf
false __world__ false 6KsbkmnYEe2Y3g9fobLUFA
html incontext json
https://hypothes.is/a/6KsbkmnYEe2Y3g9fobLUFA https://hyp.is/6KsbkmnYEe2Y3g9fobLUFA/www.researchgate.net/profile/Lin-Gui-5/publication/342058196_Multi-Task_Learning_with_Mutual_Learning_for_Joint_Sentiment_Classification_and_Topic_Detection/links/5f96fd48458515b7cf9f3abd/Multi-Task-Learning-with-Mutual-Learning-for-Joint-Sentiment-Classification-and-Topic-Detection.pdf https://hypothes.is/api/annotations/6KsbkmnYEe2Y3g9fobLUFA
admin delete read update
acct:ravenscroftj@hypothes.is
acct:ravenscroftj@hypothes.is
group:__world__
acct:ravenscroftj@hypothes.is
NLProc
topic modelling
neural networks
selector source
end start type
11125 10591 TextPositionSelector
exact prefix suffix type
n recent years, the neural network based topic modelshave been proposed for many NLP tasks, such as infor-mation retrieval [11], aspect extraction [12] and sentimentclassification [13]. The basic idea is to construct a neuralnetwork which aims to approximate the topic-word distri-bution in probabilistic topic models. Additional constraints,such as incorporating prior distribution [14], enforcing di-versity among topics [15] or encouraging topic sparsity [16],have been explored for neural topic model learning andproved effective. word embeddings[8], [9], [10].I However, most of these algorith TextQuoteSelector
https://www.researchgate.net/profile/Lin-Gui-5/publication/342058196_Multi-Task_Learning_with_Mutual_Learning_for_Joint_Sentiment_Classification_and_Topic_Detection/links/5f96fd48458515b7cf9f3abd/Multi-Task-Learning-with-Mutual-Learning-for-Joint-Sentiment-Classification-and-Topic-Detection.pdf
Neural topic models are often trained to mimic the behaviours of probabilistic topic models - I should come back and look at some of the works: * R. Das, M. Zaheer, and C. Dyer, “Gaussian LDA for topic models with word embeddings,” * P. Xie, J. Zhu, and E. P. Xing, “Diversity-promoting bayesian learning of latent variable models,” * M. Peng, Q. Xie, H. Wang, Y. Zhang, X. Zhang, J. Huang, and G. Tian, “Neural sparse topical coding,” 2022-11-21T20:13:05.556810+00:00 https://www.researchgate.net/profile/Lin-Gui-5/publication/342058196_Multi-Task_Learning_with_Mutual_Learning_for_Joint_Sentiment_Classification_and_Topic_Detection/links/5f96fd48458515b7cf9f3abd/Multi-Task-Learning-with-Mutual-Learning-for-Joint-Sentiment-Classification-and-Topic-Detection.pdf acct:ravenscroftj@hypothes.is
display_name
James Ravenscroft
https://www.researchgate.net/profile/Lin-Gui-5/publication/342058196_Multi-Task_Learning_with_Mutual_Learning_for_Joint_Sentiment_Classification_and_Topic_Detection/links/5f96fd48458515b7cf9f3abd/Multi-Task-Learning-with-Mutual-Learning-for-Joint-Sentiment-Classification-and-Topic-Detection.pdf
NLProc
topic modelling
neural networks
hypothesis
reply /replies/2022/11/21/1669061585
n recent years, the neural network based topic modelshave been proposed for many NLP tasks, such as infor-mation retrieval [11], aspect extraction [12] and sentimentclassification [13]. The basic idea is to construct a neuralnetwork which aims to approximate the topic-word distri-bution in probabilistic topic models. Additional constraints,such as incorporating prior distribution [14], enforcing di-versity among topics [15] or encouraging topic sparsity [16],have been explored for neural topic model learning andproved effective.
Neural topic models are often trained to mimic the behaviours of probabilistic topic models - I should come back and look at some of the works:
  • R. Das, M. Zaheer, and C. Dyer, “Gaussian LDA for topic models with word embeddings,”
  • P. Xie, J. Zhu, and E. P. Xing, “Diversity-promoting bayesian learning of latent variable models,”
  • M. Peng, Q. Xie, H. Wang, Y. Zhang, X. Zhang, J. Huang, and G. Tian, “Neural sparse topical coding,”