Skip to main content
. 2022 Jan 30;13(4):335–362. doi: 10.1007/s41060-021-00302-z
Notation Description
N=n1,n2,,nN Set of news items, N is the size of the news dataset
yi0,1 Label yi=1 is fake news; yi=0 is real news
U=u1,u2,,uU Set of users; U is the number of users
u0,sc0,t0 A tuple, user u and social context sc during timestamp t
Cni Content of news
SCni=u0,sc0,t0,, Sequence of a user’ social contexts on a news item, |sc| is the size of SC
y^ni0,1 Predicted label for news item ni
y^(ni)=Mni,SCni Model M predicts a label for news item based on its news features and social contexts
X=x1,x2,,xk Sequence of k input vector representations, k is the length
X=x1,x2,,xk Sequence of embedding vectors from X
f:X1:kY1:l Mapping f from input sequence of k vectors to a sequence of l target vectors
x, X Output vector representation from input x, and sequence of output vectors of x
κi; vi; qi;K Key vector; value vector; query vector; set of key vectors
Wv, Wk,Wq, SoftMax Trainable weight vectors of κ; v; q, activation function
X¯1:k;Y1:l Contextualized input sequence to decoder; the target vector sequence
fθenc;pθdec, L1:k=1,,k Encoder function, decoder function; logit vector
y; y Vector representation of y, and y
 < S > ; [S];hS Token in decoder; last state of the token; hien ste
p[0,1]2 Probability distribution over classes [0,1]
W; b; h Project matrix; bias term; cross-entropy function
X;X; X;X¯ Input sequence to encoder; sequence generated from X; sequence generated from X; output encoding sequence from X
Y; Y; Y; Y;Y¯ Target sequence in decoder; sequence generated from Y; Y; Y; and Y, respectively
yj; y^j Ground truth label; model prediction