|
Algorithm 1 Information-Theoretic Watermark Embedding |
Require: prompt, LM, EMB, , , , NGram, k, Ensure:
watermarked_text |
-
1:
Initialize watermarked_text = “”, bits_embedded = 0
-
2:
▹ Track total entropy manipulation
-
3:
▹ Initial probability distribution
-
4:
while not termination condition do
-
5:
▹ Maximum likelihood token
-
6:
▹ Original entropy
-
7:
▹ Token embedding
-
8:
▹ Semantic neighborhood
-
9:
▹ N-gram probabilities from context
-
10:
-
11:
if bits_embedded / len(watermarked_text) < then
-
12:
-
13:
▹ Boost green tokens by , zero red tokens
-
14:
-
15:
▹ Entropy change
-
16:
if
then ▹ Entropy constraint
-
17:
▹ Equation (6)
-
18:
watermarked_text +=
-
19:
bits_embedded +=
-
20:
-
21:
else
-
22:
▹ Fallback to original
-
23:
watermarked_text +=
-
24:
end if
-
25:
else
-
26:
-
27:
watermarked_text +=
-
28:
end if
-
29:
prompt ← Update(prompt, )
-
30:
▹ Update distribution
-
31:
end while
-
32:
return watermarked_text,
|