| 
Algorithm 2 Information-Theoretic Feature Extraction and Detection | 
Require: text, LM, EMB, N, , NGram, Classifier, k
 
Ensure: is_watermarked, confidence
- 
  1:
// Probability Curvature Features 
 
- 
  2:
 
 
- 
  3:
      ▹ Curvature values 
 
- 
  4:
for  to N do 
 
- 
  5:
          ▹ Random synonym replacement, preserve structure 
 
- 
  6:
     
 
- 
  7:
    .append() 
 
- 
  8:
end for 
 
- 
  9:
 
 
- 
  10:
// Information-Theoretic Features 
 
- 
  11:
 
 
- 
  12:
 
 
- 
  13:
 
 
- 
  14:
      ▹ Perplexity from entropy 
 
- 
  15:
// Watermark Detection Features 
 
- 
  16:
      ▹ Log-likelihood ratio 
 
- 
  17:
 
 
- 
  18:
for each token  in text do 
 
- 
  19:
     
 
- 
  20:
     
 
- 
  21:
    if  then 
 
- 
  22:
         
 
- 
  23:
        .append() 
 
- 
  24:
    else 
 
- 
  25:
         
 
- 
  26:
    end if 
 
- 
  27:
end for 
 
- 
  28:
 
 
- 
  29:
// Feature Aggregation 
 
- 
  30:
 
 
- 
  31:
 
 
- 
  32:
 
 
- 
  33:
 
 
- 
  34:
// Classification with Confidence 
 
- 
  35:
 
 
- 
  36:
 
 
- 
  37:
 
 
- 
  38:
return ,  
 
 
 |