Table 2.
CPU time and speedup for simulated data experiments. Times are reported as the sum of execution time across all processes.
Average SSDa Time (min) | Average Iteration Time (min) | # EM Iterations | Total CPU Time (hrs) | |
---|---|---|---|---|
Simulated 0.05 SNR | ||||
StandardEM | 4004 | 4022 | 10 | 670 |
SubspaceEM-1x | 2.8 | 6.1 | 6 | 0.8 |
SubspaceEM-2x | 4.3 | 7.2 | 6 | 0.9 |
SubspaceEM-Overall | 3.5 | 6.7 | 12 | 1.7 |
Simulated 0.02 SNR | ||||
StandardEM | 3914 | 3924 | 7 | 458 |
SubspaceEM-1x | 2.6 | 7.1 | 5 | 0.8 |
SubspaceEM-2x | 3.7 | 6.8 | 6 | 0.9 |
SubspaceEM-Overall | 3.2 | 7.0 | 11 | 1.6 |
Speedupb | ||||
Simulated 0.05 SNR | 1144 | 600 | - | 394 |
Simulated 0.02 SNR | 1223 | 561 | - | 280 |
SSD = Sum of squared differences. In StandardEM, the differences are calculated between images and projections. In SubspaceEM, the differences are between approximated images and approximated projections.
Speedup is calculated as (StandardEM time)/(SubspaceEM-Overall time).