S1 Appendix.

S1 Appendix. Convergence of J(x) using the gradient descent of TV function
The optimization of Eq. (11) is obtained using the gradient descent method with the iterative
formula,
x k+1 = x k - hS k ,
(S.1)
where k is the iteration number of J(x) minimization, h is a positive step-size, Sk is the
normalized gradient of J(xk), S k =
ÑTV (x k )
ÑTV (x k )
ÑJ(x k )
ÑJ(x k )
Here, we show that S k =
sk =
ÑJ(x k )
ÑJ(x k )
, and ÑJ(x) = 2(x - v) + aÑTV(x) .
can be replaced by the gradient descent of TV function
as Eq. (21), which can guarantee non-increasing J(x) under some condition of  ,
thus leading to the projection onto C3, PC3.
It can be shown that J(x) has Lipschitz continuous gradient [5],
J(x k+1 ) £ J(x k ) + J(x k ),x k+1 - x k +
L k+1 k 2
x -x ,
2
(S.2)
where L is the Lipschitz constant, whose value depends on the specified operation of 2D or 3D
differentiation and can be determined numerically [5].
From Eq. (21), we have x k+1 = x k - hs k . The inequality in Eq. (24) is converted into
J(x k+1 ) - J(x k ) £ -h 2(x k - v),aÑTV (x k ), s k +
(
k
= -h 2(x - v), s
where we use the fact that s k
2
k
Lh 2
2
)
Lh 2
+ a ÑTV (x ) +
2
k
,
(S.3)
= 1 . In order to have non-increasing objective function, the right
side of Eq. (S.3) must be less than zero, i.e.
(
k
-h 2(x - v), s
k
)
Lh 2
+ a ÑTV (x ) +
£ 0 . (S.4)
2
k
1
Therefore the condition for non-increasing J(x) can be satisfied by select h according to,
(
)
0 £ h £ T k = 2 2(x k - v), s k ) + a ÑTV(x k ) / L .
(S.5)
If the sequence of J(xk) approaches the local minimum, it approaches the global minimum as well
since J(x) is convex. Thus, although Eq. (21) is intended for TV minimization, it can also be used
to find projection v onto set C3, i.e. TV(PC3 (v)) £ t , given that Eq. (S.5) is satisfied.
The condition of Eq. (S.5) depends on the Lagrange multiplier a , whose lower bound can be
determined by Eq. (16). Indeed, the TV-POCS method used a step size that implicitly satisfied
this condition (Fig 9) and led to a good reconstruction behavior.
2