Using subgradient method to solve lasso problem
The problem is to solve:
[underset{eta}{operatorname{minimize}}left{frac{1}{2 N} sum_{i=1}^{N}left(y_{i}-z_{i} eta
ight)^{2}+lambda|eta|
ight}
]
Subgradient Optimality:
[0 in partialleft{frac{1}{2 N} sum_{i=1}^{N}left(y_{i}-z_{i} eta
ight)^{2}+lambda|eta|
ight}
]
[Longleftrightarrow 0 in-frac{1}{N}sum_{i=1}^{N}z_i(y_i-z_ieta)+lambda partial|eta|
]
Denote (v=partial|eta|),according to the definition of subgradient, we have
[v inleft{egin{array}{ll}
{1} & ext { if } eta>0 \
{-1} & ext { if } eta<0 \
{[-1,1]} & ext { if } eta=0
end{array}
ight.
]
The subgradient optimality condition is
[frac{1}{N}sum_{i=1}^{N}z_i(y_i-z_ieta)=lambda v
]
-
if (eta>0, v=1)
[frac{1}{N}sum_{i=1}^{N}z_i(y_i-z_ieta)=lambda ]we can solve (eta=frac{sum z_iy_i-lambda N}{sum z_i^2})
Since zi is standardized,(sum z_i^2=N),
[eta=frac{sum z_iy_i-lambda N}N\=frac{1}{N}langlemathbf{z}, mathbf{y} angle-lambda ] -
if (eta<0), (v=-1)
[frac{1}{N}sum_{i=1}^{N}z_i(y_i-z_ieta)=-lambda ]we can solve (eta=frac{sum z_iy_i+lambda N}{sum z_i^2})
Since zi is standardized,(sum z_i^2=N),
[eta=frac{sum z_iy_i+lambda N}N\=frac{1}{N}langlemathbf{z}, mathbf{y} angle+lambda ] -
if (eta=0,|v|le1)
[|frac{1}{N}sum_{i=1}^{N}z_i(y_i-z_ieta)|lelambda ]Since (eta=0,) we have (frac{1}{N}|langlemathbf{z}, mathbf{y} angle| leq lambda)
In conclusion, we have:
[widehat{eta}=left{egin{array}{ll}
frac{1}{N}langlemathbf{z}, mathbf{y}
angle-lambda & ext { if } frac{1}{N}langlemathbf{z}, mathbf{y}
angle quad>lambda \
0 & ext { if } frac{1}{N}|langlemathbf{z}, mathbf{y}
angle| leq lambda \
frac{1}{N}langlemathbf{z}, mathbf{y}
angle+lambda & ext { if } frac{1}{N}langlemathbf{z}, mathbf{y}
angle<-lambda
end{array}
ight.]
i.e.
[widehat{eta}=mathcal{S}_{lambda}left(frac{1}{N}langlemathbf{z}, mathbf{y}
angle
ight)
]
Where $$mathcal{S}_{lambda}(x)=operatorname{sign}(x)(|x|-lambda)$$