Sparse Representation软件和工具包




Fast l-1 Minimization Algorithms: Homotopy and Augmented Lagrangian Method
-- Implementation from Fixed-Point MPUs to Many-Core CPUs/GPUs

Allen Y. Yang, Arvind Ganesh, Zihan Zhou,
Andrew Wagner, Victor Shia, Shankar Sastry, and Yi Ma


© Copyright Notice: It is important that you read and understand the copyright of the following software packages as specified in the individual items. The copyright varies with each package due to its author(s). The packages should NOT be used for any commercial purposes without direct consent of their author(s).

This project is partially supported by NSF TRUST Center at UC Berkeley, ARO MURI W911NF-06-1-0076, ARL MAST-CTA W911NF-08-2-0004.

Publications

  1. Allen Yang, Arvind Ganesh, Zihan Zhou, Shankar Sastry, and Yi Ma. A Review of Fast 11-Minimization Algorithms for Robust Face Recognition. (preprint)
  2. Allen Yang, Arvind Ganesh, Shankar Sastry, and Yi Ma. Fast l1-Minimization Algorithms and An Application in Robust Face Recognition: A Review. ICIP 2010.
  3. Victor Shia, Allen Yang, Shankar Sastry, Andrew Wagner, and Yi Ma. Fast l1-Minimization and Parallelization for Face Recognition. Asilomar 2011.
MATLAB Benchmark Scripts
  • L-1 Benchmark Package: http://www.eecs.berkeley.edu/~yang/software/l1benchmark/l1benchmark.zip

The package contains a consolidated implementation of nine l-1 minimization algorithms in MATLAB. Each function uses a consistent set of parameters (e.g., stopping criterion and tolerance) to interface with our benchmark scripts.
  1. Orthogonal Matching Pursuit: SolveOMP.m
  2. Primal-Dual Interior-Point Method: SolveBP.m
  3. Gradient Projection: SolveL1LS.m
  4. Homotopy: SolveHomotopy.m
  5. Polytope Faces Pursuit: SolvePFP.m
  6. Iterative Thresholding: SolveSpaRSA.m
  7. Proximal Gradient: SolveFISTA.m
  8. Primal Augmented Lagrange Multiplier: SolvePALM.m
  9. Dual Augmented Lagrange Multiplier: SolveDALM.m; SolveDALM_fast.m

The package also contains a script to generate the synthetic data shown in the paper [1].
Note:
1. To run the alternating direction method (YALL1), one needs to separately download the package from its authors (following the link at the end of the page).
2. Please properly acknowledge the respective authors in your publications when you use this package.


Single-Core l-1 Minimization Library in C
  • Homotopy and ALM algorithms implemented in C with MATLAB wrapper: http://www.eecs.berkeley.edu/~yang/software/l1benchmark/L1-Homotopy-ALM.zip
Fixed-Point l-1 Minimization for Mobile Platforms
  • Fixed-point Homotopy algorithm implemented in Java: http://www.eecs.berkeley.edu/~yang/software/l1benchmark/fixed_point_homotopy_java.zip
Many-Core l-1 Minimization Library in C/CUDA
  • Coming soon ...
Benchmark Results
Simulations
  • Noiseless Delta-Rho Plot at 95% Confidence

The delta-rho plot measures the percentage of successes to recover a sparse signal at pairs of (delta, rho) combinations, where delta=d/n is the sampling rate and rho=k/n is the sparsity rate. Then a fixed success rate of 95% over all delta's can be interpolated as a curve in the plot, as shown on the left.In general, the higher the success rates, the better an algorthm recovers dense signals in the l-1 problem.

Observations:
  1. Without concerns about speed and data noise, the success rate of the interior-point method PDIPA is the highest of all the algorithms in the figure, especially when the signal becomes dense.
  2. The success rates of L1LS and Homotopy are similar, and they are very close to those of PDIPA.
  3. The success rates of FISTA and DALM are comparable over all sampling rates. The performance also shows significant improvement over the IST algorithm, namely, SpaRSA.
  • Fixed Low Sparsity Simulation (only speed is shown here)

The figure on the left shows the average run time over various projection dimensions d, where the ambient dimension is n=2000. A low sparsity is fixed at k=200.

Observations:
  1. The computational complexity of PDIPA grows much faster than the other algorithms. More importantly, in contrast to its noise-free performance, the estimation error also grows exponentially, in which case the algorithm fails to converge to an estimate that is close to the ground truth (please refer to the paper).
  2. L1LS and Homotopy take much longer time to converge than SpaRSA, FISTA, and DALM.
  3. The average run time of DALM is the smallest over all projection dimensions.
  • Fixed High Sampling Rate Simulation (only speed is shown here)

The figure on the left shows the average run time over various sparsity ratios rho, where the ambient dimension is again n=2000. A high sampling rate is fixed at d=1500.

Observations:
  1. Again, PDIPA significantly underperforms compared to the rest five algorithms in terms of both accuracy and speed.
  2. The average run time of Homotopy grows almost linearly with the sparsity ratio, while the other algorithms are relatively unaffected. Thus, Homotopy is more suitable for scenarios where the unknown signal is expected to have a very small sparsity ratio.
  3. DALM again is the fastest algorithm compared to SpaRSA and FISTA.
Robust Face Recognition

Under Construction ...

The CMU Multi-PIE database can be purchased from here: http://cmu.wellspringsoftware.net/invention/detail/2309/
  • Solving Cross-and-Bouquet Model in Robust Face Recognition
This experimenent selects 249 subjects from Multi-PIE, chooses 7 extreme illumination conditions as the training images. The the testing images are corrupted at random pixel coordinates from 0% to 90%. We measure the average classification rate and the speed under different corruption percentages.

Observations:
  1. In terms of accuracy, Homotopy achieves the best overall performance. The performance of PDIPA is very close to Homotopy, achieving the second best overall accuracy. On the other hand, FISTA obtains the lowest recognition rates.
  2. In terms of speed, Homotopy is also one of the fastest algorithm, especially when the pixel corruption percentage is small.

Other Public l-1 Minimization Libraries

  • SparseLab: http://sparselab.stanford.edu/
    • Orthogonal Matching Pursuit (OMP): SolveOMP
    • Primal-Dual Basis Pursuit (BP): SolveBP.m
    • Lasso : SolveLasso.m
    • Polytope Faces Pursuite (PFP): SolvePFP.m
  • l1magic: http://www.acm.caltech.edu/l1magic/
    • Primal-Dual Basis Pursuit (BP): l1eq_pd.m
  • L1LS: http://www.stanford.edu/boyd/l1_ls/
    • Truncated Newton Interior-Point Method: l1_ls.m
  • GPSR: http://www.lx.it.pt/~mtf/GPSR/
    • Gradient Projection Sparse Representation: GPSR_BB
  • l1-Homotopy: http://users.ece.gatech.edu/~sasif/homotopy/
    • Homotopy Method: BPDN_homotopy_function.m
  • SpaRSA: http://www.lx.it.pt/~mtf/SpaRSA/
    • Iterative Shrinkage-Thresholding Algorithm: SpaRSA.m
  • FISTA: http://www.eecs.berkeley.edu/~yang/software/l1benchmark/
    • Fast IST Algorithm: SolveFISTA.m
  • FISTA for wavelet-based denoising:
    • http://iew3.technion.ac.il/~becka/papers/wavelet_FISTA.zip
  • NESTA: http://www.acm.caltech.edu/~nesta/
    • Nesterov's Algorithm: NESTA.m
  • YALL1: http://www.caam.rice.edu/~optimization/L1/YALL1/
    • Alternating Direction Method: yall1.m
  • Bregman Iterative Regularization: http://www.caam.rice.edu/~optimization/L1/bregman/
    • Fixed-Point Continuation and Active Set: FPC_AS.m


[Return to Home]

你可能感兴趣的:(Sparse Representation软件和工具包)