Anomaly Detection Using Sparse Modeling
1. Introduction
11. Task Definition
In signal processing, the task of sparse modeling (sparse coding) is to represent input signals with linear combinations of a few dictionary elements.
 Input signal: patches (red, green and blue)
 Linear combinations: coefficients (0.4, 0.1 and 0.9)
 Few dictionary elements: striped pattern boxes in dictionary

: Actual signals. Y.shape > (num_features, num_patches)Y 
: Dictionary. Basis vectors are stored. D.shape > (num_features, num_basis)D 
: Sparse representation. Coefficients are stored. X.shape > (num_basis, num_patches)X
12. Apply to Anomaly Detection
To apply sparse modeling to anomaly detection, we have one assumption: "
13. Comparison between Sparse Modeling and Deep Learning
Sparse Modeling  Deep Learning  

Data  Sparse Modeling focuses only on the very essential parts and works with small amounts of information.  Deep Learning algorithms need a very large amount of training data to learn and build up a model. 
Explainablity  Sparse Modeling maintains a transparent model that can be both reviewed and verified. Hence the results are understandable and explainable.  The models created in a Deep Learning setup work as a black box. They don’t explain why they return a certain result or decision. 
Computational Resource  Since Sparse Modeling only consumes minimal power, it can be easily embedded into lowcost equipment or FPGA.  The special computer equipment required to process Big Data is expensive. 
2. Orthogonal Matching Pursuit
This is an algorithm to optimize the sparse representation
Question: What is the ideal basis vector
3. KSVD
KSVD is an algorithm to optimize dictionary
4. Sparse Modeling
for i in range(max_iteration):
X = sparse_encode(Y, D, algorithm="omp")
for j in range(num_basis):
nonzero_index = X[:, j] != 0
X[nonzero_index, j] = 0
error = Y[nonzero_index, :]  np.dot(X[nonzero_index, :], D)
U, S, V = np.linalg.svd(error)
X[nonzero_index, j] = U[:, 0] * S[0]
D[j, :] = V.T[:, 0]
5. Model Comparison on MVTec AD Dataset
51. Graphical Results
52. AUROC Scores
Category  PaDiM  RIAD  Sparse Modeling 

zipper  0.923  0.975  0.743 
wood  0.992  0.965  0.934 
transistor  0.998  0.918  0.462 
toothbrush  0.883  0.972  0.758 
tile  0.994  0.997  0.862 
screw  0.815  0.799  0.744 
pill  0.958  0.786  0.467 
metal_nut  0.992  0.920  0.402 
leather  1.000  1.000  0.766 
hazelnut  0.985  0.890  0.967 
grid  0.959  0.983  0.772 
carpet  0.997  0.781  0.375 
capsule  0.937  0.731  0.577 
cable  0.930  0.655  0.408 
bottle  1.000  0.971  0.625 
 https://github.com/taikiinoue45/PaDiM#1aurocscores
 https://github.com/taikiinoue45/RIAD#1aurocscores
 https://github.com/taikiinoue45/SMAD#1aurocscores
53. PRO30 Scores
Category  PaDiM  RIAD  Sparse Modeling 

zipper  0.935    0.727 
wood  0.891    0.822 
transistor  0.949    0.659 
toothbrush  0.915    0.902 
tile  0.826    0.645 
screw  0.936    0.951 
pill  0.952    0.873 
metal_nut  0.933    0.700 
leather  0.978    0.768 
hazelnut  0.937    0.976 
grid  0.866    0.642 
carpet  0.952    0.515 
capsule  0.921    0.796 
cable  0.918    0.658 
bottle  0.951    0.705 