Research

Salient Object Detection: A Benchmark

[:en]

Ali Borji, Ming-Ming Cheng, Huaizu Jiang, Jia Li

Notice: Welcome to contact Ming-Ming Cheng for adding new comparisons. Adding new results should supply either source code or executable.

Abstract

We extensively compare, qualitatively and quan- titatively, 42 state-of-the-art models (30 salient object detection, 10 fixation prediction, 1 objectness, and 1 baseline) over 6 challenging datasets for the purpose of benchmarking salient object detection and segmentation methods. From the results obtained so far, our evaluation shows a consistent rapid progress over the last few years in terms of both accuracy and running time. The top contenders in this benchmark significantly outperform the models identified as the best in the previous benchmark conducted just two years ago. We find that the models designed specifically for salient object detection generally work better than models in closely related areas, which in turn provides a precise definition and suggests an appropriate treatment of this problem that distinguishes it from other problems. In particular, we analyze the influences of center bias and scene complexity in model performance, which, along with the hard cases for state-of-the-art models, provide useful hints towards constructing more challenging large scale datasets and better saliency models. Finally, we propose probable solutions for tackling several open problems such as evaluation scores and dataset bias, which also suggest future research directions in the rapidly-growing field of salient object detection.

Papers

  • Salient Object Detection: A Benchmark, Ali Borji, Ming-Ming Cheng, Huaizu Jiang, Jia Li, IEEE TIP, 2015.  [pdf] [Project page] [Bib]
  • Salient Object Detection: A Survey, Ali Borji, Ming-Ming Cheng, Huaizu Jiang, Jia Li, arXiv eprint, 2014.  [pdf] [Project page] [Bib]

Code

Downloads

We provide the evaluation data (images, ground truth, saliency maps, etc.) downloads here to facilitate future research. We suggest to use BT software to download these zip files using their url lists, which is available here. Evaluation resutls in form of matlab and csv file for plots and talbes could be downloaded here. If you use any parts of our results, please cite the corresponding paper above. 您也可以通过百度网盘下载。

Performance: FMeasure of saliency maps and salient object segmentations

ModelT-MT-AT-SJ-MJ-AJ-SD-MD-AD-SS-MS-AS-SM-MM-AM-SE-ME-AE-S
MBD.622.594.642.472.422.470.624.592.636.799.803.759.849.830.890.739.703.785
ST.631.580.648.455.394.459.631.577.635.818.805.768.868.825.896.752.690.777
QCUT.651.625.620.509.454.480.683.647.647.810.801.672.874.843.843.779.738.747
HDCT.602.571.636.412.378.422.609.572.643.822.802.758.837.807.877.705.669.74
RBD.596.566.618.457.403.461.63.58.647.837.825.75.856.821.884.718.68.757
GR.551.509.546.418.338.378.599.54.58.798.753.639.816.77.83.664.583.677
MNP.495.523.603.367.337.405.467.486.576.621.778.765.668.724.822.568.555.709
UFO.579.557.61.432.385.433.545.541.593.742.781.729.842.806.862.701.654.739
MC.61.603.6.46.42.434.627.603.615.779.803.63.847.824.855.742.704.745
DSR.611.604.597.454.421.41.626.614.593.794.821.632.835.824.833.737.717.703
CHM.612.591.643.417.368.424.604.586.637.75.75.658.825.804.857.722.684.735
GC.533.517.497.384.321.342.535.528.506.729.73.616.794.777.78.641.612.593
LBI.519.534.618.371.353.416.482.504.609.692.776.764.696.714.857.586.563.738
PCA.544.558.601.432.404.368.554.554.624.754.796.701.782.782.845.646.627.72
DRFI.67.607.674.475.419.447.665.605.669.831.839.702.881.838.905.787.733.801
GMR.597.594.579.454.409.432.61.591.591.773.789.643.847.825.839.74.712.736
HS.585.549.602.442.358.428.616.565.616.811.776.713.845.8.87.731.659.769
LMLC.54.519.588.375.302.397.521.493.551.653.712.674.801.772.86.659.6.735
SF.5.495.342.373.319.219.519.512.377.764.794.509.779.759.573.619.576.378
FES.547.575.426.424.411.333.52.555.38.617.785.174.717.753.534.645.655.467
CB.581.556.615.444.375.435.542.534.593.73.704.657.815.775.857.717.656.761
SVO.554.441.609.414.279.419.557.407.609.744.667.746.789.585.863.639.357.737
SWD.528.56.649.434.386.454.478.506.613.548.714.737.689.705.871.624.549.781
HC.386.401.436.286.257.28.382.38.435.736.759.646.677.663.74.46.441.499
RC.61.586.639.431.37.425.599.578.621.774.807.649.844.82.875.741.701.776
SEG.5.425.58.376.268.393.516.45.562.704.64.669.697.585.812.568.408.715
MSS.478.49.2.341.324.089.476.49.193.743.783.298.696.711.362.53.536.203
CA.458.494.557.353.33.394.435.458.532.591.737.565.621.679.748.515.494.625
FT.386.4.238.278.25.132.381.388.259.715.734.436.635.628.472.434.431.257
AC.41.431.068.227.199.049.354.383.04.684.729.14.52.566.014.411.41.038
LC.386.408.289.264.246.156.327.353.243.683.752.486.569.589.432.39.396.219
OBJ.498.482.593.368.282.413.481.445.578.685.723.731.718.681.84.574.456.698
BMS.568.578.594.434.404.416.573.576.58.713.76.627.805.798.822.683.659.69
COV.51.587.398.429.427.315.486.579.373.518.724.212.667.755.394.641.677.413
SS.415.482.523.344.321.397.396.443.502.533.696.641.572.642.675.467.441.574
SIM.372.429.568.295.292.384.358.402.539.498.685.725.498.585.794.433.391.672
SeR.374.419.536.316.285.388.385.411.532.521.714.702.542.607.755.419.391.596
SUN.387.432.486.303.291.285.321.36.445.504.661.613.505.596.67.388.376.478
SR.374.457.002.279.27.001.298.3630.504.7.002.473.569.001.381.385.001
GB.526.571.65.419.396.455.507.548.638.571.746.695.688.737.837.624.613.765
AIM.427.461.559.317.26.36.361.377.495.541.718.693.555.575.75.449.357.571
IT.373.437.005.297.2830.378.449.005.579.697.008.471.586.158.407.414.003
AVG.458.569.62.392.367.411.406.514.534.388.524.64.58.692.779.597.627.756
Max FMeasure of average precision recall curve, average FMeasure for adaptive thresholding results, average FMeasure for SalCut. The subtitle of each column is in the [Dataset]-[Evaluation Metric] format, where [Dataset] is represented by the initial letter for the 6 benchmarks {THUR15K, JuddDB, DUT-OMRON, SED2, MSRA10K, ECSSD}. Click the title of the column to rerank the table according to that metric.

Performance: AUC & MAE

MethodT-AUCT-MAEJ-AUCJ-MAED-AUCD-MAES-AUCS-MAEM-AUCM-MAEE-AUCE-MAE
MBD0.9150.1620.8380.2250.9030.1680.9220.1370.9640.1070.9170.172
ST0.9110.1790.8060.2400.8950.1820.9220.1450.9610.1220.9140.193
QCUT0.9070.1280.8310.1780.8970.1190.8600.1480.9560.1180.9090.171
HDCT0.8780.1770.7710.2090.8690.1640.8980.1620.9410.1430.8660.199
RBD0.8870.150.8260.2120.8940.1440.8990.130.9550.1080.8940.173
GR0.8290.2560.7470.3110.8460.2590.8540.1890.9250.1980.8310.285
MNP0.8540.2550.7680.2860.8350.2720.8880.2150.8950.2290.820.307
UFO0.8530.1650.7750.2160.8390.1730.8450.180.9380.150.8750.207
MC0.8950.1840.8230.2310.8870.1860.8770.1820.9510.1450.910.204
DSR0.9020.1420.8260.1960.8990.1390.9150.140.9590.1210.9140.173
CHM0.910.1530.7970.2260.890.1520.8310.1680.9520.1420.9030.195
GC0.8030.1920.7020.2580.7960.1970.8460.1850.9120.1390.8050.214
LBI0.8760.2390.7920.2730.8540.2490.8960.2070.910.2240.8420.28
PCA0.8850.1980.8040.1810.8870.2060.9110.20.9410.1850.8760.248
DRFI0.9380.150.8510.2130.9330.1550.9440.130.9780.1180.9440.166
GMR0.8560.1810.7810.2430.8530.1890.8620.1630.9440.1260.8890.189
HS0.8530.2180.7750.2820.860.2270.8580.1570.9330.1490.8830.228
LMLC0.8530.2460.7240.3030.8170.2770.8260.2690.9360.1630.8490.26
SF0.7990.1840.7110.2180.8030.1830.8710.180.9050.1750.8170.23
FES0.8670.1550.8050.1840.8480.1560.8380.1960.8980.1850.860.215
CB0.870.2270.760.2870.8310.2570.8390.1950.9270.1780.8750.241
SVO0.8650.3820.7840.4220.8660.4090.8750.3480.930.3310.8570.404
SWD0.8730.2880.8120.2920.8430.310.8450.2960.9010.2670.8570.318
HC0.7350.2910.6260.3480.7330.310.880.1930.8670.2150.7040.331
RC0.8960.1680.7750.270.8590.1890.8520.1480.9360.1370.8920.187
SEG0.8180.3360.7470.3540.8250.3370.7960.3120.8820.2980.8080.342
MSS0.8130.1780.7260.2040.8170.1770.8710.1920.8750.2030.7790.245
CA0.830.2480.7740.2820.8150.2540.8530.2290.8720.2370.7840.31
FT0.6840.2410.5930.2670.6820.250.820.2060.790.2350.6610.291
AC0.740.1860.5480.2390.7210.190.8310.2060.7560.2270.6680.265
LC0.6960.2290.5860.2770.6540.2460.8270.2040.7710.2330.6270.296
OBJ0.8390.3060.750.3590.8220.3230.870.2690.9070.2620.8180.337
BMS0.8790.1810.7880.2330.8560.1750.8520.1840.9290.1510.8650.216
COV0.8830.1550.8260.1820.8640.1560.8330.210.9040.1970.8790.217
SS0.7920.2670.7540.3010.7840.2770.8260.2660.8230.2660.7250.344
SIM0.7970.4140.7270.4120.7830.4290.8330.3840.8080.3880.7340.433
SeR0.7780.3450.7460.3790.7860.3520.8350.290.8130.310.6950.404
SUN0.7460.310.6740.3190.7080.3490.7890.3070.7780.3060.6230.396
SR0.7410.1750.6760.20.6880.1810.7690.220.7360.2320.6330.266
GB0.8820.2290.8150.2610.8570.240.8390.2420.9020.2220.8650.263
AIM0.8140.2980.7190.3310.7680.3220.8460.2620.8330.2860.730.339
IT0.6230.1990.5860.20.6360.1980.6820.2450.640.2130.5770.273
AVG0.8490.2480.7970.3430.8140.2880.7360.4050.8570.260.8630.276
Comparison of AUC scores (larger better) and MAE scores (smaller better). Similar to the table above, the subtitle of each column is in the [Dataset]-[Evaluation Metric] format, where [Dataset] is represented by the initial letter for the 6 benchmarks {THUR15K, JuddDB, DUT-OMRON, SED2, MSRA10K, ECSSD}. Click the title of the column to rerank the table according to that metric.

Salient object detection datasets

Abbr.ImagesReferences
MSRA10K10000Learning to Detect A Salient Object , IEEE CVPR 2007, Liu et al. Frequency-tuned Salient Region Detection, IEEE CVPR 2009, Achanta et al. Global Contrast based Salient Region Detection, IEEE TPAMI 2015, Cheng et al.
ECSSD1000Hierarchical Saliency Detection, IEEE CVPR 2013, Yan et al.
THUR15K15000SalientShape: Group Saliency in Image Collections, The Visual Computer 2013, Cheng et al.
JuddDB900What is a salient object? A dataset and a baseline model for salient object detection, arXiv, ePrints
DUTOMRON5000Saliency Detection Via Graph-Based Manifold Ranking, IEEE CVPR 2013, Yang et al.
SED 2100Image segmentation by probabilistic
bottom-up aggregation and cue integration
, IEEE CVPR 2007, Alpert et al

Information about different methods

Abbr.DateRun time (s)CodeBooktitlePaper title
MB+2015CIEEE ICCVMinimum Barrier Salient Object Detection at 80 FPS
ST2014MIEEE TIPSaliency Tree: A Novel Saliency Detection Framework
QCUT2014M+CICPRAutomatic Object Segmentation by Quantum Cuts
HDCT20144.12MIEEE CVPRSalient Region Detection via High-dimensional Color Transform
RBD20140.269MIEEE CVPR Saliency Optimization from Robust Background Detection
GR20131.35M+CSig. Proc. Lett.Graph-Regularized Saliency Detection With Convex-Hull-Based Center Prior
MNP201321.0M+CThe Vis. Comp.Saliency for Image Manipulation
UFO201320.3M+CIEEE ICCVSalient Region Detection by UFO: Uniqueness, Focusness and Objectness
MC20130.195M+CIEEE ICCVSaliency Detection via Absorbing Markov Chain
DSR201310.2M+CIEEE ICCVSaliency Detection via Dense and Sparse Reconstruction
CHM201315.4M+CIEEE ICCVContextual Hypergraph Modeling for Salient Object Detection
GC20130.037CIEEE ICCVEfficient Salient Region Detection with Soft Image Abstraction
LBI2013251M+CIEEE CVPR Looking Beyond the Image: Unsupervised Learning for Object Saliency and Detection
PCA20134.34M+CIEEE CVPRWhat Makes a Patch Distinct?
DRFI20130.697M+CIEEE CVPRSalient Object Detection: A Discriminative Regional Feature Integration Approach
GMR20130.149CIEEE CVPRSaliency Detection via Graph-based Manifold Ranking
HS20130.528CIEEE CVPRHierarchical Saliency Detection
LMLC2013140M+CIEEE TIPBayesian Saliency via Low and Mid Level Cues
SF20120.202CIEEE CVPRSaliency Filters: Contrast Based Filtering for Salient Region Detection
FES20110.096M+CImage Ana.Fast and Efficient Saliency Detection Using Sparse Sampling and Kernel Density Estimation
CB20112.24M+CBMVCAutomatic salient object segmentation based on context and shape prior
SVO201156.5M+CIEEE ICCVFusing Generic Objectness and Visual Saliency for Salient Object Detection
SWD20110.190M+CIEEE CVPRVisual saliency detection by spatially weighted dissimilarity
HC20110.017CIEEE CVPRGlobal Contrast based Salient Region Detection
RC20150.136CIEEE TPAMIGlobal Contrast based Salient Region Detection
SEG201010.9MECCVSegmenting salient objects from images and videos
MSS20100.076CIEEE ICIPSaliency detection using maximum symmetric surround
CA201049.0M+CIEEE CVPRContext-aware saliency detection
FT20090.072CIEEE CVPRFrequency-tuned salient region detection
AC20080.129MICVSSalient region detection and segmentation
LC20060.009CACM Multi.Visual attention detection in video sequences using spatiotemporal cues
OBJ20103.01M+CIEEE CVPRWhat is an object?
BMS20130.575M+CIEEE ICCVSaliency Detection: A Boolean Map Approach
COV201325.4MJ. of Vis.Visual saliency estimation by nonlinearly integrating features using region covariances
SS20120.053MIEEE PAMIImage Signature: Highlighting sparse salient regions
SIM20111.11MIEEE CVPRSaliency estimation using a non-parametric low-level vision model
SeR20091.31MJ. of Vis.Static and space-time visual saliency detection by self-resemblance
SUN20083.56MJ. of Vis.SUN: A bayesian framework for saliency using natural statistics
SR20070.040MIEEE CVPRSaliency detection: A spectral residual approach
GB20060.735M+CNIPS Graph-based visual saliency
AIM20098.66MJ. of Vis.Saliency, attention, and visual search: An information theoretic approach
IT19980.302MIEEE PAMI A model of saliency-based visual attention for rapid scene analysis
Detailed information of each method. Regarding source code type: `C' means 'C/C++', 'M' means 'Matlab', 'M+C' means a mixture of Matlab and C/C++.

News

  1. 2015/4/24: evaluation results of QCUT has been added.
  2. 2015/10/24: evaluation results of MBD has been added.

[:zh]

Ali Borji, Ming-Ming Cheng, Huaizu Jiang, Jia Li

Notice: Welcome to contact Ming-Ming Cheng for adding new comparisons. Adding new results should supply either source code or executable.

Abstract

We extensively compare, qualitatively and quan- titatively, 42 state-of-the-art models (30 salient object detection, 10 fixation prediction, 1 objectness, and 1 baseline) over 6 challenging datasets for the purpose of benchmarking salient object detection and segmentation methods. From the results obtained so far, our evaluation shows a consistent rapid progress over the last few years in terms of both accuracy and running time. The top contenders in this benchmark significantly outperform the models identified as the best in the previous benchmark conducted just two years ago. We find that the models designed specifically for salient object detection generally work better than models in closely related areas, which in turn provides a precise definition and suggests an appropriate treatment of this problem that distinguishes it from other problems. In particular, we analyze the influences of center bias and scene complexity in model performance, which, along with the hard cases for state-of-the-art models, provide useful hints towards constructing more challenging large scale datasets and better saliency models. Finally, we propose probable solutions for tackling several open problems such as evaluation scores and dataset bias, which also suggest future research directions in the rapidly-growing field of salient object detection.

Papers

  • Salient Object Detection: A Benchmark, Ali Borji, Ming-Ming Cheng, Huaizu Jiang, Jia Li, IEEE TIP, 2015.  [pdf] [Project page] [Bib]
  • Salient Object Detection: A Survey, Ali Borji, Ming-Ming Cheng, Huaizu Jiang, Jia Li, arXiv eprint, 2014.  [pdf] [Project page] [Bib]

Code

Downloads

We provide the evaluation data (images, ground truth, saliency maps, etc.) to facilitate future research. Link: 百度网盘

Performance: FMeasure of saliency maps and salient object segmentations

ModelT-MT-AT-SJ-MJ-AJ-SD-MD-AD-SS-MS-AS-SM-MM-AM-SE-ME-AE-S
MBD.622.594.642.472.422.470.624.592.636.799.803.759.849.830.890.739.703.785
ST.631.580.648.455.394.459.631.577.635.818.805.768.868.825.896.752.690.777
QCUT.651.625.620.509.454.480.683.647.647.810.801.672.874.843.843.779.738.747
HDCT.602.571.636.412.378.422.609.572.643.822.802.758.837.807.877.705.669.74
RBD.596.566.618.457.403.461.63.58.647.837.825.75.856.821.884.718.68.757
GR.551.509.546.418.338.378.599.54.58.798.753.639.816.77.83.664.583.677
MNP.495.523.603.367.337.405.467.486.576.621.778.765.668.724.822.568.555.709
UFO.579.557.61.432.385.433.545.541.593.742.781.729.842.806.862.701.654.739
MC.61.603.6.46.42.434.627.603.615.779.803.63.847.824.855.742.704.745
DSR.611.604.597.454.421.41.626.614.593.794.821.632.835.824.833.737.717.703
CHM.612.591.643.417.368.424.604.586.637.75.75.658.825.804.857.722.684.735
GC.533.517.497.384.321.342.535.528.506.729.73.616.794.777.78.641.612.593
LBI.519.534.618.371.353.416.482.504.609.692.776.764.696.714.857.586.563.738
PCA.544.558.601.432.404.368.554.554.624.754.796.701.782.782.845.646.627.72
DRFI.67.607.674.475.419.447.665.605.669.831.839.702.881.838.905.787.733.801
GMR.597.594.579.454.409.432.61.591.591.773.789.643.847.825.839.74.712.736
HS.585.549.602.442.358.428.616.565.616.811.776.713.845.8.87.731.659.769
LMLC.54.519.588.375.302.397.521.493.551.653.712.674.801.772.86.659.6.735
SF.5.495.342.373.319.219.519.512.377.764.794.509.779.759.573.619.576.378
FES.547.575.426.424.411.333.52.555.38.617.785.174.717.753.534.645.655.467
CB.581.556.615.444.375.435.542.534.593.73.704.657.815.775.857.717.656.761
SVO.554.441.609.414.279.419.557.407.609.744.667.746.789.585.863.639.357.737
SWD.528.56.649.434.386.454.478.506.613.548.714.737.689.705.871.624.549.781
HC.386.401.436.286.257.28.382.38.435.736.759.646.677.663.74.46.441.499
RC.61.586.639.431.37.425.599.578.621.774.807.649.844.82.875.741.701.776
SEG.5.425.58.376.268.393.516.45.562.704.64.669.697.585.812.568.408.715
MSS.478.49.2.341.324.089.476.49.193.743.783.298.696.711.362.53.536.203
CA.458.494.557.353.33.394.435.458.532.591.737.565.621.679.748.515.494.625
FT.386.4.238.278.25.132.381.388.259.715.734.436.635.628.472.434.431.257
AC.41.431.068.227.199.049.354.383.04.684.729.14.52.566.014.411.41.038
LC.386.408.289.264.246.156.327.353.243.683.752.486.569.589.432.39.396.219
OBJ.498.482.593.368.282.413.481.445.578.685.723.731.718.681.84.574.456.698
BMS.568.578.594.434.404.416.573.576.58.713.76.627.805.798.822.683.659.69
COV.51.587.398.429.427.315.486.579.373.518.724.212.667.755.394.641.677.413
SS.415.482.523.344.321.397.396.443.502.533.696.641.572.642.675.467.441.574
SIM.372.429.568.295.292.384.358.402.539.498.685.725.498.585.794.433.391.672
SeR.374.419.536.316.285.388.385.411.532.521.714.702.542.607.755.419.391.596
SUN.387.432.486.303.291.285.321.36.445.504.661.613.505.596.67.388.376.478
SR.374.457.002.279.27.001.298.3630.504.7.002.473.569.001.381.385.001
GB.526.571.65.419.396.455.507.548.638.571.746.695.688.737.837.624.613.765
AIM.427.461.559.317.26.36.361.377.495.541.718.693.555.575.75.449.357.571
IT.373.437.005.297.2830.378.449.005.579.697.008.471.586.158.407.414.003
AVG.458.569.62.392.367.411.406.514.534.388.524.64.58.692.779.597.627.756
Max FMeasure of average precision recall curve, average FMeasure for adaptive thresholding results, average FMeasure for SalCut. The subtitle of each column is in the [Dataset]-[Evaluation Metric] format, where [Dataset] is represented by the initial letter for the 6 benchmarks {THUR15K, JuddDB, DUT-OMRON, SED2, MSRA10K, ECSSD}. Click the title of the column to rerank the table according to that metric.

Performance: AUC & MAE

MethodT-AUCT-MAEJ-AUCJ-MAED-AUCD-MAES-AUCS-MAEM-AUCM-MAEE-AUCE-MAE
MBD0.9150.1620.8380.2250.9030.1680.9220.1370.9640.1070.9170.172
ST0.9110.1790.8060.2400.8950.1820.9220.1450.9610.1220.9140.193
QCUT0.9070.1280.8310.1780.8970.1190.8600.1480.9560.1180.9090.171
HDCT0.8780.1770.7710.2090.8690.1640.8980.1620.9410.1430.8660.199
RBD0.8870.150.8260.2120.8940.1440.8990.130.9550.1080.8940.173
GR0.8290.2560.7470.3110.8460.2590.8540.1890.9250.1980.8310.285
MNP0.8540.2550.7680.2860.8350.2720.8880.2150.8950.2290.820.307
UFO0.8530.1650.7750.2160.8390.1730.8450.180.9380.150.8750.207
MC0.8950.1840.8230.2310.8870.1860.8770.1820.9510.1450.910.204
DSR0.9020.1420.8260.1960.8990.1390.9150.140.9590.1210.9140.173
CHM0.910.1530.7970.2260.890.1520.8310.1680.9520.1420.9030.195
GC0.8030.1920.7020.2580.7960.1970.8460.1850.9120.1390.8050.214
LBI0.8760.2390.7920.2730.8540.2490.8960.2070.910.2240.8420.28
PCA0.8850.1980.8040.1810.8870.2060.9110.20.9410.1850.8760.248
DRFI0.9380.150.8510.2130.9330.1550.9440.130.9780.1180.9440.166
GMR0.8560.1810.7810.2430.8530.1890.8620.1630.9440.1260.8890.189
HS0.8530.2180.7750.2820.860.2270.8580.1570.9330.1490.8830.228
LMLC0.8530.2460.7240.3030.8170.2770.8260.2690.9360.1630.8490.26
SF0.7990.1840.7110.2180.8030.1830.8710.180.9050.1750.8170.23
FES0.8670.1550.8050.1840.8480.1560.8380.1960.8980.1850.860.215
CB0.870.2270.760.2870.8310.2570.8390.1950.9270.1780.8750.241
SVO0.8650.3820.7840.4220.8660.4090.8750.3480.930.3310.8570.404
SWD0.8730.2880.8120.2920.8430.310.8450.2960.9010.2670.8570.318
HC0.7350.2910.6260.3480.7330.310.880.1930.8670.2150.7040.331
RC0.8960.1680.7750.270.8590.1890.8520.1480.9360.1370.8920.187
SEG0.8180.3360.7470.3540.8250.3370.7960.3120.8820.2980.8080.342
MSS0.8130.1780.7260.2040.8170.1770.8710.1920.8750.2030.7790.245
CA0.830.2480.7740.2820.8150.2540.8530.2290.8720.2370.7840.31
FT0.6840.2410.5930.2670.6820.250.820.2060.790.2350.6610.291
AC0.740.1860.5480.2390.7210.190.8310.2060.7560.2270.6680.265
LC0.6960.2290.5860.2770.6540.2460.8270.2040.7710.2330.6270.296
OBJ0.8390.3060.750.3590.8220.3230.870.2690.9070.2620.8180.337
BMS0.8790.1810.7880.2330.8560.1750.8520.1840.9290.1510.8650.216
COV0.8830.1550.8260.1820.8640.1560.8330.210.9040.1970.8790.217
SS0.7920.2670.7540.3010.7840.2770.8260.2660.8230.2660.7250.344
SIM0.7970.4140.7270.4120.7830.4290.8330.3840.8080.3880.7340.433
SeR0.7780.3450.7460.3790.7860.3520.8350.290.8130.310.6950.404
SUN0.7460.310.6740.3190.7080.3490.7890.3070.7780.3060.6230.396
SR0.7410.1750.6760.20.6880.1810.7690.220.7360.2320.6330.266
GB0.8820.2290.8150.2610.8570.240.8390.2420.9020.2220.8650.263
AIM0.8140.2980.7190.3310.7680.3220.8460.2620.8330.2860.730.339
IT0.6230.1990.5860.20.6360.1980.6820.2450.640.2130.5770.273
AVG0.8490.2480.7970.3430.8140.2880.7360.4050.8570.260.8630.276
Comparison of AUC scores (larger better) and MAE scores (smaller better). Similar to the table above, the subtitle of each column is in the [Dataset]-[Evaluation Metric] format, where [Dataset] is represented by the initial letter for the 6 benchmarks {THUR15K, JuddDB, DUT-OMRON, SED2, MSRA10K, ECSSD}. Click the title of the column to rerank the table according to that metric.

Salient object detection datasets

Abbr.ImagesReferences
MSRA10K10000Learning to Detect A Salient Object , IEEE CVPR 2007, Liu et al. Frequency-tuned Salient Region Detection, IEEE CVPR 2009, Achanta et al. Global Contrast based Salient Region Detection, IEEE TPAMI 2015, Cheng et al.
ECSSD1000Hierarchical Saliency Detection, IEEE CVPR 2013, Yan et al.
THUR15K15000SalientShape: Group Saliency in Image Collections, The Visual Computer 2013, Cheng et al.
JuddDB900What is a salient object? A dataset and a baseline model for salient object detection, arXiv, ePrints
DUTOMRON5000Saliency Detection Via Graph-Based Manifold Ranking, IEEE CVPR 2013, Yang et al.
SED 2100Image segmentation by probabilistic
bottom-up aggregation and cue integration
, IEEE CVPR 2007, Alpert et al

Information about different methods

Abbr.DateRun time (s)CodeBooktitlePaper title
MB+2015CIEEE ICCVMinimum Barrier Salient Object Detection at 80 FPS
ST2014MIEEE TIPSaliency Tree: A Novel Saliency Detection Framework
QCUT2014M+CICPRAutomatic Object Segmentation by Quantum Cuts
HDCT20144.12MIEEE CVPRSalient Region Detection via High-dimensional Color Transform
RBD20140.269MIEEE CVPR Saliency Optimization from Robust Background Detection
GR20131.35M+CSig. Proc. Lett.Graph-Regularized Saliency Detection With Convex-Hull-Based Center Prior
MNP201321.0M+CThe Vis. Comp.Saliency for Image Manipulation
UFO201320.3M+CIEEE ICCVSalient Region Detection by UFO: Uniqueness, Focusness and Objectness
MC20130.195M+CIEEE ICCVSaliency Detection via Absorbing Markov Chain
DSR201310.2M+CIEEE ICCVSaliency Detection via Dense and Sparse Reconstruction
CHM201315.4M+CIEEE ICCVContextual Hypergraph Modeling for Salient Object Detection
GC20130.037CIEEE ICCVEfficient Salient Region Detection with Soft Image Abstraction
LBI2013251M+CIEEE CVPR Looking Beyond the Image: Unsupervised Learning for Object Saliency and Detection
PCA20134.34M+CIEEE CVPRWhat Makes a Patch Distinct?
DRFI20130.697M+CIEEE CVPRSalient Object Detection: A Discriminative Regional Feature Integration Approach
GMR20130.149CIEEE CVPRSaliency Detection via Graph-based Manifold Ranking
HS20130.528CIEEE CVPRHierarchical Saliency Detection
LMLC2013140M+CIEEE TIPBayesian Saliency via Low and Mid Level Cues
SF20120.202CIEEE CVPRSaliency Filters: Contrast Based Filtering for Salient Region Detection
FES20110.096M+CImage Ana.Fast and Efficient Saliency Detection Using Sparse Sampling and Kernel Density Estimation
CB20112.24M+CBMVCAutomatic salient object segmentation based on context and shape prior
SVO201156.5M+CIEEE ICCVFusing Generic Objectness and Visual Saliency for Salient Object Detection
SWD20110.190M+CIEEE CVPRVisual saliency detection by spatially weighted dissimilarity
HC20110.017CIEEE CVPRGlobal Contrast based Salient Region Detection
RC20150.136CIEEE TPAMIGlobal Contrast based Salient Region Detection
SEG201010.9MECCVSegmenting salient objects from images and videos
MSS20100.076CIEEE ICIPSaliency detection using maximum symmetric surround
CA201049.0M+CIEEE CVPRContext-aware saliency detection
FT20090.072CIEEE CVPRFrequency-tuned salient region detection
AC20080.129MICVSSalient region detection and segmentation
LC20060.009CACM Multi.Visual attention detection in video sequences using spatiotemporal cues
OBJ20103.01M+CIEEE CVPRWhat is an object?
BMS20130.575M+CIEEE ICCVSaliency Detection: A Boolean Map Approach
COV201325.4MJ. of Vis.Visual saliency estimation by nonlinearly integrating features using region covariances
SS20120.053MIEEE PAMIImage Signature: Highlighting sparse salient regions
SIM20111.11MIEEE CVPRSaliency estimation using a non-parametric low-level vision model
SeR20091.31MJ. of Vis.Static and space-time visual saliency detection by self-resemblance
SUN20083.56MJ. of Vis.SUN: A bayesian framework for saliency using natural statistics
SR20070.040MIEEE CVPRSaliency detection: A spectral residual approach
GB20060.735M+CNIPS Graph-based visual saliency
AIM20098.66MJ. of Vis.Saliency, attention, and visual search: An information theoretic approach
IT19980.302MIEEE PAMI A model of saliency-based visual attention for rapid scene analysis
Detailed information of each method. Regarding source code type: `C' means 'C/C++', 'M' means 'Matlab', 'M+C' means a mixture of Matlab and C/C++.

News

  1. 2015/4/24: evaluation results of QCUT has been added
  2. 2015/10/24: evaluation results of MBD has been added

[:]

(Visited 73,539 times, 1 visits today)
Subscribe
Notify of
guest

28 Comments
Inline Feedbacks
View all comments