Smartphone Image Denoising Dataset

Abdelrahman Abdelhamed1             Stephen Lin2             Michael S. Brown1

1York University             2Microsoft Research

SIDD Benchmark

A new version of SIDD Benchmark (SIDD+) is being hosted as a challenge at the New Trends in Image Restoration and Enhancement (NTIRE 2020) workshop in conjunction with CVPR 2020.
The participating solutions and results will be published in the challenge report in the CVPR 2020 Workshop proceedings.

Challenges can be accessed at the following Codalab competitions:
NTIRE 2020 Real Image Denoising Challenge - Track 1 - rawRGB
NTIRE 2020 Real Image Denoising Challenge - Track 2 - sRGB

Download

SIDD Benchmark Code v1.2 (9 KB) [ Mirror 1 | Mirror 2 | Mirror 3 | Mirror 4 ]

SIDD Benchmark Data as single .mat arrays of dimensoins [#images, #blocks, height, width, #channels]:
Noisy raw-RGB data: [ Mirror 1 | Mirror 2 ]
Noisy sRGB data: [ Mirror 1 | Mirror 2 ]

SIDD Validation Data and Ground Truth as single .mat arrays of dimensoins [#images, #blocks, height, width, #channels]:
Noisy raw-RGB data: [ Mirror 1 | Mirror 2 ]
Noisy sRGB data: [ Mirror 1 | Mirror 2 ]
Ground-truth raw-RGB data: [ Mirror 1 | Mirror 2 ]
Ground-truth sRGB data: [ Mirror 1 | Mirror 2 ]

SIDD Benchmark Data (full-frame images, 1.84 GB) [ Mirror 1 | Mirror 2 | Mirror 3 | Mirror 4 ]
MD5: decd113eaf99a8dbd1dbb7f7c9dafedd   SHA1: b8092d990139f41b6da97b4afa679a2876de53bd

Description

The SIDD Benchmark consists of 40 images representing 40 scene instances. These images can be used to benchmark denoising methods.

For each image, the following is provided in one directory:

  1. Noisy Raw-RGB image (.MAT). Black Level subtracted, normalized to [0, 1].
  2. Noisy sRGB image (.PNG). Gamma corrected, without any tone mapping.
  3. Metadata extracted from the DNG file (.MAT). For example, black and saturation levels, as-shot neutral, noise level function, etc.

The PSNR and SSIM values are calculated only on 32 blocks of size 256 by 256 pixels. The block positions are provided in a file named "BenchmarkBlocks32.mat".

Follow the instructions in the Code_v/_ReadMe.txt file to

Upload your results

Click here to submit your results

Benchmark Results

Results of the NTIRE 2019 Challenge on Real Image Denoising can be found in this paper.

The following tables show the benchmark results published in the paper.

Results of denoising in raw-RGB space

Method PSNR SSIM Time
(Applied on/Evaluated on) (Applied on/Evaluated on) (seconds)
Raw/Raw Raw/sRGB Raw/Raw Raw/sRGB Raw
BM3D 45.52 30.95 0.980 0.863 34.3
NLM 44.06 29.39 0.971 0.846 210.7
KSVD 43.26 27.41 0.969 0.832 2243.9
KSVD-DCT 42.70 28.21 0.970 0.784 133.3
KSVD-G 42.50 28.13 0.969 0.781 153.6
LPG-PCA 42.79 30.01 0.974 0.854 438.1
FoE 43.13 27.18 0.969 0.812 6097.2
MLP 43.17 27.52 0.965 0.788 131.2
WNNM 44.85 29.54 0.975 0.888 1975.8
GLIDE 41.87 25.98 0.949 0.816 12440.5
TNRD 42.77 26.99 0.945 0.744 15.2
EPLL 40.73 25.19 0.935 0.842 653.1
DnCNN 43.30 28.24 0.965 0.829 51.7
Benchmark01 40.48 26.940 0.91 0.623 0.07
Noise 36.75 23.680 0.84 0.480 23.36
aRID 48.05 35.780 0.98 0.902 23.36
MCU-Net 48.80 36.540 0.99 0.875 158.84
HWD 48.97 35.290 0.99 0.903 -1.00
BOE-IOT-AIBD 51.56 39.010 0.99 0.952 0.40
RDUNet 51.96 39.520 0.99 0.957 16.45
DnCNN_Denoise_ours 47.65 34.500 0.98 0.892 -1.00
DnCNN_Denoise_real 47.35 34.040 0.98 0.885 -1.00
Resnet_Denoise_real 47.72 34.970 0.98 0.901 -1.00
BOE-IOT-AIBD 51.87 39.480 0.99 0.956 15.91
BOE-IOT-AIBD 51.87 39.480 0.99 0.956 15.91
HWD_combine 48.94 35.400 0.99 0.896 -1.00
AAC_TECHNOLOGIES 51.71 39.300 0.99 0.955 -1.00
MLRI+1.5CBM3D 48.45 35.310 0.99 0.912 -1.00
RCNN+1.5CBM3D 48.36 35.240 0.99 0.906 -1.00
RCNN+1.5CBM3D (VST) 48.56 35.510 0.98 0.914 -1.00
MLRI+1.5CBM3D (VST) 49.48 36.060 0.99 0.922 -1.00
MLRI+1.5CBM3D 49.43 36.060 0.99 0.922 -1.00
Bitonic MX 49.41 37.250 0.99 0.934 8.91
CycleISP-public-weights 47.98 35.020 0.95 0.846 0.20
test_e 51.43 38.830 0.99 0.952 -1.00
b2ub 50.81 38.000 0.99 0.943 0.88
Regueb 36.75 23.680 0.84 0.480 -1.00

Results of denoising in sRGB space

Method PSNR SSIM Time
(Applied on/Evaluated on) (Applied on/Evaluated on) (seconds)
sRGB/sRGB sRGB/sRGB sRGB
BM3D 25.65 0.685 27.4
NLM 26.75 0.699 621.9
KSVD 26.88 0.842 9881.0
KSVD-DCT 27.51 0.780 96.3
KSVD-G 27.19 0.771 92.2
LPG-PCA 24.49 0.681 2004.3
FoE 25.58 0.792 12166.8
MLP 24.71 0.641 564.8
WNNM 25.78 0.809 8882.2
GLIDE 24.71 0.774 36091.6
TNRD 24.73 0.643 45.1
EPLL 27.11 0.870 1996.4
DnCNN 23.66 0.583 158.9
CBDNet 33.28 0.868 4.48
DeepProxies BM3D 34.34 0.911 6.69
mwresnet 38.52 0.949 47.73
mwresnet 39.31 0.956 77.34
mwresnet 39.64 0.958 25.14
Path-Restore 38.21 0.946 0.89
HT-MWResnet 39.80 0.959 63.45
test 39.78 0.958 -1.00
test2 37.97 0.942 -1.00
test3 37.97 0.942 -1.00
Benchmark08 25.50 0.559 0.02
Benchmark08 25.50 0.559 0.02
BT02 36.71 0.924 30.00
BT02_2 38.34 0.946 30.00
BT02_2 38.34 0.946 30.00
KPCN 38.60 0.948 0.00
VDN 39.26 0.955 0.15
URDNet_boostnet 38.88 0.952 2.22
UNet_D 38.88 0.952 2.28
URDNet_genet 38.88 0.952 2.22
UNet_D 37.92 0.944 2.24
BoostNet 38.57 0.950 2.22
UNet_ND 38.88 0.952 2.27
noisy_test 23.70 0.480 0.00
DNW-AMC 35.76 0.909 42.00
DNW-MNV1 38.12 0.947 42.00
DNW-Baseline 38.30 0.948 42.00
HI-GAN 38.88 0.952 2.27
AINDNet+TF* 39.08 0.953 0.00
FAN 3.43 0.136 -1.00
FAN 39.33 0.956 -1.00
DANet+ 39.43 0.956 0.09
DANet 39.25 0.955 0.09
Noise2Blur 34.64 0.926 0.06
DnCNN 30.71 0.695 0.00
DnCNN+ 32.59 0.861 0.00
GMSNet-A 39.51 0.958 0.00
noisy 23.70 0.480 0.04
GMSNet-B 39.69 0.958 0.00
RDB-Net 27.57 0.681 -1.00
RDB-Net 38.11 0.945 -1.00
MWUnet 17.28 0.312 -1.00
MWUnet 38.76 0.952 -1.00
unsupervised-30k 34.56 0.897 -1.00
CLeanToN-DnCNN 34.00 0.907 0.36
CLeanToN-DIDN_1 35.35 0.937 9.30
PseudoISP(PT-MWRN) 39.92 0.959 57.03
WTYNet_v1 39.29 0.956 -1.00
N2V 13.77 0.240 -1.00
n2v 13.77 0.240 -1.00
n2v 27.68 0.668 -1.00
RND_Base 38.48 0.950 -1.00
ADANI 37.64 0.944 0.11
RND_Jacobian 37.17 0.937 -1.00
RND_Jacobian2 38.69 0.951 -1.00
MoD-NAS 39.29 0.955 0.00
RND_Jacobian_200 38.69 0.951 -1.00
RND_JacobianV1 38.65 0.951 -1.00
Jacobian_RDN 38.62 0.951 -1.00
Jacobian_RDN_Big 38.63 0.950 -1.00
My_DNCNN 37.21 0.936 -1.00
InvDN 39.28 0.955 0.00
RND_Jacobian_Big 38.63 0.950 -1.00
My_FFDNet 38.28 0.948 -1.00
MWU_net 39.20 0.954 -1.00
ffdnet 38.27 0.948 -1.00
XGHdenoiser 18.72 0.508 0.28
XGHdenoiser 18.72 0.508 0.28
submit_MWRN_e300_mat 39.71 0.958 57.03
Deam 39.35 0.955 -1.00
Deam+ 39.43 0.956 -1.00
FHGZ 38.53 0.948 1.91
Test 39.25 0.955 0.00
Res_UNet 39.48 0.956 0.05
STWave_2 39.49 0.956 0.05
STWave_2 39.49 0.956 0.05
STWave_22 39.49 0.956 0.05
mytest 36.89 0.938 -1.00
test(IDNet) 12.34 0.008 -1.00
test(IDNet) 12.34 0.008 0.00
test(IDNet) 39.22 0.955 0.00
PDTNet+ 39.35 0.957 -1.00
u2u-2-zz 13.00 0.091 0.00
u2u-2-z3 39.39 0.957 -1.00
u2u-2-z4 39.58 0.958 0.00
PPDNet 39.69 0.958 0.00
test_test 39.14 0.954 0.00
Bitonic MX 36.67 0.933 42.24
BM3D G 36.20 0.929 17.25
skip1 38.63 0.954 -1.00
DnWSTrans 39.65 0.958 0.05
TSAFormerV1 5.57 0.363 7.99
TSAFormerV1 39.13 0.956 8.17
TSAFormerV1 39.13 0.956 8.17
BOE-IOT-AIBD_2021 39.83 0.959 4.33
NLCN 35.84 0.922 -1.00
Thunder 39.47 0.957 0.00
test 38.27 0.947 -1.00
testnet5_all_to_r_0.3_0.5 39.27 0.955 -1.00
jtfrn_3cancha 39.09 0.953 -1.00
ujft_down 39.33 0.955 -1.00
ujft_down 39.33 0.955 -1.00
svid250 34.32 0.922 0.10
Feature_Ensamble 18.72 0.521 -1.00
Feature_Ensamble 38.95 0.953 -1.00
Scaoed_2 39.44 0.956 -1.00
Scaoed_1 39.44 0.956 -1.00
Scaoed_2 39.48 0.957 -1.00
LPID_trial1012_68 39.24 0.955 -1.00
test-qy-ql 39.72 0.959 -1.00
test2-qy-ql 39.90 0.959 -1.00
MyNet 39.35 0.955 -1.00
MyNet 39.35 0.955 -1.00
CVF-SID (T) 34.43 0.912 0.04
CVF-SID (S) 34.51 0.916 0.06
CVF-SID (S^2) 34.71 0.913 0.06
carl_SIDD 12.34 0.008 -1.00
carl_SIDD 12.34 0.008 -1.00
carl_SIDD 12.34 0.008 -1.00
Inv_multi 39.20 0.955 0.00
resize_4level_grad 39.47 0.957 -1.00
resize_4level_grad0110 39.47 0.957 -1.00
svid250 35.87 0.928 0.09
DeamNetforNNZS 31.67 0.837 -1.00
DeamNet 29.03 0.709 -1.00
chenzj_selfdn 35.23 0.916 -1.00
chenzj_selfdn_2 35.12 0.916 -1.00
chenzj_selfdn_2.5 35.37 0.919 -1.00
chenzj_selfdn_3 35.13 0.919 -1.00
chenzj_selfdn_3.5 35.58 0.925 -1.00
chenzj_selfdn_4 23.70 0.480 -1.00
nonlocal 34.44 0.910 -1.00
learning 36.96 0.935 -1.00
learning 36.96 0.935 -1.00
learning 36.96 0.935 -1.00
learning 36.96 0.935 -1.00
image processing 36.96 0.935 -1.00
image 36.96 0.935 -1.00
NLNet 38.28 0.947 -1.00
NLNet620K 38.38 0.948 -1.00
NLNet550K 38.17 0.946 -1.00
NLNet670 38.48 0.948 -1.00
NLNet695K 38.29 0.947 -1.00
chenzj_selfdn_6 35.88 0.925 -1.00
NLNet650K 38.22 0.948 -1.00
NLNet680 38.46 0.948 -1.00
NAFNet 40.15 0.960 -1.00
NAFNet 40.15 0.960 -1.00
MMBP_try 23.83 0.123 -1.00
Daformer 39.96 0.960 -1.00
new_chenzj_selfdn_1 35.94 0.921 -1.00
NLNet600 38.20 0.946 -1.00
new_chenzj_selfdn_1 35.94 0.921 -1.00
learning600 38.20 0.946 -1.00
DnCNN 37.73 0.941 -1.00
chenzj_new_selfdn_2 35.88 0.926 -1.00
chenzj_selfdn_final 35.94 0.925 -1.00
APBSN-author 35.97 0.925 -1.00
APBSN-my 37.05 0.934 -1.00
SSVQ 34.53 0.904 3625944.35
SSVQ 34.62 0.906 3630752.07
SSVQ 34.88 0.908 3592136.16
SSVQ 35.03 0.907 3595889.02
LPIENet 37.73 0.943 -1.00
demo1 35.05 0.907 -1.00
demo2 35.38 0.911 -1.00
demo2 35.38 0.911 -1.00
demo3 35.38 0.911 -1.00
demo3 36.39 0.932 -1.00
NAGNet_RIDNet_NIL 13.17 0.037 -1.00
WTV_Net 37.55 0.943 -1.00
WTV_Net 18.81 0.528 -1.00
WTV_Net 37.31 0.940 -1.00
RS_WTV 38.65 0.951 -1.00
RS_WTV 38.75 0.952 -1.00
RS_WTV 38.70 0.952 -1.00
ADFNet 39.63 0.958 -1.00
AAP (S)(E) 37.00 0.933 -1.00
AAP (S) 36.64 0.920 -1.00
AAP 36.54 0.919 -1.00
Noise2Code 35.85 0.921 3251749.52