Skip to main content

Table 1 Performance comparison of ECG signal quality algorithms

From: Quality estimation of the electrocardiogram using cross-correlation among leads

 

E1

E2

E3

Physionet/CinC Challenge (accuracy scores)

 Xia et al. [3]a

0.932

0.914

0.845

 Clifford et al. [4]

0.926

\(-\)

\(-\)

 Tat et al. [7]

0.920

\(-\)

\(-\)

 Hayn et al. [8, 17]

0.916

0.834

0.873

 Kalkstein et al. [5]

0.912

\(-\)

\(-\)

 Jekova et al. [9, 18]

0.908

\(-\)

\(-\)

 Zausender et al. [6]

0.904

\(-\)

\(-\)

 Noponen et al. [10]

0.900

\(-\)

\(-\)

 Moody [11]

0.896

0.896

0.802

 Johannesen et al. [13, 20]

0.880

0.880

0.791

 Langley et al. [12]

0.868

0.868

0.814

 Chudacek et al. [14]

0.828

0.833

0.872

Other studies (accuracy score in the test setb)

 Clifford et al. [16]

0.970

 Xia et al. [1]c

0.951

 Langley et al. [19]

0.914

  1. Physionet/CinC Challenge was divided into three events: event 1 (E1), where participants were not required to submit their code; event 2 (E2), where participant were required to submit the code; and event 3 (E3), where the open source code of E2 was tested on a data set not available for participants. Accuracy scores for both E1 and E2 are calculated on the Dataset B by the Challenge organizers (see "Methods" section), while for E3 the accuracy was calculated on Dataset C.
  2. aThe score reported in [3] is different from the official entry [2].
  3. bThe test set is different from the Physionet/Cinc Challenge set.
  4. cThe reported accuracy was calculated for the training set.