In medicine, before replacing an old device by a new one, we need to know whether the results of the old and new device are similar. This is called determining the agreement between methods. In this paper, we will first discuss various ways to determine the agreement between methods to measure continuous variables, including the t test, the correlation coefficient and the Bland-Altman plot. In the second part, we will discuss methods to determine the agreement between categorical variables, like the chi(2) test and Cohen's kappa. The latter are often used when studying the agreement between clinicians, definitions, formulas or different data sources. Copyright (c) 2012 S. Karger AG, Base
This classic methods paper (Bland and Altman, 2010) considers the assessment of agreement between me...
In a contemporary clinical laboratory it is very common to have to assess the agreement between two ...
Abstract:Kappa statistics is used for the assessment of agreement between two or more raters when th...
Background: Accurate values are a must in medicine. An important parameter in determining the qualit...
Before new tests are implemented, it is important to compare their results with those of other measu...
Background: Currently, we are not aware of a method to assess graphically on one simple plot agreeme...
Correlation and agreement are 2 concepts that are widely applied in the medical literature and clini...
<div><h3>Background</h3><p>Accurate values are a must in medicine. An important parameter in determi...
With advances in medical technology, simpler and safer methods for diagnosis and therapy are increas...
INTRODUCTION AND OBJECTIVE: Most of important variables measured in medicine are in numerical forms ...
Agreement between measurements refers to the degree of concordance between two (or more) sets of mea...
In clinical measurement comparison of a new measurement technique with an established one is often n...
Measuring agreement between qualified experts is commonly used to determine the effec-tiveness of a ...
A common goal in radiological studies is the search for alternatives for image processing and analys...
AbstractAgreement measures are used frequently in reliability studies that involve categorical data....
This classic methods paper (Bland and Altman, 2010) considers the assessment of agreement between me...
In a contemporary clinical laboratory it is very common to have to assess the agreement between two ...
Abstract:Kappa statistics is used for the assessment of agreement between two or more raters when th...
Background: Accurate values are a must in medicine. An important parameter in determining the qualit...
Before new tests are implemented, it is important to compare their results with those of other measu...
Background: Currently, we are not aware of a method to assess graphically on one simple plot agreeme...
Correlation and agreement are 2 concepts that are widely applied in the medical literature and clini...
<div><h3>Background</h3><p>Accurate values are a must in medicine. An important parameter in determi...
With advances in medical technology, simpler and safer methods for diagnosis and therapy are increas...
INTRODUCTION AND OBJECTIVE: Most of important variables measured in medicine are in numerical forms ...
Agreement between measurements refers to the degree of concordance between two (or more) sets of mea...
In clinical measurement comparison of a new measurement technique with an established one is often n...
Measuring agreement between qualified experts is commonly used to determine the effec-tiveness of a ...
A common goal in radiological studies is the search for alternatives for image processing and analys...
AbstractAgreement measures are used frequently in reliability studies that involve categorical data....
This classic methods paper (Bland and Altman, 2010) considers the assessment of agreement between me...
In a contemporary clinical laboratory it is very common to have to assess the agreement between two ...
Abstract:Kappa statistics is used for the assessment of agreement between two or more raters when th...