Data analysis
In order to assess the inter-observer agreement, we calculated the kappa
coefficient weekly for the interpretation of alveolar pattern, which is
known for its use as a standard tool for inter-observer agreement
according to literature. The Fleiss Kappa was used to assess the
inter-observer agreement of the chest X-rays findings between three
raters. Subsequently, we compared each radiologist’s findings with those
of the gold standard radiologist using Cohen’s Kappa to identify the
radiologist with the least agreement with the gold standard. Descriptive
analysis of the variables, including radiological findings and relevant
clinical variables, was performed using frequencies and measures of
central tendency and dispersion. To explore potential associations
between radiological findings and clinical variables, we conducted
bivariate analysis using Pearson Chi-Square test, Fisher’s exact test,
or Kruskal-Wallis rank sum test, as appropriate based on the type of
data. All analyses were performed in R.