Elena Damian,1 Bart Meuleman,1 Wim van Oorschot1
This study investigates the degree of transparency in cross-national research. It provides insight into what information is most likely to be left out in empirical studies and to what extent researchers provide sufficient details to assess the quality of their studies and/or to possibly replicate them.
The data set is composed of 305 comparative studies that were published between 1986 and 2016 in 1 of 29 sociology, political science, and cross-cultural psychology journals. First, we selected all journals from these fields that accept manuscripts on a broader variety of topics and publish comparative research. Second, we selected all articles that (1) use data from at least 1 of 7 international surveys that offer free data access (ie, Afrobarometer, Eurobarometer, European Social Survey, European Values Study, International Social Survey Program, World Values Survey, and Latinobarometer); (2) include in their analyses 5 or more countries; (3) use any type of comparative analysis; and (4) do not have a purely methodological aim. This selection resulted in 1007 studies from which we drew a random sample of 305 articles. Third, we created a questionnaire and coded for each article what information regarding the empirical analysis is reported (eg, sampling design, description and measurement of the variables, and information about the data used for contextual variables).
We found that most studies include basic information about the empirical analysis: a description of the population sample (81%, 246) and the dependent variables (97%, 297), and the contextual variables (95%; n = 191 of 202 articles with variables from external sources); the exact questions used to measure the dependent variables (70%, n = 212 of 305); a list of the countries included in the study (89%; n = 271 of 305); or the final study sample size (82%; n = 249 of 305). However, less than half of the articles provide crucial information needed to assess the quality of the study, ie, information about the sample design (39%; n = 118 of 305), survey mode (16%; n = 50 of 305), response rate (9%; n = 28 of 305), use of weights (29%; n = 87 of 305), number of missing values (10%, n = 30 of 305) and their treatment (31%; n = 94 of 305),precise references to the data sources used to create the contextual variables (42% n = 85 of 202), or dataset version (18%; n = 55 of 305). In addition, of all 305 articles analyzed, only 2 articles provided full and accessible replication materials.
These preliminary results reveal that most cross-national studies published in sociology, political science, and cross-cultural psychology journals omit essential information needed to assess their quality.
1Centre for Sociological Research, University of Leuven, Leuven, Belgium, firstname.lastname@example.org
Conflict of Interest Disclosures:
No external funding outside of the employer of the 3 authors (University of Leuven).
Role of the Funder/Sponsor:
The sponsor of the study had no role in the study design, data collection, data analysis, data interpretation, or writing of the report.