#PAGE_PARAMS# #ADS_HEAD_SCRIPTS# #MICRODATA#

The promise of open survey questions—The validation of text-based job satisfaction measures


Autoři: Indy Wijngaards aff001;  Martijn Burger aff001;  Job van Exel aff002
Působiště autorů: Erasmus Happiness Research Organization, Erasmus University Rotterdam, Rotterdam, the Netherlands aff001;  Erasmus School of Health & Policy Management, Erasmus University Rotterdam, Rotterdam, the Netherlands aff002;  Erasmus School of Economics and Tinbergen Institute, Erasmus University Rotterdam, Rotterdam, the Netherlands aff003
Vyšlo v časopise: PLoS ONE 14(12)
Kategorie: Research Article
prolekare.web.journal.doi_sk: https://doi.org/10.1371/journal.pone.0226408

Souhrn

Recent advances in computer-aided text analysis (CATA) have allowed organizational scientists to construct reliable and convenient measures from open texts. As yet, there is a lack of research into using CATA to analyze responses to open survey questions and constructing text-based measures of psychological constructs. In our study, we demonstrated the potential of CATA methods for the construction of text-based job satisfaction measures based on responses to a completely open and semi-open question. To do this, we employed three sentiment analysis techniques: Linguistic Inquiry and Word Count 2015, SentimentR and SentiStrength, and quantified the forms of measurement error they introduced: specific factor error, algorithm error and transient error. We conducted an initial test of the text-based measures’ validity, assessing their convergence with closed-question job satisfaction measures. We adopted a time-lagged survey design (Nwave 1 = 996; Nwave 2 = 116) to test our hypotheses. In line with our hypotheses, we found that specific factor error is higher in the open question text-based measure than in the semi-open question text-based measure. As expected, algorithm error was substantial for both the open and semi-open question text-based measures. Transient error in the text-based measures was higher than expected, as it generally exceeded the transient error in the human-coded and the closed job satisfaction question measures. Our initial test of convergent and discriminant validity indicated that the semi-open question text-based measure is especially suitable for measuring job satisfaction. Our article ends with a discussion of limitations and an agenda for future research.

Klíčová slova:

Employment – Jobs – Algorithms – Surveys – Emotions – Research validity – Software tools – Labor studies


Zdroje

1. Duriau VJ, Reger RK, Pfarrer MD. A content analysis of the content analysis literature in organization studies: Research themes, data sources, and methodological refinements. Organizational research methods. 2007;10(1):5–34.

2. Hayes AF, Krippendorff K. Answering the call for a standard reliability measure for coding data. Communication methods and measures. 2007;1(1):77–89.

3. Short JC, McKenny AF, Reid SW. More than words? Computer-aided text analysis in organizational behavior and psychology research. Annual Review of Organizational Psychology and Organizational Behavior. 2018;5:415–435.

4. Short JC, Broberg JC, Cogliser CC, Brigham KH. Construct validation using computer-aided text analysis (CATA) an illustration using entrepreneurial orientation. Organizational Research Methods. 2010;13(2):320–347.

5. Feldman R. Techniques and applications for sentiment analysis. Communications of the ACM. 2013;56(4):82–89.

6. Liu B. Sentiment analysis: Mining opinions, sentiments, and emotions. New York, NY, US: Cambridge University Press; 2015.

7. Tov W, Ng KL, Lin H, Qiu L. Detecting well-being via computerized content analysis of brief diary entries. Psychological Assessment. 2013;25(4):1069. doi: 10.1037/a0033007 23730828

8. Wang N, Kosinski M, Stillwell DJ, Rust J. Can well-being be measured using Facebook status updates? Validation of Facebook’s Gross National Happiness Index. Social Indicators Research. 2014;115(1):483–491.

9. Kiritchenko S, Zhu X, Mohammad SM. Sentiment analysis of short informal texts. Journal of Artificial Intelligence Research. 2014;50:723–762.

10. Thelwall M, Buckley K, Paltoglou G, Cai D, Kappas A. Sentiment strength detection in short informal text. Journal of the American Society for Information Science and Technology. 2010;61(12):2544–2558.

11. Gilles I, Mayer M, Courvoisier N, Peytremann-Bridevaux I. Joint analyses of open comments and quantitative data: Added value in a job satisfaction survey of hospital professionals. PloS ONE. 2017;12(3):e0173950. doi: 10.1371/journal.pone.0173950 28296974

12. Aguinis H, Pierce CA, Bosco FA, Muslin IS. First decade of Organizational Research Methods: Trends in design, measurement, and data-analysis topics. Organizational Research Methods. 2009;12(1):69–112.

13. Bono JE, McNamara G. Publishing in AMJ—part 2: research design. Academy of Management Briarcliff Manor, NY; 2011.

14. Mossholder KW, Settoon RP, Harris SG, Armenakis AA. Measuring emotion in open-ended survey responses: An application of textual data analysis. Journal of Management. 1995;21(2):335–355.

15. Singer E, Couper MP. Some methodological uses of responses to open questions and other verbatim comments in quantitative surveys. Methods, data, analyses: a journal for quantitative methods and survey methodology (mda). 2017;11(2):115–134.

16. Spector PE, Pindek S. The future of research methods in work and occupational health psychology. Applied Psychology. 2016;65(2):412–431.

17. Patil S, Palshikar GK. Surveycoder: A system for classification of survey responses. In: International Conference on Application of Natural Language to Information Systems. Springer; 2013. p. 417–420.

18. Esuli A, Moreo A, Sebastiani F. Building Automated Survey Coders via Interactive Machine Learning. arXiv preprint arXiv:190312110. 2019;

19. Li H, Yamanishi K. Mining from open answers in questionnaire data. In: Proceedings of the seventh ACM SIGKDD international conference on Knowledge discovery and data mining. ACM; 2001. p. 443–449.

20. Rosell M, Velupillai S. Revealing Relations between Open and Closed Answers in Questionnaires through Text Clustering Evaluation. In: LREC. 2008.

21. Schonlau M, Couper MP. Semi-automated categorization of open-ended questions. In: Survey Research Methods. 2016. p. 143–152.

22. Zehner F, Sälzer C, Goldhammer F. Automatic coding of short text responses via clustering in educational assessment. Educational and Psychological Measurement. 2016;76(2):280–303. doi: 10.1177/0013164415590022 29795866

23. Podsakoff PM, MacKenzie SB, Lee JY, Podsakoff NP. Common method biases in behavioral research: a critical review of the literature and recommended remedies. Journal of Applied Psychology. 2003;88(5):879–903. doi: 10.1037/0021-9010.88.5.879 14516251

24. Meade AW, Craig SB. Identifying careless responses in survey data. Psychological Methods. 2012;17(3):437–55. doi: 10.1037/a0028085 22506584

25. Krosnick JA. Survey research. Annual Review of Psychology. 1999;50(1):537–567.

26. Turner SF, Cardinal LB, Burton RM. Research design for mixed methods: A triangulation-based framework and roadmap. Organizational Research Methods. 2017;20(2):243–267.

27. Jick TD. Mixing qualitative and quantitative methods: Triangulation in action. Administrative Science Quarterly. 1979;24(4):602–611.

28. Edwards JR. To prosper, organizational psychology should… overcome methodological barriers to progress. Journal of Organizational Behavior: The International Journal of Industrial, Occupational and Organizational Psychology and Behavior. 2008;29(4):469–491.

29. Reja U, Manfreda KL, Hlebec V, Vehovar V. Open-ended vs. close-ended questions in web questionnaires. Developments in applied statistics. 2003;19(1):160–117.

30. Kaplan SA, Warren CR, Barsky AP, Thoresen CJ. A note on the relationship between affect (ivity) and differing conceptualizations of job satisfaction: Some unexpected meta-analytic findings. European Journal of Work and Organizational Psychology. 2009;18(1):29–54.

31. Judge TA, Weiss HM, Kammeyer-Mueller JD, Hulin CL. Job attitudes, job satisfaction, and job affect: A century of continuity and of change. Journal of Applied Psychology. 2017;102(3):356–74. doi: 10.1037/apl0000181 28125260

32. Weiss HM. Deconstructing job satisfaction: Separating evaluations, beliefs and affective experiences. Human Resource Management Review. 2002;12(2):173–194.

33. Kam CCS, Meyer JP. How careless responding and acquiescence response bias can influence construct dimensionality: The case of job satisfaction. Organizational Research Methods. 2015;18(3):512–541.

34. Borg I, Zuell C. Write-in comments in employee surveys. International Journal of Manpower. 2012;33(2):206–220.

35. Moniz A, Jong F. Sentiment analysis and the impact of employee satisfaction on firm earnings. In: Proceedings of European Conference on Information Retrieval. 2014. p. 519–27.

36. Poncheri RM, Lindberg JT, Thompson LF, Surface EA. A comment on employee surveys: Negativity bias in open-ended responses. Organizational Research Methods. 2008;11(3):614–630.

37. Young LM, Gavade SR. Translating emotional insights from hospitality employees’ comments: Using sentiment analysis to understand job satisfaction. International Hospitality Review. 2018;32(1):75–92.

38. Taber TD. Triangulating job attitudes with interpretive and positivist measurement methods. Personnel Psychology. 1991;44(3):577–600.

39. Judge TA, Hulin CL, Dalal RS. Job satisfaction and job affect. In: Kozlowski SWJ, editor. Oxford library of psychology The Oxford handbook of organizational psychology. New York, NY, US: Oxford University Press; 2012. p. 496–525.

40. Locke EA, Dunnette MD. The nature and causes of job satisfaction. In: Handbook of Industrial and Organizational Psychology. New York, NY, US: Holt, Reinhart & Winston; 1976.

41. O’Cathain A, Thomas KJ. “Any other comments?” Open questions on questionnaires–a bane or a bonus to research? BMC medical research methodology. 2004;4(1):25.

42. Glerum A, Atasoy B, Bierlaire M. Using semi-open questions to integrate perceptions in choice models. Journal of choice modelling. 2014;10:11–33.

43. Axhausen KW, Weis C. Predicting response rate: A natural experiment. Survey Practice. 2010;3(2).

44. Taboada M, Brooke J, Tofiloski M, Voll K, Stede M. Lexicon-based methods for sentiment analysis. Computational Linguistics. 2011;37(2):267–307.

45. Mohammad SM. Sentiment analysis: Detecting valence, emotions, and other affectual states from text. In: Emotion measurement. Elsevier; 2016. p. 201–237.

46. McKenny AF, Aguinis H, Short JC, Anglin AH. What doesn’t get measured does exist: Improving the accuracy of computer-aided text analysis. Journal of Management. 2018;44(7):2909–2933.

47. Zamith R, Lewis SC. Content analysis and the algorithmic coder: What computational social science means for traditional modes of media analysis. The Annals of the American Academy of Political and Social Science. 2015;659(1):307–318.

48. Mikhaylov S, Laver M, Benoit KR. Coder reliability and misclassification in the human coding of party manifestos. Political Analysis. 2012;20(1):78–91.

49. Snow R, O’Connor B, Jurafsky D, Ng AY. Cheap and fast—but is it good?: evaluating non-expert annotations for natural language tasks. In: Proceedings of the conference on empirical methods in natural language processing. Association for Computational Linguistics; 2008. p. 254–263.

50. Cambria E. Affective computing and sentiment analysis. IEEE Intelligent Systems. 2016;31(2):102–107.

51. Pennebaker JW, Chung CK, Ireland M, Gonzales A, Booth RJ. Linguistic Inquiry and Word Count: LIWC 2015 [Computer software]. Pennebaker Conglomerates. [Internet]. 2015. Available from: http://liwc.wpengine.com

52. Rinker TW. sentimentr: Calculate Text Polarity Sentiment. University at Buffalo/SUNY, Buffalo, New York. version 0.5. 2016.

53. Pennebaker JW, Boyd RL, Jordan K, Blackburn K. The development and psychometric properties of LIWC2015. 2015.

54. Cohn MA, Mehl MR, Pennebaker JW. Linguistic markers of psychological change surrounding September 11, 2001. Psychological Science. 2004;15(10):687–693. doi: 10.1111/j.0956-7976.2004.00741.x 15447640

55. R Core Team. R: A Language and Environment for Statistical Computing [Internet]. Vienna, Austria: R Foundation for Statistical Computing; 2014. Available from: http://www.R-project.org/

56. Chen J, Chai A, Goel M, Lieu D, Mohamed F, Nahm D, et al. Predicting Stock Prices from News Articles. The Undergraduate Statistics Association-Project Committee journal APPENDICES. 2015;

57. Ikoro V, Sharmina M, Malik K, Batista-Navarro R. Analyzing Sentiments Expressed on Twitter by UK Energy Company Consumers. In: 2018 Fifth International Conference on Social Networks Analysis, Management and Security (SNAMS). IEEE; 2018. p. 95–98.

58. Naldi M. A review of sentiment computation methods with R packages. arXiv preprint arXiv:190108319. 2019;

59. Weissman GE, Ungar LH, Harhay MO, Courtright KR, Halpern SD. Construct validity of six sentiment analysis methods in the text of encounter notes of patients with critical illness. Journal of Biomedical Informatics. 2019;89:114–121. doi: 10.1016/j.jbi.2018.12.001 30557683

60. Jockers M. syuzhet: Extracts sentiment and sentiment-derived plot arcs from text. R package version. 2017.

61. Abbasi A, Hassan A, Dhar M. Benchmarking Twitter Sentiment Analysis Tools. In: LREC. 2014. p. 26–31.

62. Araujo M, Reis J, Pereira A, Benevenuto F. An evaluation of machine translation for multilingual sentence-level sentiment analysis. In: Proceedings of the 31st Annual ACM Symposium on Applied Computing. ACM; 2016. p. 1140–1145.

63. Gonçalves P, Araújo M, Benevenuto F, Cha M. Comparing and combining sentiment analysis methods. In: Proceedings of the first ACM conference on Online social networks. ACM; 2013. p. 27–38.

64. Islam MR, Zibran MF. A comparison of dictionary building methods for sentiment analysis in software engineering text. In: 2017 ACM/IEEE International Symposium on Empirical Software Engineering and Measurement (ESEM). IEEE; 2017. p. 478–479.

65. Pennebaker JW, Mehl MR, Niederhoffer KG. Psychological aspects of natural language use: Our words, our selves. Annual review of psychology. 2003;54(1):547–577.

66. Stone PJ, Dunphy DC, Smith MS, Ogilvie DM. Associates: The General Inquirer: A Computer Approach to Content Analysis. The MIT Press; 1966.

67. Thelwall M. The Heart and soul of the web? Sentiment strength detection in the social web with SentiStrength. In: Cyberemotions. Springer; 2017. p. 119–134.

68. Krippendorff K. Content Analysis: An Introduction to Its Methodology (Commtext Series). Thousand Oaks, CA: SAGE Publications; 1980.

69. Shalunts G, Backfried G. SentiSAIL: sentiment analysis in English, German and Russian. In: International Workshop on Machine Learning and Data Mining in Pattern Recognition. Springer; 2015. p. 87–97.

70. Lacy S, Watson BR, Riffe D, Lovejoy J. Issues and best practices in content analysis. Journalism & Mass Communication Quarterly. 2015;92(4):791–811.

71. Dormann C, Zapf D. Job satisfaction: A meta-analysis of stabilities. Journal of Organizational Behavior: The International Journal of Industrial, Occupational and Organizational Psychology and Behavior. 2001;22(5):483–504.

72. Kristof AL. Person-organization fit: An integrative review of its conceptualizations, measurement, and implications. Personnel Psychology. 1996;49(1):1–49.

73. Wang G, Hackett RD. Conceptualization and measurement of virtuous leadership: Doing well by doing good. Journal of Business Ethics. 2016;137(2):321–345.

74. Hackett RD, Wang G. Virtues and leadership: An integrating conceptual framework founded in Aristotelian and Confucian perspectives on virtues. Management Decision. 2012;50(5):868–899.

75. Neubert MJ, Carlson DS, Kacmar KM, Roberts JA, Chonko LB. The virtuous influence of ethical leadership behavior: Evidence from the field. Journal of Business Ethics. 2009;90(2):157–170.

76. Shin DC, Johnson DM. Avowed happiness as an overall assessment of the quality of life. Social Indicators Research. 1978;5(1–4):475–492.

77. Diener E, Wirtz D, Tov W, Kim-Prieto C, Choi D, Oishi S, et al. New well-being measures: Short scales to assess flourishing and positive and negative feelings. Social Indicators Research. 2010;97(2):143–156.

78. Judge TA, Watanabe S. Individual differences in the nature of the relationship between job and life satisfaction. Journal of Occupational and Organizational Psychology. 1994;67(2):101–107.

79. Organ DW. A restatement of the satisfaction-performance hypothesis. Journal of Management. 1988;14(4):547–557.

80. Hinkin TR. A brief tutorial on the development of measures for use in survey questionnaires. Organizational Research Methods. 1998;1(1):104–121.

81. Peer E, Brandimarte L, Samat S, Acquisti A. Beyond the Turk: Alternative platforms for crowdsourcing behavioral research. Journal of Experimental Social Psychology. 2017;70:153–163.

82. Dunn TJ, Baguley T, Brunsden V. From alpha to omega: A practical solution to the pervasive problem of internal consistency estimation. British Journal of Psychology. 2014;105(3):399–412. doi: 10.1111/bjop.12046 24844115

83. McDonald RP. Test theory: A unified treatment. Hillsdale, NJ: Erlbaum; 1999.

84. Nunnally J. Psychometric methods. New York: McGraw-Hill; 1978.

85. Potkay CR, Allen BP. The Adjective Generation Technique: An alternative to adjective check lists. Psychological Reports. 1973;32(2):457–458.

86. Salton G. The SMART retrieval system. Englewood Cliffs, NJ: Prentice-Hall; 1971.

87. Cable DM, Judge TA. Person–organization fit, job choice decisions, and organizational entry. Organizational Behavior and Human Decision Processes. 1996;67(3):294–311.

88. Spector PE, Bauer JA, Fox S. Measurement artifacts in the assessment of counterproductive work behavior and organizational citizenship behavior: Do we know what we think we know? Journal of Applied Psychology. 2010;95(4):781–90. doi: 10.1037/a0019477 20604597

89. World Values Survey. World values survey database. [Internet]. 2019. Available from: http://worldvaluessurvey.org

90. Bjørnskov C. How comparable are the Gallup World Poll life satisfaction data? Journal of happiness Studies. 2010;11(1):41–60.

91. Hallgren KA. Computing inter-rater reliability for observational data: an overview and tutorial. Tutorials in quantitative methods for psychology. 2012;8(1):23. doi: 10.20982/tqmp.08.1.p023 22833776

92. Boyer KK, Verma R. Multiple raters in survey-based operations management research: a review and tutorial. Production and Operations Management. 2000;9(2):128–140.

93. Fisher RA. Frequency distribution of the values of the correlation coefficient in samples from an indefinitely large population. Biometrika. 1915;10(4):507–521.

94. Steiger JH. Tests for comparing elements of a correlation matrix. Psychological Bulletin. 1980;87(2):245.

95. Kobayashi VB, Mol ST, Berkers HA, Kismihók G, Den Hartog DN. Text mining in organizational research. Organizational Research Methods. 2017;21(3):733–65. doi: 10.1177/1094428117722619 29881248

96. Fisher GG, Matthews RA, Gibbons AM. Developing and investigating the use of single-item measures in organizational research. Journal of Occupational Health Psychology. 2016;21(1):3. doi: 10.1037/a0039139 25894198

97. Thompson ER, Phua FT. A brief index of affective job satisfaction. Group & Organization Management. 2012;37(3):275–307.

98. Bowling NA, Wagner SH, Beehr TA. The Facet Satisfaction Scale: an Effective Affective Measure of Job Satisfaction Facets. Journal of Business and Psychology. 2018;33(3):383–403.


Článok vyšiel v časopise

PLOS One


2019 Číslo 12
Najčítanejšie tento týždeň
Najčítanejšie v tomto čísle
Kurzy

Zvýšte si kvalifikáciu online z pohodlia domova

Aktuální možnosti diagnostiky a léčby litiáz
nový kurz
Autori: MUDr. Tomáš Ürge, PhD.

Všetky kurzy
Prihlásenie
Zabudnuté heslo

Zadajte e-mailovú adresu, s ktorou ste vytvárali účet. Budú Vám na ňu zasielané informácie k nastaveniu nového hesla.

Prihlásenie

Nemáte účet?  Registrujte sa

#ADS_BOTTOM_SCRIPTS#