Human perception and biosignal-based identification of posed and spontaneous smiles
Autoři:
Monica Perusquía-Hernández aff001; Saho Ayabe-Kanamura aff003; Kenji Suzuki aff002
Působiště autorů:
Communication Science Laboratories, NTT, Atsugi, Kanagawa, Japan
aff001; Artificial Intelligence Laboratory, University of Tsukuba, Tsukuba, Ibaraki, Japan
aff002; Faculty of Human Sciences, University of Tsukuba, Tsukuba, Ibaraki, Japan
aff003
Vyšlo v časopise:
PLoS ONE 14(12)
Kategorie:
Research Article
prolekare.web.journal.doi_sk:
https://doi.org/10.1371/journal.pone.0226328
Souhrn
Facial expressions are behavioural cues that represent an affective state. Because of this, they are an unobtrusive alternative to affective self-report. The perceptual identification of facial expressions can be performed automatically with technological assistance. Once the facial expressions have been identified, the interpretation is usually left to a field expert. However, facial expressions do not always represent the felt affect; they can also be a communication tool. Therefore, facial expression measurements are prone to the same biases as self-report. Hence, the automatic measurement of human affect should also make inferences on the nature of the facial expressions instead of describing facial movements only. We present two experiments designed to assess whether such automated inferential judgment could be advantageous. In particular, we investigated the differences between posed and spontaneous smiles. The aim of the first experiment was to elicit both types of expressions. In contrast to other studies, the temporal dynamics of the elicited posed expression were not constrained by the eliciting instruction. Electromyography (EMG) was used to automatically discriminate between them. Spontaneous smiles were found to differ from posed smiles in magnitude, onset time, and onset and offset speed independently of the producer’s ethnicity. Agreement between the expression type and EMG-based automatic detection reached 94% accuracy. Finally, measurements of the agreement between human video coders showed that although agreement on perceptual labels is fairly good, the agreement worsens with inferential labels. A second experiment confirmed that a layperson’s accuracy as regards distinguishing posed from spontaneous smiles is poor. Therefore, the automatic identification of inferential labels would be beneficial in terms of affective assessments and further research on this topic.
Klíčová slova:
Behavior – Emotions – Face recognition – Perception – Face – Ethnicities – Experimental design – Electromyography
Zdroje
1. Dillon CM, Carr JE. Assessing indices of happiness and unhappiness in individuals with developmental disabilities: a review. Behavioral Interventions. 2007;22:229–244. doi: 10.1002/bin.240
2. Thieme A, Wallace J, Meyer TD, Olivier P. Designing for Mental Wellbeing: Towards a More Holistic Approach in the Treatment and Prevention of Mental Illness. In: Proceedings of the 2015 British HCI Conference. ACM; 2015. p. 1–10.
3. Laparra-Hernandez J, Belda-Lois JM, Medina E, Campos N, Poveda R. EMG and GSR signals for evaluating user’s perception of different types of ceramic flooring. International Journal of Industrial Ergonomics. 2009;39:326–332. doi: 10.1016/j.ergon.2008.02.011
4. McDuff D, El Kaliouby R, Senechal T, Amr M, Cohn JF, Picard R. Affectiva-mit facial expression dataset (AM-FED): Naturalistic and spontaneous facial expressions collected ‘in-the-wild’. IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops. 2013; p. 881–888.
5. Soleymani M, Asghari-esfeden S, Fu Y, Pantic M. Analysis of EEG Signals and Facial Expressions for Continuous Emotion Detection. IEEE Transactions on Affective Computing. 2016;7(1):17–28. doi: 10.1109/TAFFC.2015.2436926
6. Breazeal C, Brooks R. Robot Emotion: A functional perspective. In: Fellous JM, Arbib MA, editors. Who Needs Emotions: The Brain Meets the Robot. MIT Press; 2005. p. 271–310.
7. Wilson M. Six views of embodied cognition. Psychonomic bulletin & review. 2002;9(4):625–636. doi: 10.3758/BF03196322
8. Oberman LM, Winkielman P, Ramachandran VS. Face to face: blocking facial mimicry can selectively impair recognition of emotional expressions. Social neuroscience. 2007;2(3-4):167–78. doi: 10.1080/17470910701391943 18633815
9. Darwin C. The Expression of the Emotions in Man and Animals. New York D. Appleton and Company; 1872.
10. James W. What is an emotion? Mind. 1884;os-IX(34):188–205. doi: 10.1093/mind/os-IX.34.188
11. Öhman A. Face the Beast and Fear the Face: Animal and Social Fears as Prototypes for Evolutionary Analyses of Emotion. Psychophysiology. 1986;23(2):123–145. doi: 10.1111/j.1469-8986.1986.tb00608.x 3704069
12. Ekman P. An Argument for Basic Emotions. Cognition and Emotion. 1992;6(3-4):169–200. doi: 10.1080/02699939208411068
13. Ekman P. Basic Emotions. In: Dalgleish T, Power M, editors. Handbook of cognition and emotion. John Wiley & Sons, Ltd.; 1999. p. 45–60.
14. Galati D, Scherer KR, Ricci-Bitti PE. Voluntary facial expression of emotion: comparing congenitally blind with normally sighted encoders. Journal of personality and social psychology. 1997;73(6):1363–79. doi: 10.1037//0022-3514.73.6.1363 9453990
15. Ekman P, Friesen WP. Nonverbal leakage and clues to deception. Psychiatry. 1969;32(1):88–106. doi: 10.1080/00332747.1969.11023575 5779090
16. Ekman P, Rosenberg EL. What the Face RevealsBasic and Applied Studies of Spontaneous Expression Using the Facial Action Coding System (FACS). 2nd ed. Oxford University Press; 2005.
17. Ekman P, Rosenberg E. What the face reveals: Basic and Applied Studies of Spontaneous Expression Using the Facial Action Coding System (FACS). Second edition ed. Oxford University Press; 2005.
18. Ekman P. Darwin, Deception, and Facial Expression. Annals of the New York Academy of Sciences. 2003;1000:205–221. doi: 10.1196/annals.1280.010 14766633
19. Ekman P, Friesen WP. Measuring facial movement with the Facial Action Coding System. In: Ekman P, editor. Emotion in the human face. second edi ed. Cambridge University Press; 1982. p. 178–211.
20. Rychlowska M, Jack RE, Garrod OGB, Schyns PG, Martin JD, Niedenthal PM. Functional Smiles: Tools for Love, Sympathy, and War. Psychological Science. 2017;28(9):1259–1270. doi: 10.1177/0956797617706082 28741981
21. Stewart PA, Bucy EP, Mehu M. Strengthening bonds and connecting with followers. Politics and the Life Sciences. 2015;34(1):73–92. doi: 10.1017/pls.2015.5 26399947
22. Crivelli C, Fridlund AJ. Facial Displays Are Tools for Social Influence. Trends in Cognitive Sciences. 2018;22(5):388–399. doi: 10.1016/j.tics.2018.02.006 29544997
23. Mehu M, Scherer KR. A psycho-ethological approach to social signal processing. Cognitive Processing. 2012;13(S2):397–414. doi: 10.1007/s10339-012-0435-2 22328016
24. Ekman P, Friesen WP. Measuring facial movement with the Facial Action Coding System. In: Ekman P, editor. Emotion in the human face. second edi ed. Cambridge University Press; 1982. p. 178–211.
25. Mehu M, Mortillaro M, Bänziger T, Scherer KR. Reliable Facial Muscle Activation Enhances Recognizability and Credibility of Emotional Expression. Association. 2012;12(4):701–715.
26. Ekman P, Friesen W, Hager J. FACS Investigator’s Guide; 2002.
27. Nickerson RS. Confirmation bias: A ubiquitous phenomenon in many guises. Review of General Psychology. 1998;2(2):175–220. doi: 10.1037/1089-2680.2.2.175
28. Meissner CA, Brigham JC. Thirty Years of Investigating the Own-Race Bias in Memory for Faces: A Meta-Analytic Review. Psychology, Public Policy, and Law. 2001;7(1):3–35. doi: 10.1037/1076-8971.7.1.3
29. Orne MT. Demand Characteristics and the Concept of Quasi-Controls1. In: Artifacts in Behavioral Research. Oxford University Press; 2009. p. 110–137.
30. Nederhof AJ. Methods of coping with social desirability bias: A review. European Journal of Social Psychology. 1985;15(3):263–280. doi: 10.1002/ejsp.2420150303
31. McGettigan C, Walsh E, Jessop R, Agnew ZK, Sauter DA, Warren JE, et al. Individual Differences in Laughter Perception Reveal Roles for Mentalizing and Sensorimotor Systems in the Evaluation of Emotional Authenticity. Cerebral Cortex. 2015;25(1):246–257. doi: 10.1093/cercor/bht227 23968840
32. Guo H, Zhang Xh, Liang J, Yan Wj. The Dynamic Features of Lip Corners in Genuine and Posed Smiles. Frontiers in psychology. 2018;9(February):1–11.
33. Hess U, Beaupré MG, Cheun N. Who to whom and why–Cultural differences and similarities in the function of smiles. In: Abel MH, editor. Mellen studies in psychology, Vol. 4. An empirical reflection on the smile. New York: Edwin Mellen Press; 2002. p. 187–206.
34. Bugental DB. Unmasking the “Polite Smile”: Situational and personal determinants of managed affect in adult-child interaction. Personality and Social Psychology Bulletin. 1986;12(1):7–16. doi: 10.1177/0146167286121001
35. Ekman P, Friese W, Davidson R. The Duchenne Smile: Emotional Expression And Brain Physiology II. Journal of Personality and Social Psychology. 1988;58(2):342–353. doi: 10.1037/0022-3514.58.2.342
36. Ekman P, Friesen WV, O’Sullivan M. Smiles when lying. Journal of Personality and Social Psychology. 1988;54(3):414–420. doi: 10.1037//0022-3514.54.3.414 3361418
37. Schmidt K, Bhattacharya S, Denlinger R. Comparison of deliberate and spontaneous facial movement in smiles and eyebrow raises. Nonverbal Behaviour. 2009;33(1):35–45. doi: 10.1007/s10919-008-0058-6
38. Krumhuber EG, Manstead ASR. Can Duchenne smiles be feigned? New evidence on felt and false smiles. Emotion. 2009;9(6):807–820. doi: 10.1037/a0017844 20001124
39. Namba S, Makihara S, Kabir RS, Miyatani M, Nakao T. Spontaneous Facial Expressions Are Different from Posed Facial Expressions: Morphological Properties and Dynamic Sequences; 2016.
40. Messinger DS. Positive and negative: Infant facial expressions and emotions. Current Directions in Psychological Science. 2002;11(1):1–6. doi: 10.1111/1467-8721.00156
41. Girard JM, Shandar G, Liu Z, Cohn JF, Yin L, Morency LP. Reconsidering the Duchenne Smile: Indicator of Positive Emotion or Artifact of Smile Intensity? PsyArXiv. 2019.
42. Schmidt KL, Ambadar Z, Cohn JF, Reed LI. Movement differences between deliberate and spontaneous facial expressions: Zygomaticus major action in smiling. Journal of Nonverbal Behavior. 2006;30(1):37–52. doi: 10.1007/s10919-005-0003-x
43. Cohn JF, Schmidt KL. The timing of facial motion in posed and spontaneous smiles. International Journal of Wavelets, Multiresolution and Information Processing. 2004;2:121–132. doi: 10.1142/S021969130400041X
44. Hoque M, Morency LP, Picard RW. Are you friendly or just polite?—Analysis of smiles in spontaneous face-to-face interactions. In: D’Mello S, editor. Affective Computing and Intelligent Interaction. Lecture Notes in Computer Science. vol. 6974. Springer Berlin Heidelberg; 2011. p. 135–144.
45. Mavadati M, Sanger P, Mahoor MH, Street SY. Extended DISFA Dataset: Investigating Posed and Spontaneous Facial Expressions; 2016.
46. Krumhuber EG, Manstead ASR. Effects of Dynamic Aspects of Facial Expressions: A Review. Emotion Review. 2013;5(1):41–46. doi: 10.1177/1754073912451349
47. Orlowska AB, Krumhuber EG, Rychlowska M, Szarota P. Dynamics Matter: Recognition of Reward, Affiliative, and Dominance Smiles From Dynamic vs. Static Displays. Frontiers in Psychology. 2018;9:938. doi: 10.3389/fpsyg.2018.00938 29942274
48. Namba S, Kabir RS, Miyatani M, Nakao T. Dynamic Displays Enhance the Ability to Discriminate Genuine and Posed Facial Expressions of Emotion. Frontiers in Psychology. 2018;9:672. doi: 10.3389/fpsyg.2018.00672 29896135
49. Dawel A, Wright L, Irons J, Dumbleton R, Palermo R, O’Kearney R, et al. Perceived emotion genuineness: normative ratings for popular facial expression stimuli and the development of perceived-as-genuine and perceived-as-fake sets. Behavior Research Methods. 2017;49(4):1539–1562. doi: 10.3758/s13428-016-0813-2 27928745
50. Jack RE, Garrod OGB, Schyns P. Dynamic Facial Expressions of Emotion Transmit an Evolving Hierarchy of Signals over Time. Current Biology. 2014;24(2):187–192. doi: 10.1016/j.cub.2013.11.064 24388852
51. Bartlett MS, Littlewort GC, Frank M, Lee K. Automatic Decoding of Facial Movements Reveals Deceptive Pain Expressions. Current Biology. 2014;24(7):738–743. doi: 10.1016/j.cub.2014.02.009 24656830
52. Zloteanu M, Krumhuber EG, Richardson DC. Detecting Genuine and Deliberate Displays of Surprise in Static and Dynamic Faces. Frontiers in Psychology. 2018;9:1184. doi: 10.3389/fpsyg.2018.01184 30042717
53. Janssen JH, Tacken P, de Vries JJGGJ, van den Broek EL, Westerink JHDM, Haselager P, et al. Machines Outperform Laypersons in Recognizing Emotions Elicited by Autobiographical Recollection. Human–Computer Interaction. 2013;28(6):479–517. doi: 10.1080/07370024.2012.755421
54. Vinciarelli A, Pantic M, Bourlard H. Social signal processing: Survey of an emerging domain. Image and Vision Computing. 2009;27(12):1743–1759. doi: 10.1016/j.imavis.2008.11.007
55. Calvo RA, Member S, Mello SD, Society IC. Affect Detection: An Interdisciplinary Review of Models, Methods, and Their Applications. IEEE Transactions on Affective Computing. 2010;1(September):18–37. doi: 10.1109/T-AFFC.2010.1
56. Bettadapura V. Face Expression Recognition and Analysis: The State of the Art. arXiv. 2012.
57. Dibeklioglu H, Salah AA, Gevers T. Recognition of Genuine Smiles. IEEE Transactions on Multimedia. 2015;17(3):279–294. doi: 10.1109/TMM.2015.2394777
58. Wang S, Wu C, Ji Q. Capturing global spatial patterns for distinguishing posed and spontaneous expressions. Computer Vision and Image Understanding. 2016;147:69–76. doi: 10.1016/j.cviu.2015.08.007
59. Dibeklioglu H, Salah AA, Gevers T. Recognition of genuine smiles. IEEE Transactions on Multimedia. 2015;17(3):279–294. doi: 10.1109/TMM.2015.2394777
60. Yang J, Wang S. Capturing spatial and temporal patterns for distinguishing between posed and spontaneous expressions. In: Proceedings of the 2017 ACM on Multimedia Conference—MM’17. New York, New York, USA: ACM Press; 2017. p. 469–477.
61. Cacioppo JT, Tassinary LG. Inferring psychological significance from physiological signals. American Psychologist. 1990;45(1):16–28. doi: 10.1037//0003-066x.45.1.16 2297166
62. Tassinary LG, Cacioppo JT. Unobservable Facial Actions and Emotion. Psychological Science. 1992;3(1):28–33. doi: 10.1111/j.1467-9280.1992.tb00252.x
63. Schmidt KL, Cohn JF. Dynamics of facial expression: Normative characteristics and individual differences. In: IEEE Proceedings of International Conference on Multimedia and Expo. Tokyo: IEEE; 2001. p. 728–731.
64. Oberman LM, Winkielman P, Ramachandran VS. Slow echo: facial EMG evidence for the delay of spontaneous, but not voluntary, emotional mimicry in children with autism spectrum disorders. Developmental Science. 2009;4:510–520. doi: 10.1111/j.1467-7687.2008.00796.x
65. van Boxtel A. Facial EMG as a Tool for Inferring Affective States. In: Spink A, Grieco F, Krips OE, Loijens L, Noldus L, Zimmerman P, editors. Proceedings of Measuring Behavior. Eindhoven; 2010. p. 104–108.
66. Murata A, Saito H, Schug J, Ogawa K, Kameda T. Spontaneous Facial Mimicry Is Enhanced by the Goal of Inferring Emotional States: Evidence for Moderation of “Automatic” Mimicry by Higher Cognitive Processes. PLOS ONE. 2016;11(4):e0153128. doi: 10.1371/journal.pone.0153128 27055206
67. Chen Y, Yang Z, Wang J. Eyebrow emotional expression recognition using surface EMG signals. Neurocomputing. 2015;168:871–879. doi: 10.1016/j.neucom.2015.05.037
68. Gruebler A, Suzuki K. Design of a Wearable Device for Reading Positive Expressions from Facial EMG Signals. IEEE Transactions on Affective Computing. 2014;PP(99):1–1.
69. Funahashi A, Gruebler A, Aoki T, Kadone H, Suzuki K. Brief report: The smiles of a child with autism spectrum disorder during an animal-assisted activity may facilitate social positive behaviors—Quantitative analysis with smile-detecting interface. Journal of Autism and Developmental Disorders. 2014;44(3):685–693. doi: 10.1007/s10803-013-1898-4 23893100
70. Takano Y, Suzuki K. Affective communication aid using wearable devices based on biosignals. In: Proceedings of the 2014 conference on Interaction design and children—IDC’14. New York, New York, USA: ACM Press; 2014. p. 213–216.
71. Perusquía-Hernández M, Hirokawa M, Suzuki K. A wearable device for fast and subtle spontaneous smile recognition. IEEE Transactions on Affective Computing. 2017;8(4):522–533. doi: 10.1109/TAFFC.2017.2755040
72. Perusquía-Hernández M, Hirokawa M, Suzuki K. Spontaneous and posed smile recognition based on spatial and temporal patterns of facial EMG. In: Affective Computing and Intelligent Interaction; 2017. p. 537–541.
73. Perusquía-Hernández M, Ayabe-Kanamura S, Suzuki K, Kumano S. The Invisible Potential of Facial Electromyography: A Comparison of EMG and Computer Vision when Distinguishing Posed from Spontaneous Smiles. In: Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems—CHI’19. New York, New York, USA: ACM Press; 2019. p. 1–9.
74. Henrich J, Heine SJ, Norenzayan A. Most people are not WEIRD. Nature. 2010;466(7302):29–29. doi: 10.1038/466029a 20595995
75. Niedenthal PM, Rychlowska M, Wood A, Zhao F. Heterogeneity of long-history migration predicts smiling, laughter and positive emotion across the globe and within the United States. PLOS ONE. 2018;13(8):e0197651. doi: 10.1371/journal.pone.0197651 30067736
76. Thibault P, Gosselin P, Brunel ML, Hess U. Children’s and adolescents’ perception of the authenticity of smiles. Journal of Experimental Child Psychology. 2009;102(3):360–367. doi: 10.1016/j.jecp.2008.08.005 18930472
77. Thibault P, Levesque M, Gosselin P, Hess U. The duchenne marker is not a universal signal of smile authenticity—but it can be learned! Social Psychology. 2012;43(4):215–221. doi: 10.1027/1864-9335/a000122
78. Mai X, Ge Y, Tao L, Tang H, Liu C, Luo YJ. Eyes Are Windows to the Chinese Soul: Evidence from the Detection of Real and Fake Smiles. PLoS ONE. 2011;6(5):e19903. doi: 10.1371/journal.pone.0019903 21647430
79. Bernstein MJ, Sacco DF, Brown CM, Young SG, Claypool HM. A preference for genuine smiles following social exclusion. Journal of Experimental Social Psychology. 2010;46(1):196–199. doi: 10.1016/j.jesp.2009.08.010
80. Gadassi R, Mor N. Confusing acceptance and mere politeness: Depression and sensitivity to Duchenne smiles. Journal of Behavior Therapy and Experimental Psychiatry. 2016;50:8–14. doi: 10.1016/j.jbtep.2015.04.007 25958338
81. Song R, Over H, Carpenter M. Young children discriminate genuine from fake smiles and expect people displaying genuine smiles to be more prosocial. Evolution and Human Behavior. 2016;37(6):490–501. doi: 10.1016/j.evolhumbehav.2016.05.002
82. Hourihan KL, Benjamin AS, Liu X. A cross-race effect in metamemory: Predictions of face recognition are more accurate for members of our own race. Journal of applied research in memory and cognition. 2012;1(3):158–162. doi: 10.1016/j.jarmac.2012.06.004 23162788
83. Namba S, Kabir RS, Miyatani M, Nakao T. Spontaneous Facial Actions Map onto Emotional Experiences in a Non-social Context: Toward a Component-Based Approach. Frontiers in Psychology. 2017;8:633. doi: 10.3389/fpsyg.2017.00633 28522979
84. IPanda. Panda wants a hug from nanny, but nanny is working; 2017. Available from: https://youtu.be/r8B-RuJRI2A.
85. Brown A. Trololo cat; 2010.
86. Lang PJ, Bradley MM, Cuthbert BN. International Affective Picture System (IAPS). Gainesville, FL.: University of Florida; 2008.
87. Comon P. Independent component analysis, A new concept? Signal Processing. 1994;36(36):28–314.
88. Russell JA, Weiss A, Mendelsohn GA. Affect Grid: A Single-Item Scale of Pleasure and Arousal. Journal of Personality and Social Psychology. 1989;57(3):493–502. doi: 10.1037/0022-3514.57.3.493
89. Quirin M, Kazén M, Kuhl J. When nonsense sounds happy or helpless: The Implicit Positive and Negative Affect Test (IPANAT). Journal of Personality and Social Psychology. 2009;97(3):500–516. doi: 10.1037/a0016063 19686004
90. Shimoda S, Ōkubo N, Kobayashi M, Satō S, Kitamura H. An attempt to construct a Japanese version of the Implicit Positive and Negative Affect Test (IPANAT). Shinri-gaku kenkyū. 2014;85(3):294–303.
91. van der Ploeg MM, Brosschot JF, Thayer JF, Verkuil B. The Implicit Positive and Negative Affect Test: Validity and Relationship with Cardiovascular Stress-Responses. Frontiers in psychology. 2016;7:425. doi: 10.3389/fpsyg.2016.00425 27065908
92. Hyvärinen a, Oja E. Independent component analysis: algorithms and applications. Neural networks: the official journal of the International Neural Network Society. 2000;13(4-5):411–30. doi: 10.1016/S0893-6080(00)00026-5
93. Krumhuber E, Manstead ASR, Cosker D, Marshall D, Rosin PL, Kappas A. Facial dynamics as indicators of trustworthiness and cooperative behavior. Emotion. 2007;7(4):730–735. doi: 10.1037/1528-3542.7.4.730 18039040
94. Coles NA, Larsen JT, Lench HC. A meta-analysis of the facial feedback literature: Effects of facial feedback on emotional experience are small and variable. Psychological Bulletin. 2019;145(6):610–651. doi: 10.1037/bul0000194 30973236
95. Beyer H, Holtzblatt K. Contextual design. interactions. 1999;6(1):32–42.
96. Levine TR, Sun Park H, McCornack SA. Accuracy in detecting truths and lies: Documenting the “veracity effect”. Communication Monographs. 1999;2(66):125–144. doi: 10.1080/03637759909376468
97. Valstar MF, Gunes H, Pantic M. How to distinguish posed from spontaneous smiles using geometric features. Int’l Conf Multimodal In. 2007;terfaces:38–45.
98. Kring AM, Neale JM. Do schizophrenic patients show a disjunctive relationship among expressive, experiential, and psychophysiological components of emotion? Journal of Abnormal Psychology. 1996;105(2):249–257. doi: 10.1037//0021-843x.105.2.249 8723006
99. Kring AM, Caponigro JM. Emotion in Schizophrenia: Where Feeling Meets Thinking. Current directions in psychological science. 2010;19(4):255–259. doi: 10.1177/0963721410377599 22557707
100. Gur RE, Kohler CG, Ragland JD, Siegel SJ, Lesko K, Bilker WB, et al. Flat affect in schizophrenia: relation to emotion processing and neurocognitive measures. Schizophrenia bulletin. 2006;32(2):279–87. doi: 10.1093/schbul/sbj041 16452608
Článok vyšiel v časopise
PLOS One
2019 Číslo 12
- Metamizol jako analgetikum první volby: kdy, pro koho, jak a proč?
- Nejasný stín na plicích – kazuistika
- Masturbační chování žen v ČR − dotazníková studie
- Těžké menstruační krvácení může značit poruchu krevní srážlivosti. Jaký management vyšetření a léčby je v takovém případě vhodný?
- Fixní kombinace paracetamol/kodein nabízí synergické analgetické účinky
Najčítanejšie v tomto čísle
- Methylsulfonylmethane increases osteogenesis and regulates the mineralization of the matrix by transglutaminase 2 in SHED cells
- Oregano powder reduces Streptococcus and increases SCFA concentration in a mixed bacterial culture assay
- The characteristic of patulous eustachian tube patients diagnosed by the JOS diagnostic criteria
- Parametric CAD modeling for open source scientific hardware: Comparing OpenSCAD and FreeCAD Python scripts