Geschlechtsübergreifende Maskulina im Sprachgebrauch Eine korpusbasierte Untersuchung zu lexemspezifischen Unterschieden
By: Carolin Mueller-Spitzer , Samira Ochs , Jan Oliver Ruediger and more
Potential Business Impact:
Shows how men's words can exclude women.
This study examines the distribution and linguistic characteristics of generic masculines (GM) in contemporary German press texts. The use of masculine personal nouns to refer to mixed-gender groups or unspecified individuals has been widely debated in academia and the public, with con-flicting perspectives on its gender-neutrality. While psycholinguistic studies suggest that GM is more readily associated with male referents, corpus-based analyses of its actual use remain scarce. We investigate GM in a large corpus of press texts, focusing on lexeme-specific differences across dif-ferent types of personal nouns. We conducted manual annotations of the whole inflectional para-digm of 21 personal nouns, resulting in 6,195 annotated tokens. Our findings reveal considerable differences between lexical items, especially between passive role nouns and prestige-related per-sonal nouns. On a grammatical level, we find that GM occurs predominantly in the plural and in indefinite noun phrases. Furthermore, our data shows that GM is not primarily used to denote entire classes of people, as has been previously claimed. By providing an empirical insight into the use of GM in authentic written language, we contribute to a more nuanced understanding of its forms and manifestations. These findings provide a solid basis for aligning linguistic stimuli in psy-cholinguistic studies more closely with real-world language use.
Similar Papers
Beyond Content: How Grammatical Gender Shapes Visual Representation in Text-to-Image Models
Computation and Language
AI pictures change based on word gender.
Beyond Content: How Grammatical Gender Shapes Visual Representation in Text-to-Image Models
Computation and Language
AI pictures change based on word gender.
Integrating gender inclusivity into large language models via instruction tuning
Computation and Language
Fixes computer language to be fair to everyone.