Inviting an author to review:
Find an author and click ‘Invite to review selected article’ near their name.
Search for authorsSearch for similar articles
16
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      Structural priming of code-switches in non-shared-word-order utterances: The effect of lexical repetition

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          Code-switching is generally dispreferred at points of non-shared word order across a bilingual's two languages. In priming studies, this dispreference persists even following exposure to a code-switched non-shared-word-order utterance. The present study delves deeper into the scope of code-switching priming by investigating whether lexical repetition across target and prime, a factor known to boost structural priming, can increase code-switching at points of word order divergence. Afrikaans–English bilinguals ( n=46) heard prime sentences in which word order, lexical repetition, and switch position were manipulated and subsequently produced code-switched picture descriptions. The results show that lexical repetition boosts the priming of code-switching in a non-shared word order. The findings demonstrate that code-switching in production is affected by a dynamic interplay between factors both language-internal (i.e., word order) and language-external (i.e., priming, and specifically lexical repetition).

          Related collections

          Most cited references75

          • Record: found
          • Abstract: not found
          • Article: not found

          Fitting Linear Mixed-Effects Models Usinglme4

            Bookmark
            • Record: found
            • Abstract: not found
            • Article: not found

            lmerTest Package: Tests in Linear Mixed Effects Models

              Bookmark
              • Record: found
              • Abstract: found
              • Article: not found

              Random effects structure for confirmatory hypothesis testing: Keep it maximal.

              Linear mixed-effects models (LMEMs) have become increasingly prominent in psycholinguistics and related areas. However, many researchers do not seem to appreciate how random effects structures affect the generalizability of an analysis. Here, we argue that researchers using LMEMs for confirmatory hypothesis testing should minimally adhere to the standards that have been in place for many decades. Through theoretical arguments and Monte Carlo simulation, we show that LMEMs generalize best when they include the maximal random effects structure justified by the design. The generalization performance of LMEMs including data-driven random effects structures strongly depends upon modeling criteria and sample size, yielding reasonable results on moderately-sized samples when conservative criteria are used, but with little or no power advantage over maximal models. Finally, random-intercepts-only LMEMs used on within-subjects and/or within-items data from populations where subjects and/or items vary in their sensitivity to experimental manipulations always generalize worse than separate F 1 and F 2 tests, and in many cases, even worse than F 1 alone. Maximal LMEMs should be the 'gold standard' for confirmatory hypothesis testing in psycholinguistics and beyond.
                Bookmark

                Author and article information

                Contributors
                (View ORCID Profile)
                Journal
                Bilingualism: Language and Cognition
                Bilingualism
                Cambridge University Press (CUP)
                1366-7289
                1469-1841
                February 07 2023
                : 1-14
                Article
                10.1017/S1366728923000044
                1d97e691-b8ea-4429-b31d-3b005446a3cf
                © 2023

                Free to read

                http://creativecommons.org/licenses/by/4.0/

                History

                Comments

                Comment on this article