When hearing or reading words and sentences in a second language (L2), we face many uncertainties about how the people and objects referred to are connected to one another. So what do we do under these conditions of uncertainty? Because relatively proficient L2 speakers have access to the grammar and lexicon of each language when comprehending words and sentences or when planning spoken utterances, and because the recent research suggests that these linguistic systems are not entirely independent, there is a critical question about how the knowledge of two languages affects basic aspects of language processing. In this article, I review how eye-tracking methodology has been used as a tool to address this question. I begin by discussing why eye movements are a useful methodology in language processing research, and I provide a description of one experimental paradigm developed to explore eye movements during reading. Second, I present recent developments in the use of eye tracking to study L2 spoken-language comprehension. I also highlight the importance of using multiple measures of online sentence processing by discussing results obtained using a moving window task and eye-tracking records while L2 speakers read syntactically ambiguous relative clauses. Next, I discuss research investigating syntactic processing when L2 speakers process mixed language. I end with suggestions for future research directions.
All Science Journal Classification (ASJC) codes
- Language and Linguistics
- Linguistics and Language