Date of Completion

8-23-2016

Embargo Period

8-10-2016

Keywords

critical evaluation, online reading, disciplinary literacy, scientific literacy, prior knowledge, gender, socioeconomic status, offline reading

Major Advisor

Donald J. Leu

Associate Advisor

Mary Anne Doyle

Associate Advisor

Michael Coyne

Associate Advisor

Christopher Rhoads

Field of Study

Educational Psychology

Degree

Doctor of Philosophy

Open Access

Open Access

Abstract

This study investigated how seventh grade students performed on a measure of online critical evaluation in science (the ORCA). The analysis included evaluating the extent to which critical evaluation also appeared to be an aspect of other elements of online research and comprehension, including reading to locate information, reading to synthesize information, and reading and writing to communicate information. Additionally, this study examined the extent to which several important individual difference variables affected students’ ability to critically evaluate information during online reading in science. The individual difference variables evaluated in this study included prior knowledge, gender, socioeconomic status, and offline reading ability. Participants (n = 1,434) included seventh grade students from two states in the Northeast United States.

This study used a multiple theoretical perspectives approach (Labbo & Reinking, 1999) to frame the study. Three theoretical perspectives were employed that included theories of offline (RAND Reading Study Group, 2002; Anderson & Pearson, 1984) and online reading (Leu, Kinzer, Coiro, Cammack, & Henry, 2013), perspectives on individual differences (Afflerbach, 2015), and a disciplinary literacy framework (Shanahan & Shanahan, 2008) for science. These perspectives are integrated in a way that forms the basis for a framework of critical evaluation of online information in science, a framework that takes into account the role of individual differences in the reading comprehension process.

Multiple regression analysis was used to evaluate the shared variance among critical evaluation and the three other skill areas. Multilevel modeling (MLM) was used to compare mean differences in scores between critical evaluation and the other three skill areas. MLM also was used to evaluate the effects of the four individual difference variables on students’ online critical evaluation abilities. Both student-level and school-level effects were evaluated. Findings suggest that critical evaluation is a somewhat unique and difficult dimension of online research and comprehension. Findings also suggest that student-level prior knowledge, gender, and offline reading, as well as school-means for offline reading, have a significant effect on students’ ability to evaluate online information in science. Results are discussed in the context of theory development, research, assessment, and instruction.

Share

COinS