Education reforms outlined in the National Science Education Standards (NRC, 1996) emphasize the importance of teaching for understanding. The assessment of understanding requires performances or artifacts that demonstrate the connections learners make between concepts. Yet, the range and nature of these tasks in specific science contexts are not well understood. Therefore, analysis of student work is important in order to determine overall educational value, identify misconceptions or gaps in conceptual understandings, and assess students' achievement levels.; This dissertation examined ninth-grade student generated artifacts produced during an 18-week project-based science curriculum where students engaged in an interdisciplinary (biology, chemistry and earth science) study of stream ecology. Students created artifacts (essays, investigative reports, and dynamic computer models) to demonstrate their developing understandings of stream ecology. I employed content analysis to identify scientific understandings across the measures. The lens used to catalog the understandings was the National Science Education Standards (NRC, 1996).; Findings indicate that students' understandings of the stream ecosystem began as unconnected pieces of knowledge but increased substantially in both breadth and depth across the chemistry, biology, earth science and environmental science domains. Students developed robust understandings around the niche concept and the interactions of earth systems. The dynamic computer models provided rich environments for demonstrating ecosystem understandings. An implication for instruction is that well designed artifacts provide significant assessment of student understandings across curriculum content.; The usefulness of the Standards as a tool shows both promise and problems. The tool shows some promise in addressing the issues of validity, reliability and impact on instruction. Using the Standards as a frame of reference means information about student achievement generated through different assessments in different contexts can now have common meaning and value in the science education community. The issues encountered in deploying this tool included: (1) lack of correspondence between content in the curriculum and in the Standards. (2) How to handle different levels of specificity in the assessment criteria (expected content, "if-might" content, and serendipitous content). And (3) translating content standards into performance standards to assess learning below the proficiency level defined by the Standards. |