Michele IBX Eval

From Wikidelphia
Jump to navigation Jump to search

Back to IBX Proposal

Back to Information Advocates Notebook


This is Michele's Evaluation Document


Masucci, 2010 – TAGS Evaluation Discussion

This project will address a gap in the literature by examining the relative benefits of culturally relevant and locally based field experiences as compared with traditional ones to determine if there are differences between them in terms of fostering student understanding, cartographic skills, and engagement of geography. Questions guiding the research and evaluation processes related to the project are: (a) Are there differences in the nature of field experiences that can address engagement and other measures of persistence in STEM fields? (b) Does knowledge of core skills and methods in geography contribute to student choices of pathways in geography and/or other STEM fields of study? And, (c) Does participation in collaborative teamwork using state of the art technologies provide students with important self-efficacy, leadership, and strategizing skill gains that connect with their moving towards STEM educational and career pathways?

We plan to analyze these questions through examining the following aspects of program participation:

• The impacts TAGS experiences have on students’ development of core skills and knowledge in technology, communication, research, and analysis;
• The impacts TAGS experiences have on students’ aspirations and goals with respect to high school course taking, college planning, and career interests in technology, math, and/or science; and
• The impacts training and hands-on experiences with media, mapping, e-communications and coding technologies have on students, teachers and parents have on spatial understandings of their communities.

The goal of the formative evaluation plan is to assess: (a) successful strategies in the project design and implementation (b) the effectiveness of the instructional approach, and (c) the viability of the field exploration, social media, lab, and coding approaches for fostering engagement; and (d) project implementation areas that prove to be more challenging in order to utilize the information to re-asses pedagogy, instruction, implementation and involvement. The formative component will focus on assuring that the project activities progress as planned, the curriculum and learning activities are adapted or revised in response to feedback, and the activities engage students, provide the planned educational experiences and assure project quality. The indicators of the formative evaluation include how well the project progresses, how the recruitment and delivery methods and strategies align with project objectives, the cohesiveness of curricular resources, software tools developed, and technology use implementation, and the extent to which project quality aligns with project goals and participants’ expectations.















The summative evaluation plan will entail two components: (a) a student-based and (b) a teacher-based evaluation model. The intended outcomes for students include:

• Knowledge and skills related to maps, media, e-communications and coding;
• Knowledge and skills related to park environments and associated problems;
• Geography, Geographic Information Science and STEM career aspirations and course-taking patterns;
• Increased interaction and engagement with parents, teachers and community partners around these STEM fields; and
• Increased self-confidence in science and leadership skills.

The conceptual framework for assessing these outcomes is represented in the figure above. We plan to develop instruments to measure baseline skill sets for mapping, visualization, and e-communications related to environmental and geographic core concepts. We will pilot test new instruments to measure the skill set and knowledge base related to program goals. The survey instruments will be tested using exploratory factor analysis for latent concepts and use Cronbach’s alpha for reliability. We will use a panel of experts to assess content validity drawing on support from the Institute for Survey Research and Faculty advisors. We will explore existing instruments for engagement, including using items from and revising the well known National Student Survey of Engagement to the local context and for high school students. In addition, we will explore using existing instruments in the literature for leadership and self-confidence in science. (See Berson and Berson 2006; Bertrams and Dickhäuser 2009, Broda et al. 2003; Carter and Spotanski 1989; Casillas et al. 2006; Chang and Cheng 2008; Gegner et al. 2009; Fields 2009; Fowler et al. 2009; Kastens et al. 2009; Kastens and Liben 2007; Lane 2004; Lin and Tsai 2008; March 1998; Miri et al. 2007; Sanders 2009.) All of these instruments will be pilot tested for validity and reliability.

Students will be pre-tested on each of the instruments before starting the program. To measure treatment, data will be collected on dosage of exposure to treatments (After-School programming throughout the academic year and the Summer Program/Field Experiences). Dosage will be measured both in days and hours of exposure for each student in the program. In addition, if students return in subsequent years, we will longitudinally track dosage of the programming. Post-test instruments will be developed similar to the pre-test instruments to measure program effectiveness. In addition to having data on dosage of the broad treatments, we plan to attempt to track “active ingredients” of each intervention, including time at the computer, versus time in the field, time spent with mentors and other relevant key elements of the program. Lastly, data will be tracked on level of parental and teacher involvement for each student as this is we hypothesize that the higher the parental involvement, the increased outcomes for the student. We plan to do follow-up interviews with students after they graduate to collect data on course-taking patterns as well as job placement or college entrance.

Methodology for the quantitative student-based evaluation will be quasi-experimental in nature. There will be no random assignment to treatment but we will control for student demographics, including school attended, and other socio-educational indicators of “at-risk”. Pre-measures will be collected on outcomes specified above to assess the effect of the treatments. Data analysis techniques will include multiple regression, including path analyses and/or structural equation modeling to simultaneously examine the effects of intermediary treatments on the outcomes as well as the latent structure of the concepts we are attempting to measure. Because of the way students are recruited, we do not anticipate having enough clustering of students within teacher/parent to worry about hierarchical structures and will therefore treat all students as independent observations unless otherwise indicated after the students are recruited.

Similarly for the teacher-based evaluation, the intended outcomes for teachers and parents include:

• Knowledge and skills related to Geographic Information Science (mapping, visualization, representation);
• Knowledge and skills related to media, map, communication, and coding applications;
• Increased use of the tools learned in Philadelphia School District classrooms

Instrument development, data collection and analytical techniques for the teacher-based evaluation will be similar to those described above for students. One very important teacher-based outcome for the broader impacts of this program is the application of the tools to Philadelphia-based classrooms. We plan to longitudinally track the teachers to see if they are exploring and using the techniques they learned in their classes or if they leverage the knowledge they learned by integrating their students or other teachers from their school into our program in successive years. As final summative measures, we will track rates of participation and project completion, diversity of students participating in the project, the accessibility of the project to the population we aim to involve, and the value placed on the project by participants. In addition, we will draw on in-school performance measures and community indicators of project impacts, including tracking high school attendance, improvement of academic performance, and increased interest and involvement in STEM subject matter.











Qualitatively, data will be collected via observation, focus groups, interviews, and archival data sources. In addition, observation will be conducted to record a fraction of the workshops conducted that involve teachers and parents. Also, three project staff meetings will be observed and recorded. In addition to direct observation, three focus groups will be used to gather data. Two sets of focus groups will be targeted towards high school students participating in the summer intensive program and the after school program. The primary purpose of the focus groups that involve students is to assess and further understand the students’ ontology (how they see the world and themselves in that world). A third focus group will involve parents that participated in the workshop to assess the program participants understanding of the use of the technology approaches developed by the project. Interviews with project staff and assessment of archives related to student work will complete the qualitative assessments. Direct observation, focus group, and interview content will be transcribed and coded with a codebook that incorporates culturally relevant and responsive education. The entire evaluation assessment is described in detail in the supporting documentation provided by the external evaluation specialist.