Action+Research+Report

Using Student Response Systems to Improve Benchmark Results
By Jana Dugas

//Abstract//

Six times a year in five subjects students are assessed with benchmark tests. In the current method, tests are copied and bubble sheets are run off for each child. Students fill answers in on bubble sheets, which are then scanned through a scanner and uploaded into data processing software. Students in this economically disadvantaged school, struggle with reading scores. It also takes a significant amount of time and effort for some students to bubble in the answers sheets, especially in first and second grade. The school of study has access to student response systems that are sitting in the building, not being utilized. The proposed idea is to use the student response systems to increase test scores. Data was collected every six weeks for a 24-week period. The sample of students ranged from on grade level to well below grade level, including two non-readers. The study revealed that the students using student response systems did not score higher than the students who used bubble answer sheets.

//Introduction / Background//

The study was conducted at Vidor Elementary School in Vidor, Texas. The district has previously struggled with AYP. The school has been recognized by TEA in the past year and has been recognized or exemplary in the past. The student population at Vidor Elementary is 99% white and 76% economically disadvantaged. Technology use on this campus is minimal; teachers use projectors, document camera/video devices, and laptops to deliver lessons. Students have access to 3-5 computers per classroom that connect to the Internet. There is a computer lab available that will accommodate 25 students, however there is not enough bandwidth for the students to all access the Internet at the same time. The study was conducted by a fourth grade, self-contained teacher. The sample of students included students that were on grade level to well below grade level, including two non-readers.

The students in the previous year had scored low on the Texas Assessment of Knowledge and Skills (TAKS) in the area of reading. More specifically the economically disadvantaged students scored poorly in the Reading TAKS. Accelerated Reader is utilized at the school and students are required to earn a certain amount of points every grading period. Students were able to read and comprehend on grade level. After school tutoring was available to struggling students, as well as small group instruction during the school day. Lots of interventions had been put in place to give students multiple opportunities to master skills. Low reading scores have been a problem in the past, students have also struggled in math on the TAKS.

The study was conducted to see if using student response systems could improve benchmark test results, and thus improving Reading TAKS (or STAAR) scores. Student response systems were already available on at the school and were not being used. Could they be used to input student answers and uploaded into the data software? Would the use of technology motivate the students to achieve higher test scores?

The teachers and students would benefit from this process because students will get a chance to use technology. 21st Century learners yearn for technology. As a campus with such economic disadvantages, the students most likely rarely have access to technology that most other students their age do. Teachers will also benefit from learning how to use the technology available through professional development. The student response system is designed to let students input answers using a remote control that contains answer choices. The previous method for collecting test data was to have students fill in bubble sheets with their answer choices. The bubble sheets were then taken to a scanner and scanned into a computer. The computer would send the results to software that collects the data and grades the tests. Using the students response systems the answers could be entered in and uploaded in a shorter amount of time and in a more positive and engaging way.

A student response system consists of remote controls (one for each student), a receiver, and software for your computer that records answers and grades them for you. AYP refers to Adequate Yearly Progress, if students are not meeting the set standard, then schools can get into trouble from TEA. There are many factors that go into the AYP formula, not just test scores. TEA is the Texas Education Agency, which governs Texas public schools and also sets the standards that the students must meet. The Texas Assessment of Knowledge and Skills or TAKS has now been replaced with the State of Texas Assessment of Academic Readiness.

//Literature Review//

There have been several studies on student response systems. While some of them have been used at the collegiate level, others have been used in K-12. The research that exists explores student response system use, impacts on students and learning, and effects on classroom environment.

Uses of student response systems include in lectures or lessons and formal assessment. When used in a lecture or lesson the systems provide a form of feedback for the teacher to assess whether students are grasping the subject being taught. “The teachers believe the ability for all students to answer questions helps them to determine their students’ learning during instruction, and this enables them to provide prompt, corrective feedback according to their students’ learning needs” (Beck). Most researchers agree “when used in classes, ARS (Audience Response System) clickers typically have either a benign or positive effect on student performance on exams” (Caldwell, 2007). The student response systems can also have a positive impact on assessment. Research also finds that using student response systems showed “ significant improvement in exam performance on those parts of the courses where clickers were used more often” (McCune). They (student response systems) increase student’s engagement, motivation, and participation (Kay and Knaack, 2009). Several literatures talked about the positive effect that student response systems had on the environment in the classroom. “T he energy in the class and the excitement of the students made for a great learning environment” when using the clickers ( "CBC news," 2011).

//Action Research Design//

//Subjects//

The target population was my class of 9-10 year olds (fourth grade); there was no way to exclude anyone, it simply wouldn’t have been fair. There were 18 students, 10 girls and 8 boys. The sample of students included students that were on grade level to well below grade level, including two non-readers. All of my students were able to take benchmarks using the student response system; also there was a blind student for half of the year, and with assistance was able to use the student response systems. Once other teachers and my principal found out that I was using the student response systems, they became interested. This project has led me to teach a professional development class to my peers in August 2012 on how to use the student response systems for benchmark testing. Any student of any skill level can use the student response systems; some will need more assistance than others. I compared my sample to the control group. The control group was the rest of the grade level. They consisted of students that mirrored mine, just multiplied. The control group had about 90 students of similar learning abilities, and they also had similar teaching. Our grade level plans lessons together and we teach in similar styles using similar methods.

//Procedures//

The first thing that had to happen was to find out if the student response system would upload scores into the data software that was used to store the data. Jana Cash, the technology coach for Vidor I.S.D. helped to figure this step out. This occurred in October of 2011. The next step was to set everything up; it was not hard. A sensor had to be connected to the USB port and set up high where everyone in the room could point the remotes. There was a dashboard download, as well as a software update that needed to be downloaded. Tests were already created in the data software. In the data software program a box had to be checked to show that student response pads would be used. Remotes were handed out to the students and they entered their answers.

It was fairly easy to manage the project. No money was used; the student response systems and the software had already been purchased. I utilized my technology coach, my students, and myself; no other people were involved or utilized. The overall time to download the dashboard and get the student response systems going was about half an hour. After that, it was a matter of minutes to get the student response systems distributed and working each time we used them. As soon as I showed the student response systems to the students the atmosphere in the classroom transformed from blah to boisterous. The students were excited to show what they had learned and did not miss the bubble sheets. The only public resources used were the technology coach and my students. The student response systems that the school had already purchased were used; they had been purchased 4 or 5 years ago and were not being utilized.

//Data Collection//

When all the answers had been entered with the student response pads, I uploaded the data to the software website. Data was instantly collected and tests were graded. The data within the software is organized in multiple ways: individual student scores, class scores, school scores, and district scores. This was done every six weeks until school was out in May 2012, excluding the 3rd six weeks due to a laptop crash.

//Findings//

Students in the 2nd 6 weeks actually scored lower than the grade level; however in the 6th 6 weeks, they scored exactly the same. Students really enjoyed using the response systems so it created a positive atmosphere, especially for a testing time. When the laptop crashed and students weren’t able to use the response systems, they were very upset and morale was low because they couldn’t use the response systems.



//Conclusions and Recommendations//

It’s my belief that if the student response systems had been used on a daily basis that the test results would have been higher for those students. Research talks about the student response systems increasing scores for students who used them more frequently. The students loved using them, so to me it was worth using them. Putting the remotes in the hands of the students will get them excited about testing. It will create a more positive atmosphere in a time that is stressful for most.

I would recommend that if you really want to improve test scores and you have access to student response systems, then I would use them more often. They are great tools and they are also good for tracking student progress; it makes it easy to collect data and grade. The results from this project will be disseminated in professional development training on how to set up and use the systems. There are a few teachers in each grade level that will support me in what I am trying to do; I will ask for their help in getting the rest of their teammates to try using them. I think with a core team of supporters and users that others will want to follow suit; especially when the students start talking about using the response systems. I also have support from my administrators in what I am doing. The staff will be receiving a pamphlet with step-by-step how-to instructions on setting up and using the student response system and data software together so that everyone on our campus can use it.

To communicate with the public about my action research, the technology coach and I are going to be telling the other elementary campuses about this project. We will be giving a professional development on the process of how to set up and use the student response systems to take benchmark tests and then, how to upload and analyze the data in data software. All administrators and the superintendent has seen the video that I plan on sharing in the professional development training that shows me using the student response systems to give a benchmark test.

When I share my action research with others, I will show a video of the process and student’s reactions. I think this will create acceptance towards what I am about to show them. I think to revise the project I would want the student response systems to be used more than just for benchmark tests. I don’t think just using them for benchmark testing will improve scores; they would have to be used more on a daily basis. With that in mind, we would need more systems. Our school only has four. However, I do think since our school is just emerging into accepting technology usage in the classroom, this is a great starting point. If I can get one grade level all using the student response systems, then I think we will have success. I plan to motivate the staff to use them by getting the teachers to use the student response systems to see how fun it is. I think this will help them to see what their kids will be experiencing. Another motivational tool will be that I will make myself available to them before and after school to help them set it up and guide them through the process; I would even use my conference and lunch times to do that. If they know they have support right there, I think they would use it.

This meets the needs of students by letting them utilize technology; that doesn’t happen very often in their school day. Our elementary campuses are barely using technology in the classrooms; resources are just not available. It also meets the needs of the teachers, they have more time available to do other things instead of having to print out bubble sheets, make sure they are filled in correctly, and then scanning them. The bubble sheet process takes up time and effort. Using the student response systems cuts down on paperwork and set up time is minimal. It can be up and running in a matter of minutes. I think once we start seeing how excited our kids are to use them, and then there will be a push to purchase more systems. I think that is when we will truly see improvements in test scores. If students are using them more than just for tests, then they will be more engaged in the learning process. Scores will then improve.

Key words or phrases: student response systems, benchmark testing

References:
 * Beck, S. K. (n.d.). //Just click it: Using a student response system for assessment and feedback//. Unpublished manuscript, Catoosa County Public Schools, Georgia, Retrieved from []
 * Caldwell, J. E. (2007). Clickers in the large classroom: Current research and best-practice tips . //CBE - Life Sciences Education//, //6//(Spring), 9-20. Retrieved from []
 * Kay, R., & Knaack, L. (2009). Exploring the use of audience response systems in secondary school science classrooms. Journal of Science Education & Technology, 18, 382-392.
 * McCune, V. (n.d.). The university of edinburg: college of science and engineering. In Retrieved from []
 * (2011, May 13). //CBC news//. (2011). [0]. Retrieved from []