![]()
|
Abstract |
|
![]() |
1.
Introduction |
![]() |
![]() |
2. Instructional Need Science education has not done a very good job of making science an attractive field for women ( Lynch, 2000). We believe a major factor is that many of the topics that are usually studied when learning the scientific method are less interesting to girls (e.g., soil samples, earthworms, levers and pulleys). By highlighting the more complex and interesting behavioral sciences through the use of videos with topics such as children playing, college students' social behavior, and gorillas feeding, COR may help to enhance women and girls' interest in science. The scientific method requires the understanding of probability and statistics; however, even in cases where students have taken statistics in a previous course they have difficulty applying what they have learned to their research methods course. Students find the research methods courses difficult and boring, leaving faculty members teaching unhappy and unmotivated students (Blasko, Kazmerski, Corty, & Kallgren, 1998). As a result, research methods and statistics courses often receive lower student evaluations than other courses (Cashin, 1990). Many instructors attempt to combat these problems by teaching research methods courses using hands-on activities. However, it is difficult to find good subjects to observe ( Blasko, Kazmerski, Corty, & Kallgren, 1998). Many object to using animals on ethical grounds, and it is costly for the school to maintain its own laboratory facilities. Similarly, observing children can be difficult to arrange and requires parental approval, which can be difficult to obtain. In our case, we have gone to the local zoo to observe animals' behavior, but with short observation sessions one does not know if the animals will behave in ways that will allow the students to have a valuable learning experience. For example, if the animal sleeps for the entire observation session, the student has learned little. Additionally, students typically require considerable practice to learn these concepts well, which means they need many observation opportunities. |
|
![]() |
3. COR’s Instructional
Model Despite the stereotype of the lonely scientist, the scientific process is largely collaborative in nature. Typically scientists work in teams and a diversity of skills and learning styles can enhance creativity and efficiency. The hallmarks of good research are that the findings are both reliable (repeatable) and valid (generalizable to new populations). Therefore it is critical to compare the results of one observer to another independent observer. By focusing on this, COR was designed to enhance collaboration. Students work in small teams of two or three to define their research question, develop the best methodology to test it, conduct the observation, and then test their interrater reliability. This type of collaboration can develop teamwork skills and enhances learning by encouraging deeper processing of course materials as students explain concepts to each other (Jones & Carter, 1998; Teasley, 1995). 3.1 Lessons |
|
![]() |
A subsequent lesson takes the student’s data and shows how to perform interrater reliability testing. The student is permitted to use the expert’s data or his own. In using his own data he can see how tests of interrater reliability (percent agreement and Cohen’s Kappa) are applied to his data. He then evaluates whether his data have sufficient interrater reliability to continue with data analysis and significance testing. |
|
![]() |
3.2 Case Study COR presents a case study in which a fictional young girl, Sarah, may be removed from her afterschool program due to her aggressive behavior towards the other children. Sarah’s teacher claims the child is not aggressive, but a staff member at the care center claims she is. Sarah’s parents have come to you for help in determining whether Sarah really is behaving any more aggressively than her peers or if gender stereotypes or other issues are influencing the staff member at the care center. Students observe Sarah in a play setting and assess if her behavior is more aggressive than her age-matched peers. Students must use two observers and calculate interrater reliability, and then interpret if their results are statistically significant. Afterwards students are presented with the difficult task of determining if Sarah should be removed from the afterschool program. To demonstrate the complexity of such decisions, students then have the opportunity to watch Sarah play in another setting where she acts very differently. At the beginning
of the case study students are introduced to Dr. Wellington, a fictional
expert in observational research who is there to help them through
the case study. She guides students through the case by asking questions
about things such as how they should observe Sarah, how they will design
their coding sheets, what their level of interrater reliability is
and if that is acceptable, if their results are statistically significant,
and what their results tell them. At every step, students are given
feedback from Dr. Wellington that helps them understand what they just
did or how it impacts the next step. Dr. Wellington also tells students
when they give incorrect answers, but she doesn’t just tell them
the answer; she gives hints, tips, and other leading feedback
to make students think more deeply about the question they answered
and try again. Through answering these questions, the student is led
through the entire observational research process. |
|
![]() |
We believe this “guide
on the side” approach is an invaluable way for students to
learn. They are able to think about what it is they are doing and
are given
feedback at every step. This prevents a student from making a mistake
early on that causes the rest of the assignment and his practice
with the concepts to be incorrect. Students are given lots of help,
but
in a way that makes them think about exactly what it is they are
doing, as well as why and how it applies in the given context of
the case
study. This is something a textbook simply cannot do. The case study
also concludes by asking students to write a report that outlines
their results and conclusions regarding Sarah. This can be graded
by the
instructor, allowing the case study to be used as a class assignment. |
|
![]() |
3.4 Library |
|
![]() |
3.5 Glossary |
|
![]() |
4. Programming Environment
and Use of Technology Integration of videos is a key to COR’s ability to assist students in enhancing their observational research skills. With Director® we were able to create “Dr. Wellington’s coding” where a model coding sheet is filled in while the video runs. By tracking the timing of the video, we are able to show data being written on the computer coding sheet as if the expert were coding the video. This creates a very powerful learning tool for students. We also capitalized
on the ability of Director® to read external files to create a program
that can be easily customized by the instructor. Instructors are
able to provide different coding for a particular video, either to modify
the way
the computer codes the video or to have the computer code the video for a
different behavior. In much the same way, the instructor is also able
to modify the glossary
by adding more terms or editing the definitions provided. |
|
![]() |
5. Use and Evaluation Although additional material has also been added to the lessons, the major innovation in COR Version 2 is the interactive case study module. In the spring 2003 semester, COR 2 was used for the first time in a Basic Research Methods class at Penn State Erie. Because we had extensively evaluated the lessons as described above, here we focused our attention on the new case study. The program was evaluated in several stages. Students read material on nonexperimental designs (Bordens & Abbott, 2002) and completed an on-line quiz before coming to class. The instructor then presented the lessons component of COR during five hours of class time. Students took a pre test based on their knowledge to date and were then given a list of topics and asked to rate their confidence in actually performing the task. Students then used the case study at individual laptop computers in class over five hours. The class went to the zoo to complete an observation in a two-person team and wrote up the results of their observational study in an APA style laboratory report. They then rated their confidence in the same topics of observational research as before. All students completed a post test on the concepts as part of their course exam. In the final step all students completed a usability survey about COR and this information was used to make improvements to the program. |
|
![]() |
|
![]() Figure 3. Students’ mean confidence ratings (expressed as a percentage) and mean test scores before (pre) and after (post) using the case study. Error bars show the standard error of the mean. |
![]() |
The results of the usability analysis showed that students felt that both the lessons and the case study were attractive, interesting, and easy to use. Students had difficulty creating and using data files in the case study and we’ve modified the program to alleviate this. Many students spent considerable time creating screen captures of the COR lessons which they used when writing their paper and studying for the test. Enhancing the print features of COR is on our list of desired improvements. Overall, students felt strongly that COR was a valuable addition to the class and that they now understood the application of observational techniques in much more depth. |
|
![]() |
6. Summary and Future Directions We would also like to examine the feasibility of delivering portions of COR over the web. Although there remain significant drawbacks due to bandwidth limitations, major strides have been made in recent years in delivering more interactive programs and higher quality video over the internet. |
![]() |
![]() |
8. References
Blasko, D. G., Kazmerski, V., Corty, E., & Kallgren, C. (1998). Courseware for Observational Research (COR): A new approach to teaching naturalistic observation, Behavioral Research Methods and Computing. Vol. 30(2): 217-222. Bordens, K. S. & Abbott, B. B. (2002). Research design and methods: A process approach (5th Ed.). Boston: McGraw-Hill. Cashin, W. (1990). Students do rate different academic fields differently. In M. Theall & J. Franklin (Eds.), Students ratings of instruction: Issues for improving practice. San Francisco: Jossey-Bass, Inc. Fossey, D. (2000). Gorillas in the mist. Boston: Mariner Books. Johnson, H. & Solso, R. (1978). An introduction to experimental design in psychology: A case study approach (2nd Ed.). NY: Harper & Row Publishers. Jones, M. G., & Carter, G. (1998). Small groups and shared constructions. In J.J. Mintzes, J.H. Wandersee, & J.D. Novak (Eds.) Teaching science for understanding: A constructivist view (pp. 261-279). San Diego: Academic Press. Kazmerski, V. A. & Blasko, D. G. (1999). Teaching observational research in introductory psychology: Computerized and lecture-based methods. Teaching of Psychology. Vol 26(4): 295-298. Lynch, S. (2000). Equity and science education reform. Mahwah, NJ: L. Erlbaum Associates. National Research Council (1996). National Science Education Standards. Washington DC: National Academy of Sciences Press. Available on-line at: http://www.nap.edu/readingroom/books/nses/html/ . National Science Foundation (1999). Women, minorities, and persons with disabilities in science and engineering: 1998. Arlington, VA. (NSF 99-338). Teasley, S. D. (1995). The
role of talking in children’s peer
collaborations. Developmental Psychology, 3, 207-220. |
![]()
|