In this issue:
- Course evaluations
- Making history with a teaching technology
- Conflict management for instructors
- Detecting plagiarism
- Strengthening the link between teaching and research
- Teacher "credibility"
- Declining academic standards
- New teaching assistant developer
- Workshop registration notice
- Spring TRACE events
Many of us have just gone through the process of having a course evaluation done and will soon receive the results. And likely, many of us will be disappointed that our students don't seem to think we are as good as we think we are. If you are not satisfied with your ratings, why not try to improve them by taking a couple of hours to analyze what students are saying?
Course evaluations usually provide two kinds of information, the quantitative ratings and the open-ended comments. For analysis purposes, note the areas in the quantitative ratings that are lower than you would like. Remember, you are not ever likely to please all the students in a class so ratings are not likely to be perfect. You should be concerned when ratings are below the midpoint of the rating scale or somewhat below the department or faculty average (if known). Remember about 50% of ratings will be below department or faculty averages and that most teaching on university campuses is rated better than average.
With the open-ended comments, a good procedure is to type them into a word-processing document. When doing this, organize them in groups of similar comments. When finished, arrange the comments so that the largest group is first, the next largest second, etc. Then look at the groups of comments and consider what they are saying. For example, if under strengths of the course there are many comments about your availability to students, that would indicate an area that students really appreciate in your teaching. If you have a large number of comments suggesting that the text was not very good, you should consider a change or at least, upon the next offering of the course, explain why you are continuing to use that text. Sometimes just bringing the students into your decision making process will reduce their concerns.
Instructors are sometimes really upset by the single sexist or otherwise hurtful comment in a set of evaluations. Grouping the comments helps to put such a comment in perspective. The groups of similar comments are the things to concentrate on. An issue with only a single comment can be ignored. You want to concentrate on the things that many students agree on.
It is not uncommon to have some students who really like an aspect of the course, such as the textbook, and others who dislike it. If there are groups of comments on both sides of an issue, there may be no clear way to deal with the issue. But in the next offering of the course, you might ask students for more specific feedback on why they like or do not like something. This will provide better information on what to change and how it might be beneficial.
Groups of comments on positive aspects of the course are indicators of things you should continue. Groups of comments on areas that are not working well are things to consider changing. You may not always agree with a group of comments suggesting a problem.
However, you should not ignore these. If it is a concern with a method you use, when you use it again, you may just need to explain why you are using that method, indicate that some students may have problems with it, and suggest strategies for those students to adapt to the method. If you are uncertain how to deal with a problem indicated by a group of comments, TRACE may be able to provide suggestions. We also would be happy to work with you on an analysis of your course evaluations. Contact Gary Griffin (x2579) or Donna Ellis (x5713) for assistance.
By Donna Cooper
Everything old was made new again last fall. That's when Ken McLaughlin and Tracy Light used a modern day teaching technology to help undergraduate History students step back in time. History 108: Genealogy is a popular evening course that explores the intertwining of historical events with family life stories. Traditionally, the course has posed a challenge to instructors, since it's based primarily on independent genealogical research done by students.
Unfortunately, it's usually "not until the end of the course" that students get a chance to share their research findings, said Ken. "So what we wanted to do, from the very beginning of the course, was to provide a way for students to get to know each other better," to discuss their projects, "and to establish a kind of rapport without necessarily taking away from class time."
PhD candidate and History 108 teaching assistant (TA) Tracy Light recognized the course as a perfect forum for Web CT, an innovative on-line course management system. Students were divided into discussion groups and assigned three questions over the course of the term, all of which dealt with links between genealogy and traditional history. The discussion topics provided students with a conceptual framework for other assignments as well, which included a research paper and an investigation of their own family history.
Although the class met weekly for lectures on campus, discussion groups were carried out completely on-line, with groups posting their finished responses for the entire class to read. Think of Web CT discussions as virtual seminars, where students carried on conversations asynchronously, taking the time to consider comments of fellow students before responding, then working collaboratively to formulate a polished group response.
Thanks to a feature of the Web CT program, Ken and Tracy were able to keep track of each student's participation. They occasionally dropped in on the discussions to offer insight, encouragement or praise. Both instructors agree that cross-fertilization of ideas and increased levels of participation (especially from otherwise quiet students) were definite benefits of the initiative. "It was really fascinating to see them thinking about core concepts throughout the course, as opposed to just on the final exam," said Tracy.
The on-line discussions helped bring students closer together. "That's the paradox of it," Ken pointed out. Even though work at the computer was done independently from home, a great deal of collaborative thinking and peer editing was required. As a result, written responses were of a higher quality than previous versions of History 108 where Web CT was not used.
But did Ken and Tracy have to work harder so their students could work better? Integrating Web CT technology into the course "required more time than I had anticipated," said Tracy. "Especially setting up the program."
Tracy held additional workshops to introduce students to the teaching technology as well as the group work it would entail. These sessions put her well over the standard ten-hour TA work week during the early part of the course.
Still, both consider their additional hours an investment. Ken likened the process to designing a new course, where one "has to decide on readings, read all those readings in advance," then create a courseware package for students. Next time around, the course won't require nearly as much set-up.
"We learned a lot of lessons about using course time more valuably," said Tracy. Feedback from students indicated that models of effective on-line discussions would be useful.
Ken and Tracy's investment is expected to pay off this fall, when History 108 will run once again using Web CT.
By Geneviève Desmarais
No matter how hard we work at preventing conflicts, they are a part of life, and instructors, from the most entertaining to the most challenging, have encountered them at times. Though as instructors, we prepare ourselves to transmit knowledge, we receive very little guidance regarding how to cope with distracting, upset, or angry students.
Last term, TRACE held a workshop on the topic of conflict management. A tip sheet was created from the topics covered in the workshop that might be of interest to you. This tip sheet provides some suggestions on how to prevent conflicts from occurring in the first place, and on how to manage conflicts if they arise.
When conflicts do arise in your classroom, don't ignore them. Most classroom conflicts concern distracting student behaviour, such as talking, and are easy to cope with. Move around the classroom to create some dynamics and keep the students' attention on you, make eye contact with the distracting students, or pause during your lecture. Students are not necessarily discussing topics unrelated to your lecture, so don't make any assumptions: ask questions and allow them to share their opinions. If the behaviour persists or degenerates, you may consider verbally addressing the class or asking the disruptive students to leave.
Sometimes classroom conflicts depart from mere distractions and escalate into authority challenges. The key to getting through these public confrontations is to avoid taking the challenge personally. You should try to relax and keep calm, and keep your voice sounding clear and firm instead of defensive or sarcastic. Acknowledge the value of the student's opinion, and maybe suggest discussing the matter privately at a later time.
Many of the conflicts we encounter as instructors will occur in private, for example over a grade complaint. The key goals here are to maintain professionalism, and to focus on the issues at hand, not the person. If the discussion becomes heated, suggest scheduling a later appointment. This will give both parties enough time to calm down and have a second look at the problem. When conduct gets out of hand, make a record and don't hesitate to seek assistance.
Managing conflicts is a matter of style and there is no one best way to cope with them. You should therefore choose the approach you are most comfortable with.
Technology has made it easier for students to obtain and submit work that is not their own for credit purposes. Technology tools can also assist instructors in tracking down plagiarized materials.
A demonstration tool is available to assist in detecting plagiarism. The tool is FindSame.com. FindSame.com is an Internet search engine that looks for content, not keywords, on the Internet. A demo of the service, provided by Digital Integrity, is free. To use the service go to FindSame's home page (http://www.FindSame.com), insert the passage from the student's paper into the text box, and click on the "search" button.
The following statement appears on the homepage. "This demo shows Digital Integrity's ability to search for content, not keywords. You submit an entire document, and we return a list of Web pages that contain any fragment of that document longer than about one line of text. Paste some text in the box below, or upload a file by clicking on the "Browse" button. Then click the "search" button and we'll show you where on the Web any piece of that text appears."
When the search is complete, FindSame presents a list of links to web sites containing matching text. The text can be displayed and matching text will appear in colour. By clicking on the "Side-by-Side" button, the program will display the information you submitted on one side of the screen and the passage it found on the other. Matching text from the two documents will appear in colour. This makes it very easy to determine whether plagiarism actually occurred.
You can see how the system operates without submitting text by clicking on "Discover plagiarism in a student report" and then clicking on "search." A sample student text is displayed along with an indication of the text that is matched by other sources. At the end of the student text, a list of sources is presented, starting with those that contain the largest percentage of overlap. You will note that most of the student's text is matched by the sources found. By checking a few of the matching texts with a large percent overlap, you can see how the paper was put together.
You can add a Digital Integrity command bar to Microsoft Word 2000. "The command bar allows direct submission of Word documents or selected sections to the Digital Integrity demo search engine. The results are presented in Internet Explorer." You can also add a Digital Integrity command to your browser toolbar (Netscape or Internet Explorer). These command bars can be downloaded from the web site.
Unfortunately, this is only a demonstration site with extensive but somewhat dated coverage of internet material. Digital Integrity does have products available that provide better coverage.
The population at large does not have a very good idea of what a university is all about. The teaching component is something they relate to easily, but many people are relatively unaware of the research activity that involves so much of a professor's life and so much of the university's resources. How many times have you heard, "Now that classes are over, what are you going to do with your summer off?" Since many of the adults in our society are now products of post-secondary education, the lack of knowledge in the general population may be partially the fault of universities. University graduates may not understand the link between the teaching activities and research activities of faculty members.
The link not only can improve an understanding of what universities are about, it can also serve to improve undergraduate education. An article by Elton (2001) suggests that a positive link can be established. He argues that "a positive research and teaching link primarily depends on the nature of students' learning experiences, resulting from appropriate teaching and learning processes, rather than on particular inputs or outcomes."
Essentially, Elton is arguing that to develop the link, students must be "actively involved in the learning process and indeed come - at least in part - to own it. This is the very antithesis of learning through didactic teaching, which in most instances can lead to only superficial learning." He believes the link between research and teaching depends primarily on the curriculum.
He then discusses curriculum designs that favour good learning outcomes. One such design is problem-based learning, which originated in medicine but is spreading to other disciplines. The changed curricula shift the process from teacher- to student-centered learning. Here the process shifts from the excellence of the teacher to the excellence of the learning experience.
Elton's paper and his ideas are heavily influenced by changes in the British system of higher education. However, his ideas are worth considering.
Fink provides a three-dimensional model that may assist with evaluating your teaching and provides ideas of what you can do to improve your credibility with students. The model comes from communication research on speaker credibility. Teaching behaviour is divided up into three areas: competence, trustworthiness and dynamism. Teachers perceived as competent, trustworthy and dynamic are likely to be deemed more credible and receive higher ratings.
Competence refers to the perceived "expertness" of the instructor. This is influenced not only by level and breadth of knowledge but also by things such as classroom management skills and the ability to answer student questions. Trustworthiness involves making students feel welcome as participants in class, sensitivity to student concerns and gender and cultural issues, and fairness in grading. Dynamism focuses on your enthusiasm for teaching and also involves presentation skills.
Fink indicates that each of the three dimensions of teacher credibility is based on learnable teaching behaviours. She claims that some instructors at her institution who used the model significantly improved end-of-semester teaching evaluations. No data are provided to substantiate that claim.
In examining credibility, the first step is to collect information from your students. Course evaluations, as described in the article Course Evaluations, can help here. Then categorize the data according to the credibility model. Specified strengths, problems or gaps may assist in identifying areas to work on to improve credibility. Fink also suggests other techniques that you can use. And if requested TRACE will assist with any of these techniques. Fink's paper can be found by going to: http://www.ou.edu/idp/ideas.html then looking under "Interacting With Students."
Discussions about declining academic standards are common in faculty meetings and in various publications. The evidence provided is usually anecdotal or based on impressions. An article by Miller and Goyder (2000), of Waterloo's sociology department, provides actual evidence of a decline in mathematics preparation of students entering the University of Waterloo.
Miller and Goyder analyzed a series of scores from a "mathematics preparedness test" (MPT) taken by all incoming students to the Faculty of Engineering. "The exam was designed to improve first year performance and is used as an early identification tool to determine which students will likely have difficulty in the engineering programs so that assistance can be sought before problems arise (Ford, 1995). The test is revised every three years." Data from the 1991-1993 and 1994-1996 cycles were considered to be highly comparable. In 1997, the math faculty started using the test and it was changed more fundamentally. Data were analyzed separately for the 3-test periods (1991-1993, 1994-1996, and 1997-1999). The three periods corresponded to the three versions of the test. A total of 6194 student scores were included in the analyses.
Faculty members who had instructed at Waterloo for at least ten years, had done some instruction in first-year courses, and came from the faculties of AHS, Engineering, Mathematics and/or Science were surveyed about the preparation of incoming students. "In all, 52 out of 78 eligible instructors answered the survey."
The results indicated "that students taking the test in 1992 scored about 1.8 (rounded) fewer questions correct than did their 1991 counterparts. The next year's decline was 1.842, again rounding to 1.8. Essentially here we would conclude that each year's class was getting nearly two fewer questions correct than in the previous year, rather a notable decline." The models for the other two periods revealed only minor fluctuations with no statistical significance. "It is convincing that a sharp drop in the early 1990's, with a tapering later, was predicted by our informants independently of the time series just described."
A similar trend was detected at the University of Western Ontario where Essex (1997) took a diagnostic test first used in 1984 and re-administered it in 1992.
In the survey of professors at Waterloo, 47% felt academic preparation of first-year students decreased relative to first-year students in 1987, 41% perceived no trend, and 12% indicated an increase in preparation. The authors give some possible explanations for the differences in perception. For example, "Maybe it is to be expected that results from the survey of faculty member perceptions gave rather divided views, especially given that the years of exposure to first year students does vary within the sample, and the sample size is too small to control this." Miller and Goyder also provide quotations from faculty members on the differences they perceived. Generally, the comments imply problems with insufficient preparation in the high schools and first-year students' attitudes and expectations based on their high school experiences. Although student preparation appeared to decline in the early 1990's, a number of faculty members did not attribute this to less able students, just to less prepared students.
In the paper's conclusion, the authors also point to differences in "workplace culture among faculty in mathematics compared to sister faculties such as engineering, science or applied health sciences." "Judging from the questionnaires, many mathematics instructors feel particularly pressured to adjust their grades. This aspect of the organizational culture of the mathematics faculty seems to make frontline instructors especially particularly sensitive to issues of standards."
Shannon McKenna is a Master of Arts student in the Department of Sociology. She has been a teaching assistant (TA) for the Introduction to Sociology and Sociology of Adolescence courses since enrolling in the program. As part of her TA, she has had the opportunity to provide assistance to sociology students at both the individual and group level.
Before coming to Waterloo, Shannon was involved with students as a lab facilitator in the communication department at the University College of Cape Breton. This experience allowed her to exercise knowledge of a variety of communication-related areas, such as: presentation skills, interviewing techniques, conflict management skills, group work, class participation, and methods of assessment.
Shannon comes to the TRACE Office eager to share her recent experiences and to broaden her understanding of university teaching. Feel free to contact her at: firstname.lastname@example.org.
We have learned that our web-based registration system does not forward to us registrations completed using Internet Explorer. If using Explorer, you will receive a registration confirmation; however, it is not valid since we do not receive your registration. The solution? For now, please use Netscape to register or call us at Ext. 3132. We will continue to work to resolve this technical glitch. Thank you for your patience.
Workshops for the spring 2001 term:
|Improving students' learning practices||May 8||12 - 1:30 p.m.|
|Varying your teaching activities||June 6||12 - 1:30 p.m.|
|Reflecting on your teaching||July 12||12 - 1:30 p.m.|
|Teaching dossiers, (part TBA)||TBA||TBA|
For more specific details, watch for notices in your department and via the Certificate listserv. If you would like to join the listserv, please email the TRACE Office.
TRACE now uses a Web workshop registration form. Please read the: "Workshop Registration Notice" for an update.