AAA HQ & Digital Services Blog

news and views from AAA headquarters

This is a public blog  publicRSS

blog entry

    Want to get your students engaged? Try PollEverywhere
    blog entry posted December 29, 2011 by Julie Smith David, last edited February 10, 2012 
    1194 Views, 6 Comments
    title:
    Want to get your students engaged? Try PollEverywhere
    body:

    For the last year, the Conference on Teaching and Learning (CTLA) has been participating in the Regional Meetings, and I've had the great fortune to work with some of the best scholars of teaching in those events... and I have learned a LOT from them!!

    One thing I have struggled with is how to engage students, efficiently, during class... and I got some help with this from Mike Meyer.  He introduced all of us to a tool called PollEverywhere that allows you to take real time polls - using your audience's cell phones!!  You embed a poll into your PowerPoints, your students send text messages for their responses, and the PowerPoint updates in real time so you can see what your students are thinking.  It's clickers without the pain of the devices!!

    Have you used clickers - or tools like PollEverywhere - to liven up your classes?  If so, please click "more..." below, and then use comments to let us know how you've done it, and any keys to success. (And I've included some more details about using the tool that you'll see when you click "more...", too) 

    more:

    To get started, just click the "Create your first poll" button, enter your question, and press enter.  If you want it to be multiple choice, click "Convert to Multiple Choice" and enter your options, and then click "Continue."

    Once your poll is created, people send text messages to submit their responses.  In general, there will be one number to which you send the texts.  To record the response, the text body will be a specified number (for multiple choice) or a number and free form text (for open ended questions).  PollEverywhere provides you with overheads to explain the process - and I used them the first time I introduced this to class - and it was EASY to follow!

    You can create an account so you can store and reuse your questions.  And if you have 30 or fewer respondants, then the tool is FREE (my holiday gift to you!!).  For more students, or record tracking, there are several packages that you can explore.

    Let me know what you discover, and how it works!!

    Comment

     

    • Dan Stone

      Thanks Julie!

      I used and liked clickers (with turning point software). Only drawback was getting the clickers (either students had to buy them or I had to pick up and deliver them daily). 

      I asked my University's IT shop about using polleverywhere and they were (unsurprisingly) cautious. I think their concern is that they don't want to get into support a product that they are unfamiliar with, and, that competes with an existing contracted supplier (turning point). 

      Still, I may test the waters on polleverywhere later this month in my advanced systems class.

       

       

    • Robert E Jensen

      Over 24 years ago, Barry Rice believed in the learning power of classroom electronic response pads (clickers).
      He was right if they are used correctly ---
      http://www.trinity.edu/rjensen/000aaa/thetools.htm#ResponsePads

      "Does Using Clickers in the Classroom Matter to Student Performance and Satisfaction When Taking the Introductory Financial Accounting Course?" by Ronald F. Premuroso, Lei Tong, and Teresa K. Beed, Issues in Accounting Education, November 2011, pp. 701-724
      http://aaajournals.org/doi/abs/10.2308/iace-50066
      There is a fee for the full text version

      ABSTRACT:

      Teaching and student success in the classroom involve incorporating various sound pedagogy and technologies that improve and enhance student learning and understanding. Before entering their major field of study, business and accounting majors generally must take a rigorous introductory course in financial accounting. Technological innovations utilized in the classroom to teach this course include Audience Response Systems (ARS), whereby the instructor poses questions related to the course material to students who each respond by using a clicker and receiving immediate feedback. In a highly controlled experimental situation, we find significant improvements in the overall student examination performance when teaching this course using clickers as compared to traditional classroom teaching techniques. Finally, using a survey at the end of the introductory financial accounting course taught with the use of clickers, we add to the growing literature supporting student satisfaction with use of this type of technology in the classroom. As universities look for ways to restrain operating costs without compromising the pedagogy of core requirement classes such as the introductory financial accounting course, our results should be of interest to educators, administrators, and student retention offices, as well as to the developers and manufacturers of these classroom support technologies.

       

    • Robert E Jensen

      "Some interesting findings and unanswered questions about clicker implementations," by Robert Talbert, Chronicle of Higher Education, January 4, 2012 ---
      Click Here
      http://chronicle.com/blognetwork/castingoutnines/2012/01/04/some-interesting-findings-and-unanswered-questions-about-clicker-implementations/?sid=wc&utm_source=wc&utm_medium=en

      I have been using clickers in my classes for three years now, and for me, there’s no going back. The “agile teaching” model that clickers enable suits my teaching style very well and helps my students learn. But I have to say that until reading this Educause article on the flight out to Boston on Sunday, I hadn’t given much thought to how the clicker implementation model chosen by the institution might affect how my students learn.

      Different institutions implement clickers differently, of course. The article studies three different implementation models: the students-pay-without-incentive (SPWOI) approach, where students buy the clickers for class but the class has no graded component for clicker use; the the students-pay-with-incentive (SPWI) approach, where students purchase clickers and there’s some grade incentive in class for using them (usually participation credit, but this can vary too); and the institution-pays-clicker-kit (IPCK) approach, where the institution purchases a box of clickers (a “clicker kit”) for an instructor, and the instructor brings them to class.

      For me, the most interesting finding in the study was that there appears to be a threshhold for the perceived usefulness of clickers among students. The study found that in the SPWOI approach, 72% of student respondents said they would buy a clicker if it was used in at least three courses they were taking per semester. But drop that number to “at least two courses” and the percentage drops to 24%! So once the saturation level of clicker use reaches something like 50–75% of a student’s course load, they start seeing the devices as worth the money, even with no grade attached to its use. (Only a depressing 13% of students said they would pay $50 for a clicker based solely on its value as a learning tool. We have some P.R. to do, it seems.)

      In the SPWI approach, 65% of respondents said they would buy a clicker if the contribution of clicker use toward their course grades was between 3% and 5%. (This is sort of mystifying. What do the other 35% do? Steal one? Just forfeit that portion of their grade?) The study doesn’t say explicitly, but it implies that if the grade contribution is less than 3%, the percentage would drop — how precipitously, we don’t know.

      The study goes on to give a decision tree to help institutions figure out which implementation model to choose. Interestingly, if it gets down to choosing between the SPWI and SPWOI models, the deciding factor is whether the institution can manage cheating with the clickers. If so, then go with SPWI. Otherwise, go SPWOI — that is, if you can’t control cheating, don’t offer incentives.

      Here at GVSU, I use the SPWI approach. Students have to pay for the clickers, but they get 5% of their course grade for participation. I take attendance at each class using the Attendance app for the iPhone. Then, once or twice a week, I’ll cross-check the attendance records with the clicker records for the day. If a student is present but doesn’t respond to all the clicker questions, they lose participation credit for the day. This method also mitigates cheating; if a student is absent for the day but has records of clicker response, then I hold the student guilty of cheating, because someone else is entering data for them. (Putting the burden on the absent student makes it less likely they’ll give their clicker to someone else to cheat for them.).

      Continued in article

    • Richard E Lillie

      Hi Julie,

      I was aware of Poll Everywhere but had not used it with a course.  However, after reading your post, I decided to give it a try.  I'm using Poll Everywhere with my ACCT 620 (Internal Audit and Management Decisions) course during WQ 2012.  After talking with the tech support people at Poll Everywhere,  I subscribed to the "Personal" option for $15/month.  This is pretty reasonable for what I want to do for a one-quarter course and allows up to 50 responses per poll question.

      I plan to use it to take class attendance and for questions peppered throughout course topics.  I'll post comments to the Teaching with Technology blog to share my experience with using Poll Everywhere.

      Thanks for posting your comments.  It got me started using Poll Everywhere.

      Rick Lillie

    • Julie Smith David

      This software has a new release, with a pretty cool mobile interface.  If you use an ipad or your phone during class, you can start and stop polls from that device, and push the poll to your audience's mobile devices.  I haven't tried it, but it looks interesting: 

      http://blog.polleverywhere.com/new-mobile-presenter-tools/

    • Robert E Jensen

      Clickers (Response Pads)  in the Classroom ---
       http://www.trinity.edu/rjensen/000aaa/thetools.htm#ResponsePads

      Question
      Should instructors stop using clickers in the classroom?

      "Teaching Tax: On Clickers and Laptops," by Sam Brunson, SurlySubgroup Blog, June 8, 2016 ---
      https://surlysubgroup.com/2016/06/09/teaching-tax-on-clickers-and-laptops/

      Classroom Clickers After 20 Years of Application ---
      http://www.trinity.edu/rjensen/000aaa/thetools.htm#ResponsePads

      "Episode 90: Growing Pains for ‘Clickers’," Jeffrey R. Young, Chronicle of Higher Education, December 7, 2011 --- Click Here
      http://chronicle.com/blogs/techtherapy/2011/12/07/episode-90-growing-pains-for-%E2%80%98clickers%E2%80%99/?sid=wc&utm_source=wc&utm_medium=en

      Classroom response systems, or “clickers,” have been around for years, but only a small percentage of classes use them. Competing and incompatible brands, faculty reluctance to try new technologies, and confusion about which campus group should provide support for the devices all contribute to a slow adoption, says Derek Bruff, director of Vanderbilt University’s Center for Teaching and author of Teaching With Classroom Response Systems. The Tech Therapy team looks at how those gadgets can be seen as an example of the difficulty in moving technology beyond the early-adopter stage.

      Download this recording as an MP3 file, or subscribe to Tech Therapy on iTunes.

      Each month, The Chronicle’s Tech Therapy podcast offers analysis of and advice on what the latest gadgets and buzzwords mean for professors, administrators, and students. Join hosts Jeff Young, a Chronicle reporter, and Warren Arbogast, a technology consultant who works with colleges, for a lively discussion—as well as interviews with leading thinkers in technology.

      Jensen Comment
      Response pads have a long history dating back over 20 years in the classroom. HyperGraphics was one of the first companies to shift from wired to wireless clickers using the old DOS HyperGraphics course (learning) management software. My first dog and pony technology shows featured my managerial accounting course in HyperGraphics. My first gig was at the University of Wisconsin.

      It was October 4-5, 1990 when I made my first away-from-home dog and pony show on featuring HyperGraphics technology --- at the University of Wisconsin. HyperGraphics software pretty much died after Windows replaced the DOS operating system in PCs. I then shifted my managerial accounting and accounting theory courses to ToolBooks for the PC. My out-of-town dog and pony shows really commenced to roll when my university hosts invested in those old three-barrel color projectors that predated LCD projectors. I eventually made hundreds of presentations of HyperGraphics and then ToolBooks on college campuses in the United States, Canada, Mexico, Finland, Sweden, Germany, Holland, and the United Kingdom (where I lugged my full PC and LCD projector between five campuses as the European Accounting Association Visiting Professor). Many of my campus visits and topics are listed at http://www.trinity.edu/rjensen/Resume.htm#Presentations

      Shortly thereafter Loyola's Barry Rice with his ToolBooks became a much heavier user of clickers than me in his large accounting lectures.

      I think Bill Ellis at Furman University is a current user of clickers in his accounting courses.


      Use Plickers for quick checks for understanding to know whether your students are understanding big concepts and mastering key skills ---
      https://www.plickers.com/
      Thank you Sharon Garvin for the heads up.


      Audience Response --- http://en.wikipedia.org/wiki/Audience_response

      "App Tries to Increase Student Participation by Simplifying Clicker Technology." by Angela Chen, Chronicle of Higher Education, July 11, 2012 --- Click Here
      http://chronicle.com/blogs/wiredcampus/app-tries-to-increase-student-participation-by-simplifying-clicker-technology/37855?cid=wc&utm_source=wc&utm_medium=en

      From clickers to programs like Learning Catalytics—which data-mines to match students with discussion partners—student-response systems are becoming more and more sophisticated. But Liam Kaufman, a graduate of the University of Toronto, thinks that the key to effective feedback is a tool with fewer bells and whistles.

      Mr. Kaufman is the developer of Understoodit, a browser-based app that lets students indicate their level of comprehension during class, and then see how much everyone else understands.

      The idea is that, during a lecture, everyone runs the Understoodit Web site, which is also accessible via mobile and tablet devices. Students press buttons to indicate that they either understand the material or are confused by it. The feedback is displayed in real time, in the form of a “confus-o-meter” and an “understand-o-meter,” which show the percentage of students who comprehend the material.

      The app was inspired by clickers, Mr. Kaufman says. But whereas clickers usually require students to answer questions so the professor can gauge their understanding, Understoodit lets them directly indicate confusion or comprehension, which is then available for everyone to see. That approach, he hopes, will encourage students to ask more questions when they realize that others are confused as well.

      Mr. Kaufman first tested the app on an entry-level computer-science class at the University of Toronto in February. The app is still in beta testing, and available by invitation only. More than 2,000 people have signed up so far, Mr. Kaufman says, including professors at institutions such as Harvard University, Stanford University, and the University of Pennsylvania.

      Continued in article

       

      Over 24 years ago, Barry Rice believed in the learning power of classroom electronic response pads (clickers).
      He was right if they are used correctly ---
      http://www.trinity.edu/rjensen/000aaa/thetools.htm#ResponsePads

      "Does Using Clickers in the Classroom Matter to Student Performance and Satisfaction When Taking the Introductory Financial Accounting Course?" by Ronald F. Premuroso, Lei Tong, and Teresa K. Beed, Issues in Accounting Education, November 2011, pp. 701-724
      http://aaajournals.org/doi/abs/10.2308/iace-50066
      There is a fee for the full text version

      ABSTRACT:

      Teaching and student success in the classroom involve incorporating various sound pedagogy and technologies that improve and enhance student learning and understanding. Before entering their major field of study, business and accounting majors generally must take a rigorous introductory course in financial accounting. Technological innovations utilized in the classroom to teach this course include Audience Response Systems (ARS), whereby the instructor poses questions related to the course material to students who each respond by using a clicker and receiving immediate feedback. In a highly controlled experimental situation, we find significant improvements in the overall student examination performance when teaching this course using clickers as compared to traditional classroom teaching techniques. Finally, using a survey at the end of the introductory financial accounting course taught with the use of clickers, we add to the growing literature supporting student satisfaction with use of this type of technology in the classroom. As universities look for ways to restrain operating costs without compromising the pedagogy of core requirement classes such as the introductory financial accounting course, our results should be of interest to educators, administrators, and student retention offices, as well as to the developers and manufacturers of these classroom support technologies.

      "Some interesting findings and unanswered questions about clicker implementations," by Robert Talbert, Chronicle of Higher Education, January 4, 2012 ---
      Click Here
      http://chronicle.com/blognetwork/castingoutnines/2012/01/04/some-interesting-findings-and-unanswered-questions-about-clicker-implementations/?sid=wc&utm_source=wc&utm_medium=en

      I have been using clickers in my classes for three years now, and for me, there’s no going back. The “agile teaching” model that clickers enable suits my teaching style very well and helps my students learn. But I have to say that until reading this Educause article on the flight out to Boston on Sunday, I hadn’t given much thought to how the clicker implementation model chosen by the institution might affect how my students learn.

      Different institutions implement clickers differently, of course. The article studies three different implementation models: the students-pay-without-incentive (SPWOI) approach, where students buy the clickers for class but the class has no graded component for clicker use; the the students-pay-with-incentive (SPWI) approach, where students purchase clickers and there’s some grade incentive in class for using them (usually participation credit, but this can vary too); and the institution-pays-clicker-kit (IPCK) approach, where the institution purchases a box of clickers (a “clicker kit”) for an instructor, and the instructor brings them to class.

      For me, the most interesting finding in the study was that there appears to be a threshhold for the perceived usefulness of clickers among students. The study found that in the SPWOI approach, 72% of student respondents said they would buy a clicker if it was used in at least three courses they were taking per semester. But drop that number to “at least two courses” and the percentage drops to 24%! So once the saturation level of clicker use reaches something like 50–75% of a student’s course load, they start seeing the devices as worth the money, even with no grade attached to its use. (Only a depressing 13% of students said they would pay $50 for a clicker based solely on its value as a learning tool. We have some P.R. to do, it seems.)

      In the SPWI approach, 65% of respondents said they would buy a clicker if the contribution of clicker use toward their course grades was between 3% and 5%. (This is sort of mystifying. What do the other 35% do? Steal one? Just forfeit that portion of their grade?) The study doesn’t say explicitly, but it implies that if the grade contribution is less than 3%, the percentage would drop — how precipitously, we don’t know.

      The study goes on to give a decision tree to help institutions figure out which implementation model to choose. Interestingly, if it gets down to choosing between the SPWI and SPWOI models, the deciding factor is whether the institution can manage cheating with the clickers. If so, then go with SPWI. Otherwise, go SPWOI — that is, if you can’t control cheating, don’t offer incentives.

      Here at GVSU, I use the SPWI approach. Students have to pay for the clickers, but they get 5% of their course grade for participation. I take attendance at each class using the Attendance app for the iPhone. Then, once or twice a week, I’ll cross-check the attendance records with the clicker records for the day. If a student is present but doesn’t respond to all the clicker questions, they lose participation credit for the day. This method also mitigates cheating; if a student is absent for the day but has records of clicker response, then I hold the student guilty of cheating, because someone else is entering data for them. (Putting the burden on the absent student makes it less likely they’ll give their clicker to someone else to cheat for them.).

      Continued in article

      January 10, 2012 reply from Steve Hornik

      Late reply to this thread, but my memory is pretty bad and I was trying to remember a "clicker" alternative. I finally did, its Pollanywhere and works the same way as clickers. I've used for presentations at AAA meetings a few years ago, here's a link if anyone is interested in finding out more:

      http://www.polleverywhere.com/ 

      _________________________
      Dr. Steven Hornik
      University of Central Florida
      Dixon School of Accounting
      407-823-5739
      http://about.me/shornik

       


      Will classroom clickers be obsolete if each student in class is online?

      October 22, 2009 message from Bill Ellis [bill.ellis@furman.edu]

      http://www.sapweb20.com/blog/powerpoint-twitter-tools/

      Here’s new software from a reliable source. I’ve not tried this yet, but it might have a use in classrooms.

      FREE PowerPoint Twitter Tools
      Ever wanted to make presentations a more interactive, Web 2.0 experience? A prototype version of the PowerPoint Twitter Tools is now available for testing. Created using SAP BusinessObjects Xcelsius <http://www.sap.com/solutions/sapbusinessobjects/sme/reporting-dashboarding/index.epx> (but requiring only PowerPoint for Windows and Adobe Flash to run), the twitter tools allow presenters to see and react to tweets in real-time, embedded directly within their presentations, either as a ticker or refreshable comment page. There are currently six tools:

      • PowerPoint Twitter feedback slide
      • PowerPoint Twitter ticker bar
      • PowerPoint Twitter update bar
      • PowerPoint Twitter voting — bar charts and pie chart
      • PowerPoint Mood meter
      • PowerPoint Crowd meter     

      Jensen Comment
      Thanks for this heads up Bill. For over a decade I taught in an electronic classroom where each student work station had software that made clickers unnecessary, although clickers would still be useful for students not having computers at their seats. The above software does more than most electronic classroom software to date.

      I summarize the history of classroom clickers (response pads) below.
      This includes a previous message from Bill Ellis and reference to an early adopter back in the 1980s --- our own AECM founder Barry Rice (who by the way was a very popular old-style ToolBook lecturer when using response pads).

      The main advantage of response pads, in my viewpoint, is that they help hold student attention in a lecture because of fear/anticipation of being called on. I used an Excel program that not only called on a student at random, it flashed his/her picture on the screen.

      My electronic classroom software could also instantly flash whatever was on any student’s workstation screen. This prevented students from doing email or playing computer games in class --- or so I discovered after embarrassing a few students early on in the course. If a student seemed to be furiously typing an email message in class, I flipped that student’s screen in front of the class. Some would begin “Dear Mom.”

      Students can write to their moms after class.

      Bob Jensen

      May 26, 2009 message from Bill Ellis [bill.ellis@furman.edu]

      I thought I’d pass along this email on clickers and recommend a new book by Derck Bruff.

      I’ve been using Clickers for almost two years now in Principles, Advanced and Governmental accounting courses at GTC and Furman. The comments by Derck Bruff, a Furman graduate, below are right on target.

      Accountability and engagement are the primary two features clickers have brought into my classrooms. There is no place for shy students to hide. A response is demanded and every student’s score is recorded. Every student is engaged not only by having to answer questions throughout the lecture, but in discussions using “think-pair-share” techniques that reinforce learning in a very active way.

      I don’t use clickers for grades but do let students know their “scores” and class averages. I’ve seen a high positive correlation between responses on the question “how many hours did you study this week?” to a student’s clicker score for the lecture. If students miss a question that gives me an early warning that I should go over that learning objective again.

      I’m convinced that clickers when used creatively help confidence, teaching and learning to improve.

      Bill Ellis, CPA, MPAcc
      Furman University
      Accounting UES

      May 26, 2009 message from Rick Reis <reis@stanford.edu>

      Date: Tue, 26 May 2009 08:30:20 -0700
      From: Rick Reis <reis@stanford.edu>
      Subject: TP Msg. #950 Clickers
      To: tomorrows-professor@lists.stanford.edu 

      "Instead of creating chaos, faculty find that when everyone gets a remote control (and you ask good questions), everyone ends up on the same channel."

      Folks:

      The posting below looks at the impact of an important new technology on faculty lecturing and student learning. It is by James Rhem, executive director of the National Teaching & Learning Forum and is #45 in a series of selected excerpts from the NT&LF newsletter reproduced here as part of our "Shared Mission Partnership." NT&LF has a wealth of information on all aspects of teaching and learning. If you are not already a subscriber, you can check it out at [http://www.ntlf.com/] The on-line edition of the Forum--like the printed version - offers subscribers insight from colleagues eager to share new ways of helping students reach the highest levels of learning. National Teaching and Learning Forum Newsletter, Volume 18, Number 3, March 2009.? Copyright 1996-2009. Published by James Rhem & Associates, Inc. All rights reserved worldwide. Reprinted with permission.

      Regards,

      Rick Reis reis@stanford.edu 

      Tomorrow's Teaching and Learning

      Clickers

      Clickers have been quietly marching over the horizon of attention for several years. Only early adopters, however, and schools with enough money and vision to try them have come to understand that, far from being simply the latest new gadget, they offer students a pedagogically powerful blend of intimacy and anonymity that can move them from passive to active learning with the click of a button (and a battery of well-crafted questions).

      Rapid improvements in the technology and especially the publication of Derek Bruff's Teaching with Classroom Response Systems: Creative Active Learning Environments (Jossey-Bass, 2009) seem poised to place clickers in faculty consciousness across the board. The attention the book has already received offers some index of the growing interest in clickers. Bruff has already been profiled by the on- line newsletter Inside Higher Education and the Chronicle of Higher Education.

      How They Work

      For those who don't know, clickers are hand-held devices similar to the remote controls for televisions and other media devices. They can send a specific electronic signal to a central receiving station connected to a computer equipped with software that tabulates the responses and can then display the distribution of answers on a bar graph.

      In operation-especially in quantitative fields with concrete correct and incorrect answers-a professor presents a multiple choice or true/false question. Students respond by pushing buttons for answers (a), (b), (c), and so on. Then, normally, the professor shows the bar graph of how the class answered. Quickly, students can see where they stand in terms of how well they understand the material, and (just as importantly) where their classmates stand, and where they stand in relation to these peers. And students get all of this very specific feedback on their learning without risking a moment of embarrassment. The anonymity of the system allows students to confront little important truths about their progress (or lack of it) without risking a thing.

      Faculty schooled a few generations back when shame and guilt were felt to have at least some pedagogical value-that is to say, in a time when students felt ashamed to make a poor grade or come to class unprepared-the ascendance of this new teaching environment may seem strange. However, as the emphasis in education has shifted over the centuries from building character to simply learning, it all makes sense. (And, of course, whether shame and guilt actually built character remains an open question.)

      Anonymity's Advantages

      The anonymity is "pretty important," says Derek Bruff, who teaches mathematics and serves as assistant director of the Vanderbilt Center for Teaching. "Students are often hesitant to speak up in front of their peers," he says. "A key element in that is the desire not to be wrong or foolish in front of their peers, especially in a class where there are right/ wrong answers. In other classes, they don't want to stand out or be the one with the strange opinion."

      Peer pressure, says Bruff, "dampens conversation." The anonymity that clickers provide is one way of dealing with that. "It's not the only way," Bruff concedes. "There are professors that are able to create a safe environment where that's not a problem."

      If escaping peer pressure and taking refuge in anonymity prove such positive elements in teaching and learning, a question that comes immediately to mind is, where do cooperative learning and other small group activities fit in? The answer? On the next click, so to speak.

      Offering an answer via the clicker establishes a "buy-in," says Bruff, a commitment not simply to an answer but to the learning process. With this threshold crossed, passivity has begun to be left behind. The anonymity allows cumbersome emotional baggage to be left behind as well, lending both a purity and a more animated sense of mission to the next step, the familiar "think-pair-share."

      The "Think Moment"

      "We use the think-pair-share method a lot here," says Bruff, "think, talk with one, talk in the larger group. There's more risk at each stage, but giving students a warm-up experience is important because many need that moment. If a hand in the first row goes up to answer a question, their thinking is stopped. The class is then moving on. Maybe they needed 30 more seconds. Giving the 'think moment' is helpful. Then, in the pair, they get to practice saying what they think, and they get to hear other thinking which then sharpens theirs."

      The silent, private "think moment" operates like moving from warm water to hotter and hotter baths in a hot spring, for example, and finally into strong currents where one may have to swim against the tide intellectually.

      Just as this technologically enhanced learning environment intensifies the focus on learning and recognizing where everyone stands in the process moment to moment, it also intensifies the burden on faculty to become "agile teachers." For example, when clickers first began to be used, showing the bar chart of student responses immediately was expected. As their use has grown and influenced faculty understanding of group behavior and learning patterns, whether to show or not to show the graph has become an important "thinking-on-your-feet" decision. Even if most students agree on a correct answer, how deeply do they understand the reasoning behind it? Sometimes, to make sure their learning goes more deeply, faculty withhold the results and ask students to turn to their neighbor and talk out the reasons for their answer, especially if their neighbor gave a different answer.

      "When I have that happen," says Bruff, "I tell my groups, 'Even if you agree, talk it out because you could both be wrong.' I want them to test themselves a little bit."

      It's the "thinking-on-your-feet" challenge that burdens faculty. "That's a roadblock for some faculty," says Bruff. "They want 'ballistic teaching,'" he says with a laugh. "Launch lecture, and once it's off, it's off on its way." Clickers offer lots of chances for mid-course corrections, but their use also demands something of a chess player's mentality of knowing not only how the pieces move, but which move to make next for maximum advantage. Sometimes, the best move does turn out to be "creating times for telling," says Bruff (using a phrase coined by Schwartz and Bransford), time for a little lecture students need and which skillful use of clicker questions can lead them to want. For example, anticipating a common misconception, faculty may ask a question experience has shown them most students will answer incorrectly.

      "The instructor then reveals the correct answer," says Bruff, "often through a demonstration. The students are surprised most of them got the answer wrong and it makes them want to hear why the right answer is right and the answer they gave is wrong."

      Making Good Questions

      Successful use of clickers turns on the skillful use of good questions. "Writing good questions I would have to say is the hardest part" of teaching with clickers, says Bruff. But it's also the most exciting part because it causes faculty to become intensely intentional about their teaching moment to moment, not just lecture to lecture. "That's why I like to talk about clickers with faculty," says Bruff, "because it generates this kind of conversation: 'What are my learning goals for my students?'"

      There are content questions asking for recall of information, conceptual questions seeking evidence of understanding, application questions, critical thinking questions, and free-response questions. When and how to ask the right kind of question in response to where the students actually sitting before the faculty member are becomes the proof of good teaching in that moment.

      One of the most interesting aspects to emerge from the use of clickers has to do with the flexibility of the multiple choice question to stimulate thinking and learning. "Many people think of the multiple choice question as being only about factual recall," says Bruff, but the one-best-answer variation probes much deeper. "A really good teacher can write really good wrong answers to a question," says Bruff, ones that key into common student difficulties with material. "When I really like 40-60% of my students to get it wrong. And I'd like them to be split between a right choice and several wrong choices, because then that means I have tapped into some misconceptions that are fairly common and need to be addressed and the question is hard enough to be worth talking about."

      Metacognition and Confidence

      Some of the problems that have emerged in using clickers have also turned out to reveal opportunities for increasing student learning or rather student learning about their own learning. Bruff, a mathematician, began to ponder how much confidence he could have in student learning reported via true/ false questions or even some multiple choice questions. In a true/ false situation, for example, students might guess and have a 50% chance of lodging a correct answer. Multiple choice questions might be constructed to include an "I don't know" option, but then the matter of discouraging student engagement becomes an issue. Students might retreat to the safety of an "I don't know" answer rather than commit to a response they felt uncertain about. Pondering this problem has led a number of pioneers in clicker use, like Dennis Jacobs at Notre Dame, to marry self-assessments of confidence levels with decisions about right or wrong answers. So, for example, in Jacobs' system (where clicker responses are graded) a correct answer in which a student indicated high confidence would receive five points. An incorrect answer that a student had expressed high confidence in would receive no points. On the other hand, an incorrect answer in which a student indicated low confidence would receive two points.

      "If a student gives a right answer," says Bruff, "but realizes they aren't confident in it, they have a little metacognitive moment thrust upon them: they have to ask themselves 'Why wasn't I more confident in my answer? What are the standards of evidence in this field that would allow me to be confident in my answer?'" By the same token, a student aware enough of his own learning to express low confidence in an incorrect answer receives partial credit for sensing that he didn't know, thus encouraging him as a learner rather than thumping him for getting something wrong. With this system, he gets both the positive and negative points to be made through the question.

      Creative Options Everywhere

      One of the strengths of Bruff's book on clicker use lies in the wide range of faculty examples he includes. That range evinces impressive imagination and commitment among faculty to improving student learning, itself a pleasure in reading the book. And, while the dominant use of clickers falls in scientific fields, the book includes rich examples of skillful use of clickers in humanities courses as well. Moreover, while clickers offer the most efficient means of collecting student responses, the overall emphasis falls on collecting those responses and on the dimensions of psychology, motivation, and cognition involved in their use. Hence, Bruff includes discussion of some low-tech means of collecting student responses as well.

      With clickers, as with so many other new technologies, the greatest benefit seems to lie in the way they uncover new means of improving one of the most ancient of transactions-teaching and learning. Socrates would be proud.

      Contact Derek Bruff at: Derek.bruff@vanderbilt.edu

      May 27, 2009 reply from Bob Jensen

      Hi Bill and Rick,

      One of the enthusiastic early adopters of response pads (clickers) in the hands of students during lectures was our AECM founder Barry Rice. Barry used the early technology called HyperGraphics for screen presentations and student responses on screen --- http://www.trinity.edu/rjensen/000aaa/thetools.htm#ResponsePads 

      HyperGraphics was DOS-based before the Windows operating system came on the scene. HyperGraphics had a unique niche in the DOS world but never competed well in the Windows/Mac worlds when ToolBook and Authorware came on the scene --- http://www.trinity.edu/rjensen/290wp/290wp.htm  This illustrates how technology can make and destroy software. ToolBook and Authorware, in turn, never competed well in academe after course technology became more Web-based. Now we have HTML, XML, Wikis, chat rooms, instant messaging, etc.

      But response pads (clickers) are still popular with many faculty in various academic disciplines. In a lecture, clickers offer limited response capabilities that online students get with full network capabilities from their PC stations.

      I’m certain Barry Rice will be pleased with your 2009 testimonial about successful clicker use that he used successfully as far back as 1989. Barry would probably still use clickers in lectures had he not switched to full-time administration many years ago.

      I had the luxury of teaching in an electronic classroom over the past two decades. Each student sat in front of a PC capable of easily interacting on screen and via ear phones with the instructor and each other. With a flick of a button I could flash any student’s screen in front of the class just as a clicker response can be flashed in front of the entire class.

      What I did not develop software for was response aggregation. One advantage of clicker software is the power to instantly aggregate joint responses of all students in the class such as the number of responses for each of the choices in a multiple choice question. I think the Trinity University electronic classrooms now have such aggregation software that can slice and dice multiple student responses.

      While many faculty users of clickers minimize clicker cheating by not providing student performance grades based on clicker usage, there are some that give credit in some form, including quiz points based upon clicker responses. This can create problems. One study on clicker cheating can be found at http://www.lychock.com/portfolio/Documents/final report.pdf

      Another problem in very large lectures might arise when clickers are used for taking attendance. These are not very reliable for taking role unless accompanied by some verification controls.

      Bob Jensen