Late last year, via Twitter, a discussion revolved around how to give AP Stats students more and regular experience with multiple choice questions. One idea put forth by @druink was to have an MC Monday experience. She shared a form she created and asked for input and collaboration around the viability of the form: number of questions, student reflection, etc.
I wanted to use the form but tweak it some more to allow me to track both general student progress as well as progress in the four strands of the AP Stats curriculum: Exploring data, Generating data, Probability and Inference methods. Part of this is to complete my teacher evaluation student growth component, but also to help me see if there are areas that need revisiting later in the course (of course, there always are, but in the past it was hit or miss rather than data driven).
One hindrance for tracking via components was how to identify each question easily so it didn’t become a time-suck. In addition, I needed a reasonably easy way to generate questions that were essentially at the AP level. So my department purchased a new ExamView test generator for our text which already had the questions identified by the AP standards. I also found that GradeCam can assign a learning target to questions. Unfortunately, the available targets were CCSS or state standards; luckily, in the CCSS there are “almost matching” statements around the 4 big strands, so I could label questions. Because I have been using GradeCam for unit tests as well as Midterms and Finals, students are very comfortable with the process of scanning their answers for me.
So today was the first day of trying out the process and I think it went great! Now I’ll be able to track long-term retention of the key ideas as well as monitor individual students’ growth (or lack there of).
Yup, I’m a grading machine! Lots of finals to assess, lots of lives to change. Well, not really, but I think my students sometimes think that’s the case. As I grade, I try to keep in mind final “growth” in understanding for each student. Isn’t it tough to pick questions that give a good snapshot (and that is all the questions are, just a snapshot) of a student’s understanding at that moment. So much pressure to make a final assessment that truly reflects each student’s knowledge and understanding at the end of the semester.
I’m also a “muncher” as I grade, so I tried to be more healthy…see the grapes 🙂
As I mentioned yesterday, I am becoming a big fan of Google Forms for an easy and flexible medium for assessing student work. Needless to say, the possibilities for building in student voice in the classroom are so broad.
Today, I began to read through and grade my AP Statistics students’ Bias Projects. The posters were hung in the classroom, as you saw yesterday. I took my high stool and my iPad and worked my way around the room. The Google Form I set up made it easy to assess and record. I used the “4” or top rating descriptors under each category to help me focus on the key aspects of the category. For instance, one category is Data Collection. The descriptor included
- Method of data collection is clearly described,
- Includes appropriate randomization,
- Describes efforts to reduce bias, variability, confounding,
- Quantity of data collected is appropriate.
The descriptors helped me gauge the quality of my students’ product and the Google Form drop-down made it a breeze to quickly grade.
I spent about 7 minutes per poster reading and evaluating the quality of the work.
Once I finished assessing the final products, I posted them outside my room to share with the school. As students and teachers walk by, I hear lots of interesting comments…I sure hope it translates into increased enrollment next year.
Today my AP Stats students had the final unit test of this semester. It was on Sampling Distributions which can be fraught with conceptual as well as mechanical errors. So how do I address them? How do I remember them? How do I learn from them? How can I use them to inform my practice next year?
One thing I do is an Assessment Debrief on a lined Post-It note. As I grade and notice consistent errors or misconceptions, I’ll document right then what I’m thinking. I used to think about these things, but forget the gist by the time I was handing back assessments. This helps me organize to communicate better with my kiddos.
But the even better thing is that I stick it on the key and when I begin planning for the chapter the next year, I have detailed notes of errors as well as the big conceptual misunderstanding. Then I can plan ahead about how to draw out these misconceptions or process errors early in the chapter. YAY!!
Well, here we are! Today’s the day for my kiddos to shine. Although few students popped in this morning for last minute help (that’s a good thing, right?) they seemed a little nervous at the start of the period, so I offered them a smile and a pat on the back at any time during the exam….some took me up on it at the end of the test…so made me smile!!
I wish I was more organized outside of school so that I would bake some pi sugar cookies (I think I have a pi-cookie cutter somewhere). What do you do to help your students relax prior to and during an assessment?
I have always handed back my quizzes and some tests because I think it is important for students to take the time to review mistakes, ask questions and seek better understanding through mistake analysis. At my school, the math department has been required by the school board to return all assessments although most other departments are not. We have asked if we can keep the assessments and make them available to students in our classroom at any time; the answer was NO. This past year, we did finally convinced our school board that we can’t hand back our Finals because of the time component in creating them and this was agreed upon.
Subsequently, over the years, students have created Facebook accounts for the various math courses we offer at the high school and then the returned assessments are scanned and posted for all to see. The purpose of this site is to help students get good grades by having access to the rich questions asked on our assessments, go over them with their tutors (who also have the exams on file) so that they have practiced the response over and over again. Thus, for many students, the assessment doesn’t assess the ability to bring to bear their conceptual understanding and procedural fluency in new and unfamiliar situations.
For example, one question we have asked is the following:
Find the equation of the line (in the most efficient form) through the two given points on y = sec(x):
The goals of this question are for students to read and use the scale, evaluate y = sec x for various inputs, apply old knowledge in a new way (write an equation of a line with non-integer values), and be comfortable with unusual forms of answers. If students see this question prior to the test, then it becomes a routine problem. The student response does not inform me about how well they are making sense of the structures inherent in a problem, applying some basic skills of evaluating trig expressions, finding slope, and writing the equation in a most efficient form, namely, point-slope form. Our calculus teachers say that students can use the calculus to generate information, but when it comes to writing an equation of a line (good ‘ol Alg 1 skill) they freeze up, mostly because the numbers aren’t integers or decimals. We also emphasize why point-slope is a more efficient form than slope-intercept when working with radicals, radians, etc. because one doesn’t have to calculate the y-intercept with “ugly” numbers.
With this community practice, it makes it challenging to create new and authentic assessments that actually test student knowledge and not their ability to mimic assessments they have seen. Yes, I know students could pass around old assessments if they kept them, but the effort and diligence to do this was large, and the follow-through for most students was very small. Now with the electronic posting option, little effort or forethought by the student is needed to preview every single previously-given assessment.
So today, for AP Stats, I am adjusting some of the questions on Thursday’s assessment to update them as well as include new questions to get at their conceptual understanding as well as their computational fluency and contextual interpretation. So I created (based on initial work done on the Collegeboard site) a question grid like the one below. I have seen AP Question Grids done in an Excel spreadsheet so that searching for specific kinds of questions is easier. Maybe some day I’ll switch over
I spent some time looking for a question about Combining Random Variables and finalizing this year’s exams.
How much time do you spend writing assessments? Are you required to return assessments and if so, what is the impact on your assessment preparations and practice? Would love to hear some time saving ways to create authentic and powerful assessment questions.
My precalculus teaching partner and I decided we wanted to encourage long-term retention while also giving students some experiences with multiple choice exams since our final exam has an MC component. So we instituted Quarter Midterms. Students do tend to perform a little below their usual level, but part of that is due to taking a multiple choice exam in math where the distractors are so enticing!
This year I used GradeCam to quick score them, but also got an item analysis. I hope to scatter in the highly missed questions as openers or exit questions in second quarter. These questions will also be good discussion questions next year.
I am soooooo excited. Today my AP Stats students start their end-of-year project. I just love to see how everything comes together for them. Every year I try a new medium in which they complete the project and as I posted earlier, this year students will present their results in an infographic.
I spent some time with a social studies colleague of mine who did infographics with her freshman. She shared some best practices she gained when doing the project with her students. One of the key shares was that she had her students use Piktochart as the free medium for actually creating the infographic.
It is easy to use, entering or importing data is easy, there are pre-made templates and once the infographic is complete, it can be made into a slideshow presentation so my students don’t need to create a powerpoint to present their findings.
As part of the set-up I had two basic guidelines:
Ask and Answer an Intellectually Interesting Question
For the first part of your application project, your pair will describe a particular problem that can be addressed through the primary statistical techniques we are studying this year—namely, confidence intervals and hypothesis testing, and linear regression.
- Determine an intellectually Interesting Question and at least 5 supporting questions
- Identify the population of interest, needed sample size, etc.
- Design an unbiased survey/method to collect data that answers your question
- Collect the data (you will hand in the raw data collected)
- Analyze question using statistical inference
Creating the Infographic
You should use appropriate data visualizations and other visual elements (colors, shapes, lines, typography, whitespace, and so on) in ways that enhance your infographic’s potential for communicating your work on the project. Your infographic can be any size or shape, but it must be of sufficient resolution to display well on the course blog. Your infographic should be designed so as to make sense to a fellow student in an AP Statistics course. Thus, you may assume that your audience is familiar with the material we have covered together as a class this year.
- Keep in mind that the central idea of the study should be prominent feature of the poster.
- Infographic Title should be informative
- Use statistical principles studied in this course: data displays, numerical analysis, inference and decisions (interpret p-value)
- Reveal data at several levels of detail, from a broad overview (at a distance) to the fine structure (closer inspection reveals more intricate information).
- Use supporting questions to guide how to display various aspects of your data to convince audience
- Graphic’s legend should clearly match to the graph
- Results of inference for main question as culminating evidence: connects statistical results to the context of the problem.
- Harmonious and strategic use of color and design elements. No trivial or extraneous information, etc.
I am using to collect information from the students about their partner choices, their main and supporting questions, hypotheses, data collection methods, etc. More to come in the following days!
I can’t believe that all of my posts this week are about Precalculus and Conics! But my AP Stats students are taking their final in little chunks because of the smarter balanced testing and weird schedules of 30 minute periods.
So my students had class time to prepare and to my delight, they actually worked in teams, went to the boards to explain to others a problem they could explain and generally used the time so productively. I felt like I had rays of sunbeams radiating from my smile ‘cuz I was so darn proud of them!
P.S. Please excuse my messy room! We are still under construction and all of my “stuff” is just out there on the shelves. Embarrassing!