I've been happily teaching high school science for over 13 years. This website serves as a way for me to reflect on my practice, give back to the science educators' community, help other science teachers who may need a place to start, and build a strong community of science learners and educators.
What’s the big problem with many science research assignments we give our students today? They can all be done with ChatGPT and AI. I have a colleague who ran into this problem with her science class and their space research project. Now, I still think research projects are useful – but, going forward, I’m going to do 3 things to AI proof my research projects. Watch the video for more details. The tasks are briefly summarized below.
Episode Notes
Task 1: Write an argument using the information acquired from the research assignment. I can assess a student’s ability to write an argument through this task.
Task 2: Solve a puzzle using the information acquired from the research assignment. I can assess a student’s ability to apply knowledge to other scenarios through this task.
Task 3: Evaluate bias using the information acquired from the research project. I can assess a student’s ability to evaluate bias in secondary sources through this task.
Thanks for watching, and let’s talk science education again soon.
Our resources are free. We aren’t collecting emails for our resources. However, it would help us out if you liked us on our Facebook page and subscribed to our Youtube Channel. Thanks!
When it comes to writing tests while using standards based assessment, my students need to demonstrate greater depth in their written responses in order to demonstrate greater levels of proficiency. However, because I need students to give more depth, many students now spend more time forming their responses and, thus, they are unable to finish their tests. One solution is to make the tests shorter by giving fewer questions – but, I’ve already streamlined it to the fewest questions possible. So, in the end, I decided to split my tests up.
Episode Notes
For tests created to assess objectives and standards (ie. Standards based grading), start by declaring which standards to assess. For example, for a chemistry test, are you planning to assess a student’s ability to analyze data and/or to write an argument? What about other standards?
Then, decide on how you’ll go about assessing the standard on the test. Will you use multiple choice questions or extended response questions or both? Or, will you ask students to write an essay? Aim to create a 30-40 minute task for each standard.
Split the test into separate parts that are based on the standards. Thus, on one day, give students the part that assess one standard. Then, on the next day, give students the part that assess another standard. For example, on Monday, give students questions that assess how well students analyze data and information. Then, on Tuesday, give students an essay question that assess how well students can write an argument.
Make sure to let students know ahead of time which standards will be assessed on which day.
Thanks for watching, and let’s talk science education again soon.
Our resources are free. We aren’t collecting emails for our resources. However, it would help us out if you liked us on our Facebook page and subscribed to our Youtube Channel. Thanks!
Strength based comments take a lot of time to write. Recently, it took me 7 hours to enter the comments for each of my students in 6 science classes. It takes this long for a number of reasons: it’s difficult to find comments that match the strength based comments format from traditional comment banks; and, most teachers don’t think about comments until the end of the term or school year – and, as a result, we’re left scrambling to remember what students are strong and weak in. After my 7-hour experience with comments this time, I figured out that I had it all backwards when it comes to comments – and that I can potentially produce strength based comments for my students by starting with comments first (ie. at the beginning of the year/term).
Episode Notes
Strength Based comments emphasize a student’s capabilities and aptitudes. Comments about a student will be individualized, specific, and may include the student’s own voice.
One framework for Strength Based comments has teachers write comments for the following 3 categories for each student: (1) Strengths, (2) Areas for Further Development, and (3) Ways to Support Learning at Home.
Using a template of prompts from the Yukon ministry of education to come up with a range of comments for each category, I write a list of comments that speaks to the skills and concepts we covered and practiced this term. Then, I copy and paste the full of comments for each student. And, as I go through each student’s report card, I delete the comments that do not apply to that particular student.
To save time next reporting period, work backwards and come up with the range of comments at the beginning of the term. This will not only help focus my teaching but will also help focus what I’ll be looking for in each student. Then, at the end of term, it will be faster to delete all those that don’t apply.
Thanks for watching, and let’s talk science education again soon.
Our resources are free. We aren’t collecting emails for our resources. However, it would help us out if you liked us on our Facebook page and subscribed to our Youtube Channel. Thanks!
Are you looking for an easy and engaging activity to introduce CER next year? Here’s one you can try – and to add a bit of fun, you’ll be working with a colleague and collecting some summer vacation artifacts over the summer to use as part of a show-and-tell for this CER activity. Don’t worry, it won’t take up much time – no different than keeping a small summer memory box.
Episode Notes
CER is a framework to help students write an argument, which is a skill that is taught in our curriculum (under “Communicating” in the BC Science curriculum and under Science and Engineering Practices in the NGSS). Thus, for a CER intro activity, let’s have our students engage in argument – but, let’s make it easy for them to come up with one.
For this CER intro activity, get a colleague to collect 5-7 unique artifacts over the summer that would paint a picture of what they did during their summer break. This could be a ticket stub to a movie, a seashell from the beach, a sweater they knit – you get the idea. And, you should do the same during your vacation – collect unique 5-7 artifacts over the summer as well.
When you do your intro class to CER, lay out all the artifacts that you and your colleague have collected and tell students where the artifacts are from. Then, tell your students to use CER to structure their argument to this question: Which teacher – Teacher A or Teacher B – had the better summer vacation?
Thanks for watching, and let’s talk science education again soon.
Our resources are free. We aren’t collecting emails for our resources. However, it would help us out if you liked us on our Facebook page and subscribed to our Youtube Channel. Thanks!
How do we get more marks out of the projects we assign? This is important because I want to reduce the amount of time I spend marking projects while still getting more feedback and marks regarding my students. This is especially important because switching to Standards Based Grading has forced me to change what I assess and how I should assess it. For example, I now need to assess Planning and Conducting and Designing Solutions (aka. Applying and Innovating). I could assign a bunch of labs and projects – for example – 8 each term and mark each one for a student’s ability to plan and conduct experiments or design solutions. But, that’s a lot of work, and a lot of marking. Instead, I’ve learned how to get more marks off of one project – and this has saved me a lot of time.
Episode Notes
Assign projects that allow students to build and test solutions to a problem – AND, the results of these solutions need to be measurable. For each project, require students to build and test multiple prototypes. And, mark each prototype. This sounds like a lot of work, but, I mark prototypes by looking at the measurable results achieved by each prototype, which doesn’t take a lot of time at all.
For each prototype, record how well the prototype worked against a standard that you set. Record these marks for Designing Solutions. If you require students to make 3 prototypes over the course of the project, then you’ll have 3 marks for Designing Solutions (aka. Applying and Innovating)
Also, give marks for improvements made between prototypes. For example, if students made 3 prototypes, give an improvement mark between prototype 1 and prototype 2. And, give another improvement mark between prototype 2 and prototype 3. These marks can be assigned to Planning and Conducting.
.
Thanks for watching, and let’s talk science education again soon.
Our resources are free. We aren’t collecting emails for our resources. However, it would help us out if you liked us on our Facebook page and subscribed to our Youtube Channel. Thanks!
An issue some teachers using Standards Based Grading (SBG) are grappling with is that there are no percentages, no cumulative points, no tests out of 50 or 100 or whatever in SBG. Instead, we use proficiency scales and rubrics that assess whether a student’s current level of proficiency is emerging, developing, proficient, or extending. But, how do we determine what’s developing, proficient, or extending? Turns out, we can look to our old standardized exams – especially their answer keys – to help us construct our proficiency scales.
Episode Notes
Many old standardized exams have answer keys that show exactly what steps students need to provide for an extended response question to get partial or full marks (see handouts for a sample). From what was told to me, those who marked these standardized exams first spent a day discussing what steps students needed to show in order to get each mark out of question that was out of multiple marks.
With regards to proficiency scales and standards based grading, we need to sit down with our colleagues and discuss what students need to demonstrate at each proficiency level for each competency, skill or practice so that we know and the students know the expectation at each proficiency level.
One way we can do this is by having a group marking session where we take test questions and have everyone mark the same question by themselves and then sharing out what proficiency they felt the student had achieved. It’s through these conversations with our colleagues where we can hash out exactly what is proficient, developing, or extending. It doesn’t have to be complicated – you can take a test that you’ve already given students and pass around a few responses for colleagues to discuss.
.
Thanks for watching, and let’s talk science education again soon.
Our resources are free. We aren’t collecting emails for our resources. However, it would help us out if you liked us on our Facebook page and subscribed to our Youtube Channel. Thanks!
Are you looking for a way to improve student reading comprehension in your science classroom? Have you tried Bionic Reading? It’s an app that bolds the first few letters of each word in a text – and it’s theorized to help individuals read faster and comprehend more. Does it work in the science classroom? We tried it with our science students, and we invite you to use our handouts to do the same with yours.
Watch the video to see how we set up our class experiment
Episode Notes
According to their website, Bionic Reading is supposed to guide the eyes through text with artificial fixation points. As a result, the reader is only focusing on the highlighted initial letters and lets the brain center complete the word.
For students with ADHD, who may be sensitive to a lot of stimulus, the bolded letters may help to ground them to the text. When I asked my students – many of whom did not have ADHD but used bionic reading for my experiment – how they felt using Bionic Reading, many echoed the same thing: that their eyes were able to skip over words more quickly and their minds able to predict the words. This supports that idea that Bionic Reading may help engage the mind more during reading.
My preliminary results – which include 132 students from Grade 8 to 12 – show that students who used bionic reading – on average – 10 seconds faster with the approximately the same test accuracy compared to those students who didn’t use bionic reading.
Thanks for watching, and let’s talk science education again soon.
Our resources are free. We aren’t collecting emails for our resources. However, it would help us out if you liked us on our Facebook page and subscribed to our Youtube Channel. Thanks!
Here’s a question that comes up a lot in my workshops: If Standards Based Grading is meant to assess the science skills (aka. curricular competencies, SEPs – science and engineering practices) a student can demonstrate, what about the science content? Aren’t we doing a disservice by not testing to see if students actually understand and know the science content?
Episode Notes
The content is still there – we still teach content – but the content should be used to teach and practice science skills (aka. curricular competencies, SEPs in the NGSS), which is what we assess in Standards Based Grading.
Craft better test questions that require students to include content while demonstrating a skill. For example, students could be tasked to write an argument based on data that is given to them on a science test. Students could be required to use the CERR (Claim, Evidence, Reasoning, and Rebuttal) framework for their response. Under the reasoning and/or rebuttal sections, students would need to include science concepts and content that provide explanations to their argument.
Thanks for watching, and let’s talk science education again soon.
Our resources are free. We aren’t collecting emails for our resources. However, it would help us out if you liked us on our Facebook page and subscribed to our Youtube Channel. Thanks!
How do you give retests when using standards based grading? My answer – I don’t give retests. This is an excellent benefit for me in using standards based assessment because, now, I don’t spend additional time creating and marking retests.
Episode Notes
Traditional marks book setup focuses on mastery of information and content. Since specific content will only appear only on specific tests, students who don’t do well on specific content would want a retest for the corresponding test.
A marks book setup where students are assessed on curricular skills and practices using Standards Based Grading focuses on mastery of skill. If a student doesn’t do well on specific skills, they are given other opportunities on future tests to demonstrate mastery of that skill. Thus, I don’t have to give a retest when the focus is on skills and practices because those skills come up over and over.
Content still appears on tests. But the content is what I use to evaluate the skills. For example, the argument students craft on a chemistry test need to include chemistry ideas in their response in order for it to be considered PRF or EXT.
Thanks for reading, and let’s talk science education again soon.
Our resources are free. We aren’t collecting emails for our resources. However, it would help us out if you liked us on our Facebook page and subscribed to our Youtube Channel. Thanks!
Can multiple choice questions be used to assess standards? Yes, they absolutely can. But, to ensure we’re using multiple choice to assess standards effectively, we need to re-examine the multiple choice questions we use and come up with certain types of multiple choice questions we consistently use to assess standards. In this episode, I go over examples of questions I’ve used to assess curricular standards like Questioning and Predicting.
Using Multiple Choice in Standards Based Grading
NOTE: Our transcript is below. Download handouts at the bottom of our page and follow along! Or, watch the video.
Truth be told, I love multiple choice because crafting good multiple choice questions AND good multiple choice responses is both art and science. When I come across a good multiple choice question, there’s just something about it that makes me say “ah-ha, that’s a cool way of testing that idea” – and I think all teachers sort of geek out that way too. That’s how I approach using multiple choice in SBG – I look to see if a question gives me that “ah-ha” feeling.
Thus, there are questions that simply don’t give me that feeling – and these are the stereotypical multiple choice questions that are based on recall – like a question that asks “what is the definition of”. We need to get rid of these questions because they’re more memorization than skill. Having said that, I’m not saying there isn’t going to be content on a standards based test – just that the content will be used differently on our test.
So, we need to come up with some types of questions to use for SBG. And here’s the key I’ve discovered to doing this effectively and efficiently: for each competency I’m assessing, I use the same passage types and types of questioning.
For standards or competencies related to Questioning and Predicting, I give students the passage type known as Dueling Hypotheses – where I bring up multiple hypotheses to a phenomenon and students need to analyze the hypotheses and form predictions off these hypotheses. Here’s a passage I created where I give students 3 hypotheses for the formation of acne, which is of interest to students. One hypothesis says acne is caused by diet. Another states acne is caused by bacteria and dirt. A 3rd hypothesis says acne is caused by cosmetics.
Then, the questions I tend to ask are ones that require students to see which hypothesis will be supported or refuted by specific samples of evidence. For example, if an anti-viral medication were used and Jordan’s face cleared up, which hypothesis would this support?
Thus, since I use a consistent format in questions and passages, students get regular practice on how to analyze and predict the effect of different hypotheses. And, because I’m basing passages on the content I’m assessing, I’m providing a variety of examples and hypotheses to students as well. Also, the consistency in format saves me time in creating standards based questions because I’ve already decided on the types of questions I’m going to ask, which is half the battle when coming up with a test. And isn’t this what we – as teachers – want at the end of the day? To save time?
Check out my handouts, where I also provide an example of how I assess standards or competencies related to Planning and Conducting.
Thanks for reading, and let’s talk science education again soon.
Our resources are free. We aren’t collecting emails for our resources. However, it would help us out if you liked us on our Facebook page and subscribed to our Youtube Channel. Thanks!