Tuesday, September 30, 2014

Why I always ask my students to provide evidence

Without going into much philosophical detail regarding my take on the purpose of science education and the pedagogy that goes with it, one of the most important things I want my students to take with them in their life beyond my classroom is this:

Claims made by yourself or by others can be treated as dogma until you or the people making the claim provide evidence to back it up.  Until then....stay skeptical.
The point is simple and has practical uses in an infinite number of scenarios.  Just a few I can think of: (note: none of these claims are thought of to be necessarily true--they are just claims I have heard or claims that I know others have heard)

  • Doctor comes in and tells you that you have cancer
  • You hear from a friend that your girlfriend is cheating on you
  • The quality of education a school provides is based solely on the funding it receives
  • low-income students will naturally have low academic scores
  • Climate change is simply not true
  • African-Americans can jump higher due to subtle physiological differences
  • Females are better at multitasking than males
  • Students learn better when the teacher accommodates for different learning styles 
  • water is made up of 2 atoms of hydrogen and 1 atom of oxygen
The list could literally go on and on.  Regardless of the claim, each one requires evidence to back it up.  For example, it may very well be true that when the doctor tells you that you have cancer that means you ACTUALLY have cancer, but we all know that no one is going to simply accept a life-threatening illness without SOME sort of evidence to back it up.  It's not that I don't trust the doctor and it's not that I don't trust the the people who make the claim.  It's the fact that ALL claims about reality require evidence, otherwise you're simply accepting things based on faith.  

So what's the last straw that broke the camel's back that prompted me to write this?

The other day, in my high school chemistry class, we were trying to determine the thickness of aluminum foil.  In order to do this, we needed to use our newly-derived equation from a mass vs. volume graph.

Density = mass / volume

Along with this, we needed to use an equation for volume:

Volume = Area * thickness

To solve for thickness, you need to know the area of the aluminum foil square and its volume.  Area was no big deal, simply measure length and width and multiply them--cool.  However, the problem arose when my students went to solve for volume using their original density equation.  Keep in mind that I gave them the density of aluminum (2.7 g/cm^3) and they found mass on their own by putting it on a scale.  When solving for volume, EVERY single lab group did the following thing:


Hopefully you can see the problem with this.  So how does this tie in with what I've been saying about providing evidence for claims?  Let me continue...
After giving a brief "algebra lesson" I tried to get the students to understand that you simply can't do this when trying to get the denominator by itself.  So why did a bunch of 11th and 12th graders make a basic 8th grade algebra mistake?  I think it was due to the years of mindlessly knowing that they can multiply the denominator by itself to get rid of it.....so why not carry the same logic to the numerator?  Then this stupid thing came up......



I can't tell you how much I hate something like this.  Any teacher of math or science has seen this and I absolutely hate it.  It totally promotes a "no thought required" approach.  The claims it makes are simple:
density = mass / volume
mass = density * volume
volume = mass / density

In this case, all claims provided by the triangle are true.  But the reason they are true IS NOT because I, as a teacher, showed you this little triangle "trick".  They are true for very basic mathematical reasons!  In fact, if I really wanted to, I could provide a graph which plots density vs. volume and show them how to calculate the area under the curve and literally show them where something like mass = density * volume comes from.  No matter the case, you can use proven rules of algebra to solve for ANY of the variables.  
Interestingly enough, when I showed my students this triangle, their faces lit up and it was like I had given them a brand new present.  Kids love it when you show them procedural or algorithmic ways of doing something--because it requires very little (if any) thought.  However, if I require to you derive the density formula from a mass vs. volume graph (which we did) and then algebraically rearrange that equation to solve for any of the variables....I've asked too much of you.  
It's not a matter of capability either.  EVERY single one of my students is fully capable of doing all the things mentioned right above.  I just don't want them to accept the fact that mass = density * volume or volume = mass / density just because they heard it from a teacher.  I want them to know it because they can prove it using their own reasoning skills along with the math skills we have taught them.

Finally, the last thing I got into with my class as we took a little detour from chemistry class.  I asked them, "what is pi?"  Immediately, every single one of them responded with, "3.14159 blah blah blah".  Some of them even know something like 7 decimal places....impressive!  But then I asked them, "how do you know this is number or, better yet, where does this number even come from?"  Not a single student had the slightest clue.  They had been taught this number since they were young and not a single teacher had managed to take 10 minutes to allow them to plot a circle's circumference vs. its diameter and calculate its slope (which is the value for pi).  "That's where this number comes from!" I said with a smile.  They weren't too amused.  By the way, it's not their fault....I didn't know this until I was 22 for the same reasons that they don't know it at their age of 17 and 18.  

Do I really care THAT much that kids know where the value of pi comes from or that mass = density * volume?  No....but don't miss the point.  If we consistently leave out room for the students to naturally use their reasoning to investigate WHY things are the way they are or HOW things happen and just simply tell them the facts, then we as teachers are no different than a Google search.  One of the cornerstones to a successful scientifically-literate democracy of people is the ability to reason.  If we willingly inhibit this ability, we are not only hurting our students in terms of inhibiting growth of their own thinking skills, but we are robbing them of a much deeper understanding of the amazing universe around them.  

Monday, September 8, 2014

Reflections on Ball Bounce Challenge

About a week ago, I posted a quick bit on the benefits I come across using Twitter to expand my profession learning community.  This particular post revolved around the idea of actually doing some physics the first couple days instead of doing the routine syllabus check and other typical 1st day "to-do" things.  While glossing over some potential activities, I had come across Frank Noschese's blog which included a fun ball bounce challenge activity with a high-speed camera.  After consulting with Frank via Twitter about the logistics of the activity, I decided to give it a try.

The first day went a little something like this:
In front of me are several balls.  As you can see, some are tennis balls and some are golf balls.  It shouldn't be much of a surprise that when I bring each ball up and drop it, the ball bounces back to a certain height.  Each ball bounces higher the higher I start it and, as you can see, the golf ball appears to have more "bounciness" to it.  We aren't going to concern ourselves with WHY this happens right now--that's for a later time.  However, we are going to use this phenomenon to collect some data and try to make some bold predictions.  
I have a very specific goal I want each group to try and achieve: start the ball at the appropriate height so that it bounces back up to a target height that I have already designated for you.  Keep in mind that you will not be given any practice rounds and you will not be told the height I want your ball to reach until we are ready for the challenge itself.  You and your group members must decide on how you are going to do this......SO GO!

At this point, groups got their balls (tennis or golf) as well as a meter stick (or 2) and I quickly began to see a variety of things happen.

  • some groups simply dropped their ball against a meter stick and "eyeballed" its bounce height
  • some groups dropped their ball and, using their iPhone, recorded its bounce height so they could view it later
  • some groups only dropped the ball once and decided that one data point was good enough
  • 1 or 2 groups really didn't know (or maybe care) what they were doing and just sort of bounced the ball a few times 
From my perspective, I thought all of this was interesting for a couple reasons:
  1. It gave me insight toward how students value data.  Is one data point good enough or should I have more?  How accurate do I need to be and are there measuring tools or techniques that will allow me to attain more accurate data?
  2. It also gave me insight as to who already sort of "gets it."  One quarter of my class qualifies for special education and I have several others who have demonstrated poor academic skills in the past (based on either already having them or simply hearing it from other teachers).  This is not to say that "teaching this class physics will be impossible" but it does have a profound effect on the extent to which I can assume certain skills my students have as a teacher.
We spent the entire first period, which just so happened to be a short day due to an earlier school assembly, collecting our data.  I assigned a number to each ball so that they would have the same ball the following day when attempting the challenge.

The second day:
I told the students I wanted them to have their ball bounce to a height of approximately 62 cm.  This was essentially the height of two lab stools.  As I walked around the classroom trying to get some insight as to how they were going to determine their starting height, I noticed the following methodologies from the students' perspective:
  1. "Based on our data, the ball seems to bounce to a height that is about 85% of its original height.  Therefore, we're just going to calculate the original height so that 85% of it is 62cm."
  2. "We found that our ball loses (some amount) of inches in height for every 1 foot we bring it down"
  3. "We know that the ball has to start somewhere above 62 cm so we are just going to sort of eyeball it and hope that we come close"
  4. "Our data shows that for every 10 cm we raise the ball, it bounces about 7 cm.  So we will try to use that so it bounces to 62 cm"
Again, I didn't really care how the students went about collecting their data and how they were going to use that to make the ball bounce to the target height.  The physics behind this activity is actually pretty cool and fun but it has its own place in the course down the road.  What was cool to me was that when given the freedom to develop their own experimental procedure, students sometimes come up with cool and creative ways to achieve the objective.  Sometimes ways that I would have never thought of before!  This can also go the other way too.  If given the freedom to choose, some students will essentially do nothing and any nudge I try to give them in the right direction will sort of just go right through one ear and out the other.  But that's a different story.

So what did the results of the challenge look like?
  • As you will see in the videos, I placed a meter stick between the two stools.  I tilted it upright so that the broad side was facing the camera.  I told the students that if they got any part of their ball level with the meter stick, they succeeded in the challenge.  Because this was for fun, I thought it might allow some room for error.  Winners got to reach into a bag of mini Twix and take a handful.  I ended up going through 3 bags!!  All very large hands from teenage boys....
Winner


Winner


Winner




As you can see from the videos, some are pretty good and some went just went horribly wrong.  However, those that either won or got really close all had one thing in common: they had a plan that allowed them to collect enough data and come up with some sort of way to use that data to make a prediction.  It was their first attempt at using models to make predictions!

Overall, the activity was fun.  The students had a good time watching the youtube videos as a class and seeing who actually won.  It was easy for me to record the videos and quickly upload them to my youtube channel that same class period.  I will add a few new things to the activity next year so that each group provides some sort of data and each group makes a whiteboard that shows their methods as well as data.  This activity was really cool and I'm sure the students appreciated not having to spend yet another class period listening to their teacher read the syllabus :)