Tuesday, February 7, 2012

"Knowledge Building"

And here it is: higher education finally getting on the bandwagon and engaging in constructivist learning techniques (re-packaged, apparently, with the new label "knowledge building").  This article, I'm proud to say, comes from my own alma mater, Smith College: "What Do You Know? And How Well Do You Think?"

I will let the article speak for itself, but I'm glad to see that innovative instructional design involving active, constructive ("knowledge-building") learning is taking place at the college level.

Library Video Using Xtranormal

You might have seen these types of videos floating around -- people are making these for a vast variety of reasons.  I've seen library-related and educational videos made with this online software, and my husband commented last night that people on his Corvette forum have posted these types of videos as well (I didn't ask what, exactly, the videos on his forum were about).

Here's one I created about our library's one-on-one consultation service. It was easy and fun to do.  

And this one was created by my co-worker, Jessica Ambrose.

Here's another one that I really liked that was posted by Sheila Webber on her blog.



From a pedagogical standpoint, I admit I really didn't understand the point of these until I made one. I think that, as with any technology, it's what you do with it and how well and soundly you use it. 

A few points/rules of thumb that I can think of:

  • Keep them short.  More than two-three minutes and you've lost your audience with these. All of the above videos are 40-50 seconds long, but they each serve their purpose: to inform about one concept, service, or process. 
  • Make them as content-rich as possible in that limited time. 
  • Think about your audience and write the script accordingly.  Don't be overly corny or trying-too-hard-to-be-cool if you are writing for students. 
  • Make use of humor to capture the audience's attention.  Selecting two characters who are just plain-looking, for example, and having a boring script defeats the purpose. If you want to make a boring video, then make a regular video and make it professional-looking.  Only use Xtranormal or a similar script-reading video technology if you are going to capitalize on the tongue-in-cheek nature of the graphics and crudeness of the technology. (Disclaimer: by crudeness I am referring to the "low-tech" nature of it -- I am NOT suggesting that it should be used to create lewd or lascivious videos!)

Monday, February 6, 2012

College Scorecard?

The White House is trying to create a "scorecard" to help families determine the "affordability and value" of prospective colleges. As of now, the scorecard looks very limiting and misleading, in my opinion: http://www.whitehouse.gov/issues/education/scorecard

Fortunately, they are looking for feedback.  I understand that the types of statistics they are using for this scorecard are easy to gather, and seemingly can create a picture of a college's affordability and value; however, suggesting that each individual student's experience will be the same or even close to the same as the "average" this data creates is very misleading.  A student who attends a college with a high percentage of at-risk or lower-achieving students could get an excellent education there if that college places a high value on teaching and learning effectiveness.  The statistics at that college might look bleak, but a student with a lot of potential could take advantage of the high student-faculty ratio, the tutoring services, and leadership opportunities there, for example, to be very successful.


I appreciate the Government's efforts and intentions, but I fear that another shallow tool (shallower, in my opinion, than the U.S. News and World Report rankings that come out each year) will end up leading to "punishing" colleges that accept at-risk and low-income students, because they are worried about "raising their score" on this scorecard to get more applicants. There are so many factors that go into college value, and they vary greatly depending on the goals and purpose of the individual student attending.  In general, though, I believe that looking at measures of the effectiveness of teaching and learning that takes place at a college -- particularly in the areas of critical thinking and information literacy -- can be a good measure of the college's value, no matter what the financial data might say.

Of course, this data is harder to capture. But just because data can be easily obtained does not mean that it should be used. It's like the expression, "You get what you pay for."  Simplistic data leads to simplistic behaviors. Robust, qualitative data leads to robust, high-quality results -- but it is more complicated to gather and compile. Yet colleges and universities have been compiling this data for years, and in the last five years have been under a lot of pressure by accrediting agencies to compile even more.  The data is there -- now it needs to be read and used accordingly to place real value on a college education.

Thursday, February 2, 2012

The Purpose--and Mis-Use--of Formative Assessment

This article from Education Week very simply and clearly explains what formative assessment is supposed to be, and how it is often misunderstood and misused.

A poignant point from the article: "[I]t’s not the test per se that is formative or summative. It is the use to which the test’s results are put."

This to me is one of the most important things to keep in mind.  I have seen here at my institution "formative assessment" defined in such a way that the emphasis is on the assessment of the student progress rather than on how this data is being used to assess the effectiveness of the instruction. There is a big difference, but I think many educators (especially, I'd venture to say, those in higher education who do not necessarily have degrees in education fields) do not fully have this understanding. Giving quizzes or assignments throughout a course can be one means by which educators can employ formative assessment processes, but they are not the only, nor necessarily the most effective, means.

Regardless of the means by which formative assessment data is obtained, the purpose of formative assessment must be kept in mind:  to evaluate whether or not teaching and learning activities are effective, and then to make changes accordingly to the lesson design and instructional methods.  Formative assessment is a much more subtle, nuanced, and complicated process than simply monitoring students' quiz results or grades on assignments to assess their progress in class.  It is what is done with these grades, quiz results, or other more informal assessment techniques (such as simply asking questions in class to gauge students' understanding at a given moment in time) that distinguishes formative assessment from summative assessment; the latter is for the purpose of evaluating what students have learned; the former is for evaluating how instruction or learning processes can be improved.