Uncategorized

Developing and Measuring Transferable Skills at Scale – An Invitation to Participate

Developing and Measuring Transferable Skills

Universities are very good at transmitting content.  In my Introduction to Psychology course for example I know I have a responsibility to inform students about the methods of psychology, the most interesting experiments and the theories they conform with or refute, the critical figures in the field, etc.  We all feel pretty comfortable on that level.  Clearly transmitting content is important, but in this face-based world where future professions are hard to predict, and wherein students ultimately move from one sort of job to another, we are repeatedly told that while content is relevant, perhaps the more important thing we should be doing is developing core transferable skills in our students – skills like Critical Thought, Creative Thought, Clear Effective Communication, and MetaCognitive Awareness.  This raises two questions, HOW do we develop these skills and HOW DO WE KNOW if we’re doing it well?

The HOW DO WE KNOW question implies a need for measurement.  When we teach content we use tools like exams to measure how well the transmission worked, that assures us our efforts were worthwhile and for the reflective educators it gives them a measure they use to assess the impact of changes they may make to their course.  Do certain ways of teaching produce better test scores for example?  If we could similarly measure the transferable skills we value we would be in a stronger position to push for their development, and to develop them in a data-informed manner.

The HOW question represents a different challenge.  The things is, humans learn information differently than they learn skills.  Some bit of information can be learned quickly from an engaging lecture, text, or other educational resource.  Skills, however, “develop” and the process required to develop them involves repeated practice with the skill, preferably in a feedback-rich and structured learning environment.  As I sometimes like to say, you can learn a lot ABOUT Karate in a one hour lecture, but if you want to learn TO DO Karate, its a much slower and work-intensive process.  It is not immediately obvious to most educators how one provides students with this sort of practice as it related to skills like critical thought.

Next year (Fall 2018 – Winter 2019) our lab will be overseeing a very wide-scale attempt to test a concrete answer to both the HOW and the HOW DO WE KNOW questions.  Specifically, we will be assessing the potential of technology-enabled peer-assessment to meet these challenges.  Already we have over 35 institutions from 5 countries participating, and we like to go even bigger.  So consider this an invitation!  But what exactly am I inviting you to do?

The process we are testing involves taking some activity you might already use (or perhaps creating a new one if you prefer) and providing it to students in the context of peer-assessment and the formative use of feedback.  Specifically, we provide the peer-assessment technology we have created and evolved in our lab (see vision.peerScholar.com), and that technology supports the following activity flow.  First students submit a composition in response to your instructions.  In a next step they then see and are asked to assess the work of a randomly-selected and anonymously-present subset of their peers’ work (in my case they see and assess 6 peer compositions, but that is up to the instructor).  In the final phase they see the constructive comments peers have applied to their initial composition, they are asked to assess those comments, then they revise their work in light of the comments and perform a reflection piece describing their decisions and the justifications underlying them.  In previous research we describe precisely how this process exercises core transferable skills and we provide data showing that it works.

Here is the novel bit of the new study.  The Association of American Colleges and Universities has developed a set of so-called VALUE rubrics that connect directly with the core transferable skills educators have voiced as important.  For example, one rubric highlights the factors connected to Critical Thought and provides a mechanism for grading the extent to which each factor is present within a given composition.  Thus the rubrics can be used to translate a qualitative composition into a qualitative measure of transferable skills.

So what if we bring these two things together?  That is, what if, when students assess the work of their peers, we ask them to evaluate the work by applying a VALUE rubric.  Essentially we would be asking our students to look at a given compositions and assess how much, say, critical thought is apparent in that piece.

Let me highlight the potential synergies supported by this notion.  First, asking students to apply a rubric associated with some skill gives them a clear sense of what that skill is, and what it looks like when its present or absent.  Second, they are getting this experience in the context of an educational process that has been shown to develop a range of transferable skills.  Third, it provides quantitative measures associated with the skills we are interested in.  In fact, if students are asked to assess 6 peers, then each student’s work would ultimately receive 6 student ratings, 6 quantitive measures of the amount of, say, critical thought evident in that work.

While any single student rating might provide a relatively unreliable measure, when you average a number of such measures the resulting average has a much higher level of reliability.  We showed this experimentally in work we did some time ago, and we’ve replicated that finding in a study we did last year (we are now preparing the report).  Thus there is good reason to believe that the student derived measures are useful, valid and informative.

Given all of the above, the research we are planning for next year is essentially imagined as a demonstration of concept.  That is, we hope to show this process can work in almost any field at any level in a way valued by both students and faculty.  Thus we want the study to be as large as possible, and not only will we provide the technology and support to faculty who participate, we will also provide it to their entire institution during the study in hopes more colleagues come on board throughout the year.

Would you like to join us?  If so, go to the Participate in Research portion of this website to learn more and sign up by filling out the Google form.

Leave a Reply

Your email address will not be published. Required fields are marked *