Competency and credit

I worked as an IT support technician for several years in college and after completing my bachelor’s degree. I liked the work, for the most part – helping users figure out why their Windows [version redacted] machines weren’t working as expected and helping to administer a few servers – but it was clear to me almost from day one that the vast majority of the content I learned in college wasn’t going to be particularly helpful in this business (not counting the ability to speak, write, and think clearly, which are invaluable anywhere). This may have something to do with the fact that I was an interdisciplinary humanities major, rather than a computer science major, but it didn’t change the fact that I needed further training in order to perform my job correctly. Some of this came informally by learning on the job. Some of this came more formally by training seminars and workshops supplied to me by my employers and ones I sought out myself.

This is traditional in the IT support world. Some higher-level techs come out of college with degrees in computer science and hit the ground running as programmers, network administrators, or DBAs. But I would guess that most IT folks work through tutorials and study for an exam like A+. Passing this examination demonstrates competency with a particular type of skill. Other competency exams await for those looking for further certifications or career advancement, even those who do graduate from college with computer science degrees.

This model seems very effective to me. If I study for and pass an exam with a known competency undergirding it (like understanding how to troubleshoot PC hardware and software), then a potential employer has a verifiable way of knowing that I understand how to perform certain types of work. When someone graduates from college, though, what does a transcript really offer to a potential employer? In my interdisciplinary studies degree, I learned to write much better than I had in high school, but there was no writing exam that solidified my competency in writing academic research papers. Though I tried to absorb Weberian sociological analysis, when I finished my degree my transcript did not say that I was “well versed in the Protestant Work-Ethic.” My transcript was so useless, in fact, that none of my potential employers ever even asked for it. Though the bachelor’s degree is itself a type of credential, it is a murky one; can we assume that a diploma from a reputable four-year institution certifies solid thinking, writing, and problem-solving skills?

This leads me to wonder if college should be more like the IT exams. I’m not suggesting that high-stakes examinations should become the core of the higher ed system, though this is often the case anyway (by virtue of the dreaded Cumulative Final). The model I envision would include evaluation of specific skills, likely related to both the discipline in question and more general thinking and writing skills, demonstrated through project- or test-based assessments. If a student does well in calculus class, we have a general sense of the quantitative skills he or she might possess. But if that student instead had demonstrated that they can calculate a derivative and apply that skill to a real-world problem, would that be of more use to both the student and those who might look at this student’s transcript.

To take it even further in this direction, suppose that graduation from any given institution of higher learning required a student to complete a series of competencies. Rather than a specific amount of in-class time, measured by the Carnegie unit, students would be able to demonstrate competencies at their own pace. Southern New Hampshire University is already doing something like this with their “College for America” program, so it will be interesting to see the results of this experiment. The benefits of a system like this seem obvious, though – especially for job seekers and their potential employees – because these competencies will be listed directly on the student transcript and will become public knowledge about anyone graduating from that school (or, taken to scale, any school in the country). Colleges could (and are) working directly with employers to understand the types of skills they desire in employees, thus closing the loop between educational attainment and employability.

I don’t mean to sound like all higher education needs to be career-focused. I think there is immense personal value in taking courses because one is interested in the material. But the fact remains that the vast majority of students who complete a degree are headed to the workforce in one way or another, so doesn’t it make sense to ensure they’re properly prepared?

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s