Testing labor quality without degrees and credit hours

Quite possibly this could be a more significant development — for the United States at least — than on-line education:

Testing firms are offering new ways to measure what students learn in college. Their next generation of assessments is billed as an add-on – rather than a replacement – to the college degree. But the tests also give graduates something besides a transcript to send to a potential employer.

The latest arrival of the bunch will be a revised version of the Collegiate Learning Assessment, dubbed the CLA+, which the Council for Aid to Education is rolling out this fall. The new test, which is the CLA’s first upgrade in a decade, includes a work readiness component and more student-level data.

Earlier this year the Educational Testing Service (ETS) introduced two new electronic certificates for student learning. And ACT Inc. continues to develop its WorkKeys skills assessment system.

There is more here.  In the longer run, what happens when a student shows up with a good score but no degree?


If this catches on, we'll see a lot more teaching to the test - education aimed solely at doing well on standardized tests. Is that really what we should be encouraging at the college level?

Look at the supposedly hardest (Level 7!!) sample question on the ACT Test on Applied Math:

"The farm where you just started working has a vertical cylindrical oil tank that is 2.5 feet across on the inside. The depth of the oil in the tank is 2 feet. If 1 cubic foot of space holds 7.48 gallons, about how many gallons of oil are left in the tank?"

I don't think this has any chance of eroding the utility of a traditional Engineering / Science degree anytime soon.


For a math/hard science major, of course this isn't a difficult question. However, how many college students could answer it today? I bet fewer than 50% could.

To me, that is exactly the point. Sure those sorts of questions will differentiate physicists from english majors, but we already have information to do that from courses enrolled, etc. A hiring firm would have that information too- I cannot imagine an engineering firm hiring a Roman Literature major who happened to be pretty quick with word problems. It is not clear how this difficulty of question would separate those with strongest math skills from one another, seems like repackaged SAT/GRE math.

I can see some utility like this: Say, I am hiring a Sales Manager or Accounts Exec. or one of the many jobs that list a generic graduation as a prerequisite (just because they can). Now I can differentiate between an acceptable math-literate History grad versus another dumber, say, Philosophy grad.

The basic flaw they are compensating for is the fact that Universities allow students with such flaws in basic math / analytic / writing skills to graduate out at all.

This is ridiculous. Non-STEM majors would last have done a math class in their freshman year, and might not have practiced geometry since early in high school.

Besides, any job that requires answering a question like this isn't something one trains for at a university.

Any adult should be able to figure that problem out just by handling units. Maybe the one trick is knowing that A=πr². If it's not a multiple choice problem you could give that hint.

So the question is, "Have you memorized the formula for the area of a circle/cylinder"? Or if they are not allowed a calculator, is it whether the person can multiply decimals in their head?

That beats teaching them to inane fantasies promulgated by the education rackets.


"If this catches on, we’ll see a lot more teaching to the test – education aimed solely at doing well on standardized tests. Is that really what we should be encouraging at the college level?"


But what we have now is not always useful information. Some colleges, including highly ranked ones, award A grades to 40 percent of students in each class. Others, including some state colleges, give 10 percent As. How can a potential employer distinguish scholarly excellence from a college transcript?

Yes, it is. This always struck me as an lame excuse for lousy instruction. You should always be teaching the test, because the test is the evaluation of the student's knowledge. If the test doesn't adequately do that, the problem is with the test..

I'd say what happens when students start showing up with a good score but no degree would render most degrees meaningless, but we know that won't be the case, as education level is pretty much the only type of filtering still permitted in the job market. It'll become more apparent that most schools (or at least departments within those schools) are diploma mills, but the degree requirement won't go away. Until discrimination by education level is outlawed too.

"what happens when a student shows up with a good score but no degree?" Don't ask us, ask the Supreme Court.

Related to this, for people entering college, the data is clear on the value of high test scores who did not take high school seriously. Students who complete high school with a good class rank are much better students with a much greater chance of graduating from college than students with high test scores who get lower class ranks (better high school class rank is achieved by a mixture of grade point average and selection of more challenging honors classes). I would think that the same holds true (on average, there are some Bill Gates out there) for college students, where people who don't bother finishing college have some other attributes that are undesirable as employees even if they are brilliant. The Spence signalling model is hard to ignore.

Gates took Math 55. That's all you need to know about his ability and diligence in college.

Agreed. There are fields where some minimum test score cutoff seems necessary for success, and there are times when test scores can be an antidote for some reason for poor performance in high school that does not reflect on personal character (e.g. a semester of low grades due to a bout of mono, or due to a romantic disaster). But the general observation that employers want reasonably intelligent grinders who won't rock the boat, rather than genius divas is a solid one.

Griggs v. Duke Power decided this issue. My bet is the court will treat these exams as IQ tests. Employers will inevi9tably refuse to consider them for fear of getting sued.

Has anyone gotten sued for using a test, even an IQ test, for a reasonably high or intellectually demanding position (programmer, engineer, quant, doctor, economist, statistician etc.)? All the cases I've read are about firefighters or janitors or some such.

Sometimes I think the worry of getting sued is overblown.

Paul Oyer at Stanford did some research on this, related to the Civil Rights Act of 1991. Buried somewhere in that paper is the idea that firms overestimated the risk of getting sued, although I don't see that result in the abstract.


When I was in the marketing research business, Procter & Gamble used a rigorous written test that they had paid a lot of money to get validated as meeting "business necessity". P&G got their money's worth: the quality of P&G staffers was high.

On the other hand D&B refused to have written hiring tests for fear of being harassed by the government. Another firm I worked for used a written test (one of the founders' Advanced Quantative Methods in Marketing final exams) for about 10 years, with excellent effect, until the EEOC complained.

Did Tyler intentionally put this post in close proximity to the "Superstar Teachers " post? Testing is the missing link that enables superstar teachers, no?

The Korean teacher is actually a pretty much a suneung test prepper ala Stanley Kaplan - meaning this post is actually perfectly positioned, as an example of two for profit professions feeding off each other in an endless spiral.

God forbid someone make a profit from teaching

"what happens when a student shows up with a good score but no degree?"

Employer: "Your test results show that you're very intelligent and your lack of degree shows you avoided going 50K+ in debt for a useless humanities degree. Hmmmm, while you're unable to enunciate the myriad cultural constructions and gendered prejudices of contemporary white patriarchal society you seem very wise at managing money and debt by not spending thousands of dollars on what's obstensibly become a mechanism for cultural indoctrination. It also shows here that instead of getting drunk for 4 years while spouting bong flavoured marxism you went and earned actual working experience. We feel you'd be a good fit for our team. Welcome aboard!"

I'm not sure I completely agree. There is research that compares high school graduates to those with the GED, and this tends to show that GED holders don't perform as well in their future life. I think a reasonable explanation is that high school and college degrees indicate you can jump through hoops and do as your told for years on end, and I think this is important for a lot of employers.

Surely there must be a cheaper way to prove one's capacity for conformity

Join the armed forces for a couple of years? They'll pay you to prove your capacity for conformity...

Clearly you have never been in the service.

The Air Force, Navy, Coast Guard and, I believe, the Marines won't take you if you haven't finished high school. The Army may, but only if you get a good score on their test. It's harder to get into the military than it is to get into most colleges.

To your point Alan, I have heard that the Marines, in fact, have the highest requirements when it comes to the ASVAB (armed services vocational aptitude battery).

The Marines have fewer slots to fill from a larger pool of applicants. Their market position gives them the luxury of setting a higher price. Plus, cooler uniforms.

No, at present, the Marines have the lowest entrance test percentile requirement (if you have a high school diploma, you must score in the 32nd percentile or higher on the SAT-like AFQT, 50th percentile with just a GED), and the Coast Guard the highest. From Wikipedia:

Standards for enlistment

AFQT required minimum scores for people with a high school diploma as of December 2012 (unless otherwise noted) are as follows:

Minimum AFQT
Tier I Tier II
Branch ≥ HS Diploma = GED
Army 35 50
Navy 35 50
Air Force 40 65
Marines 32 50
Coast Guard 45 50 with 15 college credits
*Army National Guard 35 50
*Air National Guard 35 50


This has been essentially my pitch - went to a cheaper state school where I could use AP courses to graduate in a couple years, scored well on the GMAT, and passed professional tests (CFA). With hindsight I wouldn't recommend it. Networking is limited and nontraditional doesn't do well with HR screens. Perhaps if it became a little more normal though..

Definitely not what HR in normal firms are looking for; they just want to know which box to put you in. Don't know about in the US but over here in the Antipodes in some jobs e.g. actuary, you just have to find a way past HR and get your CV to someone who knows how to properly interpret your background. I also think there is a perception that such a tactic is a signal of some kind of abnormality/defect, or a lack of confidence in oneself.

"Perhaps if it became a little more normal though." Not likely. I think it is a system with multiple equilibria, and the one we are stuck in is not good.

Teaching to the test for critical thinking could improve social sciences programs. Proving the ability to find information could root out those who found out where to buy research papers.

I never get this crazy fear of "teaching to the test", at least not at the baseline we are at.

Knowing how to solve an algebraic equation, or computing compound interest is a skill I'd rather have even if imparted via the goal of "teaching to a test".

Yeah. The only challenging (or interesting) courses I had in high school were AP classes, which followed the set AP curriculum with the goal of getting us to do well on tests. The other classes just had much less content, or spent way too much of the sememster on one topic (like parabolas) then squeezed everything else into a couple weeks.

I think an excellent way to avoid conflict of interest is a division of responsibilities: (1) Deciding what needs to be taught, (2) actually teaching it, and then (3) testing if what was desired to be taught was indeed learnt. These are tasks best done by separate people.

I think one of the weak points of US higher ed is conflating these three tasks.

Yes! I agree. I think this too, though I see the main conflict being between 2 and 3. Is the university providing an education or a degree? Dumbing down the curriculum and grade inflation tells me it is the degree.

I think the mentality makes sense for a teacher who already believes he can pick good subject matter. Such a teacher would be upset to see the results not reflected in the tests, and naturally get upset about the mechanically bureaucratic thinking that will inevitable go behind the test.

However, it is hard to see why a centralized education bureaucracy with a mandatory curriculum is any better.

Employers in STEM fields have informal ways to achieving this same goal. Anyone who has the pleasure of hiring programmers, for example, knows the drill. If you are hiring out of college, you get good and testing the skill of the applicants. So many pop out with useless diplomas that sound like the real thing, you have no choice but to create your own informal testing methods. The irony is college dropouts are wildly over represented in the higher end of computing. A kid who got bored by junior year and dropped out to start working is probably going to a creative programmer.

What are the legal issues in the US? Why are these not construed as "apptitude tests" or whatever the bogeyman is supposed to be.

Mostly that they are done face-to-face with nothing written down (bar a whiteboard), and not mandated from above, thus making them legally difficult to distinguish from any other subjective job interview. Those places that do have explicit coding challenges will typically use problems very close to those actually faced by their programmers, which makes them perfectly legitimate tests under the applicable court rulings.

Programming tests often aren't directly related to the task at hand. I've been asked to do FizzBuzz a number of times, which is just a test to see if the person in front of you is a complete charlatan.

On programming forums, there's often a lot of complaint from the special snowflakes that they shouldn't have to submit to tests, but despite their anger they've never managed to actually get a lawsuit going because someone could not implement quicksort for a job that doesn't require writing sorting algorithms.

They tend to be quite free form, instead of a standardized test everyone's scored are compared against.

Good tests resemble a real task, but can be done in a very short amount of time with little preparation. On one end of the spectrum, you get questions that are barely more complex than a puzzle. On the other, the candidate might receive part of the test in advance. For instance, a candidate might be handed a very small application, that they should become familiar with pre-interview. Then, in the interview itself, the candidate would be asked to make changes to the application, or comment on which parts of it have the most maintainability problems.

There's just so much variation in productivity between programmers of the same experience that nothing short of a practical test will give you an idea of the real skills of the candidate. Some can be ten times as productive as your average, and others be ZMP workers, but you'll never be able to remove the ZMPers by looking at a resume.

This assumes intelligence is the key to doing a good job. If somebody is highly intelligent, but can't keep his ass in a seat to complete a long-term task, that can be a problem. And no, the solution isn't always a more flexible work environment.

That there are several traits / skills relevant to a person's ability to do a job well is not a good argument to exclude intelligence from that set. Unless you are arguing intelligence is anti-correlated with those traits in which case too it makes sense to measure intelligence if only for the opposite motive.

No, I'm saying an IQ test won't tell you everything you want to know. Obviously, intelligence is important, but I've known several highly intelligent people that I wouldn't hire.

It won't tell you everything you want to know, but it is the single best indicator of on-the-job success, if you have to choose one factor (according to studies). Interview results come in near the bottom of the list. Good interview results probably just gets you a sociopath.

I was told just recently by a recruiter about a candidate who showed up in a tuxedo and top hat. Interviews are good for filtering out those people.

Duh, you don't wear a top hat with a tuxedo!

I believe that storing fine grained information about skills gained by a student and using it could be a big pay-off from computer aided education. Verifying and taking advantage of skills already gained during retraining could make this much quicker and easier. This would increase employment flexibility and reduce the risk of starting vocational training for job X and then finding no jobs were available or you were not good enough to do X professionally.

I still think it only a matter of time before actual colleges and universities themselves offer comprehensive testing out for degrees- frosh to senior. You combine that with a comprehensive personality test, and you will have something an employer might well use to assess. All without having to waste 4-6 years and $100K plus.

Things like CLA+ are going to face the problem of using IQ tests today- the fear of being sued for disparate impact. Of course, financial pressure of ridiculously escalating college costs may force political adjustment of things like Griggs, so it is an open question.

What incentive does a university have to do this comprehensive testing? Universities want high scores and are willing to fudge to get it. If students do poorly, the university loses money because students leave and new students will stop coming because they will naturally blame the professors that can't teach. The university will lose prestige because employers see that all of the students in this supposedly good school aren't that good.

The university doesn't want any of this so they will make the tests easier. Proof that this is the natural tendency is grade inflation today .

Then the employers will start to doubt the value of these schools and so will parents. Presto! Back where we started.

A top quality college/university gets to screen the students prior to admission, Keith, via HS grades and comprehensive verbal/writing/mathematics tests. I agree, that after that screen, a university/college may have an incentive to dumb things down so that they have higher graduation rates, but my idea is that comprehensive testing out for the degrees themselves is for people who never attended physically in the first place. There, I can't imagine a university would have an incentive to dumb it down. The incentive seems to run the other way.

Recruiting for a medical group. I have found test scores to be nearly meaningless. Our worst hires had great test scores. I made it a point to track this down and found that teaching to the test is the norm for many of the weaker training programs. The size and quality of the training program matter much more than test scores in our experience.


I think this depends on how you design a test.

You can design tests that are not bubble tests, but ones which require people to solve problems and display creativity.

That makes some sense. The better the law school, the less it teaches to the bar exam. The good schools want to teach you to figure out the law for yourself, including where the law is unclear, how and why the law has changed over time, and how to think analytically, not unemotionally about legal issues. If you are someone who should be a lawyer, that plus a summer of intense studying for the bar exam will be enough.

All this talk about college as a "waste of time" is more indicative of a certain disposition (i.e. nerdy tightwads) than a true indication of the value of the college experience. File under mood affiliation?

Well, it was a waste of time if you were trying to learn valuable job skills. It's not a waste of time if you want to have fun. When are you ever going to have a chance to be 18-22 in a city of 18-22-year-olds ever again? I wasted way too much time learning things in college that I never use. I should have drank way more.

I think this is great. And, in fact predicted this development to a faculty member at a graduate school who was revising the core business school curicula to ensure that students had mastered their areas before graduating. The premise was that you would "sell" the students to employers by showing that they had mastered accounting or marketing to some level of proficiency.

No more demands for easy A's but rather demand for substance rather than educator popularity contests.

It will also be good for online education, because you can compare a Yale or Harvard graduate to other graduates. It might also be good for business to test their own staffs and encourage them to brush up and take some classes.

Most empirical research on personnel selection shows that cognitive assessments and work sample assessments are the two best predictors of employee success.

The non-confirmity / signaling issue seems pretty important, though. I wonder if, given a candidate who did not go to college, there is a difference in the predictive power of cognitive assessments vs. work sample assessments.

Reading through the objections, I find a common theme- the assertion that taking the 4 years to get the degree is some sort of valid signal of conformity, working well with others, work ethic, etc.

Native intelligence isn't going to allow a person to pass, let's say, a comprehensive test for differential equations, organic chemistry, or even poetry/literature analysis. To do so will require a person to have actually demonstrated the work ethic to learn these subjects well enough to pass the exams, and doing it on their own. It is a basic mistake to think these are actually the equivalent of an IQ test.

Now, colleges/universities do offer actual learning experiences that one can't easily obtain by independent study. Here, I am thinking of practical laboratory experiences. However, there is literally no rationale for having such experiences spread out via 1 day/week for 4 years. You can literally do an entire semester of organic laboratory course in a week's time.

Colleges and universities may be dragged kicking and screaming towards more efficient modes of operation, but it will happen.

Re: "Colleges and universities may be dragged kicking and screaming towards more efficient modes of operation"

+1 Kicking and screeming into the 20th, and not the 21st, century.

You can do the in-lab hours that quickly, and that *might* be OK for the first two years. But I the really valuable labs in physics and engineering involve detailed analysis of the experiment set out in a report that takes about a week to write.

"the assertion that taking the 4 years to get the degree is some sort of valid signal of conformity, working well with others, work ethic, etc."

Any signal the measures those traits is valuable to employers. If you can devise a test that measures that (e.g. the famous "marshmallow test") and isn't easily falsifiable when used repetitively to test large numbers of applicants, that result would be used a great deal by employers.

The famous example in entry level law firm hiring is the polished shoe test. Conventional wisdom in big law is that only people who are strong in this dimension of ability/personality have gleamingly polished dress shoes with no scuffing (and more generally unwrinkled shirts, and crisp ties, etc.). I guarantee you that far more law firms hiring entry level employers use the polished shoe test than use LSAT scores which are a good and standardized proxy for IQ that every applicant took within the last two or three years. Another proxy for this trait often used by large law firm hiring partners for entry level employees is participation in team sports as a college undergraduate.

Is this the "hagwonization" of higher ed?

Compare the difficult of professional certification to getting a degree. Sure for lots of people neither is particularly challenging but the former is definitely harder in many fields. I mean people with college degrees in business and finance related fields often complain that the CFA is difficult. This does not say much for what they learned in their degrees.

I think people are missing the utility that such tests presents for employers of jobs that need brains, but not necessarily training. Secretaries. Bank Tellers. Higher end service jobs (barristas, waitstaff at mid level chain restaurants, which is why you find college students working there). The high school degree is useless. So how do you find functional people who have, say, 10th-12th grade reading skills and basic math literacy? Right now, the only way to manage this is to ask for a college degree or college attendance. This test would do nicely to reduce the need for that demand, except it will weed out huge numbers of blacks and Hispanics at the top end.

And even at the low end--McDonalds, Taco Bell, hotel maids--it would do to separate the functional from the non-functional.

It would be considered severely lacking if a STEM major justified ignorance of what the Civil war was about or the location of Mexico on a map or the policies of non discrimination or some such.

Being able to calculate the volume of a cylinder is a skill as basic as any of that and I'm surprised anyone would try and rationalize that ignorance.

Perhaps the deeper malady is that breadth requirements have been one way: humanities majors are not being taught as much of science / math basics as might make sense in todays world.

Wouldn't hiring on the basis of a test expose employers to discrimination-based lawsuits? I thought that was the primary reason that large employers especially were reluctant to use tests....

Wasn't this what the Ricci case was about? All the firefighters who passed the test for promotion where white, so the city threw the test out in fear of a lawsuit...


The court decided against the city in this case, but I still think there is a lot of hesitation attached to test-based hiring because of this.

Would be really interested to see this used in combination with some (potentially modified version) of the pre-college assessments (i.e., SAT or ACT) to create an individual and college-level "value-added" score for students. If done well, could be an interesting, quantitatively based way of separating unobservables (e.g., talent, motivation) from schooling in evaluating post-college prospects. Granted, it would likely be biased, depending on exactly what you measured, but that doesn't make it uninteresting...

It makes sense. Hell, I've floated trial balloons about similar ideas myself. But, I don't really believe it will catch on.

When particular narrow skills are at issue, certifications created by dominant players in the industry (like CISCO and Microsoft) not third party certifications matter, or a traditional regulated profession system is devised if the consumers of the skill set are diffuse.

General educational tests, however you choose to package them, are basically proxies for IQ, and there are lots of proxies of IQ that have long been available that employers have never carried about before. For example, employers could, but don't, routinely ask applicants for their SAT or GRE scores. I've never seen a law firm ask for an LSAT score or bar exam score (apart from pass-fail) from an applicant, even though studies have shown that this is strongly correlated with success in the firm. Even transcript requests are rare despite the fact that this also is strongly correlated.

Why is this rational? I think the main answer is that completing a traditional degree in a timely fashion signals not just the IQ that tests like these can measure, but also what Steve Hsu calls "W" - work ethic, grit, persistence, personal organization, etc. Employers may recognize that it is possible that someone can be smart or knowledgeable enough to do a job without a degree, but they don't want to hire someone who is too lazy to endure the grind of studying for midterms, doing labs, writing term papers and just showing up to class regularly for three to five years even if they have the brainpower needed. In the same vein, high school diplomas are a much more valuable credential than GEDs (a large share of which are earned by people who are incarcerated or had conduct issues in high school) even though a GED is well validate to measure the same level of academic ability as a high school diploma.

Even in jobs that are fairly demanding in terms of IQ (e.g. law, investment banking, management consulting) employers often think about "W" type ability and IQ ability as roughly equal in importance. And, most jobs do not demand nearly so much in terms of IQ and relatively more in terms of W. Hence, experiences that validate W other than college (e.g. years of active duty service in the military with an honorable discharge or peace corp service or time spent as a self-employed business owner even if the business isn't all that big or impressive) are often valuated by employers, while IQ oriented test scores are rarely sought out by employers.

Comments for this post are closed