Why are there so few computer science majors?

That is a long and very interesting post by Dan Wang, it is hard to summarize, here is one tiny excerpt but better to read the whole thing:

2. You don’t need a CS degree to be a developer. This is another valid statement that I don’t think explains behaviors on the margin. Yes, I know plenty of developers who didn’t graduate from college or major in CS. Many who didn’t go to school were able to learn on their own, helped along by the varieties of MOOCs and boot camps designed to get them into industry.

It might be true that being a software developer is the field that least requires a bachelor’s degree with its associated major. Still: Shouldn’t we expect some correlation between study and employment here? That is, shouldn’t having a CS major be considered a helpful path into the industry? It seems to me that most tech recruiters look on CS majors with favor.

Although there are many ways to become a developer, I’d find it surprising if majoring in CS is a perfectly useless way to enter the profession, and so people shun it in favor of other majors.

And this, which runs close to my own thoughts:

Perhaps this is a good time to bring up the idea that the tech sector may be smaller than we think. By a generous definition, 20% of the workers in the Bay Area work in tech. Matt Klein at FT Alphaville calculates that the US software sector is big neither in employment nor in value-added terms. Software may be eating the world, but right now it’s either taking small bites, or we’re not able to measure it well.

Finally, a more meditative, grander question from Peter Thiel: “How big is the tech industry? Is it enough to save all Western Civilization? Enough to save the United States? Enough to save the State of California? I think that it’s large enough to bail out the government workers’ unions in the city of San Francisco.”

Here is Dan’s follow-up tweet on other answers to the puzzle.


'I think that it’s large enough to bail out the government workers’ unions in the city of San Francisco.”'

Not to mention pouty secret funders of lawsuits targeting the media.

CNN has no special distinction under the first amendment. That means that I am a media too. Do I have a sufficiently compelling first amendment interest in publishing a sex tape, made surreptitiously and acquired through theft, of a minor public figure with a non-public figure, that I can publish it globally and ignore a court order to remove it? In our world of fake anonymous sources at Washpo, NYT, and CNN, maybe I do?

But the real question is whether "covfefe" proves that Donald Trump is a homosexual with Vladimir Putin, like Stephen Colbert had a first amendment interest in broadcasting, or whether it only proves that Donald Trump will let Vladimir Putin purchase Washington D.C. in exchange for renaming a Russian city Trumpingrad? Tune in to CNNMSNBCNYTWASHPONPR to find out!

The publisher of the sex tape did not acquire it through theft, it was made by the owner of the camera/premises on which it was recorded and thus not infringing the copyright of those recorded. So yes, you have an absolute right as long as there is not copyright violation.

It was made with an explicit deception from owner and then stolen from the owner and sold to publisher...

I should note that possibly a great deal of this puzzle can be explained by the hangover from the dotcom bust in 2001. Several people pointed this out to me on Twitter and in my comments section.

The issue is that I may have started the measurement as the bubble worked its way out of the college system, i.e. in 2005, four years after the bubble. Still, as I note in an addendum, this may be a bit too neat of a story: "Should it take 15 years before the popping of the bubble before we see that college students are graduating with the same degrees again? I guess so, and I’m interested if other industries have experienced a similar lag. Were people entering school say in 2003 acutely aware of how badly fresh graduates were suffering? Were they very well aware of then market conditions, and decided that things were too risky? Why didn’t freshmen/sophomores course correct earlier when they saw that the bubble had bursted?"

Another issue: It's not totally clear how the National Center for Education Statistics aggregates these categories; it may be the caset that there are CS majors hiding in other categories.

Rather than discuss these known issues, I'd be pleased if more discussion focused on the power law distribution between top programmers, the question I'm most interested in, and in which I raise in the piece.

'I should note that possibly a great deal of this puzzle can be explained by the hangover from the dotcom bust in 2001.'

Or a fundamental misunderstanding, as shown from this quote from the article itself - 'not just what the problems are in the industry, but how they deter college students from pursuing a major in CS'

The number of programmers capable of performing certain tasks is in many areas quite rare (those where the term 'developer' might be accurate). Unfortunately, the exact quote doesn't seem available online, but the man that created one of the technically leading OSes of the 1990s - https://en.wikipedia.org/wiki/Jean-Louis_Gass%C3%A9e - remarked that his company had 30 of the 200 people in the world capable of writing an OS.

A quote made around the same time that another person was also developing an OS, essentially all on his own, at first, though his 'team' continued to expand steadily. And oddly enough, it appears as if his OS - some people are likely familiar from it by using a smartphone not made by Apple, or possibly the Internet - was essentially his thesis. Yes, it is likely Torvalds was taking advantage of Finnish government funded higher education to pursue a hobby, so to speak - 'Torvalds attended the University of Helsinki between 1988 and 1996, graduating with a master's degree in computer science from NODES research group. His academic career was interrupted after his first year of study when he joined the Finnish Army Uusimaa brigade, in the summer of 1989, selecting the 11-month officer training program to fulfill the mandatory military service of Finland. In the army he held the rank of Second Lieutenant, with the role of a ballistic calculation officer. Torvalds bought computer science professor Andrew Tanenbaum's book Operating Systems: Design and Implementation, in which Tanenbaum describes MINIX, an educational stripped-down version of Unix. In 1990, he resumed his university studies, and was exposed to UNIX for the first time, in the form of a DEC MicroVAX running ULTRIX. His M.Sc. thesis was titled Linux: A Portable Operating System' https://en.wikipedia.org/wiki/Linus_Torvalds

The realm where 'developers' live is extremely rarefied. People capable of writing compilers may benefit - extensively - from taking computer science courses, as there is no reason to reinvent, much less reimplement, decades old concepts. This being one of the explicit benefits of the GPL, to allow code to be reused, examined, tweaked, without a developer needing to waste time on mundane work, or solving a problem for themselves which was originally solved 3 decades ago. This being a major reason that Torvalds is still in the position to guide an OS, by the way - the GPL world tends to work on the basis of talent, not credentials. The world where developers like Torvalds live (possibly somebody has also heard of this little thing called Git - yeah, Torvalds again, using the GPL - https://en.wikipedia.org/wiki/Git ) is not adequately covered by looking at CS majors in the U.S.

Dude, Torvalds wrote Linux in the 1990s as essentially a clone of Unix for desktop pcs. In the ensuing 20+ years, it has changed drastically and evolved far beyond his original code. You can't credit him with android. There were entire eras of Linux development red hat, Ubuntu, between his version and Android, which was developed for yet another platform


ARPAnet is responsible for the whole internet. Even though the actual internet was developed independently at universities by bored academics who wanted to build an open version of it that anyone could connect to.

It's UNIX. Android runs the Linux kernel, which is the bit of Linux-the-OS that Torvalds actually wrote. And while academics built out the 'net (Berkeley Sockets) I don't know that they "added" openness. They too used openness.

The Linux kernel has changed a lot since 1993. Everything about Linux has changed a lot.

ARPAnet was a closed network, you had to be hardcoded into it. It didn't use ethernet, or go over telephone lines, it used dedicated physical lines that only specific machines could connect to.

The internet that we know today was never physically connected to ARPAnet. ARPAnet was a demonstration project that proved certain concepts, but it was never the "backbone" of the internet. The internet that we know today actually evolved out of USENET, not ARPAnet.

Commercial Internet service providers (ISPs) began to emerge in the late 1980s. The ARPANET was decommissioned in 1990.

1990 was very late in 'net evolution, and much important (open) architecture and usage (ISPs) were in place by then.

It would have taken Google 4 years to get the the linux starting point. They were (and are) smart, so they used an existing working and well maintained base to build their system upon. This is a characteristic of a large number of very popular software systems; OSX is a variant of a unix OS, Safari and Chrome are built upon an html engine written by a couple of germans who wanted a challenge and wrote khtml. Any software gets rewritten and replaced over time as maintenance and operation change, but again, Apple used khtml to build Safari and got a two year head start.

What was and is revolutionary about Linux and Linus is the process. Software is a bunch of small and important details put together to make something larger; the complexity is enormous. Failed software systems are almost the norm. The tooling for Linux, Git, the build systems, etc are used extensively in the industry. Read the story of how git was developed; Linus used patches sent via email because the software repository systems didn't fit. He said what he needed, someone wrote it, and everyone is using it now. It is called Git.

The same thing happened with Node, which is the basis for a substantial proportion of internet web applications. A guy had an idea, wrote some code, did a talk which is painful to watch because of his shyness. It worked better than what existed, a bunch of other people started pounding on it and improving it. Google and Facebook web applications use it. Their demands and smarts have improved it substantially. Facebook released a number of their internal software libraries, so had Microsoft and Google. Netflix as well. They know that their applications can improve with the ideas of other people.

Yes, so it's unfair to say "Linus Torvalds developed Android", when it was actually a collaborative effort involving thousands of people over 20+ years. Torvalds was certainly important, but you wouldn't credit the two Germans for inventing Chrome, just because they invented khtml. All sorts of software is developed based on pre-existing code written by someone else. It's the norm for software to be developed this way. the one guy who developed Node doesn't claim credit for Goodle and Facebook.

I started a top tier CS program in 2002, I dropped it in 2003 and while the market wasn't the main driver it certainly didn't help.

@Dan Wang: the stackoverflow user survey you link is quite informative. 75% of them have bachelor or higher degrees.

I think this shows a trend. Today, a good fraction of math, physics and engineering graduates have developer experience with Matlab, R, C++ or Python. This kind of experience is enough for a great variety of jobs that require "coding" but also analytical skills, stats, and specialized scientific knowledge.

I have geoscience background and coding C++ is part of the job. However, there are more complicate tasks that need a computer science background. One of my thesis advisors leads a software company that develops numerical models. When I asked for job opportunities, the answer was the company needs 1 or 2 guy with physics knowledge and the rest of employees are "hard" computer science.

Thus, computer science people is employed in very specific tasks: HPC, machine learning, automated image processing and software/hardware development. But, not much people around the world works on these challenges. Not as much as applied coding.

The student loan program offers loans for any major at the same terms and at the same interest rates. Think about that for a second.

Dear Lord Hazel, you aren't proposing a system of student loan underwriting that would consider the probability that a student would pay that loan back are you? That would make you a racist and a sexist under your own social justice worldview, given the racist system of privilege that forms the basis of varying levels of compensation in our economy. For shame.

You know, Thomas, it is possible to be a free market libertarian, AND not be a racist!

Hazel, you seem to have missed my point. You seem to subscribe to this social justice worldview. My point was: look how easy it is to find racism anywhere. If Donald Trump proposed what you just proposed TNC et al. would claim it is a racist policy on the basis of disproportionate impact. The takeaway is that maybe you should be a little more careful before you jump on the bandwagon of racist outrage du jour.

I see, anyone who thinks that racism still exists and that blacks (and other minorities) are still treated unfairly in many aspects of life, such as law enforcement, is a "Social Justice Warrior". You're either on the alt-right "race realist" bandwagon, or you're the ENEMY, a "Social Justice Warrior", in league with communists who are trying to destroy America.

Yeah, these guys don't get nuance, moderation, or degree of difference. It's part of what makes them so shrill.

There are only two options: 1. disparate impact is dispositive proof of white racism, or 2. disparate impact is not dispositive proof of white racism. The people who believe the second thing range from Ben Sasse to the KKK; the group of composed of racists and non-racists alike. The people who believe the first are uniformly racist because the first requires a belief in a unique, innate racism which is expressed in white skin.

I have no idea what you are talking about. Disparate impact may, in some cases, be the result of implicit racism in policy. In other cases, it might have nothing whatsoever the fuck to do with racism. It would be absurd to conclude that disparate impact NEVER happens because politicians are predisposed to cater to the interests of white voters and thus pursue political policies that intentionally benefit white voters more.

What we really need to look at is whether the policy is objectively just without regard to race. That's precisely the value of limited government. It prevents government from biasing the market in favor of favored groups, such as whites. The fewer laws there are, the more people's success is determined by their own merits, and not by being part of a favored class.
Now, we can have a debate about anti-discrimination policies and societal racism and whether government should do anything to counteract the fact that some people are going to be assholes to certain groups of people. But I favor non-governmental solutions to that - i.e. having social norms where racism is heavily frowned upon. Which is what makes the alt-right so fucking awful. The last thing society needs is a bunch of people walking around saying it's okay to be racially prejudiced. Because absent such social norms the only way to make sure that black people ARE treated equally and ARE given a fair shot WILL be government. If you don't like anti-discrimination laws, you should be radically in favor of political correctness. If you don't want government to have to step in to force people to stop being assholes to people based on race, then you have to work to enforce norms against it.

"Being assholes to people because of race" is not committed only or even mostly by whites. I encounter racism all the time, and it's all against whites.

We've had laws against discrimination (other than so-called affirmative action) for 50+ years. More to the point, the only whites who are old enough to have ever committed it are over 70. So it's about time we end AA and make both laws and mores 100% color-blind -- and permanently accept the fact that those who are still economic failures today, when nobody is oppressing them, are responsible for their own failure.

I agree that laws and mores (social norms) should be 100% neutral with regard to race. if we want a society where people are treated like individuals instead of "average black person", then we have to have really powerful norms condemning racism. However, I'm not willing to say, in today's society, that if someone is not successful that's totally their own fault and has nothing to do with white racism. There's still a lot of racism, and there's still a lot of way in which governments pursue policies that are designed to benefit white voters more so than others.

You should check out 'The Color of Law: A Forgotten History of How Our Government Segregated America', which details how past federal, state and local policies deliberately enforced racially segregated neighborhoods, destroyed integrated neighborhoods, and systematically turned established black neighboods into slums by rezoning them for industrial uses. That's a perfect example of "disparate impact" as a racist outcome of how the system is biased against minorities. Rezoning black neighboods into industrial zones destroyed the equity value that black families had in real estate, and it was done precisely because black had less political power than whites. This sort of thing is still going on. Minority neighborhoods are more likely to be condemned as "blighted", taken through eminent domain, and bulldozed in redevelopment projects. And minority neighborhoods continue to be more likely to be used as sites for undesirable things like landfills and water treatment plants.

You are rapidly approaching the Art Deco limit for shitposting, except he is actually scholarly on a small subset of subjects, whereas you appear to be an ignorant blowhard on absolutely everything.

I perceive your distaste with what I've written, but perhaps, since it is sheer ignorance, you could enlighten me. Show me that disparate impact is not evidence of racism to the left and I shall never post here again. I mean, you can't, but please try.

Since the Great Recession, there has been a massive shift in demand from soft majors like English and History to the hard sciences, with Computer Science taking the lead. Enrollment capacity in CS has tripled in many schools but not kept pace with demand, leaving departments scrambling to hire new faculty and having to ration scarce slots. Decades ago, entering freshman were mostly "undeclared," sometimes not settling on a major until the sophomore year. Nowadays, college applications require that a major be specified; acceptance is jointly to the school and to the major. Acceptance rates within a school vary greatly by major and CS is by far the toughest. (Undeclared is a possible choice but it's very tough too: the expectation is that an undeclared is likely to be a CS major in disguise.) The binding constraint on CS enrollment is on the supply side, and we haven't reached equilibrium yet.

There wasn't any demand for English and history majors BEFORE the recession.

Women are over represented in English and under represented in Computer Science, and yet, you've proposed that we charge a higher interest rate for English majors and a lower one for Computer Science majors. Hazel, it seems like you have a lot of internalized misogyny, you know, according to your standards of proof for misogyny. You're also a racist.


I think student loans should be handled by private banks, rather than a government agency. Which would result in the banks deciding who to give loans to and for what purposes. Generally this means the low risks will pay less interest and have an easier time getting a loan. Which would mean that students would be incentivized to enter academic fields that are in demand in the job market.

Do note that this applies independently of whether the student in question is male or female. Women are just as capable of getting CS degrees as men. If everyone is incentivized to enter STEM fields, EVERYONE is incentivized. Maybe this will encourage more girls to keep up their superior math skills from grade school. (Research shows that girls outperform boys in math until sometime in high school - which IMO, is a cultural artifact of the signals girls get about what is considered normal or cool for girls. Also, computer programming is so detached from any gendered evolutionary task, hunting vs. gathering, that it's hard to imagine a reason why women would be any less good at it.)

Hazel, it would be more expensive for everybody because student loans are subsidized.

"If everyone is incentivized to enter STEM fields, EVERYONE is incentivized."

It works in theory.

"Research shows that girls outperform boys in math until sometime in high school"

Link? If your evidence is that girls get better grades in math, it doesn't count. Teachers grade on effort, not ability.

"I think student loans should be handled by private banks, rather than a government agency. Which would result in the banks deciding who to give loans to and for what purposes. Generally this means the low risks will pay less interest and have an easier time getting a loan. Which would mean that students would be incentivized to enter academic fields that are in demand in the job market."

I agree, Hazel, and this makes us both racist and sexist according to the new left.

Fascinating. Evidence that shows girls score better than boys in math doesn't count because .

Fascinating. Evidence that shows girls score better than boys in math doesn’t count because (insert confabulated reasoning here).



" Evidence that shows girls score better than boys in math doesn’t count because"

Grades are not "scores," and no, they don't count. As far as test scores, the link shows them to be equal, no advantage for girls. It's hardly evidence against biological causation as you imply. Young girls are also much more equal to young boys in physical strength than are adult females. We know from twin and adoption studies that heredity as a factor becomes more important as a child ages.

Yes, grades are scores. Grades also test persistence over time better, as noted in the article, rather than success on isolated performance tests. Some people might score well on a standardized test once a year, but don't consistently study the material well enough to get good grades. Also, consistent effort matters, genius being one part inspiration and 9 parts perspiration and all. I know tons of smart guys who are total slackers who will never amount to much in spite of their brains.

Hazel, you admit that grades conflate ability and persistence, yet you use grades to argue equal ability within demographics without controlling for persistence. Hmm.

Persistence IS ability. Some people are more disciplined by nature, and that's part of what makes them more successful.

Persistence and capability are different. I could try, as millions have, to be a breakthrough mathematician, or an NBA mvp, or an Olympic sprinter. No amount if persistence can allow one to exceed their limits. We know IQ correlates with mathematics and we know male IQ has fatter tails and we know reseach mathematics is dominated by men. The story fits. This should be the default view, but it isn't because of sheer sexist tribalism and self-dealing.

The NBA isn't racist and sexist, nor the math department, nor the Olympics, nor the Navy SEALs.

@Jason Bayz:

Are you saying American school don't grade based on work output, but rather on perceptions of amount of work put into the tasks (which would be very subjective)?

So, if a student with high math ability correctly answers all the questions on a math test, he will get the same (or lower) grade as another student who couldn't answer all the questions correctly but who the teacher thought put in a lot of effort?

If the above it true, then it would be mind-blowing (and mind-blowingly stupid.) If not, then can you state precisely, and with examples, what you meant about grading?

Again, the argument the the article makes is that isolated standardized tests don't actually test "ability". The classroom grades are actually a better measure of ability because they track consistent performance over time. This is not about "effort", it's that the girls get on average better scores if you measure repeatedly throughout the semester, but that boys outperform on point tests. Someone can cram for a point test as a challenge , but basically be a slacker the rest of the year. being able to score high on isolated point tests doesn't mean you have more math ability. Being able to consistently get better grades, over, and over, throughout the school year does.
That's what I'm talkign about. It's not "effort" it's that being consistently good at something over time is a better measure of ability than getting a high score once a year.

Yes, but Wall Street at least was happy to hire people who had solid grades in any academic subject.

> I’d be pleased if more discussion focused on the power law distribution between top programmers

Two things: firstly, is there a power law distribution in skill in other knowledge work? I suspect so, most things in the universe seem to follow power laws, especially living systems.

Secondly, as for a partial explanation: the best performance I see in software engineering centers around seeing the big picture and not being distracted by the details. However, being able to focus intently on the details is the only way to become a decent programmer. This tension defines the arc of good engineers: early focus on minutia to the exclusion of all else followed by a shift to a holistic perspective in which fulfilling needs is the most important thing and mastery of technicalities is simply a means to an end. Not many are able to navigate all the way from one end to the other.

"Two things: firstly, is there a power law distribution in skill in other knowledge work? I suspect so, most things in the universe seem to follow power laws, especially living systems."

One program can reach billions of users. That can't be said about the product of most fields.

> One program can reach billions of users. That can’t be said about the product of most fields.

True and important... but I don't see how that impacts developer performance.

Perhaps you're saying that the outsized impacts of good developers magnifies their importance and therefore perceived skill.

Perhaps the reason the tech sector is neither big in employment or value-added terms, or maybe this is why it is hard to measure, is because a large number of the big "tech" firms are really just media companies (Google, Facebook, Snapchat, Twitter, Linkedin, Netflix).

And British Airways is "just" an airline. Almost all companies are tech companies, more or less these days. Some good, some bad (e.g. British Airways).

Yeah. At the end of the day, CS is about making websites.

General Electric Aviation, headquartered in Ohio, makes jet engines, but is not a "tech" company.

Snapchat makes a messaging app and is a "tech" company.

I think you are mistaken.


One of the most fascinating articles I have read about business. Now tell me what business Target Canada actually was in? Or not anymore because they couldn't master it.

Any business which is successful and dominant in it's market is a tech company. They have to be.

An opposite example would be Amazon. They had to become CS leaders and innovators to "just" sell books.

The cloud is invisible but don't misjudge the technical achievement. Much much bigger than glamour tech like AI or VR.

the very nature of software development, as a profession, is that it doesn't scale well by throwing manpower (or money, which is typically just a proxy for man-hours) at a problem. this is robustly discussed within the field, but perhaps most famously in The Mythical Man Month [0]. Expecting mass employment from a field that is naturally disinclined to use extra human workers is unrealistic. An aphorism within the profession is that a good developer will automate their own job away and then go find a new project.

the "value add" component is precisely why the most famous tech firms are really media/advertising companies. that is an easy source of value add in a field that trends towards value subtract as a default. software subtracts value because it removes human participation and thus tends to lock users into already established paths-of-least-resistance (those that have been automated). thus the difficulty of getting internet users to pay for content, for example. it's harder (and more expensive, for the user) than free content.

then there are the soft considerations. this isn't so frequently discussed in the literature (for political reasons I suspect), but there is good reason to believe that not everyone can even do programming at a competent level, irrespective of their intelligence or even specific intelligence in mathematics. this isn't quite so robustly established, but there is reason to believe [1] that the problem is intractable because there are some foundational concepts about how programming functions on a logical and syntactical level that are sufficiently foreign to a subset of people that they are unlearnable to within acceptable professional standards by that subset of people. In other words, not everyone is capable of forming a correct mental model of what a computer does.

0 - https://en.wikipedia.org/wiki/The_Mythical_Man-Month
1 - https://blog.codinghorror.com/separating-programming-sheep-from-non-programming-goats/

I am a computer programmer (with a degree in the subject) and it suits me. My nephew asked me about computer programming just before applying to university, having expressed no previous interest in the subject. I advised him that articles by academics (which I had seen in CACM) showed a roughly 50% success rate in turning out competent computer programmers, and said that I was employable as a computer programmer because while in university I was keen enough on programming for fun to spend a lot of my spare time getting the practice that put me in the top 50%. My nephew is doing a PPE now, which he enjoys. If he doesn't find glamorous PPE-related work, he will probably train as an accountant.

I don't know if it is different in the top end of the industry, with every employee very highly skilled and a company desire to derive every possible advantage from automation, but in blue-collar programming there are a lot of people doing jobs like manual testing and click-by-click documentation of procedures that are neither glamorous nor particularly enjoyable. Less skilled programmers tend to do more of this and enjoy their work less.

The top of the industry is very picky about hiring, mostly using mechanisms that barely work, and cut out a lot of competent programmers. This turns the industry into one with social classes, not unlike law: The 500K+ total compensation exists, but good luck getting into those firms directly from a weak university: It's a slog as you climb from lower quality places either bit by bit, or by investing a lot of personal time on your public image.

Those firms at the top of the industry are really more defined by their great selection of market niche. Then they get to hire people that are in general more competent, but ultimately what one has to do in them is not fundamentally different from what one does in a mid tier place (for instance, a top employer in a crappy metro). Once you ask employees for the top tiers that have been there for a bit, what you hear is boredom and politics, as there's only so many challenging projects to go around. Therefore, you just find that many things that would be handled by the barely literate in a weak firm are handed to people that would be seen as top talent somewhere else. They just decide it's OK to be there due to the very different compensation.

If there's a secret to the computer science market, is that there are huge differences in compensation caused by what the market can bear. The skills used in most positions at the big four are not different than those used everywhere else: It's just that they have money fountains, so they can just afford to pay a lot more. A lot of programming work out there just doesn't get done because the firm doesn't have the economies of scale that would make the job worthwhile, so they end up paying less.

This is why the saddest part of the industry is that we are not very good at identifying quality, not even during interviews: Some of the best programmers I know would never pass the interview that I got through at a top employer, but quick thinking, yet unproductive people work everywhere. Career outcomes and skill seem only loosely correlated.

Re: This is why the saddest part of the industry is that we are not very good at identifying quality, not even during interviews:

This is true across the whole economy. The professionalization of the hiring process via HR departments has not in fact improved the process at all, except maybe to add a degree of shielding from discriminatory hiring lawsuits.

We're actually decent. Are mistakes made? Once in a while, but generally the top firms are quite good at identifying talent.

I worked at a large Wall Street firm for eight years. I have considerable respect for our HR people there, and my department head boss was the best boss I've ever had. Yet they made their hiring mistakes too. At the end of the day, hiring is something of a crap shoot with a great deal of uncertainty to it. Every hiring manager who has been at it for a while can tell tales of candidates who had great resumes and interviewed well-- and were total disasters as employees.

I strongly doubt the reactionary faculty idea, what I hear is that CS has the reputation for being the easiest of any of the Engineering majors. I've heard this when I was an undergrad and multiple people who work in academia told me they noticed the same thing from the inside, one reason they identified is that CS faculty often see their jobs as temporary positions and so don't bother themselves with details such as whether half the class plagiarizes their code.

I think prestige is the most important factor, engineers are nerds, but programmers are the uber-nerds. This is worsened by immigration, programming has become associated with Indians. Ironically, the White flight this causes helps cancel out the wage-lowering effects of those immigrants for the natives who remain.

When I went into CS, I knew it had low prestige but didn't care. I wanted to make money, who cares what the normies thought? But I didn't consider that having low-prestige would have real-world consequences. Companies hate paying high wages to low-prestige workers, thus the push to both replace Americans with H1Bs(there's no similar push for other high-paying jobs like law, medicine, or finance, and other forms of "engineering" are targeted to a lesser degree) and encourage ever more people to go into the field. I'm sure that explains much of the "women in computing" nonsense as well, companies want to replace their high-paid and often rebellious nerds* with lower-paid, more compliant, women. It hasn't worked, but someday it might.

*At my company, programmers can show a level of contempt for the management that would be unheard of for other employees, the disrespect goes both ways.

Programming is a blue collar job. Who wants to spend tens of thousands going to university to wind up in a blue collar profession?

If programming is all someone knows, it is indeed a blue collar job.

However, engineering + programming = $$$ and status.

Really? I don't think so.

If 80-100K per year in your first job is not enough, what is enough? Consider: these people actually like the job and it's not a manager position.

I can tell you in some weeks, I'm looking for a job on science+coding. Offers look good, whatever I get is fine.

Uhhh yeah that's pretty much nobody. I get that a PhD in physics with strong emphasis on mathematically modelling and knowledge of numerical methods can be lucrative, but a CS degree isn't really going to get you there, neither will any typical engineering undergrad where you also learn mostly useless crap.

Does no one here have experience in SF or Seattle? I work in west coast tech, and it's weird reading people speculate on how much people can make, or whether it's high status or not.

Are you all east coasters and Europeans assuming it's the same everywhere?

The average starting salary for my CS friends, including stock, is about 120-150k/year at age 22. I'm a few years older and make just shy of 200 a year. Most people on my team make more.

It's considered one of the highest prestige jobs in the city.

This talk of 80k/year and low prestige just isn't true for US west coast tech.

For much of the country (outside some high cost metro areas) the wages are much lower.-- but so is the cost of living.

Huh? In what sense is it blue collar? You work on your butt, not your feet, it's done in offices not factories or mines, and you shower before work not after.

Janitors are low prestige.

Are you a janitor?

Well venture capital beseeches you, Natasha. It's what us humpty dumpties stoogies called new money paying off student loans in the old days when people used to throw bottle caps into buckets. And that game you're actually curating? No one gives a flying color why you play it on the subway. Look games are fun no doubt in my mind. Let's play a game called liar's die.

If you donate 50% of your salary to that orphanage, play that game honey. Let's called that a perfect six. That's pretty prestigious.

If you donate 40% of your salary to the government to help pay for the student debt bubble, I would stop playing that game immediately.

If you are really busy at work building a game that manufactures and more than that manicures and more than that mannequins and more than that transforms people lives into drudgery and then curates off it, it's called wagging the dog. Let's called an Ace of Spades. You've never eaten grass in your life. And that's a problem. Games are important no doubt in my mind, they make you feel good.

Ex programmer here. I never blamed my employer for a wage. I went and got another one. And another. Until I found equity in an IPO.

Nobody cares about plagiarizing code in industry either. It is standard practice. Why reinvent the wheel? Nobody is going to read the spice code anyway.


>Nobody is going to read the source code anyway.

That is highly dependent on whether the source code is encapsulated - does it reliably produce the output that is expected given the inputs I give it?

If it doesn't, the source code will be studied rather intensively.

Yes, it's a good idea to only steal code that works.

THey didn't like the lint from gustav klimt - bobio dylano

actually the industry cares quite a lot about how, where, and by whom code is re-used. there is a reason the GPL exists in the first place after all.

> companies want to replace their high-paid and often rebellious nerds* with lower-paid, more compliant, women. It hasn’t worked, but someday it might.

I believe it will never work and many companies will dig their own graves trying to make it work. the job itself cannot be adequately performed by compliant, agreeable people. there's decades worth of evidence for this and yet nobody really seems willing to accept it as the truth despite the evidence staring us all in the face.

Please explain. To be sure, any job that requires innovation needs something more than go-along-to-get-along types, but wages are simply a function of supply and demand so lower wages do not necessarily point to incompetent workers, and any workplace that has too many "rebellious" or just plain nasty and unpleasant people in it is headed for trouble. The truest advice Scott Adams ever gave (from "The Dogbert Management Handbook") was "Fire the A$$holes".

I wish I had some good links and citations on hand for you, and I'm posting on my coffee break right now so I don't have time to adequately source materials for you, which is not that easy to do in this field anyway since study of the psychology and pedagogy of computer science is very nascent anyway.

however, from my own experience, and from my reading of history, I find that there is something essential about a somewhat disagreeable attitude when it comes to doing computer programming. I'll just outline a few thoughts here that hopefully provoke some good discussion.

1. the task itself is a bit on the anti-social side (which isn't to say it is not collaborative, it is highly collaborative, but the working conditions involve hours spent alone in your own head)

2. it is not accidental that the archetype of "elite" computer programmer is the hacker. hacking (in its traditional programmer usage meaning skillful deconstruction of complex systems, not its hollywood usage meaning breaking-and-entering) absolutely requires the programmer to think about things in a disagreeable way. it is predicated on doing what you are specifically advised not to do. find the edge cases, the glitches, the overflows, the unhandled exceptions, etc.

3. debugging is an arduous task that generally consists of wading through opaque, inadequate, or sometimes outright deceptive feedback cycles in order to isolate the root cause of a problem. it legitimately feels (to me anyway) like being in a heated argument with a machine. another highly disagreeable task.

4. there is an bit of folk wisdom in the industry that says "the users really have no idea what they want and whatever they tell you they want is wrong", and this turns out to be true so frequently that it has become established as wisdom. that is a fundamentally disagreeable outlook. don't trust the users. they don't really know what they're doing and it is your job to discover what they need even when they are telling you that they want something else.

5. disagreeable personality is not the same as "nasty and unpleasant". this might be a misunderstanding on your part, and it is probably a common misunderstanding. a disagreeable person can be very polite and decorous but the way they conceptualize problems and social relationships will be very different than an agreeable person. a disagreeable person is more concerned with correctness or truth than with social harmony. that is very nearly a template for being good at programming.

Humans interact one way. Computers interact another way. Programmers are humans who are willing to go halfway, and speak computer languages Those scripts, queries etc. are arbitrary but precise abstractions of pure logic.

Not all humans want to mess with pure logic. Dangerous stuff.

Any program based on a wish, but not harsh logic, will fail.

Programmers do tend to be introverted, and I can see that as a helpful trait for reasons you state. However it's a long way from that to being an outright jerk who alienates coworkers and creates trouble in the workplace. (We may have a confusion over what type of person we are talking about-- I am thinking about the 4Chan type of anal orifice, not the shy guy corner in the last cube) Also, finding someone who can think like an end user is absolutely golden, since all too often the worst mistakes are those made by developers/programmers who assume everyone thinks and works the way they do-- "No idiot would ever do this therefore I won't code for the eventuality"-- and that leads to support calls and bug fixes because, yep, users can and will do that. I once got a job at a prestigious financial institution because in a very simple programming test I stuck a button labeled "Run" on the spreadsheet that contained the VBA program I was asked to create-- no other candidate did that, they just assumed the person using it would know how to navigate to the macro view and run the code.

The reason programmers are left in the dark is because their lamp shades have Utrecht . Across the danube, on the other side of town, the editorialists and the sales people cross hairs in regard to how best implement the platform. The reason sales people are now being given more responsibility is because they have the vision for what is necessary while the editorialists and other project managers can only play what is sufficient. The programmers can look at their product on the internet and add together the thoughts that cross their head and imagine a way to negotiate a larger steak at the dinner they aren't invited to.

part of it that few of the practicing software engineers that I know feel that their college education added much if any value to their career in terms of actual skills acquired vs signaling benefits, and this is backed up by data*. But I guess that's true for every profession except academia.

*Google, for example, doesn't put much weight in your educational background beyond the specific projects you contributed significantly to. I suppose

This is true for virtually all engineering disciplines. Universities do a horrible job at training in technical disciplines.

I will add that top 50 schools probably do a better job. Google does actually care somewhat about degree pedigree.

I continue to strongly believe that the main advantage of the top schools is network effects / reputation: people believe that, say, MIT is better than, say, University of Iowa, so stronger students try to get into MIT and employers then think the reason the MIT grads are better is the school, not the raw material. I think that coursewise, the main difference between the "top tier" schools and the "middle tier" schools is that the top tier schools may have more opportunities for quality undergrad research. But, at least in engineering, that can actually be a negative: some schools tend to turn out research-oriented graduates, and some turn out more design-oriented graduates. Most companies need far more of the latter than the former.

Well the trend is just opposite here in India. Every student who pursues an engineering degree here (there are more than 1.5 million of them every year btw) wishes to take up computer science & engineering or a related branch like information technology. The main reason for that is market demand (which is near saturation point now). And the goal of every CSE student is to get a tech job in US (for which there are not enough local CS graduates). But with Trump's new policies there will be a lot of vacancies in this field and I don't think that the demand will be fulfilled in the forseeable future. It takes time to generate interest in a particular field when the current numbers are so discouraging.

The engineer has no place in this work really anymore.  If we think about DaVinci, was he a scientist or an artist? All his other scientific studies were perfect and most of them rival modern 3D archetypes. The main way you no you are a programmer is when you have to go downstairs.

Many years ago in high school my friends and I learned FORTRAN IV at the local university (we'd finished calc and ordinary dif eq and were out of stuff to do) and it was a revelation. One buddy who also monitored the courses went on to Microsoft in the early 80s and has given away more than I'll ever make. He has no degree; so feel free to look down on him as he flies overhead. I'm on to STAN in R these days and I can still say that learning a language is like learning to fly. Once you've got it you might as well leave the nest and see what there is to be seen. No need to wait around for "those who can't do, teach" academics to lay another brick in the wall between you and where you want to be.

'Many years ago in high school my friends and I learned FORTRAN IV at the local university '

What, they did not teach it in high school, using a school system owned HP mainframe? Not really the high point of my education, as two years of Fortran programming convinced me that there are betters things to do with life than program.

Alas, I'm well into my 50s now and it was an IBM 360, WATFOR (compiler). I failed to foresee "shrinkage".

We had old IBM punchcard machines that they would take to the local university to run. FORTRAN was the game but by the time I graduated we had PASCAL on a VAX at localU (if you got the shoulder tap) and of course a bunch of TRS-80s where you could write code stored on cassette tape to recreate Pong.

'We had old IBM punchcard machines'

My high school had a couple - IBM of course - in the typing classroom, oddly enough. But by that point, punchcards had truly faded, compared to the luxury of time sharing and scheduling time in front of a terminal. And scheduling time at the printer.

Learning how to type is a basic skill for a programmer, but I am going to guess that it has been decades since most school systems have offered such an elective class to students. Assuming that in today's U.S., students even have much opportunity to take different subjects as electives - I honestly don't know.

I'm in my 50s too - and Fortran is still used.

I see it clearly now and now staring serenely, I only must say that hear me goliath for I have lost and thou art won and I have made my peace with eggs

"as two years of Fortran programming convinced me that there are betters things to do with life than program"

And what a stunning success that turned out to be. LOL.

There's a lot of focus on demand side effects, but the supply side seems to be the more promising explanation. Anecdotally (based on the University of Washington, if it matters), CS enrollment is limited by the CS department. CS departments have overly high standards (or explicit enrollment caps) because CS departments have a lot of political power, which is also an effect of the growth of programming jobs in economic importance and prestige.

Furthermore, individual professors tend to have more power, as their job skills are more valuable and more immediately transferable than in many other fields. The weird flip side of the academy being too theoretical is that the deep-pocketed tech industry is willing to fund basic research and directly compete for talent of all kinds. One friend lost 2 consecutive PhD advisors to industry. This power to leave for similar research in the private sector keeps departments from compelling too much teaching from any one professor, thus keeping the number of majors down.

I make a distinction between coders and software engineers. Coders have expertise in a specific language that they can ply for a living. The downside is that they can be caught in a talent trap and find their value diminished if the industry moves in a different direction. Software engineers are trained in the principles of building systems but aren't necessarily trained in any specific language. Since the demand is for coders most software engineers wind up with a specific language of choice.

There is definitely a split in attitude along those lines. The people who jump on new tech cycles do better than those who find a niche.

Personally I've used the terms "programmers" and "plumbers" to describe this phenomenon. "Plumbers" only know how to take off-the-shelf open-source components and hook them up to the other elements within the application framework. (Plumbers because they simply "plumb" the components together without understanding how they work.)

"Programmers" on the other hand have some knowledge and, if pressed, can build the components and create software--not just wire up a bunch of pre-existing stuff.

The difference to me is when a new project is started, a programmer starts sketching the architecture of the application, while the plumber goes off to GitHub.com to start looking for components that does roughly what they think they need.

There is such a huge demand for software developers in our field that plumbers aren't getting weeded out (as they did when the dot-com crash happened in the late 90's), and we're seeing a lot of very bad stuff being pushed on customers. (How often do we here about a security bug being found in some library, only to find that library has been used in thousands of different mobile or desktop apps?)

Plumbers because they simply “plumb” the components together without understanding how they work. [CITATION NEEDED]

I've been doing this for decades. I've worked on weak firms, and currently I work at one of those places everyone would recognize building distributed systems. However, I've yet to see someone that is good at a single language but whose skills just can't translate.

The idea that there's such thing as a special 'software engineering' thing with separate training and that do work of a different nature is something people tell themselves to feel better than others. Places where people with that mentality have a majority create nonsensical architect jobs, where the easiest part of the job is sent to the people with higher status.

I find it quite sad that we have an industry so full of misconceptions.

Most "programmers" could benefit from academic study of algorithms and finite state machines in particular. I can't count the number of times I've seen a "programmer" build a nalive O(n^2) algorithm when some time spent on analysis and design would have shown that there was an O(1) solution. Thus becomes extremely important when you're doing hard real-time systems, where e.g. the airplane crashes if the flight computer overruns the fly-by-wire frame's time deadline.

Here's a personal anecdote

About ten years ago when I entered the university, I had to make a choice between computer science and physics. Both subjects interested me, computer science perhaps a bit more than physics. I ended up choosing physics anyway. A major factor behind this decision was rumors about horrible working conditions in the software companies: programmers are made to work like dogs when project deadlines are closing; people developing chronic back problems by the age of 30 because of sitting on a chair 8 hours a day for ten years; people losing their mental health due to the intensive nature of the work combined with high pressures to perform.

At no point was my decision based on salaries of programmers vs physicists. I didn't have any clue about the relative salaries of the two professions and as a naive 18 year old living with my parents the question didn't really interest me at all.

I've since learned that these factors depend a lot on where and how you work and are not unique to software industry. However, writing software for living still has a reputation of devouring people's souls strong enough to persuade someone who is actually interested in learning programming, like my younger self, to not enter the field.

Re: people developing chronic back problems by the age of 30 because of sitting on a chair 8 hours a day for ten years;

Welcome to pretty much any white collar job.

At most, only 1.5 years of my CS degree was of any use to me -- probably less than that. The rest was a waste of time, money, and lots of energy. It was mostly a jobs program for academics. I think that's true for most college degrees. I suspect that over time we'll see the university walls come tumbling down in other unlicensed fields. If any of my kids want to become programmers, I'd encourage them to explore avenues outside the university. Unless you want to work in a niche field, there's no need to take all the difficult science and mathematics courses required for a CS degree, not to mention all of the other fluff courses.

I graduated with a CS degree in the early 90s, when you still needed that degree to get your foot in the door. It was rare back then to see a programmer without some kind of degree.

The demand of the.com boom changed everything along with workplace dress codes. People who had no business programming were getting jobs as programmers. The .com bust corrected much of that. However, the previous barriers to entry never returned. ( Likewise, most places don't allow shorts and flip-flops anymore, but I've not met a programmer who still has to wear a suit and tie to work.)

Part of that also has to with the nature of the companies hiring programmers. Back when you needed a degree, it was mostly larger corporations that had the programming jobs. In the 90s that all began to shift to smaller companies. The smaller companies were more likely to hire someone without the CS degree. The .com companies were even more likely to do so.

As a young person in the 80s and much of the 90s, it was too difficult to be self-taught. The hardware, software, and training materials were out of reach to most people.

Today, it is amazing how much material is out there. The challenge now for an aspiring self-taught programmer is figuring out how to narrow his focus. It's a Bernie Sanders problem these days: too many choices!

We probably do have too many 4 year degrees for 2 year topics.

Still, a good program could use those 4 years to survey a very wide field.

I agree there are huge differences between the average University programs in CS and other STEM fields and the top 30, but most People got to the mediocre Schools which are a total rip-off

Agree. A cs degree has merit for doing advanced things or going to grad school, but not for being a run of the mill developer. You b don't need it to learn new programming languages or software packages. You only need it to do high level design or advanced things like machine learning.

The knowledge you need to do that kind of stuff isn't cs, it is math. Or physics where you learn how to model mathematically complex scenarios.

And at these higher levels it isn't the education, it is the individual with the knowledge and ability to put it all together. Education is a tool for the very capable to accomplish these things.

But there are a handful of people to whom this applies. Where a degree is important is a situation a friend described to me; they have a very expensive software system at this bank, and the technical people who show up need to have the credentials to match their customers. The skills are important, but at this level aren't that difficult. It is the ability to convince someone to hand over their business functioning to someone from who knows where. A degree is the first step.

I got a B.A. and M.S. in CS in the mid-to-late 90s and have worked in software since. I would still encourage a young person considering entering the field to get the degree, depending on their situation. While the absolute necessity of it may be decreasing over time, at the moment it's still immensely helpful to getting into the industry.

Going to college is extremely expensive and time consuming. Why would one spend for years and tens of thousands of dollars getting a degree that has medical utility in terms of landing a job?

Erec marginal. Stupid phone

This is the biggest load of crap. For the average kid that can afford to go to college it is a great investment. Key word there is "average". Sure if you are very smart, or very motivated or very (insert positive trait here) you don't need college. But the average kid is none of these things. A decent university however will force that average kid to learn enough to hopefully get out of blue collar range. The "you don't need college" meme is such bs. The people who don't need college are the smart people. They are going to be successful with or without school. If you are a regular Joe however you better put your butt in school.


Another thing to consider: the more smart and motivated you are, the less a degree is likely to cost.

I actually take the opposite view from you. Smart people often benefit hugely from a degree (and potentially also a postgraduate degree). Doctors, lawyers, investment bankers, CEOs, research scientists, college professors, etc.

The people for whom college is not a great deal are those who are either not smart enough or not motivated enough to get adequate grades in a marketable field. For them, the degree truly *does* have marginal utility.

I'm talking about CS, specifically. Right now, in the current job market, getting a CS degree doesn't really give you *that* much of a market advantage. And it costs a lot and takes away four years of job experience you could be gaining.

In general going to college is a great idea, assuming you're going into a STEM field. It's utility just happens to be somewhat reduced for CS because of market conditions and the high cost of tuition at the moment.

I'd say the utility isn't marginal, and the degree needn't be expensive. Even when you include opportunity cost.

In an ideal world tuition wouldn't cost $30,000/year. We happen to have a fucked up education market at the moment which is part of the reason people are turning to bootcamps and self-learning. Again, are you going to spend $120,000 and four years of your life to learn stuff you could learn in a year on the job with a couple of code bootcamps?

You're focusing on the "sticker price" number. Most students don't pay that much. When I graduated high school you could get a free ride (including room and board) to either the University of Oklahoma or LSU if you scored an 800 on the SAT math section. That's no mean feat, but recall that most aspiring C.S. types are mathematically inclined *and* the SAT, and particularly the math section, is eminently gameable. And you can take the SAT multiple times; it only takes one 800 to get the free ride (supposing it still exists).

The bigger thing is that only wealth students pay full price these days. I used the govt. mandated "cost calculator" at the University of Texas at Austin to estimate how much it would cost for me to send my son there. My household income is around the 10th percentile. Total cost: $10k/year. Humorously, Harvard wasn't much more.

Also, I posit there are things I learned while earning my bachelor's degree that I did not learn during the first year on the job and would not have learned in a code boot camp. But it's not really about the education; it's about the signalling and the value thereof. In my experience the degree is still valuable (esp. if you get a M.S., which is sort of the new B.S.), and worth the cost if you can get it at a reasonable price.

#1) It's not just MOOCs and non-traditional college options, even within the traditional university system, often physics or pure math or statistics or EE or even finance or business is a better route to desirable tech jobs than CS.

#2) Young people under 25 fantasize fantasize about starting the next Facebook or being the next Elon Musk or curing diseases or even history or politics or even math or statistics. CS is often less interesting on that level. No young people fantasize about maintaining database servers or writing really obscure internal business apps.

#3) Tech offers less predictable career development tracks than everything else. Some people don't like rigid predictable career paths, and those people are probably less interested in highly structured CS programs.

I'm a developer with a CS degree (2002). I can tell you the reason.

Programming, ( and to an even greater extent, majoring in CS) requires certain relatively rare (but not necessarily positive outside of the CS realm) personality traits. I've worked as a programmer in companies that have many programmers and many non-programmers, and specifically many non-programmers who often want to move (or are pushed by their job roles) into more technical roles. There is absolutely a personality "type" (use whatever metric you want, Myers-Briggs or whatever) that A) makes a good programmer and B) makes the person enjoy programming. If you don't have at least some combination of those traits, you will hate programming and it's hard to be any good at something you hate doing. I can always tell ahead of time which non-programmers will A) show no interest in (and usually dislike) any programming or technical issue, outside of what they need to do their job as opposed to B) become fascinated by it, and drop their old role and dive in 100%. Is it because I'm some amazing judge of character? Nope, because almost everyone in group B has already done it, so it's easy to tell.

There is almost no barrier to becoming a programmer (of some variety) at all, and one of those traits that I mention above is some melange of curiosity, self-motivation and strange impulse to optimize things that, if you have them, you were already driven to that type of field already. The number of people in the US (I bet the stats in India/China are different) who would enjoy programming and be good at programming but who are somehow not doing it (either as a hobby or professionally) is pretty low

And finally, to do the even rarer thing of majoring in CS, you have to have that trait (and others), and you have to be willing to put up with 4 years of study (after all, there's likely a "college" personality, that you then need to have alongside that the even rarer "programmer" brain) in an academic field that is very different from the practical application. For example, taking a CS databases course (at least it was 15 years ago) is filled with arcane and esoteric features that aren't that applicable in real life database situations. The traits that benefit from/enjoy this experience (rules-following, conformity, top-down instruction, a desire to learn arcane-but-useless academic points - as opposed to arcane-but-useful, which programmer personalities love) are dissimilar to many of the other personality traits that make a good developer: simplicity, optimizational desire, self-guidance, etc... so you don't get a lot of people who have both sets of traits. I do, but I'm weird. Not in a good way or a bad way, just in a programmer/CS-major way.

To me it's "does this person really get into video games". Gamers generally are good programmers, because they tend to see coding as a game. You win when it compiles and runs without errors. That makes it fun. They also tend to enjoy optimization, as in getting the highest score in terms of speed, performance, etc.

I hate databases though. I'm an engineer. I just write C++ and certain scripting languages. No desire to learn SQL.

Hazel, I think you're absolutely right, and I have seen the video-game/programmer correlation (and am an example of it myself).

And there are definitely sub-fields in CS that appeal to different personality types. DB's are definitely one that, as you say, rubs some people the wrong way. (personally I love SQL). :)

Computer Science is too hard.
To be a good program for most tasks you do not need to be able to understand the most difficult algorithms. There are programming jobs that just require careful tedious work.
Programming is also something that person of moderate IQ like me can get good enough at through practice over many years.
I say start them slower and build up slower and help them.
Make CS a not so hard major, the geniuses can show there advanced skill by doing extra work and build portfolios.
I think if is an easier major more people will take it and stick with it through graduation.

Programming can be hard, it can also be dull. A manager who supervised at least 70 well paid USA programmers said that the smartest people often quit or get fired because they just don't want to do the kinds of programming assignments that need to get done. Even when they are attracted to the salary and employment perks, they just don't want to do the day to day tasks. There often isn't the opportunity to apply exotic math or science or intellectual growth experiences that some people crave.

Yeah, this is why it's more fun to be in some technical discipline that uses programming, rather than just be a coder. You get a job doing some fun and interesting scientific task, and then you write code to make that task easier and do analysis. I'd probably be bored silly too if I just had to write snippets of code for tasks that I didn't care about and wasn't interested in.


My speciality is aircraft flight control, which I find fascinating and fun; I do a lot of programming in the course of my job, but I'm not a programmer. And that's how I like it :-)

+1 to Hazel's comment

Re: Programming can be hard, it can also be dull.

That describes about 90% of all jobs.

I've led IT orgs for some time and find that the most valuable traits are creativity and curiosity, followed by a drive to make things better. None of these traits show up in the degree or school you went to, but are readily apparent in someone that is self taught or otherwise accidentally ended up in the field.

This has shown to be so true that, even when interviewing recent grads, I will place far more emphasis on what they have done outside the classroom (even the most mundane personal projects that apply tech in some way) than the classes they took or the grades they got.

Twenty percent in the Bay Area sounds like a lot, but I doubt so many are what you'd think of as tech employees. I'm in tech. I'm learning a bit of R so I can become a data scientist, but if you asked me to do anything beyond extremely basic HTML, I'd be unable to. A tech company employs marketers, product managers, HR, QAs, sales teams, designers, and a ton of other non-technical positions. None of those jobs need knowledge of code (some designers do), but are all part of the tech industry. In my last job, actual coders were about a quarter of the team dedicated to putting out new products. Additionally, there are companies putting out tools that greatly reduce the need for devs on the front end. My current team invested in a good content management system and we're going from twenty devs to five. That doesn't reduce the need for copy writers, designers, marketers, sales, or back end devs, but does greatly reduce front end developer jobs.

If you want to talk about numbers, consider that the most popular major at the Ivies appears to be economics. CS sometimes cracks the top 3.
Is econ enough to save the world?

Maybe there would be demand for a combined CS/philosophy/math/cog sci degree, something like the Semantic Systems (or whatever it's called) at Stanford. A lot of CS is applied logic and set theory and it would fit well with courses in analytic philosophy and cognitive science. That might be a way to draw some humanities people into CS who wouldn't necessarily be interested.

The good computer programmers I know have majored in fields as different as Sociology, Architecture (not the data kind), and of course Computer Science. Many are self taught.

It seems like kids are getting the memo that you should be majoring in STEM and learning skills that can translate to IT or data work. But they seem to be choosing STEM majors that are not CS. So perhaps the real question is why are some of the non-CS majors like physics so attractive when CS seems to be more practical?

Maybe kids see CS as boring compared to other majors that also give STEM credentials?

> The good computer programmers I know have majored in fields as different as Sociology, Architecture (not the data kind), and of course Computer Science. Many are self taught.

LOL, no.

I agree with what seems to be a consensus among the actual software developers speaking up here. The vast bulk of programmers out there are essentially just plugging together various libraries with a little bit of glue code.

When I started in computing, such libraries didn't exist, or at least not in the scale they do today. If you wanted to draw a chart on a screen, you had to plot it yourself, which meant understanding the equations involved and how to manipulate graphics directly. If you needed to rotate a 3D image, you were doing your own matrix calculations. If you needed to sort some items, you needed to write your own algorithms, which required you to understand at least enough math to know what was efficient and what wasn't. If you were writing your own simulator of something in the physical world, you were writing code that dealt with integrals and other high end math.

Also, back in the day we were severely memory and performance constrained, and therefore coding style, efficiency, and high level understanding of machine architecture was critical. You had to know things like boolean algebra, algebraic reduction methods, NP completeness, pointer math, etc. That's why CS was historically run out of the math department.

Computer science degrees were aimed at programmers who needed to do such work, and programmers needed sufficient theoretical background that they could build their own solutions from scratch.

Today, if you need to rotate a 3D image, you get some image library and call the rotate() function. If you need to simulate the physical world, you grab a math library that has everything you need and just call the appropriate methods.

As a result, for most modern programmers the actual code is relatively simple, and the real 'engineering' is all about the process around it. Programmers need to learn how to set up and run build systems, integration tests, unit tests, version control, etc. A good 2-year college will turn out perfectly adequate application programmers who can do this sort of thing.

The other value of a degree is for signalling. It used to be that a CS degree was a guarantee that you at least had a good understanding of the field and could code well enough to get through the degree. But what we've learned in recent years is that the difference between good programmers and bad ones is rarely about their education. Instead it's character traits like intelligence, diligence, attention to detail, creativity, ability to estimate and meet commitments, being able to work effectively on a small team, etc. A CS degree these days just doesn't signal any of that.

Another factor is that a lot of programmers are working in areas where they need serious domain knowledge outside of CS, and employers have found that it's often easier to take a control engineer or a biologist and teach them to program than it is to take a programmer and teach them control engineering or biology. So you find a lot of 'computer engineers' who have degrees in physics, or math, or some other non-CS specialty.

The result of all this is that many top employers have dropped degree requirements entirely, and replaced them with interview questions, programming assignments and tests. Google is famous for not requiring a degree of any sort, but having extremely difficult tests that applicants must take.

Essentially, the industry has figured out how to eliminate the degree as a signal by directly testing applicants for the traits they need. This is a lot easier in CS where there are concrete metrics, and where we can essentially give IQ tests to applicants than it is in other fields where this is not possible.

In my own experience interviewing many candidates, the only thing I look for in education is to see what kind of options the person took, whether they took the easiest CS classes or the harder ones, and what kind of grades they got. And that's only for people with no real work experience. Once you've been working as a programmer for a few years, your degree is completely irrelevant. What I really look for are those character traits, and what the person does in his or her spare time. If you're a 'computer engineer' and your hobbies are watching TV, hiking, and playing sports, you're a hard sell. If your hobbies are HAM radio, writing computer games, building hardware, running web sites and other 'geeky' pursuits, I'm much more interested.

My own company just announced that we are dropping all degree requirements for candidates for programming jobs. And I have personally seen new hires with 2-year diplomas or just high school diplomas rocket past Harvard Computing Science graduates, because once you are in the door of the company all anyone cares about is how you perform, and not your credentials. We employ thousands of programmers around the world.

In yon culture, being a founder is cool and glamorous, but being a coder is still associated with pizza and fat people.

Two things:

1) There may be double majors / minors that are not counted in this metric.

2) A lot of people go into coding that don't necessarily have a computer science degree.

I laughed when I read a comment above about "plagiarized code". That is definitely coming from someone who never programmed in his/her life.

"Plagarized code"

When I was at the university, as a math/cs double major, a cs professor flunked an entire compiler class for cheating, which was widespread in the early 80s boom.

For our start up (https://get.chatterbeak.com), we find that many CS majors don't have enough math or engineering background, so we hire Math or EE majors.

Maybe the problem is this: for "handle cranking" jobs (making United Airline's website), you don't need a CS degree. For real engineering (like what we do, mixing millions of separate, unique audio streams in real-time simultaneously), CS degrees aren't "good enough."

This is definitely true, but it didn't used to be so. I started out with a 2-year computer engineering technology diploma in 1982, and even in that class we took the following math: Differential Calculus, Integral Calculus, Linear Algebra, Boolean Algebra, Differential Equations, and Partial Differential Equations. After transferring to University, I needed complex functions, statistics, linear algebra 2, and one or two others for specialty subjects. I would guess that the average CS grad back then graduated with at least 6-8 math classes.

I just looked up the requirements for CS at the same university today. They require '2 math courses' in year one, a single math course in year 2, and that's it. No mention of what those classes have to be, so I guess you could take bonehead algebra, a math for business majors class, etc.

I hope that the CS classes themselves have some math built in, because it's hard to fathom how anyone can graduate from a STEM field like computing without having to take the minimum math courses that even biology majors have to take. But this just goes to the earlier point that computing science isn't so much 'science' any more than it is an applied technology trade. That is, unless you get into real hardcore science programming, in which case you're likely to have a degree in physics or math or chemistry or whatever field you're writing programs for. Or if you're leading a team of such people, you're more likely a Computer Engineering grad than a CS grad.

I hope the original author understands that there is a difference between computing science and computer engineering. From what I've seen, computer engineering has become much more popular, and it's much more rigorous. Maybe that's where all the new programmers are coming from?

Clueless gatekeepers -- recruiters and HR people -- like the CS degree, sure. So, if you are trying to get your very first coding job, having a CS degree gives you an advantage against other people seeking their first coding job, mostly at places that have clueless gatekeepers. And then, after three years, you leave coding for some other work, because it turns out that your job is completely unlike academic computer science, and you neither like nor are good at your job. Then the project manager at that workplace sighs relief, and hopes HR doesn't stick him with another dud.

If you manage to get your first job without a CS degree -- which is only marginally harder than getting it with one, especially at a firm where the hiring process for coders is under the control of coders -- then you've got the qualifications to get another job coding, and what letters are on your degree (or often enough, even if you have one at all) is irrelevant.

Because, well, CS programs don't teach what you need. The "good" CS programs are heavy on theory, which is wonderfully rigorous and not particularly related to most real-world programming tasks, any more than a degree in theoretical physics is of much use to working mechanical engineers. And the "weak" CS programs make you pretty good at making toy programs, rather than learning how to do development work under anything resembling realistic conditions.

Re: theory. In my experience, it's the guys who poo-poo the usefulness of "theory" who end up writing fantastically inefficient code that I have to come back in later and rewrite so that it can scale.

I don't mean to denigrate theory. Good mechanical engineers need a solid grounding in basic theory, too. But a physicist isn't going to make a good mechanical engineer without learning a lot of practical things . . . any more than somebody trained in an auto mechanic school is going to make a good mechanical engineer without learning theory. And with the good computer science programs being good at making computer scientists, and the bad ones tend to be expensive code boot camps married to general education requirements, the result that pretty much everybody has to pick up the missing pieces to do real software development on-the-job.

It kind of looks like I'm saying we need "software engineering" graduates, but I don't know if the current software engineering programs that exist do what's necessary, either.

My undergraduate program was geared toward theory and left out some practical topics that would have been highly useful once I made my way into industry. To be fair I didn't do a good job of filling in the gaps on my own. Then again, it's often hard to know what you don't know. If you know what I mean.

Some of the code I find myself refactoring, though...wow. Maybe it's not so much a lack of understanding of theory as it is a basic lack of analytical/logical ability.

Let me give an actual example. At one point I worked on an Android app. When you launched it, the first thing it did is go download a bunch of images. It was a dumb design, but it was a requirement that the thing be "dynamic" and re-configurable without an update in the app store, so we couldn't just bundle them into the app. The guy before me who wrote the code that downloaded the images just requested them in serial, one after the other. Most were small, but there were many of them, and there's latency involved since we're talking about mobile connections. The upshot is that if the images weren't already cached, depending on network conditions it could take up to 30 seconds to load them all. Imagine that: you launch an app, and it just sits there at the splash screen with a spinner for 30 seconds before you can do anything.

My insight, which just seems like common sense, was to load them in parallel. I created a thread pool using the stock classes and shrunk the load time by a factor of six.

Did that require an understanding of theory? Sort of, if by "theory" we mean "take advantage of parallelism when possible". Or maybe the other guy understood this but just didn't care enough to take the extra time to do it right? Maybe what I brought to the table wasn't so much knowledge of how to do it "right", but an inability to allow it to continue to be "wrong"? Maybe he didn't understand how to do concurrent programming? Who knows. But I see boneheaded stuff like that all...the...time.

There is very little science in software.

Well, unless you are working in AI, simulation, numerical method libraries, writing game engines, building device drivers, writing microcontroller code, writing audiio and video transforms, building control systems, writing robust, scalable cloud applications, building analytics, writing computer models for real world systems....

There is plenty of science in software - just not everywhere. And fhe science is so varied that you can't teach it all in CS. Which is why it's a shame that so much math seems to have been cut from the curriculum.

Science is everywhere, in all the things you listed. None of it software. How much software was required to implement e=mc^2?

Software is not science, except maybe for some proofs of grammar, and there are a few protocol proofs needed. Otherwise, having written the stuff all my life, I never once went to a software depository to find the science, I generally went to the mathematicians. I would be an absolute fool if I thought AI came put of software. AI has a science, I know because we have 7 billion AI agents running around, none of them written in software, yet all have a science in their AI.

How about this. Computers are a tool that is frequently used to "do" science. Computers need software. Software is often written in the service of scientific pursuits. As one anecdotal example, I have a fiend who's on a Physics faculty. He studies black holes. Much of his time is spent coding up and running simulations.

You have to understand the science to write the software. Otherwise, GIGO.

These tech companies are a joke.

Comments for this post are closed