Saturday assorted links

Comments

5.8 Finally.

5.2 Is this a statement of identity that discounts what is said?

This website certainly has all the info I needed about this subject
and didn't know who to ask.

Respond

Add Comment

Respond

Add Comment

They say they value "intellectual progress" but has anything of value come out of their collective minds like a new scientific law or field of study? Every thing they ever write about is just thinly disguised political punditry like most of economics.

Respond

Add Comment

Respond

Add Comment

#3 Doctoral degrees aren't about earning a lot money. Who'da thunk it?

If you got your PhD from Trump University in Russian studies, you probably won't earn a whole of money.

Still sounds a lot more marketable than a Ph.D. in Women's Studies from Random State University.

Respond

Add Comment

Respond

Add Comment

Respond

Add Comment

1. This is a breakthrough? I notice this every day.

3. Small potatoes.

5. Which half?

6. It will be gone soon. I haven't been there yet, but it doesn't seem like a "real" place to go anymore.

#4. Surprisingly interesting and accessible.

Respond

Add Comment

Venice has been a purely tourist spot for decades. That's not a recent change. Disneyland is purely a tourist spot, but you can like it for what it is.

Respond

Add Comment

Respond

Add Comment

5. Caplan: "We value intellectual progress over emotional comfort. . . . We are genuinely non-partisan." The folks at GMU are just as self-unaware as the folks at Yale.

Well, the author did say to discount everything unless you're willing to give 1000:1 odds, so probably he's just exaggerating. Yale = GMU? That's a flattering comparison!

Bonus trivia: the Yale endowment hedge fund did very well, under that guy who wrote that book (David S. Swensen, " Under Swensen's guidance the Yale Endowment saw an average annual return of 11.8 percent from 1999 to 2009. As of the 2016 fiscal year, Yale's endowment had risen by 3.4%, the most out of any Ivy League school, according to Institutional Investor.")

Respond

Add Comment

Respond

Add Comment

5.2 like most people, GMU bloggers don't understand their own motivations.

Respond

Add Comment

We are genuinely non-partisan. The Democratic and Republican parties both seem like absurd churches to us. Even if one is markedly worse, we’ll never join either because both are “often wrong and never in doubt.”

That's unintentionally funny.

Respond

Add Comment

#4 Well a big part of the reason for this is that software development ("engineering" LOL) is generally a "profession" for the low-IQ set. Think of it MUCH more like modern day blue collar factory work than some intellectual endevour. It's really for people who did some college but didn't have the chops to get degrees in serious subjects like Law or Medicine.

Law or Medicine is not that brainy, take it from me, a law school dropout, or ask Raymond. It's a lot of reading and pigeon holing of stuff, same with doctors. As for programming, I do that for fun, and I think the author's complaints about bloatware are valid, and also better coding like Gang of Four design patterns and Managed Extensibility Framework (MEF) in C# and the like would help. But, as the author says, users don't care about bloatware and programmers are scarce (albeit underpaid, I personally would not work as a professional programmer for the measly $120k they make for the hours they put in) so bloatware and ill-maintained code will persist.

Respond

Add Comment

As someone who got into programming recently after 15 years of factory work, but had a 35 ACT/1480 SAT, I should be able to comment on this.

One similarity is the need for constant learning about more efficient ways to accomplish the same tasks, because of time pressure.

One difference is how regularly your coworkers accomplish something you could not because they are *smarter*. In a factory it's usually that they're stronger or more experienced with better instincts.

The particular flavor of IQ that doctors use is very heavy on memorization and pattern recognition. Programmers display more of what I think of as "intelligence", abstract and logical reasoning that structures the world and predicts the unknown.

I think doctors and lawyers are more distinguished by conscientiousness and competitiveness than raw IQ. The IQ tests are gatekeeping.

I agree that you're rarely puzzled in the way you would be trying to figure out the diameter of a spiral or something, but there sure is a lot of "this could only be true if that were true" logic and "I have to plan now or this will all come apart later" that make it a job where high IQ thrives, even if it's not necessary.

Very interesting, thanks. Do you use MEF (or equivalent) in your code?

Bonus trivia: Elon Musk used to code in 'tar ball' style, 1000 lines of code in one module, and then get upset when people modified it to make it easier to maintain. It's in the excellent biography by Ashlee Vance. Bill Gates was also a coder, not sure about his code but he did code in Basic, which is not a serious language.

Bill Gates was likely the best programmer, by far, of the billionaires.

He programmed BASIC, in 4k.

Even then he knew to repurpose an existing code base, Dartmouth Basic if I remember correctly..

https://en.m.wikipedia.org/wiki/Dartmouth_BASIC

The language specification was clearly used, it is basic after all.

I would doubt that much of the code was taken from Dartmouth, one is interpreted and the other is a compiler. I also see no references what-so-ever that there was shared code.

There is old history here. I am not sure of all the details, but I do think it is interesting that Dartmouth to Microsoft represented the age so well, the transition from shared programs as a default to more aggressive copyright:

Thus, I first met Bill at COMDEX when you could actually stop by and say “Hi”. But then, I also had the opportunity to meet other key people in the business and sat in on discussions regarding its future. In 1977, for example, one of the major issues was Bill’s dispute with MITS and his “Open Letter to Hobbyists” (of Feb. 3, 1976).

In what many considered to be significant hypocrisy, Bill essentially accused “hobbyists” of stealing BASIC. To understand why they thought this, we should take a quick look at BASIC. The beginner’s all-purpose symbolic instruction code set was invented at Dartmouth College in 1963 by John George Kemeny and Tom Kurtzas as a teaching tool for undergraduates. It was developed over time as a student project largely funded via public (NSF) grants. Thus, it was open source “shareware” from the start. When Bill and Paul read about the MITS computer (above), they (like many) say its market potential and thought that some computer language was needed to make more functional and appealable. MITS founder Ed Roberts agreed and accepted an offered demonstration. But Bill and Paul didn’t have a product – so they “stole” Dartmouth BASIC[6], modified it using stolen computer time (at Harvard) and sold an unfinished product to MITS.

http://richwritings.com/opinion5.htm

That's a story that goes back decades. I first heard it in the 80's.

There is a lot of sour grapes in that story. And it is rehashing with gusto various stories/spins about Microsoft’s rise to power.

I see nothing in that story that says that Bill didn’t write the BASIC interpreter. It does note that Bill did not write the floating point portions, but that has never been in dispute.

He, along with Paul, were impressive programmers.

As I say, the stories have been going around for 40 years, but sure make up your own mind.

Don't get me wrong I think Bill is incredibly bright and that shows now especially when he is in global philanthropist mode.

And hey maybe the bright thing was to grab that Dartmouth code and rework it hard for an 8080 target.

Respond

Add Comment

Respond

Add Comment

Respond

Add Comment

Respond

Add Comment

Respond

Add Comment

4K of RAM. Sorry for being terse.

Respond

Add Comment

Respond

Add Comment

Respond

Add Comment

Respond

Add Comment

Eh, I'd rather be a rich software engineer like Zuckerberg, Gates, or Bezos and get laughed at for being a nerd, than be a slimy lawyer who creates nothing of value (the #1 profession of politicians), or a self-hating doctor (highest suicide rate of all major professions). Tech's real competition is finance (see how Google and Goldman trade each other's staff like its fantasy football). Even then Tech is becoming more like finance (see Bitcoin) and finance is becoming more like Tech (see banks try to poach tech talent and call themselves as tech companies).

Respond

Add Comment

Respond

Add Comment

4. was really good, but depressing.

It seems like another variant of worse is better ( https://en.wikipedia.org/wiki/Worse_is_better ). It was even referenced in the article.

Respond

Add Comment

Well software development is kind of a fake profession filled with people who don't actually care about producing good products but rather think complexity and learning lots of frameworks is cool in and of itself. Once you understand that software is developed by losers it make more sense.

If the market rewards fake professions then literally fake it til you make it.

Respond

Add Comment

Respond

Add Comment

Maybe. I sarted with a 4k byte machine, gradiated 64k bytes and thought I was stylin'. We ran a preemptive multitasker in that 64k (for embedded medical electronics). I have gone through several iterations since then on development speed vs efficiency. I can node.js, for instance, but I can't say I love it.

That said, I saw a disruption in the narrative here:

Look around: our portable computers are thousands of times more powerful than the ones that brought man to the moon. Yet every other webpage struggles to maintain a smooth 60fps scroll on the latest top-of-the-line MacBook Pro. I can comfortably play games, watch 4K videos but not scroll web pages? How is it ok?

I'm pretty sure all that playback is now done by special hardware, DMA'd to display memory, and not to mention bandwidth limited.

So .. while there might be something here, I'm not really sure the author nailed it down.

And to argue the coverse, I can buy a $5 Raspberry Pi Zero, plug it into an old phone charger, and have a UNIX stack running 7x24 for maybe a buck a year worth of electricity.

Which efficiency do you really want, in cycles or deployment?

The playback that the author is referring to is the rendering of webpages, not video.

Clearly video is accelerated today, either by the video card or the CPU, depending upon codec and random factors.

Webpages OTOH are decidedly not accelerated. They are rendered by massive, and moderately old, code bases. Further, for modern webpages very large amounts of JavaScript are added to the mix, further reducing performance.

Two key things in that paragraph, smooth scroll and 4k videos are clearly hardware accelerated. And more:

GPU acceleration is en vogue. After slowly but steadily moving out of the 3D niche it has arrived in the mainstream. Today, applications like Microsoft Office leverage the GPU, but even more so do web browsers. Chrome, Firefox and Internet Explorer all have hardware acceleration turned on by default. People generally seem to be happy about that – GPUs are super-efficient, the more work they do the fewer remains for the CPU, overall energy consumption is reduced and battery life increases. Or so the myth goes. Interestingly, facts to prove that are hard to find. Nobody seems to have measured how GPU acceleration affects CPU usage. Let’s change that.

https://helgeklein.com/blog/2014/12/impact-gpu-acceleration-browser-cpu-usage/

And basically it is a huge error to confuse software efficiency with ever greater bit-pushing demands of high resolution, high dynamic range, high frame-rate media.

Respond

Add Comment

Respond

Add Comment

Respond

Add Comment

Respond

Add Comment

#4 - the details have changed but the story remains the same, as it was in say 1995.
It all comes down to money.

Too many developers chasing too few real dollars with too little real function to offer.

Have a nice day.

Respond

Add Comment

1. Contrary-wise, the president of the United States, well known and identified in all respects, is the biggest troll in the world(*).

I don't see the anonymii affecting us quite that much.

* - and probably not even compos mentis at this point.

https://www.businessinsider.com/trump-regrets-not-firing-comey-when-obama-was-still-in-office-2018-9

Guess you were fooled when Trump asked for Russian help in finding Hillary's missing emails, too.

Can you imagine the unmitigated hell this "Anonymous" has been living in for the last two years?

It would take a heart of stone not to.... laugh.

Hey look, my 401k went up again!

We have actually had many up markets in the past, without dotards to some nominal degree "in charge."

(But not so "in charge" that he can fire Rosenstein for recognizing his obvious impairment.)

Respond

Add Comment

Did you receive a hot stock tip from Chris Collins?

Respond

Add Comment

Respond

Add Comment

Respond

Add Comment

Respond

Add Comment

2. "our results provide no evidence that TABOR affected the level of taxes or spending in Colorado"

Intuitively seems broadly generalizable given the long history of failure in spending restraing legislation. The list of abject failures would have to include the Budget Control Act of 2011 (spending caps subsequently raised willy-nilly), the Statutory Pay-As-You-Go Act of 2010 (exemptions, exemptions, and more exemptions), Balanced Budget Act of 1997 (legislatively reversed in short order), the Budget Enforcement Act of 1990 (discretionary spending caps circumvented), and the Gramm–Rudman–Hollings Balanced Budget Act (key provisions held unconstitutional). And Governor Schwarzenegger's Proposition 58 balanced-budget control measures did not reduce state debt growth or prevent subsequent debt increases.

If we assume that governmental entities are not going to default, and that is perhaps too big an assumption to swallow looking at the debt per capita of states like Illinois and Rhode Island as well as Puerto Rico, the semantics of "spend" and "tax" are clearly problematic. Spending is taxation. Taxing is only a matter of timing.

Instead of saying stuff like "the President proposes X trillion in spending" or "Congress passed X trillion in appropriations" it probably makes more sense to just say that the President or Congress "imposes X trillion in future taxes." The spending is the liability and since taxes are the primary means by which the liability will be satisfied, spending = tax burden. A pretty vapid observation to be sure, but unfortunately one that seems to be very convenient to ignore.

Taxes are only one way of paying the bill for spending. Inflation is one way, supplying the demand for safe investments with low-interest-rate bonds is another. Sometimes there's a bailout. Sometimes you can raid pensions or something. You could declare a draft so that the "tax" for our military spending was borne by slave labor.

The fact you used a larger share of resources ("spending") can really only be paid for by people forgoing those resources in the present. Future taxes represent the most responsible way of repaying your obligation to them, but it is not set in stone that you will do that.

Respond

Add Comment

I don't think there's any evidence increased spending today or higher deficits today means higher taxes in the future. Zero. Can you point me to a single tax increase in the 1950's that was due to the massive spending and deficits of WWII? In fact that debt was never paid off and what was a huge portion of GDP now could be paid off tomorrow with the GDP of a small state.

I would put forth spending today is not future taxation at all. The cost of spending cannot be incurred in the future but can only be incurred in the present. The gov't could run 100% deficits and see no issue with inflation.

"The gov't could run 100% deficits and see no issue with inflation."

Then why not do it??

Because no one really understands the dynamics of debt hence they fall back on a general feeling that debt must be bad or spooky rules (i.e. if your debt to gdp goes above 90% bad things happen, but no one can explain why).

Respond

Add Comment

That plus this only applies if you are dealing with a relatively large economy that issues debt in the same currency it pays it. It also has to have control over it's own money supply. it also needs to have an independent central bank. Greece, for example, uses the Euro and issues debt in Euros but the money supply is controlled by Germany and France with Greece having almost no influence.

Likewise many countries do not have as independent a central bank as the US does. For this to work the gov't can issue as much debt as it wants but the Central bank should be conducting monetary operations to keep inflation and unemployment low. In that case the market acts as a check on excessive gov't spending.

Respond

Add Comment

Respond

Add Comment

Respond

Add Comment

Respond

Add Comment

4. I think it is even worse than they describe. There is a real potential of the whole thing consuming all available resources dealing with security flaws and consequences, and maintenance just to keep the damn thing running.

Respond

Add Comment

Hanson's demanding that people listen only to his literal meanings, and not infer any signalling, is the most grotesque hypocrisy.

Respond

Add Comment

'Critics often scold GMU econ bloggers for violating shared cultural norms'

Yes, that is true. and here is an example concerning eugenics - 'and ultimately the Nazi connection will be seen as a bump in the road.' Talk about violating shared cultural norms when seeing the intentional deaths of millions of humans as a bump in the road.

'The critics presume we’re part of their culture.'

No, a number of these critics are fully aware that someone who can dismiss the deaths of millions in the pursuit of a much better world through eugenics as a bump in the road are not part of the same culture.

'What matters are statistics, not emotions – and arguments, not stories.'

Except when statistics are simply waved away as a bump in the road, that is.

'Hyperbole is the worst thing in the universe.'

Self-recommending.

You bring this up constantly, and I don’t understand the logic chain here.

You’re equating CRISPR with the Holocaust, and then smearing Cowen with the association.

Murder of millions of people compared to editing genes.

Is removing horrific genetic diseases the same as genocide?

What is your deal.

Not to mention the quote he provides is a prediction of what other people will think

Respond

Add Comment

Respond

Add Comment

Respond

Add Comment

#5. 5-8 is only sometimes correct. For instance, when a man makes a logical argument that maybe getting sexually assaulted isn’t all that bad, we should absolutely take into account the fact that he has probably not had the experience of being assaulted or of fearing it, and thus might not correctly imagine the costs at play here. The broader rules of engagement suggest that a man making such an argument should at least preface comments with the recognition that he really doesn’t know what the threat of assault feels like. By refusing these rules of engagement, certain GMU bloggers are only removing themselves from the conversation. You may love your own house rules, but if you’re blind to a basic decency requested by the rest of the people in the discussion, you should perhaps not be surprised if everyone else stops listening. Which frankly is what is happening to robin Hanson.

This is exactly right. I would add the refusal of the blogger to confront critics in any meaningful way is also grounds for disqualification. The best bloggers are the ones who are not on the GMU faculty (Sumner and Henderson). Caplan dashes off stream of consciousness ramblings and then disappears for some days until the next itch needs to be scratched.

If everyone is really ceasing to listen to Hanson because of his offensiveness, then I suppose we will see fewer people complaining about how offensive his posts are in the future. Right?

Respond

Add Comment

Respond

Add Comment

When did Hanson make the argument that rape and sexual assault is okay?

This is classic mood affiliation. He asks questions that people do not want to confront, because they delve into biases and the sacred spaces of ideologies.

When groups of people refuse to engage an argument on its own logic because of the identity of the author (male? White? Whatever), then we’ve already crossed the Rubicon and are firmly in the realm of stupidity, bias, mood affiliation, and insanity.

Logic is logic. The party that refuses to engage in the argument and instead attacks the speaker based on race or gender immediately loses. We used to call this the ad hominem fallacy.

Now the university teaches it as privilege and “splaining.”

Where did these people attack Hanson for being a white male? They attacked his ideas for being stupid. You people need to stop being so easily offended by facts, reason, and logic.

Respond

Add Comment

Respond

Add Comment

I would agree if a man presented an argument that rape wasn't harmful, I would agree the input of women and esp. rape victims (who are not all women BTW) would be very valuable. But I've never seen any man making such an argument. If he did, though, I suspect I could easily refute its logic even if there was no rape victims nearby to refute it from a testimonial perspective.

I think where this analogy comes into play is perception bias. It's not on the radar for most men. Having women in the discussion puts it on the radar. This does not overturn rationality or objective observation. When alerted, we can all measure it and observe it but we need different points of view to call it out to us.

Respond

Add Comment

Respond

Add Comment

I'm afraid everyone is already stopping listening to you, Test subject (well, I stopped in the middle of your too long and boring comment), and that Robin Hanson's audience is increasing with the last tantrum of the internet mob.

By the way, 5 is great, and I would be very happy to know with what Tyler is in disagreement.

Respond

Add Comment

#6 what is the optimal amount of tourists
#Coase

A few dozens a year.

Respond

Add Comment

Respond

Add Comment

6. "What matters are statistics, not emotions – and arguments, not stories."
In the real world, people care about the stories of their lives and don't give a damn what a bunch of limp wristed academics have to say. Caplan needs to hit a weight room, enjoy plate of pit beef with some boiled vegetables, and stop acting like a cuck. Be a real man.

Imagine if this kind of comment was all you had to offer the world.

Respond

Add Comment

Respond

Add Comment

5. I enjoyed this post and admire many of its principles. I mostly encounter these blogs through MR, but I appreciate how often their willingness to flout conventions leads to novel ideas, such as Caplan's book on parenting, or disregards conventional ideas. The ideal that I see most in this group is perhaps a form of "I'm more rational than X," which may be true, and yet there's also a predictable consistency of views and a predictable absence of concern for some issues expressed amongst this group that I find also in the groups they critique. I suspect they're all more driven by the Haidt's elephant than they'd prefer to admit.

Respond

Add Comment

1,3,5, didn't a biologist say it better here;
https://www.youtube.com/watch?v=WozTbBN7aoU

how big a pile a money do you get if sumbody puts
a camera in your house

Respond

Add Comment

#4. What a rant! There are also lots of boring jobs and workers frustrated with their situation. There are also lots of people are building exciting things with software.

The specific rants here are weak. To pick one: text editors. Emacs and Vim are quite widely used and well maintained. If that's what you want use them. "What could be simpler?" Well, you can make a bare bones text editor, sure that's simple. And you could optimize for response time if you want. Many text editors are quite sophisticated and clever and have innovative feature sets, interfaces, and architectures.

Respond

Add Comment

On #4 I guess this is because of the very fast cycle in software development, people have to release their software when it is good enough, not when it is optimised. In fact this is like evolution - organisms are just good enough, not optimised. As in the famous quote where someone apologises for writing a long note because they didn't have time to write a short one, optimisation takes time, which the developers don't have. I also assume that part of this bloat is due to breaking the development of the code among many individual programmers, which is necessary now since the equipment and the software is so complex. No one person understands how Windows 10 works for instance. You may therefore have lots of sections of the code basically doing the same thing.

Respond

Add Comment

Tabor. Colorado is like the fifth lowest on education spending, so they must be doing something right.

Respond

Add Comment

Moore's Law is what has enabled software bloat, wherein even the simplest task requires GBs of memory and zillions of CPU cycles.

Yet Moore's Law has slowed, and perhaps digital hardware is asymptotically approaching its physical limits.

If so, perhaps further improvements in system performance will have to come from improvements in software efficiency?

Or perhaps The Machine Stops (E.M. Forester) will be prophetic after all, as all our digital devices just gradually take more and more time to do less and less, until one day the machine just ... stops.

Respond

Add Comment

From 5: "7. Hyperbole is the worst thing in the universe."

+ google

Respond

Add Comment

1: This paper is intriguing and its conclusions are troubling. They don't seem to have good evidence about pseudonymous writers gaining credibility, but that's a minor weakness.

We already see attempts to "short and distort" when the conspirators occasionally get caught; there are undoubtedly many more who do not get caught. And according to this article they are sometimes successful.

I guess the only consolation is that their overall effect on the market is hopefully small; $20B according to the article which is certainly an amount to be reckoned with but small relative to the size of US capital markets.

Respond

Add Comment

Respond

Add Comment