My interview with Eric John Barker

by on September 29, 2013 at 10:48 am in Books, Economics, Film, Science, Web/Tech | Permalink

It is here in excerpts, mostly about Average is Over, but with some twists, here is one part:

TC:…That’s right, and Obi-Wan also tells Luke, “Finish your training in the Dagobah system,” right? How many times did he tell him? Yoda tells him. Yoda. What does Luke do? He tells Yoda to get lost. So I think as humans we’re somewhat programmed to be a bit rebellious and to not want to be controlled, which is perfectly understandable given that others are trying to control us as often as they are. But that’s going to mean in those new settings, which we’ve never biologically evolved to handle, we’re going to screw up an awful lot. Just like Luke did not finish his training in the Dagobah system.

Eric’s very interesting blog you will find here.

n September 29, 2013 at 11:04 am

Great read – my favorite AIO review/interview yet. Nice to see TC recognize that people shouldn’t just pursue something because it is in demand – they should pursue something they are good at and that they enjoy. More Star Wars please.

prior_approval September 29, 2013 at 11:18 am

‘given that others are trying to control us as often as they are’

But in the oncoming seminal age of marketing, we will all feel better about it, though. As a matter of fact, there is a book on sale concerning this point, if reviews are to be trusted.

And the arising doctrine of algorithmic infallibility will allow absolution for any individual marketing the greater good who happens to possess a glimmer of self-awareness or historical understanding.

Here is a guiding quote – ‘Many more of us will become like preachers. Imagine a reverend or a deacon in a church trying to grab everyone’s attention, get them to follow the religious moral code.’

Then imagine how the heretics will be treated, if the past is any guide.

Hark to the words of the prophet –

‘The people who listen to the machines, they’re going to do better. They’ll have a better chance of being happily married. They’ll choose better dates. They’ll kiss at the right time or whatever it is the machine tells you. They’ll have better portfolios. They’ll have better diets. Whatever it will be, but I fully expect that something like roughly half of the human race isn’t going to want to listen.’

Though I still wonder – what happens when your device indicates the optimal conditions, and the partner’s device indicates they aren’t? Somehow, the doctrine of algorithmic infallibility does not seem to get very far when one thinks about it for 5 seconds. But who cares about what the heretics think.

jeff September 29, 2013 at 12:00 pm

This has turned my view of you upside down. You thought Luke should have stayed?

Andrew' September 29, 2013 at 2:24 pm

From a certain point of view. I am with the Bryan caplan reading

Andrew' September 29, 2013 at 2:29 pm

BTW, its just an illustration, but when Luke returns(!) Yoda tells him no more training does he require. They still don’t give him the damn degree. What does a mofo have to do kill your father who happens to be Darth Vader?!?

Comments on this entry are closed.

Previous post:

Next post: