Wednesday assorted links

Comments

2: finally, a worthwhile twitter thread though mainly for entertainment value. Some of the comments were excellent: Veruca Salt, chip dysplasia, The Matrix, the (alleged) capital-labor ratio of click farms in China, and this quote about automated click robots responding to content generated by automated content-generating robots.
https://pbs.twimg.com/media/D1cVsw-W0AA4mwa.jpg

If twitter were like that more often, then I'd read it. But the signal-to-noise ratio is far too low.

I forgot that the best comment was probably the one that cited
https://thispersondoesnotexist.com/
which creates "photos" of people who do not exist. Not clear what the algorithm is though; if all they're doing is taking an existing photo and doctoring it, that's not very exciting. But if they're at will able to create an image according to certain specifications (young male from the Middle East, middle-aged European female, maybe even distinguish between the appearance of white Europeans and white Americans) then that's pretty impressive.

"The site is the creation of Philip Wang, a software engineer at Uber, and uses research released last year by chip designer Nvidia to create an endless stream of fake portraits. The algorithm behind it is trained on a huge dataset of real images, then uses a type of neural network known as a generative adversarial network (or GAN) to fabricate new examples."
https://www.theverge.com/tldr/2019/2/15/18226005/ai-generated-fake-people-portraits-thispersondoesnotexist-stylegan

Explained more with video here: https://medium.com/syncedreview/gan-2-0-nvidias-hyperrealistic-face-generator-e3439d33ebaf

If your twitter feed sucks, it's probably someone else's fault amirite?

#4. Flash fictional adjacent commentary thereto:

http://fictionaut.com/stories/strannikov/velocities-of-disputation-fact-measurement-perception-volition

is Dr. Cowens sister married?

1. This is why most Minimum Wage Increase efforts have been gradualist, allowing for adjustment time.

5. That's fascinating. I hope it works, and they try to get you placed in as high-paying position as possible so you can more easily pay them back.

. . . Honestly, I bet there's a ton of potential for stuff like that out there. It's just most obvious in computer stuff because of the high pay and demand.

"5. Lambda School and equity-sharing."

According to the description, the minimum payment is $25K and the max is $50K over a 5 year program and the expenditure is $18K.

That's in addition to tuition which is $20K or 17% of income for 2 years.

This is all for a 9 month to 60 week long programming training.

You’re wrong. It’s 20K if you pay upfront. It’s 30K if they train you for nine months with no cash upfront. It’s 50K if they pay you $2,000 a month for the nine months they’re training you.

https://lambdaschool.com/about/

> There are no up-front costs required to attend Lambda School; we only get paid when you do. Once you’re earning at least $50k per year you’ll pay back 17% of your income for the first two years.

> Total tuition possible is capped at a maximum of $30k, so no matter how much you’re getting paid the most you could possibly pay is $30k.

> Alternatively, you may opt to pay a tuition of $20k up-front with no income-based repayment.

"You’re wrong."

Maybe, but I'm merely quoting directly from their site.

I believe the parts you are quoting refer to 'tuition'. The $2,000 per month is the stipend program which is in addition to the the tuition costs.

https://lambdaschool.com/stipend/

It has little to do with adjustment time. If anything this gives us the clearest picture of the marginal effects because in the long run (a few years) capital is variable. Adjustment time also obscures the effect because inflation erodes part of the value of the increase in minwage. Workers or potential workers also make human capital choices based on expected wages and employment. If the increase is phased in, then the adverse effects will also be phased in.

If minimum wage is not a binding constraint for most workers, raising it won't have any deleterious (or good) effects if the new minimum is also not binding. So incremental increases might appear harmless when they really arent.

6: I'm reminded of those giant hornets in Japan (one and a half inches long, and as ornery and dangerous as you might fear) that can massacre a honeybee hive: the bees' stingers have no chance and a single hornet can kill 40 bees per minute. But the endemic Japanese honeybees have evolved to defend their hives by ganging up on the hornets and raising their body heat to temperatures that kill the hornet.
https://en.wikipedia.org/wiki/Japanese_giant_hornet
Video here:
https://www.iflscience.com/plants-and-animals/hornet-cooked-alive-bees/

Those giant hornets killed 18 in China a few years ago. Videos of those things give me nightmares.

https://www.theguardian.com/world/2013/sep/26/hornet-attacks-kill-18-china

1 is not a minimum wage study, or at least has no relevance to US minimum wages. Why can’t Tyler/Alex understand this?

The minimum wage jumps in a major way after age 18-strong drop in total employment and labor hours purchased.

How many times are min wage fan boys going to keep moving the goal posts to fit the arguement on this one? It’s a crappy policy tool with super crap supporting evidence.

Better to ditch it for more EITC....

"Better to ditch it for more EITC...."

Yes, the EITC is a better solution. But of course, it's one that government has to pay for directly. Whereas, the cost of minimum wage laws can be passed on to employers.

“Whereas, the cost of minimum wage laws can be passed on to employers.”

True, and you can add to that the personal cost to marginal employees who are now unemployed or unable To find work.

Thought experiment: a

Sorry, thought experiment:

How many ex-cons can immediately get a job at $15 an hour?

How many at $20?

EITC would spread the burden over the entire tax base at progressive rates while the burdens of minwage are borne largely by low wage earners, low income consumers, and producers who serve both groups.

I have no idea what the relative deadweight costs are.

Maybe, but there is a valid concern about the government subsidizing the ordinary costs of doing business.

But it makes up for it by raising taxes on everyone.

The most successful government policies, in terms of accomplishing goal X, are those where they raise taxes and then spend the money to make X happen. The worst are those where the government tries to force other people to make X happen though kludges, but those are often implemented because they can "keep taxes low" (but we pay in other ways) or because they really want to punish some outgroup.

Well the EITC gets attacked because an estimated 21% to 25% of the payments are fraudulent. And I'm wondering how much the drop in employment, is 18 year olds desiring to spend more time in the University and less time working. I can't find a breakdown for unemployment above and below 18 years of age but what is interesting is the 2.4% rate for 16-24 year olds vs 8% rate for 25-29 year olds.

https://www.statista.com/statistics/586091/rate-of-unemployment-by-age-group-in-denmark/

"Well the EITC gets attacked because an estimated 21% to 25% of the payments are fraudulent."

Converting EITC over to a wage boost that is paid directly via the employees pay check would resolve that issue. The current program is primarily aid to working mothers and is paid via federal income tax rebates.

To be clear, making the program universal (not just aid to working mothers) and implementing via pay checks would substantially reduce the fraud. Most of the fraud is claiming too many children on a Federal tax return.

The minimum wage has always been a superb policy tool for decreasing employment among the low-skilled.

With a disparate impact on "marginalized" groups. Intersectionalism for me but not for thee.

You really have no idea what you're talking about.

Keep going...don’t just leave it at that...

"Why can’t Tyler/Alex understand this?"

Because demand curve slopes downward?

3: Tyler leaves out the description of Polymath University: "In this model, as a condition for graduation, students must major in three disparate disciplines. So, a student could not major in history, English and philosophy, or accounting, finance and business administration. Instead, a student would be required to major in, say, philosophy, sociology and finance, or accounting, history and design. There is some interesting literature on students who double major in such nonadjacent subjects, and the kinds of creative, innovative thinkers they become. A student who triple majors in widely divergent subjects would develop a supple and complex mind."

Sounds like a liberal arts education on steroids, going beyond stiff distribution requirements as at say the University of Chicago to triple major requirements.

Which goes against the generally disaggregated curriculum that Tyler and Alex seem to favor. Or as exemplified in Tyler's link #5, Lambda University. (Most of these new-fangled "universities" are either scams or noble experiments doomed to fail, but Lambda sounds like it might work.)

I wonder about the research on students who double major. There's some possible selection bias: they're probably above average students to start with. Is it useful as a signal? I think it may depend on the school.

It's very common for students at Bowdoin College to double major, something like 30-40% of their graduates do. So a double major from Bowdoin might not signal much.

Conversely, it's rare for students at Reed College to double major; the senior thesis is so intense that very few students can or want to tackle two theses. So although a double major might signal something at Reed, it's so rare as to not be a useable signal.

But maybe there are schools in between those extremes were a double major does signal something?

"So although a double major might signal something at Reed, it's so rare as to not be a useable signal."

Huh? If anything, the rarity and difficulty of it would make it an intensely strong signal.

An intensely strong signal for the one student a year who graduates with a double major. That's so rare as to be useless.

Want to hire a smart college graduate? Hire a double major from Reed. Good luck finding one.

Hence, not a useable signal.

(Plus, I don't know if the double majors from Reed College are especially smart or good; I'm just basing this on that guy's claims that research shows that double majors have an extra intellectual something.

To even try to research this would probably require over 20 years of data comparing Reed's double majors to its single majors, due to the tiny sample sizes we're dealing with. So Reed's double major data may be useless even for research purposes, unless the researcher can get decades of data about Reed College, and/or combine it with data from other colleges.)

I did a Compsci + Finance double major and maybe eight of the seventy students who started the program ended up graduating in it.

It was hard mostly due to workload. You don't have the freedom to take basket-weaving options, and although professors within a department coordinate to avoid overloading students that benefit doesn't always apply to double majors.

For this specific degree, the type of person that makes a good programmer will find the drudgery of accounting and fake-math economics intolerable.

The limited respect I held for economics vanished when the professor spent a lesson explaining how to integrate functions by counting squares on graph paper while I was simultaneously taking 300 level calculus.

2. So... a fake user, views adds fed to it by an algorithm, and likes click-bait fake news re-posted by an aggregator, drawn in by deceptive google results, then scrolls through catfish profiles, angry posts by paid trolls, and AI conversations with bots, and honey-trap posts by spooks and the cops, all supported by ads for products resold by accounts not connected to the producer, sold on a platform that disclaims a connection to any of it.

And somewhere, somehow, at the bottom of this someone is still paying a lot of money for the clicks, and have not yet wised up?

Please tell me that the cable repair guy and pizza delivery boy on Pron-Hub are still real at least.

Literally LOL'ed

“Please tell me that the cable repair guy and pizza delivery boy on Pron-Hub are still real at least.”

You have strange taste in pron, sir, but the rest was funny.

4. The science of quantum mechanics proves that "objective" reality is indeed a social construct.. Postmodernists rejoice, Jordan Peterson fans will have a heart attack.

Nah, just more misleading pop-quantum-science headlines

Jordan Peterson dabbles in quantum mysticism and probably appreciates this result. He is not a realist.

Yeah and no discussion of how Everett supposedly collapsed the apparent paradox/contradiction with Many Worlds 50 years ago. Might this result then provide another argument for MW?

Are there ways to disprove the MW hypothesis? If the answer is No, then it is at best metaphysics, not physics. And it also overwhelms Occam's Razor by introducing immense complexities, rather like Ptolemaic epicycles or the old aether hypothesis. At least explanations based on conscious observation invoke something which we know exists even if we don't understand it

No, it just proves that (maybe) one natural phenomena is not easy (or possible) measure. Most phenomena is still easy to measure and that is why you and I are able to be chatting here. Postmodernists are simply crying babies unhappy with reality.

Does the experiment reflect an "objective" reality or is it (and therefore any conclusion drawn from it) just a social construction?

Jack is bluffing a jack, I called. The dealer turned over an ace of diamonds.
“All-in,” I said.
“Call.” Jack turned over a four and a five.

Locality also suffers in experiments with paired particles. Space-time is an approximation that normally works very well.

4. Quantum seems yet more confusing.
------------

Causality cost energy when setting up the experiment.. If observer A measured the photon then observer A altered the quantum field prior to the experiment, observer A needed a reference point and cleared the field prior to firing the photon. The exchange was hedged prior to the occurrence. To make the measure it to apply energy already liquid which has to be accumulated prior to measurement. Thus observer A ad B will be partitioned. Neither can know the preparation of the other until the experiment, and then it is too late. This is another case of 'error correction' we hear about in our abstract algebra tree.There will again be a superposition of possibilities because a process in the tree trunk keeps enough capacity to maintain roundness, it keeps a bit error for rounding to the nearest integer. That has to be acumulated prior to the experiment, hence the paradox, they conserve energy earlier than the experiment.

Here is another way to think on it. For observer A to have a fair test on observer B, then at some point prior to the experiment they exchange time, and other wise agreed to build potential energy across all observers. All parties already agreeing that the outcome is a compressed finite channel, a tree trunk. In doing so, their experiment will hold 'bit error' round off energy that forces the experiment to follow a path partitioned from the other observers. The more accurate the collective test, the greater the indifference curve parallelism, the more quantum will the results appear, order will be ambiguous since no party knows exactly how the rounding off went when creating finite ratios.

Like word spaghetti.
Some where between playing baseball an executing this experiments the experimenters had to share the vector space and swap energies. Nothing happens without error correction.

https://www.thesun.co.uk/tech/8624077/time-machine-experiment-quantum-physics-electrons/

The “balls” scattered and, according to the laws of physics, should have appeared to split in a haphazard way.

But researchers managed to make them reform in their original order — looking as if they were turning back time.

---------

Another or similar?

Same issue. Once the experimenters setup the experiment, the original state of the electrons already limited to a superposition of the possible outcomes, it is slightly skewed from the order the experiments think. . Setting up the experiment creates our abstract tree, expermineters are stuck in their own algebra, and the experiment rounds the circle. Solution space is already partitioned, and contradictory before the experiment even started.

#4: the speed of light keeps us from exploring the largest things in our universe. Quantum weirdness keeps from exploring the smallest things in our universe. It's almost as if a designer was keeping us from understanding the edges of our existence and finding out the details of the simulated universe we live in.

Prediction: once the designers of the simulation decide to give our universe more computational power, we'll suddenly discover some new way of overcoming these limitations.

Enthusiastic +1, yes.

I have posted the same thing. I'm not in the end a 'we are living in a simulation' guy, but if we are then the speed of light and quantum weirdness is probably the closest thing to evidence that we are.

I think of it more as trying to pop a water balloon wearing dish washing gloves

3) it's funny because Beethoven composed sharp and hard, and soft. Mozart, on the other hand, had a unison even if a bit acoustic. It occurs that Papa Haydn's Farewell Symphony is so moving for me because it's small steps into great slides, as if a child learning to read a word. And who makes out on that transaction?? There is this idea that families are the fundamental "unit," that somehow friendship is less than. In so much as they are biological units, families help us understand society, but that does not require they help one understand herself. To understand oneself requires transcendence that family cannot offer because families are intrinsically unequal. In-fact, the family object is necessarily delusion. So one delusion or delusions of grandeur?

There was a recent prizewinning essay on similar lines: https://fqxi.org/community/forum/topic/3006

If we're going to reasons along these lines (and its probably futile to), seems more like evidence that the world was not built for us, rather something else for which these are not tough barriers.

"...rather something else for which these are not tough barriers."

Are they tough barriers? Human's have been around for 100K+ years. 1,000 years ago moving at the speed of a cheetah was a power of the Gods. Now it's an ability we grant to 16 year olds.

In 1,000 years, the speed of light may not seem so restrictive.

https://en.wikipedia.org/wiki/Sumatran_rhinoceros#/media/File:Dicerorhinus_sumatrensis_Bell_1793.jpg

https://en.wikipedia.org/wiki/Sumatran_rhinoceros#/media/File:Sumatran_Rhino_001.jpg

https://en.wikipedia.org/wiki/Sumatran_rhinoceros#/media/File:Sumatran_Rhino_London-1872.jpg

Battle unicorns.

4. Quantum seems yet more confusing.

There is a Bell's Theorem, really a math theorem. If you assume local variables and quantum effects (action at a distance), then light must be instantaneous. There are a dozen manifestations of this equivalency all through math. Quantum effects need to be understood as a prior connection, there is already a tree trunk involved and two botanists are looking at opposite branches. The more closely a quantum effect is measured the rounder the tree. We prep the experiment, ex ante, to make a rounder tree trunk, otherwise we never see the effect. The paradox is that the scientist preps a tree rounder than sustainable without the prep work, the results end up saying: "Tree trunk too round'. Another version of Bell's theorem, but it applies to a general also of compacting aggregate systems.

The new math connects Bell's theorem and queuing systems, to holograms and we can get the connection between the Walmart checkout manager and Bell's Theorem, the checkout stand is stable when it meets Bell's theorem to best estimate, which is error corrected to the limited set of outcomes, the prior algebra.

4.

There is really nothing interesting here. All the "alternative realities" and "objective reality" stuff is just a bizarre and overwrought way to frame a straightforward situation. (A quantum situation, yes, but a straightforward one).

In any case, this experiment doesn't do anything that Wigner's thought experiment didn't. I don't know if I'd even call it an experiment. It's a demonstration rather.

4. Now would be a good time to repost Brukner's own paper on the quantum measurement problem. https://arxiv.org/abs/1507.05255

This has long seemed to me to be the correct view to take, given everything we already know. I interpret it as meaning antirealism about the quantum theory itself, not "reality" whatever that may be. Or if you prefer, that reality can inherently not be known to us to the fullest extent. (A kind of instrumentalism if you will) And this follows from the structure of the theory itself, so in a sense this experiment was "unnecessary" as John says. (No idea what Matt Young is saying though.)

Of course this does not rule out something like t'Hooft's superdeterminism, or the latest Bohmian idea or the one by Wolfram (these have their own issues), and nor does it contradict Everettian ideas either (which, again, have their own issues amply exhibited here: https://www.mat.univie.ac.at/~neum/physfaq/topics/manyworlds).

**YOUTH** minimum wages in Denmark.

Alex Tabarrok posted something about this back in 2017

https://marginalrevolution.com/marginalrevolution/2017/06/minimum-wage-evidence-danish-discontinuity.html

This article about Pursuit shows a lot of similarities between Pursuit and the Lambda School, although somewhat different aims or at least target students. As with Lambda School, this is one of the few new-fangled education "disruptors" where I think there might actually be real value to this idea.
https://www.nytimes.com/2019/03/15/business/pursuit-tech-jobs-training.html?action=click&module=Well&pgtype=Homepage&section=Technology

Comments for this post are closed