The EU “link tax” has been resurrected

And here is commentary from Ben Thompson:

This is why the so-called “link tax” is doomed to failure — indeed, it has already failed every time it has been attempted. Google, which makes no direct revenue from Google News, will simply stop serving Google News to the EU, or dramatically curtail what it displays, and the only entities that will be harmed — other than EU consumers — are the publications that get traffic from Google News. Again, that is exactly what happened previously.

There is another way to understand the extent to which this proposal is a naked attempt to work against natural market forces: Google’s search engine respects a site’s robot.txt file, wherein a publisher can exclude their site from the company’s index. Were it truly the case that Google was profiting unfairly from the hard word of publishers, then publishers have a readily-accessible tool to make them stop. And yet they don’t, because the reality is that while publishers need Google (and Facebook), that need is not reciprocated. To that end, the only way to characterize money that might flow from Google and Facebook (or a €10-million-in-revenue-generating Stratechery) to publishers is as a redistribution tax, enforced by those that hold the guns.

Here is the full post, excellent as always.


Does anyone else find it ironic to see so many people who disparaged Brexit suddenly awake to reality?

The EU has always abused its power to enforce dumb shit. They regulate the power draw of kettles for chirsts sake. As any high school graduate will know, the amount of power it takes to heat water is constant. Lower power just makes it take longer.

"As any high school graduate will know, the amount of power it takes to heat water is constant. Lower power just makes it take longer."

Actually, you're wrong, it's worst than that. You ignored the heat loss from the top of the kettle. The longer it takes the more heat you've used heating the atmosphere around the kettle. The amount of power is not constant, it goes up. So, lower power makes it more inefficient and takes much longer.

In the worse case, your power is less than your conductive and radiative heat loss and the kettle will never reach the boiling point.

EU regulations of kettle power were necessary. If northern Virginia had regulated kettle power draw, they wouldn’t have supported eugenics or nuclear power like Mercatus does. After all, genetic research is just one step away from eugenics. Small steps to a better world. Right Cowen?

The Koch’s have ruined kettles, and European regulation prevents catastrophic kettle disasters. Why do you think the Koch’s pay Cowen’s salary? Follow the kettle money SHEEPLE.

Germans would never stoop to this. And unlike dirty Americans who haven’t become expats, Germans have never let their sense of superiority affect their behavior or cause any outside group harm.

Random Wikipedia article that’s tangential to the point at best

Unnecessary block quote that’s irrelevant to the comment

Nuclear energy is bad. Cowen is bad.

I was fired from George Mason. Also he owned slaves.

Unions and labor terms. But in German without translation.

I generally frown on sock puppets, except where the puppetry is obvious to all and funny.

Ich auch.

I am pro-Brexit (I think it is best for the UK), and I agree that most regulations made, an most actions taken, by the UE are absurd. If for once they make a good choice, they must be defended, though. And are you sure that the UK, which is the main historical victim
of the American disrespect for intellectual property of others, doesn't agree with UE on this specific point?

Except you're talking a load of shit, because there are no eu regulations relating to kettle power draw. Why are you foaming wingnuts all so gullible.

The EU was really drawing up regulations to limit kettle power, as reported by numerous credible sources up to late 2016. I don't believe this has happened (yet), but it's not clear whether the plans are ongoing, on hold, or have been dropped.

Next there will be absurd claims that the EU wants to regulate the suction power of vacuum cleaners.

No, they wanted to set standards for the conditions under which sucking power is measured, so manufacturers couldn't deceptively fudge the numbers the way VW did with emissions standards.

Dyson, whose CEO is a leading Brexiteer, made a stink about it because he thought the standards were unfair to his products. More likely they showed how poorly those overpriced contraptions perform, but hey...

'showed how poorly those overpriced contraptions perform'
Actually, not really. The basic dispute was that the tests were done with pristine bags, and did not reflect the loss of suction power as the bag was used over time, which is the main selling point of Dyson's overpriced contraptions.

So yes, the tests were unfair, as they avoided testing vacuum cleaners with bags in real use conditions, but they did not show how poorly his product performed.

I just saw a great saying earlier: "Never attribute to conspiracy that which is adequately explained by incompetence." I would think denying all these links as conspiracy could be answered by that quote.

The problem is that robots.txt is all-or-none. Either pages can be indexed in their entirety, with Google choosing what content they display limited only by fair use, or if they pages are hidden by robots.txt, they won't show in Google's results at all. An alternative option would be for the website to detect Google's indexing "bot" and give it the content it does want index, excluding portions of the web page it does not want indexed.

Ben Thompson's point would still seem to stand. Google "pays" firms for their content by indexing the content that the firm does want indexed. The firm can decide for itself whether that's worth enough to allow the rest of its content to get indexed.

But this defense of Google conflates Google news with the Google Search Engine, which is in a quasi-monopoly situation (91.82% search are mode on google in the EU). And as such, it is vulnerable to anti-trust laws and action from the european union.

Seems like a trivial problem to solve if anyone actually cared.

The entire concept of robots.txt is to prevent certain parts of the site from being indexed. Ben Thompson, as always, is spot on.

Groucho Marx said it best - Politics is the art of looking for trouble, finding it everywhere, diagnosing it incorrectly and applying the wrong remedies.

robots.txt is all-or-nothing on a per-page basis. So, there is no way (using robots.txt) to say "you can index this page, but only the page title, url, and description (of my choosing)". The alternative solution I suggest is that content providers selectively render content to the search-engine bots based on what they do/don't want indexed. This is feasible today.

Not trying to belabor the point, but their are various Meta Directives that instruct further on how a page should be treated. I think the point is that EU regulators are seeing problems where none exist.

Ben Thompson, in his article, provides a possible solution too.
If regulators, EU or otherwise, truly want to constrain Facebook and Google — or, for that matter, all of the other ad networks and companies that in reality are far more of a threat to user privacy — then the ultimate force is user demand, and the lever is demanding transparency on exactly what these companies are doing.

These meta-directives and robots.txt files still don't allow the content provider to say "you can index this page, but only the page title, url, and description (of my choosing)". I'm in agreement that Ben Thompson's conclusion - enforcing transparency is the right thing here. Search engine indexing bots should be required by law to disclose themselves via their user-agent. This will give the content providers the ability to control with a very fine grain what content gets indexed and what content does not.

In other words the content provider should incur the cost of protecting its own content from search engines.

Is that always the heart of political/regulator disputes: how to assign costs to some group or other?

I wish I had the self-confidence of an EU bureaucrat. They seemingly believe that anything can be bended to their will.

Among continental Europeans there has always been a historical tendency against market forces. The British are long known for acknowledging this in reference to the Italians and French as well as the Germans. Hence the refusal to enter the EU monetary union, and more recently Brexit.

There is great irony to me that the EU as a project started out as trying to be American in flavor but as time has gone on it has ended up being "Capetian" in taste. Nothing has changed.

Generally speaking, how is Europe able to regulate US internet companies' websites? If Google were to redirect European users to its US website or serve webpages from a server in the US, would it still be subject to European law? In other words, what determines jurisdiction when a user is in one country and a server is in another? China, of course, blocks content from outside China, but isn't Europe open?

Europe is open, and the problem is not Google *sending* content to Europe, it is Google *taking* without authorization privately owned content from European citizens and companies.

This is a fight that the US can't win. If Google keeps doing this with US government support, the UE can simply say that all American intellectual property (movies, series, books, music, software, patented medications, technological patents, all of Google's technology (just contact some high-ranking engineers at google, pay them twice what they earned and give them villas on the Riviera and UE passports, and all of Google's secrets will fly as butterflies), etc.) are fair game for European citizens and companies and can be copied freely. The US have much more to lose in this game than EU.

My question is when does US IP law apply and when does EU law apply? As far as I know, if an American inventor wants to enforce patent rights in Europe, he or she needs to get a European patent. The US patent only covers the US.

Surely, US law governs a user in the US using Google. What about a European user that remotely connects to a US computer (through Remote Desktop, TeamViewer, etc.) that accesses Google? That seems only a small step from a European user accessing Google from a computer in Europe.

Jurisdiction is defined by physical geographic borders, but data can be redirected virtually to anywhere.

"Google, which makes no direct revenue from Google News, will simply stop serving Google News to the EU."

Fine. Just do that. And stop complaining.

News aggregation makes no sense when media outlets have Twitter accounts. Google is slower to begin with.

It's worse. Germany has this as "Leistungsschutzrecht" ( )

From that page:
Numerous German publishers entered a zero-cost licensing agreement with Google so that their content continued to be displayed on Google News.[21]

So it is a barrier to entry for any competitor to Google thus enshrining Google's monopoly.

Only if the newspapers want it to make Google powerful. They should be free to give content away if they wish.

Every time I visit a website and an annoying pop-up tells me that it uses cookies (it might as well tell me that it uses HTML too), I curse the EU.

Actually, that pop-up is telling you they use javascript. I don't normally use javascript (it gets turned on explicitly to reply to comments here), and I never see those pop-ups.

"Excellent?" Really?

"Moreover, the cost of copyright infringement to copyright holders has in fact decreased dramatically. Here I am referring to cost in a literal sense: to “steal” a copyrighted work in the analog age required the production of a physical product with its associated marginal costs; anyone that paid that cost was spending real money that was not going to the copyright holder. Digital goods, on the other hand, cost nothing to copy; pirated songs or movies or yes, Stratechery Daily Updates, are very weak indicators at best of foregone revenue for the copyright holder."

Foregone revenue is irrelevant, it is forgone profit that matters, which is unchanged. Strachery is hearby lowered in esteem.

Yes, the marginal cost of an extra copy of a work are zero; the scarcity is the existence of the first copy. For the aggregator-mentality of Stratchery, all those things have no cost, because they are stolen by someone else.

Still, you can bet that the law will take me down if I offer up a derivative work based on Google search results. THAT is different, because internet.

It could be a tragedy of the commons. It doesn't make sense for any one publisher to apply a robots.txt file to curtail Google's access to their site; but it might make sense if they all take part, enforced by law.

In this instance, I can't identify where the commons is though.

People are easily manipulated. The "link tax" doesn't resolve that problem, but at least it imposes a charge on the manipulators. What about market forces? What about propaganda?

My favorite part of this new 'link tax' is that it is 'inalienable' -- meaning that publishers are not permitted to negotiate their own deals with aggregators or waive the fees entirely. The point, of course, is to prevent 'defectors' from letting Google link to their content for free and gain advantage over their rivals.

The next question -- which organizations are to be deemed to be publishers who must charge link fees and which organizations putting up internet content are not publishers and may, therefore, allow Google to link to them for free? Fun times.

I really doubt google dont make revenue from google news

Comments for this post are closed