The Evil of Pagination

by on October 2, 2012 at 6:45 am in Economics, Education, Web/Tech | Permalink

I agree with Farhad Manjoo:

Splitting articles and photo galleries into multiple pages is evil. It should stop.

Pagination is one of the worst design and usability sins on the Web, the kind of obvious no-no that should have gone out with blinky text, dancing cat animations, and autoplaying music. It shows constant, quiet contempt for people who should be any news site’s highest priority—folks who want to read articles all the way to the end.

Pagination persists because splitting a single-page article into two pages can, in theory, yield twice as many opportunities to display ads—though in practice it doesn’t because lots of readers never bother to click past the first page. The practice has become so ubiquitous that it’s numbed many publications and readers into thinking that multipage design is how the Web has always been, and how it should be.

Neither is true: The Web’s earliest news sites didn’t paginate, and the practice grew up only over the past decade, in response to pressure from the ad industry. It doesn’t have to be this way—some of the Web’s most forward-thinking and successful publications, including BuzzFeedand the Verge, have eschewed pagination, and they’re better off for it.

Joshua Gans October 2, 2012 at 6:57 am

After apparently trying hard not to paginate, the PS of the original article is on the second page.

Mo October 2, 2012 at 2:34 pm

It’s a joke.

Anon. October 2, 2012 at 6:58 am

>Splitting articles and photo galleries into multiple pages is evil.

First world problems, anyone?

prior_approval October 2, 2012 at 7:03 am

‘It shows constant, quiet contempt for people who should be any news site’s highest priority’

Or towards people unable to click or link to the (almost always) provided Print button – almost as if hyperlinking was something people on the web are unaware of.

A (commercial) web site’s highest priority is paying the bills that allow it to exist as a news site – a reality far too easy to lose sight of as the business model of providing news as a profitable economic activity has collapsed.

Though of course, there are plenty of ‘news’ sites that have little to nothing to do with surviving based on revenue from people reading their ‘news.’ Mainly because the people paying the bills of those ‘news’ sites have an entirely different goal than attempting to make a bottom line profit.

The home page (what a quaint term) of http://www.buzzfeed.com/ would benefit immensely from pagination. And calling BuzzFeed a news site is roughly the same as calling People magazine a news weekly.

John Thacker October 2, 2012 at 11:19 am

Though of course, there are plenty of ‘news’ sites that have little to nothing to do with surviving based on revenue from people reading their ‘news.’ Mainly because the people paying the bills of those ‘news’ sites have an entirely different goal than attempting to make a bottom line profit.

Ah, so we’re all agreed that we have a prediction that the Voice of America and PBS should paginate less? BTW, I’m not sure why you’re using sneer quotes in relation to those two.

John Thacker October 2, 2012 at 11:22 am

It’s certainly true that the historical business model has collapsed (especially by classifieds being separated from news by sites such as Craiglist, Zillow, Redfin, etc.), but it’s not clear that frustrating your customers is the right answer. I realize that subscriptions have apparently worked for some sites, but I’m not a fan of them either.

Cliff October 2, 2012 at 4:38 pm

Why is “Home page” a quaint term?

Dave October 2, 2012 at 7:13 am

Meet deslide.

Rahul October 2, 2012 at 7:18 am

Very nice. Slide shows are super-evil.

MadNumismatist October 2, 2012 at 11:17 am

Great find, read the article, and ignored it, then I just found a slide show, used et voila. thanks.

Rahul October 2, 2012 at 7:17 am

Totally agree. My personal solution is always clicking on “Print version”.

Another evil (rarer) is websites that disable scrolling using PageUP / PageDown, instead resorting to some gimmicky drag-with-mouse widget.

Oreg October 2, 2012 at 2:28 pm

I guess you mean pages built with Adobe Flash. Indeed, Flash is evil.

Amakudari October 3, 2012 at 3:35 am

Let’s not forget those stupid and evil sites that disable right-clicking. They’ve thankfully gotten rarer as well, although as a resident of Japan we’re behind the US (the Nikkei, basically the local WSJ, had it until very recently). Not only is it foiled completely by disabling JavaScript on those sites, but it’s super-easy for anyone who actually wants to steal content to do so in a number of ways (Ctrl+A, Ctrl+C; view source; etc.). We have lots of other nasty anti-patterns here, like impermanent URLs.

Kevin October 2, 2012 at 7:45 am

Aside from adds, pagination is a simple (but old) method of controlling webpage size which ultimately affects the user experience. In principle you can design a page which has rotating ads (as the user scrolls), no pagination, and fast loading using some AJAX. In practice, pagers are simple to implement and robust, and so still vastly more common.

Sigivald October 3, 2012 at 2:45 pm

Does page size really affect UX that much, though? It’s not 1996, and we’re not using dialup anymore…

Further, pagers are ubiquitous on big budget sites that can totally manage a professional AJAX system.

I don’t think that anyone believes for a second that big players are paginating so heavily in anything other than a futile attempt to juice their page-view numbers.

(Futile because the people buying ad space know damned well the numbers are inflated, and are willing to pay that much less per view because of it…)

Joseph W. October 4, 2012 at 11:11 am

It makes a big difference if you’re using it from a work computer where, every time a new page full of ads loads, you have to answer half a dozen Internet Explorer queries about whether you want to let this site open content on your computer.

gregorylent October 2, 2012 at 8:40 am

always thought that was for goosing the page-view stats .. and i dislike it also

RPLong October 2, 2012 at 8:42 am

Now wait just a minute here.

WTF is wrong with dancing cat animation?

MIckey October 2, 2012 at 10:10 am

The boston herald is really bad with this. Most of their articles go two pages, with only a sentence or 2 on the second page.

JimK October 2, 2012 at 10:27 am

With all the Javascript tools out there it is a real crime, especially for tech mags, to use full page refresh for slideshows.

John Schilling October 2, 2012 at 10:35 am

“…constant, quiet contempt for people who should be any news site’s highest priority”

Really? Because from the rest of the article, I thought pagination was what the advertisers wanted. Possibly they are foolishly acting against their own interest, but it is hardly contemptible to give them what they ask for. What they actually pay for, with real money.

At least at a commercial website, “folks who want to read articles all the way to the end”, without paying real money, are not the top priority because they are not the customers. They are the product, and product doesn’t get to complain about how it is packaged.

Rahul October 2, 2012 at 10:58 am

So, wherever I pay a subscription for a website are all articles one-page?

Memnon October 2, 2012 at 11:49 am

The advertisers do not want pagination.
The advertisers want to reach the audience. They will pay for an audience.
The audience do not want pagination. The audience wants the content.
The advertisers are untroubled with a state of maximal pagination. The audience is very troubled in such a state.
The intermediary want to inflate reported audience size to get more money from advertisers. Pagination is a mild form of scam against the advertiser when only some intermediaries use pagination. It is inconsequential to the advertiser when all providers use pagination.

Hit count based payment causes a pagination arms race between content providers, who are forced to assume the pagination methods of the providers with the least empathy for their audience.

Only a switch in metric use can force a new stable configuration.
It is not anti-market to pray for such a change, but it is unrealistic to expect pagination to disappear through voluntary restraint.

RM October 2, 2012 at 11:07 am

I agree that pagination is bad, especially in local newspapers. But I had always assumed that the practice started off in the era of slower internet speeds, and that it was a way to provide quick downloads. I had assumed that the time to download a long page was more than twice the time to download two pages. But it is the ads … eh? I would never have guessed because I assumed that — like me — people never click on ads.

John Thacker October 2, 2012 at 11:25 am

In the age of slower Internet speeds, there were simply a lot less ads, bandwidth intensive graphics, and so on. There was also a lot less precise page structure, so documents could be more easily displayed as downloaded, and the text reformatted on the fly as more was downloaded. The newer way of displaying things is prettier to most people, but it forces the use of pagination because it’s more difficult to display the page while it’s still being downloaded and have it look sensible at all. (Frames actually made that easier, but looked ugly to most.)

mulp October 2, 2012 at 11:16 pm

When will the age of slower internet speeds end in the USA?

Not everyone lives in high population density areas, and a national standard has not been adopted.

Besides, the US has among the highest costs for high speed, while income inequality is increasing.

Sigivald October 3, 2012 at 2:48 pm

“Inequality” is a non-issue. (What does “inequality” have to do with the actual notional issue of how many people can afford broadband access? Nothing. Inequality is not synonymous with poverty.)

And what would “a national standard” fix? How?

(Something like 95%* of the population can get wired broadband, and of the rest, pretty much everyone can get wireless broadband of some sort or other.

What’s the problem your “national standard” will be fixing, exactly?

* Number completely pulled out of a hat, but absolutely plausible.)

Henry October 2, 2012 at 12:29 pm

You are absolutely right. I don’t know what’s Mr Manjoo age, but in the early days of the internet with 24K modems, if you didn’t use pagination it will take for ever to load a page, furthermore back then the browsers were more than a little flaky add to that limited memory and things got very unstable if you placed to much stuff in a page, even if it was all plain text/html.

Rahul October 2, 2012 at 1:58 pm

I’m skeptical of that theory: Lots of technical pages / manuals / FAQ’s from the early days of the WWW were essentially long, monolithic plain text blocks and they hardly ever used pagination. The textual content of a news page hardly consumes any bandwidth at all.

e.g. RFC’s

http://www.rfc-editor.org/rfc/rfc5416.txt

Amakudari October 3, 2012 at 4:00 am

Yeah, I posted on the original article that only a teensy fraction of the page comes from the actual article’s text. People severely misunderstand what “problem” modern pagination is solving, especially since for most sites it began far after the days of the primordial web.

1 MB = full text of the Bible = 2-or-so megapixel picture (JPG) = a fraction of a second of HD video (H.264)

Those figures can change with compression, but the point remains that text just isn’t that big a deal. Slate could save about 200 KB by Chrome’s guess just by using gzip compression wherever available. It could save another boatload without social network buttons. It saves nothing by putting pagination buttons, because those often requiring downloading identical content the next time, some of which won’t or can’t be cached. UX is even a secondary concern, as implementing an AJAX way with fallback to serve new pages has been trivial for years. All they want is to rack up pageviews.

careless October 3, 2012 at 11:31 am

And still, somehow disqus manages to stress modern computers with comment threads.

Mo October 2, 2012 at 2:40 pm

Plain text does not take up much bandwidth. The hogs are the images. For example, that giant manual was 172 K, total.

For the record, I remember using a 2400 baud modem.

Bender Bending Rodriguez October 2, 2012 at 10:10 pm

Hayes or Hayes-compatible? Were you always having to jiggle your handset on the acoustic coupler?

aaron October 3, 2012 at 3:14 pm

The images don’t even us up much bandwith. It’s flash and ad code that do it.

I manually rip the article body out of NYTimes articles to avoid their JAVA crap. It loads instantaneously.

LS October 2, 2012 at 11:18 am

prior_approval — Sure, they have to pay their bills.
But when so many mainstream sites spread stories over multiple pages clogged with widgets and autoplaying ads leading to slow-loading pages (in essence, interfering with getting the whole story), I just go elsewhere.
The irony is that with dying newspapers’ print editions, the stories that “jumped” pages weren’t for the page views, but just a way to squeeze stories into the space left over after the ads are in place. (The ad profile determines the “news hole”). Plus, print ads didn’t scream at you and get in the way of your reading.
Anyway, there is a reasonable amount of clicking online to get through a story. Some sites just go too far.

Todd Fletcher October 2, 2012 at 12:03 pm

I’m a web developer for a well known media company. We’ve had to do all of these tricks for the simple reason that, I’m sad to say, they work. We had obnoxious subscription popups that blocked the content when you first land on an article page. This to me is a gigantic usability no-no. I among others advocating turning them off on the theory that subscriptions would go up because visitors would have a more positive reaction to the site. Nope. They plummeted. We had forum pages that required a click and a page load for expanding all of the threads – which is a terrible user experience – but when we went to a single long page, again, to benefit the users, ad impressions dropped off the cliff. The price of ad impressions is falling all the time so these tactics are used just to keep on an even keel. The only way out seems to be selling leads from the users, which does make more money per pop but it’s a lot harder to get people to give up their info and consent to a sales call. Fact is it’s very hard to monetize a content based web site.

Rahul October 2, 2012 at 2:02 pm

The other option is to improve content.

mulp October 2, 2012 at 11:24 pm

You volunteering to provide high value content for $2 a day?

Rahul October 3, 2012 at 8:35 am

$60 a month is a lot of money. I’m wondering, what’s the most expensive online magazine subscription? Ok, excluding academic journals.

NYT’s most expensive option is $35 a month. WSJ $26.

Todd Fletcher October 3, 2012 at 12:05 am

That’s the problem, in a nutshell. But how?

Josh Nelson October 2, 2012 at 1:18 pm

if you’re looking at tons of data (a web representation of database info) i find pagination extremely helpful and 100% necessary.

Frank Johns October 2, 2012 at 2:35 pm

I had assumed some blogs, like the Economist’s which only started paginating recently and don’t display anymore ads, do it in order to count how many clicks the individual articles get. It is a pain in the ass to keep clicking and waiting, though.

joemama1333 October 2, 2012 at 3:24 pm

You’re missing an important point about pagination as far as ad revenues go – it really doesn’t improve monetization much if at all (especially if you take into account people who drop off). I work for a display ad exchange and there are many factors that contribute to the prices paid for ads. One is scarcity and when you’re paginating, you’re creating multiple opportunities for advertisers to get to that customer. In an auction-based model, that decreases competitiveness to buy the ad space, so advertisers can bid lower, knowing if they don’t get the user the first time, they can get them the next. Also, prices go down as your session depth on a site increases, so they’re losing on that front as well. This is basically the same as auto-refreshing a page and at the end of the day, it just pushes down prices by a significant amount (which is almost certainly offset by the drop off of users).

SgtBob October 2, 2012 at 5:01 pm

In Dark Ages journalism, teaching was don’t jump a story unless absolutely necessary, because most readers won’t turn to the jump page. Obviously advertising wasn’t in charge.

sdrace October 2, 2012 at 6:47 pm

Anyone interested in the polar opposite of pagination should visit the Daily Mail home page and scroll, and scroll, and scroll… http://www.dailymail.co.uk/

Gherald October 2, 2012 at 6:55 pm

AutoPager extensions!

Google Chrome: https://chrome.google.com/webstore/detail/mmgagnmbebdebebbcleklifnobamjonh

Firefox: https://addons.mozilla.org/en-US/firefox/addon/autopager/

Verily, the first thing I do when sitting down on a new computer is install Chrome or Firefox (a portable version, if I don’t have administrator rights), then the Adblock Plus extension ( http://adblockplus.org/ ) , then finally AutoPager. It’s great.

*daniel October 3, 2012 at 11:55 am

Auto-scroll is the answer here. If you don’t load the second “page” of content until the user gets that far, you can still deliver your ads, and you can deliver them to the kind of person who reads that far. That can be a select group.

Michael Van Beek October 3, 2012 at 1:05 pm
aaron October 3, 2012 at 3:04 pm

I’ve suspected that 2 pages also determines the readers interest in the content.

I don’t see how this could be valuable enough to justify the evil.

Scott Saliency October 6, 2012 at 2:33 pm

The web needs low transaction cost micro transactions. A digital low transaction cost currency would do wonders for the web and eliminate much that is “evil”.

Comments on this entry are closed.

Previous post:

Next post: