Comments on: Seriously.
http://www.metafilter.com/151667/Seriously/
Comments on MetaFilter post Seriously.Wed, 29 Jul 2015 20:15:17 -0800Wed, 29 Jul 2015 20:15:17 -0800en-ushttp://blogs.law.harvard.edu/tech/rss60Seriously.
http://www.metafilter.com/151667/Seriously
<a href="http://blog.lmorchard.com/2015/07/22/the-verge-web-sucks/">The Verge's web sucks.</a> post:www.metafilter.com,2015:site.151667Wed, 29 Jul 2015 20:04:06 -0800the man of twists and turnscookiesupercookietrackerthevergewebbrowsermozillamosaicfirefoxchromesafariieBy: oceanjesse
http://www.metafilter.com/151667/Seriously#6146272
This is why I give money to this website!comment:www.metafilter.com,2015:site.151667-6146272Wed, 29 Jul 2015 20:15:17 -0800oceanjesseBy: migurski
http://www.metafilter.com/151667/Seriously#6146277
<a href="http://mike.teczno.com/notes/bandwidth.html">The Verge's Web Has Always Sucked</a> (self link).comment:www.metafilter.com,2015:site.151667-6146277Wed, 29 Jul 2015 20:26:57 -0800migurskiBy: verb
http://www.metafilter.com/151667/Seriously#6146278
This is a real and pressing problem; over the past couple of years I've worked on redesign projects for large and mid-sized ad funded sites, and it is <em>rough</em> fighting for lighter-weight pages. One of the problems is concentration risk: if you put all of your eggs in one basket with a single ad network and/or metrics system, you run the risk of falling behind in a sort of arms race that pushes shockingly thin margins around like water seeking the lowest point. In businesses without a mature ad tech infrastructure, one network is often used to handle "workaday" ads, another is used to manage a small number of higher-margin ads that are willing to sell straight to the site, Google Analytics is there because literally everyone uses it, and another system or two get folded in to provide "value added" metrics for the direct advertising partners who want more information for the higher rates they're paying. Then add the social sharing widgets, which everyone accepts are necessary because such a huge slice of everyone's traffic comes straight from Twitter/Facebook/Reddit...
It's even worse on mobile, where the execution time for those scripts takes a hit compared to desktop even beyond the raw download time.
There are techniques for making things better without tipping the whole apple cart, but they're generally isolated and not well integrated into the "baseline" of modern web design. Things have just gotten a lot more complicated, and lazy-loading of script and media assets, careful planning of responsive ad placement for mobile, etc are techniques that take expertise and the willingness to spend time integrating them into an evolving design.
Finally, really fixing the problem requires that technologists and designers engage with the business models and business needs driving these decisions. No one likes paged articles on the web, for example, but they're the only (simple) way for a site to ensure that they get enough ad impressions to justify the time and expense of producing a 6000 word piece compared to the popular but fluffy listicles. Understanding what gaps the third metrics script and the fourth ad network were filling is necessary, because figuring out alternatives that won't destroy the site's slim revenue model means understanding and predicting the impact of major changes. This kind of stuff is what's driven me to start learning more about the nitty gritty of digital ad and metrics platforms—not because I want to do SEO crap or get excited about ads <em>per se,</em> but because I want to figure out how to help fix these things <em>and</em> how to help sites I really care about thrive in the "better web."
The Verge's article angered me because it blamed the technical infrastructure of the web for what is really a <em>publishing industry</em> challenge.comment:www.metafilter.com,2015:site.151667-6146278Wed, 29 Jul 2015 20:27:32 -0800verbBy: oceanjesse
http://www.metafilter.com/151667/Seriously#6146283
Is it a problem though (for them)? We buy new cellphones every two years and create jobs for all of the advertising agencies and cellphone makers.comment:www.metafilter.com,2015:site.151667-6146283Wed, 29 Jul 2015 20:36:44 -0800oceanjesseBy: threeants
http://www.metafilter.com/151667/Seriously#6146287
<em>No one likes paged articles on the web</em>
Is this true? Obviously I hate the FOX 38 East Boise Tonight news article that spreads one paragraph of text over 12 pages while launching five auto-play video ads, a flash game, and Significantly More Than One Weird Trick. But I would take a stripped-down pagination scheme for a longform article any day over some wretched parallax shit that's designed to look good on someone's telephone.comment:www.metafilter.com,2015:site.151667-6146287Wed, 29 Jul 2015 20:39:00 -0800threeantsBy: doctor_negative
http://www.metafilter.com/151667/Seriously#6146288
Ghostery shows 12 blocked trackers on theverge.com, gawker.com shows 16. So they are slightly less evil than gawker. It's kind of like choosing between syphilis and gonorrhea.comment:www.metafilter.com,2015:site.151667-6146288Wed, 29 Jul 2015 20:40:01 -0800doctor_negativeBy: verb
http://www.metafilter.com/151667/Seriously#6146291
<em> Is this true? Obviously I hate the FOX 38 East Boise Tonight news article that spreads one paragraph of text over 12 pages while launching five auto-play video ads, a flash game, and Significantly More Than One Weird Trick. </em>
Well, that's sort of a different matter. Many of the big high-concept media-rich longform articles are design projects in and of themselves; the "articles longer than <em>n</em> words get paginated for better ad metrics" stuff exists in a kind of parallel universe.comment:www.metafilter.com,2015:site.151667-6146291Wed, 29 Jul 2015 20:44:29 -0800verbBy: downtohisturtles
http://www.metafilter.com/151667/Seriously#6146293
<em>Often, one network is used to handle "workaday" ads, another is used to manage a small number of higher-margin ads that are willing to sell straight to the site, Google Analytics is there because literally everyone uses it, and another system or two get folded in to provide "value added" metrics for the direct advertising partners who want more information for the higher rates they're paying. Then add the social sharing widgets, which everyone accepts are necessary because such a huge slice of everyone's traffic comes straight from Twitter/Facebook/Reddit...</em>
Is this why when I, for example, load up my hometown newspaper I get 11 blocks on Ghostery and 32 adblocks? If your ad-based business model relies on scrounging for whatever little bits of ad dollars you can and spending resources doing whatever manipulation is necessary to maximize that while simultaneously annoying your customers to no end, don't you think you'd be better off just charging them outright or something? I don't get it.comment:www.metafilter.com,2015:site.151667-6146293Wed, 29 Jul 2015 20:47:58 -0800downtohisturtlesBy: Talez
http://www.metafilter.com/151667/Seriously#6146298
This is why I love Reader Mode in Safari on iOS. One tap and all of their CSS and JS shit goes out the fucking window into the trash. I can't wait until iOS 9 with Safari View Controller (Newsblur will finally get the ability to use reader mode *fingers crossed*) and Safari Content Blocker which will give the middle finger to "automatically redirect to app store" ads.comment:www.metafilter.com,2015:site.151667-6146298Wed, 29 Jul 2015 20:54:00 -0800TalezBy: migurski
http://www.metafilter.com/151667/Seriously#6146301
Reader Mode is my favorite thing as well, though NYTimes has been messing with it lately with the introduction of the "show full article" button on mobile. Loading only partial content on initial view prevents Safari from seeing the full article. Safari seems to determine what to include at page load time, not at button-press time.comment:www.metafilter.com,2015:site.151667-6146301Wed, 29 Jul 2015 21:00:03 -0800migurskiBy: verb
http://www.metafilter.com/151667/Seriously#6146302
<em>Is this why when I, for example, load up my hometown newspaper I get 11 blocks on Ghostery and 32 adblocks?</em>
Probably something like that, yeah. I mean, I think that is bad business for a lot of reasons; among other things, inventory ads are just a terrible business. Visitors are literally more likely to be in an airplane crash than <em>intentionally</em> click on an ad image. Barring teams with the experience (or time/resources) to dig in and come up with a different gameplan, many sites/publishers have slowly drifted into this ad-encrusted world rather than planning it and rolling it out in one big launch.
<em>If your ad-based business model relies on scrounging for whatever little bits of ad dollars you can and spending resources doing whatever manipulation is necessary to maximize that while simultaneously annoying your customers to no end, don't you think you'd be better off just charging them outright or something? I don't get it.</em>
Well, one challenge is that only the absolute tippy-top of the food chain can make a compelling enough case for their content to get people to pay up. The bread and butter of newspapers (especially local ones) was never subscribers, it was local advertisers and classified ads. Those are being shredded by the move to "everyone has a web site, and they're SEO'ing the hell out of it" as well as the emergence of competing free services like Craigslist. I mean, this isn't really new stuff—the gutting of the news industry's revenue model has been an ongoing slow-motion wreck for a long time.
The consequence is that the "Or Something" in "just charge them or something" is really a big blinking question mark. Lots of sites are experimenting, some are succeeding, others are failing, and others are flailing. Ads are a rough way to monetize, but they are a well-trod path that you can plan and execute on without taking big risks on bold monetization models. I don't mean to make excuses for the practices that have gotten us here; rather, I'm trying to dig into the "whys" that make it such a pervasive problem. It is not simply a technical problem, and it is often one that the people writing and producing publications <em>hate</em> themselves.comment:www.metafilter.com,2015:site.151667-6146302Wed, 29 Jul 2015 21:01:20 -0800verbBy: Alter Cocker
http://www.metafilter.com/151667/Seriously#6146304
My thinking too, Turtles. If I'm really only worth $6.20, and that funds the whole web, then someone please explain or link to an explanation why micropayments is such a lousy idea. Hell, I'd go twenty bucks a month for clean fast low load time content. I used to buy a couple dozen magazines a month and didn't mind ads either - until they went to blow ins, tear outs, smellies, etc and I couldn't "thumb" through my mags any more ( even if I filleted them first).
Is this truly not a workable business model: make something good and sell it to people?
P.S. This is not snark, I would like to know,
Really, all this grief for $6.20?comment:www.metafilter.com,2015:site.151667-6146304Wed, 29 Jul 2015 21:04:23 -0800Alter CockerBy: potch
http://www.metafilter.com/151667/Seriously#6146307
<a href="http://www.metafilter.com/user/24138">mefi's own</a>comment:www.metafilter.com,2015:site.151667-6146307Wed, 29 Jul 2015 21:12:34 -0800potchBy: Sys Rq
http://www.metafilter.com/151667/Seriously#6146312
I don't know about The Verge, but it seems similar to my experiences with The A.V. Club. If you block ads and scripts, it's a great site, but if you don't, it's just a big ol' bucket o' barf. Every spare pixel is crammed with ads and lowest-common-denominator linkbait, and all kinds of scripts are running doing who knows what.
The solution to all this is not exactly rocket surgery: "Buy a subscription to view our site (or network of sites) without ads or tracking." But I suppose that assumes a site's whole raison d'être isn't merely to generate pages to fill with ads, then attract readers to click the ads; unfortunately, that's exactly what even the most nobly journalistic sites seem to resort to in pathetically short order.
Also, looking at The Verge now in Firefox, I don't at all understand this guy's need to laud its aesthetics. It's fug-pugly and fatally broken. The headline text is literally an inch tall, allcaps, italic, Arial; even with scripts enabled and ads unblocked, none of the images load, none of the articles load, none of the fonts load, none of the ads load. It's just a header bar and a load of headlines linking to nothing. Enabling third-party tracking fixes the ads and the fonts, but the images and articles -- you know, what some might call <em>the actual website</em> -- are still nowhere to be found. If the site doesn't work with every filter turned off, yeah, The Verge's web definitely sucks. What actually is there looks like Hitler and the Daily Mirror had a litter of randomly-coloured gradients. Also He-Man, because their logo is that He-Man font.comment:www.metafilter.com,2015:site.151667-6146312Wed, 29 Jul 2015 21:25:53 -0800Sys RqBy: a lungful of dragon
http://www.metafilter.com/151667/Seriously#6146315
The Verge is clickbait. Best to avoid them whenever possible.comment:www.metafilter.com,2015:site.151667-6146315Wed, 29 Jul 2015 21:31:44 -0800a lungful of dragonBy: fifteen schnitzengruben is my limit
http://www.metafilter.com/151667/Seriously#6146321
Tracking scripts and ad networks are the reason the average web page has grown larger every year for ages. It's not the ever-increasing JavaScript libraries or pictures of women laughing at salad, it's just "one more thing" for the folks down the hall so they can get their numbers for a spreadsheet.
There are mechanisms to manage them, but you have to do a bunch of work, and with attrition, who can remember that this one template calls this one JS library (which in turn calls another which calls another). Before you know it, your full load time has taken a dive and everybody points fingers at everyone else.
News sites are the absolute worst, though. I think my personal Ghostery record is 85 trackers on one page.comment:www.metafilter.com,2015:site.151667-6146321Wed, 29 Jul 2015 21:34:06 -0800fifteen schnitzengruben is my limitBy: mrgrimm
http://www.metafilter.com/151667/Seriously#6146347
The Verge's Web is unfortunately our (mainstream) Web. Everybody does this shit. Check out a (non-wiktionary) dictionary site.comment:www.metafilter.com,2015:site.151667-6146347Wed, 29 Jul 2015 22:13:52 -0800mrgrimmBy: mrgrimm
http://www.metafilter.com/151667/Seriously#6146350
Also, <a href="https://www.squarefree.com/bookmarklets/zap.html">zaps</a>.comment:www.metafilter.com,2015:site.151667-6146350Wed, 29 Jul 2015 22:17:37 -0800mrgrimmBy: Joe in Australia
http://www.metafilter.com/151667/Seriously#6146351
There are major websites that I just can't read on my iPhone, like the <em>Jerusalem Post</em>. It takes forever to load and then there's this weird draggy effect as you scroll. <em>The Guardian</em> is about half as bad: I don't get the dragginess, but I often get page crashes and half the time the front page won't load <strong>at all.</strong> <em>The Toast</em> has these horrible little adds that slide in from the side and obstruct your view. They've fixed them a bit so you can get them to go away, but not always.
I don't read <em>Newsweek</em>, but for some reason when I click on a link it always says that I've read "your five free articles this month, would you like to subscribe?" Haha, no. And then there are the ones that literally fill a quarter of the screen with a stupid header ("BLOOMBERG VIEW") in case you forgot what website it was, and put a strip at the bottom so you can Twatter or Farce it or whatever kids do nowadays. The <strong>actual article</strong> appears in a sort of slit like a letterbox.
Lots of these sites - all except <em>The Guardian</em> in fact - deliberately sabotage Safari's reading mode. Because they know you're using it to avoid the ads. But here's the thing:<em> their sites are literally unreadable without it.</em> Nobody reads a website that's crashed! So why do they do it? Because they can't risk letting anyone avoid their ads, and they can't force their advertisers and metricisers and so forth to make sure that their code works across devices and across webpages and in conjunction with a zillion other random blobs of marketing code. Result, inertia. They would literally rather die than get things right.
And you might think that subscribing would let you avoid this crap. I am here to tell you that I subscribed to <em>The Atlantic</em> just so I could use their iPhone app. And guess what? It has a letterboxed page, to remind you that you're in the app and in case you want to see every one of its little icons and menus. I just counted: there are eleven buttons (in three separate rows) on every page while you're reading an article. These companies are like the Cretaceous dinosaurs weighed down with shells and frills and horns so they can barely move. An extinction event would probably be a mercy.comment:www.metafilter.com,2015:site.151667-6146351Wed, 29 Jul 2015 22:20:02 -0800Joe in AustraliaBy: deusx
http://www.metafilter.com/151667/Seriously#6146355
<i>Also, looking at The Verge now in Firefox, I don't at all understand this guy's need to laud its aesthetics. It's fug-pugly and fatally broken.</i>
Hi. I'm "this guy". I like how their site looks and the content often entertains me. Your taste & mileage may vary. They have some <a href="http://www.theverge.com/2015/4/2/8285139/max-headroom-oral-history-80s-cyberpunk-interview">good stories sometimes</a>, and their's was one of the first news sites with a responsive layout that wasn't entirely terrible on all my devices. I also appreciate magazine-style sites heavy in visual design, so I was trying to give the benefit of the doubt and be nice - those are also often heavy downloads. The blog post is basically my mental narrative as I realized the page weight had very little to do directly with what I actually enjoy from the site.comment:www.metafilter.com,2015:site.151667-6146355Wed, 29 Jul 2015 22:23:53 -0800deusxBy: scose
http://www.metafilter.com/151667/Seriously#6146358
Yup. After I lost my phone recently, I started using a really old, bottom-of-the-barrel android 2.0 smartphone with a tiny screen and slooow processor. Most sites take forever to load, or straight up don't render. Non-interactive sites whose only goal is to present an article with some images.
Mainstream web design is a wasteland infested with awfulness in every place that could possibly support it.comment:www.metafilter.com,2015:site.151667-6146358Wed, 29 Jul 2015 22:24:46 -0800scoseBy: deusx
http://www.metafilter.com/151667/Seriously#6146365
<i>If I'm really only worth $6.20, and that funds the whole web, then someone please explain or link to an explanation why micropayments is such a lousy idea.</i>
In a nutshell, my understanding is that the overhead of micropayments has typically killed them. Transaction fees, administration, cognitive load & annoyance, etc. That said, there are a <a href="https://flattr.com/">few</a> <a href="https://www.google.com/contributor/welcome/">micropayment</a> <a href="https://www.syndicoin.co/">schemes</a> out there.
But, folks have learned that the web is free of charge to visit and most everyone is voting with their wallets. Content creators ask for money, sometimes they get it.
But, <a href="http://hackingdistributed.com/2014/12/31/costs-of-micropayments/">getting folks to fork over many payments of small increments is hard</a>.
Content creators put up ads, and the money rolls in way more reliably. Lather, rinse, repeat until you've got at least 7 different ad networks running realtime auctions on your eyeballs.comment:www.metafilter.com,2015:site.151667-6146365Wed, 29 Jul 2015 22:36:58 -0800deusxBy: Devonian
http://www.metafilter.com/151667/Seriously#6146423
Are we missing something important here?
This certainly sucks wet farts out of dead pigeons. It's another aspect of the content-as-steg-payload thinking which also mucks things up for VI people. But surely it's also fixable really quite simply through search.
Most traffic on the Web - that precious juice which drives everything - lands on sites through Google. So, build a search that downranks sites for bloat, or tracker density, or whichever metric most offends the senses. The result would be a superfast Web experience for users, and one that would heavily encourage the swift generation of new networks of content provision that aren't dependent on this mutant growth. Revenue... well, worry about that later, in the traditional manner. It's not as if what we've got now is doing much other than drive the mutant ecosystem.
Google can't do it, because it would cause regulatory implosion.
So not only is there a conceptually simple answer, it creates a new environment where the old guard can't follow.comment:www.metafilter.com,2015:site.151667-6146423Thu, 30 Jul 2015 02:20:02 -0800DevonianBy: lollusc
http://www.metafilter.com/151667/Seriously#6146436
Wow this explains a lot. For the past month we've had no internet at home and have tethered to our phones. Despite turning off all updates, avoiding videos and music, and basically just browsing text-based sites for a couple of hours a day, we have been chewing through our monthly phone data allowance in less than a week. (1.5GB each). I couldn't see how text could be doing that, but it's true that on mobile (phone and iPad) I'm not using an adblocker. Grrr.comment:www.metafilter.com,2015:site.151667-6146436Thu, 30 Jul 2015 03:14:46 -0800lolluscBy: harriet vane
http://www.metafilter.com/151667/Seriously#6146439
How efficient are the ad downloads? I know that altogether they're enormous, but is that a large number of tiny, optimised scripts, or a reasonable number of bloated buggy bullshit scripts with uncompressed images? I'm wondering if there's any payoff in putting pressure on the ad networks to get more efficient.comment:www.metafilter.com,2015:site.151667-6146439Thu, 30 Jul 2015 03:22:05 -0800harriet vaneBy: octothorpe
http://www.metafilter.com/151667/Seriously#6146444
I mostly read The Verge through NewsBlur which just gives me the text and lets me blow past the 4/5 of the posts that either I don't care about or just repeat stuff from other posts.comment:www.metafilter.com,2015:site.151667-6146444Thu, 30 Jul 2015 03:40:37 -0800octothorpeBy: clvrmnky
http://www.metafilter.com/151667/Seriously#6146479
Coincidentally, I was <a href="https://www.tumblr.com/blog/clvrmnky">
just bitching about this exact thing</a>
now that I mostly use a mobile browser.
But even regular browsers on modest lappies can barely render some sites these days. The madness has to stop.</a>comment:www.metafilter.com,2015:site.151667-6146479Thu, 30 Jul 2015 05:25:21 -0800clvrmnkyBy: srboisvert
http://www.metafilter.com/151667/Seriously#6146492
<em>Then add the social sharing widgets, which everyone accepts are necessary because such a huge slice of everyone's traffic comes straight from Twitter/Facebook/Reddit...</em>
The annoying thing is you can do this without all the external javascripts.comment:www.metafilter.com,2015:site.151667-6146492Thu, 30 Jul 2015 05:42:21 -0800srboisvertBy: escape from the potato planet
http://www.metafilter.com/151667/Seriously#6146499
Devonian, I've always heard that Google <em>does</em> penalize sites for slow load times.
But maybe that's not accurate, because how then to explain the success of 9 MB <strike>articles</strike> tracker delivery systems?comment:www.metafilter.com,2015:site.151667-6146499Thu, 30 Jul 2015 05:53:22 -0800escape from the potato planetBy: Joe in Australia
http://www.metafilter.com/151667/Seriously#6146532
Google doesn't necessarily get served the same page that viewers do; in fact I presume it doesn't, since it's totally standard to serve different versions of a page to different browsers. Also, Google is smart enough to not download junk it doesn't need to see.comment:www.metafilter.com,2015:site.151667-6146532Thu, 30 Jul 2015 06:27:01 -0800Joe in AustraliaBy: humanfont
http://www.metafilter.com/151667/Seriously#6146561
I feel forced to use as blockers just to stop these sites from crashing all the time.comment:www.metafilter.com,2015:site.151667-6146561Thu, 30 Jul 2015 07:03:39 -0800humanfontBy: octothorpe
http://www.metafilter.com/151667/Seriously#6146584
<a href="/151667/Seriously#6146561">humanfont</a>: "<i>I feel forced to use as blockers just to stop these sites from crashing all the time.</i>"
I can only read my local newspaper's site reliably on a desktop with ad-block as it consistently manages to crash chrome on Android. I'm not sure that I understand the business plan of no actually letting me access their site.comment:www.metafilter.com,2015:site.151667-6146584Thu, 30 Jul 2015 07:23:37 -0800octothorpeBy: Ferreous
http://www.metafilter.com/151667/Seriously#6146588
Someone wake me when the trend for annoying background scrolling with images that fade in and out on longform stories dies off.comment:www.metafilter.com,2015:site.151667-6146588Thu, 30 Jul 2015 07:24:15 -0800FerreousBy: Nelson
http://www.metafilter.com/151667/Seriously#6146601
This article has hit a nerve; I've been seeing discussion all over the place. I think it's because it connects two topics that make users angry. Constant surveillance on the Internet combined with gratuitously slow web pages. A 7MB web page takes literally 54 seconds to load on my rural Internet connection. And 52 seconds of that are to load software that is literally spyware, code that is mildly to significantly harmful to my privacy.
As a software guy myself what annoys me most is how poorly implemented it all is. 7MB of Javascript code is ridiculous. When I load the page today it's 5.4MB (and counting; some ad code keeps tickling my network every 30 seconds.) But FWIW most of the large objects on that page have nothing to do with ad trackers. The single biggest one, for example is an HTML 5 video player. That's supposed to be a good thing, better than Flash video, but the need for 329kb of code to make it work is not a ringing endorsement. The real code problem with those 20 ad trackers is that it implies 20 new HTTP connections. Also 20 different places whose security you're trusting. Regularly, ad sites get subverted and serving straight-up malware.
I couldn't use the Web anymore without uBlock Origin blocking ads. And I run Ghostery in blocking mode although that's a struggle; it daily breaks various websites. Not because the "Like me on Facebook!" page is integral to the site's content, of course, but because the knucklehead programmer didn't make his page code robust and the presence of a Javascript error loading (say) Facebook ends up breaking the Javascript code doing something elsewhere on the page. And so I see the "Submit" button doesn't work, and I reload without Ghostery blocking. Every day, it's tedious.comment:www.metafilter.com,2015:site.151667-6146601Thu, 30 Jul 2015 07:29:54 -0800NelsonBy: selenized
http://www.metafilter.com/151667/Seriously#6146622
With the move from browsers to block trackers and other more egregious silliness I wonder if this is a problem that will go the way of pop-up ads. Remember those? Remember how we used to have plugins to disable them, then the browsers just implemented that functionality, and now you hardly ever see them?comment:www.metafilter.com,2015:site.151667-6146622Thu, 30 Jul 2015 07:36:17 -0800selenizedBy: Nelson
http://www.metafilter.com/151667/Seriously#6146645
The problem is popups were relatively easy to stop. Also everyone hated them, even the developers. And we still do have popups anyway, those giant page-covering interstitials that show up on the first visit to a site. They're mostly being use to upsell you to a mobile app or giving the site your email address, but they're still popup ads.
The popup-blocking equivalent for trackers is blocking third-party HTTP cookies. There never was a good reason to allow them in the first place and now that major browsers are moving to block them, user privacy gets a little better. OTOH that blocking is also partially responsible for the javascript proliferation; instead of a simple cookie now sites are doing more devious things to plant third party ad trackers in their web pages, things that take a little more code to download and run. It's an arms race the consumers are losing.comment:www.metafilter.com,2015:site.151667-6146645Thu, 30 Jul 2015 07:45:45 -0800NelsonBy: Nelson
http://www.metafilter.com/151667/Seriously#6146652
(Also thank you Metafilter for being a relatively friendly site. This page itself is 237kb, or 3% the size of that Verge article. It has a few trackers: Google Analytics, Quantcast, ChartBeat, but those are relatively harmless compared to a lot of the crap we see daily. And a couple of ads of course, particularly if you're not logged in, but the entire page is 17 HTTP requests and only like 5 of those are in service of advertising. That's not counting the ping tracker which is tracking page dwell time though, a few bytes every 15 seconds.)comment:www.metafilter.com,2015:site.151667-6146652Thu, 30 Jul 2015 07:49:17 -0800NelsonBy: gilrain
http://www.metafilter.com/151667/Seriously#6146669
Just incidentally, if you use Firefox for Android, you can install adblocking plugins. And apparently, iOS 9 is going to be allowing adblocking extensions for Safari, so there's hope for iFolks as well.comment:www.metafilter.com,2015:site.151667-6146669Thu, 30 Jul 2015 07:56:06 -0800gilrainBy: escape from the potato planet
http://www.metafilter.com/151667/Seriously#6146691
<em>Google doesn't necessarily get served the same page that viewers do; in fact I presume it doesn't, since it's totally standard to serve different versions of a page to different browsers.</em>
This is far from "totally standard"—in fact, I've been building websites for 15 years, and I have almost <em>never</em> encountered this. The closest technique that's ever been commonly used (well, aside from IE's conditional comments, which are on their way out) is to host a desktop-optimized site at www.example.com, and a mobile-optimized site at m.example.com. And that isn't serving different pages to different browsers—that's <em>two different sites</em> which happen to contain similar content.
Anyway, serving different content to Google's bots would be a transparent attempt to game their algorithms, and probably a good way to get your site's ranking penalized.
The dominant approach these days is <a href="https://en.wikipedia.org/wiki/Responsive_web_design">responsive design</a>, where the entire <em>point</em> is to serve the same code to all clients (so the site owners don't have to build ten different versions of the site for all the myriad devices that people use to browse the web these days).
<em>Also, Google is smart enough to not download junk it doesn't need to see.</em>
Of course. And we don't know all the intricacies of Google's crawler, but <em>Google themselves</em> publicly advise that it <em>does</em> <a href="http://googlewebmastercentral.blogspot.com/2014/05/understanding-web-pages-better.html">download and execute JavaScript</a>. This is practically a necessity if they're going to index today's heavily JavaScript-dependent web.
(Google also downloads and render CSS and images—<em>e.g.</em>, so site owners can't keyword-spam by putting white text on a white background. Search engines do a lot more than just scraping HTML these days.)comment:www.metafilter.com,2015:site.151667-6146691Thu, 30 Jul 2015 08:08:48 -0800escape from the potato planetBy: straight
http://www.metafilter.com/151667/Seriously#6146695
If you're paying for data, not blocking ads on your mobile device is madness.comment:www.metafilter.com,2015:site.151667-6146695Thu, 30 Jul 2015 08:11:16 -0800straightBy: missmerrymack
http://www.metafilter.com/151667/Seriously#6146701
<em>Google can't do it, because it would cause regulatory implosion.
</em>
<a href="https://www.doubleclickbygoogle.com/solutions/digital-marketing/ad-exchange/">And Google runs an ad exchange, which is partially the cause of all of the bloat.</a> It sounds easy to say publishers shouldn't use programmatic ad services, but they need display ads for revenue. Subscriptions don't cut it because most people won't pay for content.comment:www.metafilter.com,2015:site.151667-6146701Thu, 30 Jul 2015 08:15:00 -0800missmerrymackBy: flabdablet
http://www.metafilter.com/151667/Seriously#6146808
<em>really fixing the problem requires that technologists and designers engage with the business models and business needs driving these decisions</em>
Really fixing the problem client-side is super-simple. Just run Adblock Plus and turn the entire revolting industry <em>off</em>. Done.comment:www.metafilter.com,2015:site.151667-6146808Thu, 30 Jul 2015 09:15:22 -0800flabdabletBy: introp
http://www.metafilter.com/151667/Seriously#6146823
To restate from above: <em>Mobile Firefox can run uBlock Origin, Ghostery, etc.</em>
If you're having problems with data consumption, pages crashing, etc., it's an amazing solution. It's replaced the default Android browser for me ever since so many pages started getting out of control. Rural coverage here in southwestern Virginia (2.5G EDGE @ ~150 kbit/sec) means most of the Internet would be unusable without it. (Props to the folks at Mozilla for maintaining their infrastructure such that most plugins are available on the mobile version.)comment:www.metafilter.com,2015:site.151667-6146823Thu, 30 Jul 2015 09:22:57 -0800intropBy: Artw
http://www.metafilter.com/151667/Seriously#6146833
Why the web sucks in '15: It's the money.comment:www.metafilter.com,2015:site.151667-6146833Thu, 30 Jul 2015 09:30:58 -0800ArtwBy: Artw
http://www.metafilter.com/151667/Seriously#6146838
The original Verge article seems to boil down to "phones should have more memory" and "content stacks are potentially stifling", neither of which are all that untrue either.comment:www.metafilter.com,2015:site.151667-6146838Thu, 30 Jul 2015 09:35:40 -0800ArtwBy: fifteen schnitzengruben is my limit
http://www.metafilter.com/151667/Seriously#6146849
Yeah, the perverse thing about phone browsing is that people expect it to be as fast as the desktop when:
* phones have limited memory and slower processors
* network latency on mobile is ridiculous
* ubiquitous ad networks are not optimized for anything other than enormous banner ads and tracking scripts
You CAN get a time-to-glass of 1 second on a page, but the amount of hoop-jumping you have to do is pretty intense. For example, reducing round trips between the phone and the server by UUencoding images into the first request, that sort of thing.comment:www.metafilter.com,2015:site.151667-6146849Thu, 30 Jul 2015 09:46:32 -0800fifteen schnitzengruben is my limitBy: caution live frogs
http://www.metafilter.com/151667/Seriously#6146853
As far as I see it my thinking is "if it won't load on an iPad you should fire your web team"
If it doesn't work on mobile it doesn't work, period. And not the newest version of everything either. I should not need a brand new device with gigabytes of memory to read a listicle. Test on the OLDEST system capable of running current software - if it fails there your site is bad and you should feel bad.comment:www.metafilter.com,2015:site.151667-6146853Thu, 30 Jul 2015 09:49:28 -0800caution live frogsBy: picea
http://www.metafilter.com/151667/Seriously#6146868
I think the linked article and <a href="http://idlewords.com/talks/web_design_first_100_years.htm">this one</a>, which was posted on MetaFilter, explaining why the author doesn't foresee much change in the web in the next few decades, go very well together.
In particular, things like The Verge's design speaks to the vision for the web that web companies like Facebook and Google have, which is a non-democratic one they design. They dictate what is best for the proles, even if we don't want it their way. They know best.
Currently I'm a web designer, not a great one and a little behind the times. I come from a fine arts background and this is the way I make ends meet. Initially I was all about bumbling around making very image-intensive sites that I thought looked cool. Now, as I'm learning more about designing responsively and heavily considering the mobile experience, I have done a 180 and just want to pare everything down as far as possible. I want everything clean and minimal and keep any design flourishes to an absolute minimum. I think it's counter intuitive for most creative people to want to tamp down the big ideas, but they need to be channeled in better ways, if a site's design is to be coherent and functional before all else.comment:www.metafilter.com,2015:site.151667-6146868Thu, 30 Jul 2015 09:57:35 -0800piceaBy: escape from the potato planet
http://www.metafilter.com/151667/Seriously#6146942
caution live frogs, I agree if we're talking about, say, a typical article on a magazine site—something where the main *substance* of the page is text, maybe with a few images.
But the web is capable of a lot more than that these days, and certain applications of the web's technology stack simply require more horsepower. Those pages are *always* going to be slower when you're pushing them down a smaller pipe, to a smaller processor. And I don't think we should *not build* those pages simply because they're a bit clunky on less powerful devices. I mean, there are still people out there browsing the web on old Windows XP desktops—and much of the modern web is clunky or broken for them, too. If we wait for old devices to go completely extinct before we exploit new possibilities, we'll never get anywhere.
That said, there is a *ton* that could be done to improve the performance of typical glossy-magazine-article-type pages. I was amazed how much faster the web got after I installed Ghostery.
I recently stumbled across a site I built for a friend 14 years ago. It's still online. It has no scripts whatsoever. One stylesheet. A handful of small images. And pages load *blazingly* fast. It feels like a single-page app.comment:www.metafilter.com,2015:site.151667-6146942Thu, 30 Jul 2015 10:35:33 -0800escape from the potato planetBy: entropicamericana
http://www.metafilter.com/151667/Seriously#6146982
I usually fetch web pages from other sites by sending mail to a program (see git://git.gnu.org/womb/hacks.git) that fetches them, much like wget, and then mails them back to me. Then I look at them using a web browser, unless it is easy to see the text in the HTML page directly. I usually try lynx first, then a graphical browser if the page needs it (using konqueror, which won't fetch from other sites in such a situation).comment:www.metafilter.com,2015:site.151667-6146982Thu, 30 Jul 2015 11:01:12 -0800entropicamericanaBy: maxwelton
http://www.metafilter.com/151667/Seriously#6146985
A comment here turned me on to this FF extension: <a href="https://github.com/gorhill/uMatrix">uMatrix</a>, which I use with a great deal of pleasure. I like the fine controls for blocking/unblocking, and how you can see the number of requests blocked/not blocked. <a href="http://i.imgur.com/xo75YL7.png">Here is metafilter</a>, for example. You can click directly in the matrix to block/unblock items.comment:www.metafilter.com,2015:site.151667-6146985Thu, 30 Jul 2015 11:02:54 -0800maxweltonBy: Nelson
http://www.metafilter.com/151667/Seriously#6146986
You're looking at this all wrong. For sites like The Verge, the tiny part of the page that is the text of the article isn't the application of the web's technology stack. The article is the cost center, the news hole, the bait to reel in the eyeballs. The web product is all this shitty Javascript and ads and trackers and cookies and image beacons and Flash cookies.comment:www.metafilter.com,2015:site.151667-6146986Thu, 30 Jul 2015 11:03:15 -0800NelsonBy: ignignokt
http://www.metafilter.com/151667/Seriously#6147286
<a href="https://twitter.com/xbs/status/626781529054834688">Average web page size is approaching the size of a Doom install.</a>
A friend of mine pointed out that the Boston Globe <a href="http://yellowlab.tools/result/e3htgnas57/rule/jQueryVersionsLoaded">loads SIX versions of jQuery</a>. He mentioned this months and months ago, and apparently, no one's going to address that.comment:www.metafilter.com,2015:site.151667-6147286Thu, 30 Jul 2015 13:12:28 -0800ignignoktBy: ignignokt
http://www.metafilter.com/151667/Seriously#6147292
And to be fair, I bet the developers working on the Globe know very well that it's horrible to load six versions of jQuery. Quite possibly, the pages are sectioned off each governed by different business units, each with their own competing incentives, and no one's gonna give for the greater good of the overall experience.comment:www.metafilter.com,2015:site.151667-6147292Thu, 30 Jul 2015 13:16:12 -0800ignignoktBy: Artw
http://www.metafilter.com/151667/Seriously#6147493
Well, there's two things here - The Verge could be the best, most mobile optimized code in the world and still struggle under the weight of ads and tracking there.comment:www.metafilter.com,2015:site.151667-6147493Thu, 30 Jul 2015 15:18:33 -0800ArtwBy: Artw
http://www.metafilter.com/151667/Seriously#6147502
Wait, are all these jQueries coming in with the ads?comment:www.metafilter.com,2015:site.151667-6147502Thu, 30 Jul 2015 15:22:23 -0800ArtwBy: enn
http://www.metafilter.com/151667/Seriously#6147507
<i>And to be fair, I bet the developers working on the Globe know very well that it's horrible to load six versions of jQuery.</i>
That doesn't make it better, that makes it worse.comment:www.metafilter.com,2015:site.151667-6147507Thu, 30 Jul 2015 15:25:22 -0800ennBy: Artw
http://www.metafilter.com/151667/Seriously#6147513
I am putting analytics crap on a page RIGHT NOW. The pain...comment:www.metafilter.com,2015:site.151667-6147513Thu, 30 Jul 2015 15:27:02 -0800ArtwBy: Joe in Australia
http://www.metafilter.com/151667/Seriously#6147547
Modern browsers are effectively a miniature operating system, and these web pages are like system images with a bunch of interacting programs loaded from and reporting to different addresses. We wouldn't place similar demands on <em>real</em> operating systems: the separate programs would be sandboxed and restricted to their own windows. It's pretty amazing that our browsers are not just expected to load and run these web pages, but display them in a consistent and aesthetically-pleasing way.comment:www.metafilter.com,2015:site.151667-6147547Thu, 30 Jul 2015 15:54:07 -0800Joe in AustraliaBy: stavrosthewonderchicken
http://www.metafilter.com/151667/Seriously#6147628
Timely topic for me -- I'm right at this moment in the early stages of building a Thing that is a response to the kind of issues people are talking about here, with an eye to helping out site users, small publishers, and advertisers, all with a social responsibility aspect built in. I'm terrified and overwhelmed, a bit, because I've never actually started a business <em>qua</em> business before, but it's very exciting to be planning it.comment:www.metafilter.com,2015:site.151667-6147628Thu, 30 Jul 2015 17:21:41 -0800stavrosthewonderchickenBy: ignignokt
http://www.metafilter.com/151667/Seriously#6147632
<b>Artw</b>, I don't know for sure that they mostly come from ads, but here is a little thing I ran in one of their news article pages:
<code>> document.querySelectorAll('iframe').length
43</code>
<b>enn</b>, it is worse, but I don't blame the developers. They probably want to coordinate and eliminate craziness like that. But probably can't.comment:www.metafilter.com,2015:site.151667-6147632Thu, 30 Jul 2015 17:30:50 -0800ignignoktBy: Artw
http://www.metafilter.com/151667/Seriously#6147633
This is what we get for rejecting Flash...comment:www.metafilter.com,2015:site.151667-6147633Thu, 30 Jul 2015 17:32:08 -0800ArtwBy: Talez
http://www.metafilter.com/151667/Seriously#6147645
<i>This is what we get for rejecting Flash...</i>
Literally the lesser of two evils?comment:www.metafilter.com,2015:site.151667-6147645Thu, 30 Jul 2015 17:54:04 -0800TalezBy: adamsc
http://www.metafilter.com/151667/Seriously#6147725
<a href="/151667/Seriously#6146601">Nelson:</a>
<blockquote>
But FWIW most of the large objects on that page have nothing to do with ad trackers. The single biggest one, for example is an HTML 5 video player. That's supposed to be a good thing, better than Flash video, but the need for 329kb of code to make it work is not a ringing endorsement.
</blockquote>
This is a great example of how long developers' earliest learning experiences persist even after the environment has changed radically. This is what an HTML5 video player requires:
<pre><video controls><source type="video/mp4" src="path/to/something.mp4"></video></pre>
That works in <a href="http://caniuse.com/#feat=mpeg4">90% of browsers</a> – i.e. comfortably more than Flash – and it loads quickly, has <a href="https://developer.mozilla.org/en-US/docs/Web/HTML/Element/track">an easy path for subtitles</a>, <a href="https://dev.opera.com/articles/html5-video-flash-fallback-custom-controls/">has a painless fallback mechanism</a>, and the native browser widgets look good and perform well. (Accessibility for the browsers' built-in controls is <a href="http://terrillthompson.com/blog/635">mixed but improving</a> and since many of the JavaScript/Flash-based players are worse, really requires a specific comparison to say whether it's a net win or loss vs. adding your own buttons with accessible markup.)
The catch, of course, is that a generation of developers learned that video is hard and requires a barge-sized external library and never asked whether the change should be more than switching from encoding <code>.flv</code> files to <code>.mp4</code>. I still see sites where they use Flash first and rely on JavaScript to enable a native HTML5 player only after the Flash version fails to load. On far too many sites, the video will fail to play at all when JavaScript is disabled, hasn't completely loaded yet, or when their analytics tracking service fails to record your click correctly.comment:www.metafilter.com,2015:site.151667-6147725Thu, 30 Jul 2015 19:16:24 -0800adamscBy: verb
http://www.metafilter.com/151667/Seriously#6147787
<em> The catch, of course, is that a generation of developers learned that video is hard and requires a barge-sized external library and never asked whether the change should be more than switching from encoding .flv files to .mp4. </em>
Well... sort of. For simple playback of stored assets, that's fine. What gets complicated is when (stop me if this is familiar) you actually need to monetize the content that you're publishing. Then, you end up wanting things like pre-roll ads, metrics and tracking to keep monitor how many people see which ads, content protection to make sure that people don't just yoink your videos and post them to bittorrent... And if you care about time on site, you want to make sure that your video player supports playlists and recommendations once a video is finished.
And don't even get me started on what it takes to build the online presence for an actual cable television station. You'd think that (say) MSNBC would actually have the right to throw its video feed up wherever it wants to, but as it turns out? Nope. The abomination that is Adobe Pass is no one's friend, but for the moment it's the best we've got.
I know, I know. Ads make everything terrible. Monetizing things destroys what makes them wonderful. I <em>do</em> agree that browser-level support for HTML5 media is fantastic these days, but it's also sadly the extra stuff that really bogs things down. Showing a static ad is easy, too—just an image tag! But all the additional crap that comes along for the ride is where things turn into a horror show.
Despite what Nelson says, I don't think that the people who make these publications consider the actual content they produce to be a "cost center", any more than Toyota's designers and engineers consider the factory a "cost center." Folks here get just as het up when they discover that businesses don't pay their writers well, or that they try to lowball web developer salaries; that's the other side of the publishing coin. For better or worse, ad revenue has historically been a much more reliable and consistent way of getting those people paid than any alternatives.
We can do it better, much better, but for the forseeable future it's a question of doing ads and metrics better rather than not doing them at all. Either that, or accepting that we will have a "professional content" ecosystem that is orders of magnitude smaller than today's.comment:www.metafilter.com,2015:site.151667-6147787Thu, 30 Jul 2015 20:51:23 -0800verbBy: flabdablet
http://www.metafilter.com/151667/Seriously#6147887
<em>six versions of jQuery</em>
or quite likely six copies of the <em>same</em> version, just pulled from different CDNs.comment:www.metafilter.com,2015:site.151667-6147887Fri, 31 Jul 2015 00:23:10 -0800flabdabletBy: mcrandello
http://www.metafilter.com/151667/Seriously#6147924
For quite a while I've wished there were an extension that handled redirecting all requests for jquery to a locally stored version, as well as ga.js. That would make me feel a lot better about browsing, and when I experimented with using a proxomitron type setup to pull them from a local webserver, did seem to speed up the web quite a bit. I'm not quite that clever of a coder or I would have made something better by now.comment:www.metafilter.com,2015:site.151667-6147924Fri, 31 Jul 2015 03:05:10 -0800mcrandelloBy: harriet vane
http://www.metafilter.com/151667/Seriously#6147940
<em>I'm right at this moment in the early stages of building a Thing that is a response to the kind of issues people are talking about here...</em>
Godspeed, stav. Let us know when you need beta testers.comment:www.metafilter.com,2015:site.151667-6147940Fri, 31 Jul 2015 04:07:45 -0800harriet vaneBy: bonaldi
http://www.metafilter.com/151667/Seriously#6147963
It's going to be really interesting to watch what happens after Apple launches ad-blocking in mobile Safari this autumn. It's going to be one-click to a better web, and I think they're going to see huge uptake.
So how will publishers respond? More sponsored editorial? Harder-to-block ads served from the same domain as the content? Limiting site functionality if you have adblock enabled (an ad-wall equivalent of the paywall)? Will they start dumping the web and moving more to native apps?
I don't see them taking it lying down, that's for sure. But at least we'll get a few months of blissful fast mobile browsing again.comment:www.metafilter.com,2015:site.151667-6147963Fri, 31 Jul 2015 05:05:16 -0800bonaldiBy: adamsc
http://www.metafilter.com/151667/Seriously#6148257
<a href="/151667/Seriously#6147787">Verb</a>:
<blockquote>
Well... sort of. For simple playback of stored assets, that's fine. What gets complicated is when (stop me if this is familiar) you actually need to monetize the content that you're publishing. Then, you end up wanting things like pre-roll ads, metrics and tracking to keep monitor how many people see which ads, content protection to make sure that people don't just yoink your videos and post them to bittorrent... And if you care about time on site, you want to make sure that your video player supports playlists and recommendations once a video is finished.
</blockquote>
That's certainly true for many sites but even there it's something of a separate issue from the question of whether you need hundreds of KB of JavaScript. Tracking metrics is just a manner of listening to events; mandatory ads do require JavaScript but again the job can be done with at least an order of magnitude less.
Advertising and DRM requirements certainly do exact an unavoidable tax but I think the biggest problem is simply that the user experience has been considered a minor frill. Until executives realize that poor performance and usability also have a cost, any pressure to do better will be much weaker than the endless demand for new features.comment:www.metafilter.com,2015:site.151667-6148257Fri, 31 Jul 2015 09:31:22 -0800adamscBy: flabdablet
http://www.metafilter.com/151667/Seriously#6149723
<em>For quite a while I've wished there were an extension that handled redirecting all requests for jquery to a locally stored version, as well as ga.js.</em>
What we actually need is to get serious about constructing content-addressable distribution infrastructure.
At the very minimum, consistent inclusion of an X-Content-URN header as proposed <a href="https://lists.w3.org/Archives/Public/www-talk/2001NovDec/0090.html">here</a> would allow browsers to recognize content that already existed in their caches from HTTP response headers alone, allowing them to avoid redundant body downloads.
Obviously it would be even better if it became normal practice for static content linked from web pages to be referenced via something like urn:sha1:e5wip7z6dyyvkz44ggetrz2olqnxnwaj instead of http://some.random.cdn/jquery-1.11.3.min.js. This would allow the browser to fulfill such content directly from cache, if it already had it, without a round-trip to <em>any</em> server.
Making this work well would require assistance from network infrastructure similar in purpose to DNS, for resolving hash-based URNs to lists of URLs; in fact it might even be feasible to press DNS itself into service for this.comment:www.metafilter.com,2015:site.151667-6149723Sat, 01 Aug 2015 10:36:03 -0800flabdabletBy: adamsc
http://www.metafilter.com/151667/Seriously#6149978
flabdablet: this is most likely to happen using the <a href="https://w3c.github.io/webappsec/specs/subresourceintegrity/">Subresource Integrity</a> spec, which provides a standard way to list strong hashes for the contents of a link so e.g. your visitors would be protected against an outside CDN being compromised and has <a href="https://code.google.com/p/chromium/issues/detail?id=355467">will ship in Chrome 45</a> and is <a href="https://bugzilla.mozilla.org/show_bug.cgi?id=992096">underway in Firefox</a>.
Various Google & Mozilla developers have talked about using this for shared caching but <a href="http://www.w3.org/TR/SRI/#security-considerations">the security considerations have led them to defer implementation</a>.comment:www.metafilter.com,2015:site.151667-6149978Sat, 01 Aug 2015 16:38:48 -0800adamscBy: flabdablet
http://www.metafilter.com/151667/Seriously#6150444
I completely fail to see how using a shared copy of data with a shared hash value has anything to do with security. The security considerations you link all have to do with maintaining the integrity of the hash values included <em>in the links</em>; the ones associated with the returned data are completely determined by that data, are therefore not subject to spoofing, and should be completely safe to use for de-duping.
Do you have any links to dev discussions on this? I'd be interested in their reasoning.comment:www.metafilter.com,2015:site.151667-6150444Sun, 02 Aug 2015 08:09:38 -0800flabdabletBy: adamsc
http://www.metafilter.com/151667/Seriously#6152755
It's implied by the <a href="http://www.w3.org/TR/SRI/#hash-collision-attacks">hash collision attacks</a> section, which really comes down to preventing <a href="https://www.owasp.org/index.php/Cache_Poisoning">cache poisoning</a>.
Basically, imagine what happens if this catches on: everyone starts using CDNs with e.g. jQuery v1.2.3 and a known hash, and the web is nicely faster, particularly for mobile users, so plenty of sites start doing this. Then a few years later, someone comes up with a way to generate a collision (more technically, a <a href="https://en.wikipedia.org/wiki/Preimage_attack">preimage attack</a>) such that they can generate a malware script with the same hash and then try to get users to visit a site so anyone whose browser didn't happen to have it cached already will store the malware and subsequently inject it into legitimate sites which reference the real code.
It's not a high-probability attack – and anyone who can do it has many other fun angles to try, like compromising TLS or the code-signing systems Microsoft, Apple, etc. use – but it'd be hard to recover from since it'd realistically require a browser update to disable this feature if an attack was discovered in the wild. What they have now – strongly recommending against use of the weaker hash functions – is probably enough but it wouldn't surprise me if this eventually shipped with a requirement to e.g. specify multiple hashes from different families just in case.
Google is failing me trying to find some of the threads I've [sporadically] followed but it's a surprisingly noisy topic. I'll update if I find the thread again.comment:www.metafilter.com,2015:site.151667-6152755Mon, 03 Aug 2015 20:27:26 -0800adamscBy: flabdablet
http://www.metafilter.com/151667/Seriously#6152772
<em>it'd be hard to recover from since it'd realistically require a browser update to disable this feature if an attack was discovered in the wild.</em>
Nope. It would require that web pages relying on compromised shared resources got updated to request those resources using a different, stronger hash.
Secure hash functions are under constant, active attack from within the cryptography community. Whenever somebody finds a way to reduce the time required to crack one from a fuctillion years to half a fuctillion, it makes the news. The idea that some hidden black hat is going to be able to bypass all that analysis at a stroke and release a previously unsuspected preimage attack - especially one whose plaintext resembles usefully executable malware - is ludicrous. There will <em>always</em> be plenty of time to deprecate hash functions that start to display signs of weakness.comment:www.metafilter.com,2015:site.151667-6152772Mon, 03 Aug 2015 20:47:02 -0800flabdabletBy: [insert clever name here]
http://www.metafilter.com/151667/Seriously#6152821
Lets not forget politics that lead to shitting coding/loading behavior. I've been on teams more than once where devs didn't like the work they were tasked and made it work as shitty as possible in hopes of the code getting pulled. And sometimes that doesn't happen. Sometimes it does. Frequently aimed at marketing departments under the assumption that they don't know enough to call the bluff. One instance of this was eons ago when the CEO demanded the dev team drop everything so they implement a facebook like button (back when likes when in the news feed of the liker). And so they did, but only after explaining how it would slow the page load down. I don't remember the specifics, other than they put the JavaScript call inline, and ran it into production. Borked loading so much because at the time, the scripts from fb ran at a snails pace. But it was a business initiative so it ended staying as a stale mate until I pointed out we could load the JavaScript at the end of the page, or even better, lazy load it. And we did, but I was persona non-grata for pointing out an easy work around.
I am certain there are many instances like this that don't get caught, because that wasn't the only petty dev team I worked with. And I find it hard to believe I was the only one privy to these shenanigans.comment:www.metafilter.com,2015:site.151667-6152821Mon, 03 Aug 2015 21:47:34 -0800[insert clever name here]By: flabdablet
http://www.metafilter.com/151667/Seriously#6153344
Oh yeah, another thing: a suddenly successful method for mounting pre-image attacks against SHA hashes would enable the <a href="https://blog.mozilla.org/security/2014/09/23/phasing-out-certificates-with-sha-1-based-signature-algorithms/">spoofing of SSL certificates</a>, which opens <em>many</em> more ways to monetize the attack than fartarsing around with substituting malware for shared resources. That <em>really would</em> be problematic to respond to, and yet we don't find people refusing to implement SSL in browsers because of "security considerations".comment:www.metafilter.com,2015:site.151667-6153344Tue, 04 Aug 2015 09:43:06 -0800flabdabletBy: Joe in Australia
http://www.metafilter.com/151667/Seriously#6154161
Is that necessarily the case? SSL certificates have a (semi-)fixed format. You might have a way to pad messages in such a way that you create a collision, but not have the ability to create arbitrary collisions. You could do it for source and object files, since they have a lot of room for garbage, but your chances of being able to fit a padded message into the SSL certificate format would still be remote.comment:www.metafilter.com,2015:site.151667-6154161Tue, 04 Aug 2015 16:49:03 -0800Joe in AustraliaBy: adamsc
http://www.metafilter.com/151667/Seriously#6154267
<a href="/151667/Seriously#6152772">flabdablet</a>:
<blockquote>
<blockquote>it'd be hard to recover from since it'd realistically require a browser update to disable this feature if an attack was discovered in the wild.
</blockquote>
Nope. It would require that web pages relying on compromised shared resources got updated to request those resources using a different, stronger hash.
</blockquote>
Note that I said "realistically" – I considered that option but I think it's unrealistic to assume that you could fix an active exploit by getting a bunch of otherwise unconnected sites to upgrade their deployed pages. Just to use the most obvious example, suppose that the jQuery/Google Analytics or Fonts/Bootstrap/React/etc. had their CDN offering start including the integrity attribute in their code samples – how long would it take for a change to circulate across every site? I think you'd get major players fairly quickly but simply finding everyone to contact would be a non-trivial problem, much less getting prompt releases pushed through everywhere.comment:www.metafilter.com,2015:site.151667-6154267Tue, 04 Aug 2015 18:03:26 -0800adamscBy: adamsc
http://www.metafilter.com/151667/Seriously#6154284
<blockquote>
Oh yeah, another thing: a suddenly successful method for mounting pre-image attacks against SHA hashes would enable the spoofing of SSL certificates, which opens many more ways to monetize the attack than fartarsing around with substituting malware for shared resources.
</blockquote>
That's true but it's not quite the same: you still wouldn't have the ability to affect other sites without also being able to conduct a MITM attack. Shared cache pollution would allow large-scale attacks with no other access, which is different in potentially interesting ways.
Again, I haven't read anyone saying not do it, only that they wanted to really think the implementation through first. Given how many "harmless" features in SSL/TLS have turned out to be exploitable by someone sufficiently creative, that seems reasonable and if you notice how much work went into the current implementation it's not surprising that they didn't want to wait before launching anything.comment:www.metafilter.com,2015:site.151667-6154284Tue, 04 Aug 2015 18:13:33 -0800adamsc
¡°Why?¡± asked Larry, in his practical way. "Sergeant," admonished the Lieutenant, "you mustn't use such language to your men." "Yes," accorded Shorty; "we'll git some rations from camp by this evenin'. Cap will look out for that. Meanwhile, I'll take out two or three o' the boys on a scout into the country, to see if we can't pick up something to eat." Marvor, however, didn't seem satisfied. "The masters always speak truth," he said. "Is this what you tell me?" MRS. B.: Why are they let, then? My song is short. I am near the dead. So Albert's letter remained unanswered¡ªCaro felt that Reuben was unjust. She had grown very critical of him lately, and a smarting dislike coloured her [Pg 337]judgments. After all, it was he who had driven everybody to whatever it was that had disgraced him. He was to blame for Robert's theft, for Albert's treachery, for Richard's base dependence on the Bardons, for George's death, for Benjamin's disappearance, for Tilly's marriage, for Rose's elopement¡ªit was a heavy load, but Caro put the whole of it on Reuben's shoulders, and added, moreover, the tragedy of her own warped life. He was a tyrant, who sucked his children's blood, and cursed them when they succeeded in breaking free. "Tell my lord," said Calverley, "I will attend him instantly." HoME²Ô¾®¿Õ·¬ºÅѸÀ×Á´½Ó
ENTER NUMBET 0017 www.wkf.org.cn www.lyltsb.com.cn lejin8.com.cn dizhe2.net.cn sikou0.com.cn nj9424.com.cn www.shelu3.net.cn 9imm.com.cn www.ajoho.com.cn sinaweb4.com.cn