Uphill

Nick Sabalausky via Digitalmars-d digitalmars-d at puremagic.com
Tue Jun 2 21:14:58 PDT 2015


On 06/02/2015 07:51 PM, weaselcat wrote:
>
> I have to disable javascript on amazon.com to be able to use the site or
> else it brings my browser to a crawl.
>

When I finally figured out how to hack (by that I mean "load it down 
with an endless list of add-ons" and THEN waste an evening configuring) 
a modern Firefox to be tolerable, I thought I was finally in pretty 
decent good shape. (Well, at least until I was subjected to Mozilla's 
next couple rounds of UI blunders anyway...)

But now that I've finally accepted defeat on "screw cellphones" and been 
in the android habit for over a year, I've found myself in a new web 
hell: Now, no matter what mobile browser I use (Chrome, Firefox, 
Dolphin, or the amazingly-incorrectly named "Internet"), I have a choice:

A. Use the goofy, straightjacketed, sluggish "mobile" versions of 
websites (Wikipedia's is particularly bad, what with the auto-collapsing 
of every section on the entire page *while* you're scrolling and reading 
and then again every time you navigate "back" to a section you'd already 
re-expanded manually. Gee, thanks for closing my book, Wikipedia, never 
mind that I was reading it.)

Or B. Switch on "request desktop site" mode and get *improved* mobile UX 
but can easily take 30 seconds to a full minute (per page) before 
becoming responsive enough to accept clicking on a link. And that's for 
pages that aren't even dynamic beyond the initial flurry of JS onLoad() 
nonsense.

(Seriously, nearly anything that can be run during onLoad(), BELONGS on 
the server. Why would ANYONE in their right mind EVER make every single 
client do several Ajax/REST/whatever requests, THEN render the EXACT 
SAME page every time, for every single incoming request? Instead of, oh, 
I dunno, rendering the EXACT SAME page ONCE when content ACTUALLY 
changes and having the server spit THAT out to every request? Not 
enterprisey enough, I guess. Really, how often does a blog or news site 
actually post or modify an article? That exact same pages REALLY need to 
get completely regenerated on every hit even though NOTHING has changed 
since the last 500 hits?)

What I find very interesting is that it's consistently big businesses 
that have the most impossible-to-use sites. Ex: Just look at any site by 
SCE. You'd almost think they *don't* want any viewers and customers.



More information about the Digitalmars-d mailing list