Category Archives: Javascript

Allow more JavaScript, maintain privacy

I’ve long regarded JavaScript in the browser to be one of the biggest security holes in web-browsing, and at the same time the Internet works less and less well without it. In 2008 Joel Spolsky made the observation that for some people the Internet is just broken:

Spolsky:   Does anybody really turn off JavaScript nowadays, and like successfully surf the Internets?

Atwood:   Yeah, I was going through my blog…

Spolsky:   It seems like half of all sites would be broken.

Which is not wrong.  Things have changed in the last five years, and now the Internet is even more broken if you’re not willing to do whatever random things the site you’re looking at tells you to, and whatever other random sites that site links off to tell you to, plus whatever their JavaScript in turn tells you to. This bugs me because it marginalizes the vulnerable (the visually impaired, specifically), and is also a gaping security hole.  And the performance drain!

Normally I rock with JavaScript disabling tools and part of my tin-foil-hat approach to the Internet, but I’m now seeing that the Internet is increasingly dependent on fat clients. I’ve seen blogging sites that come up empty, because they can’t lay out their content without client-side scripting and refuse to fall back gracefully.

So, I need finer granularity of control.  Part one is RequestPolicy for FireFox, similar to which (but not as fine-grained) is Cross-Domain Request Filter for Chrome.

The extensive tracking performed by Google, Facebook, Twitter et al gives me the willys. These particular organisations can be blocked by ShareMeNot, but the galling thing is that the ShareMeNot download page demands JavaScript to display a screenshot and a clickable graphical button – which could easily been implemented as an image with a href. What the hell is wrong with kids these days?

Anyway, here’s the base configuration for my browsers these days:

FireFox Chrome Reason
HTTPSEverywhere HTTPSEverywhere Avoid inadvertent privacy leakage
Self Destructing Cookies “Third party cookies and site data” is blocked via the browser’s Settings, manual approval of individual third party cookies. Avoid tracking; StackOverflow (for example) completely breaks without cookies
RequestPolicy Cross-Domain Request Filter for Chrome Browser security and performance, avoid tracking
NoScript NotScripts Browser security and performance, avoid tracking
AdBlock Edge Adblock Plus Ad blocking
DoNotTrackMe DoNotTrackMe Avoid tracking – use social media when you want, not all the time
Firegloves (no longer available), could replace with Blender or Blend In I’ve have had layout issues when using Firegloves and couldn’t turn it off site-by-site

Pressing a button does not demand JavaScript

The state of software produced by web developers is highly variable.  The things the good programmers can do is little short of astonishing, as it always has been with limited environments.  But the bad programmers…

Fifteen years ago I did a Microsoft certification thingy, and now they want me to do a satisfaction survey on it – for no compensation.  I think not.  But I notice an unsubscribe link at the bottom of the email, so I follow it:, see the Submit button, click on it… and nothing happens.  And then I realise – it needs JavaScript to press.  A button, one of those things right at the heart of HTML 2.0.  What is this, amateur hour?  Turns out, yes it is because if you follow the hacked URL above — which if filled with bogus data — and click on the Submit data, the back end proceeds happily without validating any of the data, and asks you another question before confirming that it’s done:

We’re sorry you no longer want to receive e-mails from us. Please allow one week for us to process this request, during which time you may still receive e-mails from us. We apologize for any inconvenience.
To help us improve our service, please tell us the primary reason why you no longer wish to receive our messages:

There appears to be some kind of problem with their computers.  Last time I checked, the time it takes a computer to remove a record from a database is in the vicinity of “I’m already finished”, not one week.

I’m of the opinion that people who construct software ought to be required to put their name on it in a visible way, so they can go on my list of people to smack in the face when I meet them.  It’s for the best.

Chrome doesn’t sandbox the CPU; Goggle docs waits really hard

Chrome doesn’t attempt to sandbox CPU consumption. I just closed an inactive Google docs spreadsheet, and saw CPU fall from pegged-at-100% to bubbling along at 10%.

Does it really need each available CPU cycle to wait for the other end to do something? Apparently so, in the way it’s coded.

Google: not as clever as the press release makes out.

MySchool: so wrong

Background: The Australian federal government has finally pushed out a web site publishing performance metrics for all schools throughout Australia. There has been much brouhaha regarding this. For some reason, the go-live wasn’t a quiet one, but a very loud, flick-on-the-switch big-bang go live.

Naturally, the website asploded.

Any website that’s going to be hit by 1% of the Australian population the moment it goes live is going to blow up unless there are some cluey, experienced people behind it. Clever, inexperienced people, or experienced idiots with a large budget might stand a chance if things got progressively worse over time, but turn it on and hammer it on day one? does not have cluey, experienced people behind it. There are various signs.

For a start, what is it with the TLD? seems fine, but what’s wrong with a redirect from given they were the folks running around promoting it? It’s not like myschool is an education institution.

Then you get there. Guess what? It won’t work without JavaScript. At all. Because typing in a string and hitting enter demands the availability of JavaScript. Using <form> is so 2000s. Get with the new decade! It’s so vital to the site that users must not be allowed in if they don’t have JavaScript. Screw the blind! They’ve only got one school to go to anyway.

And the site is slow, amazing slow. But I guess if you’ve got to download all that JavaScript to enter that string, of course it’s going to be slow. Switching to a different set of data? Couldn’t download that and just do a hide/show, no you’ve got to do some kinda AJAX-y postback crap for a massive round-trip delay; if you were dealing with rapidly changing data, that might almost make sense; every year this website will get data updates, so no: this makes no sense. I clicked on it, and a long time later, something happened to the web page. In the meantime, I went off to get a drink. Alternatively, you could just show a table for each year, and skip the damn JavaScript altogether. Why there’s even a backend is beyond me, this whole thing could be served perfectly well – and mind-numbingly quickly – from static pages.

And for the purpose it’s intended for: parents picking a school for their kids. Can you compare schools? No. Open them up in different browser tabs, if you have a tabbed browser (remember: the blind can go take a flying leap). Good thing the site is chocked full of JavaScript. And the JavaScript is used for handy things like map-based locating of schools, and – oh, hang on, no it’s not. There’s no Google-maps mash-up. Good thing the site is chocked full of JavaScript.

Clearly, the entire site has been an exercise in some programmer somewhere bolstering their resume rather than giving the client something appropriate. Either that, or a manager was in charge of the feature spec, and demanded all the latest buzzwords that they had heard but didn’t understand. I’m betting it took more than a year to build. Feel free to speculate.

I’m also willing to bet the price on this site was more than the $50,000 it should have cost (one person, three months). I’m imagining about two or three orders of magnitude more. I’m figuring the servers required for this aren’t running in some guy’s bedroom, even though that would be about all that’s required for such a simple dataset that’s presented in such a straightforward way.

Must try harder.

ANZ computerised banking is user-hostile

I have an ANZ Bank account. Using their website to pay bills is an exercise in frustration. I only have one account, but the website insists on me picking it out of a dropdown with two entries – the first one, the default, instructing me to pick an account. Failure to do so results in an error – “Please choose a From Account.” I only have ONE! Assume that’s where I want to pay from! Then one must pick who to pay, with an option to pick previous billers from a drop-down list. If you pick from the dropdown without JavaScript enabled, you get the error “Please select a biller from the drop-down list or enter a biller code.” – with JavaScript it fills in a few fields for you, but why does it even need you to fill those fields in if you’ve picked your biller already? Fill them in when I click the “I’m done” button!

Finally, we come to a bugbear I have with ANZ currency fields. You can’t enter a dollar amount, it has to include a decimal point with two following cents; they can’t infer from a lack of a decimal point you’re talking about a dollar amount. They enforce this rule on their website, and they insist that at an ATM you enter the number of cents you wish to withdraw from the ATM. Given the smallest unit of currency available from an ATM is $20, what is wrong with this picture?

Nifty: Force Directed Graphs in Javascript

Starts off as a mess, then...
Kyle Scholz has developed code to represent Force Directed Graphs in Javascript, and you can interact with the nodes. We’re talking mathematical directed graphs here – you might know them as networks.

Basically, there’s a bunch of nodes and they settle themselves into a stable state minimizing tension between them – the graphs balance themselves out, and you can see it happening – it’s animated. And interactive – you can grab a node and move it around. It is ubercool.

Downside is that it sucks huge CPU.
... eventually becomes balanced

AJAX and Screenreaders: Screw the blind, this is the new web!

Sitepoint tries to figure out how well the Web2.0 works for blind people.

Basically, no US gov website, and none that loves blind people, will be able to implement a AJAX-only site – a noscript verson will have to be available. And this stems from the fact that it’s too hard to make the various screenreaders act in a standard way in response to changes to the document. Which sounds to me to be a perfect problem for World Wide Web Consortium standardisation.