There’s no Algorithm for Common Sense!

Virtual has drawn up a list of 25 Free Website Checkers, with a brief description of what each one does.  The checkers are split into handy sections – General, Disability, and Usability – but automated checkers will only check the easy bits – e.g. colour contrast, HTML (HyperText Markup Language) and CSS (Cascading Style Sheet) code, etc – i.e. the bits for which an algorithm can be written. 

However, whilst a website checker can check that alt text, for example, is used with an image and will tell you if it’s missing, it can’t actually tell you whether what that alt text actually makes sense.  For example, alt text of “an image” or “asdfg” is not going be very useful to someone who doesn’t download images or for someone who uses tooltips to find out the relevance of the image (particularly where a description or title hasn’t been provided).  So developers and content authors need a hefty dose of common sense to make sure that the aspects of a website that can’t automatically be checked by a computer are actually usable and accessible.

It’s often quoted (but I can’t remember by whom) that one could implement the whole of WCAG (Web Content Accessibility Guidelines) and still end up with an inaccessible site. Whilst an automated checker might find that the site is accessible based on a simple checklist, a human may find it unusable.  Human involvement in checking accessibility is still necessary and as well as common sense, an understanding of accessibility issues and context is also required.  For example, whilst a photo of Winston Churchill might have the alt text of “Photo of Winston Churchill”, if the photo is illustrating a particular point, it could be more relevant to say “Photo of Winston Churchill smoking a cigar” or “Photo of Winston Churchill in London in 1949″, depending on context.

So whilst automated web accessibility checkers have their uses, it’s important to remember that they generally don’t include an algorithm for common sense!

Technology and Control: the Designer v. the User

Chapter Two: Framing Conversations about Technology” of “Information Ecologies: Using Technology with Heart” by Bonnie Nardi and Vicki O’Day looks at the differing views of technology from the dystopic to the utopic.  The authors make some interesting comparisons between the technology we have now with the technology of the recent past, as well as some very interesting comments.

Nardi and O’Day have noticed that although the advance of technology is seen as inevitable, people do not critically evaluate the technologies they use, even though they have been designed and chosen by people.  In other words, we accept the technology that is placed before us but we forget that we have a choice as to the type of technology we actually use and the way in which we use it.

The authors compare the differing views of Nicholas Negroponte (technophile and director of the MIT Media Lab) and Clifford Stoll, author of “Silicon’s Snake Oil”, programmer and astronomer.  Interestingly, although their views are remarkably different (one utopic, the other dystopic), they both agree that “the way technology is designed and used is beyond the control of the people who are not technology experts” (Nardi & O’Day).

Nevertheless, people often use technology in ways that are completely different from the way in which the designer intended. For example, Johnny Chung Lee has developed some interesting and unusual uses for the Nintendo Wii controller.  Thinking out of the box can bring control back to the user and it’s probably fair to say that we all (from expert users to newbies) use the technology we have in ways which weren’t even considered by designers, even if it’s just using a CD as a coaster for a coffee mug.

So although technology (hardware and software) designers may only have a limited perspective on the way in which they expect their technology to be used, once it is out in the public domain, alternative uses or ways of working will often be developed and exploited.