Sharon Perry » low_literacy http://blogs.cetis.org.uk/accessibility Cetis Blog Fri, 12 Jul 2013 10:04:14 +0000 en-US hourly 1 http://wordpress.org/?v=4.1.22 Icon Chat and Search Engine for People with Low Literacy http://blogs.cetis.org.uk/accessibility/2009/01/19/icon-chat-and-search-engine-for-people-with-low-literacy/ http://blogs.cetis.org.uk/accessibility/2009/01/19/icon-chat-and-search-engine-for-people-with-low-literacy/#comments Mon, 19 Jan 2009 15:41:42 +0000 http://blogs.cetis.org.uk/accessibility/?p=82 Icon Chat and Search Engine, which presents information in the form of images or symbols, drawn in from feeds. ]]> Following on from my post about “Taking Symbols for Granted“, where I reviewed Jonathan Chetwynd’s paper entitled “Communication with symbols: from the web to the internet and beyond”, Jonathan has just let me know about the launch of his Icon Chat and Search Engine at openicon.org. The Opera browser is recommended for viewing the site, although you can use FireFox to get the general gist (IE7 is rather intermittant).

The site includes links to three short YouTube videos describing:
* how to chat and search with icons using the application. It shows how chat can take place using symbols, such as using a “heart” for love;
* how to create a web page with a live icon – i.e. a graphic feed which updates, for example a weather symbol which updates as the weather is updated for that location;
* a look at a feed which uses icons, i.e. Zanadu (SVG enabled, so may not work on Internet Explorer), which has live image feeds of pet images from Flickr, the latest news, the latest weather, a direct link to play Radio 4, etc.

The aim is to present information in the form of images or symbols, which are drawn in from feeds. This is similar to a widget based approach, but the concept behind this site allows people with low literacy levels to use the internet without having to navigate complex external sites.

]]>
http://blogs.cetis.org.uk/accessibility/2009/01/19/icon-chat-and-search-engine-for-people-with-low-literacy/feed/ 0
Taking Symbols for Granted http://blogs.cetis.org.uk/accessibility/2009/01/16/taking-symbols-for-granted/ http://blogs.cetis.org.uk/accessibility/2009/01/16/taking-symbols-for-granted/#comments Fri, 16 Jan 2009 17:10:55 +0000 http://blogs.cetis.org.uk/accessibility/?p=77 The Journal of Assistive Technologies (Vol.2, Issue 3) has recently published a paper entitled “Communication with symbols: from the web to the internet and beyond” (see the list of contents and a sample article) by Jonathan Chetwynd.

Chetwynd begins by reminding us of how many symbols there are around us – from road signs in the physical world to emoticons in the virtual world. We use them so much in our everyday life that we take them for granted, even seeing and understanding graphics before (or even without) reading any associated text (c.f. the Apple iPhone interface, which has large symbols for each application and a small text name underneath).

Symbols are useful means of communication for people with low literacy levels or who do not speak the local language (although some symbols may also have localised or cultural meanings, which may be different from the universally understood meaning). Chetwynd suggests that as online computer games almost completely rely on graphics and symbols that games developers may be well-suited to make useful contributions to the development of symbol-based communication.

He also laments the fact that many groups which have an influence on web accessibility are effectively only open to people from large organisations, because of the financial or resource costs required. However, it’s not all bad news. The W3C SVG (World Wide Web Consortium Scalable Vector Graphics) Working Group has opened its work up to the public and has recently chartered a public interest group. Chetwynd’s peepo website is SVG enabled.

SVG is a graphics specification which can include text and metadata descriptions, so that they can be searched. It is a very flexible format and is well-suited to small and mobile devices. Most browsers (with the exception of Internet Explorer) have built-in support for SVG and even some mail clients are SVG-enabled. Theoretically, this means that e-mails containing symbols only can be exchanged and understood.

Chetwynd’s paper reminds that we don’t just live in a text-based world. Symbols are all around us. They instruct and remind us, and help us to communicate and navigate. They can be seen as a common, almost universal language, and provide benefits to people who, for whatever reason, have difficulties understanding text or language. By using them more on web-based resources, alongside text labels, not only will we help others to communicate but we will also help ourselves.

]]>
http://blogs.cetis.org.uk/accessibility/2009/01/16/taking-symbols-for-granted/feed/ 1
BBC Podcast: Accessibility in a Web 2.0 World? http://blogs.cetis.org.uk/accessibility/2008/01/14/bbc-podcast-accessibility-in-a-web-20-world/ http://blogs.cetis.org.uk/accessibility/2008/01/14/bbc-podcast-accessibility-in-a-web-20-world/#comments Mon, 14 Jan 2008 11:28:14 +0000 http://blogs.cetis.org.uk/accessibility/2008/01/14/bbc-podcast-accessibility-in-a-web-20-world/ I’ve just listened to the BBC’s Podcast Accessibility in a Web 2.0 World (around 43 minutes long, available as MP3 and Ogg Vorbis formats).  The podcast takes the form of a facilitated discussion between a number of experts talking about what Web 2.0 applications mean to accessibility and included representatives from the BBC, commercial web design companies, and the AbilityNet charity.

There were some interesting comments and if you don’t get chance to listen to the whole thing, here’s a brief run-down of some of the ideas and issues, which I thought were particularly salient.

* Social networking sites can take the place of face-to-face networking, particularly where the user has motor or visual disabilities. However, many sites often require the user to respond initially to a captcha request, which can be impossible for people with visual or cognitive disabilities.  Some sites do allow people with voice-enabled mobiles to get around the captcha issue, but not everyone has such technology. Once the user has got past such validation, they then have to navigate the content which, being user generated, is unlikely to be accessible.

* One of the panellists felt that people with disabilities did not complain enough about inaccessible websites and that a greater level of user input would help web based content be more accessible.

* Jonathan Chetwynd, who has spoken to the CETIS Accessibility SIG in the past (see Putting the User at the Heart of the W3C Process) stated that users were not involved in the specification and standards process, because it was led by large corporate companies.  He also felt that users with low levels of literacy or technical ability were being overlooked in this process.

* There was some interesting discussion about W3C (World Wide Web Consortium) and the way in which their accessibility guidelines are developed.  Anyone can be involved in the W3C process but as a fee is charged for membership, it is mostly companies, universities, and some not-for-profit organisations who take part.  As some companies don’t want their software to appear as inaccessible, it may be that their motive in joining the W3C is less altruistic.  It was stated that it was actually easier to “fight battles” within the W3C working groups than to take them outside and get a consensus of opinion. As a result, there is not enough engagement outside the W3C working groups which has resulted in a lot of dissatisfaction with the way in which it works. 

* We are now in a post-guideline era, so we need to move away from the guideline and specification approach to an approach which considers the process.  This means taking the audience and their needs into account, assistive technology, etc.  Accessibility is not just about ticking boxes.  The BSI PAS 78 Guide to Good Practice in Commissioning Websites, for example, gives guidance on how to arrive at the process and to ensure that people with disabilities are involved at every stage of the development.  However, developers often want guidelines and specifications to take to people who don’t understand the issues regarding accessibility.

* It is important that everyone is given equivalence of experience so there is a need to separate what is being said and how it needs to be said for the relevant audience.  The web is moving from a page-based to an application-based approach.  One panellist likened Web 2.0 applications to new toys with which developers were playing and experimenting and he felt that this initial sandpit approach would settle down and that accessibility would start to be considered.

* Assistive technology is trying hard to keep up with the changing nature of the web but is not succeeding.  Although many Web 2.0 applications are not made to current developer standards (not the paper kind!), many of the issues are not really developer issues.  For example, multimodal content may have captions embedded as part of the the file or as standalone text, which both browsers and assistive technologies need to know how to access.

* People with disabilities are often expected to be experts in web technology and in their assistive technology but this is often not the case.

After the discussion, the panel members were asked what they felt would advance the cause of web accessibility.  My favourite reply was the one where we all need to consider ourselves as TAB (Temporarily Able Bodied) and then design accordingly.  The rationale behind this was that we will all need some sort of accessibility features at some stage.  So the sooner we start to build them in and become familiar with them, the better it should be for everyone else!

]]>
http://blogs.cetis.org.uk/accessibility/2008/01/14/bbc-podcast-accessibility-in-a-web-20-world/feed/ 1
Could on-screen narration be discriminatory? http://blogs.cetis.org.uk/accessibility/2007/10/05/could-on-screen-narration-be-discriminatory/ http://blogs.cetis.org.uk/accessibility/2007/10/05/could-on-screen-narration-be-discriminatory/#comments Fri, 05 Oct 2007 02:58:06 +0000 http://blogs.cetis.org.uk/accessibility/2007/10/05/could-on-screen-narration-be-discriminatory/ Cathy Moore has an interesting blog post entitled “Should we narrate on-screen text?, where she suggests that automatic narration of on-screen text can actually be detrimental to learners. She states that learners generally read more quickly than the narration is read – screen reader users can “read” text very quickly -and that learners are then forced to move at the pace of the narration.

Although some learners may find on-screen narration useful such as young children, learners who are learning the language of the learning resource, and learners with low literacy skills or cognitive difficulties, on-screen narration should not be included just to try and fulfil obligations towards students with disabilities. Automatically including on-screen narration just to fulfil SENDA (Special Educational Needs and Disabilities Act) obligations, could actually discriminate against students who do not need on-screen narration.

Therefore, the provision of on-screen narration should be considered very carefully and, if it is considered necessary, offered as an alternative or option, but the original resource (or an alternative) should still be accessible to screen reader and other technology users.

]]>
http://blogs.cetis.org.uk/accessibility/2007/10/05/could-on-screen-narration-be-discriminatory/feed/ 1