XVI Systems

Computer access for the visually impaired and blind

Speech/Braille Server API specification

Introduction

Any generic computer system with a screen, to usable by the visually impaired or blind must  convey the information on the screen into a form understandable by these users. In today's technology the normal means of interface  is either speech, braille or both.

Modern computer systems are complex and typically applications are developed  in a structured manner using where possible the tested and defined standard work of others. The BEAM Speech/Braille server is intended to be such a standard - Its objectives are twofold. Firstly to provide a framework upon the suppliers of Braille and Speech can build a generic system. Secondly, to formalise and implement a set of procedures that my be used by a application wishing to output in speech and/or braille.
 

The speech system
Speech is used to convey information to the user. Typically the information on the display is logically partitioned allowing the user to traverse the display asking for, or provided with information at each step. For example when 'traversing' an application menu bar the words "file",  "Edit" might be spoken and when "File" selected the words "pop up menu New" spoken.

The purpose of this API is not to define the speech functionality but to allow an application the resources to generate and control this speech.

Braille displays
Picture of an alva 440 braille displayA braille display has a number of electrically driven cells, typically eight pins per cell which are used to display the 6 pin braille character set. The two additional pins are used for additional information such as cursor position. In addition a braille display may have a number of keys/buttons that may be used to aid or control screen reading. For example, a button might move down a line, to the next character, screen object etc.
The BEAM XVI screen reader for example configured, for use with an Alva 440 Braille display  uses the front panel rubber front keys or what we refer to as object/text  traversal, simplified, an object is in essence a screen entity such as a button, text entry cell, menu entry etc.

The  twin button rows above the status cells (left-most three cells) are used as mouse buttons and to scroll the line of text displayed on the remaining 40 cell.

Features

  • A generic interface for speech and/or braille devices.
  • Support for simultaneous 'braille and/or  speech' enabled applications and specific screen reader software.
  • Simple addition of new speech/braille devices. The server will dynamically load the modules required. Third party and user modules can be easily developed.
  • Fully networked remote access to braille/speech devices.
  • Speech system supports output stream trigger points.
  • Server automatically supports US and DK braille.
  • Application history support to allow streamed logs such as kernel messages to be accessed.
  • The Speech Braille Server Architecture

    BEAM have adopted a client/server style architecture to implement speech/braille access. This approach allows both generic screen reader software and 'braille/speech enabled' application to co-exist on the same computer. By 'braille/speech enabled' we mean applications such as emacs the various 'console' readers available - that directly support speech output. Generic screen reader software would typically allow the reading of  applications that don't support braille/speech directly.

    A server demon process 'sits' awaiting connection from one or a number of clients that wish to use the braille/speech device. Connection is made via TCP/IP sockets. It is not necessary for the client and server to run on the same host machine. Applications make use of the server by  the functions provided by a small interface library.

    Client Access

    An application linked with an appropriate library can connect to the speech/braille server, such a application is referred to as an speach/braille client. Before covering the calls that these clients may make the Braille device object model will be described.

    From the perspective of a client a braille device has the following characteristics :-

    The functions of this speach/braille device access library are outlined below :-

    The speach/braille device access library, linked with an application facilitates connection to, and use of  an speach/braille device server.

    The speach/braille device API is available in both 'C++' and 'C' forms.
    The 'C' implementation comprises a set of 'C' functions that, when linked with a application allow the
    to output in speech and braille. Additional functionality allows an application to perform actions at pre-determined points in the
    speech output flow.

    SBClient 'C' API
    SBClient 'C++' API

    Server Driver Modules


    The speech/braille server essentially provides an interface between a client and a braille and/or braille display.
    Imperative in the architectural specification was the requirement that :-

    The first requirement has been addressed by developing two simple C++ classes - Braille and Speech from which are derived the specific device classes. Illustration of the method can best be found be examination of our source files Alva.h,Alva.cc,Apollo.h and Apollo.h

    The server program has been developed with the ability to dynamically load object files. When the server first starts up, it reads a text configuration file /usr/xvi/etc/sbserver.conf. This simple <TAG>:    <data> format file contains the two tags SPEECHDEV: and BRAILLEDEV: the values read indicate the name of the object file to load.