Monday, August 13, 2018

Niamey 1978 & Cape Town 2018: 1. Some thoughts about extended Latin & content in African languages

Image features the 31 modified letters & diacritic combinations in
the African Reference Alphabet, 1978. (Nor all are currently in use.)

The world 40 years ago, when the Meeting of Experts on Transcription and Harmonization of African Languages took place in Niamey, and that of the Wikimania 2018 conference in Cape Town (which ended last month) seem very distant from each other. But from the angle of the written form of African languages at least, the concerns of the two events are not so distant.

One of these concerns is the extended Latin alphabets that were on the agenda in Niamey, and which are used in about half of the African language editions of Wikipedia. This post and the next consider these two vantage points, asking whether extended Latin is associated with less content creation, and what might be done to facilitate its use of the longer Latin alphabet.

Adapting the Latin script to African realities


In 1978, representatives of countries that had gained independence no more than a couple of decades earlier, or in some cases only a few years before, met in Niamey to advance work on writing systems for the first languages of the continent. One of the linguistic legacies of the colonial period was the Latin alphabet (even in lands where other systems had been used). But given the phonological requirements sometimes very different than what Latin letters represented in Europe, linguists added various modified letters, diacritics, and digraphs to write African languages (sometimes even a special system for a single publication1.

So, that legacy also often took the form of multiple alphabets and  orthographies for a single language, reflecting the different origins of European linguists (frequently Christian missionaries from different denominations), locations in which they worked (perhaps places where speakers of a language had particular dialects or accents), and individual skills and choices. After independence, many African countries undertook to simplify this situation, but they still often ended up with alphabets and spelling conventions different from those in neighboring countries.

The linguists and language specialists in Niamey, as in other such conferences of that era (many of which, like the one in Bamako in 1966, were supported by UNESCO), were concerned with further simplifying these discrepancies, with accurate and consistent transcription of languages that were for the most part spoken in two or more countries (whose speaker communities were divided by borders). That included adopting certain modified letters and diacritic combinations for sounds that were meaningfully significant in African languages (some of which correspond with characters in the International Phonetic Alphabet).

Language standardization, which is actually a complex set of decisions, was a real concern where there were on the one hand diverse peoples grouped in each state and on the other hand limited resources for producing materials and training teachers. At its most basic level, though, standardization of any sort required an agreed upon set of symbols and conventions for transcription.2

A reference alphabet for shared orthographies


The African Reference Alphabet (ARA)3 produced by the Niamey meeting was an effort in that direction. It built on the longer post-independence process to facilitate use and development of written forms of African languages - a process that had its roots in the early introduction of the Latin script (before the formal establishment of colonial rule) and efforts during the colonial period such as the influential (at least in the British colonies) 1928 Africa Alphabet. The ARA was intended - and to some degree at least still serves - as sort of a palette from which orthographies for specific linguistic, multilingual national, and cross-border language needs could be addressed.4

And that set of concerns - alphabets, orthographies and spelling conventions - turned out to be the starting point for later efforts in the context of information and communication technology (ICT) to localize software and interfaces, including Wikipedia and other Wikimedia interfaces, and to develop African language content online, including for Wikimedia projects. Even if it does not seem as visible as other challenges.

What I haven't seen is an evaluation of the efforts at Niamey and the other expert meetings on harmonization of transcriptions, although the most used of the characters in the ARA can be seen in various publications, and all but perhaps one are in the Unicode standard.

In any event. the situations of the various African languages are diverse, with some having well established corpora while others are "less-resourced," and in the worst case, inconsistently written.

Extended Latin and composing on digital devices


One important element in discussions in the process of which Niamey was part, was the role of modified letters - what are now called extended Latin characters - in transcribing many African languages. The ARA includes no less than 30 of them (22 modified letters and 8 basic Latin with diacritics5). These added characters and combinations are not intended to all be used in any one language, but represent standard options for orthographies. The incorporation of some of these into a writing of a single language makes the writing clearer, and has no drawbacks for teaching, learning, reading, or handwriting (although there are arguments against the use of diacritics). Since the establishment of Unicode for character encoding, the screen display of these characters is not a problem (so long as fonts have been created including glyphs for the characters).

However even the presence of even just one or two extended Latin characters leads to problems with standard keyboards and keypads - where are you going to place an additional character, and how is the user to know how to find it? This is a set of issues that was of course recognized back in the era of typewriters. One of the spinoffs from the Niamey conference was the 1982 proposal by Michael Mann and David Dalby (who attended the meeting) for an all lower-case "international niamey keyboard," which put all the modified characters (of an expanded version of the ARA) in the spots normally occupied by upper-case letters.

While that proposal never went far (I hope to return to the subject later) - due in large part to its abandonment of capital letters - it was but one extreme approach to a conundrum that is still with us. That is, how to facilitate input of Latin characters and combinations that are not part of the limited character sets that physical keyboards and keyboards are primarily designed for. It's not that there aren't ways of facilitating input - virtual keyboard layouts (keyboard drivers that can be designed like and shared, like Keyman, and onscreen keyboards) have been with us for years, and there are other input systems (voice recognition / speech-to-text being one). The problem is lack of standard arrangements and systems for many languages. Or perhaps in the matter of input systems, the old wag, "the nice thing about standards is there are so many to choose from," applies.

The result, arguably, may be a drag on widespread use of extended Latin characters, and as a consequence of popular use on digital devices of languages whose orthographies include them. Or a choice to ASCIIfy text (using only basic Latin), as has been the case with Hausa on international radio websites. Or even confusion based on continued use of outdated 8-bit font + keyboard driver systems, as witnessed in at least one case with Bambara (see discussion and example).

What can the level of contributions to African language editions of Wikipedia tell us about the effect of extended Latin? This will be explored in the next post: Extended Latin & African language Wikipedias.

1. For example some works on forest flora which had lists of common names in major languages of the region.
2. Arguably in the case of a language written in two or three different scripts, one could have a system in each script and an accepted way to transliterate between or among them.
3. The only other prominent use I found of the term "reference alphabet" was that of the ITU for their version of ISO 646 (basically the same as ASCII): "International Reference Alphabet." The concept of reference alphabet seems to be a useful one in contexts where many languages are spoken and writing systems for them aren't yet established.
4. This approach - adopting a standard or reference alphabet for numerous languages - was taken by various African countries, for example Cameroon and Nigeria. These efforts were without doubt influenced by the process of which Niamey and the ARA were part.
5. By comparison, the Africa Alphabet had 11 modified letters and did not use diacritics. All 11 of the characters added in the Africa Alphabet were incorporated in the ARA. It is worth noting that in the range of modified letters / special characters created over the years, some are incorporated into many orthographies, others fewer, and some are rarely used if at all.

No comments: