Multilingual Typeface Design

By Jayce Nguyen

Most people are already pretty familiar with the Latin alphabet, one of the most “widely used alphabetic writing system[s] in the world” and the standard script for not only English, but for other languages such as French, German, Spanish, and even Vietnamese. As the Latin script is so widely used, it is often easier to find a variety of different typefaces supporting Latin-based languages. From familiar standard fonts – like Times New Roman and Arial – to more fancy display typefaces, it’s pretty easy to find a font that will convey a certain message or meaning. There’s no shortage of Latin typefaces out there and a quick search will bring up hundreds of results. Unfortunately, the same thing cannot be said about non-Latin scripts.

Where Latin script users are met with variety and accessibility while browsing the web, those who use languages with non-Latin scripts are often faced with scarcity and other difficulties involving type. In the early days of the internet, users whose primary languages are lesser-known were not always able to navigate the web easily due to the lack of broad, multilingual support for non-Latin scripts. With the expansion of the internet and growing networks of global communication in the past decade, there have been new developments in type and web fonts that help give a voice to users previously silenced by these barriers.

The Unicode Standard

A big part of this growth for multilingual support started with the development of Unicode – a now universal system of character encoding found on nearly every platform and application we use today. Before the Unicode standard was developed, earlier systems of character encodings were limited in comparison. No single system could encompass all of the world’s languages and particularly not non-Latin languages that are pictographic in nature, such as Japanese for example. With different systems, there was always a risk of encountering conflicting encodings as well as error when transferring data between devices.

A gif scrolling through the character map for the Arial font while displaying the unicode font on the bottom.
A quick scroll through Character Map on windows for the Arial font that displays the unique Unicode number for the selected character displayed on the bottom.

The Unicode Standard addressed these issues and offered a new way for writing systems around the world to fall under one universal standard of character encoding. With a universal system, “Unicode provides a unique number for every character, no matter what the platform, program, or language is.” The continued growth and development of the Unicode Standard allowed for the expansion of multilingual accessibility on the internet, giving all sorts of languages a newfound voice. As of this article, the latest released version of Unicode (V12.1) has a total of 137,929 assigned graphic and format characters and is expected to grow even more next year.

Despite these advances, it should be noted that Unicode is a system of encoding, and not a typeface in itself. Although encoding may support all sorts of languages and characters, unless there is a glyph for said characters, users are usually met with the .notdef glyph, which Unicode formally refers to as a replacement glyph or missing character. Often appearing as a blank rectangle (e.g. ▯), the appearance of this glyph is not only an annoyance, but a hindrance for someone trying to communicate in their own language.

Noto Font Family

In the age of digital communication, there have been different approaches that seek to solve the problem of missing glyphs. One significant example is the Noto font family, a typeface that was developed by Google in partnership with Monotype. Initially released in 2014 and currently covering over 800 languages and 100 writing systems, Noto aims to get rid of these blank squares (which they refer to as “tofu” due to its appearance). An extensive project that spanned over 5 years, Monotype worked alongside cultural experts as well as the communities that use these scripts and languages in order to create the fonts that you see today. According to Toshi Omagari, senior type designer at Monotype, the ultimate aim of Noto is to “provide at least one free font for every language in the world.”

A list showcasing a number of writing systems included in Noto.
A list showcasing a number of writing systems included in Noto.

The research and work on the Noto project also brings attention to the importance of cultural preservation through language and type. Scripts and writing that previously face the risk of being lost and forgotten over time can now be forever preserved on the internet with the creation of a digital typeface. Take for example, through the Noto project, Monotype’s work on the Adlam script – a relatively new script for the Fulani language of Africa. Designers were able to work directly with the original creators of the non-Latin script, gaining direct insight into the script and language. By working together in such a manner, type designers are able to better “incorporate stylistic choices and features that would reflect the creators’ original intentions.” Their work in bringing this script to the digital realm gave Fulani-speakers a new voice online.

Toshi Omagari, a senior Japanese type designer at Monotype and designer of the Tibetan script for Noto, describes typeface design as “giving users the voice they want,” which is important given the context of multilingual typeface design. He is no stranger to typeface design either, as he has designed typefaces for “Latin, Cyrillic, Greek, Mongolian, Tibetan, Arabic, and classical or minority scripts like Phags-Pa, Soyombo, Siddham, and Lepcha.”

A page from the notebook that Toshi Omagari kept while studying the calligraphic traditions of Tibetan.
A page from the notebook that Toshi Omagari kept while studying the calligraphic traditions of Tibetan.
A sample of the Noto Sans Tibetan font.
A sample of the Noto Sans Tibetan font designed by Toshi Omagari.

When asked about his thoughts on the application of the Noto font family and the importance of having “universal typefaces” similar to it, Toshi notes that “for many, Noto turns out to be the first font ever” and is “being used in all sorts of places, from Google itself, Facebook, and to regular documents [as well].” An example of Noto’s usage that Toshi cites was on Facebook, where he saw Noto Mongolian being used for the first time in a text format when “they were sending pictures of a text before” the creation of the font family. He puts emphasis on the concept and importance of voice here, and believes that “without fonts, people are voiceless on the internet.” The creation of Noto with its “universal aesthetic” and “technical treatment of every language” filled a gap that was desperately needed in an era that depends on digital communication.

Challenges

Despite the success of a typeface like Noto, there are still many challenges that come with multilingual type design. When asked his thoughts on the issue, Toshi felt that the biggest problem was “design decisions in latin being applied to non-Latin [scripts].” He may have clients that ask for “Japanese or Arabic versions” of a geometric typeface, when a majority of the scripts found around the world are not constructed in the same way that Latin script is. For example, Toshi states that “complex scripts” like Arabic and Devanagari “whose character shape can vary depending on context” are more difficult to work with as they “require a lot of research and technical understanding on the designer’s part” but “also the most fun to design.” When working with non-Latin scripts, it is important for designers to have a basic understanding on what exactly they are working with as they cannot simply condense or thicken a font as they see fit. Many scripts have evolved differently over time and like Toshi said, “there are shapes and details you should avoid in Latin” if you want to create a typeface that will (potentially) cover a wide range of scripts.

A good example of these challenges in practice is seen in the experiences of designer Peter Biľak of Typotheque in Designing Hebrew Type. To preface, Biľak doesn’t speak Hebrew and was ultimately unfamiliar with the script and alphabet. He encountered problems early on in his endeavor, as his own understanding on “letter proportions, the balance of forms and counter-forms” were completely different to those of Hebrew readers and applied more to Latin script. The initial application of his own optical rules looked odd to native designers, and it took years for him to “retrain” his eyes and brain to adjust and work with the script. Biľak also encountered the issue of serifs – a concept that is “foreign to most non-Latin scripts.” The problems that Biľak faced demonstrates many of the fundamental challenges within multilingual type design that Toshi Omagari outlines.

Early sketches of Hebrew letter terminations by Peter Biľak.
Early sketches of Hebrew letter terminations by Peter Biľak.
Further Development

Despite the difficulties with multilingual type design, the growth of web fonts and increased for non-Latin type alongside a Unicode standard has permanently changed the digital landscape. Although requiring a considerable amount of work and resources, the creation of more “universal” typefaces like Noto has helped to increase accessibility on the internet. Communities that previously did not have a voice are now able to participate in global communication, introducing their ideas online and having their language and culture preserved for years to come.

Even with these advances and accomplishments, multilingual type can still be further diversified. Toshi Omagari believes that “we desperately lack display styles” and although typefaces like “Noto Sans and Neue Frutiger World are great, [they are still] conventional text faces” and not comparable to something like Cooper Black. However, he points out that “such a design project is hard to justify financially, but it’s great fun if you have time.”

With the foundations already in place, nothing is stopping multilingual type design from being pushed further. It is ultimately up to both current and emerging designers to decide how they want to take on the challenge of multilingual design in order to continue giving people voices through the use of type.

Sources/References

Biľak, Peter. “Designing Hebrew Type.” Typotheque, Typotheque, 5 Oct. 2017, www.typotheque.com/articles/designing_hebrew_type.

“Glossary of Unicode Terms.” Glossary, Unicode, Inc., 11 Mar. 2019, 12:06:44 PM, unicode.org/glossary/#replacement_glyph.

“Google Noto Fonts.” Google Noto Fonts, Google, www.google.com/get/noto/.

Matteson, Steve. “Creating Noto for Google.” Monotype, Monotype Imaging Inc., 6 Oct. 2019, www.monotype.com/resources/case-studies/more-than-800-languages-in-a-single-typeface-creating-noto-for-google.

Omagari, Toshi. “Toshi Omagari.” Monotype, Monotype Imaging Inc., 2019, www.monotype.com/studio/toshi-omagari.

“Overview.” Unicode, Unicode, Inc., 17 July 2019, home.unicode.org/basic-info/overview/.

“Peter Biľak.” Typotheque, Typotheque, 2019, www.typotheque.com/authors/peter_bilak.

The Editors of Encyclopaedia Britannica. “Latin Alphabet.” Encyclopædia Britannica, Encyclopædia Britannica, Inc., 12 June 2013, www.britannica.com/topic/Latin-alphabet.

“Unicode® Version 12.1 Character Counts.” Unicode, Unicode, Inc., 23 Sept. 2019, 4:22:39, www.unicode.org/versions/stats/charcountv12_1.html.

 

 

 

Leave a Reply

Privacy Statement