AfriBERTa brings the power of natural language processing to African languages
Researchers at the Cheriton School of Computer Science have developed a data-efficient pretrained transformed-based neural language model to analyze 11 African languages. Their new neural network model, which they have dubbed AfriBERTa, is based on BERT — Bidirectional Encoder Representations from Transformers — a deep learning technique for natural language processing developed in 2018 by Google.