Towards a More Inclusive AI: Progress and Perspectives in Large Language Model Training for the Sámi Language ...

Sámi, an indigenous language group comprising multiple languages, faces digital marginalization due to the limited availability of data and sophisticated language models designed for its linguistic intricacies. This work focuses on increasing technological participation for the Sámi language. We dra...

Full description

Bibliographic Details
Main Authors: Paul, Ronny, Buckchash, Himanshu, Parida, Shantipriya, Prasad, Dilip K.
Format: Article in Journal/Newspaper
Language:unknown
Published: arXiv 2024
Subjects:
Online Access:https://dx.doi.org/10.48550/arxiv.2405.05777
https://arxiv.org/abs/2405.05777
id ftdatacite:10.48550/arxiv.2405.05777
record_format openpolar
spelling ftdatacite:10.48550/arxiv.2405.05777 2024-09-09T20:06:31+00:00 Towards a More Inclusive AI: Progress and Perspectives in Large Language Model Training for the Sámi Language ... Paul, Ronny Buckchash, Himanshu Parida, Shantipriya Prasad, Dilip K. 2024 https://dx.doi.org/10.48550/arxiv.2405.05777 https://arxiv.org/abs/2405.05777 unknown arXiv arXiv.org perpetual, non-exclusive license http://arxiv.org/licenses/nonexclusive-distrib/1.0/ Computation and Language cs.CL Artificial Intelligence cs.AI FOS Computer and information sciences Article article Preprint CreativeWork 2024 ftdatacite https://doi.org/10.48550/arxiv.2405.05777 2024-06-17T09:21:52Z Sámi, an indigenous language group comprising multiple languages, faces digital marginalization due to the limited availability of data and sophisticated language models designed for its linguistic intricacies. This work focuses on increasing technological participation for the Sámi language. We draw the attention of the ML community towards the language modeling problem of Ultra Low Resource (ULR) languages. ULR languages are those for which the amount of available textual resources is very low, and the speaker count for them is also very low. ULRLs are also not supported by mainstream Large Language Models (LLMs) like ChatGPT, due to which gathering artificial training data for them becomes even more challenging. Mainstream AI foundational model development has given less attention to this category of languages. Generally, these languages have very few speakers, making it hard to find them. However, it is important to develop foundational models for these ULR languages to promote inclusion and the tangible ... Article in Journal/Newspaper Sámi DataCite
institution Open Polar
collection DataCite
op_collection_id ftdatacite
language unknown
topic Computation and Language cs.CL
Artificial Intelligence cs.AI
FOS Computer and information sciences
spellingShingle Computation and Language cs.CL
Artificial Intelligence cs.AI
FOS Computer and information sciences
Paul, Ronny
Buckchash, Himanshu
Parida, Shantipriya
Prasad, Dilip K.
Towards a More Inclusive AI: Progress and Perspectives in Large Language Model Training for the Sámi Language ...
topic_facet Computation and Language cs.CL
Artificial Intelligence cs.AI
FOS Computer and information sciences
description Sámi, an indigenous language group comprising multiple languages, faces digital marginalization due to the limited availability of data and sophisticated language models designed for its linguistic intricacies. This work focuses on increasing technological participation for the Sámi language. We draw the attention of the ML community towards the language modeling problem of Ultra Low Resource (ULR) languages. ULR languages are those for which the amount of available textual resources is very low, and the speaker count for them is also very low. ULRLs are also not supported by mainstream Large Language Models (LLMs) like ChatGPT, due to which gathering artificial training data for them becomes even more challenging. Mainstream AI foundational model development has given less attention to this category of languages. Generally, these languages have very few speakers, making it hard to find them. However, it is important to develop foundational models for these ULR languages to promote inclusion and the tangible ...
format Article in Journal/Newspaper
author Paul, Ronny
Buckchash, Himanshu
Parida, Shantipriya
Prasad, Dilip K.
author_facet Paul, Ronny
Buckchash, Himanshu
Parida, Shantipriya
Prasad, Dilip K.
author_sort Paul, Ronny
title Towards a More Inclusive AI: Progress and Perspectives in Large Language Model Training for the Sámi Language ...
title_short Towards a More Inclusive AI: Progress and Perspectives in Large Language Model Training for the Sámi Language ...
title_full Towards a More Inclusive AI: Progress and Perspectives in Large Language Model Training for the Sámi Language ...
title_fullStr Towards a More Inclusive AI: Progress and Perspectives in Large Language Model Training for the Sámi Language ...
title_full_unstemmed Towards a More Inclusive AI: Progress and Perspectives in Large Language Model Training for the Sámi Language ...
title_sort towards a more inclusive ai: progress and perspectives in large language model training for the sámi language ...
publisher arXiv
publishDate 2024
url https://dx.doi.org/10.48550/arxiv.2405.05777
https://arxiv.org/abs/2405.05777
genre Sámi
genre_facet Sámi
op_rights arXiv.org perpetual, non-exclusive license
http://arxiv.org/licenses/nonexclusive-distrib/1.0/
op_doi https://doi.org/10.48550/arxiv.2405.05777
_version_ 1809939001934610432