The Use of N-Gram Language Model in Predicting Nepali Words
DOI:
https://doi.org/10.3126/paj.v5i1.45040Keywords:
Word suggesting, Katz model, Viterbi model, N-gram , back-off smoothingAbstract
This paper aims to study the problems of automated generation and understanding of natural human languages. The word prediction and word completion from a tab-complete in typing is particularly useful to minimized keystrokes for the users with specific necessaries, and to reduce mistakes, and typographic errors. The word prediction techniques are well-established methods that are frequently used as communication aids for people with disabilities to accelerate the writing, to reduce the effort needed to type and to suggest the correct words. It is something that is skillful at doing prediction according to the previous context. Projection can either be established on word figures or verbal rules. The N-gram model is about predicting nth word from N-1 words. It assigns the probabilities to sentences and sequences of words of all possible combination of n words. To meet the objective, this research uses statistics amount of Nepali language of diverse word kinds to expect right word with as much precision as possible. Under the statistical method, this research will deal with the N-gram method to predict the next word for the Nepali language using Viterbi as decoding algorithm.
Downloads
Downloads
Published
How to Cite
Issue
Section
License
Copyright (c) 2022 Authors and Centre for Research and Innovation (CRI)
This work is licensed under a Creative Commons Attribution 4.0 International License.