Functional and Cognitive Analysis of Grammar in the Georgian Language Using the BERT Model

Functional and Cognitive Analysis of Grammar in the Georgian Language Using the BERT Model

Authors

DOI:

https://doi.org/10.52340/lac.2025.10.42

Keywords:

Linguistic, functional and cognitive dimensions of Georgian grammar communicative purposes

Abstract

*

Downloads

Download data is not yet available.

References

Devlin, J., Chang, M.-W., Lee, K., & Toutanova, K. (2019). BERT: Pre-training of deep bidirectional transformers for language understanding. In Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics (NAACL), 4171–4186.

https://doi.org/10.48550/arXiv.1810.04805

Goldberg, Y. (2019). Assessing BERT’s syntactic abilities. arXiv preprint arXiv:1901.05287. https://arxiv.org/abs/1901.05287

Lakoff, G. (1987). Women, fire, and dangerous things: What categories reveal about the mind. University of Chicago Press.

Langacker, R. W. (1987). Foundations of Cognitive Grammar: Volume I: Theoretical Prerequisites. Stanford University Press.

Tomasello, M. (2003). Constructing a Language: A Usage-Based Theory of Language Acquisition. Harvard University Press.

Rogers, A., Kovaleva, O., & Rumshisky, A. (2020). A Primer in BERTology: What we know about how BERT works. Transactions of the Association for Computational Linguistics, 8, 842–866. https://doi.org/10.1162/tacl_a_00349

Aronson, H. I. (1990). Georgian: A Reading Grammar. Columbus, OH: Slavica Publishers.

Chau, C., Yimam, S. M., & Gurevych, I. (2020). Low-resource language model pretraining: A case study on Tibetan. Proceedings of the 28th International Conference on Computational Linguistics (COLING), 1087–1093.

Devlin, J., Chang, M.-W., Lee, K., & Toutanova, K. (2018). BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding. arXiv preprint arXiv:1810.04805. https://arxiv.org/abs/1810.04805

Givón, T. (1990). Syntax: A Functional-Typological Introduction (Vol. 1). Amsterdam: John Benjamins.

Halliday, M. A. K. (1994). An Introduction to Functional Grammar (2nd ed.). London: Edward Arnold.

Hewitt, B. G. (1995). Georgian: A Structural Reference Grammar. Amsterdam: John Benjamins Publishing.

Langacker, R. W. (1987). Foundations of Cognitive Grammar, Volume 1: Theoretical Prerequisites. Stanford: Stanford University Press.

Linzen, T. (2020). How can we accelerate progress towards human-like linguistic generalization? ACL 2020: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, 5210–5217.

Martin, L., Muller, B., Suárez, P. J. O., Junczys-Dowmunt, M., & Sagot, B. (2020). Towards a Universal Model for Cross-lingual Named Entity Recognition. Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, 7163–7174.

Talmy, L. (2000). Toward a Cognitive Semantics: Concept Structuring Systems (Vol. 1). Cambridge, MA: MIT Press.

Tenney, I., Das, D., & Pavlick, E. (2019). BERT Rediscovers the Classical NLP Pipeline. Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics, 4593–4601.

Aronson, H. I. (1990). Georgian: A Reading Grammar. Columbus, OH: Slavica Publishers.

Chau, C., Yimam, S. M., & Gurevych, I. (2020). Low-resource language model pretraining: A case study on Tibetan. Proceedings of the 28th International Conference on Computational Linguistics (COLING), 1087–1093.

Devlin, J., Chang, M.-W., Lee, K., & Toutanova, K. (2018). BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding. arXiv preprint arXiv:1810.04805. https://arxiv.org/abs/1810.04805

Givón, T. (1990). Syntax: A Functional-Typological Introduction (Vol. 1). Amsterdam: John Benjamins.

Halliday, M. A. K. (1994). An Introduction to Functional Grammar (2nd ed.). London: Edward Arnold.

Hewitt, B. G. (1995). Georgian: A Structural Reference Grammar. Amsterdam: John Benjamins Publishing.

Langacker, R. W. (1987). Foundations of Cognitive Grammar, Volume 1: Theoretical Prerequisites. Stanford: Stanford University Press.

Linzen, T. (2020). How can we accelerate progress towards human-like linguistic generalization? ACL 2020: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, 5210–5217.

Martin, L., Muller, B., Suárez, P. J. O., Junczys-Dowmunt, M., & Sagot, B. (2020). Towards a Universal Model for Cross-lingual Named Entity Recognition. Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, 7163–7174.

Talmy, L. (2000). Toward a Cognitive Semantics: Concept Structuring Systems (Vol. 1). Cambridge, MA: MIT Press.

Tenney, I., Das, D., & Pavlick, E. (2019). BERT Rediscovers the Classical NLP Pipeline. Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics, 4593–4601.

Downloads

Published

2025-06-03

How to Cite

Mirtskhulava, L. (2025). Functional and Cognitive Analysis of Grammar in the Georgian Language Using the BERT Model. Language and Culture, (10), 291–297. https://doi.org/10.52340/lac.2025.10.42
Loading...