Distributional semantic modeling: a revised technique to train term/word vector space models applying the ontology-related approach
Published in 12th International Conference of Programming UkrPROG 2020, volume 2866 of CEUR Workshop Proceedings. Also published in scientific journal "Problemy programmirovaniâ" (2020), 2020
We design a new technique for the distributional semantic modeling with a neural network-based approach to learn distributed term representations (or term embeddings) – term vector space models as a result, inspired by the recent ontology-related approach (using different types of contextual knowledge such as syntactic knowledge, terminological knowledge, semantic knowledge, etc.) to the identification of terms (term extraction) and relations between them (relation extraction) called semantic pre-processing technology – SPT. Our method relies on automatic term extraction from the natural language texts and subsequent formation of the problem-oriented or application-oriented (also deeply annotated) text corpora where the fundamental entity is the term (includes non-compositional and compositional terms). This gives us an opportunity to changeover from distributed word representations (or word embeddings) to distributed term representations (or term embeddings). This transition will allow to generate more accurate semantic maps of different subject domains (also, of relations between input terms – it is useful to explore clusters and oppositions, or to test your hypotheses about them). The semantic map can be represented as a graph using Vec2graph – a Python library for visualizing word embeddings (term embeddings in our case) as dynamic and interactive graphs. The Vec2graph library coupled with term embeddings will not only improve accuracy in solving standard NLP tasks, but also update the conventional concept of automated ontology development. The main practical result of our work is the development kit (set of toolkits represented as web service APIs and web application), which provides all necessary routines for the basic linguistic pre-processing and the semantic pre-processing of the natural language texts in Ukrainian for future training of term vector space models.
Recommended citation: O.V. Palagin, V.Yu. Velychko, K.S. Malakhov, and O.S. Shchurov. Distributional semantic modeling: a revised technique to train term/word vector space models applying the ontology-related approach. In Ivan Sergienko and Philip Andon, editors, Proceedings of the 12th International Scientific and Practical Conference of Programming (UkrPROG 2020), volume 2866 of CEUR Workshop Proceedings, pages 342-352. CEUR-WS.org, 2020 http://ceur-ws.org/Vol-2866/ceur_342-352palagin34.pdf