Word Embeddings and Its Application in Deep Learning
Parul Verma1, Brijesh Khandelwal2
1Parul Verma, Assistant Professor, Dept. of IT, Amity Institute of Information Technology, Amity University Uttar Pradesh, Lucknow, U.P., India.
2Brijesh Khandelwal, Associate Professor, Dept of Comp. Sc, Amity School of Engineering & Technology, Amity University Chhattisgarh, Raipur, Chhattisgarh, India.
Manuscript received on 29 August 2019. | Revised Manuscript received on 13 September 2019. | Manuscript published on 30 September 2019. | PP: 337-341 | Volume-8 Issue-11, September 2019. | Retrieval Number: K13430981119/2019©BEIESP | DOI: 10.35940/ijitee.K1343.0981119
Open Access | Ethics and Policies | Cite | Mendeley | Indexing and Abstracting
© The Authors. Blue Eyes Intelligence Engineering and Sciences Publication (BEIESP). This is an open access article under the CC-BY-NC-ND license (http://creativecommons.org/licenses/by-nc-nd/4.0/)
Abstract: Word embedding in simple term can be defined as representing text in form of vectors. Vector representations of text help people in finding similarities, because contextual words that seem to appear nearby regularly use to appear in close proximity in vector space. The motivating factor behind such numerical representation of text corpus is that it can be manipulated arithmetically just like any other vector. Deep learning along with neural network is not new at all, both the concepts are prevalent around the decades but there was a major tailback of unavailability and accessibility of computation power. Deep learning is now effectively being used in Natural Language Processing with the improvement in techniques like word embedding, mobile enablement and focus on attention. The paper will discuss about the two popular model of word embedding (Word2Vec model) can be used for deep learning and will also compare them. The implementation steps of Skip gram model are also discusses in the paper. The paper will also discuss challenging issues for Word2Vce model.
Keywords: Deep Learning, CBOW model, Skip-gram model, Word Embedding, Word2Vec,
Scope of the Article: