elib
DLR-Header
DLR-Logo -> http://www.dlr.de
DLR Portal Home | Impressum | Datenschutz | Kontakt | English
Schriftgröße: [-] Text [+]

Context based Text-generation using LSTM networks

Santhanam, Sivasurya (2018) Context based Text-generation using LSTM networks. Artificial Intelligence International Conference – A2IC 2018, 21- 23 Nov 2018, Barcelona, Spain.

[img] PDF
1MB

Kurzfassung

Long short-term memory(LSTM) units on sequence-based models are being used in translation, question-answering systems, classification tasks due to their capability of learning long-term dependencies. Text generation models, an application of LSTM models are recently popular due to their impressive results. LSTM models applied to natural languages are great in learning grammatically stable syntaxes. But the downside is, the system has no basic idea of the context and it generates text given a set of input words irrespective of the use-case. The proposed system trains the model to generate words given input words along with a context vector. Depending upon the use-case, the context vector is derived for a sentence or for a paragraph. A context vector could be a topic (from topic models) or the word having highest tf-idf weight in the sentence or a vector computed from word clusters. Thus, during the training phase, the same context vector is applied across the whole sentence for each window to predict successive words. Due to this structure, the model learns the relation between the context vector and the target word. During prediction, the user could provide keywords or topics to guide the system to generate words around a certain context. Apart from the syntactic structure in the current text-generation models, this proposed model will also provide semantic consistency. Based on the nature of computing context vectors, the model has been tried out with two variations (tf-idf and word clusters). The proposed system could be applied in question-answering systems to respond with a relevant topic. Also in Text-generation of stories with defined hints. The results should be evaluated manually on how semantically closer the text is generated given the context words.

elib-URL des Eintrags:https://elib.dlr.de/139595/
Dokumentart:Konferenzbeitrag (Vortrag)
Zusätzliche Informationen:Generation of a sequence of text based on a specific context using Long short-term memory neural networks is proposed in this work.
Titel:Context based Text-generation using LSTM networks
Autoren:
AutorenInstitution oder E-Mail-AdresseAutoren-ORCID-iDORCID Put Code
Santhanam, SivasuryaSivasurya.Santhanam (at) dlr.dehttps://orcid.org/0000-0001-5117-8288NICHT SPEZIFIZIERT
Datum:November 2018
Referierte Publikation:Nein
Open Access:Ja
Gold Open Access:Nein
In SCOPUS:Nein
In ISI Web of Science:Nein
Status:veröffentlicht
Stichwörter:Natural language processing, Machine learning, Text generation, Neural networks
Veranstaltungstitel:Artificial Intelligence International Conference – A2IC 2018
Veranstaltungsort:Barcelona, Spain
Veranstaltungsart:internationale Konferenz
Veranstaltungsdatum:21- 23 Nov 2018
Veranstalter :PremC
HGF - Forschungsbereich:Luftfahrt, Raumfahrt und Verkehr
HGF - Programm:Raumfahrt
HGF - Programmthema:Technik für Raumfahrtsysteme
DLR - Schwerpunkt:Raumfahrt
DLR - Forschungsgebiet:R SY - Technik für Raumfahrtsysteme
DLR - Teilgebiet (Projekt, Vorhaben):R - Vorhaben SISTEC (alt)
Standort: Köln-Porz
Institute & Einrichtungen:Institut für Simulations- und Softwaretechnik > Verteilte Systeme und Komponentensoftware
Institut für Simulations- und Softwaretechnik
Hinterlegt von: Santhanam, Sivasurya
Hinterlegt am:14 Dez 2020 09:28
Letzte Änderung:14 Dez 2020 09:28

Nur für Mitarbeiter des Archivs: Kontrollseite des Eintrags

Blättern
Suchen
Hilfe & Kontakt
Informationen
electronic library verwendet EPrints 3.3.12
Gestaltung Webseite und Datenbank: Copyright © Deutsches Zentrum für Luft- und Raumfahrt (DLR). Alle Rechte vorbehalten.