Date of Publication
12-16-2019
Document Type
Master's Thesis
Degree Name
Master of Science in Computer Science
Subject Categories
Computer Sciences
College
College of Computer Studies
Department/Unit
Computer Science
Thesis Adviser
Ethel Chua Joy Ong
Defense Panel Chair
Charibeth Cheng
Defense Panel Member
Nathalie Rose Lim-Cheng
Ethel Chua Joy Ong
Abstract/Summary
Abstract The area of story content generation has been widely explored in the field of natural language processing. Previously, analogy-based methodologies have been used to provide an approach to this task. However, with the improvement of tech-nology, more and more research have been tapping into recurrent neuralnetworks - specifically, long short-term memory networks (LSTM) to accomplish this task. More specifically, they train the LStM models to learn the story, either through sequences of scenes in a story, or more commonly through sequences of action events, and allow them to predict a subsequent event based on this input. While the approach proves to be more complex, general findings from these researches an inability to provide consistently decent respons. In these researches, the rec-curring problem is attributed mainly to the poor quality of the training dataset. This research takes this opportunity and provides an alternative method of ex-traction and encoding event sequences in stories. The performance analysis of the event extraction system over 8 stories yielded the system an F1 score of 82%. The effectiveness of the encoding was evaluated by utilizing the encoded events extracted from a set of children’s stories in training an LSTM network. Results show the system’s ability to generate a decent response for more than half the time, with its ability limited by the current size of the dataset.
Abstract Format
html
Language
English
Format
Electronic
Accession Number
CDTG007957
Keywords
Natural language generation (Computer science); Parsing (Computer grammar); Computational linguistics
Upload Full Text
wf_yes
Recommended Citation
Villaluna, W. D. (2019). Extracting and encoding event sequences for use in recurrent neural networks. Retrieved from https://animorepository.dlsu.edu.ph/etd_masteral/6517
Embargo Period
12-5-2022