Suche

Text Representation Learning for Matrix CapsNets


Description In recent years, deep learning has become very popular in the research community, both in fundamental and applied research. This leads to a steady advancement in the development of existing and new neural network architectures. Many architectures emerge in the field of computer vision and are subsequently adapted for further modalities such as text and audio. For example, convolutional neural networks, which were originally derived from human visual perception, also achieve state-of-the-art performance on many text classification tasks.
A very new and unique innovation for image data are the so-called Capsnets[1] and their improved (improved calculation of the network forward pass) 2nd generation the Matrix Capsules[2]. First attempts have already been made to adapt them for text and achieved some remarkable results on topic or sentiment classification tasks[3][4]. The aim of this study is to explore new ways to learn text representation such as (visual) character quantization[5] with matrix capsules and, thus, to investigate more deeply the combination of text modality and capsule networks.
 
Task In this thesis the student designs and implements a new text representation suitable for a visual input layer. In addition, this text representation is compared to other others like word2vec visual embeddings and N-gram convolutional layer previously used with Capsnets. For this purpose, a benchmark is performed on the popular NLP Tasks Text Sentiment Classification based on Amazon Review 5-class polarity dataset. The implementation of the Matrix Capsules in Tensorflow can be based on [6].
Utilises Tensorflow, Matrix CapsNets, Representation Learning.
Requirements Advanced knowledge in Machine Learning and Natural Language Processing, Good programming skills (e.g. Python, C++).
Languages English or German.
Supervisor Lukas Stappen, M. Sc.
 (lukas.stappen@informatik.uni-augsburg.de)
Shahin Amiriparian, M. Sc.
 (shahin.amiriparian@informatik.uni-augsburg.de)