Vai ai contenuti. | Spostati sulla navigazione | Spostati sulla ricerca | Vai al menu | Contatti | Accessibilità

logo del sistema bibliotecario dell'ateneo di padova

Genchi, Walter (2019) Design compact and efficient recurrent neural networks for natural language processing tasks. [Magistrali biennali]

Full text disponibile come:

[img]
Preview
PDF
2216Kb

Abstract

The present work takes into account the compactness and efficiency of Recurrent Neural Networks (RNNs) for solving Natural Language Processing (NLP) tasks. RNNs are a class of Artificial Neural Networks (ANNs). Compared to Feed-forward Neural Networks (FNNs), RNN architecture is cyclic, i.e. the connection between nodes form cycles. This subtle difference has actually a huge impact on solving sequence-based problems, e.g. NLP tasks.In particular, the first advantage of RNNs regards their ability to modellong-range time dependencies, which is a very desirable property for natural languagedata, where word’s meaning is highly dependent on its context. The second advantage of RNNs is that are flexible and accept as input many different datatypes and representation. This is again the case of natural language data, whichcan come in different sizes, e.g. words with different lengths, and types, e.g. sequences or trees.

Item Type:Magistrali biennali
Corsi di Diploma di Laurea:Scuola di Scienze > Matematica
Uncontrolled Keywords:Neural network, deep learning, NLP, RNN
Subjects:Area 01 - Scienze matematiche e informatiche > INF/01 Informatica
Codice ID:62766
Relatore:Tolomei, Gabriele
Correlatore:Chernyak, Ekaterina
Data della tesi:19 July 2019
Biblioteca:Polo di Scienze > Biblioteca di Matematica
Tipo di fruizione per il documento:on-line per i full-text
Tesi sperimentale (Si) o compilativa (No)?:Yes

Solo per lo Staff dell Archivio: Modifica questo record