Geometric Data Analysis (M104)

Announcements

Ανάθεση εργασίας για τελική αξιολόγηση.
- Thursday, January 7, 2021 at 2:42 PM -

Kαλημερα και καλη χρονια..

σε συνεργασια με τον κ. Εμιρη, στο πλαισιο της αξιολογησης του μαθηματος προτεινουμε την βιβλιογραφικη αναδρομη σε 3 θεματα (φαινονται παρακατω) με πυρηνα τις δημοσιευσεις που προτεινονται σε καθε ενα απο αυτα.

Η αξιολογηση σας θα γινει (σε ημερομηνια που θα ανακοινωθει εγκαιρα) με βαση μια 30 λεπτη (+ ερωτησεις) παρουσιαση.

Οι 5 φοιτητες/τριες που παρακολουθουν το μαθημα με στοχο την βαθμολογηση τους να χωριστειτε μετα απο συνεννοηση μεταξυ σας σε 3 ομαδες (πχ 2+2+1,  η ομαδα με 1 ατομο θα εχει προφανως λιγοτερο φορτο) - και να μας ενημερωσετε μεχρι 12 Ιανουαριου με ενα μηνυμα εμαιλ προς τον κ. Εμιρη με αντιγραφο σε μενα.


Ευχες,

Μ. Βαζιργιαννης
===================

1. Graph convolutional Networks

- The first approach for a graph neural network model came from
Scarcelli 2009 [1].
- The most impactful work after Scarcelli was from Kipf & Welling 2016
[2] , who introduced a matrix-form spectral based neural network for
graphs called GCN, which incorporated the graph topology as matrix
multiplication.
- Transforming the GCN into label propagation schemes was a next crucial
step, such as GraphSAGE from Hamilton 2017 [3]. This work highlighted a
first connection between the spectral-based GCN and the spatial-based
GNNs, which ( the latter ) are described via locally aggregational rules.

[1]
http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.1015.7227&rep=rep1&type=pdf
[2]
http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.1015.7227&rep=rep1&type=pdf
1

[3] https://arxiv.org/abs/1706.02216

2. Graph attention networks and expressiveness

- Most of the works until 2017 described isotropic graph learning
architectures, that treated the neighborhoods of each node equally. As a
complement to that, Velickovic 2017 proposed Graph Attention Networks
[4], where the definition of graph attentions was built. Â
- Since 2018, there was an increasing interest in measuring and
extending the expressivity of graph neural networks. Xu et al 2018/2019
[5] presented one of the first efforts of defining the expressive power
with respect to the equivalence with WL-isomorphism test.
- On parallel with the comparison with the WL-isomorphism test, efforts
have been made to understand the meaning of depth in GNNs and why most
deep GNN models fail to perform equally to shallow ones, such as the
findings of Oono 2019/2020 [6].


[4] https://arxiv.org/abs/1710.10903
[5] https://arxiv.org/abs/1810.00826
[6] https://arxiv.org/pdf/1905.10947.pdf

3. Graph based methods for NLP

Graphs have proven very effective for NLP tasks. The following papers
present the original approach [10] and aspects such as keyword
extraction[7] and also GNN based approaches [9].

[7]Main core retention on graph-of-words for single-document keyword
extraction, F Rousseau, M Vazirgiannis European Conference on
Information Retrieval, 382-393
[9] Message passing attention networks for document understanding, G
Nikolentzos, A Tixier, M Vazirgiannis. Proceedings of the AAAI
Conference on Artificial Intelligence 34 (05), 8544-8551
[10] Graph-of-word and TW-IDF: new approach to ad hoc IR, F Rousseau, M
Vazirgiannis, Proceedings of the 22nd ACM CIKM, 140, 2013