博学论坛第三期:Self-Attentive Hawkes Process

Self-Attentive Hawkes Process

发布人:林柱良
主题
Self-Attentive Hawkes Process
Date
-
Venue
珠海校区海滨红楼13号
Speaker
张强
Presenter
张强

讲座摘要:
Capturing the occurrence dynamics is crucialto predicting which type of events will happen next and when. A common method to do this is through Hawkes processes. To enhance their capacity, recurrent neural networks (RNNs) have been incorporated due to RNNs’ successes in processing sequential data such as languages. Recent evidence suggests that self-attention is more competent than RNNs in dealing with languages. However, we are unaware of the effectiveness of self-attention in the context of Hawkes processes. This study aims to fill the gap by designing a self-attentive Hawkes process (SAHP). SAHP employs self-attention to summarise the influence of history events and compute the probability of the next event. One deficit of the conventional self-attention, when applied to event sequences, is that its positional encoding only considers the order of a sequence ignoring the time intervals between events. To overcome this deficit, we modify its encoding by translating time intervals into phaseshifts of sinusoidal functions. Experiments on goodness-of-fit and prediction tasks show the improved capability of SAHP. Furthermore, SAHPis more interpretable than RNN-based counter-parts because the learnt attention weights revealcontributions of one event type to the happeningof another type. To the best of our knowledge,this is the first work that studies the effectivenessof self-attention in Hawkes processes.