Adjustments to intros and conclusions

Signed-off-by: Riccardo Finotello <riccardo.finotello@gmail.com>
This commit is contained in:
2020-10-13 17:48:21 +02:00
parent 295fe19683
commit 43a73d6beb
13 changed files with 260 additions and 249 deletions

View File

@@ -1,9 +1,9 @@
In this thesis we present topics in phenomenology of string theory ranging from particle physics amplitudes and Big Bang-like singularities to the study of state-of-the-art deep learning techniques based on recent advancements in artificial intelligence for string compactifications.
We present topics in phenomenology of string theory ranging from particle physics amplitudes and Big Bang-like singularities to the study of state-of-the-art deep learning techniques for string compactifications based on recent advancements in artificial intelligence.
In particular we show the computation of the leading contribution to amplitudes in the presence of non Abelian twist fields in intersecting D-branes scenarios in non factorised tori.
We show the computation of the leading contribution to amplitudes in the presence of non Abelian twist fields in intersecting D-branes scenarios in non factorised tori.
This is a generalisation to the current literature which mainly covers factorised internal spaces.
We also study a new method to compute amplitudes in the presence of an arbitrary number of spin fields introducing point-like defects on the string worldsheet.
This method can then be treated as an alternative computation with respect to bosonization and older approaches based on the Reggeon vertex.
This method can then be treated as an alternative computation with respect to bosonization and approaches based on the Reggeon vertex.
We then present an analysis of Big Bang-like cosmological divergences in string theory on time-dependent orbifolds.
We show that the nature of the divergences are not due to gravitational feedback but to the lack of an underlying effective field theory.
We also introduce a new orbifold structure capable of fixing the issue and reinstate a distributional interpretation to field theory amplitudes.
@@ -14,7 +14,7 @@ We also include a methodological study of machine learning applied to data in st
We thus show how such an approach can help in improving results by processing the data before using it.
We then show how deep learning can reach the highest accuracy in the task with smaller networks with less parameters.
This is a novel approach to the task: differently from previous attempts we focus on using convolutional neural networks capable of reaching higher accuracy on the predictions and ensuring phenomenological relevance to results.
The approach is inspired by recent advancements in computer science and inspired by Google's research in the field.
In fact parameter sharing and concurrent scans of the configuration matrix retain better generalisation properties and adapt better to the task than fully connected networks.
% vim: ft=tex