Corrections and adjustments to the introduction
Signed-off-by: Riccardo Finotello <riccardo.finotello@gmail.com>
This commit is contained in:
		| @@ -1,4 +1,4 @@ | ||||
| We present topics of phenomenological relevance in string theory ranging from particle physics amplitudes and Big Bang-like singularities to the study of state-of-the-art deep learning techniques for string compactifications based on recent advancements in artificial intelligence. | ||||
| We present topics of (semi-)phenomenological relevance in string theory ranging from particle physics amplitudes and Big Bang-like singularities to the study of state-of-the-art deep learning techniques for string compactifications based on recent advancements in artificial intelligence. | ||||
|  | ||||
| We show the computation of the leading contribution to amplitudes in the presence of non Abelian twist fields in intersecting D-branes scenarios in non factorised tori. | ||||
| This is a generalisation to the current literature which mainly covers factorised internal spaces. | ||||
| @@ -11,8 +11,8 @@ We also introduce a new orbifold structure capable of fixing the issue and reins | ||||
| We finally present a new artificial intelligence approach to algebraic geometry and string compactifications. | ||||
| We compute the Hodge numbers of Complete Intersection Calabi--Yau $3$-folds using deep learning techniques based on computer vision and object recognition techniques. | ||||
| We also include a methodological study of machine learning applied to data in string theory: as in most applications machine learning almost never relies on the blind application of algorithms to the data but it requires a careful exploratory analysis and feature engineering. | ||||
| We thus show how such an approach can help in improving results by processing the data before using it. | ||||
| We then show that the deep learning approach can reach the highest accuracy in the task with smaller networks, less parameters and less data. | ||||
| We thus show how such an approach can help in improving results by processing the data before utilising them. | ||||
| We then show that deep learning the configuration matrix of the manifolds reaches the highest accuracy in the task with smaller networks, less parameters and less data. | ||||
| This is a novel approach to the task: differently from previous attempts we focus on using convolutional neural networks capable of reaching higher accuracy on the predictions and ensuring phenomenological relevance to results. | ||||
| In fact parameter sharing and concurrent scans of the configuration matrix retain better generalisation properties and adapt better to the task than fully connected networks. | ||||
|  | ||||
|   | ||||
		Reference in New Issue
	
	Block a user