Vai al contenuto principale

Luca Rubini

  • Phd: 29 th cycle
  • Matriculation number: 700096

Phd thesis

This thesis focuses on the training process of a Single Layer Feed Forward Neural Network by pseudoinversion methods. This family of techniques uses pseudoinversion for evaluating the weights of output layer, given an initial input weights configuration; because input weights are fixed the training process results to be faster compared with classical backpropagation. Although pseudoinversion-based approaches are very popular due to their e.ectiveness and lower computational time requirements, they are sometimes a.ected by numerical instability and overfitting, which can dramatically compromise the learning performance; thus we build on the framework of Tikhonov regularisation proposing some techniques to overcome the above issues and to improve generalisation capabilities of the model.


In the first part of this thesis we focus on setting input weights through random projections, a simple and powerful tool to treat large amount of complex data. We show that so doing we often save computational time and memory resources; then we analyse the impact on performances when the completely random sampling of the space of input weights is replaced by a multi-start local search strategy from operational research domain.


In the second part of this thesis we propose a procedure named OCReP which uses the condition number as a diagnostic tool for identification of instability. By imposing well-conditioning requirements on the relevant matrices, our theoretical analysis allows the identification of an optimal value for the regularization parameter without the use of classical k-fold cross validation schemes.

We use some UCI benchmark datasets for both regression and classification tasks in order to test our proposed ideas; we observe considerable improvements with respect to the state of the art methods.

Last update: 08/05/2017 11:31
Non cliccare qui!