Planned seminars

Europe/Lisbon

Taban Baghfalaki, Bordeaux University, Bordeaux, France

Dynamic event prediction, using joint modeling of survival time and longitudinal variables, is extremely useful in personalized medicine. However, estimating joint models that include multiple longitudinal markers remains a computational challenge due to the large number of random eff ects and parameters that need to be estimated. We propose a model-averaging strategy to combine predictions from several joint models for the event, including models with only one longitudinal marker or pairwise longitudinal markers. The prediction is computed as the weighted mean of the predictions from the one-marker or two-marker models, with the time-dependent weights estimated by minimizing the time-dependent Brier score. This method enables us to combine a large number of predictions issued from joint models to achieve a reliable and accurate individual prediction. The advantages and limitations of the proposed methods are highlighted by comparing them with the predictions from well-specifi ed and misspecifi ed all-marker joint models, as well as one-marker and two-marker joint models, using the available PBC2 dataset. The method is used to predict the risk of death in patients with primary biliary cirrhosis. The method is also used to analyze a French cohort study called the 3C data. In our study, seventeen longitudinal markers are being considered to predict the risk of death.

Joint seminar CEMAT and CEAUL

Europe/Lisbon

Cécile Mercadier, Université Claude Bernard – Lyon 1, France
To be announced

Joint seminar CEMAT and CEAUL

Europe/Lisbon
Room P3.10, Mathematics Building Instituto Superior Técnicohttps://tecnico.ulisboa.pt

Diogo Pereira, CEMAT, Instituto Superior Técnico

The maximum likelihood problem for Hidden Markov Models is usually numerically solved by the Baum-Welch algorithm, which uses the Expectation-Maximization algorithm to find the estimates of the parameters. This algorithm has a recursion depth equal to the data sample size and cannot be computed in parallel, which limits the use of modern GPUs to speed up computation time. A new algorithm is proposed that provides the same estimates as the Baum-Welch algorithm, requiring about the same number of iterations, but is designed in such a way that it can be parallelized. As a consequence, it leads to a significant reduction in the computation time. We illustrate this by means of numerical examples, where we consider simulated data as well as real datasets.

Joint seminar CEMAT and CEAUL