Tutorials, Sunday June 27

09:30-13:00 T1. Structural Decomposition Methods: Identifying easy instances of hard problems [abstract]
Francesco Scarcello and Gianluigi Greco
15:00-18:30 T2. Multivariate Models for Relational Learning [abstract]
Volker Tresp

T1. Structural Decomposition Methods: Identifying easy instances of hard problems

Many NP-hard problems in different application areas, ranging, e.g., from AI and Database Theory to Game theory, are known to be efficiently solvable when restricted to instances whose underlying structures can be modelled via acyclic graphs or hypergraphs. However, structures arising from real applications are hardly precisely acyclic. Yet, they are often not very intricate and, in fact, tend to exhibit some limited degree of cyclicity, which suffices to retain most of the nice properties of acyclic ones. Therefore, several efforts have been spent to investigate invariants that are best suited to identify nearly-acyclic graph/hypergraphs, leading to the definition of a number of so-called structural decomposition methods.

The tutorial will illustrate the main structural decomposition methods proposed in the literature within a unifying framework, it will discuss their game-theoretic and logical characterizations, and it will present the algorithmic approaches that can be used to efficiently solve nearly acyclic instances, by looking at the so-called islands of tractability. As a reference scenario for the exposition, the well-known problem of efficiently evaluating conjunctive queries in relational databases will be considered.

The tutorial is intended both for academic researchers and for implementers, wishing to familiarize with methods and algorithms tailored to efficiently solve NP-hard problems. The tutorial will be self-contained. No specific prerequisites are required.

Francesco Scarsello

Francesco Scarcello is an associate professor of Computer Science at the Università della Calabria, Italy. His research interests are database theory, constraint satisfaction problems, nonmonotonic reasoning, knowledge representation, knowledge compilation, planning, and computational complexity. The results of his research have been published in high-level international journals and conferences. In 2008, his paper on the complexity of Nash equilibria (co-authored with Gianluigi Greco and Georg Gottlob) received the IJCAI-JAIR Best Paper Award. In 2009, its PODS'99 paper where he defined hypertree decompositions (together with Nicola Leone and Georg Gottlob) received the ACM PODS Alberto O. Mendelzon Test-of-Time Award, which is awarded every year to a paper published in the PODS proceedings ten years prior that had the most impact in terms of research, methodology, or transfer to practice over the intervening decade.

Gianluigi Greco

Gianluigi Greco is an assistant professor of Computer Science at the Università della Calabria, Italy. His research interests focus on database theory, game theory, constraint satisfaction problems, nonmonotonic reasoning, knowledge representation, machine learning, and computational complexity. The results of his research have been published in more than 90 high-level international journals and conferences. In 2008, his paper on the complexity of Nash equilibria (co-authored with Francesco Scarcello and Georg Gottlob) received the IJCAI-JAIR Best Paper Award that is annually awarded to an outstanding paper published in the Journal of Artificial Intelligence Research in the preceding five calendar years. In 2009, his contributions to the field of Artificial Intelligence have been awarded with the Marco Somalvico Prize from the Italian Association for Artificial Intelligence (AI*IA).


T2. Multivariate Models for Relational Learning

Multivariate modeling concerns the prediction of several response variables given the inputs. The main motivation behind multivariate modeling is that, via combined modeling, the prediction accuracy for each individual response variable is improved. In the first half of the tutorial, we will cover some of the main approaches to multivariate modeling, e.g., hierarchical Bayesian modeling, mixed models, projection methods, matrix decomposition, and structural prediction. The second half of the tutorial is concerned with the application of multivariate models to relational learning. Examples include the infinite hidden relational model, stochastic relational models, and multivariate models used in Semantic Web computing.

We believe that this area of research is of major interest and should attract a wide audience. Technically, only basic knowledge in probability and machine learning is assumed.

Volker Tresp

Volker Tresp received a Diploma degree from the University of Göttingen, Germany, in 1984, and a Ph.D. from Yale University in 1989. Since 1989 he is the head of a research team in machine learning at Siemens, Corporate Research and Technology. In 1994 he was a visiting scientist at the Massachusetts Institute of Technology's Center for Biological and Computational Learning. Each summer (since 2003) he gives a lecture on machine learning and data mining at the University of Munich. He has been involved in all leading program committees in machine learning and is on the organizing committee of the annual Learning Workshop. He has served on numerous industrial and academic advisory boards.