•   
  •   
  • Home
  •   
  •   
  • Call for Papers
  •   
  •   
  • Committees
  •   
  •   
  • Competition
  •   
  •   
  • Registration
  •   
  •   
  • Program
  •   
  •   
  • Keynotes
  •   
  •   
  • Location
  •   
  •   
  • Sponsors
  •   
  •   
  • Contact







    Antoine Bordes
    Facebook
    Wednesday, June the 20th


    Teaching Machines to Understand Natural Language

    Abstract Despite the recent successes of Deep Learning for multiple tasks ranging from image segmentation to speech recognition, understanding language remains a largely unsolved problem for machines. This is still highly challenging for multiple reasons such as the intrinsic complexity of language, the need for machine common-sense or the difficulty of actually evaluating natural language understanding. Yet, current research is making progress and this talk will exhibit some of them in the areas of open-domain question answering (answering questions on any topic) and machine reading (answering questions related to a short piece of text). We will show how the combined use of innovative neural networks architectures with new training and test benchmarks can yield promising results. The slides.







    John Shawe-Taylor
    UCL
    Thursday, June the 21st


    New Approaches to Training and Analysing Deep Networks

    Abstract The talk will discuss how mini-batch gradient information can be used to identify weight updates that are with high probability good for the whole training set. This suggests a novel deep learning algorithm that shows interesting behaviour in real applications. We will also discuss how this and data distribution dependent approaches may be used to analyse generalisation.The slides.




    Jean-Bernard Lasserre
    CNRS
    Friday, June the 22nd


    Moments and positivity certificates in and outside optimization

    Abstract We first provide a brief description of the moment-SOS (sum-of-squares) hierarchy in global optimization which is based on powerful positivity certificates from real algebraic geometry. Combined with semidefinite programming it allows to define a hierarchy of convex relaxations. Each relaxation in the hierarchy is a semidefinite program whose size increases and the associated monotone sequence of optimal values converges to the global minimum. Finite convergence is generic and fast in practice. In fact this methodology also applies for solving the Generalized Problem of Moments (GPM) (of which global optimization is only a particular instance, and even the simplest). Then we briefly describe its application to several of many other applications outside optimization, notably in applied mathematics, probability, statistics, computational geometry, control and optimal control. The slides.