290 likes | 300 Views
A Graphical Model For Simultaneous Partitioning And Labeling. Philip Cowans & Martin Szummer AISTATS, Jan 2005. Cambridge. Motivation – Interpreting Ink. Hand-drawn diagram. Machine interpretation. Graph Construction. Vertices are grouped into parts. Vertices, V.
E N D
A Graphical Model For Simultaneous Partitioning And Labeling Philip Cowans & Martin Szummer AISTATS, Jan 2005 Cambridge
Motivation – Interpreting Ink Hand-drawn diagram Machine interpretation
Graph Construction Vertices are grouped into parts. Vertices, V Each part is assigned a label G Edges, E
Labeled Partitions • We assume: • Parts are contiguous. • The graph is triangulated. • We’re interested in probability distributions over labeled partitions conditioned on observed data.
Conditional Random Fields • CRFs (Lafferty et. al.) provide joint labeling of graph vertices. • Idea: define parts to be contiguous regions with same label. • But… • Large number of labels needed. • Symmetry problems / bias. +1 +2 +2 -1 -1 +3 +3
A Better Approach… • Extend the CRF framework to work directly with labeled partitions. • Complexity is improved – don’t need to deal with so many labels. • No symmetry problem – we’re working directly with the representation in which the problem is posed.
Projection • Projection maps labeled partitions onto smaller subgraphs. • If G µV then, the projection of Y onto G is the unique labeled partition of G which is ‘consistent’ with Y.
The Model • Unary: • Pairwise:
Training • Train by finding MAP weights on example data with Gaussian prior (BFGS). • We require the value and gradient of the log posterior: Normalization Marginalization
Prediction • New data is processed by finding the most probable labeled partition. • This is the same as normalization with the summation replaced by a maximization.
Inference • These operations require summation or maximization over all possible labeled partitions. • The number of terms grows super-exponentially with the size of G. • Efficient computation possible using message passing as distribution factors. • Proof based on Shenoy & Shafer (1990).
Message Passing 8 9 7 1 2 3 5 4 6
Message Passing 1,7,8 2,9 1,2,3,4 ‘Upstream’ Message summarizes contribution from ‘upstream’ to the sum for a given configuration of the separator. Junction tree constructed from cliques on original graph. 2,3,4,5 4,5,6
Message Passing 1,7,8 2,9 1,2,3,4 2,3,4,5 x22 4,5,6
Message Update Rule • Update messages (for summation) according to • Marginals found using • Z can be found explicitly
Experimental Results • We tested the algorithm on hand drawn ink collected using a Tablet PC. • The task is to partition the ink fragments into perceptual objects, and label them as containers or connectors. • Training data set was 40 diagrams, from 17 subjects with a total of 2157 fragments. • 3 random splits (20 training and 20 test examples).
Labeling Results • Labelling error: fraction of fragments labeled incorrectly. • Grouping error: fraction of edges locally incorrect.
Conclusions • We have presented a conditional model definied over labeled partitions of an undirected graph. • Efficient exact inference is possible in our model using message passing. • Labeling and grouping simultaneously can improve labeling performance. • Our model performs well when applied to the task of parsing hand-drawn ink diagrams.
Acknowledgements Thanks to: Thomas Minka, Yuan Qi and Michel Gagnet for useful discussion and providing software. Hannah Pepper for collecting our ink database.