نبذة مختصرة : First-order logic provides a powerful and flexible mechanism for repre-senting natural language semantics. However, it is an open question of how best to integrate it with uncertain, weighted knowledge, for example regarding word mean-ing. This paper describes a mapping between predicates of logical form and points in a vector space. This mapping is then used to project distributional inferences to inference rules in logical form. We then describe first steps of an approach that uses this mapping to recast first-order semantics into the probabilistic models that are part of Statistical Relational AI. Specifically, we show how Discourse Representa-tion Structures can be combined with distributional models for word meaning in-side a Markov Logic Network and used to successfully perform inferences that take advantage of logical concepts such as negation and factivity as well as weighted information on word meaning in context. 1
No Comments.