Stochastic search has recently been shown to be successful for solving large boolean satisfiability problems. However, systematic methods tend to be more effective in problem domains with a large number of dependent variables: that is, variables whose truth values are directly determined by a smaller set of independent variables. In systematic search, truth values can be efficiently propagated from the independent to the dependent variables by unit propagation. Such propagation is more expensive in traditional stochastic procedures. In this paper we propose a mechanism for effectively dealing with dependent variables in stochastic search. We also provide empirical data showing the procedure outperforms the best previous stochastic and systematic search procedures on large formulas with a high ratio of dependent to independent variables.
The difficulty of finding information on the World Wide Web by browsing hypertext documents has led to the development and deployment of various search engines and indexing techniques. However, many information-gathering tasks are better handled by finding a referral to a human expert, rather than by simply interacting with online information sources. A personal referral allows a user to judge the quality of the information he or she is receiving, as well as to potentially obtain information that is deliberately not made public. The process of finding an expert who is both reliable and likely to respond to the user can be viewed as a search through the network of social relationships between individuals, as opposed to a search through the network of hypertext documents. The goal of the ReferralWeb project is to create models of social networks by data-mining the Web, and to develop tools that use the models to assist in locating experts and related information search and evaluation tasks.
The past several years have seen much progress in the area of propositional reasoning and satisfiability testing. There is a growing consensus by researchers on the key technical challenges that need to be addressed in order to maintain this momentum. This paper outlines concrete technical challenges in the core areas of systematic search, stochastic search, problem encodings, and criteria for evaluating progress in this area.
It is well known that the performance of a stochastic local search procedure depends upon the setting of its noise parameter, and that the optimal setting varies with the problem distribution. It is therefore desirable to develop general priniciples for tuning the procedures. We present two statistical measures of the local search process that allow one to quickly find the optimal noise settings. These properties are independent of the fine details of the local search strategies, and appear to be relatively independent of the structure of the problem domains. We applied these principles to the problem of evaluating new search heuristics, and discovered two promising new strategies.
A shorter version of my PhD thesis, "A Formal Theory of Plan Recognition", that corrects a few minor errors.
Research in discourse analysis, story understanding, and user modeling for expert systems has shown great interest in plan recognition problems. In a plan recognition problem, one is given a fragmented description of actions performed by one or more agents, and expected to infer the overall plan or scenario which explains those actions. This thesis develops the first formal description of the plan recognition process.
Beginning with a reified logic of events, the thesis presents a scheme for hierarchically structuring a library of event types. A semantic basis for non-deductive inference, called "minimum covering entailment", justifies the conclusions that one may draw from a set of observed actions. Minimum covering entailment is defined by delineating the class of models in which the library is complete and the set of unrelated observations is minimized. An equivalent proof theory forms a preliminary basis for mechanizing the theory. Equivalence theorems between the proof and model theories are presented. Minimum covering entailment is related to a formalism for non-monotonic inference known as "circumscription". Finally, the thesis describes a number of algorithms which correctly implement the theory, together with a discussion of their complexity.
The theory is applied to a number of examples of plan recognition, in domains ranging from an operating system advisor to the theory of speech acts. The thesis shows how problems of medical diagnosis, a similar kind of non-deductive reasoning, can be cast in the framework, and an example previously solved by a medical expert system is worked out in detail.
The analyses provides a firm theoretical foundation for much of what is loosely called "frame based inference", and directly accounts for problems of ambiguity, abstraction, and complex temporal interactions, which were ignored by previous work. The framework can be extended to handle difficult phenomena such as errors, and can also be restricted in order to improve its computational properties in specialized domains.
Back to my personal home page page.
Back to my AT&T information page.