“PHASETRANS”

STARTPAGE HUMAN RESOURCES AND MOBILITY (HRM) ACTIVITY

MARIE CURIE ACTIONS Marie Curie Re-Integration Grants (IRG) PART B ”PHASETRANS”

Page 1 of 10

“PHASETRANS”

SCIENTIFIC QUALITY OF THE PROJECT 1. RESEARCH TOPIC The theory of NP-completeness [GJ79] is arguably the most important intellectual contribution of Theoretical Computer Science to general scientific knowledge. Thousands of NP-complete problems are known, in scientific areas as diverse as protein folding, statistical physics, or computational social sciences. NP completeness is an indicator of computational intractability, since the computational time of all known algorithms (and indeed all algorithms, assuming that the famous conjecture P ≠ NP is true) scales superpolynomially with instance size. Despite this success, the concept of an NP-complete problem does not completely model experimental reality: the problem is that NP-completeness is a worst-case concept, so any NP-complete problem contains some ”hard” instances. But the theory does not address the issue of ”density” of such hard instances, and indeed, NP-complete problems can differ significantly with respect to the computational difficulty of ”typical” instances arising in practice. Recent years have witnessed the emergence of a new field of research, at the crossroads of Statistical Physics, Theoretical Computer Science and Artificial Intelligence: the study of phase transitions in combinatorial problems [PIM06]. This remarkable development has generated substantial excitement in the scientific [GS05, Rob05] and the popular press [Joh99, Hay97, Hay03]. Concepts from the physics of spin glasses, such as first-order phase transitions, backbones, or the replica method often have natural combinatorial counterparts, and have increased our understanding of the fundamental reasons for computational intractability. Most importantly, knowledge of the phase structure of combinatorial problems is responsible for some of the most remarkable progress in designing efficient algorithms. For instance, the celebrated survey propagation (SP) algorithm [BMZ05] has dramatically increased the size of instances of that can be solved in practice, compared to previous state of the art in satisfiability solving. The algorithm owes its existence to the description of the typical solution space predicted by the so-called 1-step RSB ansatz and the cavity method [MMZ06]. Such exciting advances are not merely of theoretical interest: satisfiability solving is the basis for some of the best computational tools in areas in Artificial Intelligence such as planning [KS92], temporal reasoning or constraint satisfaction. Efficient satisfiability solving is also instrumental in the formal verification of reactive systems using model checking [CGP99]. Such methods are employed to find errors in large complex software systems, and more generally wherever algorithmic approaches are needed. For instance, areas where the model checking approach is important include, besides the original field of hardware verification, domains as diverse as agent-oriented software engineering [BFVW04] , medical diagnosis [BBD+96], systems biology [HK06] or security protocols for e-commerce (e.g.[CJM98]). The use of formal methods in mission-critical software systems (such as for instance, NASA’s Mars Rover [RH04]) underscores (perhaps even more dramatically than in the case of satisfiability) the need for powerful solvers, that will help the detection and ultimately the prevention of undesired system behavior. The applicability of satisfiability solving to formal verification not a coincidence: the main problem addressed by model checking is the identification of “anomalous” patterns of behavior (e.g. deadlock) in a large distributed system. Roughly speaking, this is done by representing the unwanted behavior by a formula in Temporal logic and then systematically searching the (often large) set of system states for a “bad” one. The problem is very similar to searching for a satisfying assignment of a logical formula in the exponentially large space of satisfying assignments. A translation is possible and indeed, many of the advances in this area have come from considering this connection. (e.g. satisfiability solving is the underlying technology behind the NuSMV model checker). In recent work [MIDV05], we have shown that the study of phase transitions in problems related to model checking is feasible. Since formal method researchers have started to recognize the usefulness of randomized methods in software engineering [Men01], it is reasonable to hope that the transfer of methods and ideas from the study of phase transitions can prove useful to the field of formal verification. Moreover, one of the expected

Page 2 of 10

“PHASETRANS”

benefits of such work, the creation of benchmark generators for verification algorithms would benefit and guide recent studies comparing the efficiency of approaches to formal verification [BDG+04]. Detailed knowledge of the structure of solutions has produced one of the most significant algorithmic advances in the recent literature, which is even more impressive, as this work was competing against significant efforts in this direction from the Artificial Intelligence communities. The issue this proposal is addressing is to make these conceptual and experimental advances available to a larger class of problems, particularly to areas where such advances would be most helpful. In a nutshell, we propose: • to increase the generality of the approach, by developing it into a systematic theory. • to develop the practicality of the approach, by experimental investigation of phase transitions for instances with regular structure, with a special focus to instances arising from formal verification. 2. PROJECT OBJECTIVES The state of the art in the study of phase transitions in computational problems (discussed e.g. in [HW05,PIM06]) has a number of problems: • Many of the physicist’s methods lack a rigorous mathematical justification, and include several unreasonable features (such as functions of a negative number of variables, in one approach to the replica method). It is a significant scientific challenge to put these intuitions on a rigorous basis. • The study of phase transitions is still largely a case-by-case enterprise, and has not yet resulted in a general theory. It has connected rather poorly with Computational Complexity, the theory of computational difficulty that the study of phase transitions was supposed to refine. • Correspondingly, the recent efficient algorithms are not general, but are only defined for a few combinatorial problems. For instance, variants of the survey propagation algorithm have been defined, besides the original satisfiability case, for only a handful of cases, such as graph coloring [BMMZ06] and the vertex cover problem [WZ06]. • Last but not least, there exists a disconnect between the distributions of instances considered in the theoretical study of phase transitions and computational practice. So far, the study of phase transitions has emphasized random instances. In contrast, instances of practical relevance arise from highly structured constructions [HMRS06] and typically have a high degree of regularity. A study of phase transitions in such structured instances is needed, and is potentially of great impact for the target application areas such as model checking. To address these issues we propose a research project with two interrelated components: • [C1] the development of a systematic theory relating phase transitions to the computational complexity of combinatorial problems. • [C2] the investigation of phase transitions in structured instances arising from model checking problems, and their impact on the performance of the verification methods. Component [C1]: The approach will be primarily theoretical, and will consist in the investigation of notions from computational complexity theory that naturally relate to results from Statistical Physics. There are a number of good reasons that make the quest for such general theory likely to succeed.

Page 3 of 10

“PHASETRANS”

Computational complexity theory simplifies a great for problems that can be represented using logical frameworks from descriptive complexity: unlike the general case, where problems of complexity “intermediate” between polynomial time and NP-complete exist, for logically defined problems only these two extreme cases can occur [Sch78]. Furthermore, there essentially exist only three nontrivial cases that are easy, and they correspond to very natural variants of satisfiability. Descriptive complexity functions thus as a way to eliminate many “artificial” decision problems allowed by the machine-based definition of the complexity class NP. Also, in previous research we have obtained a number of results that relate phase transitions to computational complexity. They include • a precise determination of the phase transition in the most practical version of satisfiability, so-called Horn satisfiability [Ist02]. The phase transition in the remaining two easy cases has also been rigorously settled by other researchers. By contrast, the transition points for most of the NP-complete cases cannot yet be determined rigorously. • a precise characterization [Ist00, Ist05] of those versions of satisfiability that have a sharp threshold, the kind of phase transition most studied in Discrete Mathematics and Theoretical Computer Science. These characterization results have some similarities to the classification of worst-case complexity of satisfiability problems [Sch78]. For instance, the moral of the results in [Ist00] is that problems that lack a sharp threshold are similar to Horn satisfiability, one of the four easy versions of satisfiability. Thus the lack of a phase transition (at least in limited contexts) does have computational implications. • Statistical Physics arguments [MZK+99] have suggested that so-called first-order phase transitions are responsible for the exponential complexity of Davis-Putnam algorithms on instances at or above the transition point. Inspired by this result we have rigorously shown [IPB05] that a first-order transition in a combinatorial version of the order parameter from Statistical Physics is indeed correlated with exponential complexity of Davis-Putnam algorithms. The reason for this correlation is that the two phenomena (the first-order transition and the exponential complexity) have a common cause. Moreover, results in [IPB05] suggest that satisfiability problems that lack a first-order phase transition are structurally similar to 2-Satisfiability, another of the three easy cases of satisfiability. Again, the lack of a (first-order) phase transition makes the problem “easy”, at least for unsatisfiable instances. There are other reasons that make a connection with descriptive complexity plausible. For instance, the survey propagation algorithm for graph coloring in [BMMZ06] works essentially by “reducing” the problem to the case of satisfiability. The reason for this fact is that the natural reduction between the two problems has a number of special properties: it is a so-called local reduction [HMRS06], which also preserves the cluster structure of the original problem. Not all reductions in Computational Complexity theory have these properties, and the precise identification of the precise properties of reductions that make this possible, and the study of such reductions is an interesting research problem. Another connection between phase transitions and notions of computational complexity comes from the cavity method that underlies the survey propagation algorithm for satisfiability. This method involves computing the marginal probability for the set of satisfying assignments on a given variable by “reducing” the problem to the instance obtained by deleting the variable (and the clauses it participates in). This type of reduction is very familiar in Computational Complexity, under the name of self-reducibility. However the precise type of self-reducibility underlying the cavity method has not been properly defined and investigated. To sum up, we will advance the systematic study of phase transitions by studying a number of foundational problems relating phase transitions to areas in computational complexity theory, particularly to descriptive complexity. The list of issues we plan to address is detailed (under headings [C.1.1] to [C.1.4]) in the next section. Issues [C.1.1] and [C.1.2] are foundational, while issues [C.1.3] and [C.1.4] are largely independent of each other. Component [C2]: The second component of our research program has a predominantly applied/experimental character, and leverages the existing strengths of the host institution in the area of formal verification. The point of departure is the observation that often formal verification problems are Page 4 of 10

“PHASETRANS”

related to Horn satisfiability, a problem whose phase transition we know how to analyze [Ist02]. However, while this is an encouraging connection, exploited e,g. in [MIDV05], the intractability of model checking shows that this connection is by no means universal. Indeed, we anticipate that significant difficulties will occur in adapting the phase transition approach to problems in model checking. For instance, we do not believe that random instances are a good model for typical instances in this area (see the discussion of Problem [C.2.2] in the next section). Rather, the instances are likely to have a significant degree of regularity. We hope to use the extra insight provided by the study of phase transitions at a higher level of abstraction in Component [C1] to guide our intuitions and efforts in this direction. This extra insight is the element that integrates the two research directions, and motivates our twopronged approach to phase transition problems in formal verification. Our work will proceed according to the following sequence of stages. Each step below corresponds to a problem/expected outcome. The problems are labeled [C.2.1] to [C.2.4], and are detailed in the next section. • comprehensive investigation of the potential of existing approaches (including SP) for model checking. • specification of models of random instances of problems in formal verification. Experimental investigation of phase transitions in such models. • systematic identification of ”challenging” instances for model checkers. • using knowledge of the structure of solution space identified by the phase transition studies, the development of propagation-based algorithms that bring the computational advances related to phase transitions to the area of formal verification.

3. SCIENTIFIC ORIGINALITY AND INNOVATION The study of phase transitions in combinatorial optimization problems is one of the research directions that holds significant promise for a better understanding of the underlying reasons that make a problem computationally hard. The development of a systematic theory of phase transitions would represent a significant intellectual advance, and a novel direction in Computational Complexity Theory. Very little work of this type exists so far. Taking research in this direction has the opportunity to enrich Computational Complexity with a whole new set of ideas and techniques, and bring it closer to experimental evidence. Some of the theoretical problems we plan to investigate and, correspondingly, some of the advances we expect to make in this component of the project) include: • [C.1.1] We will develop a descriptive complexity theory approach to characterizing the nature of phase transitions and their connection to complexity of algorithms. For instance, we aim to characterize the sharpness of the phase transition in problems represented in the logical framework of Monotone Monadic strict NP without inequality [FV98]. We plan to study logical frameworks encoding both decision and optimization problems. Similarly, instead of concentrating on Davis-Putnam algorithms, we will attempt to relate phase transitions to the complexity of algorithms expressible in various proof systems related to (and extending) resolution. • [C.1.2] We will develop a suitable notion of “reductions” between combinatorial problems that preserve phase structure properties, and the classification of the problem complexity with respect to the new type of reductions. The precise definition of reduction will most likely include ingredients from [HMRS06], with a number of additional requirements to be determined. The reductions will have very little computational power, compared to the usual m-reductions in Computational Complexity theory. This is not necessarily a problem, since recent results in this area have shown that even extremely weak

Page 5 of 10

“PHASETRANS”

reductions (e.g. the so-called NC0 reductions) have been shown to play a role in Complexity Theory. A consequence of the existence of such reductions will be the generalizations of the survey propagation algorithm to a larger class of constraint satisfaction problems. As an experimental counterpart to this theoretical work, we will undertake a comprehensive computational evaluation of the newly defined algorithms. • [C.1.3] Despite some recent progress [MMW05], there are still outstanding issues concerning the mathematical analysis of the survey propagation algorithm. We will aim to obtain such an analysis. We will also investigating the precise notion of self-reducibility that underlies the cavity approach, a notion where the similarity of solutions is preserved by the reduction. • [C.1.4] Finally, we will develop a notion of average-case complexity that captures the average behavior uncovered by phase transition studies. The existing theory (surveyed in [Wan97]) is inadequate, as it fails to apply to most natural NP-complete problems. Some results [Fei02] suggest that average-case hardness of satisfiability does indeed have computational implications, but no full-fledged theory exists. There are, of course, a number of research questions that do not have the large scope of the previous ones, but are amenable to successful analysis. We chose only to list one problem here: computing bounds on the bisection width on random graphs. It is rigorously known that the edge probability where the bisection width is becoming positive is the critical point where the giant component of the random graph reaches size n/2. However, the exact value of the bisection width is not rigorously known. In unpublished recent work we have proposed a new heuristic algorithm, the core peeling algorithm for computing good bisections. The basis for this algorithm is the partitioning of the giant component into a large two-core, and several trees attached to the two-core. The basis of the algorithm is the hypothesis that as long as the size of the two-core remains below n/2 optimal solutions do not involve cutting the two-core. Recent rigorous results are able to compute quantities such as the size of the two-core of the giant component. So it seems plausible that there exists a rigorous analysis of the width of the bisection produced by this corepeeling algorithm. As for the application of phase transition methods to model checking, together with the conceptual advances represented by the application of new methodology to formal verification, the most significant advance we are looking for is to bring to the area of formal verification the algorithmic advances that the we have seen in the area of constraint satisfaction. The list of problems we are going to address (and the expected results) includes the following: • [C.2.1] We will take a comprehensive computational evaluation of the survey propagation algorithm [BMZ05], on various benchmarks and distributions of random instances of satisfiability. This satisfiability algorithm is so far the greatest practical contribution of the area, with preliminary experimental tests showing success on random instances at the transition point (the ”hardest” instances) with up to 106 variables. By contrast, best previous algorithms were able to solve such random ”hardest” instances with of the order of 104 variables. The algorithm is amenable to efficient parallelization, so additional gains are possible. However, due mainly to its novelty, a comprehensive evaluation is missing from the literature. We believe that survey propagation is not a ”magic” approach, equally suited for all instances of satisfiability. Rather we expect that it is fine-tuned for instances mirroring the intuitions arising from Statistical Physics. In particular, a question of interest is how does survey propagation perform on satisfiability instances arising from formal verification studies. • [C.2.2] The study of phase transitions in random models arising from problems in formal verification. The results from [MIDV05] were motivated by the automata-theoretic approach to formal verification [WV94]. In this approach a formula specified in temporal logic is translated into an exponentially large Buchi automaton. The system to be verified is also modeled by an automaton, and the product of the two automata is checked for emptiness. The random model considered in [MIDV05] corresponds to a finite automaton with random transitions and a random set of accepting states. The two control parameters encode the density of the transition graph and that of the set of accepting states. A finite automaton only models so-called safety properties of linear temporal logic, and does not directly correspond to the Page 6 of 10

“PHASETRANS”

product construction. We will consider the case of phase transitions arising from random Buchi automata. We will consider alternate models such as that of fixed formulas in linear temporal logic and random transition systems. We will also study other random models developed in conjunction with the formal method experts in the host institution, such as for instance models arising from infinite-state model checking. • [C.2.3] The development and analysis of a benchmark generator for formal verification methods. This will be done in collaboration with the formal methods experts at the host institution, and based on the phase transition studies we aim to uncover. We expect to employ methods combining this work with techniques from the study of phase transitions in constraint satisfaction problems, such as hiding solutions in random instances [AJM04]. In parallel we will attempt to collect a number of benchmark instances arising from computational experience in formal verification. • [C.2.4] We will develop propagation algorithms for instances of model-checking problems, similar to the SP algorithm. The hope is that these methods (that will be developed by leveraging the phase structure understanding of model-checking problems) will bring to model checking the kind of computational gains experienced by the field of satisfiability solving.

4. RESEARCH METHOD The area of Phase Transitions is one of growing interest and rapid expansion. This is witnessed by the recent publications of two books providing an overview of the field [HW05, PIM06], as well as the high profile of the work in this area has held in the scientific literature. Recent years have seen many summer schools and conferences on this topic, both in the European Union (several editions of a summer school/workshop at the Abdus Salam International Center for Theoretical Physics in Trieste) and in the United States (Santa Fe, Institute for Pure and Applied Mathematics at U.C.L.A., DIMACS Research Center in Theoretical Computer Science, Mathematical Sciences Research Institute at University of California at Berkeley). In the years since the publication of the original Cheeseman, Kanefsky and Taylor paper significant advances have been made in our understanding of the typical complexity of random instances of combinatorial problems. Recent advances have for the first time provided an understanding the complexity of instances in both the satisfiable and unsatisfiable phase. Such results include • the discovery and theoretical understanding of the analogies between first-order phase transitions and the computational difficulty of random instances in the unsatisfiable phase. • the recent studies into clustering of solutions. This research is primarily motivated by the need to explain the fact that the performance of many algorithms, both exact and incomplete ones (such as e.g. local search method) seem to suffer a degradation in performance at a point substantially lower than the satisfiability/unsatisfiability transition. Many problems display symmetry breaking in the solution space, and it is this property that seems responsible for the slowing down of decision algorithms on satisfiable instances. These advances have been complemented by increased understanding of the structure of particular problems. For instance: • The phase space properties of the number partition problem have many features that are reminiscent of the so-called random energy model in Statistical Physics. • The NP-complete problem 1-in-k Satisfiability is structurally similar to the tractable problem 2satisfiability. The reason for this similarity is that the transition in both problems is determined by the a ”giant component percolation” phenomenon.

Page 7 of 10

“PHASETRANS”

• The threshold of the NP-complete Hamiltonian cycle problem can be rigorously characterized, since it is essentially determined by a simpler (polynomial time computable) property, the disappearance of nodes of degree at most 1. Finally, the research has started to produce concrete algorithmic improvements. It is important to ask the following question: ”why did the previous work on Phase Transitions in Combinatorial Problems not result in a systematic theory so far ?”. The answer has partly cultural reasons. The connection between phase transitions and computational complexity does not extend to all problems and all algorithms, and would not be able to tackle issues such as the P ≠ NP issue. To be able to identify the extent that a connection, one needs to consider more refined concepts of complexity classes and reductions such as, e.g., those from descriptive complexity theory [Imm99]. Such concepts are neither well-known nor of significant interest to the Statistical Physics community, so this kind of research direction has not been considered so far by physicists, the main leaders of the research agenda in this area. This is an ideal opportunity for Computer Scientists to broaden the research agenda in the field, in a direction of substantial relevance. We strongly believe that the time is ripe for a systematic approach, that will connect and extend these advances, and lead to a unified theory of phase transitions in combinatorial problems. This is an important intellectual challenge for Theoretical Computer Science, that will yield substantial benefits to the field. As for the idea of applying methods inspired by phase transition research to the analysis of problems arising from model checking, although formal methods have become a featured application to satisfiability research, the research connecting it to ideas from Statistical Physics is in its infancy. It was recently observed, by one of the leading researchers in the area of formal methods, the Godel prize medalist Moshe Vardi [TV05]: ”... we lack realistic benchmarks of finite automata (as generators of random instances, in the automaton based approach to model checking) [...] Hopefully , with the recent increase in popularity of finite-state formalisms in industrial temporal specification languages [...] such benchmarks will become available in the not too far future, enabling us to evaluate our findings on such benchmarks. [...] At any rate, gaining a deeper understanding why one method is better than another method is an important challenge”. This proposal is about developing the conceptual tools to address this challenge.

References [AJM04] D. Achlioptas, H. Jia, and C. Moore. Hiding satisfying assignments: two are better than one. In Proceedings of AAAI, 2004. [BBD+96] S. Baumler, M. Balser, A. Dunnets, W. Reif, and J. Schmitt. Verification of medical guidelines by model checking - a case study. In Proceedings of the 13th SPIN Workshop on Model Checking Software, 1996. [BDG+04] G. Brat, D. Drusinsky, D. Giannakopoulou, A. Goldberg, K. Havelund, M. Lowry, C. Pasareanu, A. Venet, W. Visser, and R. Washington. Experimental evaluation of verification and validation tools on martian rover software. Formal methods in system design, 25(1):167–198, 2004. [BFVW04] R. Bordini, M. Fisher, W. Visser, and M. Wooldridge. Model-checking rational agents. IEEE Intelligent Systems, 19(5):46–52, 2004. [BMMZ06] A. Braunstein, M. Mezard, R. Monasson, and R. Zecchina. Constraint satisfaction through Page 8 of 10

“PHASETRANS”

survey propagation. In A. Percus, G. Istrate, and C. Moore, editors, Computational Complexity and Statistical Physics, pages 123–143. Oxford University Press, 2006. [BMZ05] A. Braunstein, M. Mezard, and R. Zecchina. Survey propagation: an algorithm for satisfiability. Random Structures and Algorithms, 27:201–226, 2005. [CGP99] E. Clarke, O. Grunberg, and D. Peled. Model Checking. MIT Press, 1999. [CJM98] E. Clarke, S. Jha, and W. Marrero. A machine checkable logic of knowledge for specifying security properties of electronic commerce protocols. In LICS Security Workshop, 1998. [Fei02] U. Feige. Relations between average-case complexity and approximation complexity. In Proceedings of the 32nd ACM Symposium on Theory of Computing, pages 534–543, 2002. [FV98] T. Feder and M. Vardi. The computational complexity of monotone monadic SNP and constraint satisfaction: a study through DATALOG and group theory. SIAM Journal on Computing, 28(1):57–104, 1998. [GJ79] M. Garey and D. Johnson. Computers and Intractability: A Guide to the Theory of NPCompleteness. W. H. Freeman and Company, 1979. [GS05] C. Gomes and B. Selman. Computational science: Can get satisfaction. Nature, 435:751–752, 2005. [Hay97] B. Hayes. Can’t get no satisfaction. American Scientist, 85(2):108–112, March–April 1997. [Hay03] B. Hayes. On the threshold. American Scientist, 91(1):12, 2003. [HK06] J. Heath and M. Kwiatkowska. Probabilistic model checking of complex biological pathways. In Proc. Computational Methods in Systems Biology (CMSB’06), 2006. [HMRS06] H. Hunt, M. Marathe, D. Rosenkrantz, and R. Stearns. Towards a predictive computational complexity theory for periodically specified problems: a survey. In A. Percus, G. Istrate, and C. Moore, editors, Computational Complexity and Statistical Physics, pages 319–367. Oxford University Press, 2006. [HW05] A. Hartmann and M. Weigt. Phase transitions in combinatorial optimization problems. WileyVCH, 2005. [Imm99] N. Immerman. Descriptive Complexity. Springer Graduate Texts in Computer Science, 1999. [IPB05] G. Istrate, A. Percus, and S. Boettcher. Spines of random constraint satisfaction problems: Definition and connection with computational complexity. Annals of Mathematics and Artificial Intelligence, 44(4):353–372, 2005. [Ist00] G. Istrate. Computational complexity and phase transitions. In Proceedings of the 15th I.E.E.E. Annual Conference on Computational Complexity (CCC’00), 2000. [Ist02] G. Istrate. The phase transition in random Horn satisfiability and its algorithmic implications. Random Structures and Algorithms, 4:483–506, 2002. [Ist05] G. Istrate. Threshold properties of random boolean constraint satisfaction problems. Discrete Applied Mathematics, 153:141–152, 2005. Page 9 of 10

“PHASETRANS”

[Joh99] G. Johnson. Separating the insolvable and the merely difficult. New York Times, July 13 Edition, 1999. [KS92] H. Kautz and B. Selman. Planning as satisfiability. In Proceedings of the European Conference on Artificial Intelligence, 1992. [Men01] T. Menzies. How AI can help SE; or: Randomized search not considered harmful. In Proceedings of the Fourteenth Canadian Conference on Artificial Intelligence, 2001. [MIDV05] C. Moore, G. Istrate, D. Demopoulos, and M. Vardi. A continuous-discontinuous secondorder transition in the satisfiability of a class of Horn formulas. In Proceedings of the Fifth International Workshop on Randomized algorithms (RANDOM’05), volume 3624 of Lecture Notes in Computer Science. Springer Verlag, 2005. [MMW05] E. Mossel, E. Maneva, and M. Wainwright. A new look at survey propagation and its generalizations. In Proceedings of the 16rd ACM-SIAM Symposium on Discrete Algorithms, 2005. [MMZ06] S. Mertens, M. M´ezard, and R. Zecchina. Threshold values of random k-SAT from the cavity method. Random Structures and Algorithms (to appear), 28(3), 2006. [MZK+99] R. Monasson, R. Zecchina, S. Kirkpatrick, B. Selman, and L. Troyansky. Determining computational complexity from characteristic phase transitions. Nature, 400(8):133–137, 1999. [Pap94] C. Papadimitriou. Computational Complexity. Addison-Wesley, 1994. [PIM06] A. Percus, G. Istrate, and C. Moore, editors. Computational Complexity and Statistical Physics. Oxford University Press, 2006. [RH04] P. Regan and S. Hamilton. NASA’s mission reliable. Computer Magazine, January 2004. [Rob05] S. Robinson. Math/physics collaboration sheds new light on computational hardness. SIAM News, May 2005. [Sch78] T. J. Schaefer. The complexity of satisfiability problems. In Proceedings of the 13th ACM Symposium on Theory of Computing, pages 216–226. ACM Press, 1978. [TV05] D. Tabakov and M. Vardi. Experimental evaluation of classical automata constructions. In Proc.of the 12th International Conference on Logic for Programming Artificial Intelligence and Reasoning (LPAR’05), 2005. [Wan97] J. Wang. Average-case complexity theory. In L. Hemaspaandra and A. Selman, editors, Complexity Theory Retrospective II, pages 295–328. Springer Verlag, 1997. [WV94] P. Wolper and M. Vardi. Reasoning about infinite computations. Information and Computation, 115(1):1–37, 1994. [WZ06] M. Weigt and H. Zhou. Message passing for vertex cover. Technical report, arXiv.org preprint server eprint cond-mat/0605190, 2006.

Page 10 of 10

PART B ”PHAS

reasoning or constraint satisfaction. Efficient .... algorithm to a larger class of constraint satisfaction problems. .... Computational science: Can get satisfaction.

146KB Sizes 1 Downloads 81 Views

Recommend Documents

PART A PART B
On the blank World Map provided, do the following: 1. create a Legend that includes a coloured symbol for each of the following: ○ health risk. ○ political instability. ○ natural disaster. 2. locate and place on the map each of the 6 risks you

WR330 Part B
Water Resources Act 1991 (as amended by the Water Act 2003), Environment Act 1995,. The Water Resources (Abstraction and Impounding) Regulations 2006. Introduction. Please read through this application form and the ... The Data Protection Act 1998. C

Part B: Reinforcements and matrices
market. Thus both matrix types will be studied. Moulding compounds will be briefly overviewed. For MMCs and CMCs, processing methods, types of materials, ...... A major application is garden decks (USA), picture frames and the ...... Primary processi

Part B: Reinforcements and matrices
Polymeric Matrix Composites (PMCs), Metal Matrix Composites (MMCs) and Ceramic. Matrix Composites (CMCs) will be discussed. For PMCs, synthetic and natural fibres as well as mineral particulate reinforcements will be studied. Polymeric matrices both,

Part B-1.pdf
Page 1 of 22. 1. Creating the E-Business, Part B- E-Business Strategy &Components by Rafat Abushaban. Creating the E-Business. Part B- Strategy and Components. Page 1 of 22. Page 2 of 22. 2. Creating the E-Business, Part B- E-Business Strategy &Compo

GeometryFinalReview Part A and B shorted GeometryFinalReview2 ...
GeometryFinalReview Part A and B shorted GeometryFinalReview2 JR.pdf. GeometryFinalReview Part A and B shorted GeometryFinalReview2 JR.pdf. Open.

CS2254 Part B with Ans.pdf
www.5starnotes.com. 12)Explain what semaphores are, their usage, implementation given to avoid. busy waiting and binary semaphores. Semaphore definition.

TLM4ALL@10TH-HINDI-PART-B-STUDY MATERIAL.pdf
Page 1 of 22. S.S.C. HINDI. STUDY MATERIAL. PART-B. Prepared by : D. Harsha Vardan M.A., H.P.T. TGT , Hindi.,. A.P. MODEL SCHOOL,. Punganur , Chittoor DT. www.tlm4all.com. Page 1 of 22 ...

Systems, Man and Cybernetics, Part B, IEEE Transactions ... - CiteSeerX
Imaging Science Division, Imaging Research and Advanced Development,. Eastman Kodak Company ...... Abstract—In this paper, an alternative training approach to the EEM- ... through an obstacle course from an initial configuration to a goal.

awes Chemistry (PGT) [Part 'B'].pdf
electrons involved per chromium atom is. (A) 2 (B) 3. (C) 4 (D) 1. 7. Which alloy contains Cu, Sn and Zn? (A) Gun metal (B) solder. (C) Type metal (D) Bronze. 8.

RP PART A & B WITH ANS.pdf
RP PART A & B WITH ANS.pdf. RP PART A & B WITH ANS.pdf. Open. Extract. Open with. Sign In. Main menu. Displaying RP PART A & B WITH ANS.pdf. Page 1 ...

STATUS: ORIGINAL SUBMISSION Part B, Dispute ...
Nov 1, 2009 - OFFICE OF SPECIAL EDUCATION. PROGRAMS. FORM EXPIRES: 08/31/2009. STATE: GUAM. Each cell should be -9 or another number.

Science II M6 Part B(2nd Term).pdf
2' tu .%kaÓh u.ska i %djh lrk fydafudakh. l=ulao@. 3' tu fydafudakh u.ska bgqjk ld3⁄4hh. l=ulao@. 4' fuys 'B' jHqyh jeo.;a jkafka ixfiapkhgh'. ixfiapkh hkq l=ulao@.

Systems, Man and Cybernetics, Part B, IEEE ...
contact movements, a virtual trajectory is also modified to reduce control error. The method ... Computer simulations show that a smooth transition from free to contact .... on the end-effector of the manipulator; and m is the number of degrees of ..

PART B Module-I ANIMAL DIVERSITY 1 ... Accounts
Divisions(1)Agnatha.General character e.g.Petromyzon(2)Gnathostomata-General characters ..... Renewable and non-renewable resources.Natural resources ... energy.Water conservation-Rain water harvesting and water shed management.

PART B (5 × 16 = 80 marks)
Write a SIC code to copy an 11 byte character string to another. 3. What is program relocation? 4. How literal differs ... What are the tasks involved in document editing process? 10. What is the use of tracing? PART B ─ (5 ... b) i) Differentiate

English-Part-B-Paper-2 June 2013.pdf
.orddn't be ro see he, hel;led Piaflo sitring.n top or ilt waSon Dad sal'ut in r)t. hackytu'd nelt b the Chi.kenshed,looking sadJer rh'n I hi'l e\cr s"n hin. (i) Hoh Nas the Piano being t.len aaa! ? (,i) Why Nas Mum sad and sobbin8 ? (in) wherc was D

Grade 8 Social Studies Part B
Choose any soldier and fill in the chart below. Do not repeat what others have used! (3 marks). Soldier's Name and Biographical information (date of birth/death, ...

English-Part-B-Paper-2 June 2013.pdf
(NON - DETAII-ED : 3s MARKS). Fill in lhr blanks sith i\. .lfroPtiatu Phrasss gnen belon to. meaninglul PaiagraPh : /u r.r w,' "n ll1 lolnE _dr' u wi l ' 1ts tr r ' r). ,, .,. '.

The Medicare Part B Drug Payment Model - Biotechnology Industry ...
May 9, 2016 - Clear from the aggressive CMS PR campaign in introducing the “model”. ▸ Oncologists are ... Pressure to get the lower cost therapy, not necessarily the best therapy. ▫ Moving ... Cancer Clinics. Source: Medicare Data; Study in P

Handbook-Of-Detergents-Part-B-Environmental-Impact-Surfactant ...
There was a problem previewing this document. Retrying... Download. Connect more apps... Try one of the apps below to open or edit this item. Handbook-Of-Detergents-Part-B-Environmental-Impact-Surfactant-Science.pdf. Handbook-Of-Detergents-Part-B-Env