Coarsest Controllability-Preserving Minimization for Nondeterministic Plants J. Markovski

Abstract— A recent process-theoretic approach to supervisory control theory identified a so-called partial bisimilarity preorder as a suitable behavioral relation to capture the notion of controllability for nondeterministic discrete-event systems. The equivalence relation, induced by the partial bisimilarity preorder, can then be employed to minimize the unsupervised system, referred to as plant, and optimize the process of supervisor synthesis. We present an efficient minimization algorithm for computing the partial bisimilarity equivalence by partitioning the state space of the plant. Our algorithm has comparable space and time complexity with the efficient counterpart algorithms for minimization by simulation equivalence.

I. I NTRODUCTION Supervisory control theory of discrete-event systems [1], [2] deals with synthesis of high-level supervisory controllers based upon formal models of the hardware and control requirements. The supervisory controller observes the discreteevent behavior of the machine by receiving sensor signals from ongoing activities. Based upon these signals it makes a decision which activities are allowed to be carried out and sends back control signals to the hardware actuators. Under the assumption that the supervisory controller can react sufficiently fast on machine input, one can model this feedback loop as a pair of synchronizing processes. The model of the machine, referred to as plant, is restricted by the model of the controller, referred to as supervisor. Traditionally, the plant is modeled as a set of observable traces of events, given as a set of synchronizing automata, whose joint recognized language corresponds to the observed traces. The events are split into controllable events, which can be disabled by the supervisor in the synchronous composition, and uncontrollable events, which must always be allowed by the supervisor. The control requirements specify allowed behavior again as sequences of events, leading to event-based supervisory control theory [1], [2]. The basic notion in supervisory control is controllability, which identifies sufficient and necessary conditions for existence of a supervisor for a given plant that satisfies the control requirements, while never disabling uncontrollable events. A coalgebraic approach to supervisory control introduced partial bisimulation as a behavioral relation suitable to define controllability of deterministic discrete-event systems [3]. In essence, it states that controllable events should be simulated, i.e., whenever the supervised plant can execute an event, the original plant must also have enabled that event, whereas J. Markovski is affiliated with the Eindhoven University of Technology, P.O. Box 513, NL-5600 MB Eindhoven, The Netherlands, [email protected]. This work is part of the EU project C4C, contract number FP7-ICT-223844.

uncontrollable events should be bisimulated, i.e., whenever the original plant can perform an uncontrollable event, the supervisor must also allow that event if the trace leading to it was enabled. This relation between the supervised and original plant ensures that no uncontrollable events are disabled in all reachable supervised states. A recent process-theoretic approach employed the preorder induced by partial bisimulation to cater for supervisory control for nondeterministic discrete-event systems as well [4]. Nondeterminism naturally occurs in systems with multiple parallel components and it enables abstract (under)specifications for a greater modeling convenience [5]. Note that nondeterministic automata are not disallowed in [1], but the semantics remains in terms of accepted languages. Previous work dealt with trace-based [6], [7] and failure-based semantics [8], [9]. Even though it is argued that refinements for failure and bisimulation semantics have similar properties [10], we consider (bi)simulation as an elegant notion to capture nondeterminism [5], [11]. Moreover, there exist efficient partitioning algorithms for minimization by (bi)simulation [12], already employed in the deterministic setting to optimize supervisor synthesis [13]. In this paper, we give a minimization algorithm for computing the coarsest partial bisimilarity quotient of the plant, which by definition preserves controllability [4]. To the best of our knowledge, this is the first proposal for minimization of nondeterministic plants that preserves controllability. We note that in the deterministic case, any behavioral relation would suffice, as in that case all behavioral relation coincide with trace semantics [11], [13]. The worst-case time complexity of our algorithm is O(ST + AS 3 ) for a given labeled transition system with S states, T transitions, and A action labels, while exhibiting a space complexity of O(AS 2 log S + N log S). For comparison, the best minimization by simulation preorder algorithm, which is less complicated due to simpler stability conditions, has better, but comparable, time complexity of O(ST ) [14], whereas we maintain the best space complexity [12], [15]. We note that the proof of the claims made in this paper can be found in [16]. II. PARTIAL B ISIMILARITY AND PARTITION PAIRS The underlying models that we are going to consider are labeled transitions systems (with successful termination options) following the notation of [5]. A labeled transition system G is a tuple G = (S, A, ↓, →), where S is a set of states, A a set of event labels, ↓ ⊆ S is a successful termination predicate that takes the role of marked states [1],

a

we denote that there exists p0 ∈ P 0 such that p → p0 . We distinguish two types of (Galois) transitions between the a partition classes [12]: P →∃ P 0 , if there exists p ∈ P such a a 0 that p → P , and P →∀ P 0 , if for every p ∈ P , it holds that a a a p → P 0 . It is straightforward that P →∀ P 0 implies P →∃ P 0 . a a 0 0 Also, if P →∀ P , then Q →∀ P for every Q ⊆ P . We define the stability conditions for partial bisimilarity equivalence of a partition pair with respect to the termination predicate and the transition relation.

[2], [4], and → ⊆ S ×A×S is the labeled transition relation. a For p, q ∈ S and a ∈ A, we write p → q and p↓. Definition 1 A relation R ⊆ S × S is a partial bisimulation with respect to the bisimulation action set B ⊆ A, if for all p, q ∈ S such that (p, q) ∈ R it holds that: 1) if p↓, then q↓; a 2) if p → p0 for some a ∈ A, then there exists q 0 ∈ T such a that q → q 0 and (p0 , q 0 ) ∈ R; b 3) if q → q 0 for some b ∈ B, then there exists p0 ∈ T such b that p → p0 and (p0 , q 0 ) ∈ R.

Definition 2 Let G = (S, L, ↓, →) be a labeled graph. We say that (P, v) ∈ P over G is stable with respect to ↓, →, and B ⊆ A if the following conditions are fulfilled: a. For all P ∈ P, it holds that P ↓ or P 6 ↓. b. For all P, Q ∈ P, if P v Q and P ↓, then Q↓. c. For every P, Q, P 0 ∈ P and a ∈ A, if P v Q and a a P →∃ P 0 , there exists Q0 ∈ P with P 0 vQ0 and Q →∀ Q0 . 0 d. For every P, Q, P ∈ P and b ∈ B, if P v Q and b b Q →∃ Q0 , there exists P 0 ∈ P with P 0 vQ0 and P →∀ P 0 .

If (p, q) ∈ R, we say that p is partially bisimilar to q and we write p ≤B q. If q ≤B p holds as well, we write p ↔B q. Note that ≤B is a preorder relation, making ↔B an equivalence relation for all B ⊆ A [4]. If B = ∅, then ≤∅ coincides with strong similarity preorder and ↔∅ coincides with strong similarity equivalence [11], [5]. When B = A, ↔A turns into strong bisimilarity [11], [5]. For p, r, and s labeled transition systems, representing the plant, the control requirements, and the supervisor, respectively, we write p | s ≤∅ r and p | s ≤U p for the conditions of controllability, where U ⊆ A is the set of uncontrollable events [4]. This setting covers both the existing deterministic and nondeterministic definition of controllability for discreteevent systems [4]. From the definition, it is not difficult to observe, that one obtains the same supervised behavior for every p0 ↔U p. Thus, one can apply minimization by partial bisimilarity equivalence to obtain the coarsest representation of the plant that preserves the same supervised behavior. We compute this quotient using partitioning algorithms for the states of the labeled transition system. To this end, we need a a define so-called little and big brother states. Let p → p0 a 00 and p → p with p0 ≤B p00 . Then we say that p0 is the little brother of p00 , or p00 is the big brother of p0 . The little brothers play an important role in defining the quotient of a labeled transition graph.

Given a relation R ∈ S × T on some sets S and T , define R−1 ∈ T × S as R−1 = {(t, s) | (s, t) ∈ R}. If R is a preorder, then R ∩ R−1 is an equivalence relation. The following theorem shows that every partial bisimulation preorder induces a stable partition pair. Theorem 2 Let G = (S, A, ↓, →) and let R be a partial bisimulation preorder over S with respect to B ⊆ A. Let ↔B , R ∩ R−1 . If P = S/↔B and for all p, q ∈ S it holds if (p, q) ∈ R, then [p]↔B v [q]↔B , then (P, v) ∈ P is stable with respect to ↓, →, and B. Stable partition pairs induce partial bisimulation preorders. Theorem 3 Let G = (S, A, ↓, →) and (P, v) ∈ P. Define R = {(p, q) ∈ P × Q | P v Q}. If (P, v) is stable with respect to ↓, →, and B ⊆ A, then R is a partial bisimulation preorder for B. Next, we define C ∈ P × P that identifies when one partition pair is finer than the other with respect to inclusion.

Theorem 1 Let p ≤B q ≤B r for p, q, r ∈ S. Then: a.p + a.q ↔B a.q if a 6∈ B, and b.p + b.q + b.r ↔B b.p + b.r if b ∈ B .

Definition 3 Let (P, v) and (P 0 , v0 ) be partition pairs. We say that (P, v) is finer than (P 0 , v0 ), notation (P, v) C (P 0 , v0 ), if and only if for all P, Q such that P v Q there exist P 0 , Q0 ∈ P 0 such that P ⊆ P 0 , Q ⊆ Q0 , and P 0 v0 Q0 .

Theorem 1 states how to eliminate little brothers when dividing states. In the sequel, we represent the partial bisimilarity preorder by means of partition-relation pairs [12]. The partition identifies mutually partially bisimilar states, whereas the relation will represent the little brother relation. Let G = (S, L,S↓, →) and let P ⊂ 2S . The set P is a partition over S if P ∈P P = S and for all P, Q ∈ P, if P ∩ Q 6= ∅, then P = Q. A partition pair over G is a pair (P, v) where P is a partition over S and the (little brother) relation v ⊆ P × P is a partial order, i.e., a reflexive, antisymmetric, transitive relation. We denote the set of partition pairs by P. The refinement operator, always produces partition pairs, with the little brother relation being a partial order, provided that the initial partition pair is a partial order [12], [15]. For all P ∈ P, we have that P ↓ and P 6 ↓, if for all p ∈ P a it holds p↓ and p6 ↓, respectively. For P 0 ∈ P by p → P 0

The relation C as given in Definition 3 is a partial order. The following theorem states that coarser partition pairs with respect to C produce coarser partial bisimulation preorders. Theorem 4 Let G = (S, A, ↓, →) and (P1 , v1 ), (P2 , v2 ) ∈ P. Define Ri = {(pi , qi ) ∈ Pi × Qi | Pi vi Qi } for i ∈ {1, 2}. Then (P1 , v1 ) C P2 , v2 ) if and only if R1 ⊆ R2 . Next, for every two stable partition pairs with respect to a labeled graph, there exists a C-coarser stable partition pair. Theorem 5 Let G = (S, L, ↓, →) and let (P1 , v1 ), (P2 , v2 ) ∈ P be stable with respect to ↓, →, and B ⊆ A. Then, there exists (P3 , v3 ) ∈ P that is also stable, and (P1 , v1 ) C (P3 , v3 ) and (P2 , v2 ) C (P3 , v3 ). 2

Theorem 5 implies that stable partition pairs form an upper lattice with respect to C. Now, it is not difficult to observe that finding the C-maximal stable partition pair over a labeled graph G coincides with the problem of finding the coarsest partial bisimulation preorder over G.

split to Q1 and Q2 such that Pk = Q1 ∪Q2 and Q1 ∩Q2 = ∅ resulting in P2 = P1 \{Pk }∪{Q1 , Q2 } such that (P2 , v2 )C (P1 , v1 ). Suppose that either Q1 v2 Q2 or Q1 and Q2 are unrelated. Then, 2 = [P1 , . . . , Pk−1 , Q1 , Q2 , Pk+1 , . . . , Pn ] is a topological sorting over P2 induced by v2 .

Theorem 6 Let G = (S, A, ↓, →). The C-maximal (P, v) ∈ P that is stable with respect to ↓, →, and B ⊆ A is induced by the partial bisimilarity preorder ≤B , i.e., P = S/↔B and [p]↔B v [q]↔B if and only if p ≤B q.

Theorem 7 enables us to update the topological sorting by locally replacing each class with the results of the splitting without having to re-compute the whole sorting in every iteration, as it is done in [12], [15]. As a result, the classes whose nodes belong to the same parent are neighboring with respect to the topological sorting. Moreover, it also provides us with a procedure for searching for a little or a big brother of a given class. All little brothers of a given class are topologically sorted in descendent to the left, and all the big brothers are topologically sorted ascendent to the right. Now, we can define a refinement fix-point operator S. It takes as input (Pi , vi ) ∈ P and an induced parent partition pair (Pi0 , v0i ), with (Pi , vi ) C (Pi0 , v0i ), for some i ∈ N, which are stable with respect to each other. Its result 0 are (Pi+1 , vi+1 ) ∈ P and parent partition Pi+1 such that 0 0 (Pi+1 , vi+1 ) C (Pi , vi ) and (Pi+1 , vi+1 ) C (Pi0 , v0i ). Note 0 that Pi0 and Pi+1 differ only in one class, which is induced by the splitter that we employed to refine Pi to Pi+1 . This splitter comprises classes of Pi , which are strict subsets from some class of Pi0 . The refinement stops, when a fix point is 0 reached for m ∈ N with Pm = Pm . In the following, we omit partition pair indices, when clear from the context. Now, suppose that (P, v) ∈ P has P 0 as parent with (P, v) C (P 0 , v0 ), where v0 is induced by v. Condition a of Definition 2 requires that all states in a class have or, alternatively, do not have termination options. We resolve this issue by choosing a stable initial partition pair, for i = 0, that fulfills this condition, i.e., for all classes P ∈ P0 it holds that either P ↓ or P 6 ↓. For condition b, we specify v0 such that P v0 Q with P ↓ holds, only if Q↓ holds as well. Thus, following the initial refinement, we only need to ensure that stability conditions c and d are satisfied, as shown in Theorem 9 below. For convenience, we rewrite these stability conditions for (P, v) with respect to (P 0 , v0 ).

Theorem 6 supported by Theorem 5 induces an algorithm for computing the coarsest mutual partial bisimulation over a labeled transition system G = (S, A, ↓, →) by computing the C-maximal partition pair (P, v) such that (P, v) C ({S}, {(S, S)}). We develop an iterative algorithm that refines this partition pair, until it reaches the C-maximal stable partition pair. III. R EFINEMENT O PERATOR We refine the partitions by splitting them in the vein of [17], i.e., we choose subsets of nodes that do not adhere to the stability conditions, referred to as splitters, in combination with the other nodes from the same class and, consequently, we place them in a separate class. To this end, we define parent partitions and splitters. Definition 4 Let (P, v) ∈ P be defined over S. Partition P 0 is a parent partition of P, if for every P ∈ P, there exist P 0 ∈ P 0 with P ⊆ P 0 . The relation v induces a little brother relation v0 on P 0 , defined by P 0 v0 Q0 for P 0 , Q0 ∈ P 0 , if there exist P, Q ∈ P such that P ⊆ P 0 , Q ⊆ Q0 , and P v Q. Let S 0 ⊆ P 0 for some P 0 ∈ P 0 and put T 0 = P 0 \ S 0 . The set S 0 is a splitter of P 0 with respect to P, if for every P ⊂ P 0 either P ⊆ S 0 or P ∩ S 0 = ∅, where S 0 v0 T 0 or S 0 and T 0 are unrelated. The splitter partition is P 0 \ {P 0 } ∪ {S 0 , T 0 }. A consequence of Definition 4 is that (P, v)C(P 0 , v0 ). Note that P 0 contains a splitter if and only if P 0 6= P. For implementation of the refinement operator we need the notion of a topological sorting. Topological sorting with respect to a preorder relation is a linear ordering of elements such that topologically “smaller” elements are not preorderwise greater with respect to each other.

Definition 6 Let (P, v) ∈ P and let (P 0 , v0 ) be its parent partition pair, where for all P 0 ∈ P 0 either P 0 6 ↓ or (↓P 0 ). Then, (P, v) is stable with respect to P 0 and B ⊆ A, if: a 1) For all P ∈ P, a ∈ A, and P 0 ∈ P 0 , if P →∃ P 0 , there a exists Q0 ∈ P 0 with P 0 v0 Q0 and P →∀ Q0 . b 2) For all P ∈ P, b ∈ B, and P 0 ∈ P 0 , if P →∃ P 0 , there b exists Q0 ∈ P 0 with Q0 v0 P 0 and P →∀ Q0 . a 3) For all P, Q ∈ P, a ∈ A, P 0 ∈ P 0 , if P v Q and P →∀ a P 0 , there exists Q0 ∈ P 0 with P 0 v0 Q0 and Q →∀ Q0 . a 4) For all P, Q ∈ P, b ∈ B, Q0 ∈ P 0 , if P v Q and Q →∀ b Q0 , there exists P 0 ∈ P 0 with P 0 v0 Q0 and P →∀ P 0 .

Definition 5 Let (P, v) ∈ P. We say that  is a topological sorting over P induced by v, if for all P, Q ∈ P it holds that P  Q if and only if Q 6v P . Definition 5 implies that if P  Q, then either P v Q or P and Q are unrelated. In general, topological sorting are not uniquely defined. It can be represented as a list , [P1 , P2 , . . . , Pn ], for some n ∈ N, where P = {Pi | i ∈ {1, . . . , n}} and Pi  Pj for 1 ≤ i ≤ j ≤ n. The following property that provides for an efficient updating of the topological order.

It is not difficult to observe that stability conditions 1-4 replace stability conditions c and d of Definition 2. They are equivalent when P = P 0 , which is the goal of our fix point refinement operation. From now on, we refer to the stability conditions above instead of the ones in Definition 2.

Theorem 7 Let (P1 , v1 ) ∈ P with P1 = {P1 , . . . , Pn } and let 1 = [P1 , P2 , . . . , Pn ] be a topological order over P1 induced by v1 . Suppose that Pk ∈ P1 for some 1 ≤ k ≤ n is 3

S(P2 , v2 , P 0 , S 0 ).

The form of the stability conditions is useful as conditions 1 and 2 are used to refine the splitters, whereas conditions 3 and 4 are used to adjust the little brother relation. Moreover, if the conditions of Definition 6 are not fulfilled for (P, v)C (P 0 , v0 ), then the partition pair (P, v) is not stable.

The refinement operator ultimately produces the coarsest stable partition pair with respect to a labeled graph. Theorem 12 Let G = (S, A, ↓, →), let (P0 , v0 ) be the initial stable partition pair, and P00 the initial parent partition as given by Definition 7, and S00 a splitter. Suppose that (Pc , vc ) is the coarsest stable partition pair with respect to ↓, →, and B ⊆ A. Then, there exist partitions Pi0 and splitters Si0 for i ∈ {1, . . . , n} such that S(Pi , vi , Pi0 , Si0 ) are well defined with Pn = Pn0 and (Pn , vn ) = (Pc , vc ).

Theorem 8 Let (P, v) ∈ P, let P 0 be a parent partition, and suppose that the conditions of Definition 6 do not hold. Then (P, v) is not stable. The initial stable partition pair and parent partition are induced by the termination options and outgoing transitions of the comprising states. To this end, we define the set of outgoing labels of a state p ∈ S to be OL(p) , {a ∈ a A | p→}. Let P ⊆ S. If for all p, q ∈ P we have that OL(p) = OL(q) we define OL(P ) = OL(p) for any p ∈ P .

We can summarize the high-level algorithm for computing the coarsest partition pair in Algorithm 1. Algorithm 1: Algorithm for computing the coarsest stable partition pair for G = (S, A, ↓, →) and B ⊆ A

Definition 7 Let G = (S, A, ↓, →), let P 6 ↓0 = {p ∈ S | p6 ↓}, and P ↓0 = S \ P 6 ↓0 . The initial parent partition is given by {P 6 ↓0 , P ↓0 }, where P 6 ↓0 or P ↓0 are omitted if empty. The initial stable partition pair (P0 , v0 ) is defined as the coarsest stable partition pair, where for every P ∈ P0 , either P 6 ↓ or P ↓ holds, OL(P ) is well-defined, and for every P, Q ∈ P0 if OL(P ) = OL(Q) then P = Q. For every P, Q ∈ P0 , P v0 Q holds if and only if OL(P ) ∩ B = OL(Q) ∩ B, OL(P ) ⊆ OL(Q), and if P ↓, then Q↓ as well.

1 2 3 4 5 6

Compute initial stable partition pair (P0 , v0 ) and parent partition P00 over S with respect to ↓, →, and B ⊆ A; while P0 6= P00 do P := P0 ; P 0 := P00 ; Find splitter S 0 for P 0 with respect to P; Compute (P0 , v0 ) := S(P, v, P 0 , S 0 ); P00 := P 0 \ {P 0 } ∪ {S 0 , P 0 \ S 0 };

The algorithm implements the refinement steps by splitting a parent P 0 ∈ P 0 to S 0 and P 0 \ S 0 and, subsequently, splits every class in P with respect to the splitter S 0 in order to satisfy the stability conditions. Using Theorem 1 the minimized labeled transition system has classes P ∈ P0 a instead of states, and P → Q for a 6∈ B, if there does not a b exist R 6= Q with QvR and P →∀ Q, and P →Q for b ∈ B, if there does not exist R1 , R2 6= Q with R1 v Q v R2 and a a P →∀ R1 and P →∀ R2 .

For every stable (P, v) ∈ P, we have (P, v) C (P0 , v0 ). In the opposite, some stability condition of Definition 2 fails. Theorem 9 Let (P, v) ∈ P, and let (P0 , v0 ) be given as in Definition 7. If (P, v) is stable, then (P, v) C (P0 , v0 ). Next, we give the fix-point refine operator S to be iteratively applied to the initial stable partition pair (P0 , v0 ) and P00 . Definition 8 Let (P, v) ∈ P and let P 0 be a parent partition of P with P 6= P 0 . Let  be a topological sorting over P ∈ P induced by v. Let S 0 ⊂ P 0 for some P 0 ∈ S P 0 be a k 0 0 splitter for P with respect to P. Suppose that P = i=1 Pi for some Pi ∈ P for k > 1 with P1  . . .  Pk and S 0 = P1 ∪ . . . ∪ Ps for 1 ≤ s < k. Put T 0 = P 0 \ S 0 . Define S(P, v, P 0 , S 0 ) = (Pr , vr ), where (Pr , vr ) is the coarsest partition pair (Pr , vr ) C (P, v) that is stable with respect to P 0 \ {P 0 } ∪ {S 0 , T 0 }.

IV. M INIMIZATION A LGORITHM We give alternative representations of the sets and relations required for computation of the refinement operator in order to provide a computationally efficient algorithm. The partition is represented as a list of states that preserves the topological order induced by v, whereas the parent partition is a list of partition classes. The little brother relation v is given as a table, whereas for v0 , we use a counter cntv (P 0 , Q0 ) that keeps the number of pairs (P, Q) for P, Q ∈ P such that P ⊆ P 0 , Q ⊆ Q0 , P 6= Q, and P v Q. When splitting P 0 to S 0 and T 0 , cntv (P 0 , P 0 ) = cntv (S 0 , S 0 ) + cntv (S 0 , T 0 ) + cntv (T 0 , T 0 ). We keep only one Galois relation →∃∀ = →∀ ∪ →∃ , with two counters cb∀ (P, a, P 0 ) and c`∀ (P, b, P 0 ) for P ∈ P, P 0 ∈ P 0 , a ∈ A, and b ∈ B, where cb∀ (P, a, P 0 ) keeps the number of a Q0 ∈ P 0 with P 0 v0 Q0 and P →∀ Q0 , and c`∀ (P, b, P 0 ) keeps b the number of Q0 ∈ P 0 with Q0 v0 P 0 and P →∀ Q0 . This way we can check the conditions of Definition 6 efficiently. By := we denote assignment, and for compactness we use Y op = X instead of Y := Y op X for op ∈ {+, −, \, ∪}. The initial stable partition pair is computed in three steps. The first step, given by Algorithm 2, groups the nodes into

The existence of the coarsest partition pair (Pr , vr ) is guaranteed by Theorem 5. Once a stable partition pair is reached, it is no longer refined. Theorem 10 Let G = (S, A, ↓, →) and let (P, v) ∈ P over S be stable with respect to ↓, →, and B ⊆ A. For every parent partition P 0 such that P 0 6= P and every splitter S 0 of P 0 with respect to P, it holds that S(P, v, P 0 , S 0 ) = (P, v). When refining two partition pairs (P1 , v1 ) C (P2 , v2 ) with respect to the same parent partition and splitter, the resulting partition pairs are also related by C. Theorem 11 Let (P1 , v1 ), (P2 , v2 ) ∈ P and (P1 , v1 ) C (P2 , v2 ). Let P 0 be a parent partition of P2 and let S 0 be a splitter of P 0 with respect to P2 . Then S(P1 , v1 , P 0 , S 0 ) C 4

classes according to their outgoing labels. This algorithm is also used to compute the initial partition when performing minimization by bisimulation [17]. It employs a binary tree to decide in which class to place a state by encoding that children in the left subtree do not have the associated label as outgoing, whereas the one in the right subtree do. We assume that the action labels are given by a set A = {a1 , . . . , an }, where a1 , . . . , ak ∈ B, so the tree has height n.

Algorithm 3: InitialPP(move, BBNodes, `) 1 2 3 4 5 6 7 8

Algorithm 2: SortStatesByOL() 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17

9

new(root); for p ∈ S do move := root; for i := 1, . . . , n − 1 do if ai ∈ OL(p) then if move.right = null then new(move.right); move := move.right; else if move.left = null then new(move.left); move := move.left;

10 11 12 13 14 15 16 17

if an ∈ OL(p) then if move.right = null then new(move.right); move := move.right; move.set := move.set ∪ {p}; else if move.left = null then new(move.left); move := move.left; move.set := move.set ∪ {p};

18 19 20 21

22

Return root;

23 24 25

When the binary tree is computed, we only have the classes, so we have to compute the little brother pairs as well. Recall that the left subtree at level i for 1 ≤ i ≤ n leads to classes that do not have action ai in their outgoing labels, whereas the right subtree leads to classes that have ai in their outgoing labels. The initial little brother relation is based on inclusion of outgoing label sets. Thus, if we traverse the tree in preorder, i.e., if we recursively first visit (1) the left subtree, then (2) the root, and finally (3) the right subtree, and keep track of corresponding subtrees that comprise the same or bigger sets of outgoing labels, we can fill in the little brother pairs as given in Algorithm 3. When the states of the labeled graph are sorted according to their outgoing labels and the little brother pairs are computed, we have to initialize the partition pair and its parent partition, and compute cb∀ and c`∀ . The parent class comprises only two classes P ↓0 and P 6 ↓0 . Finally, we have to ensure that the initial partition pair is stable by invoking the procedure for splitting as given in Algorithm 4. Once the initialization is finished, we proceed with the iterations as indicated by Algorithm 1. The refinement operator is implemented in two steps, given by Algorithms 5 and 6. The former finds a splitter and updates →∃∀ , cntv , cb∀ , and c`∀ , whereas the latter splits the classes of P and updates the little brother relations and the counters. The actual splitting of the class is performed by Algorithm 7, which first checks for existence of little or big brothers, depending on which the splitting of the class is postponed or not. When splitting a class, we employ Algorithm 8 to update the little brother relation. Due to the updating of the little brother relation,

26

if move.left = null and move.right = null then P 6 ↓ := {p ∈ move.set | p6 ↓}; P ↓ := move.set \ P 6 ↓ ; if P 6 ↓ 6= ∅ then Make new class P 6 ↓ ; par(P 6 ↓ ) := P 6 ↓0 ; P 6 ↓0 ∪= {P 6 ↓ }; P 6 ↓ := P 6 ↓ · [P 6 ↓ ]; for b ∈ BBNodes do b.LBClasses ∪= {P 6 ↓ }; for Q ∈ move.LBClasses such that Q6 ↓ do v ∪= {(Q, P 6 ↓ )}; cntv (P 6 ↓0 , P 6 ↓0 ) += 1; if P ↓ 6= ∅ then Make new class P ↓ ; par(P ↓ ) := P ↓0 ; P ↓0 ∪= {P ↓ }; P ↓ := P ↓ · [P ↓ ]; for b ∈ BBNodes do b.LBClasses ∪= {P ↓ }; for Q ∈ move.LBClasses do v ∪= {(Q, P ↓ )}; if Q6 ↓ then cntv (P 6 ↓0 , P ↓0 ) += 1; else cntv (P ↓0 , P ↓0 ) += 1; newBBNodes := ∅; if move.left 6= null then for b ∈ BBNodes do if b.left 6= null then newBBNodes ∪= {b.left}; if b.right 6= null and ` > k then newBBNodes ∪= {b.right}; InitialPP(move.left, newBBNodes, ` + 1); if move.right 6= null then for b ∈ BBNodes do if b.right 6= null then newBBNodes ∪= {b.right}; InitialPP(move.right, newBBNodes, ` + 1);

it can happen that pairs in the little brother relation no longer hold and these failed pairs are kept in F. Then, we have to update cb∀ and c`∀ as given by Algorithm 9 and Algorithm 10, respectively. The complexity of computing the initial partition pair is O(AS), where A is the number of event labels and S is the number of classes. The complexity of splitting the classes is the same as the splitting of classes for minimization by bisimulation [17], given by O(ST ) for S classes and T transitions. For updating the little brother pairs, we note that the algorithm performs S iterations, as there will be S splitters. Moreover, for each iteration there are at most S 2 little brother pairs. For the iterations regarding the eliminated little brother pairs, we note that for each failed pair we have update 2S counters. However, there can be at most S 2 failed pairs in total, leading to the complexity of O(S 3 ) for updating the little brother pairs, and total time complexity of O(ST + S 3 ). V. C ONCLUDING R EMARKS By employing a process-theoretic approach to supervisory control theory, we represented the notion of controllability using a behavioral preorder, termed partial bisimilarity. The equivalence induced by this preorder enables us to minimize plants, while preserving their supervised behavior with respect to a given set of uncontrollable events. To compute the 5

Algorithm 6: Refine(S 0 , P 0 , L) - refines P using the splitter and stability conditions of Definition 6

Algorithm 4: Initialize 1 2 3 4 5 6 7 8 9 10

root := SortStatesByOL(); InitialPP(root, ∅, 1); P := P 6 ↓ P ↓ ; P 0 := []; if P 6 ↓0 6= ∅ then P 0 := P 0 [P 6 ↓0 ]; if P ↓0 6= ∅ then P 0 := P 0 [P ↓0 ]; for P ∈ P do for a ∈ A do →∃∀ ∪= {(P, a, P 6 ↓0 ), (P, a, P ↓0 )}; cb∀ (P, a, P 6 ↓0 ) := 0; cb∀ (P, a, P ↓0 ) := 0; c`∀ (P, a, P 6 ↓0 ) := 0; c`∀ (P, a, P ↓0 ) := 0;

1 2 3 4 5 6 7 8 9 10

Split(P 6 ↓0 , P ↓0 , ∅);

11

11 12 13

Algorithm 5: UpdateSplitter -Finds a splitter and updates supporting data 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29

30

0

0

14 15 16

0

Find a splitter S for some P ∈ P with respect to P; Make new parent S 0 and set par(P ) := S 0 for P ∈ S 0 ; P 0 := P 0 \ S 0 ; Insert S 0 0 -before P 0 in P 0 ; for P ∈ P do for a ∈ A do a if P →∃∀ P 0 then →∃∀ ∪= (P, a, S 0 );

17 18 19 20 21 22

Compute cntv (S 0 , S 0 ) and cntv (S 0 , P 0 ); cntv (P 0 , P 0 ) −= cntv (S 0 , S 0 ) + cntv (S 0 , P 0 );

23

for P ∈ P do for a ∈ A do cb∀ (P, a, S 0 ) := 0; cb∀ (P, a, P 0 ) := 0; c`∀ (P, a, S 0 ) := 0; c`∀ (P, a, P 0 ) := 0;

F := ∅; for a ∈ A do for P ∈ P do Split (P, a, P 0 ); for Q0 ∈ P 0 do if cntv (Q0 , P 0 ) > 0 then for P ∈ P do if cb∀ (P, a, P 0 ) > 0 then cb∀ (P, a, P 0 ) += 1; for P ∈ P do Split (P, a, S 0 ); for Q0 ∈ P 0 do if cntv (Q0 , S 0 ) > 0 then for P ∈ P do if cb∀ (P, a, S 0 ) > 0 then cb∀ (P, a, Q0 ) += 1; for a ∈ A do for Q0 ∈ L do for P ∈ P do UpdateCount∀ (P, a, Q0 ); while F 6= ∅ do tmpF := F; F := ∅; for a ∈ A do for (Q0 , R0 ) ∈ tmpF do cntv (Q0 , R0 ) := 0; for P ∈ P such that cb∀ (P, a, R0 ) > 0 do UpdateCount∀ (P, a, Q0 );

Algorithm 7: Split (P, a, R0 ) - Uses splitter data to split P ∈ P 1

L := {Q0 ∈ P 0 | cntv (Q0 , P 0 ) > 0}; K := {Q0 ∈ P 0 | cntv (P 0 , Q0 ) > 0}; for Q0 ∈ P 0 do if cntv (P 0 , Q0 ) > 0 then Compute cntv (S 0 , Q0 ); cntv (P 0 , Q0 ) −= cntv (S 0 , Q0 ); for P ∈ P do for a ∈ A do if cb∀ (P, a, Q0 ) > 0 then if cntv (S 0 , Q0 ) > 0 then cb∀ (P, a, S 0 ) += 1; if cntv (P 0 , Q0 ) > 0 then cb∀ (P, a, P 0 ) += 1;

2 3 4 5 6 7 8 9 10 11

if cntv (Q0 , P 0 ) > 0 then Compute cntv (Q0 , S 0 ); cntv (Q0 , P 0 ) −= cntv (Q0 , S 0 ); for P ∈ P do for a ∈ A do if cb∀ (P, a, Q0 ) > 0 then if cntv (Q0 , S 0 ) > 0 then c`∀ (P, a, S 0 ) += 1; if cntv (Q0 , P 0 ) > 0 then c`∀ (P, a, P 0 ) += 1;

12

if cb∀ (P, a, R0 ) = 0 or (c`∀ (P, b, R0 ) = 0 and a ∈ B) then a if P →∃∀ R0 then a P@ := {p ∈ P | p → Y R0 }; P := P \ P@ ; if P@ 6= ∅ and P 6= ∅ then Make new class P@ ; v ∪= (P@ , P ); cntv (par(P ), par(P )) += 1; Copy →∃∀ , cb∀ , c`∀ , v, and par from P to P@ ; →∃∀ \= {(P, a, R0 )}; cb∀ (P@ , a, R0 ) := 0; cb∀ (P, a, R0 ) = 1; c`∀ (P@ , a, R0 ) := 0; c`∀ (P, a, R0 ) = 1; Insert P@ -before P in P; UpdateLittleBros@ (P@ , a, R0 );

15

else if P@ 6= ∅ then P := P@ ; →∃∀ \= (P, a, R0 ); UpdateLittleBros@ (P, a, R0 );

16

else cb∀ (P, a, R0 ) = 1; c`∀ (P, a, R0 ) = 1;

13 14

17

else UpdateLittleBros@ (P, a, R0 );

Return (S 0 , P 0 , L);

maximal partial bisimilarity preorder over a label transition system. We develop a minimization algorithm that computes this fix-point refinement operator, while efficiently using time and space with respect to counterpart minimization algorithms for simulation equivalence. A prototype implementation validates the algorithm for simulation and bisimulation equivalences. As future work we schedule a broad range of

minimized process, we employ partition pairs, as alternative characterization of the partial bisimilarity preorder, and we show that there exists a fix-point refinement operation that results in a coarsest partition pair that satisfies a set of stability conditions. Such partition pairs coincide with the 6

Algorithm 8: UpdateLittleBros@ (P, a, R0 ) - updates cntv 1 2 3 4 5 6

[6] M. Fabian and B. Lennartson, “On non-deterministic supervisory control,” Proceedings of the 35th IEEE Decision and Control, vol. 2, pp. 2213–2218, 1996. [7] C. Zhou, R. Kumar, and S. Jiang, “Control of nondeterministic discrete-event systems for bisimulation equivalence,” IEEE Transactions on Automatic Control, vol. 51, no. 5, pp. 754–765, 2006. [8] M. Heymann and F. Lin, “Discrete-event control of nondeterministic systems,” IEEE Transactions on Automatic Control, vol. 43, no. 1, pp. 3–17, 1998. [9] A. Overkamp, “Supervisory control using failure semantics and partial specifications,” IEEE Transactions on Automatic Control, vol. 42, no. 4, pp. 498–510, 1997. [10] R. Eshuis and M. M. Fokkinga, “Comparing refinements for failure and bisimulation semantics,” Fundamenta Informaticae, vol. 52, no. 4, pp. 297–321, 2002. [11] R. J. v. Glabbeek, “The linear time–branching time spectrum I,” Handbook of Process Algebra, pp. 3–99, 2001. [12] R. Gentilini, C. Piazza, and A. Policriti, “From bisimulation to simulation: Coarsest partition problems,” Journal of Automated Reasoning, vol. 31, no. 1, pp. 73–103, 2003. [13] G. Barrett and S. Lafortune, “Bisimulation, the supervisory control problem and strong model matching for finite state machines,” Discrete Event Dynamic Systems, vol. 8, no. 4, pp. 377–429, 1998. [14] F. Ranzato and F. Tapparo, “An efficient simulation algorithm based on abstract interpretation,” Information and Computation, vol. 208, pp. 1–22, 2010. [15] R. J. v. Glabbeek and B. Ploeger, “Correcting a space-efficient simulation algorithm,” in Proceedings of CAV, ser. Lecture Notes in Computer Science, vol. 5123. Springer, 2008, pp. 517–529. [16] J. C. M. Baeten, D. A. van Beek, B. Luttik, J. Markovski, and J. E. Rooda, “Partial bisimulation,” Eindhoven University of Technology, SE Report 10-04, 2010, available from http://se.wtb.tue.nl. [17] C. Baier and J.-P. Katoen, Principles of Model Checking. MIT Press, 2008.

for Q ∈ P such that Q v P do if cb∀ (Q, a, R0 ) > 0 or (c`∀ (Q, a, R0 ) > 0 and a ∈ B) then v \= {(Q, P )}; cntv (par(Q), par(P )) −= 1; if cntv (par(Q), par(P )) = 0 then cntv (par(Q), par(P )) := 1; F ∪= (par(Q), par(P ));

Algorithm 9: Updatecb∀ (P, a, R0 ) - updates cb∀ 1 2 3 4 5 6 7 8 9

cb∀ (P, a, R0 ) −= 1; Split(P, a, R0 ); if cb∀ (P, a, R0 ) = 0 then K := {Q0 ∈ P 0 | cntv (Q0 , R0 ) > 0}; while Q0 ∈ K do K \= {Q0 }; cb∀ (P, a, Q0 ) −= 1; Split(P, a, Q0 ); if cb∀ (P, a, Q0 ) = 0 then K ∪= {Q00 ∈ P 0 | cntv (Q00 , Q0 ) > 0};

Algorithm 10: Updatec`∀ (P, a, R0 ) - updates c`∀ 1 2 3 4 5 6 7 8 9

c`∀ (P, a, R0 ) −= 1; Split(P, a, R0 ); if c`∀ (P, a, R0 ) = 0 then K := {Q0 ∈ P 0 | cntv (R0 , Q0 ) > 0}; while Q0 ∈ K do K \= {Q0 }; c`∀ (P, a, Q0 ) −= 1; Split(P, a, Q0 ); if c`∀ (P, a, Q0 ) = 0 then K ∪= {Q00 ∈ P 0 | cntv (Q0 , Q00 ) > 0};

industrial cases, in order to estimate the gain of minimizing the plant for supervisor synthesis. We also intend to analyze how the size of the bisimulation action set affects level of minimization. Additionally, we will look into state-based setting, where the control requirements are given in terms of states instead of sequences of events. R EFERENCES [1] P. J. Ramadge and W. M. Wonham, “Supervisory control of a class of discrete event processes,” SIAM Journal on Control and Optimization, vol. 25, no. 1, pp. 206–230, 1987. [2] C. Cassandras and S. Lafortune, Introduction to discrete event systems. Kluwer Academic Publishers, 2004. [3] J. J. M. M. Rutten, “Coalgebra, concurrency, and control,” Center for Mathematics and Computer Science, Amsterdam, The Netherlands, SEN Report R-9921, 1999. [4] J. C. M. Baeten, D. A. van Beek, B. Luttik, J. Markovski, and J. E. Rooda, “A process-theoretic approach to supervisory control theory,” in Proceedings of ACC 2011. IEEE, 2011, available from: http://se.wtb.tue.nl. [5] J. C. M. Baeten, T. Basten, and M. A. Reniers, Process Algebra: Equational Theories of Communicating Processes, ser. Cambridge Tracts in Theoretical Computer Science. Cambridge University Press, 2010, vol. 50.

7

Coarsest Controllability-Preserving Minimization for ...

(under)specifications for a greater modeling convenience [5]. Note that ...... failed pairs in total, leading to the complexity of O(S3) for updating the little brother ...

299KB Sizes 0 Downloads 367 Views

Recommend Documents

Fair Simulation Minimization - Springer Link
Any savings obtained on the automaton are therefore amplified by the size of the ... tions [10] that account for the acceptance conditions of the automata. ...... open issue of extending our approach to generalized Büchi automata, that is, to.

Using Quadtrees for Energy Minimization Via Graph Cuts
For illustration, in figure 5 we reconstruct .... for representing 3D arrays such as those proposed in [18]. 6.3 Application .... In Tutorial at ACM. SIGGRAPH 2005 ...

Using Quadtrees for Energy Minimization Via Graph Cuts
How to efficiently... (ordered in increase order of difficult). – compute them? – compute their areas? – compute the perimeters of common borders? – compute their neighborhood system? Page 8. 14:12. 8. Quadtrees. Our proposal: Page 9. 14:12.

A heuristic solution to SONET ADM minimization for ... - Springer Link
Sep 9, 2006 - calls of a given static traffic to approach the solution. ..... The best possible solutions have been reached ..... IEEE/IEICE Global Telecom-.

Efficient Minimization Method for a Generalized Total ... - CiteSeerX
Security Administration of the U.S. Department of Energy at Los Alamos Na- ... In this section, we provide a summary of the most important algorithms for ...

Efficient Power Minimization for MIMO Broadcast ...
Preliminaries. Transmission Strategies for Single-User MIMO. • Singular Value Decomposition (SVD). H = USVH. ➢ Different constellations for each subchannel.

Efficient Power Minimization for MIMO Broadcast ...
thermore, users may have subscribed to plans of different data rates. Therefore, practical precoding schemes have to take that into consideration. In a cellular ...

Completion Delay Minimization for Instantly Decodable ...
transmission over lossy channels, such as satellite imaging, roadside to vehicle safety and streaming communications and internet TV. In [1]–[4], an important ...

Entropy Minimization SLAM for Autonomous Vehicles ...
In this paper, we propose and validate a novel approach to solve the. Simultaneous ..... Let Ot the complete 3D point cloud observed from the t-th pose, and let. It(u, v) the right .... Figure 4: Computing Dij. Left: candidate ...... H. Durrant-White

Minimization of Test Sequence Length for Structural Coverage ... - IJRIT
So we analyze the role that length plays in software testing, in particular branch ... Index terms: Software Testing, Test sequence, Search based software ...

Minimization of Test Sequence Length for Structural Coverage ... - IJRIT
So we analyze the role that length plays in software testing, in particular branch ... Index terms: Software Testing, Test sequence, Search based software ...

A Multi-Module Minimization Neural Network for Motion ...
Abstract–A competitive learning network, called Multi-Module Mini- mization (MMM) Neural ... not be mistakenly modeled as a meaningful class. Accordingly, we.

Efficient Power Minimization for MIMO Broadcast ...
Using the uplink-downlink duality [2],[3],[4],[5], as well as convex optimization techniques, [12], [13] and [14] are key papers that address the power minimization ...

Efficient Resource Allocation for Power Minimization in ...
While these solutions are optimal in minimiz- .... this section, an efficient solution to the power minimization .... remains in contact with this minimum surface.

Minimization of Test Sequence Length for Structural ...
1 ME-Software Engineering, Department of CSE, Sona College of Technology, Salem, TN, India. 2 Associate Professor, Department of CSE, Sona College of ...

Consistency of trace norm minimization
and a non i.i.d assumption which is natural in the context of collaborative filtering. As for the Lasso and the group Lasso, the nec- essary condition implies that ...

Asynchronous Parallel Coordinate Minimization ... - Research at Google
passing inference is performed by multiple processing units simultaneously without coordination, all reading and writing to shared ... updates. Our approach gives rise to a message-passing procedure, where messages are computed and updated in shared

Asynchronous Parallel Coordinate Minimization ... - Research at Google
Arock: An algorithmic framework for asynchronous parallel coordinate updates. SIAM Journal on Scientific Computing, 38(5):A2851–A2879, 2016. N. Piatkowski and K. Morik. Parallel inference on structured data with crfs on gpus. In International Works

Randomization. Part 2: Minimization
tween treatment groups without the disadvantages of stratification.2,3 ... by introducing a random element with a probability greater than 0.5 and lower than 1 (1 .

Consistency of trace norm minimization
learning, norms such as the ℓ1-norm may induce ... When learning on rectangular matrices, the rank ... Technical Report HAL-00179522, HAL, 2007b. S. Boyd ...

Approximate Test Risk Bound Minimization ... - Semantic Scholar
GALE program of the Defense Advanced Research Projects Agency, Contract. No. ...... recognition,” Data Mining and Knowledge Discovery, vol. 2, no. 2, pp.

Regret Minimization-based Robust Game Theoretic ... - CNSR@VT
the action support, but it will have further advantages when play becomes ..... [2] A. B. MacKenzie and L. DaSilva, Game Theory for Wireless Engineers,. Morgan ...

Random delay effect minimization on a hardware-in-the ... - CiteSeerX
Science and Technology in China and the Australian National. University. The gain .... Ying Luo is supported by the Ministry of Education of the P. R. China and China ... systems. IEEE Control Systems Magazine, pages 84–99, February 2001.

Regret Minimization With Concept Drift - Jennifer Wortman Vaughan
algorithms can achieve an average loss arbitrarily close to that of the best function ... be made, competing with the best fixed function is not always good enough.