MULTIPLE STATE PROBLEM REDUCTION AND DECISION MAKING CRITERIA HYBRIDIZATION

Background. Due to that decision making is always involving a great deal of approaches and heuristics, and poor statistics and time course can generate series of decision making problems, the problem of regarding multiple states and criteria is considered. Objective. The goal is to develop an approach for reducing the multiple state decision making problem along with regarding multiple criteria by their hybridization to solve disambiguously a single decision making problem. Methods. An algorithm of reducing a finite series of decision making problems to a single problem is suggested. Also a statement is formulated to hybridize decision making criteria allowing to get a single optimal alternatives’ set. Results. Practically, this set contains just a single alternative. And, owing to the law of large numbers (of multiple criteria), the greater number of criteria is involved into the hybridization, the more reliable decision by the formulated statement is. Conclusions. The represented multiple state problem reduction and decision making criteria hybridization both provide a researcher with the one decision making problem whose number of optimal solutions must be less than that by any other approaches. Besides, it allows to rank alternatives at higher reliability and validity. Furthermore, reliable weights (priorities) for scalarizing multicriteria problems are produced.


Introduction
Decision making is always involving a great deal of approaches and heuristics.They concern both estimation procedures [1,2] and criteria to optimize decisions [3,4].Selection of a single approach or criterion along with the point evaluation is a non-trivial problem needing supplementary knowledge and statistical observations.Otherwise, without prior statistics, a selected method over the ordinarily point-evaluated decision matrix is going to fail or just be ineffective [1,2,5,6].
The similar difficulty exists when multicriteria problems are solved.Without statistical data, scalarization appears the only way to pay attention to every plausible method and criterion.For this, minimax-based approaches are widely applied [7,8].Besides, sets and their cardinalities of both alternatives and states may vary as time goes by [1,2,6,9,10].Therefore, to solve properly decision making problems (DMPs) under uncontrollable uncertainties, any non-excluded aspects and methods should be regarded.

Problem statement
Inasmuch as a finite series of DMPs is an aftermath of poor statistics and time course influence, an approach to reduce this series into a solvable DMP is needed.Variety of decision making criteria should be admitted as well.The goal lies in reducing the multiple state DMP (MSDMP) along with regarding multiple criteria to solve a single DMP.This goal is going to be reached after fulfilling the following steps: 1. Formalization of MSDMP.
2. Reduction of a finite series of DMPs generating an MSDMP in order to get an optimal alternatives' set (OAS) at disambiguation.
3. Decision making criteria hybridization for a single DMP.
4. Discussion of the reduction and hybridization.

Reduction of a finite series of DMPs
Henceforward, let all decision evaluations be kind of risks.Any risk is evaluated non-negatively.Suppose that, in the k-th condition (metastate), there is a finite set of alternatives (decisions) corresponds to the k-th metastate, where the entry k ij r 〈 〉 is a risk after the deci- because those K DMPs are related anyhow. Occasionally, M N × DMP associated with the matrix k R may be assigned to a probability k p by

Denote by
* k X the OAS by a decision making would not be needed, and MSDMP would be solved to an OAS But this is rare case even when every of those K DMPs is solved by the same decision making criterion.However, the condition (4) is not excluded.
If Consequently, by the occasion (6) and a short-term statistical trend, the union of solutions of those K DMPs should be considered.This makes sense, however, only if then a new single DMP may be derived whose set of alternatives is The set of states for this DMP is If (8) fails (Fig. 1) for a short-term statistical trend then the most probable OAS should be practiced one after another, according to the descend-Fig.1.A sketch for cases when the inclusion (8) for . Such a selection is relevant for T K < or about that.The worst occasion is when ( 6) is true and probabilities by the set of states (10).This single M N × DMP is finally formalized upon the decision matrix An algorithmic representation of the described reduction of K DMPs is in Fig. 2. Practicing an OAS * k X with the probability k p refers to [11].A variate Θ which is uniformly distributed on halfinterval [0;1) is raffled.Its value is θ.And if then, in the current round, OAS * z X is chosen.
For reducing, the set of OAS The algorithm in Figure 2 does not specify what criterion is applied to solve either DMPs with matrices

R
or the single DMP with R .Selection of criteria is a separate task.

Decision making criteria hybridization
A large number of decision making criteria can be applied to solve an DMP [3,4,10,12,13].A consequence of that, generally speaking, are different OASs whose intersection often occurs empty.Hence a single criterion which might include merits of all plausible criteria should better be used.The single criterion or approach will produce just an OAS disambiguating in the final decision selection.
Various criteria operate with differently measured values.This is why the risk decision nonnegative And this is the known standardization rule: 1, min min max max min min The Savage criterion normalized regret matrix (SCNRM) F % is deduced from the matrix R % .When the Germeyer criterion is on, it uses the stochastic matrix x y .
The Germeyer criterion takes the decision mat- min min max max min min identically to (15), giving The standardization rule ( 15) is not suitable for the product criterion because all M products 1 N ij j r = ∏ % must be positive.Instead of ( 15), if the ma- trix R contains zero entries (say, the minimal risk has been evaluated to zero), the rule gives the positive matrix [ ] where we do not need to justify a selection of some 0 γ > .
When decision making criteria use matrices R % , F % , P , P R % , 〈•〉 R , the expected (estimated by a criterion) risk not depending upon states comes within segment [0;1] having no units of measurement.Let ( ) r x be the risk estimated by the h-th criterion for the alternative i x .Then is a single OAS with the h-th criterion weight where \{1} H ∈ is a number of criteria involved to solve a DMP.
As an example, consider 5 8 × risk decision matrix wherein the minimax criterion gives a single optimal alternative, namely OAS is . However, the Savage criterion by its regret matrix . Moreover, having added 1 to matrix (22), we get positive matrix wherein the product criterion gives five products 36288, 8400, 145800, 8640, 12600 correspondingly for alternatives 5 1 . This is an instance where DMP with (22) has three different OASs by three criteria.
For disambiguation, hybridize those criteria according to normalization (15) the corresponding regret matrix is Without any priorities, weights (21) can be put equal.And (20) is stated by arg min ( ) arg min ( ) The minimax, Savage, and product criteria are indexed by 1 h = , 2 h = , 3 h = , respectively: is the solution.But let think of how SCNRM (25) was calculated.It was deduced from the normalized risk decision matrix (24).However, SCNRM could be calculated straightforwardly by normalizing the origin regret matrix (23), using the standardization rule identical to (15).Denote such an SCNRM by 1 〈 〉 F % .In the being considered example, and, with SCNRM (26), by the risk 0 ( ) is more general than (20).
For the example of the risk decision matrix (22) with (27), we write by denotation

{ ( )}
i i r x = which are very small.And this is a distinctive feature of the expected risk by the product criterion over the normalized matrix 〈•〉 R -when number of states increases, the expected risk badly decreases not influencing on the grand total.In the example, those expected risks can be rounded even to zero, but the truncation error is still insignificant.To prevent this drawback of the product criterion normalization, the following expected risks are better to use: .
with the h-th criterion weight (21).
Completing the example of the risk decision matrix (22), we get is the ultimately best solution.Note that the minimax and Savage criterion came too close with their risks 1.398545 and 1.407848, although the product criterion appeared far behind them.

Discussion
MSDMP and its formalization can be imagined as a stratification of a finite series of DMPs with their matrices.Each layer is a DMP matrix.The reduction into a DMP is similar to scalarization in solving multicriteria problems.The algorithm in Figure 2 has two sides.The first one is that it relies on statistics supposing probabilities 1 { } K k k p = are known.This also often assumes that there is a long-term statistical trend, enough for practicing OASs is true in the current round.The second side is far more real: probabilities 1 { } K k k p = cannot be evaluated as points or they are just unknown, and there is a short-term statistical trend for metastates of MSDMP.In this way, a union-like DMP with set of alternatives (13) and set of states (10) is the most relevant.A short-term statistical trend nonetheless implies DMP with the set of alternatives (9) and set of states (10) when the inclusion (8) turns true.
Cases in which (1) or ( 2) turn true are practically impossible unless DMPs have very weak relation.Nevertheless, such "scattered" DMPs may be assigned rather with probabilities  5) is rarely possible requiring at least the condition (7).
Decision making criteria hybridization aims at disambiguation as well.Sometimes normalization

R
is needed to compare expected risks as they are.Then formulas (20) and ( 27) could be useful.Normalizing expected risks by (29), meritoriously, brings to simple hybridization effect by (30).That requires only weights whose values, in statistically poor cases of DMP, are set identical: In most practical events, probability-based criteria (say, Germeyer, modal, minimal variance, maximal probability, etc.) are not reliable.This is caused by the stochastic matrix ( 16) is influenced with a great deal of factors and badly varies as time goes by.So when (30) is constructed, weights corresponding to probability-based criteria could be taken smaller.
For non-risk matrices, those normalization rules fit also.Only 0 γ > must be justified such when the rule (19) is non-applicable.For gain (profit) matrices, minimum in (30) is substituted with maximum.And expected gains are weighted as usually, but, if the minimal variance criterion is included, minimal variance expected values are taken with minuses.The same concerns Savage criterion regret expected values.

Conclusions
The represented multiple state problem reduction in Fig. 2 and decision making criteria hybridization by (30) both provide a researcher with the one DMP having the single OAS, which usually contains less elements than OAS by any other approaches.Here, a problem of selecting a unique decision from the OAS is not solved.But, with sufficiently great number of criteria involved in hybridization, OAS is believed to contain just one element, that unique decision.This is a manifestation of the law of large numbers transfigured into the law of multiple approaches (criteria).The greater number of criteria is involved, the more reliable decisions by the statement (30) are.
In addition to improved substantiation of optimality, unification and normalization allow to rank alternatives at higher reliability and validity [14,15].For instance, after the solution (31), alternatives are ranked as follows: In this way, further work is going to be connected with multiple criteria which are applied to solving multicriteria problems.
this MSDMP at once.This is because we get into a probabilistic domain requiring strong statistical series.Particularly, if conditions and metastates of MSDMP recur periodically for at least a few hundred times then OAS * k X should be practiced with the probability k p .But if they recur just a few times or singly at all, then probabilities 1 { } K k k p = are counted unavailable anyway.

F
% , these both ought to be regarded while each of them produces different result.Therefore, if 0 {1, } h H ∈ in (20) corresponds to the Savage criterion then which have stronger relation to each other what actually impedes distinguishing related DMPs.Despite any relation strength, an OAS by ( is true and when it fails