Rough sets with other soft computing theory with case studies were reviewed

Abstract: In recent years, the rough set research more and more, especially on rough set theory with other soft computing combination of research is more prominent, made a lot of meaningful research results, therefore, this aspect of the current main a summary of the situation, mainly describes the current rough sets and fuzzy sets, neural networks, and some other evidence theory combination between soft computing theory research situation, and the future development of this area made some of their own ideas.

Keywords: rough sets; soft computing; fuzzy sets; rough fuzzy set; Fuzzy Rough Sets


Survey on combination of rough sets and other soft computing theories


TANG Jian-guo?? 1,2, William ZHU? 1, SHE Kun? 1, CHEN Wen?? 1,3


(1.School of Computer Science & Engineering, University of Electronic Science & Technology of China, Chengdu 611731, China; 2.School of Computer Science & Engineering, Xinjiang University of Finance & Economics, Urumqi 830012, China; 3.Dept. Of Computer Science, Fuzhou Polytechnic, Fuzhou 350108, China)? Abstract: In recent years, there are more and more research on rough sets.Especially, the combinations of rough sets and other soft computing theories have became more prominent, and have made a lot of meaningful research results. In view of this, this paper gave a summary of the current status of these major researchs.It focused on the combination of rough sets and other soft computing theories such as fuzzy sets, neural net, evidence theory, and so on. In the end, it put forward the own viewpoint of the future development in this area.

Keywords:: rough sets; soft computing; fuzzy sets; rough-fuzzy sets; fuzzy-rough sets

0 Introduction

With the computer technology and network technology, the rapid development and wide application of human society entered the era of information explosion, how to deal with and effectively use this information to scholars around the world has become a hot research topic. Soft Computing is in this context appears demand A new technique of soft computing fuzzy set theory was originally developed by the founder of Zadeh [1] proposed in 1994, it is a means for uncertain, imprecise and incomplete data, the true value to obtain fault-tolerant processing low price, easy to control processing and high robustness of the method set. Currently, soft computing theories and methods include neural networks, fuzzy sets, rough sets, genetic algorithms, evidence theory.

Rough set is developed rapidly in recent years, a theory, which is used to analyze and deal with uncertain, imprecise mathematical theory was developed by Polish mathematician Pawlak [2] in 1982 made it The basic idea is an equivalence relation on the domain of the domain will be divided into a number of equivalence classes, and then use this knowledge to the required processing imprecise or uncertain thing for an approximate characterization.

Rough Set Theory biggest feature is its division of the domain depends only on the collection of data for processing by itself, does not require any a priori information, so a description of the problem or deal with uncertainty is more objective, which is also its point with other soft computing significant difference between the theory, however, rough set in the original data imprecise or uncertain, is unable to process the data, which coincided with the soft computing in other theories have very complementary so rough sets and other soft computing theories and methods of combining rough sets has become an important element in this paper will be rough sets and fuzzy sets, neural networks, and evidence theory concept lattice theory combined with soft computing research conducted presentations and pointed out that in this respect the future research directions.

An overview of rough set theory

Rough set is a method for solving the problem of uncertainty mathematical tool set theory of knowledge is understood as the ability to distinguish between right things, in the form of performance for the division of the domain, and thus pass on the equivalence of field relational representation. rough set by a pair of upper and lower approximation operator to characterize things, it does not require any prior knowledge other than the data, so it has very high objectivity. Currently, rough sets are widely used in decision analysis, machine learning , data mining and other fields [3-8].

1.1 Basic concepts of rough set [9]

On the definition of a domain concept. Suppose U is the desired object of study consisting of non-empty finite set called a universe, that universe U. domain U of an arbitrary subset XU, called a domain U concepts in the domain U is called an arbitrary subset of the cluster's knowledge about the U.

Knowledge Definition 2 Given a domain U and U tuft on the equivalence relation S, called tuple K = (U, S) is a knowledge base about the domain U or similar space.

Definition 3 indiscernible relation Given a domain U and U tuft on the equivalence relation S, if the PS, and P ≠?, Then ∩ P is still the domain of an equivalence relation on U, called P on indiscernible relation, denoted by IND (P).
Said partition U / IND (P) for the knowledge base K = (U, S) regarding the domain U of P-basics.
Definition 4 upper approximation, lower approximation. Has knowledge base K = (U, S). Where U is the universe, S is an equivalence relation on U cluster for? X ∈ U and a domain U, etc. price relation R ∈ IND (K), then X on R is lower and upper approximation, respectively

Lower approximation R (X) = ∪ {Y ∈ U / R | YX}

On the approximation of R (X) = ∪ {Y ∈ U / R | Y ∩ X =?}

Set upper approximation and lower approximation is the core concept of rough set, rough set number of features and topological characteristics are made to describe and portray them when R = (X) when X is called R-precise set; when R (X) ≠ (X) when X is called R-rough sets, ie, X is a rough set.

1.2 Knowledge Reduction in Rough Sets

In an information system, some of the described properties of the object may be unnecessary, and therefore need to be removed the redundant attributes to improve system efficiency.

Given a knowledge base K = (U, S), the PS,? R ∈ P, if IND (P) = IND (P-{R}) holds, then R is P, unnecessary, or said R is necessary in P if P in each R are necessary, then P is independent.

Definition 5 reduction, nuclear, given a knowledge base K = (U, S) and knowledge base equivalence relation on a cluster of PS, for any GP, if G is independent and IND (G) = IND (P ), then G is called a reduction of P, denoted G ∈ RED (P). P consisting of all the necessary knowledge of the core collection called P, denoted Core (P). Reduction and nuclear relationship Core (P) = ∩ RED (P), which is the reduction of nuclear intersection.
Common knowledge reduction in rough set algorithms are blind to delete reduction method, based on Pawlak attribute importance reduction method and the reduction based on discernibility matrix method in which the blind method is arbitrarily chosen to delete a property, to see whether is necessary, if it is necessary for the preservation, or delete the attribute, this method is simple and intuitive, but the reduction is not necessarily satisfactory results; Pawlak attribute importance based approach is based on the attribute importance to carry around Jane, which is characterized by this method can get the optimal reduction of information systems or sub-optimal reduction, but it can not find a reduction possibilities exist; based on discernibility matrix approach is to distinguish between any two of the domain collection of properties of objects in the form of a matrix represented by this matrix can be intuitively derived information systems core and all reductions, although this method can be very intuitive to draw all the reduction of information systems and nuclear, but question will produce large combinatorial explosion. Moreover, some scholars of knowledge about Jane made some new and improved algorithms. literature [10, 11] based on neighborhood rough set of attributes and attribute values ​​are optimized reduction processing; literature [12] proposed a new attribute reduction method ReCA, improved data on the continuity properties of knowledge reduction performance.
Rough sets in handling uncertain problems unique new method attracted a great deal of interest to many scholars have made the theory scalability study [13 to 17], including the covering rough set [18 ~ 21], variable precision rough Set [22], and many other new content. literature [23] on the rough set axiomatic conducted in-depth research, got two on rough sets minimum Axiom Group; literature [24] by relaxing indistinguishability between objects and compatibility conditions, give a new harmonious relationship based rough set model; literature [25] constructed on the decision table object distinction conditions, and with the distinction between matrix and discernibility function presents a complete reduction Methods; literature [16] will be combined entropy and combination granulation introducing the concept of rough set, establishes the relationship between the two; literature [26] proposed a Inconsistent Information Systems Knowledge Reduction in new ways; literature [ 27] proposed division of property and property left right framing the view, the design of a division-based attribute reduction algorithm ARABP; literature [28] from the property and the perspective of information entropy of rough set measure of the uncertainty of these research has greatly promoted the development of rough set theory and applications.

Two rough sets and fuzzy sets

Fuzzy set theory by the American scholar Zadeh in 1965, the fuzzy set refers to a collection where each element of this collection is to some extent part of this collection is not affiliated with, or used to measure this kind of degree of membership functions are called membership function of fuzzy set an arbitrary element through membership functions to determine a degree of membership with correspondence.

2.1 The basic concept of fuzzy set theory

Definition 6 membership, membership function. Let U be a universe, A is a fuzzy set on U if? X ∈ U, are able to identify a number μ? A (x) ∈ [0,1] to represent x A part of the extent of the number x is called the degree of membership of the A, where μ? A (x) is such a mapping: μ? A: U → [0,1], x | → μ? A (x) ∈ [0,1],? μ? A (x) is called A membership function.

Membership function is the core foundation of the concept of fuzzy sets, which it has to identify and describe a fuzzy set for the same universe of discourse, different fuzzy membership function to determine the different sets, such as μ? A (x) and μ? B (x) domain U is the two different membership functions, which can be determined by two different fuzzy sets A and B. The fuzzy set theory is a classic collection of the expansion, when a fuzzy set membership can only be 0 or 1 that μ? A (x) ∈ {0,1}, fuzzy set A will degenerate into an ordinary collection of classical set theory.

2.2 Fuzzy Sets and Rough Sets complementary

In the fuzzy set membership function is generally based on the experience of expert knowledge or by some statistical data results to determine, with a high degree of subjectivity, and a certain lack of objectivity, which is a fundamental flaw of fuzzy sets. Rough set upper approximation and lower approximation is determined by the objective existence of the knowledge base of known objects to determine, without any prior assumptions, with strong objectivity, but in real life, there are many known and identified is no need to judge a priori knowledge, if we can directly use this knowledge to solve problems, will bring high efficiency, and this is, again, rough sets lack. Thus, each of rough sets and fuzzy sets between the characteristics of highly complementary, to combine them to solve problems usually more effective than the individual using them in research in this area has made great progress and many of the specific application, rough fuzzy sets and fuzzy rough sets [29] is one of two important research results.

Rough fuzzy set is mainly through the use of fuzzy set membership functions centralized collection on rough approximation and lower approximation method to be described, in order to enhance the objectivity of fuzzy sets deal with the problem it is to focus on the upper and lower rough approximation of the characteristics of integrated into the fuzzy set among the membership function of the fuzzy concept focused on expansion into the membership function approximation and lower approximation membership functions, membership functions by these two values ​​as determined by the membership to form an interval; using this interval to describe a elements belonging to a fuzzy set range of possibilities, rather than the previous one correspondence between the elements and the membership situation in which x ∈ A membership is no longer μ? A (x) ∈ [0,1 ], but in the [lower approximation membership, membership on the approximation of] this interval. rough fuzzy set basic definition is as follows:
Rough Fuzzy Sets Definition 7 Let U be a universe, R is an equivalence relation on U, A is a fuzzy set on U, μ? A (x) is the membership function of A, R (A) and (A) A, respectively, on approximation and lower approximation, their corresponding membership functions are:

a) The following approximate membership function μR (A) ([x]? R) = inf {μ? A (x) | x ∈ [x]? R},? x ∈ X;


b) on the approximation of the membership function μ (A) = sup {μ? A (x) | x ∈ [x]? R},?? x ∈ X.
Said R (X) = (R (X), (X)) is rough fuzzy sets.

Fuzzy rough set is the membership function of the fuzzy set concept applied to the rough set which, according to the fuzzy membership function to determine the concentration of an equivalence relation in rough set, that is, the membership function obtained by the same degree of membership of such elements belong to the same price category, resulting in a domain U division, which is actually known fuzzy focus, determined without further judgment knowledge into rough set equivalence relation on the set to get a bunch of rough equivalence classes improve the efficiency of rough sets deal with the problem of fuzzy rough sets basic concepts are defined as follows:
Definition 8 fuzzy rough set, given a universe U, A is a fuzzy set of U, μ? A (x) is the membership function A Let R? A is an equivalence relation on U and satisfies for? x, y ∈ U, xR? Ay? μ? A (x) = μ? A (y). Order [x] R?? A that represents the elements in x equivalence class, if the XU, X ≠?, then X on R? A lower and upper approximation, respectively

Lower approximation R? A (X) = ∪ {[x] R?? A | [x] R?? AX}

Upper approximation? A (X) = ∪ {[x] R?? A | [x] R?? A ∩ X ≠?}


If R? A (X) =? A (X), called X is R? A-definable set; if R? A (X) ≠? A (X), called X is R? A-fuzzy rough set.

Rough fuzzy sets and fuzzy rough sets and fuzzy sets rough sets a good complementary treatment has been applied in many fields [30-33], and have achieved good results. Many academics they have been Further comparative studies [34 ~ 37], made some improvements and expansion of the literature [38] In covering rough sets based on the combination of the recent unusual set of fuzzy sets, covering generalized rough sets introduced the concept of ambiguity, given a fuzzy calculation methods, and demonstrate the ambiguity of some of the important properties; literature [39] proposed the concept of fuzzy indiscernible relation to enhance the value of fuzzy rough set attribute processing capabilities.

3 rough set and neural network

Neural networks in modern neurobiology research results developed on the basis of an imitation of the human brain information processing mechanisms of the network system and it has supervised or unsupervised in the case from the input data for the ability to learn, is widely used in data mining [40 ~ 42], pattern recognition [43 to 47], signal processing [48,49], prediction [50, 51] and other fields.

3.1 Neural Network Fundamentals

Neural networks [52] is a simple scale processing unit having a parallel distributed processor natural experience with storage of the available knowledge and the characteristics neurons are the basic neural network information processing unit, which has to receive and transmission of information functions. a neural network is composed of many neurons, each neuron receives other neurons and external input. neural network structure is usually the way to layer tissue, usually contains an input layer , any number of hidden layer and an output layer, each layer consists of a large number of neurons whose basic principle is that the input layer neurons receiving external environment information input, hidden layer neurons in the hidden layer unit information will be output to the output layer , the output layer will output the information to the outside world according to the output neuron information exists feedback, in turn divided into feedforward neural network neural network and recurrent neural networks.

3.2 Rough Sets and Neural Network Contact
Rough set of things recognition and judgment is not based on the identified relationship on the domain, it does not require any a priori information on the importance degree of the system parameters describe things function to obtain the importance degree of each attribute, and so can not only properties The reduction, but also can be used to grasp the main features of things, to improve the ability to identify rough sets can be achieved on information systems knowledge reduction, removal of redundant information, reducing the space dimension of input information, improve processing efficiency, but rough Set anti-interference ability is poor, are more sensitive to noise in noisy environments on the performance unsatisfactorily.

Neural network characteristics is through training and learning to produce a non-linear mapping to simulate the way people think, with good adaptability can be achieved supervised and unsupervised learning, and the information can be processed in parallel; same time, It has very good ability to suppress noise, but there is also a neural network obvious defects, it can be helpful to input information or redundancy judgment of the input information can not be simplified, which makes it the treatment space the larger dimension of the information will be very difficult and inefficient.

Rough set and neural network respective strengths and weaknesses so that people find that they have a good complementary; addition, from the simulation of human thinking perspective rough set method to simulate the human abstract logical thinking, and neural network simulation of human image Intuition, therefore, to combine the two, with the characteristics of rough sets to make up neural network for dimensional data on the lack of Gao, and strong anti-jamming with neural networks rough set of features to make up the sensitivity to noise, will simulate human thinking and abstract images combined with intuitive thinking, you will get better results. Currently, research in this area has become an important research direction.

3.3 rough set and neural network combined

Rough set and neural network of the most common combination of two main ways: a) neural networks rough set as front-end processors [53], by using rough set before the original information, attributes and attribute values ​​of reduction, removal of redundant I information and reduce the dimension of the information space, to provide a simpler neural network training set, and then build and train the neural network. Such a combination approach not only reduces neural network learning and training time, improve system response speed, but also give full play to the anti-noise neural network and fault tolerance advantages, to improve the overall performance of neural networks purpose. b) by introducing a rough neural networks of neurons to carry out, the rough normal neurons and neurons constitutes a rough mix using neural networks.

Rough neurons are Lingras [54] designed as a common by a pair of overlapping neurons - the neurons and the neural element r, as shown in Fig. Roughness on neurons neurons and nerve element r Overall as a neuron r, connections between neurons represent information exchange. Figures 2-4, respectively r and s rough neural connections between the full and inhibit access and incentive to connect three common connection. Rough neuron output is having upper and Lower Approximation a pair of numbers, while the average value of only one output neuron. approximate lower or upper approximation neuron input weights are calculated according to the following formula:

input? i =? nj = 1wji × output? j

Where: wji into neuron j to i connections between neurons weights, n i and j represent the number of connections that exist between.

If f (u) for the neuron activation function, then the upper and lower rough neuron neuron output values ​​were

output? = max (f (input?), f (inputr))

output? r = min (f (input?), f (inputr))

Calculating common single output neuron i values ​​by the equation:

output? i = f (input? i)

Function f (input) the sigmoid type function, defined as follows:

f (u) = 1 / (1 + e??-gain × u)

Wherein: the gain factor gain is determined by the slope of the system designer. F (u) using the sigmoid transfer function is a result of this type of transfer function has a continuous 0 to 1 are determined.

For rough sets and neural network combined with research, as well as other scholars study presents some new combination of methods, such as strong coupling integration [55] approach to solve the neural network design network hidden layer, hidden layer nodes and determine the value of the initial weights and network semantics provides an easy implementation of new ideas. With soft computing theory in a variety of theories and techniques of continuous development and innovation, such as neural networks and evolutionary algorithms, concept lattices, evidence theory and strengthen the integration of chaos science and other research, I believe will get more exciting achievements.

4 rough sets and genetic algorithms

Genetic algorithm [56] is a natural evolution of the system of computer models, but also a general search for solving optimization problems adaptive method is characterized in that it is essentially groups of search strategy and simple genetic operators, is currently the most evolutionary algorithms important in an algorithm, widely used in artificial intelligence, data mining, and commercial areas such as automatic control.

4.1 Basic principles of genetic algorithms

Genetic algorithms by simulating natural selection and genetic mechanisms, an iterative manner to its target groups for adaptive research evaluation, selection, recombination, until the target groups to meet predetermined requirements or maximum number of iterations to obtain its desired optimal solution Genetic algorithm is a key issue in the problem space encoding individual choice, the determination of fitness function, as well as genetic strategy selection, crossover, mutation and selection of three genetic operators probabilities p? s, crossover probability p? c, mutation probability p? m sure other genetic parameters. Below is a description of the standard genetic algorithm algorithm [56]:

Beginning of iteration (iteration): t = 0

Initialization (initialize): P (0) = {a? 1 (0), a? 2 (0), ..., a? N (0)}
Suitability (evaluate): P (0) = {f (a? 1 (0)), ..., f (a? N (0))}

while (loop) T (P (t)) ≠ true do

Select (select): P '(t) = s (P (t), p? S)

Crossover (crossover): P "(t) = c (P '(t), p? C)

Variation (mutate): P? (T) = m (P "(t), p? M)

Generation groups: P (t +1) = P? (T), t = t +1

Adaptability Evaluation (evaluate):

P (t +1) = {f (a? 1 (t +1)), ..., f (a? N (t +1))}

End (end do)

4.2 rough sets and genetic algorithms combined

Rough sets and genetic algorithms are used in conjunction attribute reduction [57 ~ 59], data mining [60] and so on. Rough set attribute reduction for commonly used heuristic algorithms, such as attribute importance based on the properties of Pawlak about reduction algorithm based on discernibility matrix attribute reduction algorithm, etc. This approach problems in a certain size range would be more effective, but with the scale of the problem increases, the smallest reduction will substantially increase the difficulty of solving the genetic approximately simple algorithm is to strike the smallest reduction information system or relative minimal reduction algorithm. called minimal reduction or relative minimal reduction is all about simple attribute set or relative reduction in the number of attributes at least contain the attributes sets. due to genetic algorithm is a search method based on global optimization, and has parallel and very robust, it is possible to prevent the search for the plight of local optimum solution, the problem is more conducive to deal with large-scale reduction.

Literature [57] According discernible relationship between lower triangular matrix using genetic algorithm is proposed based on rough set genetic algorithm knowledge reduction algorithm, which can not only get the correct reduction, but also to solve the rough set heuristics Part of the problem can not be solved; literature [61] will be defined in terms of information theory measure of the attribute importance to introduce genetic algorithms as heuristic information and construct a new operator modifypop (t +1) to be repaired on the population, both to ensure the algorithm The overall optimization, but also improves the speed of convergence in the data mining literature [60] rough sets and genetic algorithms combining data from a large table is proposed to obtain the decision rules approach. The method uses rough set attribute importance and nuclear thinking of getting attribute reduction, then using genetic algorithms to find an optimum solution.

In addition, the discretization of continuous attributes is an important issue in rough set. Attribute discretization process is the key to select the appropriate breakpoint on condition attributes constitute the space to be divided in order to reduce the search space. Literature [62] To solve the problem using the minimum break point set genetic algorithm as the optimization goals and construct a new operator to ensure that the selected breakpoint can keep the original decision system can not differentiate between.

5 rough set and concept lattice

Concept lattice theory also called formal concept analysis theory, put forward by the German mathematician While the concept of a hierarchy based on concepts and mathematical expression [63], for data analysis and rule extraction is very effective. Now widely used in machine learning [64], software engineering [65] and other fields.

5.1 Concept lattice theory basics


Definition 9 [66] form the background. Said (U, A, I) is a formal context, where U = {x? 1, x? ​​2, ..., x? N} is a set of objects, each of x? I (i ≤ n) is called an object; A = {a? 1, a? 2, ..., a? n} is the set of attributes, each of a? j (j ≤ m) is called an attribute; I of the U and A binary relation between, IU × A. If (x, a) ∈ I, then say that x has attributes a, denoted xIa.

In the formal context (U, A, I), if the object is a subset XU, a subset of attributes BA, define operational operators X? * And B? * Is


X? * = {A | a ∈ A,? X ∈ X, xIa}

B? * = {X | x ∈ U,? A ∈ B, xIa}

Where: X? * Represents the X common to all objects in a collection of attributes, B? * Represents all the attributes together with B, a collection of objects.
Formal Concept Definition 10 Let (U, A, I) is a formal context, if a tuple (X, B) satisfy X? * = B and B? * = X, called (X, B) is a form of concept, referred to the concept of which, X called the extension of the concept, B is called the connotation.

Definition 11 [67] The sub-concepts, the parent concept if (X? 1, B? 1) ≤ (X? 2, B? 2), and between them there is a different concept and (Y, C), satisfied (X? 1,? B? 1) ≤ (Y, C) ≤ (X? 2, B? 2), called (X? 1, B? 1) is (X? 2, B? 2) of sub-concepts, (X? 2, B? 2) is (X? 1, B? 1) the parent concept.

Links to free download http://eng.hi138.com

5.2 rough set and concept lattice Contact

Rough Sets and are based on concept lattice of binary relations between the data table to expand the study. Rough set theory is based on the domain of their relationship can not be identified to achieve the division of the domain, resulting in a number of equivalence classes. Concept lattice is based on formal concept, combining theory and order complete lattice theory concept hierarchy discussed concept lattice for each concept is common attributes with the largest collection of objects, which is equivalent classes with rough set is very similar to the background in the form, extension that is determined by the equivalence class connotation, so some properties of rough sets, including equivalence class, upper and lower approximation and so can be described through the concept; same time, using the special structure of concept lattice can be functional dependency, thereby concept lattice can be used to visualize the conditional attribute reduction.

Rough set and concept lattice similarity between the two theories have so closely linked, many scholars study combining them. Weiling Deng people [67] studied the formal concept analysis and equivalence class and division of the concept lattice inter-relationships come into rough set theory and concept lattice concept lattice can be converted to each other conclusions; literature [68] The rough set theory attribute reduction and introducing the concept of identification matrix formal concept analysis, realized in the form of background knowledge redundancy reduction; Yao [69,70] based on the concept of object oriented concept lattice discussed concept lattice and rough set theory correspondence between the lower and upper approximations in rough set theory to introduce the idea of Formal concept analysis, formal concept analysis are discussed in several approximation operators. literature [71] will contain the degree and posets introduced to the concept of formal concept analysis, formal concept analysis on some of the basic concepts were used contains degree and be represented Posets the literature [72] use the name of formal concept analysis rung background (nominal scale) and flat rungs (plain scaling) concept is demonstrated in the upper and lower rough set theory approximation, attribute dependence and other core concepts can be derived in the corresponding context for that and that the use of the concept of cascade rough set theory can be extended to the integration of the two provides a theoretical platform. literature [73] studies based on rough set theory and concept lattice , are given in the form of a collection of background on the concept of binary operation between elements, the general sense of the concept lattices as concept lattices with operators.

6 Rough Sets and Evidence Theory


Evidence theory [74] also often called DS theory, is a use of a set of functions to handle the uncertainty theory of evidence theory is the study of the evidence refers to properties of an object or an expert experience.

6.1 based on evidence theory
Let Θ represents a problem for the set of all possible answers, in which each answer θ is a subset of Θ, no intersection between the subsets, called Θ for the identification frame.
Definition 12 [75] basic probability assignment function Let Θ be a frame of discernment, if the set function m: 2? Θ → [0,1] satisfy m (Φ) = 0, and? A? Θm (A) = 1, then m is Θ on the basic probability distribution function;? A? Θ, m (A) is called A basic credibility.

In the definition of 12, based on the power set of defined herein Θ 2? Θ on the three measures? Function:


a) belief function Bel, Bel (X) =? AXm (A),? XΘ;

b) the likelihood function pl, pl (X) =? A ∩ X ≠? m (A);

c) public function Q, Q (X) =? X? Am (X).
Where: Bel expressed belief function for each proposition reliability; likelihood function pl (X) expressed proposition X is not the degree of suspicion; public function Q (X) reflects a collection that contains all the basic X's credibility and.

6.2 Rough Sets and Evidence Theory Contact
Evidence theory based on probability assignment function to define the belief function, the likelihood function, through which function under the given assumptions to estimate the evidence and evaluation of evidence theory, evidence is mainly known properties of things or expertise and some a priori knowledge, which makes a strong subjective evidence reasoning, limiting its scope of use. evidence theory and rough set these characteristics obvious complementarity and similarity rough set for problem solving is based on a pair of objective approximation operators, with strong objectivity; rather rough set lower and upper approximation theory with evidence of trust functions, the likelihood function in form but also has a certain similarity would be the advantages of both complementary and similar for binding studies nature, has become an important direction in this area.

Literature [76, 77] by a stochastic approximation space on rough set theory with evidence similar studies, concluded: Evidence theory and belief function likelihood function can be used on rough set lower approximation and the approximate probability Description:

Bel (X) = | R (X) | / | U |, pl (X) = | (X) | / | U |

Literature [78] but also on rough set theory and the relationship between the evidence carried out further research, that the identification of different frameworks and have different lower and upper approximation between the various rough approximation space are closely linked, and can use this Contact to explain the function of identification framework of trust and the likelihood function in order to deepen the two theories? understanding.

7 Conclusion

Technological developments allow people to live, learn, research, and other modern tools expectations tend to automation, convenient, intelligent, high-speed, while the reality is that people get and the need to deal with a huge number of data is not only complicated, but mostly uncertain, incomplete or incomplete true. how to effectively and quickly extract the information people need to become a serious problem. soft computing theories appear to help people in this has made great achievements, but also the rapid development of rough set theory to application of soft computing research provides strong support and expansion. With the deepening of soft computing theory research and development, it was discovered that a single soft computing theory theory and applications on both the existence of this or that insufficient, and strong complementarity between these theories feature can compensate for these deficiencies, therefore, different theoretical studies combined with soft computing has become the current academic consensus.

This paper describes the rapid development in recent years and has a very novel features of rough set theory with soft computing theory with some of the other studies the situation, which you can see this combination of artificial intelligence, data mining, knowledge discovery, attribute Jane, automatic control and other aspects of medicine made remarkable achievements. Moreover, the word computing [79] has become a hot research field of artificial intelligence, computing with words based on the word or text terms as objects, rather than the value of the object is calculated , while the word or text itself has the characteristics of uncertain significance, which coincided with the rough set description of the problem characteristics very similar, therefore, the combination of rough sets and computing with words research will also be the future development of a rough set of content, which so I believe that with the right soft computing theory with the deepening of research, you will see even more gratifying success.

Currently soft computing theory combined studies generally confined between the two theories in which to expand certain, and the author in the actual study also found that even this pairwise combination to be perfect, and there are many areas for improvement, which requires In future studies will bring more soft computing theory to study together, learn from each other, complement each other and improve the level of research in this field.

References:

[1] ZADEH L A.Fuzzy logic, neural networks and soft computing [J]. Communications of the ACM, 1994,37 (3) :77-84.
[2] PAWLAK Z.Rough sets [J]. International Journal of Computer and Information Sciences, 1982,11 (5) :341-356.

[3] ALVATORE G, BENTTOM, ROMAN S.Rough set theory for multi criteria decision analysis [J]. European Journal of Operational Research, 2001,129 (1) :1-47.
[4] An Liping, CHEN Zeng-qiang, YUAN Zhu-zhi based on rough set theory of multi-attribute decision analysis [J]. Control and Decision, 2005,20 (3) :294-298.

[5] Li Yongmin, Zhushan Jun, Chen Xianghui, etc. Based on Rough Set Theory Data Mining Model [J]. Tsinghua University: Natural Science Edition, 1999 (1) :111-114.

[6] Liu Qing, Huang Zhaohua, Liu Shaohui, etc. Operators with Rough decision rules and data mining in soft computing [J]. Computer Research and Development, 1999,36 (7) :33-37.

[7] Wen-Yu Chang, XUE Huifeng, Zhang, et al. Rough sets in data mining classification rules Application Research [J]. Northwestern Polytechnical University, 2002,20 (3) :430-434.

[8] Tao multi-show, LV Yue, DENG Chun-yan based on rough set of multi-dimensional association rules mining method [J]. Journal of Computer Applications, 2009,29 (5) :1405-1408.

[9] MIAO Duo Qian, LI Dao-Guo set theory, algorithms and applications [M]. Beijing: Tsinghua University Press, 2008.

[10] Hu Qinghua, Yu Daren, Xie Zongxia based on neighborhood granulation and rough approximation of numerical attributes reduction [J]. Journal of Software, 2008,19 (3) :640-649.

[11] Hu Qinghua, Zhao Hui, Yu Daren based on neighborhood rough set of symbolic and numeric attributes rapid reduction algorithm [J]. Pattern Recognition and Artificial Intelligence, 2008,21 (6) :732-738.

[12] Manufacturers Lin, Wan Qiong Yao Wang Shu, etc. A continuous-valued attribute reduction method ReCA [J]. Computer Research and Development, 2005,42 (7) :1217-1224.

[13] YAO Yi-yu.Three-way decisions with probabilistic rough sets [J]. Information Sciences, 2010,180 (3) :341-353.

[14] CHEN Yu-min, MIAO Duo-qian, WANG Rui-zhi.A rough set approach to feature selection based on ant colony optimization [J]. Pattern Recognition Letters, 2010,31 (3) :226-233.
[15] LIANG Ji-ye, WANG Jun-hong, QIAN Yu-hua.A new measure of uncertainty based on knowledge granulation for rough sets [J]. Information Sciences, 2009,179 (4) :458-470.
[16] QIAN Yu-hua, LIANG Ji-ye.Combination entropy and combination granulation in rough set theory [J]. International Journal of Uncertainty, Fuzziness and Knowlege-based Systems, 2008,16 (2) :179-193.

[17]BASZCZYNSKI J,GRECO S,SOWINSKI R,et al.Monotonic variable consistency rough set approaches[J].International Journal of Approximate Reasoning,2009,50(7):979-999.

[18]ZHU W,WANG Fei-yue.A new type of covering rough set[C]//Proc of the 3rd International IEEE Conference on Intelligent Systems.2006:444-449.
[19]ZHU W,WANG Fei-yue.Reduction and axiomization of covering ?generalized rough sets[J].Information Sciences,2003,152(1):?217-230.
[20]ZHU W.Topological approaches to covering rough sets[J].Information Sciences,2007,177(6):1499-1508.

[21]LIU Gui-long,SAI Ying.A comparison of two types of rough sets induced by coverings[J].International Journal of Approximate Reasoning,2009,50(3):521-528.
[22]SLEZAK D,ZIARKO W.Variable precision Bayesian rough set model[C]//Proc of the 9th International Conference on Rough Sets, Fuzzy Sets, Data Mining, and Granular Comuting.Berlin:Springer-Verlag,2003:312-315.
[23]祝峰,何华灿.粗集的公理化[J].计算机学报,2000,23(3):330-333.

[24]王珏,刘三阳,王建新.粗糙集理论的扩展模型研究[J].同济大学学报:自然科学版,2006,34(9):1251-1255.

[25]秦克云,高岩.决策表的正域约简及核的计算[J].西南交通大学学报,2007,42(1):125-128.

[26]张文修,米据生,吴伟志.不协调目标信息系统的知识约简[J].计算机学报,2003,26(1):12-18.

[27]张海云,梁吉业,钱宇华.基于划分的信息系统属性约简[J].计算机应用,2006,26(12):2961-2963.

[28]王国胤,张清华.不同知识粒度下粗糙集的不确定性研究[J]. 计算机学报, 2008,31(9):1588-1598.

[29]DUBOIS D,PRADE H.Rough fuzzy set and fuzzy rough sets[J].International Journal of General Systems,1990(17):191-209.

[30]邱卫根.基于随机模糊集的不完全信息系统粗集模型[J].模式识别与人工智能,2009,22(1):53-59.

[31]HONG T,TSENG L,CHIEN B.Mining from incomplete quantitative data by fuzzy rough sets[J].Expert Systems with Applications,2010,37(3):2644-2653.

[32]LI Jiang-ping,PAN Bao-chang,WEI Yu-ke.Tongue image segmentation based on fuzzy rough sets[C]//Proc ofInternational Conference on Environmental Science and Information Application Technology.Washington DC:IEEE Computer Society,2009:367-369.

[33]PETROSINO A,FERONE A.Rough fuzzy set-based image compression[J].Fuzzy Sets and Systems,2009,160(10):1485-1506.

[34]YAO Yi-yu.Combination of rough and fuzzy sets based on @-level sets[M]//Rough Sets and Data Mining:Analysis for Imprecise Data.Boston:Kluwer Academic Publishers, 1997:301-321.

[35]YAO Yi-yu.A comparative study of fuzzy sets and rough sets[J].Information Science,1998,109(1-4):227-242.

[36]HU Qing-hua,YU Da-ren,WU Cong-xin.Fuzzy preference relation rough sets[C]//Proc ofIEEE International Conference on Granular Computing.2008:300-305.

[37]WU Wei-zhi.Fuzzy rough sets determined by fuzzy implication operators[C]//Proc ofIEEE International Conference on Granular Computing.Washington DC:IEEE Computer Society,2009:596-601.

[38]徐伟华,张文修.覆盖广义粗糙集的模糊性[J].模糊系统与数学,2006,20(6):115-121.

[39]王熙照,赵素云,王静红.基于Rough集理论的模糊值属性信息表简化方法[J].计算机研究与发展,2004,41(11):1974-1981.

[40]宋擒豹,沈钧毅.神经网络数据挖掘方法中的数据准备问题[J].计算机工程与应用,2000,36(12):102-104.

[41]徐建军.医学影像数据挖掘中的人工神经网络方法研究[J].实用放射学杂志, 2006,22(11):1416-1418.

[42]周序生,王志明.粗糙集和神经网络方法在数据挖掘中的应用[J].计算机工程与应用,2009,45(7):146-149.

[43]刘政凯,章杨清.利用分维向量改进神经网络在遥感模式识别中的分类精度[J]. 遥感学报, 1994,9(1):68-72.

[44]周洪宝,闵珍,宫宁生. 基于粗糙集的神经网络在模式识别中的应用[J].计算机工程与设计,2007,28(22):5464-5467.

[45]王守觉,李卫军,赵顾良,等.模式识别专用神经网络计算机系统及应用方法:北京,CN1700250[P].2005:11-23.

[46]周志华,皇甫杰,张宏江,等.基于神经网络集成的多视角人脸识别[J].计算机研究与发展,2001,38(10):1204-1210.

[47]周志华,李宁,杨育彬,等.基于神经网络集成的肺癌早期诊断[J].计算机研究与发展,2002,39(10):1248-1253.

[48]黄春琳,邱玲,沈振康.数字调制信号的神经网络识别方法[J].国防科技大学学报,1999,21(2):61-64.

[49]游荣义,陈忠.基于小波变换的盲信号分离的神经网络方法[J].仪器仪表学报,2005,26(4):415-418.

[50]XIAO Zhi,YE Shi-jie,ZHONG Bo,et al.BP neural network with rough set for short term load forecasting[J].Expert Systems with Applications,2009,36(1):273-279.

[51]LIU Hui,KONG Wei,QIU Tian-shuang,et al.A neural network based on rough set(RSNN) for prediction of solitary pulmonary nodules[C]//Proc ofInternational Joint Conference on Bioinformatics, Systems Biology and Intelligent Computing.Washington DC:IEEE Computer Society,2009:135-138.

[52]SIMON H.神经网络原理[M].叶世伟,史忠植,译.北京:机械工业出版社,2004.

[53]JELONEK J,KRAWIEC K,SLOWINSKI R.Rough set reduction of attributes and their domains for neural networks[J].Computational Intelligence,1995,11(2):339-347.

[54]LINGRAS P J.Rough neural networks[C]//Proc of the 6th International Conference on Information Processing and Management of Uncertainty in Knowledge-based Systems.1996:1445-1450.

[55]张东波,王耀南,易灵芝.粗集神经网络及其在智能信息处理领域的应用[J].控制与决策, 2005,20(2):121-126.

[56]李敏强,寇纪淞,林丹,等.遗传算法的基本理论与应用[M].北京:科学出版社,2002.

[57]王文辉,周东华.基于遗传算法的一种粗糙集知识约简算法[J].系统仿真学报,2001,13(Z1):91-93.

[58]赵敏,罗可,廖喜讯.基于免疫遗传算法的粗糙集属性约简算法[J].计算机工程与应用,2007,43(23):171-173.

[59]王萍,王学峰,吴谷丰.基于遗传算法的粗糙集属性约简算法[J].计算机应用与软件,2008,27(5):42-44.

[60]胡彧,张亦军,杨冬梅.粗糙集结合遗传算法在数据挖掘中的应用[J].计算机应用,2006,26(1):98-99.

[61]代建华,李元香.粗集中属性约简的一种启发式遗传算法[J].西安交通大学学报,2002,36(12):1286-1290.

[62]代建华,李元香,刘群.粗糙集理论中基于遗传算法的离散化方法[J].计算机工程与应用, 2003,39(8):13-14.

[63]苗夺谦,王国胤,刘清,等.粒计算:过去、现在与展望[M].北京:科学出版社,2007.

[64]ZUPANA B,BOHANEC M,DEMAR J,et al.Learning by discovering concept hierarchies[J].Artificial Intelligence,1999,109(1):211-242.

[65]DEKEL U,GIL Y.Revealing class structure with concept lattices[C]//Proc of the 10th Working Conference on Reverse Engineering.Washington DC:IEEE Computer Society,2003:353.

[66]GANTER B,WHILE R.Formal concept analysis:mathematical foundations[M].Berlin:Springer,1999.

[67]魏玲,祁建军,张文修.概念格与粗糙集的关系研究[J].计算机科学,2006(3):18-21.

[68]ZHANG Wen-xiu,WEI Ling,QI Jian-jun.Attribute reduction in concept lattice based on discernibility matrix[C]//Proc of the 10th International Conference on Rough Sets, Fuzzy Sets, Data Mining, and Granular Computing.Berlin:Springer,2005:157-165.

[69]YAO Yi-yu.Concept lattices in rough set theory[C]//Proc of Annual Meeting of the North American Fuzzy Information Processing Society.2004:796-801.

[70]YAO Yi-yu.A comparative study of formal concept analysis and rough set theory in data analysis[C]//Proc of the 4th International Confe-?rence on Rough Sets and Current Trends in Computing.Berlin:Sprin-?ger,2004:59-68.

[71]曲开社,翟岩慧.偏序集、包含度与形式概念分析[J].计算机学报,2006,29(2):219-226.

[72]曲开社,翟岩慧,梁吉业,等. 形式概念分析对粗糙集理论的表示及扩展[J].软件学报,2007,18(9):2174-2182.

[73]梁吉业.基于粗糙集与概念格的智能数据分析方法研究[D]. 北京: 中国科学院计算技术研究所,2004.

[74]SHAFER GA mathematical theory of evidence[M].Princeton:Princeton University Press,1976.

[75]单渊达,倪明.证据理论及其应用[J].电力系统自动化,1996(3):76-80.

[76]SKOWRON A.The relationship between the rough set theory and evidence theory[J].Bulletin of the Polish Academy of Sciences Mathematics,1989,37(1):87-90.

[77]FAGIN R,HALPERN J Y.Uncertainty,belief,and probability[J].Computational Intelligence,1991(7):160-173.

[78]WU Wei-zhi,LEUNG Y,ZHANG Wen-xiu.Connections between rough set theory and Dempster-Shafer theory of evidence[J].International Journal of General Systems,2002,31(4):405-430.

[79]WANG Fei-yue.On the abstraction of conventional dynamic systems:from numerical analysis to linguistic analysis[J].Information ?Sciences,2005,171(1-3):233-259.

Links to free download http://eng.hi138.com

Theoretical Computer Papers