https://doi.org/10.31449/inf.v45i1.3199 Informatica 45 (2021) 143–148 143 Study of Fuzzy Distance Measure and Its Application to Medical Diagnosis Taruna and H. D. Arora Department of Mathematics, Amity University, Noida, India E-mail: hdarora@amity.edu Vijay Kumar Manav Rachna International Institute of Research & Studies, Faridabad, India Keywords: fuzzy sets, directed divergence, fuzzy relative information measures, multi criteria decision making Received: June 12, 2020 Ambiguity has an important part in the contrary observations around peripheral world. Entropy is imperative for measuring uncertain information which was first introduced by Shannon (1948) to measure the uncertain degree of randomness in a probability distribution. Fuzzy information measures have been applied widely in the area of decision making. Jensen–Shannon divergence is a useful distance measure in the probability distribution space. The present communication we propose a way of measuring the difference between two fuzzy sets by means of a function, called divergence. In addition, study of their detailed properties for its validity is also discussed. The applications of these newly developed fuzzy divergence measure have been provided to the optimal decision making based on the weights of alternatives. Numerical verification has been illustrated to demonstrate the proposed method for solving optimal decision-making problem under fuzzy environment. Povzetek: Za probleme medicinske diagnostike je narejena študija nedorečenosti in entropije. 1 I ntr o duct i o n Information theory advanced out of mathematical studies of the problems linked with communication, storage, and transmission of massages. It originated from the fundamental paper “The Mathematical Theory of Communication” published by Shannon [1]. Shannon developed mathematical schemes for quantitatively defining the ideas of facts and proved several very trendy outcomes with deeper effects. Various generalizations of Shannon entropy were studied by Renyi [2], Arimoto [3], Sharma and Taneja [4], De Luca and Termini [5], Kaufmann [6] and Peerzada et al. [7]. Uncertainty and fuzziness are the primary nature of human wondering and of many real-world objectives. Fuzziness is found in our decision, in our language and inside the way we process information. The fundamental use of information is to get rid of uncertainty and fuzziness. In reality, we degree data furnished by using the quantity of probabilistic uncertainty eliminated in an experiment and the measure of uncertainty eliminated is also called as a measure of information while degree of fuzziness is the measure of vagueness and ambiguity of uncertainties. The theory of fuzzy sets (FSs) developed by Zadeh [8], as a generalization of classical set theory, for representing vague and indistinct phenomena. This idea serves as an effective tool for know-how of the behaviour of humanistic systems in which human judgment, perceptions and feelings play a critical role. In fuzzy set concept, the entropy is described as a degree of fuzziness which expresses the quantity of ambiguity or problem in we decide whether an element belongs to a set or not. Bhandari and Pal [9] extended the probabilistic exponential entropy idea of Pal and Pal [10] to the fuzzy phenomenon. Kapur [11] discussed fuzzy measures uncertainty due to fuzziness of information. In fuzzy context, several measures have been proposed to measure the degree of difference between two fuzzy sets. Measure of fuzzy divergence between two fuzzy sets gives the difference between two fuzzy sets and this measure of difference between two fuzzy sets is called the fuzzy divergence measure. The similarity measure is important tools that can be used in decision-making problem to deal with uncertainty through IFS theory. Various distance measures have been proposed by different researchers. It has been observed that different distance measure produces different values while measuring the distance degree between two IFSs. Also, sometimes existing distance measures are not able to give an appropriate and convenient result for a pair of IFSs. For this reason, it is always necessary to derive advanced measures for better decision making. To explain the distinction among fuzzy sets, the distance measure was set up and was regarded as dual model of correspondence measure. Many researchers, such as Yager [12], Kosko [13] and Kaufmann [6] had used distance measure to define fuzzy entropy. Several recent methods of fuzzy entropy generated by distance measure and properties of distance measure were extended by Fan et al. [14]. The distances among two fuzzy subsets 144 Informatica 45 (2021) 143–148 H.D. Arora et. al on a fuzzy subset of R+ were characterized by Dubois and Prade [15]. Thus, the set of distances between two sets was simplified whereas the shortest distance between two crisp sets was not simplified. The shortest distance among two fuzzy sets as a density function on non-negative reals was described by Rosenfeld [16]. Thus, related to Kullback and Leibler [17] probabilistic measure of divergence, the subsequent measure of fuzzy directed divergence was initiated by Bhandari and Pal [9]. Montes et al. [18] proposed an axiomatic form to measure the difference between fuzzy sets and we study in detail the case of local divergence. Luo and Zhao [19] gave the algorithms for pattern recognition and use it to solve medical diagnosis problems. Gupta and Tiwari [20] and Datta and Goala [21] proposed cosine similarity measure for intuitionistic and interval-valued intuitionistic fuzzy sets using an advanced distance measure on intuitionistic fuzzy sets. 2 Preliminaries The model of entropy was initiated to arrange numerical quantity of ambiguity. Shannon [1] originated a quantity 𝐻 ( 𝑃 ) = − ∑ 𝑝 𝑖 𝑙𝑜𝑔 𝑝 𝑖 𝑛 𝑖 =1 (1) for the uncertainty of a probability distribution ( 𝑝 1 , 𝑝 2 , 𝑝 3 … 𝑝 𝑛 ) and called it entropy. A fuzzy set 𝐴 ̃ in a finite Universe of discourse X = ( 𝑥 1 , 𝑥 2 , 𝑥 3 … 𝑥 𝑛 ) is given by 𝐴 ̃ = {〈𝑥 , 𝜇 𝐴 ̃ ( 𝑥 ) 〉|𝑥 ∈ 𝑋 } (2) where 𝜇 𝐴 ̃ : X → [0,1] 𝑖𝑠 𝑡 ℎ𝑒 membership function of 𝐴 ̃ .The number 𝜇 𝐴 ̃ ( 𝑥 ) 𝑑𝑒𝑠𝑐𝑟𝑖𝑏𝑒𝑠 𝑡 ℎ𝑒 𝑑𝑒𝑔𝑟𝑒𝑒 𝑜𝑓 𝑏𝑒𝑙𝑜𝑛𝑔𝑖𝑛𝑔𝑛𝑒𝑠𝑠 𝑜𝑓 𝑥 ∈ 𝑋 𝑖𝑛 𝐴 ̃ . De Luca and Termini [5] defined fuzzy entropy for a fuzzy set A corresponding to Shannon Entropy (1948) as H( 𝐴 ̃ ) = − 1 𝑛 ∑ [𝜇 𝐴 ̃ ( 𝑥 𝑖 ) 𝑙𝑜𝑔 ( 𝜇 𝐴 ̃ ( 𝑥 𝑖 ) )+ ( 1 − 𝑛 𝑖 =1 𝜇 𝐴 ̃ ( 𝑥 𝑖 ) ) 𝑙𝑜𝑔 ( 1 − 𝜇 𝐴 ̃ ( 𝑥 𝑖 ) ) ] (3) Motivated by the fundamental properties of directed divergence, Kapur [11] explained the concept of fuzzy directed divergence as follows: The directed divergence of fuzzy set A from the fuzzy set B is a function D (A; B) that should comply with the subsequent requirements which satisfies the following conditions: 1. 𝐷 ( 𝐴 ; 𝐵 ) ≥ 0 2. 𝐷 ( 𝐴 ; 𝐵 ) = 0 𝑖𝑓𝑓 𝐴 = 𝐵 3. 𝐷 ( 𝐴 ; 𝐵 ) ≥ 0 is a convex function in (0,1) 4. 𝐷 ( 𝐴 ; 𝐵 ) ≥ 0 should not change, when 𝜇 𝐴 ( 𝑥 𝑖 ) is changed to1 − 𝜇 𝐴 ( 𝑥 𝑖 ) and 𝜇 𝐵 ( 𝑥 𝑖 ) is changed to 1 − 𝜇 𝐵 ( 𝑥 𝑖 ) . Now, corresponding to Kullback – Leibler’s [17] measure of divergence, Bhandari and Pal [9] proposed a fuzzy divergence measure A and B given by 𝐷 ( 𝐴 ; 𝐵 ) = 1 𝑛 ∑ [𝜇 𝐴 ( 𝑥 𝑖 ) 𝑙𝑜𝑔 𝜇 𝐴 ( 𝑥 𝑖 ) 𝜇 𝐵 ( 𝑥 𝑖 ) + ( 1 − 𝑛 𝑖 =1 𝜇 𝐴 ( 𝑥 𝑖 ) ) 𝑙𝑜𝑔 ( 1−𝜇 𝐴 ( 𝑥 𝑖 ) ) ( 1−𝜇 𝐵 ( 𝑥 𝑖 ) ) ] (4) Later, Shang and Jiang [22] was pointed out that the expression (4) has some limitations, i.e., if 𝜇 𝐴 ( 𝑥 𝑖 ) approaches to 0 or 1, then its value tends to ∞. Therefore they proposed a modified version of fuzzy divergence measure (4), given as 𝐽 ( 𝐴 ; 𝐵 ) = ∑ [𝜇 𝐴 ( 𝑥 𝑖 ) 𝑙𝑜𝑔 𝜇 𝐴 ( 𝑥 𝑖 ) 𝜇 𝐴 ( 𝑥 𝑖 ) +𝜇 𝐵 ( 𝑥 𝑖 ) 2 + ( 1 − 𝑛 𝑖 =1 𝜇 𝐴 ( 𝑥 𝑖 ) ) 𝑙𝑜𝑔 ( 1−𝜇 𝐴 ( 𝑥 𝑖 ) ) 1−( 𝜇 𝐴 ( 𝑥 𝑖 ) +𝜇 𝐵 ( 𝑥 𝑖 ) 2 ) ] (5) Corresponding to Kerridge [23] inaccuracy measure, Verma and Shrama [24] define a measure of inaccuracy of fuzzy set B with respect to fuzzy set A, as 𝐼 ( 𝐴 ; 𝐵 ) = − 1 𝑛 ∑ [𝜇 𝐴 ( 𝑥 𝑖 ) 𝑙𝑜𝑔 𝜇 𝐵 ( 𝑥 𝑖 )+ ( 1 − 𝑛 𝑖 =1 𝜇 𝐴 ( 𝑥 𝑖 ) ) 𝑙𝑜𝑔 ( 1 − 𝜇 𝐵 ( 𝑥 𝑖 ) ) ] (6) Ohlan [25] proposed a parametric generalized measure of divergence between two fuzzy sets A and B corresponding to Taneja [26] as 𝐿 𝑡 ( 𝐴 , 𝐵 ) = ∑ ( 𝜇 𝐴 ( 𝑥 𝑖 ) +𝜇 𝐵 ( 𝑥 𝑖 ) ) 2 2 𝑡 𝑛 𝑖 =1 × [ ( 𝜇 𝐴 ( 𝑥 𝑖 ) +𝜇 𝐵 ( 𝑥 𝑖 ) ) 𝑡 √𝜇 𝐴 ( 𝑥 𝑖 ) 𝜇 𝐵 ( 𝑥 𝑖 ) 𝑡 +1 + ( 2−𝜇 𝐴 ( 𝑥 𝑖 ) +𝜇 𝐵 ( 𝑥 𝑖 ) ) 𝑡 √( 1−𝜇 𝐴 ( 𝑥 𝑖 ) ) .( 1−𝜇 𝐵 ( 𝑥 𝑖 ) ) 𝑡 +1 ] (7) 𝑡 = 0,1,2, … The generalized measure of fuzzy directed divergence of order 𝛼 and type 𝛽 is given by Arora and Dhiman [27] as 𝐷 𝛼 𝛽 ( 𝐴 : 𝐵 ) = 1 ( 1 − 𝛼 ) 𝛽 ∑ [{ 𝜇 𝐴 ( 𝑥 𝑖 ) 𝛼 𝜇 𝐴 ( 𝑥 𝑖 ) 𝜇 𝐵 ( 𝑥 𝑖 ) 𝛼 𝜇 𝐵 ( 𝑥 𝑖 ) 𝑛 𝑖 =1 + ( 1 − 𝜇 𝐴 ( 𝑥 𝑖 ) ) 𝛼 ( 1−𝜇 𝐴 ( 𝑥 𝑖 ) ) ( 1 − 𝜇 𝐵 ( 𝑥 𝑖 ) ) 𝛼 ( 1−𝜇 𝐵 ( 𝑥 𝑖 ) ) } 𝛽 − 2 𝛽 ] where 𝛼 > 0, 𝛼 ≠ 1, 𝛽 ≠ 0. (8) Prakash and Kumar [28] proposed a new fuzzy divergence measure of fuzzy set B with respect to fuzzy set A, as follows: 𝐾 ( 𝐴 , 𝐵 ) = −𝑙𝑜𝑔 ( 1+ 1 𝑛 ∑ [√𝜇 𝐴 ( 𝑥 𝑖 ) 𝜇 𝐵 ( 𝑥 𝑖 ) +√( 1−𝜇 𝐴 ( 𝑥 𝑖 ) ) .( 1−𝜇 𝐵 ( 𝑥 𝑖 ) ) ] 𝑛 𝑖 =1 2 ) (9) Kumari et al. [29] proposed Weighted Fuzzy Exponential J-Divergence as 𝐻 ( 𝐴 ; 𝑊 ) = 1 𝑛 ( √ 𝑒 −1) ∑ ∑ 𝑤 𝑖𝑗 [( 𝜇 𝐴 𝑓 𝑖𝑗 ) 𝑒 1−𝜇 𝐴 𝑓 𝑖𝑗 + ( 1 − 𝑀 −1 𝑗 =0 𝑀 −1 𝑖 =0 ( 𝜇 𝐴 𝑓 𝑖𝑗 ) 𝑒 1−𝜇 𝐴 𝑓 𝑖𝑗 − 1] (10) Study of Fuzzy Distance Measure and Its Application to... Informatica 45 (2021) 143–148 145 where 𝜇 𝐴 𝑓 𝑖𝑗 is the membership values of the pixels in the image and 𝑓 𝑖𝑗 is the (i, j) th pixel of the image A. Tiwari and Gupta [30] proposed entropy measures and erived relation between distance, entropy, and similarity measures for IvIFSs. 3 Proposed Fuzzy D i s t a nce Measure Let X = ( 𝑥 1 , 𝑥 2 , 𝑥 3 … 𝑥 𝑛 ) be the universe of discourse. Let 𝐴 = {〈𝑥 𝑖 , 𝜇 𝐴 ( 𝑥 𝑖 ) 〉|𝑥 𝑖 ∈ 𝑋 } and B = {〈𝑥 𝑖 , 𝜇 𝐵 ( 𝑥 𝑖 ) 〉|𝑥 𝑖 ∈ 𝑋 } be two fuzzy sets. Then we propose new distance measure as follows: 𝐷 ( 𝐴 ; 𝐵 ) = 2 𝑛 ∑ 𝑠𝑖𝑛 { 𝜋 2 |𝜇 𝐴 ( 𝑥 𝑖 ) −𝜇 𝐵 ( 𝑥 𝑖 ) |} 1+ 𝑠𝑖𝑛 { 𝜋 2 |𝜇 𝐴 ( 𝑥 𝑖 ) −𝜇 𝐵 ( 𝑥 𝑖 ) |} 𝑛 𝑖 =1 (11) Theorem 3.1. The fuzzy distance measure 𝐷 ( 𝐴 ; 𝐵 ) defined in equation (11) is a valid measure of fuzzy divergence. Proof. All the necessary four conditions to be a distance measure are satisfied by the new distance measure which are as follows: (P1) 0 ≤ 𝐷 ( 𝐴 ; 𝐵 ) ≤ 1 (P2) 𝐷 ( 𝐴 ; 𝐵 ) = 0 if and only if 𝜇 𝐴 ( 𝑥 𝑖 ) = 𝜇 𝐵 ( 𝑥 𝑖 ) . (P3) 𝐷 ( 𝐴 ; 𝐵 ) = 𝐷 ( 𝐵 ; 𝐴 ) (P4) If A, B and C be three fuzzy sets, then the distance measure satisfies the triangular inequality, i.e., 𝐷 ( 𝐴 ; 𝐶 ) ≤ 𝐷 ( 𝐴 ; 𝐵 ) + 𝐷 ( 𝐵 ; 𝐶 ) . Proof: We will now prove these conditions one by one: (P1) As we know that, 𝐴 = {〈𝑥 𝑖 , 𝜇 𝐴 ( 𝑥 𝑖 ) 〉|𝑥 𝑖 ∈ 𝑋 } for degree of membership 0 ≤ 𝜇 𝐴 ( 𝑥 𝑖 ) ≤ 1. That is, for 𝐴 = { 〈𝑥 𝑖 , 𝜇 𝐴 ( 𝑥 𝑖 ) 〉|𝑥 𝑖 ∈ 𝑋 } and B= { 〈𝑥 𝑖 , 𝜇 𝐵 ( 𝑥 𝑖 ) 〉|𝑥 𝑖 ∈ 𝑋 } 0 ≤ |𝜇 𝐴 ( 𝑥 𝑖 )− 𝜇 𝐵 ( 𝑥 𝑖 ) | ≤ 1 ⇒ 0 ≤ 𝜋 2 |𝜇 𝐴 ( 𝑥 𝑖 )− 𝜇 𝐵 ( 𝑥 𝑖 ) | ≤ 𝜋 2 ⇒ 0 ≤ 𝑠𝑖𝑛 { 𝜋 2 |𝜇 𝐴 ( 𝑥 𝑖 )− 𝜇 𝐵 ( 𝑥 𝑖 ) |} ≤ 1 (12) ⇒ 0 ≤ 1 + 𝑠𝑖𝑛 { 𝜋 2 |𝜇 𝐴 ( 𝑥 𝑖 )− 𝜇 𝐵 ( 𝑥 𝑖 ) |} ≤ 2 (13) From (12) and (13), we have 0 ≤ 2. 𝑠𝑖𝑛 { 𝜋 2 |𝜇 𝐴 ( 𝑥 𝑖 )− 𝜇 𝐵 ( 𝑥 𝑖 ) |} 1 + 𝑠𝑖𝑛 { 𝜋 2 |𝜇 𝐴 ( 𝑥 𝑖 )− 𝜇 𝐵 ( 𝑥 𝑖 ) |} ≤ 1 ⇒ 0 ≤ 2 𝑛 ∑ 𝑠𝑖𝑛 { 𝜋 2 |𝜇 𝐴 ( 𝑥 𝑖 ) −𝜇 𝐵 ( 𝑥 𝑖 ) |} 1+ 𝑠𝑖𝑛 { 𝜋 2 |𝜇 𝐴 ( 𝑥 𝑖 ) −𝜇 𝐵 ( 𝑥 𝑖 ) |} 𝑛 𝑖 =1 ≤ 1 ⇒ 0 ≤ 𝐷 ( 𝐴 ; 𝐵 ) ≤ 1. (P2) 𝐷 ( 𝐴 ; 𝐵 ) = 0 ⇔ 2 𝑛 ∑ 𝑠𝑖𝑛 { 𝜋 2 |𝜇 𝐴 ( 𝑥 𝑖 )− 𝜇 𝐵 ( 𝑥 𝑖 ) |} 1 + 𝑠𝑖𝑛 { 𝜋 2 |𝜇 𝐴 ( 𝑥 𝑖 )− 𝜇 𝐵 ( 𝑥 𝑖 ) |} 𝑛 𝑖 =1 = 0 ⇔ 𝑠𝑖𝑛 { 𝜋 2 |𝜇 𝐴 ( 𝑥 𝑖 )− 𝜇 𝐵 ( 𝑥 𝑖 ) |} 1 + 𝑠𝑖𝑛 { 𝜋 2 |𝜇 𝐴 ( 𝑥 𝑖 )− 𝜇 𝐵 ( 𝑥 𝑖 ) |} = 0 ⇔ 𝑠𝑖𝑛 { 𝜋 2 |𝜇 𝐴 ( 𝑥 𝑖 )− 𝜇 𝐵 ( 𝑥 𝑖 ) |} = 0 ⇔ |𝜇 𝐴 ( 𝑥 𝑖 )− 𝜇 𝐵 ( 𝑥 𝑖 ) | = 0 ⇔ 𝜇 𝐴 ( 𝑥 𝑖 ) = 𝜇 𝐵 ( 𝑥 𝑖 ) ⇔ 𝐴 = 𝐵 Therefore, 𝐷 ( 𝐴 ; 𝐵 ) = 0 if and only if 𝜇 𝐴 ( 𝑥 𝑖 ) = 𝜇 𝐵 ( 𝑥 𝑖 ) . (P3) As 𝐷 ( 𝐴 ; 𝐵 ) = 2 𝑛 ∑ 𝑠𝑖𝑛 { 𝜋 2 |𝜇 𝐴 ( 𝑥 𝑖 )− 𝜇 𝐵 ( 𝑥 𝑖 ) |} 1 + 𝑠𝑖𝑛 { 𝜋 2 |𝜇 𝐴 ( 𝑥 𝑖 )− 𝜇 𝐵 ( 𝑥 𝑖 ) |} 𝑛 𝑖 =1 = 2 𝑛 ∑ 𝑠𝑖𝑛 { 𝜋 2 |𝜇 𝐵 ( 𝑥 𝑖 )− 𝜇 𝐴 ( 𝑥 𝑖 ) |} 1 + 𝑠𝑖𝑛 { 𝜋 2 |𝜇 𝐵 ( 𝑥 𝑖 )− 𝜇 𝐴 ( 𝑥 𝑖 ) |} 𝑛 𝑖 =1 = 𝐷 ( 𝐵 ; 𝐴 ) To prove the triangular inequality, i.e., 𝐷 ( 𝐴 ; 𝐶 ) ≤ 𝐷 ( 𝐴 ; 𝐵 ) + 𝐷 ( 𝐵 ; 𝐶 ) , we have to prove that (P4) In order to prove fourth necessary condition, we must first prove the identity sin( 𝐴 + 𝐵 ) ≤ sin( 𝐴 )+ sin ( 𝐵 ) , where A and B are acute angles. or in other words, we have to show that sin( 𝐴 )+ sin( 𝐵 )− sin ( 𝐴 + 𝐵 ) ≥ 0 ⇒ sin 𝐴 + sin 𝐵 − sin𝐴 . 𝑐𝑜𝑠𝐵 − 𝑐𝑜𝑠𝐴 . 𝑠𝑖𝑛𝐵 ≥ 0 ⇒ sin 𝐴 ( 1 − 𝑐𝑜𝑠𝐵 )+ sin 𝐵 ( 1 − 𝑐𝑜𝑠𝐴 ) ≥ 0 Since A and B are acute angles, therefore, sin 𝐴 , ( 1 − 𝑐𝑜𝑠𝐵 ), sin 𝐵 , ( 1 − 𝑐𝑜𝑠𝐴 ) are all positive and hence the identity holds good. Now, consider 𝐴 = {〈𝑥 , 𝜇 𝐴 ( 𝑥 ) 〉|𝑥 ∈ 𝑋 } , 𝐵 = {〈𝑥 , 𝜇 𝐵 ( 𝑥 ) 〉|𝑥 ∈ 𝑋 } and 𝐶 = {〈𝑥 , 𝜇 𝐶 ( 𝑥 ) 〉|𝑥 ∈ 𝑋 } be three fuzzy sets. As, |𝜇 𝐴 ( 𝑥 𝑖 )− 𝜇 𝐶 ( 𝑥 𝑖 ) | ≤ |𝜇 𝐴 ( 𝑥 𝑖 )− 𝜇 𝐵 ( 𝑥 𝑖 ) | + |𝜇 𝐵 ( 𝑥 𝑖 )− 𝜇 𝐶 ( 𝑥 𝑖 ) | ( ∵ 𝐼𝑛𝑒𝑞𝑢𝑎𝑙𝑖𝑡𝑦 𝑜𝑓 𝑟𝑒𝑎𝑙 𝑛𝑢𝑚𝑏𝑒𝑟𝑠 ) ⇒ 𝑠𝑖𝑛 { 𝜋 2 |𝜇 𝐴 ( 𝑥 𝑖 )− 𝜇 𝐶 ( 𝑥 𝑖 ) |} ≤ 𝑠𝑖𝑛 𝜋 2 [{|𝜇 𝐴 ( 𝑥 𝑖 )− 𝜇 𝐵 ( 𝑥 𝑖 ) | + |𝜇 𝐵 ( 𝑥 𝑖 )− 𝜇 𝐶 ( 𝑥 𝑖 ) |}] ⇒ 𝑠𝑖𝑛 { 𝜋 2 |𝜇 𝐴 ( 𝑥 𝑖 )− 𝜇 𝐶 ( 𝑥 𝑖 ) |} ≤ 𝑠𝑖𝑛 { 𝜋 2 |𝜇 𝐴 ( 𝑥 𝑖 )− 𝜇 𝐵 ( 𝑥 𝑖 ) |} + 𝑠𝑖𝑛 { 𝜋 2 |𝜇 𝐵 ( 𝑥 𝑖 )− 𝜇 𝐶 ( 𝑥 𝑖 ) |} Also. 146 Informatica 45 (2021) 143–148 H.D. Arora et. al 1 + 𝑠𝑖𝑛 { 𝜋 2 |𝜇 𝐴 ( 𝑥 𝑖 )− 𝜇 𝐶 ( 𝑥 𝑖 ) |} ≤ 1 + 𝑠𝑖𝑛 { 𝜋 2 |𝜇 𝐴 ( 𝑥 𝑖 )− 𝜇 𝐵 ( 𝑥 𝑖 ) |} + 𝑠𝑖𝑛 { 𝜋 2 |𝜇 𝐵 ( 𝑥 𝑖 )− 𝜇 𝐶 ( 𝑥 𝑖 ) |} ⇒ 1 1+𝑠𝑖𝑛 { 𝜋 2 |𝜇 𝐴 ( 𝑥 𝑖 ) −𝜇 𝐶 ( 𝑥 𝑖 ) |} ≥ 1 1+𝑠𝑖𝑛 { 𝜋 2 |𝜇 𝐴 ( 𝑥 𝑖 ) −𝜇 𝐵 ( 𝑥 𝑖 ) |}+𝑠𝑖𝑛 { 𝜋 2 |𝜇 𝐵 ( 𝑥 𝑖 ) −𝜇 𝐶 ( 𝑥 𝑖 ) |} ⇒ 1 − 1 1+𝑠𝑖𝑛 { 𝜋 2 |𝜇 𝐴 ( 𝑥 𝑖 ) −𝜇 𝐶 ( 𝑥 𝑖 ) |} ≤ 1 − 1 1+𝑠𝑖𝑛 { 𝜋 2 |𝜇 𝐴 ( 𝑥 𝑖 ) −𝜇 𝐵 ( 𝑥 𝑖 ) |}+𝑠𝑖𝑛 { 𝜋 2 |𝜇 𝐵 ( 𝑥 𝑖 ) −𝜇 𝐶 ( 𝑥 𝑖 ) |} ⇒ 𝑠𝑖𝑛 { 𝜋 2 |𝜇 𝐴 ( 𝑥 𝑖 ) −𝜇 𝐶 ( 𝑥 𝑖 ) |} 1+𝑠𝑖𝑛 { 𝜋 2 |𝜇 𝐴 ( 𝑥 𝑖 ) −𝜇 𝐶 ( 𝑥 𝑖 ) |} ≤ 𝑠𝑖𝑛 { 𝜋 2 |𝜇 𝐴 ( 𝑥 𝑖 ) −𝜇 𝐵 ( 𝑥 𝑖 ) |}+𝑠𝑖𝑛 { 𝜋 2 |𝜇 𝐵 ( 𝑥 𝑖 ) −𝜇 𝐶 ( 𝑥 𝑖 ) |} 1+𝑠𝑖𝑛 { 𝜋 2 |𝜇 𝐴 ( 𝑥 𝑖 ) −𝜇 𝐵 ( 𝑥 𝑖 ) |}+𝑠𝑖𝑛 { 𝜋 2 |𝜇 𝐵 ( 𝑥 𝑖 ) −𝜇 𝐶 ( 𝑥 𝑖 ) |} ⇒ 𝑠𝑖𝑛 { 𝜋 2 |𝜇 𝐴 ( 𝑥 𝑖 ) −𝜇 𝐶 ( 𝑥 𝑖 ) |} 1+𝑠𝑖𝑛 { 𝜋 2 |𝜇 𝐴 ( 𝑥 𝑖 ) −𝜇 𝐶 ( 𝑥 𝑖 ) |} ≤ 𝑠𝑖𝑛 { 𝜋 2 |𝜇 𝐴 ( 𝑥 𝑖 ) −𝜇 𝐵 ( 𝑥 𝑖 ) |} 1+𝑠𝑖𝑛 { 𝜋 2 |𝜇 𝐴 ( 𝑥 𝑖 ) −𝜇 𝐵 ( 𝑥 𝑖 ) |}+𝑠𝑖𝑛 { 𝜋 2 |𝜇 𝐵 ( 𝑥 𝑖 ) −𝜇 𝐶 ( 𝑥 𝑖 ) |} + 𝑠𝑖𝑛 { 𝜋 2 |𝜇 𝐵 ( 𝑥 𝑖 ) −𝜇 𝐶 ( 𝑥 𝑖 ) |} 1+𝑠𝑖𝑛 { 𝜋 2 |𝜇 𝐴 ( 𝑥 𝑖 ) −𝜇 𝐵 ( 𝑥 𝑖 ) |}+𝑠𝑖𝑛 { 𝜋 2 |𝜇 𝐵 ( 𝑥 𝑖 ) −𝜇 𝐶 ( 𝑥 𝑖 ) |} ⇒ 2 𝑛 ∑ 𝑠𝑖𝑛 { 𝜋 2 |𝜇 𝐴 ( 𝑥 𝑖 ) −𝜇 𝐶 ( 𝑥 𝑖 ) |} 1+𝑠𝑖𝑛 { 𝜋 2 |𝜇 𝐴 ( 𝑥 𝑖 ) −𝜇 𝐶 ( 𝑥 𝑖 ) |} 𝑛 𝑖 =1 ≤ 2 𝑛 ∑ 𝑠𝑖𝑛 { 𝜋 2 |𝜇 𝐴 ( 𝑥 𝑖 ) −𝜇 𝐵 ( 𝑥 𝑖 ) |} 1+𝑠𝑖𝑛 { 𝜋 2 |𝜇 𝐴 ( 𝑥 𝑖 ) −𝜇 𝐵 ( 𝑥 𝑖 ) |}+𝑠𝑖𝑛 { 𝜋 2 |𝜇 𝐵 ( 𝑥 𝑖 ) −𝜇 𝐶 ( 𝑥 𝑖 ) |} 𝑛 𝑖 =1 + 2 𝑛 ∑ 𝑠𝑖𝑛 { 𝜋 2 |𝜇 𝐵 ( 𝑥 𝑖 ) −𝜇 𝐶 ( 𝑥 𝑖 ) |} 1+𝑠𝑖𝑛 { 𝜋 2 |𝜇 𝐴 ( 𝑥 𝑖 ) −𝜇 𝐵 ( 𝑥 𝑖 ) |}+𝑠𝑖𝑛 { 𝜋 2 |𝜇 𝐵 ( 𝑥 𝑖 ) −𝜇 𝐶 ( 𝑥 𝑖 ) |} 𝑛 𝑖 =1 ⇒ 𝐷 ( 𝐴 ; 𝐶 ) ≤ 𝐷 ( 𝐴 ; 𝐵 ) + 𝐷 ( 𝐵 ; 𝐶 ) . Hence, the proposed distance measure satisfies all the necessary properties. 4 Application of Proposed Fuzzy Measure to Medical Diagnosis In a classical problem of medical diagnosis, assume that if a doctor needs to diagnose some of patients "𝑃 = { 𝐴𝑙𝑒𝑥 , 𝐶 ℎ𝑟𝑖𝑠 , 𝐽𝑎𝑚𝑒𝑠 , 𝑀𝑖𝑘𝑒 𝑎𝑛𝑑 𝑆 ℎ𝑎𝑤𝑛 } " under some defined diagnosis "𝐷 = { 𝑉𝑖𝑟𝑎𝑙 𝑓𝑒𝑣𝑒𝑟 ( 𝑉𝐹 ) , 𝑀𝑎𝑙𝑎𝑟𝑖𝑎 ( 𝑀 ) , 𝑇𝑦 𝑝 ℎ𝑜𝑖𝑑 ( 𝑇 ) , 𝑆𝑡𝑜𝑚𝑎𝑐 ℎ 𝑝𝑟𝑜𝑏𝑙𝑒𝑚 ( 𝑆𝑃 ) 𝑎𝑛𝑑 𝐶 ℎ𝑒𝑠𝑡 𝑝𝑟𝑜𝑏𝑙𝑒𝑚 ( 𝐶𝑃 ) } " and a set of symptom "𝑆 = { 𝑇𝑒𝑚𝑝𝑒𝑟𝑎𝑡𝑢𝑟𝑒 ( 𝑇𝑒𝑚𝑝 . ) , 𝐻𝑒𝑎𝑑𝑎𝑐 ℎ𝑒 ( 𝐻 ) , 𝑆𝑡𝑜𝑚𝑎𝑐 ℎ 𝑝𝑎𝑖𝑛 ( 𝑆 . 𝑃𝑎𝑖𝑛 ) , 𝐶𝑜𝑢𝑔 ℎ( 𝐶 ) 𝑎𝑛𝑑 𝐶 ℎ𝑒𝑠𝑡 𝑝𝑎𝑖𝑛 ( 𝐶𝑃 ) } ". The following tables (table 1 and table 2) serve the purpose of the proposed computational application: In view of the table 3, it is being concluded that "𝐴𝑙𝑒𝑥 " is suffering from "𝑀𝑎𝑙𝑎𝑟𝑖𝑎 "; "𝐶 ℎ𝑟𝑖𝑠 ", "𝐽𝑎𝑚𝑒𝑠 " and "𝑀𝑖𝑘𝑒 " are suffering from "𝐶 ℎ𝑒𝑠𝑡 𝑝𝑟𝑜𝑏𝑙𝑒𝑚 " and "𝑆 ℎ𝑎𝑤𝑛 " is suffering from "𝑉𝑖𝑟𝑎𝑙 𝑓𝑒𝑣𝑒𝑟 ". This is because smaller value of the patient against each distance measure indicates the more probability of having the disease. 5 Comparative Study Jain and Kumar [31] proposed the intuitionistic fuzzy based trigonometric entropy as: Temp. H S. Pain C CP VF 0.4 0.4 0.3 0.1 0.6 M 0.4 0.2 0.5 0.6. 0.7 T 0.3 0.6 0.1 0.5 0.7 SP 0.2 0.3 0.7 0.4 0.3 CP 0.4 0.6 0.5 0.4 0.6 Table 1: Fuzzy membership values for diseases and their symptoms. Temp. H S. Pain C CP Alex 0.8 0.1 0.7 0.6 0.4 Chris 0.3 0.6 0.8 0.4 0.7 James 0.7 0.7 0.6 0.6 0.4 Mike 0.5 0.4 0.5 0.4 0.6 Shawn 0.5 0.6 0.3 0.2 0.9 Table 2: Fuzzy membership values for patients and related symptoms. Alex. Chris James Mike Shawn VF 0.680965 0.493065 0.634603 0.273332 0.327418 M 0.421368 0.421368 0.469442 0.296955 0.538941 T 0.689085 0.24253 0.546709 0.444991 0.40803 SP 0.42172 0.381062 0.516246 0.398187 0.670988 CP 0.596858 0.233026 0.42181 0.148478 0.367724 Table 3: Values of Fuzzy relative entropy measure for the patients and the likely diseases. Study of Fuzzy Distance Measure and Its Application to... Informatica 45 (2021) 143–148 147 𝐸 𝐼𝐹 ( 𝐴 )  = 1 𝑛 ∑ [𝑐𝑜𝑠 𝜋 2 ( |𝜇 𝐴 2 ( 𝑥 𝑖 )− 𝜈 𝐴 2 ( 𝑥 𝑖 ) | ) ] 𝑛 𝑖 =1 The fuzzy version of the entropy is: 𝐸 ( 𝐴 )  = 1 𝑛 ∑ [𝑐𝑜𝑠 𝜋 2 ( |𝜇 𝐴 2 ( 𝑥 𝑖 )− 𝜇 𝐵 2 ( 𝑥 𝑖 ) | ) ] 𝑛 𝑖 =1 From the table, it is concluded that the larger value in the column is the decision value. Wei et al. [32] proposed the generalized fuzzy entropy as: 𝐻 ( 𝐴 )  = 1 𝑛 ∑ [{ 𝑐𝑜𝑠 𝜋 ( 𝜇 𝐴 ( 𝑥 𝑖 )− 𝜈 𝐴 ( 𝑥 𝑖 ) 4 ) − 1} 𝑛 𝑖 =1 ×  1 √2 − 1 ]  From the table, it is concluded that the smaller value in the column is the decision value. 6 Conclusions In this paper, we have proposed a relative distance measure for fuzzy sets. Proof of its validity is also considered through numerical computations. Some of the essential properties of the measure are also studied. It has been observed that this measure is more flexible in terms of their previous derived measures. Application of this measure is also studied in medical diagnosis to check its legitimacy. Also, from the table 4 and 5, it is concluded that the result obtained from the proposed entropy is similar with the results of the existing entropies (shown in table nos.), which validates the fact that the proposed entropy is valid and have applications across disciplines. References [1] C.E. Shannon, “A Mathematical Theory of Communication” Bell Syst. Tech. Journal, vol. 27(379-423), pp. 623-656, 1948. https://doi.org/10.1002/j.1538-7305.1948.tb01338.x [2] Renyi A: On measure of entropy and information, Proceeding Fourth Berkely Symposium on Mathematical Statistics and probability, University of California Press, 1, pp. 547-561, 1961. [3] S.C. Arimoto,, “Information -Theoretic Considerations on Estimation Problems”. Information and Control, vol. 9, pp. 181-190, 1971. https://doi.org/10.1016/S0019-9958(71) 90065-9 [4] B.D. Sharma, and I.J. Taneja, “Entropies of Type α, β and Other Generalized Measures of Information Theory”, Mathematika, vol. 22, pp. 202-215, 1975. https://doi.org/10.1007/BF01899728 [5] A. De Luca and S. Termini, “A Definition of a Non- Probabilistic Entropy in the Setting of fuzzy sets theory”, Information and Control, vol. 20, pp. 301- 312, 1972. https://doi.org/10.1016/S0019- 9958(72)90199-4 [6] A. Kaufmann, “Fuzzy subsets: Fundamental Theoretical Elements”, Academic Press, New York, 3, 1980. https://doi.org/10.1109/TSMC.1977.4309751 [7] S. Peerzada, S.M. Sofi and R. Nisa, “A New Generalized Fuzzy Information Measure and its Properties”, International Journal of Advance Research in Science and Engineering, vol. 6, no. 12, pp. 1647-1654, 2017. [8] L.A. Zadeh, “Fuzzy Sets”, Information and Control, vol. 8, pp. 338-353, 1965. https://doi.org/10.1016/S0019-9958(65)90241-X [9] D. Bhandari and N.R. Pal, “Some New Information Measures for Fuzzy Sets”, Information Science, vol. 67, pp. 204-228, 1989. https://doi.org/10.1016/0020-0255(93)90073-U [10] N.R. Pal and S.K. Pal, “Object Background Segmentation Using New Definition of Entropy”, Proc. Inst. Elec. Eng., vol. 13, pp. 284-295, 1989. https://doi.org/10.1049/ip-e.1989.0039 [11] J.N. Kapur, “Measures of Fuzzy Information”, Mathematical Science Trust Society, vol. 2, no. 2, pp 73-76, 1997. [12] R.R. Yager, “On the Measure of Fuzziness and Negation, Part I: Membership in the Unit Interval” International Journal of General Systems, vol. 5, no. 4, pp. 221-229, 1979. https://doi.org/10.1080/ 03081077908547452 [13] B. Kosko, “Fuzziness vs. probability” International Journal of General Systems, vol. 17, pp. 211-240, 1990. https://doi.org/10.1080/03081079008935108 [14] J.L. Fan, Y.L. Ma and W.X. Xie, “On some properties of distance measure”, Fuzzy Sets and P1 P2 P3 P4 P5 D1 0.862945 0.909299 0.890581 0.986209 0.956661 D2 0.905346 0.927966 0.896657 0.980538 0.917587 D3 0.817137 0.907895 0.899024 0.963769 0.940134 D4 0.905073 0.938003 0.898832 0.956289 0.815088 D5 0.882851 0.958308 0.947068 0.988228 0.940318 Table 4: Values of Fuzzy entropy measure for Jain and Kumar (2020). P1 P2 P3 P4 P5 D1 0.103199 0.058957 0.082638 0.020753 0.02224 D2 0.044361 0.044368 0.064859 0.014851 0.06058 D3 0.140823 0.072568 0.076627 0.038459 0.031143 D4 0.065938 0.039909 0.069238 0.03408 0.108776 D5 0.078141 0.016301 0.028178 0.007425 0.026691 Table 5: Values of Fuzzy entropy measure for Wei et al. (2012). 148 Informatica 45 (2021) 143–148 H.D. Arora et. al Systems, vol. 117, pp. 355-361, 2001. https://doi.org/10.1016/ S0165-0114(98)00387-X [15] D. Dubois and H. Prade, “On distances between fuzzy points and their use for plausible reasoning”, Proc. IEEE Int. Conf. on Cybernatics and Society, Bombay, New Delhi, pp. 300-303, 1993. [16] A. Rosenfeld, “Distance between fuzzy sets”, Pattern Recognition Letters, vol. 3, pp. 229-231, 1985. https://doi.org/10.1016/0167-8655(85)90002-9 [17] S. Kullback and R.A. Leibler, “On Information and Sufficiency”, Ann. Math. Stat., vol. 22, pp. 79-86, 1951. https://doi.org/10.1214/aoms/1177729694 [18] S. Montes, I. Couso, P. Gil and C. Bertoluzza, “Divergence measure between fuzzy sets”, Int. J. of Approximate Reasoning, vol. 30, pp. 91–105, 2002. https://doi.org/10.1016/S0888-613X(02)00063-4. [19] M. Luo and R. Zhao, A distance measure between intuitionistic fuzzy sets and its application in medical diagnosis”, Artificial Intelligence in Medicine, vol. 89, pp. 34-39, 2018. https://doi.org/10.1016/ j.artmed.2018.05.002. [20] P. Gupta, and P. Tiwari, “Measures of cosine similarity intended for fuzzy sets, intuitionistic and interval-valued intuitionistic fuzzy sets with application in medical diagnoses”, In 2016 3rd International Conference on Computing for Sustainable Global Development (INDIACom), pp. 1846-1849, 2016. [21] P. Dutta, and S. Goala, “Fuzzy decision making in medical diagnosis using an advanced distance measure on intuitionistic fuzzy sets”. The Open Cybernetics & Systemics Journal, vol. 12, no. 1, pp. 136-149, 2018. https://doi.org/10.2174/ 1874110X01812010136 [22] X. Shang, and G. Jiang, “A note on fuzzy information measures”, Pattern Recognition Letters, vol. 18, no. 5, pp. 425-432, 1997. https://doi.org/10.1016/S0167-8655(97)00028-7 [23] D.E.F. Kerridge, “Inaccuracy and inference”, J. Royal Statistical Society B, vol. 23, no. 1, pp. 184- 194, 1961. https://doi.org/10.1111/j.2517-6161.1961.tb00404.x [24] R.K. Verma and B.D. Sharma, “A measure of inaccuracy between two fuzzy sets”, Cybernetics and Information Technologies, vol. 11, no. 2, pp. 13-23, 2011. [25] A. Ohlan, “A new generalized fuzzy divergence measure and applications”, Fuzzy Inf. Eng., vol. 7, pp. 507-523, 2015. https://doi.org/10.1016/j.fiae.2015.11.007 [26] I.J. Taneja, “Seven means, generalized triangular discrimination and generating divergence measures”, Information, vol. 4, no. 2, pp. 198-239, 2013. https://doi.org/10.3390/info4020198 [27] H. D. Arora and A. Dhiman, “On some generalized information measure of fuzzy directed divergence and decision making”, Int. J. Computing Sc. and Math., vol. 7, no. 3, pp. 3931-3940, 2016. [28] O. Parkash and R. Kumar, “Optimal Decision- Making Method Using Interval Valued Intuitionistic Fuzzy Divergence Measure Based on the Weights of Alternatives”, Int. J. Engg. Sc. Invention, vol. 7, no. 3, pp. 82-94, 2018. [29] S. Kumari, P. Tiwari and P. Gupta, “Application of Weighted Fuzzy Exponential J-Divergence Measure in Engiography”, Int. J. Applied Engg. Research, vol. 14, no. 13, pp. 2984-2988, 2019. [30] P. Tiwati and P. Gupta, “Entropy, Distance and Similarity measures under Interval Valued Intuitionistic Fuzzy Environment”, Informatica, Vol. 42 (2) pp. 617-628, 2018. https://doi.org/10.31449/inf.v42i4.1303. [31] S. Jain and V. Kumar, “Trigonometric Entropy on Intuitionistic Fuzzy Sets”, Int. J. Adv. Sci. Tech., vol. 29, no. 03, pp. 12234-12243, 2020. [32] C. Wei, Z. Gao, Z. & T. Guo, “An intuitionistic fuzzy entropy measure based on trigonometric function”, Control Decision, Vol. 27, pp. 571–574, 2012. https://doi.org/10.1155/2015/563745.