Today, we want to delve deeper into Probabilistic metric space, a topic that has captured the attention of millions of people around the world. From its impact on society to its relevance today, Probabilistic metric space has been the subject of debate and discussion in different areas. Throughout this article, we will thoroughly explore all facets of Probabilistic metric space, analyzing its influence in various contexts and its role in shaping the world we live in. Through a detailed and exhaustive approach, we will dive into the most relevant aspects of Probabilistic metric space, offering a complete and enriching vision for all those seeking to better understand this phenomenon.
This article needs additional citations for verification. (July 2023) |
In mathematics, probabilistic metric spaces are a generalization of metric spaces where the distance no longer takes values in the non-negative real numbers R ≥ 0, but in distribution functions.[1]
Let D+ be the set of all probability distribution functions F such that F(0) = 0 (F is a nondecreasing, left continuous mapping from R into such that max(F) = 1).
Then given a non-empty set S and a function F: S × S → D+ where we denote F(p, q) by Fp,q for every (p, q) ∈ S × S, the ordered pair (S, F) is said to be a probabilistic metric space if:
Probabilistic metric spaces are initially introduced by Menger, which were termed statistical metrics.[3] Shortly after, Wald criticized the generalized triangle inequality and proposed an alternative one.[4] However, both authors had come to the conclusion that in some respects the Wald inequality was too stringent a requirement to impose on all probability metric spaces, which is partly included in the work of Schweizer and Sklar.[5] Later, the probabilistic metric spaces found to be very suitable to be used with fuzzy sets[6] and further called fuzzy metric spaces[7]
A probability metric D between two random variables X and Y may be defined, for example, as where F(x, y) denotes the joint probability density function of the random variables X and Y. If X and Y are independent from each other, then the equation above transforms into where f(x) and g(y) are probability density functions of X and Y respectively.
One may easily show that such probability metrics do not satisfy the first metric axiom or satisfies it if, and only if, both of arguments X and Y are certain events described by Dirac delta density probability distribution functions. In this case: the probability metric simply transforms into the metric between expected values , of the variables X and Y.
For all other random variables X, Y the probability metric does not satisfy the identity of indiscernibles condition required to be satisfied by the metric of the metric space, that is:

For example if both probability distribution functions of random variables X and Y are normal distributions (N) having the same standard deviation , integrating yields: where and is the complementary error function.
In this case:
The probability metric of random variables may be extended into metric D(X, Y) of random vectors X, Y by substituting with any metric operator d(x, y): where F(X, Y) is the joint probability density function of random vectors X and Y. For example substituting d(x, y) with Euclidean metric and providing the vectors X and Y are mutually independent would yield to: