A new weighted metric: the relative metric I (original) (raw)

A new weighted metric

Journal of Mathematical Analysis and Applications, 2005

In the first part of this investigation we generalized a weighted distance function of R.-C. Li's and found necessary and sufficient conditions for it being a metric. In this paper some properties of this so-called M-relative metric are established. Specifically, isometries and quasiconvexity results are derived. We also illustrate connections between our approach and generalizations of the hyperbolic metric.

Inequalities of relative weighted metrics

2002

In this paper we present inequalities between two generalizations of the hyperbolic metric and the j_G metric. We also prove inequalities between generalized versions of the j_G metric and Seittenranta's metric.

On the similarity metric and the distance metric

Theoretical Computer Science, 2009

Similarity and dissimilarity measures are widely used in many research areas and applications. When a dissimilarity measure is used, it is normally required to be a distance metric. However, when a similarity measure is used, there is no formal requirement. In this article, we have three contributions. First, we give a formal definition of similarity metric. Second, we show the relationship between similarity metric and distance metric. Third, we present general solutions to normalize a given similarity metric or distance metric.

A metric on the space of weighted graphs

2009

In this paper we offer a metric similar to graph edit distance which measures the distance between two (possibly infinite)weighted graphs with finite norm (we define the norm of a graph as the sum of absolute values of its edges). The main result is the completeness of the space. Some other analytical properties of this space are also investigated. The introduced metric could have some applications in pattern recognition and face recognition methods.

Analysis on comparison of distances derived by one-norm and two-norm with weight functions

Applied Mathematics and Computation, 2013

Julian, Hung and Lin recently published a paper in Pattern Recognition Letters 33 1219-1223 discussing the similarity measures of . However, after careful analysis we have found that the theorem in their paper is incomplete. In this paper, we will first point out the incomplete theorem. Second, we will provide a patch work to prove that the distance with the weighted function derived by one-norm is less than that derived by twonorm. Third, we will show that their original problem is only a corollary of Jensen's inequality (1906). Our findings will help researchers develop more reasonable similarity measures and its application to pattern recognition under fuzzy intuitionistic set environment.

A common generalization of the postman, radial, and river metrics

2012

A common generalization of the postman, radial, and river metrics ABSTRACT. By using a metric d on a set X, a function ϕ of X to itself, a metric ρ on the range of ϕ , and a suitable relation Γ on X 2 to X, we construct a metric d ρϕΓ on X. This compound metric includes the postman, radial, and river metrics as some very particular cases. Our construction here closely follows a former one of M. Borkowski, D. Bugajewski, and H. Przybycień. Moreover, it may also be compared to that of A. G. Aksoy and B. Maurizi. However, instead of a metric projection and a collinearity relation we use the above mentioned ϕ and Γ. KEY WORDS. Generalized metrics and collinearity relations, postman, radial, and river metrics 2 1/2. Now, by using the above operations on complex numbers, it can be easily seen that | | is a norm on X, and thus d is a metric on X. This d is called the Euclidean metric on X. More generally, for any x, y ∈ X and p ∈ [ 1, ∞ ] , we may also naturally define d p (x, y) = | x − y | p with | z | p =    | z 1 | p + | z 2 | p 1/p if p < ∞ , max | z 1 | , | z 2 | if p = ∞ .

Metric and Euclidean properties of dissimilarity coefficients

Journal of Classification, 1986

We assemble here properties of certain dissimilarity coefficients and are specially concerned with their metric and Euclidean status. No attempt is made to be exhaustive as far as coefficients are concerned, but certain mathematical results that we have found useful are presented and should help establish similar properties for other coefficients. The response to different types of data is investigated, leading to guidance on the choice of an appropriate coefficient. R6sum~: Ce travail pr6sente quelques propri6t6s de certains coefficients de ressemblance et en particulier leur capacit6 de produire des matrices de distance m&riques et euclidiennes. Sans pr&endre ~tre exhaustifs dans cette revue de coefficients, nous pr6sentons certains r6sultats math6matiques que nous croyons int6ressants et qui pourraient ~tre 6tablis pour d'autres coefficients. Finalement, nous analysons la r6ponse des mesures de ressemblance face ~ diff6rents types de donn6es, ce qui permet de formuler des recommandations quant au choix d'un coefficient.

On Some Metric Inequalities and Applications

Journal of Function Spaces

We derive a new inequality in metric spaces and provide its geometric interpretation. Some applications of our result are given, including metric inequalities in Lebesgue spaces, matrices inequalities, multiplicative metric inequalities, and partial metric inequalities. Our main result is a generalization of that obtained by Dragomir and Gosa.

From Similarity to Distance: Axiom Set, Monotonic Transformations and Metric Determinacy

Journal of Siberian Federal University. Mathematics & Physics, 2018

How to normalise similarity metric to a metric space for a clusterization? A new system of axioms describes the known generalizations of distance metrics and similarity metrics, the Pearson correlation coefficient and the cosine metrics. Equivalent definitions of order-preserving transformations of metrics (both monotonic and pivot-monotonic) are given in various terms. The metric definiteness of convex metric subspaces R n and Z among the pivot-monotonic transformations is proved. Faster formulas for the monotonic normalization of metrics are discussed.