Skip to main content
Log in

Reconstructing Distances among Objects from Their Discriminability

  • Published:
Psychometrika Aims and scope Submit manuscript

Abstract

We describe a principled way of imposing a metric representing dissimilarities on any discrete set of stimuli (symbols, handwritings, consumer products, X-ray films, etc.), given the probabilities with which they are discriminated from each other by a perceiving system, such as an organism, person, group of experts, neuronal structure, technical device, or even an abstract computational algorithm. In this procedure one does not have to assume that discrimination probabilities are monotonically related to distances, or that the distances belong to a predefined class of metrics, such as Minkowski. Discrimination probabilities do not have to be symmetric, the probability of discriminating an object from itself need not be a constant, and discrimination probabilities are allowed to be 0’s and 1’s. The only requirement that has to be satisfied is Regular Minimality, a principle we consider the defining property of discrimination: for ordered stimulus pairs (a,b), b is least frequently discriminated from a if and only if a is least frequently discriminated from b. Regular Minimality generalizes one of the weak consequences of the assumption that discrimination probabilities are monotonically related to distances: the probability of discriminating a from a should be less than that of discriminating a from any other object. This special form of Regular Minimality also underlies such traditional analyses of discrimination probabilities as Multidimensional Scaling and Cluster Analysis.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. 1 Arabie, P. (1991). Was Euclid an unnecessarily sophisticated psychologist? Psychometrika, 56, 567–587.

    Article  Google Scholar 

  2. 2 Benzécri, J.-P. et al. (1973). Analyse des Données (vol. 2). Paris: Dunod.

    Google Scholar 

  3. 3 Chauchat, J-H., & Risson, A. (1998). Bertin's graphics and multidimensional data analysis. In J. Blasius, & M. Greenacre, (Eds.), Visualization of categorical data (pp. 37–45). New York: Academic Press.

    Chapter  Google Scholar 

  4. 4 Choulakian, V. (2004). A comparison of two methods of principal component analysis. In J. Antoch, (Ed.), COMPSTAT 2004. Berlin: Physica-Verlag, pp. 793–798.

    Google Scholar 

  5. 5 Choulakian, V. (2005). Transposition invariant principal component analysis in L1 Statistics & Probability Letters, 71, 23–31.

    Article  Google Scholar 

  6. 6 Choulakian, V. (2006). L 1-norm projection pursuit principal component analysis. Computational Statistics and Data Analysis, 50, 1391–1624.

    Article  Google Scholar 

  7. 7 Chu, M.T., Funderlic, R.E., & Golub, G.H. (1995). A rank-one reduction formula and its applications to matrix factorizations. SIAM Review, 37, 512–530.

    Article  Google Scholar 

  8. 8 De Leeuw, J., & Michailidis, G. (2004). Weber correspondence analysis. The one-dimensional case. Journal of Computational and Graphical Statistics, 13, 946–953.

    Article  Google Scholar 

  9. 9 Gifi, A. (1990). Nonlinear multivariate analysis. Chichester UK: Wiley.

    Google Scholar 

  10. 10 Greenacre, M.J. (1984). Theory and applications of correspondence analysis. London: Academic Press.

    Google Scholar 

  11. 11 Heiser, W.J. (1987). Correspondence analysis with least absolute residuals. Computational Statistics & Data Analysis, 5, 337–356.

    Article  Google Scholar 

  12. 12 Hubert, L., Meulman, J. & Heiser, W. (2000). Two purposes for matrix factorization: A historical appraisal. SIAM Review, 42, 68–82.

    Article  Google Scholar 

  13. 13 Krause, E.F. (1986). Taxicab geometry: An adventure in non-Euclidean geometry. New York: Dover.

    Google Scholar 

  14. 14 Michailidis, G. & De Leeuw, J. (2004). Homogeneity analysis using absolute deviations. Computational Statistics & Data Analysis, 48, 587–603.

    Article  Google Scholar 

  15. 15 Nishisato, S. (1980). Analysis of categorical data: Dual scaling and its applications. Toronto: University of Toronto Press.

    Google Scholar 

  16. 16 Nishisato, S. (1984). Forced classification: A simple application of a quantification method. Psychometrika, 49, 25–36.

    Article  Google Scholar 

  17. 17 Wold, H. (1966). Estimation of principal components and related models by iterative least squares. In P.R. Krishnaiah (Ed.), Multivariate analysis(pp. 391–420). New York: Academic Press.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Ehtibar N. Dzhafarov.

Additional information

This research was supported by the NSF grant SES 0318010 (E.D.), Humboldt Research Award (E.D.), Humboldt Foundation grant DEU/1038348 (H.C. & E.D.), and DFG grant Co 94/5 (H.C.).

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Dzhafarov, E.N., Colonius, H. Reconstructing Distances among Objects from Their Discriminability. Psychometrika 71, 365–386 (2006). https://doi.org/10.1007/s11336-003-1126-9

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11336-003-1126-9

Keywords

Navigation