Abstract
The development of personal technologies has recently shifted from devices that seek to capture user attention to those that aim to improve user well-being. Digital wellness technologies use the same attractive qualities of other persuasive apps to motivate users towards behaviors that are personally and socially valuable, such as exercise, wealth-management, and meaningful communication. While these aims are certainly an improvement over the market-driven motivations of earlier technologies, they retain their predecessors’ focus on influencing user behavior as a primary metric of success. Digital wellness technologies are still persuasive technologies, and they do not evade concerns over whether their influence on users is ethically justified. In this paper, we describe several ethical frameworks with which to assess the justification of digital wellness technologies’ influence on users. We propose that while some technologies help users to complete tasks and satisfy immediate preferences, other technologies encourage users to reflect on the values underlying their habits and teach them to evaluate their lives’ competing demands. While the former approach to digital wellness technology is not unethical, we propose that the latter approach is more likely to lead to skillful user engagement with technology.
Similar content being viewed by others
Notes
Persuasive technologies may be conceptually distinguished from technologies aimed at behavior change (Smids 2018). In this article, we use the two terms interchangeably in order to address both types of technological influence.
While maternalism carries gendered connotations of mothering as opposed to fathering, we do not intend to import essential assumptions about gender into our analysis. Rather, just as “paternalism” as a concept has become decoupled from gender in the philosophical analysis, so we aim to take the same approach with maternalism.
References
Andras, P., Esterle, L., Guckert, M., Han, T. A., Lewis, P. R., Milanovic, K., et al. (2018). Trusting intelligent machines: deepening trust within socio-technical systems. IEEE Technology and Society Magazine. IEEE, 37, 76–83.
Begon, J. (2016). Paternalism. Analysis, 76(3), 355–373.
Burr, C., Cristianini, N., & Ladyman, J. (2018). An analysis of the interaction between intelligent software agents and human users. Minds and Machines. https://doi.org/10.1007/s11023-018-9479-0.
Byrnes, N. (2015). Technology and persuasion. MIT Technology Review March, 23, 2015 Available at: https://www.technologyreview.com/s/535826/technology-and-persuasion/.
Christman, J. (2014). Relational autonomy and the social dynamics of paternalism. Ethical Theory and Moral Practice, 17, 369–382
Conly, S. (2013). Against autonomy: Justifying coercive paternalism. Cambridge: Cambridge University Press.
Draper, N. A., & Turow, J. (2019). The corporate cultivation of digital resignation. New Media & Society, 51, 1–16.
Dworkin, G. (1972). Paternalism. Monist, 56, 64–84.
Dworking, G. (2013). Defining paternalism. In C. Coons and Weber, M. (Eds.), Paternalism: Theory and practice. Cambridge: Cambridge University Press.
Fogg, B. J. (2009). Creating persuasive technologies: an eight-step design process. Persuasive ‘09. April 26–29, Claremont, California, USA.
Groll, D. (2012). Paternalism, respect, and the will. Ethics, 122, 692–720.
Jennings, B., Wertz, F., & Morrissey, M. B. (2016). Nudging for health and the predicament of agency: the relational ecology of autonomy and care. Journal of Theoretical and Philosophical Psychology, 36(2), 81–99.
Kahneman, D. (2011). Thinking, fast and slow. Farrar, Strauss, and Giroux.
Lanzing, M. (2018). “Strongly Recommended” revisiting decisional privacy to judge hypernudging in self-tracking technologies. Philosophy & Technology. https://doi.org/10.1007/s13347-018-0316-4.
Larson, J. (2014). The invisible, manipulative power of persuasive technology. Pacific Standard May, 14, 2014 Available at: https://psmag.com/the-invisible-manipulative-power-of-persuasive-technology-df61a9883cc7#.tg29gpms4.
Mackenzie, C., & Stoljar, N. (Eds.). (2000). Relational autonomy: feminist perspectives on autonomy, agency, and the social self. New York: Oxford University Press.
Meyers, D. T. (1989). Self, society, and personal choice. New York: Columbia University Press.
Mitchell, G. (2005). Libertarian paternalism is an oxymoron. Northwestern University Law Review, 99(3).
Morozov, E. (2014). To save everything, click here: the folly of technological solutionism. New York: Public Affairs.
Nagel, S. K., Hrincu, V., & Reiner, P. (2016). Algorithm anxiety: do decision-making algorithms pose a threat to autonomy? IEEE Ethics, 2016 May 13-14, Vancouver, Canada.
Owens, J., & Cribb, A. (2017). “My Fitbit Thinks I Can Do Better!” Do health promoting wearable technologies support personal autonomy? Philosophy & Technology. https://doi.org/10.1007/s13347-017-0266-2.
Shiffrin, S. (2000). Paternalism, unconscionability doctrine, and accommodation. Philosophy & Public Affairs, 29(3), 205-250.
Smids, J. (2018). Persuasive technology, allocation of control, and mobility: an ethical analysis. Eindhoven:Technische Universiteit Eindhoven.
Spahn, A. (2012). And Lead us (not) into persuasion…? Persuasive technology and the ethics of communication. Science and Engineering Ethics, 18, 633–650.
Specker Sullivan, L., & Niker, F. (2018). Relational autonomy, paternalism, and maternalism. Ethical Theory and Moral Practice. https://doi.org/10.1007/s10677-018-9900-z.
Sunstein, C. R., & Thaler, R. (2003). Libertarian paternalism is not an oxymoron. The University of Chicago Law Review, 70(4), 1159–1202.
Thaler, R. H., & Sunstein, C. R. (2009). Nudge. Penguin.
Thaler, R. H., Sunstein, C. R., & Balz, J. P. (2010). Choice architecture. Available at SSRN: http://ssrn.com/abstract=1583509 or https://doi.org/10.2139/ssrn.1583509
Tripathi, P., People trust Apple more than Google and Facebook DazeInfo April 12, 2018. Accessed online at: https://dazeinfo.com/2018/04/12/apple-google-microsoft-facebook-most-trusted-company/
Verbeek, P.-P. (2009). Ambient intelligence and persuasive technology: the blurring boundaries between human and technology. Neuroethics, 3, 231–242.
Wagner, N.-F. (2018). Doing away with the agential bias: agency and patiency in health monitoring applications. Philosophy & Technology. https://doi.org/10.1007/s13347-018-0313-7.
Wolf, S. (1997). Happiness and meaning: two aspects of the good life. Social Philosophy and Policy, 14(1), 207.
Zuckerman, E. (2018). Facebook only cares about Facebook. The Atlantic. January 27, 2018. Accessed online: https://www.theatlantic.com/technology/archive/2018/01/facebook-doesnt-care/551684/
Author information
Authors and Affiliations
Corresponding author
Additional information
Publisher’s Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
About this article
Cite this article
Specker Sullivan, L., Reiner, P. Digital Wellness and Persuasive Technologies. Philos. Technol. 34, 413–424 (2021). https://doi.org/10.1007/s13347-019-00376-5
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s13347-019-00376-5