Core Competency: Ethics

Ethics: A deep understanding of what it means to be ethically and socially responsible, both as individuals and as leaders; to use cross-cultural skills to understand the potential for different interpretations of what it means to be ethical in the current and future climate.


Context

Course Information: DIGC3600: Applications of Digital Culture

Creation Date: July 10th, 2023

Description of Artifact:

The technological ethics research project How To Live Forever: The Ethics of Grieftech explores the ethical implications of emerging technologies designed to digitize human essence for posthumous interaction (aka grieftech). Through field research and online examination, it explores the psychological and societal impacts of preserving digital identities after death, questioning the ethical ownership and privacy concerns of such data. The series of papers emphasize a need for responsible development and governance in grieftech to prevent unintended consequences, such as prolonged grief or exploitation of personal narrative. It also highlights the importance of considering cultural perspectives and individual consent in the creation and use of these technologies. Ultimately, the series calls for a balance between technological innovation and ethical responsibility to honor the memories of the deceased while protecting the well-being of the living.

The project can be viewed at: https://www.digitalethicsportfolio.com/

Skills, Knowledge of Abilities Demonstrated

Specific Skills/Knowledge:

Ethical Analysis: Critically assessing the ethical implications of technologies which manipulate or preserve human essence, reflecting a deeply curious understanding of ethical theory, human rights, and privacy concerns.

Technological Awareness: Knowledge of the latest advancements in digital technologies, especially those which involve artificial intelligence, data storage, and posthumous interaction, demonstrating a grasp of how these innovations intersect with society.

Interdisciplinary Knowledge: Expertise in blending multiple fields such as technology, psychology, sociology, and law, to understand the complexities of grieftech and the impact it may have on one’s sense of faith, and upon individuals and communities.

Cultural Sensitivity: The ability to understand and navigate diverse cultural perspectives on death, grief, and the afterlife, and how these views may or may not shape the acceptance and regulation of grieftech.

Policy and Governance Understanding: An awareness of the necessity for policies, laws, and ethical guidelines to govern the development and use of grieftech to ensure it serves societal needs while minimizing harm or exploitation. And how that distills into a call to action for grieftech developers to assume an ethical responsibility.

Connection to the Core Competency

Relevance:

Ethical Responsibility in Technology Development: The artifacts demonstrate a deep understanding of the ethical responsibilities associated with developing and using technologies which impact human lives, such as grieftech. They emphasize the need for careful consideration of the consequences and potential harms of such technologies, aligning with the core competency of being ethically and socially responsible as both individuals and leaders.

Balancing Innovation with Ethical Considerations: The research project illustrates how leaders in technology and ethics must navigate a growing tension between technological advancement and the potential for unintended ethical dilemmas. I believe this ability to balance progress with responsibility is central to understanding the importance of ethical leadership in the digital age.

Cross-Cultural Awareness and Sensitivity: The artifacts highlight the significance of understanding diverse cultural and ethical perspectives on death, grief, and memory, emphasizing that different societies interpret these concepts differently. This cross-cultural understanding is vital to ensuring that ethical considerations in the design and use of grieftech are strongly inclusive and respectful of global diversity.

Societal Impact and Leadership: The work demonstrates the role of leadership in shaping societal norms and ethical frameworks around emerging technologies. Leaders must be equipped to guide their teams and communities in navigating the moral implications of new technologies, particularly in contexts where societal values may shift, and truth becomes a matter of opinion.

Future Ethical Climate Considerations: The artifacts showcase a visual, forward-thinking approach to ethics, acknowledging that future technologies like grieftech will continually challenge current ethical standards. I believe the ability to anticipate and prepare for these ethical challenges is essential for responsible leadership in a rapidly evolving technological landscape.

Challenges and Solutions

Challenge: Defining Ethical Boundaries for Emerging Technology: One of the key challenges was determining the ethical boundaries for a technology like grieftech, which interacts with sensitive aspects of human lifeβ€”death, memory, and identity. This required careful consideration of how far technology should go in preserving or replicating a person’s essence and what limits should be in place to avoid exploitation or harm.

Solution: Extensive Ethical Research and Consultation: The solution involved conducting thorough research into various ethical frameworks, including privacy, consent, issues of faith and cultural views on death, as well as consulting with ethicists, technologists, faith leaders and sociologists to gain a multi-faceted understanding of the issue.

Challenge: Navigating Cultural Differences in Ethical Perspectives: Another challenge was understanding the different cultural interpretations of what it means to be ethical in contexts involving death and remembrance. Views on grief and technology vary widely across cultures, which added complexity in crafting an ethical framework that could be universally applicable.

Solution: Incorporating Cross-Cultural Sensitivity: The solution involved integrating cross-cultural perspectives into the ethical considerations, ensuring that the potential for different interpretations of ethical issues was respected and addressed. This helped to create a more inclusive and adaptable framework for the use of grieftech.

Challenge: Predicting Future Ethical Dilemmas: A significant challenge was predicting potential future ethical dilemmas which may arise as grieftech and similar technologies evolve. It was difficult, and continues to be difficult to anticipate all the ways in which these technologies could impact society, especially in the long term.

Solution: Forward-Thinking Ethical Frameworks: The solution was to develop flexible ethical guidelines which could evolve as new challenges arise. This approach allowed for adaptive solutions to emerging issues, ensuring that the ethical framework remained relevant as technology and society progress.

Impact on Professional and Academic Growth (about 250 words, 100 ish words for each category)

Professional Growth:

Creating this artifact has significantly contributed to my professional growth by deepening my understanding of the intersection between technology and ethics. By far the largest and most ambitious project of my undergraduate work, it has sharpened my ability to think critically about the societal implications of emerging technologies, particularly in sensitive areas like grieftech, and do this in a way which manages a large volume of research. The process has also enhanced my skills in cross-cultural analysis, as I navigated diverse perspectives on death, memory, and ethics. Additionally, I’ve developed a stronger capacity for forward-thinking, anticipating future ethical dilemmas, combining them with my own sense of faith, and preparing flexible frameworks. This experience has strengthened my ability to lead with ethical responsibility and to consider the broader societal impact of technology.

Academic Growth:

This artifact has significantly contributed to my academic growth by enhancing my understanding of digital ethics and its real-world applications, and how to handle a large body of material. It required me to integrate knowledge from multiple disciplines, such as technology, philosophy, religion and sociology, to address complex ethical challenges. The process strengthened my ability to critically analyze emerging technologies and anticipate their societal implications. Additionally, I refined my research and writing skills, as I engaged with diverse ethical frameworks and perspectives. Overall, this experience has deepened my ability to approach academic problems with a more interdisciplinary and forward-thinking mindset, and while the project continues to have a life beyond class, it also serves to prepare me for future academic endeavors.

Self-Evaluation

Rubric Score You Think You Earned For This Submission: 20

My responses effectively reflect the core competency of ethics by demonstrating a deep understanding of ethical responsibility in the contentious and ever-shifting context of emerging technologies like grieftech. They incorporate cross-cultural perspectives and address the implications of these technologies in a global context, showcasing an awareness of the diverse and non-linear interpretations of ethics. My reflections also synthesize academic and life experiences, illustrating how they shape a forward-thinking ethical stance. The content of the artifact continues to be relevant and compelling, with well-developed insights into ethical leadership and societal impact, while the language remains clear and precise, supporting the overall conclusions drawn from the inquiry. In particular, the project lives on online, and continues to attract a digital audience long after class has ended, which I am proud to see.

Survey
This artifact helped me demonstrate growth within the competency of Ethics by critically exploring the societal and cultural implications of emerging grieftech technologies. By engaging with diverse ethical frameworks and cross-cultural perspectives, I deepened my understanding of the balance between technological innovation and ethical responsibility. The process of anticipating future dilemmas and creating flexible, inclusive guidelines strengthened my ability to approach complex ethical challenges with a forward-thinking and interdisciplinary mindset. This project reinforced the importance of ethical leadership in navigating sensitive technological advancements, preparing me to address similar challenges in both academic and professional settings.


Artifact:

Executive Summary:

Grief is an inescapable part of loss. Grieftech describes emergent digital products which seek to preserve a person’s essence after death through the extraction of human stories. These are then able to be interactively recalled using artificial intelligence to power chatbot experiences. Grieftech positions itself as remembrance reinvented, but there are deeply ethical considerations for the currently unknown psychological impacts of synthetically prolonged sorrow, the changes in ways we relate to the dead, legacy data privacy, and the commercial rights to likeness left to survivors. As custodians of remembrance beyond what’s stored on a product’s servers, it is essential that we establish the deeply human dimensions of responsibility, care and accountability into the decision making processes which go into not just their development, but also their distribution. Grieftech developers bear the same ethical responsibilities to their future users as they do to their current content creators.

Grieftech platforms must be held accountable to endure themselves, ensuring that existential issues such as large language model collapse, algorithmic bias and commercial risks are held at a mitigated distance. They are products explicitly intended for an unknown future. They bear deeply human custodial responsibilities to operate in ways which sustain themselves into the future. Grieftech runs on the extraction of the oil of human essence, the story. As investors you control the means by which many of these personal experiences achieve an audience in the future. In your hands you hold both great power and great responsibility. Exercise it.

Dear Investors,

Over the past year, many of us have engaged in an artificial intelligence arms race to integrate generative tools into our products in commercial and brand-driven efforts to appear innovative. Those choosing not to participate, or those proceeding with caution, risk being left behind as generative tools vacuum up audience and attention. Prompting has entered our vocabulary, and it’s getting harder for any of us to determine what’s real. But what is real for many is the fear of the future. Of what these generative tools are going to do to us as citizens. Many have been vocal about job loss and human disintermediation. Some are calling for a period of developmental pause. Legislation is already years behind. Our stewardship of the present poses very real risk in the future.

Artificial intelligence is reshaping social norms, issues of identity and citizenship, privacy, and the boundaries we establish with others. As it reshapes who we are as individuals, and who we are to each other, it is even reframing our understanding of the space between life and death. In the emerging field of grieftech, our capacity to digitize the storytelling of human experience, reducing it to a series of prompted responses stored on a remote server for recall in the future, raises deeply ethical questions of faith, prolonged bereavement and the masking of pain as a fundamental part of human experience. It surfaces deeply problematic, culturally nuanced questions of the difference between can and should, especially when the implementation of technologies motivates an often unintended consequences of reinforcing bias, discrimination, exclusion and hierarchy.

This letter is a plea to ensure the endurance of your users’ futures through responsible custodianship of the present.

Grief is a critical part of authentic human experience. But learning to live with digital facsimiles masking sorrow arrests the opportunity to go deeper into our relationship with what we believe to happen next. It arrests a pain necessary in embracing the reality of a loved one lost, and this rich relational aspect of human experience cannot, should not, be reduced to a set of recalled responses. Pain isn’t good in itself. But it's a necessary part of human experience. Because ultimately pain comes from love. If we try to fill our time with something that's really just a coping mechanism, we lose the opportunity to move on. Humans are deeply communal beings. We crave interaction with others and the friendships we carry through life are more than an exchange of information. Too often in digital spaces of innovation we have grown accustomed to numbing our pain with scrolling serotonin and not embracing it as part of what it means to be alive. These are all-too-human problems which should not be masked by synthetic digital solutions, however immediately intoxicating they may appear to be.

We are not advocating for a world where technological innovation does not seek to help humans with deeply emotional experiences. We believe technology has much to offer in grief counseling and our relationship with those no longer with us. But we are advocating for increased responsibility towards the necessary human emotional needs which are masked or prolonged by the use of such products. We seek kindness and care, not faster optimized empathy. That pain of loss comes from embracing a slower feeling of love. That humans are not reducible to exchanges of information or the digitization of their stories. And that facsimiles of loved ones bear a deep responsibility, like the medical field, to do no harm.

Our research across four papers lays out a case for this responsibility across the dimensions of citizenship, extraction, obligation and consequence. Grieftech experiences transcend our historical understanding of what it means to be alive. They preserve human experience for interactive recall by those left behind at some point in the future. Neither those creating the content nor those distributing it are the end users.

The ethical issues surfaced by the fear of the future are nothing new, especially concerning dystopian ideas of artificial intelligence accelerating beyond our control. A discomfort with the very idea of digital memory and preservation beyond death taps into the very real human problems we already have with our own mortality. Resurfacing memory, feelings of nostalgia, and the ability to digitally recall the events of our lives and those we love are powerful motivators of engagement and attention. The means by which we create meaning from our individual experiences of the world surface issues of what’s possible, what’s culturally defined as ethical, and ultimately what’s legal. Individuals may lack the language to articulate this concern but we, as both users and creators still possess the agency and responsibility to be curious about the consequence of equitable exchange. All of these collide inside of grieftech experiences, which at least for now, are often constructed from a western, individualistic and affluent perspective.

In the United States, existing legislation determines that an individual’s privacy is of financial value, and is ultimately negotiable. So while privacy is for sale, it also comes bundled with large ethical issues of ownership. Who owns an individual’s data, likeness, engagement history or network? Does your product bequeath the eternal use of another’s likeness to its descendants, or others? Legal protections already don't do a lot to prevent abuse of populations that we agree should be protected, especially when those populations can't represent themselves. It’s often the case that we willingly say yes to digital identity harvesting because we value the exchange, at least in the terms that we understand it, and even if we don't have the language to express our consent. But even if those who have given up their privacy in advance, and that consent is present, within grieftech the person who has died is not the end user. Proposed legislation from the IEEE recommends that an individual has the right to access services allowing them to create a trusted identity to control the safe, specific, and finite exchange of their data. But they omit considerations of posthumous use, and the consent to do so, either by surviving loved ones, or by third-party providers.

What are those creating these experiences obligated to do? The synthetic feeling of keeping someone alive inside a period of grief may have the noble aspiration of easing the pain of loss, but it also carries the ethical risk of prolonging grief itself. Grieftech runs on the personal disclosure of intimate life stories and thoughts, preserved for survivors, and is extremely empathic in substance and intention for surviving loved ones. Grieftech aspires to endure. Therein is the obligation and responsibility as custodians of remembrance well beyond the recurring monthly subscription payment. The legacy of the deceased and the voices of those who survive are those who must be prioritized in matters of ethical conflict and permissions negotiation. The voice of the end user, in this case, the end user in perpetuity, is that which must be protected. This must be built into these experiences as a matter of critical importance.

With any digital product explicitly intended for long-term use, the future is often unclear, and the propensity towards unintended consequence high. Consequences may include the psychological impact of synthetically prolonged sorrow, changes in the ways in which we relate to the dead, legacy data privacy and the rights to likeness left to survivors, and the right to commercialize the stories we leave behind for others. Responsible AI must be deeply ingrained into these products themselves, but also into the economic mechanics which continue to fund such products. Sustainable processes, much like existing frameworks for accessibility or translation investors will already be familiar with, need to be wrapped around the development process itself. Ethically responsible governance in the distribution and responsiveness of artificial intelligence services is deeply shaped by the all-too-human problems of unintended consequence, fear of the future, bias and flawed, culturally nuanced decision-making.

So what happens next? Armed with these ethical insights, grieftech platforms have a deep responsibility to endure. As digital custodians of human lives they must ensure that emergent existential issues such as large language model model collapse are held at a distance. Use this research to guide your decision-making and to shape your roadmaps. These are products intended for the future, not the present, and they bear deeply human responsibility to operate in ways which sustains themselves into the future. As investors you control the means by which many of these experiences get developed. In your hands you hold both great power and great responsibility. Exercise it.

In hope,
Matt Shadbolt, The University of Pennsylvania




Previous
Previous

Week 1 Discussion

Next
Next

Core Competency: Scientific Process & Problem Solving