Process Artifact: Participatory Futures
Abstract:
How can we save the world by avoiding systemic collapse of the very things which sustain us? Leaning on aspects of emergent strategy, biomimicry and learnings from degenerative collapse in the life sciences, here we will explore the small decisions driving large consequences, and understand how the risks posed by artificial intelligence might be solved by the most human of choices.
What I’m interested in is brown’s fractals-informed scaling of small problems as they magnify with use and lack of intervention. We see this in the life sciences with colony collapse disorder, but we also see it in the emerging aspects of large language model collapse, where both the natural and digital systems begin to collapse under the weight of synthetic consumption. My work will wrestle with how problems scale from small to big, and propose some safeguards for avoiding them. Ultimately this will be a piece which stops at a number of places, rather than reaches a specific destination of definite conclusion. Don’t we need the big problems? Would we remove much of our sense of progress, innovation or achievement if all we were able to tackle were the small things in life? Don’t we need that sense of overcoming, of mission, of rewarded effort to feel at our most alive? Our most… human?
Mind Map & Research Materials
Available at: https://miro.com/app/board/uXjVM-3ihbU=/?share_link_id=720834458241
Main class mind map: https://miro.com/app/board/uXjVMGkGPgs=/
Peer Review Feedback & Synthesis
Lower case citation for adrienne maree brown in all instances
Make the references and citations more ‘native’ to the article format you’re using
‘Much of our understanding of the world comes from our understanding of systems’ - is there room for more nuance here? That we look for the sense-making patterns in the chaos of life. But is that true in all cases? Do we actually understand the systems? Or do we just think as if we understand them? What about systems we can’t see and how might they play into misunderstands of the world?
Is the desire to systematize the world is inherently human, or is it something we might share with other species? More room to problematize the statements you’re making.
Citational style: What would it mean to actively give your readers the ability to go deeper / further themselves? To find the page the brown quote is on. Lean in more to the digital space of it all. To use the piece as a stepping stone to go off in a bunch of other directions.
A bit more about what you think is accurate or not in part three. What is the potential beef with Franzen’s logic? He’s not the ‘destroy the tools’ perspective you might be hinting at in this section.
Benjamin notion of embedded discrimination in linguistic systematization of the ‘air and imagination’ of progress
Not sure if I buy the change proposed as it’s currently discussed
Propose the change earlier and then grapple with the consequences of it throughout
Explicitly state the change, and make it a singular expression. Shrink the 3 conditions down to a single one.
What is gained by having it look like a New York Times feature? What audiences are or aren’t attracted by this approach? What is this mimicry bringing to the piece, and who gets excluded as a result? What are the modes of power being brought forward?
How do the literal energy costs of LLMs play into the sketch you’re drawing here? What are the environmental impacts of these systems?
Go deeper on the ‘what do we do about it all’ section. Not just what, but how. How do we create systems which learn from bias? How does a system learn in a new way? How does the commercial get removed?
What does it mean to use a voice which has been trained on my voice, as opposed to actually my voice? What might it mean to weave this into the piece itself a bit more?
Podcasts + Materials Discussion: Making Change
In Octavia Butler's The Book of Martha (Butler, 2003), Butler pictures a conversation between God and a young black female, and asks her to create a species-saving decision. In doing this, God asks Martha if she knows what a nova is. Martha knows it’s a star that explodes, but God qualifies her response as being a pair of stars, a giant and a dwarf. As God goes on to explain, “The dwarf pulls material from the giant. After a while, the dwarf has taken more material than it can control, and it explodes. It doesn't necessarily destroy itself, but it does throw off a great deal of excess material. It makes a very bright, violent display. But once the dwarf has quieted down, it begins to siphon material from the giant again. It can do this over and over.”
This attachment of virus to host and the recursive siphoning off of energy has deeply resonant echoes in modern digital spaces. Technological disruption often carves out someone else's audience through efficiency, cost-saving, and time-optimization. Incumbents are perpetually under existential threat from new means of doing business, as the underlying systems which power their products become automated and relocated. The architectural networks and systems upon which these businesses operate begin to run under increasing strain, and the need to make all-too-human decisions to survive takes hold. Explosions are everywhere.
But when we’re talking about disruption, we’re often talking about transformation. A change of matter from one state to another. In these changes are systems and patterns, often repeated at both small and large scale. Organizations turn and evolve, rather than being destroyed. And as adrienne maree brown describes, matter doesn’t disappear, it transforms (brown, 2017).
brown motivates that during periods of disruption, energy often moves location as it siphons between host and virus, just like the dwarf star. In modern digital spaces, Brown argues, systems of greed often characterize these transformational activities, giving way to her conclusion that humans are ultimately ‘brilliant at survival, but brutal at it’ (brown, 2017). Her work in emergent strategy articulates the way in which complex systems and patterns arise out of a multiplicity of relatively simple interactions. And that in those relational patterns we are able to generate signal from chaos. In doing this, brown leans on the natural phenomenon of fractals to illustrate the echoes of the natural world, and how small changes replicate and reproduce at massive scale. In digital spaces, these micro-problems at the source can scale into enormous issues of bias, discrimination and exclusion beyond their creators' control. We see this with many nascent generative artificial intelligence platforms, where issues of generative degradation are already present as millions of users magnify the problem, and the algorithms begin to feed on their own content as generative work begins to fill the web.
So, if we’re Octavia Butler’s Martha, do we simply forbid micro-problems to scale into catastrophic macro consequences? Do we prohibit the development of systems from feeding on themselves in order to survive? Who gets to decide this and why?
I’m going to take a leap of faith into the realm of science fiction, and channel my own Octavia Butler with a call to action for eliminating digital bias and discrimination. In eliminating digital bias we empower a cascading effect in the world's systems which ensures survival for all. In doing that, we have to believe three things:
1. To do great things, we must do small things in a great way.
2. We must acknowledge that fallibility is essential for the survival of our species.
3. Collective sustainability is driven by intentional adaptation.
If we believe these things to be true, we accept that pluralistic, diverse, and inclusive traits must be deeply inherent in the manner in which we construct both our social and digital systems at the source. We create digital spaces and services which are predicated upon a multiverse of voices and inputs, not just from those who are developing them or training on the produce of those with access to the internet. We empower these platforms to learn from issues of bias and discrimination and address them with increasingly efficient velocity. The systems must learn beyond simply serving what they think is next, or best. Our digital systems must deliberately, consciously be able to recognize exclusion and marginalization in themselves and others. And we must remove the commercial constraints these systems are placed under to remove the threat of collapse. If they are intended to be a public good, we must operate them as such.
As brown describes, we hold ourselves to standards which we embody. We learn. We release the idea of failure, because it’s all data (brown, 2017). We should expect the same of the systems we rely upon for survival.
References:
brown, A.M. (2017). Emergent Strategy. Shaping Change, Changing Worlds. AK Press. [Digital File]. Retrieved from: https://static1.squarespace.com/static/5ad0d247af209613040b9ceb/t/5db5a44b0e6ba42da976cce7/1572185168567/brown+2017-Emergent+Strategy+full+book.pdf.
Butler, O. (2003). The Book of Martha. SciFi.com. Retrieved from: http://www.scifi.com/scifiction/originals/originals_archive/butler2/butler21.html.
Generative Artwork Library (MidJourney)
Prompt: hyper realistic photo real very close-up of worker bee extracting pollen from a beautiful flowers, immense details, --ar 16:9 --v 5.1
Prompt: isolated hyper realistic photo real worker bee on a flat black background, immense details, --ar 16:9 --v 5.1
Section Headers & Paragraph Break Artwork
My presentation goal is to articulate the work as if it was native to an opinion article in The New York Times. I think the audience which might benefit greatly from the explorations of these topics would be technological decision-makers, investment executives and those interested in the future of technology from a societal and practical perspective, as well as those with strong literary ties to authors such as brown and Butler. By presenting it this way, we also acknowledge that we are situating it inside an existing system of power and exclusion.
And while The New York Times has enormous reach, it excludes many economically, politically and demographically. This alignment is intentional. It is not my intention to make this article broadly appealing to all. I want it to reach those who are currently in positions of economic influence to make changes inside of systems of technological development. By drawing attention to the risks of model collapse to this audience, our intention is that we also surface the ability to do something about it.
I’m also going to include inline links for readers to go deeper on the material themselves, and use the article as a content from which many other threads of discussion and reading might happen. Citations are included in these working notes, but presented as links in the body copy of the article itself, just as they would be if this were a real article.
Generative Audio Excerpts (Voice Trained in Descript):
One thing the article will talk a lot about is the choice between generative or technological decisions, and human ones. I want to lean into this tactically by also giving the reader the choice to make the same decision as part of their meta-consumption of the work. To do this, I will record my own voice reading the article, but also present the choice of an artificially-generated synthetic voice reading, trained on my own voice.
Realizing This Vision Outside Of The Course
My work in the newsroom at NBC News Digital intersects with many of the issues I’ll discuss here. Issues of generative truth versus operational efficacy, how to navigate an internet increasingly made of non-human generated content, and the issues we face in making sense of an increasingly-dependent AI world. Our internal conversations are lively as we attempt to chart a path for ourselves and the millions of users who use our products every day. Upon completion, I intended to share this to our internal channels, both as a means of surfacing these ideas, but also to feed them into our existing discussions working towards a more concrete idea of ‘what should we do’.
References:
brown, a.m. (2017). Emergent Strategy. Shaping Change, Changing Worlds. AK Press. [Digital File]. Retrieved from: https://static1.squarespace.com/static/5ad0d247af209613040b9ceb/t/5db5a44b0e6ba42da976cce7/1572185168567/brown+2017-Emergent+Strategy+full+book.pdf.
Butler, O. (2003). The Book of Martha. SevenStories.com. Retrieved from: https://www.sevenstories.com/blogs/102-an-exclusive-short-story-from-octavia-butler-in-celebration-of-her-birthday.
Franzen, C. (2023). The AI feedback loop: Researchers warn of ‘model collapse’ as AI trains on AI-generated content. Venturebeat.com. Retrieved from: https://venturebeat.com/ai/the-ai-feedback-loop-researchers-warn-of-model-collapse-as-ai-trains-on-ai-generated-content/.
Ghaffary, S. (2022). Facebook is shrinking. Vox. Retrieved from: https://www.vox.com/recode/2022/2/2/22915110/facebook-meta-user-growth-decline-first-time-metaverse-mark-zuckerberg-tiktok-competition-earnings.
Henien, M. & Langworthy, G. (2009). The Vanishing of the Bees. Dogwoof Pictures. [Digital File]. Retrieved from: https://watch.plex.tv/movie/vanishing-of-the-bees.
Lee, A. (2011). Myspace Collapse: How The Social Network Fell Apart. The Huffington Post. Retrieved from: https://www.huffpost.com/entry/how-myspace-fell-apart_n_887853.