Self-Editing & Reflection

Excerpted Passage (With Notes):
Echoing the practices of bee farmers looking to sustain their colonies through the introduction of synthetic sugar foodstuffs, and the echo chambers of social networks, as large language models grow in dominance, they are increasingly absorbing their own content (This opening sentence is large and cumbersome, produces negative aesthetic response and is trying to do too much. It needs to be split up into clearer, simpler thoughts which focus on separating out and sequencing the main points of what’s happening. Lean on transposing the individual complex sentence into several simple sentences instead). They are being trained more and more (redundant repetition) on the pieces of information (collapse this into a single word - data) they’ve created in the first place (this sentence needs to be reordered to remove redundancy). They are growing nearer (edit to a single word which still contains the adjective of degree - approaching) the risk of folding in on themselves as the content they create becomes an increasing part of what’s published on the internet (split this complex sentence up into several simple sentences, even sentence fragments). This forces the question (redundant), what happens as AI-generated content proliferates around (collapse into a single word - spreads) the internet, and AI models begin to train on it, instead of (remove, not needed either for emphasis or clarity) on primarily human-generated content? (rephrase question to open with clearer interrogative adjective that isn’t in the middle of the sentence)

Researchers are increasingly concerned (collapse into a single word) about β€˜model collapse’ (remove unnecessary punctuation), a digital phenomenon brought about by (collapse into a single word - caused) the irreversible scaling of defects inside of (replace with in for concision) artificial intelligence systems (remove). Returning to brown’s emergent strategy work on fractals (collapse into fewer words for clarity), this is knowing that there’s (nothing is being said here, and is redundant in trying to connect two sentences together - remove punctuation commas and split them into two simple sentences with periods instead) something wrong at the micro level, but continuing to let it propagate into something massive (collapse into a single word for clarity and concision - spread, remove intensifier). Model collapse is a degenerative process whereby over time (remove for concision and redundancy because I’ve already used the word process), the large language models forget the true underlying data they’ve originally been trained on (adjust end not end with the word on). It magnifies mistakes, but also misrepresents and misunderstands (for concision, pick one) less popular or common (collapse into a single word lesser, or lower quality) data. As Ilia Shumailov describes β€œWe were surprised to observe how quickly model collapse happens: Models can rapidly forget most of the original data from which they initially learned”.

But this is more (open with something stronger here - reorient this sentence to focus on the collapse, not the concern) concerning than a platform like MidJourney or ChatGPT4 collapsing in on itself. There are enormous (remove, redundant intensifier) societal risks here too (remove for concision). In the process of (remove for concision and clarity) model collapse, minority groups and perspectives, already heavily marginalized (edit for redundancy), get removed. It leads to serious implications such as (edit / remove for clarity and concision) ethnic or gender discrimination. Over time the model simply (remove for concision and negative aesthetic response) forgets that specific cohorts (remove specific - a cohort is already specific) of the population (remove this - cohorts is already implicit that it’s talking about people) exist. The half-life of the training data becomes shorter and shorter (remove unnecessary repetition), and there is (remove for redundancy and concision) greater propensity to exclude already discriminated groups altogether (redundant, remove, or include adverb of attitude to bring perspective into the closing).

 

Revised Passage:
Bee farmers now sustain their colonies by introducing synthetic sugar foodstuffs. It echoes how social networks and large language models are now absorbing their own content. Training on their own data. They are approaching the risk of folding in on themselves. The content they create becomes an increasing part of what’s published on the internet. What happens as AI-generated content spreads on the internet? And AI models begin to train on non-human data?

Researchers are concerned about model collapse, a digital phenomenon caused by the irreversible scaling of defects in artificial intelligence. Let’s return to brown’s emergent strategy fractals. If there’s something wrong at the micro level, it spreads at the macro level. Model collapse is a degenerative process whereby large language models forget the true underlying data on which they’ve been trained. It magnifies mistakes, but also misrepresents incorrect data. As Ilia Shumailov describes β€œWe were surprised to observe how quickly model collapse happens: Models can rapidly forget most of the original data from which they initially learned”.

But model collapse is more concerning than just MidJourney or ChatGPT training on its own data. There are also societal risks. In model collapse, minority groups and marginalized perspectives, get removed. It leads to potential ethnic or gender discrimination. Over time the model forgets that cohorts exist. The half-life of the training data becomes shorter, and greater propensity to exclude already discriminated groups.

Reflection:
It’s a humbling but exciting process to realize there were over seventy words able to be removed from my original passage. That around a quarter of my words couldn’t justify their own existence. I primarily focused on sentence ordering, punctuation, clarity and concision, and the reduction of complex sentences in favor of simple sentences which often collapsed several verbs or nouns into one. The overall passage is now much more to the point, but in doing so becomes a little less conversational. It’s a little less β€˜me’ in its effort to get to the point sooner and more efficiently, but perhaps that’s a mirror into which I can look a bit deeper as a writer. But what it might lack in personality it significantly gains in clarity. It forces the question β€˜what am I really trying to say here?’ It focuses much more on the reader than the writer, and the need for clear and consistent comprehension in the mind of the reader, rather than the journey the writer is going on themselves.

To set some goals, I really need to pay closer attention to sentence length, and to look for those opportunities to collapse words and thoughts that are redundant, repetitive or duplicative. But how in doing so, the work doesn’t become less me, but offers the opportunity to say a lot more in the same space!


Previous
Previous

What Is Rhetoric?

Next
Next

The Rhetoric of Punctuation