Programming and ChatGPT
We can either end up having an AI write all our code, or we can do the unexpected: realize that if an AI can do it, then the code we’re writing is too repetitive to begin with.
Will ChatGPT replace good writing? I think it’ll make it more obvious how hard good writing really is. If you like the suggestions, it’s because it’s eay to see where your writing is going. So if ChatGPT can complete your thoughts, maybe you’re not saying anything interesting.
But imagine this. Some reader is going through your wiring and they have no clue what’s going on. Each sentence is completely unpredictable. In fact, they’re not sure what they’re reading. Whoever wrote it, made sure that their writing is so unique, that ChatGPT couldn’t predict a single word.
Imagine a tool for readers that tells them how easy it is for an AI to predict the next word. Writing can be scored between 0 and 100. If it’s 0, it means the AI couldn’t predict a single word. At 100, the AI could have written the same thing. In practice, even a post written entirely by an AI wouldn’t be entirely predictable. And 0 would also be nearly impossible.
Whether it’s source code or an essay, we can represent it as a series of bits.
If you have a series of random bits, these bits are entirely unpredictable, but also entirely meaningless. If you have encrypted information, then it’s unpredictable and meaningful. The digits of PI are predictable, if you know how they’re generated, and they’re meaningful.
Some limited redundancy is important in writing. Computers can use [Hamming codes)[https://en.wikipedia.org/wiki/Hamming_code] to correct single bit flips. In writing, repetition can serve a similar role. Think about the story of the three little pigs, for example, or the thumping beat of a song.