AI and Dreams

7/8/2025

generative ais are a lot like dreams arent they. the things they make anyways, course cant really say much about either on the inside. what i mean is, they both take in bit of information about the real world, mix it up, and put it back together in a shape that feels like reality, without any regard for what reality actually looks like.

i think most people are familier with how dream can seem entirely real in the moment, even for those who may not have experience it themselves, they are very likely to have at least heard someone else talk about it. how you could be visiting with people you know, and in places you've been in a hundred times. but once you wake up and think and think back, you may end up realizing those people and places look nothing like they did in the dream, acted completely out of character, or had new locations or details pulled from thin air. though there are cases where dreams do end up being pluasably realistic, ie dreaming that you woke up and prepared to go about your day, only to wake up and realize you're still in bed, but those are usually in cases where the events are extremely familiar, often happening near every day.

i think generative ai acts in much the same way, both in text generation and, in think more identifiably in image/video generation (not going to say much about audio gen here, aside from the fact that its creepy as all hell. now that i think about it im not sure my dreams tend to have much of a sound component, huh). its been known for a while that language models often "halucinate" facts in their text, from confidently saying "apple" has three 'E's, completely fabricating citation in documents, and just generally giving people outright incorrect information. its only really capable of making something that sounds pluasable, but the ai cant seperate truth from fact. its a bit easier to draw connections here with ai images, the early attempts were definately seen as dreamlike (remember the "can you name one object in this photo" image?). and even today with far more advenced models, they still tend to have a dreamlike quality to them, either very obviously with something like a video of a man made out of lettuce, or in subtler details like a hand with six fingers or words that dont make sense. it feels like if you were to take an exact image of what you see in a dream (and not how you remember afterwards, your brain will fill in the details when you try to look), it would remarkablely similar to the way ai images look today

looking at it this way does make it seem like a remarkably bad idea to rely on genAI for anything with any kind of basis in reality, fact checking, search engines, news stories, recipies, despite tech companies insisting we do otherwise. it would be pretty silly if you had never baked a cake, but had a dream that you put a bunch of ingredients in a bowl and made an amazing one, then wake up and follow those same steps expecting to make that same cake in real life. or to run to the news station saying that LA is on fire, and when they ask you how you know, you say you've never actually been there but you had a dream that it was. now im not saying dreams cant have any meaning, they can definately be used to read into things like your own emotional state, but they cant really be relied on to give you facts about objective reality. though, i suppose i have heard of people acting on things seen in dreams like this, ie breaking up with someone because of something you dreamt they said.

back