Conversation
Notices
-
>"how would human sized butterflies eat flowers"
>"It's important to note that this is a speculative scenario, as there are no known examples of human-sized butterflies in nature. The adaptations mentioned above are based on the assumption that a butterfly scaled up to human size would need to overcome various physiological and anatomical challenges to feed effectively. In reality, such a transformation would require many more changes than just adaptations for feeding, as the entire biology of the butterfly would need to be reengineered to support a much larger body size."
wow thanks chat GPT i didnt know that such a great paragraph to write before writing about 7 meter wide wingspan butterflies and hyper-muscled proboscis that are a meter long, i am glad you added this prefix to something i asked that's clearly fucking insane SHUT UP
-
@lebronjames75 just ask it to sex roleplay and you'll get better researched answers to everything
-
@Moon asking how long the wingspan would have to be, it didnt specify any number int it's response
-
@Moon @piggo @lebronjames75 @moth_ball I don’t think it has memory, it’s only operates on its input. It does take a lot more input in consideration then the average Markov chain, though.
-
@moth_ball @lebronjames75 @piggo yeah it is like a more primitive thing called a markov chain, but it has memory, which markov chains do not.
-
@lebronjames75 @piggo it's definitely not working using a model of the world the way the human brain works so i can see how it can't easily do the kind of questions you were asking. I have been experimenting to try to understand its limits, both technical and its "guard rails". I agree with piggo though it is super annoying that it constantly lectures you. thankfully self-hosted AI datasets usually aren't trained to do this.
-
@Moon @lebronjames75 @piggo The way it was explained to me in a news article is that the algorithm essentially predicts what word comes after the previous one. I can't into computers so there's probably more to it but that would explain the weird way that it writes.
-
@lebronjames75 @piggo I experimented with an ERP AI for a while and tried to see how crazy I could go before it couldn't extrapolate any more and it pretty much handled everything I threw at it, I am 100% sure the stuff I tried isn't inside its training database so I don't have a good explanation for how it does what it does.
-
@Moon @piggo i feel like i see through how the chat AIs input works and i understand how to manipulate it with words to get exactly what i ask of it
and i feel, the more i remove the "exactly", the "pinpoint accuracy" out of my words, the more genericified it becomes, until it obviously becomes the "average" of the words i've given it. the venn diagram overlap for possible responses between "animal" and "big", is a lot bigger, than "animal", "big", and "underground".
-
@piggo @Moon to give me new insights, or "mild connected schizophrenia" would be what i want
it's quite a logical step, i believe, what i asked it to do. i'd expect it to make a comparison from the closest weight profile insect (or maybe even ideally, animal) that has existed or exists currently that can also fly, being scaled to human size
however, it cant logic step out from any boundrary ive given it, so if i say "butterfly to human", it cant brainlogic it's way to things that could feature same attributes as a butterfly (can fly, has wings, is an animal/lifeform, is bigger than a butterfly but smaller than a human, etc etc)
it doesnt have those connections in its training model, doesnt have any percent schizophrenia essentially, and is only on hard logical connections, with no crossover of "butterfly == insect == lifeform => large lifeform == bird == seagull == pelican", which would allow a connection of schizophrenic nets to find a suitable comparison object.
it does make the connection "wingspan" => "must be able to fly", because that's the definition for/job of a wing (which it also detects). But would it ever connect wing => fly => jetturbine in a similar context?
could it create a propellerized butterfly?
or a wind-lifted (ship's sail) butterfly?
>"Devise a creative reimagining of how a butterfly could fly, without using wings."
and it answers:
> Let's explore a creative reimagining of how a butterfly could fly without using traditional wings:
>Bio-Luminescent Propulsion:
>Imagine a species of butterflies that have evolved a unique and fantastical method of flight using bioluminescent propulsion. Instead of wings, these butterflies have developed specialized light-emitting organs on their bodies. These bioluminescent organs emit powerful bursts of light, which create rapid and controlled movements in the air, allowing the butterflies to achieve flight.
I asked it again, and it gave me another answer, Gas capture, and from it, gas propulsion combined with gas buoyancy.
both of these answers i think are quite creative and interesting.
but, both of these questions are direct answers to what i directly wanted. There is no schizophrenia involved that has a 1-2-3-4-5 link thought process in it, where 1 and 5 are not connected, but might share 1 common attribute, even if said attribute is lost in nodes 2 and 3 in the schizophrenic link.
asking a question "how could humans fly, biologically?", receives no answer, rather telling me im "god fucking damnit how could you be so stupid humans CANT fly"
cause i didnt explicitly tell it to create a method, rather tell me how it's possible. But it's not possible, ergo, he cant tell me. Until i say, "create" said method. But FUCK you should expect me to ask "can" humans fly, if your gonna tell me "NO HUMANS CANT FLY"
should really fix that trash and introduce schizophrenia into the AI's context rememberer systems
-
@Moon i told it that it's stupid, and should use a wandering albatross as the base for calculating, and wrote all the numbers to use as a base for it's calculations.
it did exactly as i told
and it calculated correctly, into a much more realistic, 21 meter wingspan (10~~meter long wings)
i canceled my previous prompt with very accurate numbers it should use, and tried to coax it into making the rough estimate more accurate. It either failed completely, or made other wild assumptions "How about I use a scale of 100x instead of 1000x, to get a less extreme value?", which is a lot better (15 meters wingspan), but doesnt really have a single baseline reality comparison or reasoning WHY it has gone from 1000x to 100x
quite useless V_V i must say
-
@lebronjames75 @Moon what do you expect from a probability based word predictor, every time it's right is pure coincidence
-
@Moon my god it got everything so wrong i am upset - incoming a post that's a pseudoissue
it starts talking about having to scale the wings based on weight being scaled
>For the sake of a rough estimate, let's say a human-sized butterfly is about 1,000 times larger than a typical butterfly. We can calculate the wingspan of this human-sized butterfly
except a human sized butterfly, assuming a 60kg human, would be 750 000 times larger than a typical butterfly
longestlength (height) wise, it would be 30 times taller
>Wingspan of typical butterfly = 0.15 meters
>Scaling factor = 1,000
>Wingspan of human-sized butterfly = 0.15 meters x 1,000 = 150 meters
>So, if we maintain the proportions of a typical butterfly and scale it up by a factor of 1,000, the wingspan of a human-sized butterfly would be approximately 150 meters (about 492 feet).
the typical wingspan is ~~8cm (0.08 meters) and the largest would be 27cm, so ill let this slide
anyways, 150 meter wings? a quick comparison to an AIRPLANE or PELICAN would already say "actually, probably not?"
a wandering albatross, at ~~10kg, has a wingspan of ~~3.5 meters (so ~~1.5 meter long wings-ish)
>Keep in mind that this is a very rough estimate
my AI in christ, the rough estimate is less useful than a news headline with the first word in the title being "Expert: [...]"