- Messages
- 1,781
Thank you, Donna!
Enyaw. Oh yes, it took a lot of finessing to get what I wanted. It’s not a professional enough program where I can just dive in and point and do something myself. Eventually got something I liked from my suggestions and it trying to understand me.
I actually have it to go through old photographs and improve orange, badly lit photos from the 70s and 80s. Blurry ones from the 90s. When it gets its right, it’s amazing. But only one in 10 photos turns out well. It does amazing things. But many times people just don’t look like themselves. And AI you have to be careful. You can tell it not to do something and it does it anyways. It gives a lot of wrong answers too. It often says it has done something and it’s done the opposite or it hasn’t done anything at all. Including create an image, it said it created! I am glad I’m a generation where I don’t automatically reust technology to get it right. I have a general idea when answers are wonky. They have programmed it to be very positive, so it’s absolutely sure of itself and it keeps trying to tell you something’s right when it’s completely wrong.
Enyaw. Oh yes, it took a lot of finessing to get what I wanted. It’s not a professional enough program where I can just dive in and point and do something myself. Eventually got something I liked from my suggestions and it trying to understand me.
I actually have it to go through old photographs and improve orange, badly lit photos from the 70s and 80s. Blurry ones from the 90s. When it gets its right, it’s amazing. But only one in 10 photos turns out well. It does amazing things. But many times people just don’t look like themselves. And AI you have to be careful. You can tell it not to do something and it does it anyways. It gives a lot of wrong answers too. It often says it has done something and it’s done the opposite or it hasn’t done anything at all. Including create an image, it said it created! I am glad I’m a generation where I don’t automatically reust technology to get it right. I have a general idea when answers are wonky. They have programmed it to be very positive, so it’s absolutely sure of itself and it keeps trying to tell you something’s right when it’s completely wrong.
