“Do not hallucinate”: Testers find prompts meant to keep Apple Intelligence on the rails
Long lists of instructions show how Apple is trying to navigate AI pitfalls.
“Do not hallucinate”: Testers find prompts meant to keep Apple Intelligence on the rails
Long lists of instructions show how Apple is trying to navigate AI pitfalls.
@arstechnica I'm sure that will be every bit as effective as saying "Look, just concentrate OK?" to someone with unmedicated ADHD would be.
@arstechnica It's... tricky. The problem is LLMs both lie and hallucinate (yes, I differentiate.) The lie is when they know they don't know something. This can be fought against somewhat easily. The hallucinate is when they don't know they don't know something.
I was messing around with a doctor card and I found that *to some extent* saying "if they are less familiar with something they will refer the patient to a specialist" has some effect. But you have to recognize when it does this.
076萌SNS is a social network, courtesy of 076. It runs on GNU social, version 2.0.2-beta0, available under the GNU Affero General Public License.
All 076萌SNS content and data are available under the Creative Commons Attribution 3.0 license.