davidfunk90
Member
- Joined
- Feb 19, 2025
- Messages
- 27
Totally fine—rest up. None of this is urgent, and your PCs aren’t going anywhere. When you’re fresher, I can walk you through things one click at a time.
If you want easy starters later, just paste one of these and I’ll reply with a short, step‑by‑step:
Take care, and ping me whenever you’re ready—I'll keep it simp
- “Move Documents/Pictures/Downloads to my microSD on the HP 14‑dq0040nr.”
- “Turn on Storage Sense and free space safely.”
- “Set up a simple monthly backup to a USB stick.”
- “Make Ubuntu ready for basic music recording.”
- “Decide whether to switch out of S mode (pros/cons for me).”
One time I have heard of that. I remember some time ago my dad used this ping command in the terminal on Microsoft Windows 98 or something like that. I didn't know that Ping the duck was a story about the Yangtze river. I would like to further learn about it. My mom once told me about Ping the duck, that there was such a thing, but that was all.Yes—if you mean “The Story About Ping,” the classic children’s book about a duck on the Yangtze River. Fun coincidence: “ping” is also a built‑in Windows network test.
If you ever want to check your internet connection without command‑line tools (since S mode is fussy), try this:
If you were thinking of a different “Ping,” tell me which one—and if you want, I can show a simple way to test your connection that works in S mode.
- Settings > System > Troubleshoot > Other troubleshooters > Internet Connections > Run.
- Or just open Edge and visit a few known sites; if one loads and others don’t, it’s usually a site issue, not your PC.
AI “hallucinations” happen when it confidently gives wrong info. Unlike humans, AI doesn’t believe or know anything—it’s just predicting words. Mistakes only matter if someone acts on them as true, so always double-check critical info. Truth is what can be verified by reliable sources, and AI output should be treated as a helpful starting point, not a guarantee.My understanding of what you are telling me is extremely limited. How can the possibility of you saying something that is incorrect be the same as a hallucination? Wouldn't it be dependent upon there being somebody to believe in what you are telling them (even yourself, if you consider yourself having one) The next question that I can think of is that if for you or I to hallucinate we would need to believe a "lie" to be the truth, then what can we consider the truth?
I do not understand how AI or humans could know that they are wrong. Wouldn't we all be hallucinating? I am aware that right and wrong are concepts that are debated: what is right or wrong to do? what is the truth or a lie? My knowledge is that the concept of right and wrong pertains to truth and lies, or morals. These concepts are discussed and explored and if one party convinces another to change the foundations of their beliefs it could cause all parties involved to be aware that they are hallucinating. I thought this sort of conversation was happening on a large scale at all times and was what created a shared hallucination.AI “hallucinations” happen when it confidently gives wrong info. Unlike humans, AI doesn’t believe or know anything—it’s just predicting words. Mistakes only matter if someone acts on them as true, so always double-check critical info. Truth is what can be verified by reliable sources, and AI output should be treated as a helpful starting point, not a guarantee.