• 0 Posts
  • 61 Comments
Joined 2 years ago
cake
Cake day: July 6th, 2023

help-circle










  • With regard to sugar: when I started counting calories I discovered that the actual amounts of calories in certain foods were not what I intuitively assumed. Some foods turned out to be much less unhealthy than I thought. For example, I can eat almost three pints of ice cream a day and not gain weight (as long as I don’t eat anything else). So sometimes instead of eating a normal dinner, I want to eat a whole pint of ice cream and I can do so guilt-free.

    Likewise, I use both AI and a microwave, my energy use from AI in a day is apparently less than the energy I use to reheat a cup of tea, so the conclusion that I can use AI however much I want to without significantly affecting my environmental impact is the correct one.



  • I haven’t noticed this behavior coming from scientists particularly frequently - the ones I’ve talked to generally accept that consciousness is somehow the product of the human brain, the human brain is performing computation and obeys physical law, and therefore every aspect of the human brain, including the currently unknown mechanism that creates consciousness, can in principle be modeled arbitrarily accurately using a computer. They see this as fairly straightforward, but they have no desire to convince the public of it.

    This does lead to some counterintuitive results. If you have a digital AI, does a stored copy of it have subjective experience despite the fact that its state is not changing over time? If not, does a series of stored copies representing, losslessly, a series of consecutive states of that AI? If not, does a computer currently in one of those states and awaiting an instruction to either compute the next state or load it from the series of stored copies? If not (or if the answer depends on whether it computes the state or loads it) then is the presence or absence of subjective experience determined by factors outside the simulation, e.g. something supernatural from the perspective of the AI? I don’t think such speculation is useful except as entertainment - we simply don’t know enough yet to even ask the right questions, let alone answer them.


  • This isn’t the Cthulhu universe. There isn’t some horrible truth ChatGPT can reveal to you which will literally drive you insane. Some people use ChatGPT a lot, some people have psychotic episodes, and there’s going to be enough overlap to write sensationalist stories even if there’s no causative relationship.

    I suppose ChatGPT might be harmful to someone who is already delusional by (after pressure) expressing agreement, but I’m not sure about that because as far as I know, you can’t talk a person into or out of psychosis.


  • Yes, the first step to determining that AI has no capability for cognition is apparently to admit that neither you nor anyone else has any real understanding of what cognition* is or how it can possibly arise from purely mechanistic computation (either with carbon or with silicon).

    Given the paramount importance of the human senses and emotion for consciousness to “happen”

    Given? Given by what? Fiction in which robots can’t comprehend the human concept called “love”?

    *Or “sentience” or whatever other term is used to describe the same concept.


  • I thought I could see piles of debris at the bottoms of some slopes in the after pictures which weren’t there in the before pictures, but now that I’m looking at them again, I’m no longer sure that what I’m seeing isn’t just a difference in the shadows. (Presumably the pictures were taken at different times of the day.) I’m going to edit my original statement.





  • I think many people learned the wrong lesson from GWB’s Iraq War. It was presented as (among other things) a way to stop an enemy of the USA from obtaining nuclear weapons and it was a mistake, so they conclude that using force to stop enemies of the USA from obtaining nuclear weapons is a mistake. However, using force (if necessary) to stop enemies of the USA from obtaining nuclear weapons is a prudent idea and the problem with that Iraq War was that it was not actually fought for that purpose. GWB was the boy who cried wolf but real wolves still exist.