I’m not defending AI consciousness/sentience (however we define them), but IMHO your arguments aren’t particularly relevant/convincing. Why couldn’t consciousness exist in brief flashes? Isn’t the “would it even be the same consciousness” same as any sci-fi “clone with identical memories” thought experiment: if a LLM were conscious then each conversation history would be a “self”. And yes, it would be in some vague state of death/hibernation/non-existance when not propmpted.
I’m not defending AI consciousness/sentience (however we define them), but IMHO your arguments aren’t particularly relevant/convincing. Why couldn’t consciousness exist in brief flashes? Isn’t the “would it even be the same consciousness” same as any sci-fi “clone with identical memories” thought experiment: if a LLM were conscious then each conversation history would be a “self”. And yes, it would be in some vague state of death/hibernation/non-existance when not propmpted.