Depends.
Your session is unique to you, and apart from intentional training by OpenAI the models do not learn. The session context is limited to 8000 tokens (roughly 1-2 per word) and that's the extent of its short-term memory.
I don't know what they have the temperature of ChatGPT set at, but there's a certain amount of random wiggle room within the probability tree if they have it set anywhere north of 0.
Your session is unique to you, and apart from intentional training by OpenAI the models do not learn. The session context is limited to 8000 tokens (roughly 1-2 per word) and that's the extent of its short-term memory.
I don't know what they have the temperature of ChatGPT set at, but there's a certain amount of random wiggle room within the probability tree if they have it set anywhere north of 0.