Cache loop and memory loss in GPT – a user-side fix (tested with GPT itself)

github.com

3 points by sks38317 2 days ago

I’m a Korean high school student currently preparing for the CSAT (college entrance exam), and I happened to notice some persistent cache-loop behavior while using GPT in document-heavy tasks.

Repeated PDF failures seemed to create token overload and session slowdowns. So I tried manually analyzing the session, tracking token counts, and testing some user-side “optimizations”—like auto-removing failed outputs and cleaning redundant versions.

I used GPT itself to help write the report and interpret the data. It was a mix of curiosity, frustration, and… maybe procrastination. But it turned into a fun experiment.

I’ve only been exploring GitHub and ChatGPT for less than a month, so there are still many things I’m unfamiliar with.

So if there’s anything I’ve overlooked or could improve, I’d really appreciate your feedback.