So this is how my Dumb‑Agent is learning the Otherside map @OthersideMeta I’ve been building a “Dumb‑Agent” that’s basically doing "Roomba‑Mapping" in Otherside. No AI API credits. No vision model. No “analyze this screenshot” calls. Just a local Python app being annoying, deterministic, and relentless. And it’s working — you can literally see the map taking the shape of the Nexus in a top‑down pixel grid. - The trick: stop “imagining” the map, just measure it The agent doesn’t *guess* layouts. It just continuously answers: - Where am I? - Did I move? - Did I teleport / rail / respawn? - What does that imply about this tile/the tile I was just at. Everything gets written into a grid‑map (JSON) keyed by coordinates like "cx,cy"with simple labels for what’s been learned. I can also label a grid spot whilst I play, as [interesting note to analyse here]. You end up with “fog of war” getting peeled back, one cell at a time. - How it knows where it is (two sources) It tracks coordinates using two signals: - UE game logs (heartbeat): The game is code, it makes logs, bot reads the logs - bot parses the player heartbeat position. This is the “trusted” stream. - Tab‑map OCR: opens the in‑game map, screenshots just the coordinate region, OCRs it locally, closes the map. It also rejects obvious OCR outliers so one cursed read doesn’t wreck the map....