logo
183 points by paulgb 6 days ago | 52 comments
Hey HN! We started Jamsocket a few years ago as a way to run ephemeral servers that last for as long as a WebSocket connection. We sandboxed those servers, so with the rise of LLMs we started to see people use them for arbitrary code execution.

While this works, it was clunkier than what we would have wanted in a first-principles code execution product. We built ForeverVM from scratch to be that product.

In particular, it felt clunky for app developers to have to think about sandboxes starting and stopping, so the core tenet of ForeverVM is using memory snapshotting to create the abstraction of a Python REPL that lives forever.

When you go on our site, you are given a live Python repl, try it out!

---

Edit: here's a bit more about why/when/how this can be used:

LLMs are often given extra abilities through "tools", which are generally wrappers around API calls. For a lot of tasks (sending an email, fetching data from well-known sources), the LLM knows how to write Python code to accomplish the same.

Any time the LLM needs to do a specific calculation or process data in a loop, we find it is better to generate code than try to do this in the LLM itself.

We have an integration with Anthropic's Model Context Protocol, which is also supported by a lot of IDEs like Cursor and Windsurf. One surprising thing we've found is that once installed, when we ask a question about Python, the LLM will see that ForeverVM is available as a tool and verify it automatically! So we cut down on hallucinations that way.


Loading...