LLM code execution
PhariaEngine offers a CodeInterpreter, which gives you the ability to run code that the LLM has produced.
In this article:
Security
Because the Python interpreter that executes this code runs in WASM, there are tight controls as to what the code can do. It provides a secure, sandboxed environment for executing code, ensuring that it does not have access to sensitive data or perform unauthorised actions.
Limitations
Outbound HTTP requests are currently not supported in PhariaEngine. This means tools that need to make HTTP requests can only be executed in a local environment with the DevCsi class and not be deployed to PhariaEngine.