A critical vulnerability in LangChainGo allows attackers to read sensitive files through server-side template injection using the Gonja template engine. Exploitation requires only access to the prompt, making LLM chatbots a potential entry point. Updating to the latest version, which restricts filesystem access, is crucial to mitigate this risk.
Latest mentioned: 09-15
Earliest mentioned: 09-15