LLMs can be fairly resistant to abuse. Most developers are either incapable of building safer tools, or unwilling to invest ...