This is why a LLM is fundamentally a bullshitting machine. It will ALWAYS give you text back. It will return a statistically likely string of text every time you give it a string of text. It literally CANNOT function without returning *something*. People have tried. You can provide instructions to not respond and it will anyway.
That's because it's a stupid fucking machine, not an intelligent agent.