3 pointsby beaniez7 months ago1 comment
  • dontleakkeys7 months ago
    This sends your API key to their server in the POST request. I wouldn't trust it.
    • beaniez7 months ago
      yeah. I think logic to make llm inference calls could be moved to frontend to resolve that concern.