3 pointsby beaniez4 days ago1 comment
  • dontleakkeys4 days ago
    This sends your API key to their server in the POST request. I wouldn't trust it.
    • beaniez3 days ago
      yeah. I think logic to make llm inference calls could be moved to frontend to resolve that concern.