22 pointsby anujsharmax3 days ago4 comments
  • gdevillers2 days ago
    Will we see people being paid to host small single-GPU servers in their home ? I guess that would require redesigning the training system because the data transfer speed would be much slower with a higher latency. Maybe that is not even compatible with LLM training ?
  • thenthenthen2 days ago
    Dont forget suger cane: https://aisupplychain.vercel.app/
  • 7777777phil2 days ago
    128 to 210 WEEKS for power transformers. You can fab more GPUs in 18 months but the electrical infrastructure to run them takes 3-4 years and there are maybe a handful of manufacturers. Feels like the stories from the telecom fiber overbuild of 2000 except this time the supply chain can't even overbuild if it wanted to.
  • ekropotin2 days ago
    I miss good’ol pre-LLM days so much