I’m trying to learn about licensing. Why do you like AGPL-3.0 compared to others?
I’m trying to learn about licensing. Why do you like AGPL-3.0 compared to others?
Which image? I’ve seen a few wireguard options on docker hub
They responded to me. It is indeed the RTX 4070 SUPER 12GB (Not 16GB). Going to cancel order
The listing will appear correctly again on 1/24/2024
Should have spent more time…you’re right.
According to some articles, you can self host smaller parameter LLMs and/or quantized versions. For example the 7B models. Recommendations were 16GB+. Some even pulled off lower
I’m primarily hoping whatever GPU does arrive has 16GB VRAM. That was a main requirement I was trying to satisfy
Weird thing is, the product page doesn’t exist anymore. I was able to locate the Google cached version of it though (text-only version seems to work): https://webcache.googleusercontent.com/search?q=cache:cgLuNDy4rb0J:https://www.newegg.com/p/3D5-0007-00HV4
Ollama has been great for self-hosting, but also checkout vLLM as its the new shiny self-hosting toy