Recommend some small hardware server

… I want to run AI model really

Hardware server?
Are you looking to run local AI models on your computer?
Or do you mean you want your own hardware components to build a server yourself?
Do you have any specific hardware requirements?

I hope I’ve understood you correctly.

yeah , build a server
a desktop server
so big it is

This a complex question. Do you know what model you want to use? Llama can require somewhat low cost to a very costly setup depending on the model you choose.

GPU Requirement Guide for Llama 3 (All Variants)

Model needs will factor in how much you will need to spend and also potentially increase the “size” of the computers footprint

If you want a cheap AI LLM, I’ve seen Jeff Geerling run an AI assistant on a Raspberry Pi. Its not going to be smart or fast, but at least you have one.

Its not just the RPi5 of course, its going to use a eGPU dock and a AMD RX 6700XT…

… but it is a server.


Framework also has an upcoming “not-a-desktop motherboard” that you can run bigger LLMs but its gonna cost you at the very least USD 1999 plus taxes and shipping but it has an ungodly 128 GB of shared system and video RAM so you are going to have a good time tinkering with AI models. Do note that it is a fixed non-removable RAM (not upgradable) so what you buy is what you will have for the duration of the lifetime of the motherboard.

1 Like

Nvidia Jetson?

Based on your previous posts and replies:
If you’re in China, just search for servers on the usual online shopping sites and buy one directly.
You plan to run AI models, right? There’s no need to hide that from the government.
Buying servers from overseas websites is quite a hassle. You can’t purchase them anonymously anyway, and trying to would look really suspicious.

First, you should learn about server deployment and how AI models work.
Then you can think about buying one. Keep in mind that running AI models locally requires a powerful GPU.

1 Like