Small AI Inference Workstation Overview
This Small AI Inference Workstation is built around the i5-13600K with an RTX 4070Ti Super 16GB video card. This combination provides a very powerful solution for computation and allows for heavy datasets.
This system can handle models up to 13B parameters. If you need more GPU compute or VRAM, we recommend the RTX 4090 24GB.
This computer does not include a Windows license and is built to run Linux. The OS choice is up to the customer however.
Large AI Inference Workstation Limitations
This platform is currently limited to 32 threads, 1 GPU, and 192GB of memory. Please let us know if you need a more robust platform.