Large AI Inference Rack Workstation

This Large AI Inference Rack Workstation is built around the i9-13900K with an RTX A6000 48GB video card.

Description

Large AI Inference Rack Workstation Overview

This Large AI Inference Rack Workstation is built around the i9-13900K with an RTX A6000 48GB video card. This combination provides a very powerful solution for computation and allows for very large datasets. A second RTX A6000 can be easily added to provide 96GB VRAM.

This generative AI rack workstation can handle LLMs up to 70B parameters, or 8x7B parameters. If you need more GPU compute, and need to maintain 48GB VRAM/GPU, you’ll need to change to a Xeon or Threadripper build with twin RTX 6000 ADA, NVLinked over the PCIe bus.

This computer does not include a Windows license and is built to run Linux. The OS choice is up to the customer however.

If you need more than 192GB of RAM, we recommend a build based on Threadripper 7000, as it supports ECC memory. 384GB is the highest we can go on Threadripper non-Pro, without resorting to EXTREMELY expensive 128GB or 256GB DIMMs. 768GB is the highest we can normally go on Threadripper Pro, for the same reasons.

Large AI Inference Rack Workstation Limitations

This platform is currently limited to 32 threads, 2 GPUs, and 192GB of memory. Please let us know if you need a more robust platform.

Builds In This Set

Small AI Inference | Medium AI Inference | Large AI Inference | Xeon AI Training

Threadripper Pro Scientific Computing | Xeon Scientific Computing

Threadripper Pro Data Science | Xeon Data Science | Xeon Max Data Science

What You Need To Know

i9-13900K
(24 core/32 thread, 5.8 GHz single-core turbo)
RTX A6000 48GB
(very high compute, max VRAM, onboard NVLink)
64GB DDR5-6000 memory
(max of 4DIMM/192GB)
1TB+4TB Gen4 NVME dual-drive setup
(OS/programs, main)
High-airflow rack (rails included), 1200W Platinum power supply360mm CPU liquid cooler

Large AI Inference Rack Workstation

Top Flight Computers is a SolidWorks Certified Solution Partner

See some of our customer build videos and live streams on our YouTube channel!

Pricing

This Large AI Inference Rack Workstation is $9500 as currently designed, and includes a 1 year extended warranty.

Online pricing defaults to credit card payment (ACH and check are available), possible sales tax and shipping cost is extra.

Please contact us if you found one of our previous builds that you think would suit your needs well.

B2B financing is available up to 60 months, $5000 minimum spend.

Example Systems

Interested In Buying This Rack Workstation? Contact Us!

See our brief update on the status of components, including pricing and lead time

Search