TEGAKARI
  • Home
  • Latest information on overseas products (Unipos WEB)
  • R & D PC configuration example (Tegsys)
  • Service information for R & D
    • Rental service tegakari
    • Research and development/experimental equipment set construction service
  • Technical information articles
  • Version upgrade information
  • News from TEGARA
  • Contact
Pickup new articles
  • [April 2026, 3] Medical Image AI Analysis Workstation (MRI Compatible) Research workstation
  • [April 2026, 3] LAMMPS workstation: Budget of 2.8 million yen (February 2026 edition) Research workstation
  • [April 2026, 3] LAMMPS workstation: Budget of 2.8 million yen (February 2026 edition) Research workstation
  • [April 2026, 3] LAMMPS workstation: Budget of 2.8 million yen (February 2026 edition) Research workstation
  • [April 2026, 2] 2-CPU, 512GB memory workstation for large-scale genome analysis Research workstation

Home > R & D PC configuration example (Tegsys) > GPU server for AI development

GPU server for AI development

2026/2/12 TEGARA Co., Ltd. Research workstation, Artificial intelligence, R & D PC configuration example (Tegsys)

The client contacted us to build a workstation for generative AI that could be run locally to utilize large-scale language models in-house. The software they were considering using was Ollama, LM Studio, Dify, and Python.

Specifically, the customer requested a CPU model with a sufficient number of cores, and a memory configuration with at least 1TB, and preferably expandable to 2-4TB.
Regarding GPUs, we would like to install the maximum number of NVIDIA RTX Pro 6000 Blackwells possible, assuming that multiple cards can be installed.
The OS must be Ubuntu and the power supply must operate in a 100V environment.
Furthermore, storage requires the maximum number of NVMe SSDs with the highest possible capacity, providing high overall processing performance and expandability.
The client wanted to create a configuration that met these requirements within a budget of approximately 2,000 million yen.

Taking the above into consideration, we proposed the following configuration:

CPU Intel Xeon 6515P 2.30GHz(TB 3.80GHz) 16C/32T
memory Total 4TB DDR5-6400 REG ECC 128GB x 32
Storage 1 7.68TB U.2 NVMe SSD
Storage 2 15.36TB U.2 NVMe SSD ×5 (RAID5)
Video NVIDIA RTX PRO 6000 BW Server Edition x 4
network Network Card 10GbE RJ45 2-Port
Housing + power supply 4U rackmount enclosure, 3200W/200V redundant power supply (3+1)
OS Ubuntu 24.04
Other RAID card Broadcom MegaRAID
Complete rail kit
3-year send-back warranty (1-year standard warranty + 2-year extended warranty)

CPU/memory configuration optimized for LLM inference and generation AI processing

This configuration prioritizes the number of CPU cores and memory bandwidth required for running generative AI and local LLM. In order to efficiently process large-scale models, not only GPU performance but also multi-core CPU performance and the ability to access large amounts of memory are important.

To achieve this, we have adopted the latest generation of server CPUs, aiming to achieve both multi-core processing power and large memory capacity. Initially, we had envisioned a 1TB memory configuration, but in anticipation of future model expansion and support for RAG workloads, we are proposing a configuration with the maximum possible memory capacity of 4TB.

High-density GPU configuration featuring RTX Pro 6000 Blackwell

Because the customer wanted to install multiple NVIDIA RTX Pro 6000 Blackwell GPUs, we chose a 4U rack-mount chassis and a motherboard configuration that would enable stable operation of the maximum number of GPUs.

When installing multiple PRO 6000 class GPUs, significant constraints arise, such as the number of PCIe lanes, heat dissipation, and power supply capacity. This configuration meets all of these requirements, achieving stable AI calculation processing as a high-density GPU server.

Increasing NVMe storage capacity and RAID configuration

In response to requests for installing the maximum number of NVMe SSDs with the highest possible capacity, we propose a configuration that combines read/write speed and redundancy by installing multiple NVMe SSDs in addition to the system SSD.

This configuration allows for high-speed handling of research data and embedding caches, and delivers high performance in local inference environments such as RAG environments and Dify/Ollama.

Power supply environment (preferred 100V → change to 200V compatible)

Initially, the client requested that the system be operated in a 100V environment, but as the power supply capacity of 100V was insufficient for a configuration with many GPUs, the client changed course and instead chose a 200V environment.

This allows the use of a server-grade power supply unit, ensuring the stability of the entire machine and future expandability.

Keyword

・What is Ollama?
Ollama is an open-source platform that can quickly run large-scale language models (LLMs) in a local environment. It is GPU optimized and lightweight, and can be easily integrated with Python and web applications. It can also be used in research and corporate environments where privacy protection is required, for example, to automate mass spectrometry data processing, assist with peptide identification, and generate analysis scripts.

Reference: Ollama – Run LLMs locally

・What is LM Studio?
LM Studio is an integrated application for managing and running large-scale language models locally on a PC. Model download, inference, and prompt operations can be completed using a GUI, and it also helps generate research scripts and build mass spectrometry (MS/MS) data analysis workflows. It is ideal for engineers who require a secure AI environment with local inference.

Reference: LM Studio – Local AI Desktop Application

What is Dify?
Dify is an integrated platform that allows you to build AI applications with no-code/low-code. It facilitates RAG (Search Augmentation Generation), workflow automation, model switching, and more, and can be used for generating reports in research sites, prototyping mass spectrometry data interpretation support tools, and building peptide identification helper AI. It supports both cloud and local use.

Reference: Dify – AI app development platform

・What is Python?
Python is a general-purpose programming language used in a wide range of applications, from scientific calculations to AI and machine learning, mass spectrometry data analysis, and peptide identification algorithm development. Its extensive library of libraries, including NumPy, SciPy, pandas, and pyteomics, makes it easy to automate glycopeptide analysis and MS/MS spectrum processing. It is an essential development platform for researchers and engineers.

Reference: Python official website

Feel free to request a quote based on your usage and budget - Tegsys' simple inquiry form

 

■ Click here for details and inquiries about this PC case
GPU server for AI development

* Please enter the name of the case or your desired conditions.


  • Ubuntu
  • Deepearning
  • Machine learning
  • Generation AI
  • Large-scale language models

People who read this article also read this article

R & D PC configuration example (Tegsys)

Machine for machine learning & quantum chemical calculation simulation

2022/9/26 TEGARA Co., Ltd. Research workstation, R & D PC configuration example (Tegsys)

■This article was posted on September 26, 2022, so the information may be out of date. A customer viewing Case No. PC-8351 has asked us about machine learning and quantum chemistry. […see next]

Research workstation

Example of machine for animal behavior analysis

2020/8/20 TEGARA Co., Ltd. Research workstation, R & D PC configuration example (Tegsys)

■This article was posted on August 2020, 8, so the information may be outdated. A customer has requested “Dee”, a software that analyzes animal behavior using deep learning. […see next]

R & D PC configuration example (Tegsys)

Machine for sequence analysis and deep learning

2024/9/18 TEGARA Co., Ltd. Research workstation, Medicine / Nursing / Pharmacy, Biology / Agriculture, Artificial intelligence, R & D PC configuration example (Tegsys)

A customer involved in drug discovery-related work asked us to introduce a PC for WES (whole exome sequencing) and long-read sequence analysis. […see next]

Site search:

Tegara's research and development campaign information

  • [Materials field only] Research and development support campaign
    [Materials field only] Research and development support campaign
    2025/12/1
  • Special Offer on AI Robotics Products | For Tegara Repeat Users
    Special Offer on AI Robotics Products | For Tegara Repeat Users
    2025/10/31
  • Tegsys Referral Campaign | Rewards for both the referrer and the referred person
    Tegsys Referral Campaign | Rewards for both the referrer and the referred person
    2025/10/31
  • Unipos Referral Campaign | Benefits for both the introducer and the referred person
    Unipos Referral Campaign | Benefits for both the introducer and the referred person
    2025/10/31
  • Special campaign for conference attendees | UNIPOS
    Special campaign for conference attendees | UNIPOS
    2025/10/1
  • Announcement of the Young Researchers Support Campaign
    Announcement of the Young Researchers Support Campaign
    2025/5/29

Tegara YouTube Video

[Effect of IR Pass Filter] Shoot whiteboard with RealSense D435 and D435f

The latest posted video is displayed.
Other videosTegara Corporation Youtube channelto check more details.

Popular Articles (Access ranking for the last 7 days)

  • [Product introduction] MarineTraffic: real-time information provision service on ships (subscription plan) 2023/4/6
  • [Product introduction] OCCT (paid version) : PC stability check and stress test tool 2023/2/3
  • "Various I / O devices manufactured by IOI Technology" 2019/4/24
  • The latest version 5 of the projection mapping software "MadMapper" has been officially released. 2021/12/23
  • [Product Introduction] CACANi: In-between generation animation tool 2022/11/26

Latest posts

  • Medical Image AI Analysis Workstation (MRI Compatible)
    2026/3/6
  • LAMMPS workstation: Budget of 2.8 million yen (February 2026 edition)
    2026/3/2
  • LAMMPS workstation: Budget of 2.8 million yen (February 2026 edition)
    2026/3/2
  • LAMMPS workstation: Budget of 2.8 million yen (February 2026 edition)
    2026/3/2
  • 2-CPU, 512GB memory workstation for large-scale genome analysis
    2026/2/27

Featured tags

Analysis tool (56) 3D camera (55) Machine learning (machine learning) (55) Robotics (52) AI (50) Deepearning (48) Bioinformatics (48) VR (44) Statistical analysis (43) Robot arm (42) RealSense (41) simulation (38) Video / Video (37) SBC (36) Depth camera (36) Small SBC (35) IoT (35) instrumentation (35) Spectrum (33) Python (32) Data analysis (31) Image analysis / image inspection (31) Next-generation sequencer (31) First principle (30) Cyber ​​security (28) material (27) JavaScript (27) AR (27) Chemical (27) Metashape (26) Image processing (26) MATLAB (26) . NET (26) Molecular dynamics (25) TO DEAL (25) In-vehicle (25) UI (24) Photogrammetry (23) 3D model (22) gene (22) Molecular biology (22) prototype (22) Support (22) Educational robot (22) Measuring instrument (21) Web development / production (21) Electromagnetic field analysis (21) GIS (20) Test tool (20) ROS (20) Visualization (20) Mobile robot (19) Psychology (19) Robot hand (19) Animation (19) Mech robot (19) Drone (19) security (19)
Find Information by Field-Category
  •  Humanities / Social Sciences
  •  Mathematical Science
  •  Chemical
  •  engineering
  •  Medicine / Nursing / Pharmacy
  •  Biology / Agriculture
  •  Informatics
 
  •  Artificial intelligence
  •  Robotics
  •  Sensor technology
  •  Development kit / electronic work
  •  Digital gadget
  •  Automotive / vehicle related
  •  Industrial communication technology
  •  Application development and programming
  •  Network security
  •  Multimedia (video / image / audio) processing
  •  Business support and efficiency tools
Translate
Site link
Privacy Policy
Management website (service)
TEGARA Co., Ltd.
TEGARA CORPORATION corporate site

UNIPOS
Overseas product procurement and consultation services for R & D

Tegusis
Research and industrial PC production and sales services

TKS Division
Research and development/experimental equipment set construction service
Contact Form – Contact
Click here to contact TEGAKARI
SNS account
  • Twitter
  • YouTube
  • Facebook

TEGARA Co., Ltd.

Tegara is a platform that provides R & D with useful products, services, and information in an integrated manner. "Helping accelerate R & D"

Copyright © 2020 | Tegara Corporation