
A customer contacted us about a machine for Deep Learning to perform medical image analysis. Since the learning process takes a long time in the current environment, the customer would like to switch to a machine with higher specifications. The customer plans to replace the "NVIDIA RTX A6000 48GB" installed in the current machine with the new machine.
The client's budget was 150 million yen, and they provided the following specifications as their desired conditions.
| CPU: Intel or AMD (not very demanding in terms of performance) Memory: 256GB or more (4 or 8 channels) Storage: System storage is 2TB M.1 SSD Video: RTX A6000 (Replacing the one in your current machine) Power supply: 100V power supply environment OS: Ubuntu 22.04 |
Additionally, they plan to add more GPUs and storage in the future, and would like a configuration that takes scalability into consideration.
Taking these conditions into consideration, we have provided the following specifications:
| CPU | Intel Xeon W7-3465X (2.50GHz 28 cores) |
| memory | 256GB REG ECC (32GB x 8) |
| storage | 1TB SSD M.2 NVMe Gen4 |
| Video | NVIDIA RTX A6000 48GB (supplied) |
| network | on board (1GbE x1 /10GbE x1) |
| Housing + power supply | Tower type housing + 1600W |
| OS | Ubuntu 22.04 |
Configuration incorporating supplied items
We chose the latest CPU, the Xeon W2024-8X 7-core, as of August 3465, to fit our budget.
The total memory capacity is 256GB (32 x 8GB). There are 8 free slots so you can add more memory at a later date.
The supplied GPU, the RTX A6000, is expected to be the one that the customer packs and sends to us.
Configuration that allows for future expansion after purchase
When we asked the customer what their needs were, they told us that they would like to add another RTX A6000 card after purchasing the machine.
For this reason, we have provided ample capacity with a 1600W power supply unit, allowing a total of two RTX A2s to be installed and used without any problems. The internal cables required for adding GPUs are placed in a location that makes them easy to access when working.
The configuration of this case study is based on the conditions given by the customer.
Please feel free to contact us even if you are considering different conditions from what is posted.
■ Keywords・What is Tensorflow2? An open source machine learning library developed by Google for research and development of deep learning and neural networks. By handling multidimensional arrays called Tensors, it is possible to efficiently execute a wide range of machine learning tasks, such as image recognition, speech recognition, and natural language processing. There are tools available to build a production environment for models developed in the research stage, making it possible to perform inference using the production environment. In addition, inference can be performed on mobile and embedded devices using TensorFlow Lite. |
|
■ Click here for details and inquiries about this PC case * Please enter the name of the case or your desired conditions. |



