Machine with WSL2 and Docker setup


Case studies from customers involved in AI learningNo.PC-11641BAfter reading the case study article, the customer asked us to carry out the setup described in the case study article.
Specifically, they requested that Ubuntu be run on Windows, with CUDA Toolkit 11.8, TensorFlow, PyTorch, and Docker available within that Ubuntu environment.

Based on the information you provided, we proposed the following structure:

CPU Intel Core i7-14700 2.10GHz (8C/16T) + 1.50GHz (12C/12T)
memory Total 32GB DDR5 5600 16GB x 2
storage 2TB SSD S-ATA
Video NVIDIA Geforce RTX 4090 24GB
network on board (2.5GBase-T x 1) Wi-Fi6E x 1
Housing + power supply Mid-tower chassis 1000W 80PLUS PLATINUM
OS Microsoft Windows 11 Professional 64bit
Others WSL2 (Ubuntu 22.04 configuration)
Install CUDA Toolkit (on WSL2)
Tensorflow/PyTorch/Docker *All set up on WSL2

This machine has been set up to meet the customer's needs, based on the parts used in the example No. PC-11641B shown here.

This configuration is equipped with an Intel Core i7-14700 and an RTX 4090, making it capable of handling large-scale deep learning and analysis involving GPU calculations.

As a development environment, WSL2 has been enabled and the Ubuntu environment has been set up on Windows.
In addition, CUDA Toolkit, TensorFlow, PyTorch, and Docker are pre-installed in the Ubuntu environment.You can start AI development and data analysis immediately after arrival.

If you would like to install WSL2 and Docker yourself, please see this guide page for detailed instructions.

reference:
Easy even for beginners! How to install Ubuntu on Windows

For those who are active in these fields

  • Machine learning
  • AI Development
  • Numerical calculation
  • Software Development

We also offer configuration suggestions and setup tailored to your research content and operational infrastructure.
Please feel free to contact us with any requests that are not included in the contents listed here.

Keyword

・What is WSL2?

WSL2 is version 2 of the Windows Subsystem for Linux, a system that allows the Linux kernel to run directly on Windows.
It is faster and more compatible than before, and allows you to easily use the Linux environment as if it were a virtual machine.
It is used to run Linux-specific tools in research and development.

reference:Microsoft Learn WSL *Jumps to an external site

・What is CUDA Toolkit?

CUDA Toolkit is a GPU computing development environment provided by NVIDIA.
It enables GPU programming in C/C++ and Fortran, enabling high-speed calculations in deep learning, numerical analysis, and more.
It is widely used in AI development and simulation fields.

reference:NVIDIA CUDA Toolkit *Jumps to an external site

・What is Tensorflow?

TensorFlow is a machine learning library developed by Google.
It makes neural network construction and model training more efficient, and is used in a wide range of AI development, including image recognition and natural language processing.
It also supports high-speed calculations using GPUs.

reference:TensorFlow official website *Jumps to an external site

・What is PyTorch?

PyTorch is a deep learning library developed by Facebook.
It is characterized by flexible model construction using dynamic computational graphs, and is widely used from research and development to commercial services.
It also supports high-speed calculations using GPUs.

reference:PyTorch official website *Jumps to an external site

・What is Docker?

Docker is a virtualization technology that runs applications in units called containers.
It is lightweight and can start up quickly, which simplifies environment construction and deployment.

reference:Docker official *Jumps to an external site