🔔
🎄
🎁
🦌
🛷
NEW
NVIDIA AI Workbench Simplifies GPU Utilization on Windows - Blockchain.News

NVIDIA AI Workbench Simplifies GPU Utilization on Windows

Rebeca Moen Aug 26, 2024 16:27

NVIDIA's AI Workbench streamlines data science, ML, and AI projects across PCs, workstations, datacenters, and cloud environments.

NVIDIA AI Workbench Simplifies GPU Utilization on Windows

NVIDIA has introduced the AI Workbench, a free and user-friendly development environment manager designed to streamline data science, machine learning (ML), and artificial intelligence (AI) projects across various systems including PCs, workstations, datacenters, and cloud environments, according to the NVIDIA Technical Blog.

Streamlined Development Environment

The AI Workbench allows developers to create, test, and prototype projects on multiple operating systems such as Windows, macOS, and Ubuntu. It offers seamless transfer of development environments and computational tasks between local and remote systems, optimizing for cost, availability, and scalability.

This tool focuses on enhancing the developer experience while allowing for significant customization, particularly through the use of containers, which are essential for GPU-accelerated work. The AI Workbench also collaborates with ecosystem partners to improve user experience. For instance, its partnership with Canonical facilitates using an Ubuntu WSL distribution for installations on Windows.

Managed Docker Desktop Installation

One of the key features of the latest AI Workbench release is its managed installation of Docker Desktop, a container runtime recommended for both Windows and macOS. Previously, setting up Docker required manual steps, but the new collaboration with Docker enables a seamless installation process directly from the AI Workbench interface.

NVIDIA's AI Workbench now automates several tasks including:

  • Installing Docker Desktop: This eliminates the need to exit the AI Workbench installer to install Docker manually.
  • Configuring Docker Desktop on Windows: The AI Workbench now automatically configures Docker Desktop to use its own WSL distribution, NVIDIA-Workbench.

New AI Workbench Projects

The latest release also includes new example projects designed to assist developers. These projects are structured as Git repositories that define containerized development environments, supporting IDEs like Jupyter and Visual Studio Code.

One notable example is the Hybrid-RAG project on GitHub, which allows users to clone the project and run the RAG application with just a few clicks. The project can utilize either local GPUs or cloud endpoints for inference.

Agentic RAG

The Agentic RAG project integrates AI agents to include web search tool-calling into RAG pipelines, dynamically searching for new documents online to better respond to queries. This project features a customizable Gradio Chat app for running inference using various endpoints, including cloud and self-hosted microservices.

NIM Anywhere

NIM Anywhere is another significant project that includes a preconfigured RAG chatbot and Docker automation for running services like NIM, Milvus, and Redis. It also offers a customizable frontend for extending projects and building new use cases.

Fine-tuning Projects

The release introduces several fine-tuning workflows for new models, including:

  • Mixtral 8x7B: Demonstrates fine-tuning a mixture of experts (MoE) model.
  • Llama 3 8B: Showcases supervised full fine-tuning and Direct Preference Optimization (DPO).
  • Phi-3 Mini: A highly accessible fine-tuning example due to its small model size and quantization capability.
  • RTX AI Toolkit: Provides an end-to-end workflow for Windows application developers, supporting various GPUs from NVIDIA RTX PCs to cloud.

Other New Features

Based on user feedback, the latest AI Workbench release includes:

  • SSH Agent: Adds support for password-protected SSH keys for enterprise users.
  • Ubuntu 24.04: Expands support to include the latest Ubuntu distribution.
  • Logging: Introduces a support command in the AI Workbench CLI to export metadata and logs into a zip file for easier troubleshooting.

Future Developments

Looking ahead, NVIDIA plans to introduce app sharing and streamline multi-container support in future releases of AI Workbench. These features aim to further enhance collaboration and simplify complex workflows for developers.

For more details about the AI Workbench and to get started, visit the official NVIDIA Technical Blog.

Image source: Shutterstock