Nvidia offers a special innovation for RTX GPU owners as of Tuesday: Chat with RTX. This application allows users to run a personal artificial intelligence chatbot on their own computers. This chatbot, which you can feed a wide range of content from YouTube videos to documents, extracts summaries based on your data and offers relevant answers. Most importantly, all this processing happens on a local PC, and all you need is an RTX 30 or 40 series GPU with at least 8GB VRAM.
A new era in data research with Nvidia Chat with RTX
Chat with RTX also works effectively on YouTube videos. When you enter a URL, it allows you to search for specific mentions or summarize an entire video. Chat with RTX is also very good at searching your local documents.
One big downside to Chat with RTX is that it really feels like an early developer demo. Chat with RTX installs a web server and Python instance on your PC and then uses Mistral or Llama 2 models to query your data. It then uses Nvidia’s Tensor cores to speed up your queries.
Nvidia doesn’t present this app as a flashy app that all RTX owners should download and install immediately. There are a few known issues and limitations, such as source attribution not always being accurate. But Nvidia has provided a good tech demo of what an AI chatbot could do in the future, natively on your PC without needing a subscription like Copilot Pro or ChatGPT Plus to analyze your personal files.