Docker, Inc. and partners Neo4j, LangChain and Ollama recently launched a new GenAI Stack, which is capable of eliminating the need to search for, combine and configure varying technologies from different sources and instead, can produce results with generative AI applications in minutes.
The new generative AI Stack is designed to help developers get a running start with its pre-configured, ready-to-code, and secure large language models (LLMs) from Ollama, vector and graph databases from Neo4j and framework from LangChain. Docker’s first AI-powered product is named Docker AI.
“Developers are excited by the possibilities of GenAI, but the rate of change, number of vendors, and wide variation in technology stacks makes it challenging to know where and how to start,” said Docker CEO Scott Johnston. “Today’s announcement eliminates this dilemma by enabling developers to get started quickly and safely using the Docker tools, content, and services they already know and love, together with partner technologies on the cutting edge of GenAI app development.”
Demonstrated at the DockerCon, the GenAI Stack addresses the popular GenAI use cases of developers and is among a range of new AI/ML capabilities, content and partnerships announced by Docker that are aimed at promoting the quick and safe uptake of AI/ML in applications.
Its components include pre-configured open source LLMs such as Llama 2, Code Llama, Mistral, or private models such as OpenAI’s GPT-3.5 and GPT-4, and Neo4j as the default database for graph and native vector searches that uncover explicit and implicit patterns and relationships in data, which makes AI/ML models faster and more accurate, while acting as long-term memory for these models. Neo4j knowledge graphs are capable of being the basis for more accurate GenAI predictions and outcomes, while LangChain’s collaboration between LLM, application and databases serve as a framework for developing context-sensitive reasoning applications driven by LLMs. A range of supportive tools, templates and GenAI best practices have also been baked into the solution.
According to Emil Eifrem, co-founder and CEO of Neo4j: “The driving force uniting our collective efforts was the shared mission to empower developers, making it very easy for them to build GenAI-backed applications and add GenAI features to existing applications. We’re immensely excited about the possibilities this unlocks for millions of developers.”
Harrison Chase, co-Founder and CEO, LangChain added: “We’re all here to help teams close the gap between the magical user experience GenAI enables and the work it requires to actually get there. This is a fantastic step in that direction,” while Ollama creator Jeffrey Morgan said: “Docker changed how applications are developed and deployed. I’m excited to work with Docker’s community of developers to build the next generation of applications focused on AI.”
The easy out-of-the-box setup gives developer the benefits of effortless data loading and vector inex populating, which enable them to seamlessly import data, create dynamic indices, embed questions and answers and store them within the vector index. Tasks such as enhanced querying and result enrichment that can both summerise data and showcase knowledge retrieval become a breeze. With GenAI Stack, developers can also generate a range of responses across various formats and compare the results that have been collected.