Introduction

To get you started, we have created a simple Chatbot interface. When you first get the project up and running, it will simply repeat your message.

But as you progress through the course, you will build the functionality to turn the Chatbot into an intelligent movie recommendation assistant. It will use the data from a Neo4j database to improve the responses generated by an LLM.

First, let’s take a look at the technology choices we have chosen.

An introduction to Next.js

We have chosen to implement the chatbot using Next.js. You may be familiar with Next.js as a leading framework that extends the latest React features to enable developers to build full-stack applications.

The repository was created with create-next-app, and we have added TailwindCSS for presentation.

The front-end UI code creates a chatbot that uses a React hook to send an HTTP POST that triggers an agent module.

The route handler subsequently calls the call() function in the src/modules/agent/index.ts folder. This function accepts the user input plus a sessionId assigned by the framework to identify the user and is responsible for generating a response to the user.

The call() function aims to mimic a call to an LLM, using the setTimeout() function, waiting a second before returning the input to the user.

typescript
Placeholder Agent
export async function call(input: string, sessionId: string): Promise<string> {
  // TODO: Replace this code with an agent
  await sleep(2000);
  return input;
}

LLM Integration with Langchain

As you progress through the course, you will replace the call() function above with an agent capable of deciphering which tool to use and generating a response with the help of an LLM.

If you have completed the Neo4j & LLM Fundamentals course, you will be familiar with Langchain and the concept of agents. LangChain is an open-source framework designed to accelerate the development of LLM applications. We have chosen Langchain because it provides a flexible base for testing LLMs and out-of-the-box chains for performing complex tasks.

Although we have chosen a specific framework, the course focuses on the LLM integration details, which should be transferrable to your framework of choice.

Although the Neo4j & LLM Fundamentals course covers the Python library, and the code samples may differ, the overall concepts used in the TypeScript library are identical.

LLMs from OpenAI

We have included instructions to integrate the Chatbot with OpenAI’s Large Language Models.

OpenAI has gained prominence through its Generative Pretrained Transformer (GPT) series. GPT models are trained on vast datasets to generate text that resembles human language. The release of GPT-3 and GPT-4 showcased language understanding and generation improvements, increasing their application in various industries. The practical utility of these models in tasks like writing assistance, programming, and language translation has led to widespread adoption and attention.

You are by no means restricted to OpenAI, however. The hands-on challenges in this course are LLM-agnostic, and you are free to use one of the 60+ supported LLMs.

Open-Source Alternatives

If you are looking for an open-source alternative, we recommend you take a look at the GenAI Stack. The GenAI Stack consists of LangChain applications connecting to LLMs served by Ollama, run within Docker containers and backed by a Neo4j database.

Complete the course Local or Online

In the next lesson, you will set up your project. You can clone or download the repository and complete the exercises locally or use the Open in Gitpod buttons to attempt the challenges using an Online IDE.

Ready for launch?

Click the button below to mark this lesson as read and advance to the next lesson.

Summary

In this lesson, we introduced you to the objectives of the course.

In the next lesson, you will set the environment variables required to get the project running.