Langchainjs Workshop screenshot

Langchainjs Workshop

Author Avatar Theme by Jouwdan
Updated: 5 Dec 2023
21 Stars

Repository for the workshop "Unlocking the power of AI: Private conversations with your docs using Langchain JS" by @pattyneta

Categories

Overview

The LangchainJS Workshop is a demo code repository that allows users to run a NodeJS application that connects with a locally running Ollama server. The application utilizes LangchainJS for communication with the LLM and SvelteKit for the API and frontend.

Features

  • NodeJS Application: Users can run a NodeJS application that interacts with the Ollama server.
  • LangchainJS Integration: The application is built using LangchainJS for seamless communication with the LLM.
  • SvelteKit API & Frontend: SvelteKit is used for the API and frontend of the application, providing a smooth user experience.

Installation

To install and run the LangchainJS Workshop application, follow the steps below:

  1. Make sure you have Ollama installed and running locally. If not, refer to the next section for instructions.

  2. Download Ollama from the official website and install it on your computer.

  3. Open the Ollama application, which will provide you with the command to run Ollama. Copy the command.

  4. Open your terminal application and paste the command to download and run the Ollama model locally.

  5. Clone the repository of the LangchainJS Workshop.

  6. Run the npm install command in your terminal to install the necessary dependencies.

  7. Finally, run the npm run dev command in your terminal to start the application. The app will be accessible at http://localhost:5173.

Summary

The LangchainJS Workshop is a demo code repository that allows users to run a NodeJS application for interacting with a locally running Ollama server. It leverages the power of LangchainJS for seamless communication with the LLM and utilizes SvelteKit for building the API and frontend. By following the installation instructions, users can quickly set up and run the application on their local machines.