Share feedback
Answers are generated based on the documentation.

Lab: Building an AI Product Reviewer

Build a complete feedback analysis pipeline for a fictional AI product called Jarvis. You'll write Node.js code that runs local LLMs and embedding models via Docker Model Runner — no API keys, no cloud subscriptions, no data leaving your machine.

Launch the lab

  1. Start the labspace:

    $ docker compose -f oci://dockersamples/labspace-creating-ai-product-reviewer up -d
    
  2. Open your browser to http://localhost:3030.

What you'll learn

By the end of this Labspace, you will have completed the following:

  • Run LLMs locally via Docker Model Runner's OpenAI-compatible API
  • Connect a Node.js app to Docker Model Runner using the OpenAI SDK and the Compose models: integration
  • Perform sentiment analysis using low-temperature LLM classification
  • Use embeddings and cosine similarity to cluster semantically related feedback
  • Extract structured data from an LLM using response_format: { type: 'json_object' }
  • Generate context-aware responses to reviews informed by extracted product features

Modules

#ModuleDescription
1IntroductionOverview of the pipeline and Docker Model Runner setup
2Project Setup & Docker Model RunnerExplore the starter project and wire up Compose model integration
3Generating Synthetic FeedbackUse the LLM to generate realistic product reviews as test data
4Sentiment AnalysisClassify reviews as positive, negative, or neutral with low-temperature generation
5Embeddings & Semantic ClusteringGroup related reviews using vector embeddings and cosine similarity
6Features & ResponsesExtract actionable features and generate context-aware review responses
7Wrap-upSummary of techniques and ideas for extending the pipeline