From df3940b25856c83284c3f1ac3fa78c9794b09daa Mon Sep 17 00:00:00 2001 From: Chris Proctor Date: Wed, 8 Apr 2026 10:49:23 -0400 Subject: [PATCH] Update questions --- questions.md | 13 +++++++++---- 1 file changed, 9 insertions(+), 4 deletions(-) diff --git a/questions.md b/questions.md index 584171d..9b3363e 100644 --- a/questions.md +++ b/questions.md @@ -2,19 +2,24 @@ Answer each of the following questions. +## Model quality + +1. How well did the model you used work in this lab? Describe one problem you encountered + with the model's responses, and explain how you changed the system prompt to address it. + ## Memory and models -1. How much memory is available on your system? +2. How much memory is available on your system? -2. List two models from Hugging Face or Ollama's model library that could run on your system. For each one, explain +3. List two models from Hugging Face or Ollama's model library that could run on your system. For each one, explain what it is good at. -3. Choose an interesting-looking model that *cannot* run on your system. What is this model +4. Choose an interesting-looking model that *cannot* run on your system. What is this model good at doing? Imagine an AI-powered app you could create with this model and describe it. ## Local vs. remote models -4. Ollama can easily be configured to use a remotely-hosted model; you just provide the URL +5. Ollama can easily be configured to use a remotely-hosted model; you just provide the URL where the model is hosted. List some of the advantages of locally-hosted models, and some of the advantages of remotely-hosted models. What kinds of apps would be most suitable for locally-hosted models? What kinds of apps would be most suitable for remotely-hosted models?