chat-with-mistral.zaia.app Open in urlscan Pro
2606:4700:3032::ac43:db0d  Public Scan

URL: https://chat-with-mistral.zaia.app/
Submission: On February 14 via api from US — Scanned from US

Form analysis 0 forms found in the DOM

Text Content

Chat
Readme
New Chat




CHAT COM MISTRAL 7B! 🚀🤖


CHAT FEITO COM O MISTRAL E CHAINLIT

REQUIREMENTS.TXT

toml

chainlit
ctransformers


APP.PY

python

import os
import chainlit as cl

from ctransformers import AutoModelForCausalLM

# Modelo do Mistral
llm = AutoModelForCausalLM.from_pretrained("TheBloke/Mistral-7B-Instruct-v0.1-GGUF",
                                            model_file="mistral-7b-instruct-v0.1.Q4_K_M.gguf",
                                            model_type="mistral",
                                            temperature=0.7,
                                            gpu_layers=0,
                                            stream=True,
                                            threads=int(os.cpu_count() / 2),
                                            max_new_tokens=10000)

# Lista para armazenar as últimas 3 mensagens
recent_messages = []

@cl.on_chat_start
def main():
    cl.user_session.set("llm", llm)


@cl.on_message
async def main(message: cl.Message):
    global recent_messages
    llm = cl.user_session.get("llm")

    msg = cl.Message(
        content="",
    )

    # Adiciona a mensagem mais recente à lista
    recent_messages.append(message.content)

    # Mantenha apenas as últimas 3 mensagens na lista
    recent_messages = recent_messages[-3:]

    # Constrói o contexto com as últimas 3 mensagens
    context = ""
    for recent_msg in recent_messages:
        context += recent_msg + " "

    # Gera uma resposta com base no contexto
    prompt = f"[INST]<<SYS>>Você é especialista em busca na internet. Responda de forma reduzida com até 100 palavras em português brasileiro.<</SYS>>{context}[/INST]"
    for text in llm(prompt=prompt):
        await msg.stream_token(text)

    await msg.send()


DOCKERFILE

dockerfile

FROM python:3.11

COPY . .
RUN pip install ctransformers chainlit

ENTRYPOINT ["chainlit", "run", "app.py", "--host=0.0.0.0", "--port=8000", "--headless"]


DOCKER-COMPOSE.YML

yaml

version: '3'
services:
  chat-with-mistral:
    build:
      context: .
      dockerfile: Dockerfile
    ports:
      - "8000:8000"


sh

docker-compose up --build --force-recreate --remove-orphans


Built with