Skip to content

Groq API error #401

@PhoenixAlpha23

Description

@PhoenixAlpha23

Unexpected Behaviour

While trying to access the llama-3.3-70b-versatile model via groq encountered this error.

Minimal reproducible example

from langchain_groq import ChatGroq
from langchain.prompts import PromptTemplate
from langchain.chains import LLMChain
load_dotenv()

GROQ_API_KEY = os.getenv("GROQ_API_KEY")
client = Groq(api_key=GROQ_API_KEY)

MODEL_NAME = "llama-3.3-70b-versatile"   

# prompt template
prompt_template = PromptTemplate(
    input_variables=["question"],
    template="""
You are a helpful assistant. Answer the following question clearly and concisely:

Question: {question}

Answer:"""
)

# Initialize the LLaMA model via Groq
llm = ChatGroq(
    api_key=GROQ_API_KEY,
    model=MODEL_NAME,
    temperature=0.2,
    max_tokens=2048
)

# Wrap it into an LLMChain with the prompt
llm_chain = LLMChain(
    llm=llm,
    prompt=prompt_template,
    verbose=True  
)

# Example usage
user_question = "What is the capital of Maharashtra?"
response = llm_chain.run(user_question)

print("Response:")
print(response)

Output

Image

Runtime Environment

  • Model: llama-3.3-70b-versatile
  • Using via huggingface?: No
  • Platform: Streamlit

Additional context
The llama-3.3-70b-versatile model was being used via groq API (free version). This error occured between 12:00 IST and 12:30 ISt on Friday 13th June 2025. Mistake in outputs was also noticed such as chinese alphabets in between a marathi language answer.
A custom PromptTemplate was used while inferencing and was followed accurately until before this error showed up. The quality of results seem to have taken a hit after this.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions