What is an AI Context Limit (aka Context Window)?

All AI chatbots have the ability to understand the context of a conversation. However, there is only so much these models can “remember” and this is known as the “Context Length” or “Context Window.”

In other words, each AI Model can only process so much of your conversation before it starts to “forget” things.

For example, OpenAI’s GPT-3.5 model has a context length of 4,096 tokens, which is the equivalent of around 3,000 words. It can only actively process the last 3,000 words of a conversation.

If a conversation extends beyond this 3,000 word context length, GPT-3.5 will start “forgetting” the earlier parts of the exchange and may produce irrelevant or confusing responses.

Here is a breakdown of each of Magai’s available AI Models and their context length:

  • GPT-3.5: 3,000 words (or 4,096 tokens)
  • GPT-4: 6,000 words (or 8,196 tokens)
  • Claude Instant: 75,000 words (or 100,000 tokens)
  • Claude: 75,000 words (or 100,000 tokens)

Most AI assistants do not notify the user when the context length is exceeded, which can lead to a poor experience.

To address this, our team was careful to craft a subtle and transparent way of indicating where a conversations context is cut off.

We recommend using our Claude AI model which has a context length of 75,000 words to avoid these issues with reaching the AI’s context length limit and having more in-depth conversations.

What are your feelings
Updated on December 28, 2023