OpenAI has officially changed the GPT-4 usage limit from 40 messages per 3 hours to a variable quota, causing frustration among some users.
The new "flexible" limit is supposed to dynamically adjust to supply and demand, allowing users to make more or fewer GPT-4 requests depending on how busy the system is.
In practice, however, the new quota can be significantly lower than the previous 40 messages, with some users reaching the limit after only 10 to 17 requests, while others report not reaching the limit despite heavy usage, according to reports on Reddit.
The main point of criticism is the lack of transparency, as OpenAI does not communicate how many GPT-4 messages are currently possible and how many have already been used, making it difficult for users to plan their use of the AI chatbot. Especially during periods of high demand, when the limit might be low, users could find themselves without access to GPT-4.
Some users suggest that OpenAI is responding to the switch to free access without an account, which has increased usage. For OpenAI, the switch means more flexibility in allocating scarce and expensive computing resources for the top-of-the-line GPT-4.
The situation has led some users to consider switching to competing services such as LibreChat or Anthropic's Claude 3, which offer clearer quotas or allow payment per message.
In other news, OpenAI is currently rolling out a series of updates to ChatGPT, most recently adding image editing functionality to ChatGPT's built-in DALL-E 3 image generator, allowing elements in an image to be changed, removed, or added via prompt.