Shopify CEO Tobi Lütke and former Tesla and OpenAI researcher Andrej Karpathy say "context engineering" is more useful than prompt engineering when working with large language models. Lütke calls it a "core skill," while Karpathy describes it as the "delicate art and science of filling the context window with just the right information for the next step."
Too little or of the wrong form and the LLM doesn't have the right context for optimal performance. Too much or too irrelevant and the LLM costs might go up and performance might come down. Doing this well is highly non-trivial.
Andrej Karpathy
This matters even with large context windows, as model performance drops with overly long and noisy inputs.