Response.prompt_cache_retention Input should be ‘in-memory’ or ‘24h

WARNING livekit.agents failed to generate LLM completion: Connection error.
ERROR livekit.agents. pydantic_core._pydantic_core.ValidationError: 1 validation error for
ResponseCreatedEvent
response.prompt_cache_retention
Input should be ‘in-memory’ or ‘24h’ [type=literal_error, input_value=‘in_memory’,
input_type=str]

Current requirements.txt file:
livekit-agents[openai,cartesia,deepgram,silero,turn-detector,sarvam,google,smallestai,soniox]>=1.5.4
livekit-plugins-noise-cancellation~=0.2
python-dotenv==1.0.0
httpx>=0.27,<1
langchain-core>=0.3,<2
jinja2>=3.1,<4
pytest>=8,<9
boto3>=1.34,<2

Please help me out.

I am still trying to understand where that error is coming from, but in the mean time you can use OpenAI through LiveKit inference: agent-starter-python/src/agent.py at main · livekit-examples/agent-starter-python · GitHub

Engineering report that this issue should no longer occur in the latest Agents release, 1.5.5.

PR associated with the change:

(openai responses): drop prompt_cache_retention in received responses by tinalenguyen · Pull Request #5502 · livekit/agents · GitHub