LangChain for LLM Application Development
https://learn.deeplearning.ai/langchain
Models, Prompts, and Parsers
Langchain provides convenience wrappers over OpenAI models. It lets you wrap your prompts so that even the complex prompts can be reused. Parsers let you define the format of the output you want to extract from the OpenAI response.
Memory
LLMs are stateless.
Each transaction is independent.
the entire conversation is provided as context every single time.
Store the entire conversation in the conversation buffer memory. Every new message is added to the buffer.
It is possible to use the conversationbufferwindowmemory. This way only the last n messages are stored in the context memory.
there is also a conversationTokenBufferMemory - this can help control the spending directly as pricing is related to tokens
there is also a ConversationSummaryBufferMemory - summary of the conversation over time.
Chains
303 views
Comments
Participate in the conversation.
Never miss a post from
Satyajeet Jadhav
Get notified when Satyajeet Jadhav publishes a new post.
Read More
Why LLMs.txt Matters for Your Website in 2025
This post explains what LLMs.txt is, why it’s important and how it helps protect your content and guide AI usage on your terms.


The biggest challenge with AI
For most of us, the biggest challenge with AI is not going to be which LLM model to use or what infra to deploy. It will be to understand why do I need AI.
Thinkdeli Release Notes
This document maintains the list of new features and bug fixes as they happen.
