LangChain was developed in 2022 to provide a simplifying framework for working with LLMs. As the APIs for each LLM provider have converged, the need for such a framework has decreased. The people behind LangChain have recently released [[LangGraph]] which can be especially useful for agentic systems.
LangChain is useful for [[Retrieval Augmented Generation|RAG]] applications.
LangChain has abstractions of LLM, Retrievers, and Memory. Creating a RAG pipeline is quite easy with these three abstractions.
```python
llm = ChatOpenAI(temperature=0.7)
memory = ConversationBufferMemory(memory_key='chat_history', return_messages)
retriever = vectorstore.as_retriever()
conversation_chain = CnoversationalRetrievalChain.from_llm(
llm=llm, retriever=retriever, memory=memory
)
```
## LangChain Expression Language
LangChain Expression Language (LCEL) is a declarative language for defining workflows using [[YAML]].