LLM-Related Python Packages You Should Know
Several packages that could improve your LLM workflow
🔥Reading Time: 3 Minute🔥
🔥Benefit Time: A lot of Time🔥
Large Language Model or LLM has taken the world by storm. Everyone knows about them and starts using them in everyday life.
The LLM itself has been called AI by casual people, as many think they are the same things, even though they are not. In technical terms, LLM is a subset of AI focusing on natural language processing.
LLM's usefulness is great for natural language understanding and generation. It can be used in many applications, such as virtual assistants, automating routine tasks, and assisting in coding.
As LLM becomes useful in many applications, there are many efforts to improve the model. That’s why there are many Python packages developed specifically for LLM.
In this newsletter, we will explore what these packages are. So, let’s jump into it.
1. LiteLLM
LLMs application has become famous to the public thanks to the development of ChatGPT by OpenAI.
As it’s one of the famous LLM providers, OpenAI API has been used by many data scientists and developers. But what if you want to use another LLM but still want to use the OpenAI APIs format?
This is where LiteLLM comes around. You can easily standardize your LLM workflow by standardizing every LLM API to the OpenAI format.
LLMs from other companies, such as Bedrock, Huggingface, VertexAI, TogetherAI, Azure, etc., can be transformed to have consistent output.
There are also many additional functionalities from LiteLLM where you can monitor and log the deployment while setting a budget to limit the projects. It’s a versatile package that you should try on.
2. LLMFlex
LLMFlex is a versatile Python package for prompt engineering in a simple interface. It’s designed specifically for the developer to be able to work with many kinds of LLM as well.
The package favors free and local resources instead of close-source, paid APIs. This means the package is great for developing local and private LLMs-powered applications.
The package mostly provides classes and functions to access LLM models, embedding models, and vector databases with our personal prompt and RAG technique. They are all easy to use and versatile enough in many situations.
Try it yourself to see how powerful LLMFlex is.
3. GPTCache
If your LLM-powered application has become increasingly popular, the traffic levels will significantly increase.
Higher engagement means the expenses related to the LLM API calls can be substantial, especially if you use paid APIs such as OpenAI. Moreover, the higher call means slower response time.
That’s where GPTCache can help you. The Python package allows the building of a cache to store LLM responses.
Cache is often used online to store commonly accessed data to reduce retrieval time. While we use an exact match approach in the traditional cache to determine if the requested query is available, GPTCache uses semantic caching strategies with embedding to retrieve the related queries.
Use GPTCache if you know your application would require better traffic management.
4. Instructor
LLM results can be varied if we don’t control the output format, especially if the query is generic and there is no standard for the result.
Instructor is a Python package that allows you to bring the standard in the output and bring it into JSON format.
The package can easily you get JSON data from various LLMs such as GPT, Ollama, Mistral, and many others.
Additionally, the package is available in many other languages, such as Ruby, Go, Typescript, and Elixir.
5. Scikit-LLM
Leveraging the LLM, it’s now possible for users to extend the LLM power to enhance text analysis.
Scikit-LLM is a Python package that is designed to use LLM like GPT to perform text analysis such as text classification, translation, summarization, and many others via API similar to Scikit-Learn.
With how much LLM development has progressed, the Scikit-LLM would certainly take advantage of it to provide reliable and insightful output.
Try the package yourself if you want to improve your NLP tasks.
That’s all for now! I hope you enjoy my latest newsletter and spark something in you.
If you need help or want me to write something about your interest, comment or contact me on my social networks. Or even better, use the chat!
Articles to Read
Here are some of my latest articles you might miss this week.
5 Tips for Getting Started with Deep Learning in Machine Learning Mastery.
Optimizing Scikit-learn Models for Better Performance in Statology.
Streamlining Your Machine Learning Workflow with Scikit-learn and Joblib in Statology.
Using Scikit-learn’s Manifold Learning for Non-linear Dimensionality Reduction in Statology.
Evaluating and Improving Model Robustness Using Scikit-learn in Statology.
How to Merge Large DataFrames Efficiently with Pandas in KDNuggets.