GLLM Inference v0.4 to v0.5
As you may have noticed, several legacy modules in GLLM Inference have been marked as deprecated for a while. If your application is still using them, you should have received warning logs.
Backward compatibility will be removed in the upcoming minor version v0.5.0. Please review this migration guide to ensure a smooth transition.
Part 1: Prompt Builder
The following modules are deprecated:
AgnosticPromptBuilderHuggingFacePromptBuilderLlamaPromptBuilderMistralPromptBuilderOpenAIPromptBuilderMultimodalPromptBuilder
All capability to build prompts to be consumable by
LM invokerwill be replaced by the new PromptBuilder class.LMRPwill only support this class. The prompt builder no longer has interface and subclasses.If you want to use the
format_as_stringcapability previously provided by the deprecated subclasses, please use the new PromptFormatter modules. However, you shouldnt need this to work withLM invokerandLMRP.
Part 2: LM Invoker
All
...MultimodalLMInvokerare deprecated and are replaced with their...LMInvokercounterpart.e.g.,
OpenAIMultimodalLMInvokeris replaced byOpenAILMInvoker.
Google classes have been merged into a single class:
GoogleLMInvoker.All LM invoker has been migrated to:
Use native providers SDK instead of LangChain implementation.
Support multimodality, tool calling, structured output, analytics, retry, timeout, and some provider-specific features.
bind_tool_paramsparam is deprecated and is replaced bytoolsparam.with_structured_output_paramsparam is deprecated and is replaced byresponse_schemaparam.
Part 3: EM Invoker
All
...MultimodalEMInvokerare deprecated and are replaced with their...LMInvokercounterpart.e.g.,
TwelveLabsMultimodalEMInvokeris replaced byTwelveLabsEMInvoker.
Google classes have been merged into a single class:
GoogleEMInvoker.All EM invoker has been migrated to use native providers SDK instead of LangChain implementation.
Part 4: Catalog
Catalogs has been simplified.
Prompt builder catalog will only require these columns:
namesystemuser
LMRP catalog will only require these columns:
namesystem_templateuser_templatemodel_id, e.g.openai/gpt-4.1-nanocredentials, will be loaded from env var, e.g.OPENAI_API_KEYconfig, dictionary to be passed to the LM invoker, e.g.{"default_hyperparameters": {"temperature": 0.7}}output_parser_type, eithernoneorjson
Part 5: Miscellaneous Stuff
ModelIdandModelProvidershould now be imported fromgllm_inference.schemainstead ofgllm_inference.builderretryandRetryConfigshould now be imported fromgllm_core.utils.retryinstead ofgllm_inference.utils.retry
Last updated