Microsoft Guidance creates ChatGPT and Co. (desired) chains

Language models tend to interpret user requests, expressed in the form of prompts, relatively freely. You can capture the models with long prompts, but this is often inefficient. However, programs that are intended to work partially with AI results require reproducible outputs. With Guidance, Microsoft has now published a project that aims to tailor AI language models (Large Language Models, LLM) better and more efficiently to user requirements. This is a separate AI template language with its own syntax, which specifies rules for the models. It is primarily aimed at development teams that rely on reliable results when using AI.

With Guidance, programs can be written that implement text generation, prompting and the plausibility check of the results of the AI ​​models in a single step. External tools can even be integrated for the latter. Guidance’s syntax is based on the Handlebars web template language. Unlike normal template languages, however, guidance scripts follow a predefined order that is aligned with the token processing order of the AI ​​models. About the command gen AI responses can be generated and output at predefined points, or the next logic-based steps can then be initiated.

The template language works with the language models from OpenAI and Hugging Face as well as with Facebook’s leaked open source model Llama, which is used by many self-hosters. Microsoft Guidance is in a Github repository along with suitable Jupyter notebooks to try out the functions yourself.



More from iX Magazine


More from iX Magazine

More from iX Magazine


(jvo)

To home page

source site