semantickernel-localLLMs

Sample on how to run a LLM using LM Studio and interact with the model using Semantic Kernel. (by elbruno)

semantickernel-localLLMs Alternatives

Similar projects and alternatives to semantickernel-localLLMs

  • OllamaSharp

    Ollama API bindings for .NET

  • InfluxDB

    Power Real-Time Data Analytics at Scale. Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.

    InfluxDB logo
NOTE: The number of mentions on this list indicates mentions on common posts plus user suggested alternatives. Hence, a higher number means a better semantickernel-localLLMs alternative or higher similarity.

semantickernel-localLLMs reviews and mentions

Posts with mentions or reviews of semantickernel-localLLMs. We have used some of these posts to build our list of alternatives and similar projects. The last one was on 2024-04-01.
  • 📎 Extending #SemanticKernel using OllamaSharp for chat and text completion
    2 projects | dev.to | 1 Apr 2024
    Hi! In previous posts I shared how to host and chat with a Llama 2 model hosted locally with Ollama. (view post). And then I also found OllamaSharp (nuget package and repo). _OllamaSharp is a .NET binding for the Ollama API, making it easy to interact with Ollama using your favorite .NET languages. So, I decided to try it, and create a Chat Completion and a Text Generation specific implementation for Semantic Kernel using this library. The full test is a console app using both services with Semantic Kernel. Text Generation Service The Text Generation Service is an easy one. Just implement the interface Microsoft.SemanticKernel.TextGeneration.ITextGenerationService , and the generated code looks like this: Chat Completion Service The chat completion, requires the implementation of the interface: IChatCompletionService. The code looks like this: Test Chat Completion and Text Generation Services With both services implemented, we can now code with Semantic Kernel to access these services. The following code: Creates 2 services: text and chat, both with ollamasharp implementation Create a semantic kernel builder, register both services, and build a kernel. Using the kernel run a text generation sample, and later a chat history sample. In the chat sample, it also uses a System Message to define the chat behavior for the conversation. This is a test, there are a lot of improvements that can be made here. The full code is available here: https://github.com/elbruno/semantickernel-localLLMs. And the main readme of the repo also needs to be updated. Happy coding! Greetings El Bruno More posts in my blog ElBruno.com. More info in https://beacons.ai/elbruno

Stats

Basic semantickernel-localLLMs repo stats
1
35
6.4
about 1 month ago

elbruno/semantickernel-localLLMs is an open source project licensed under MIT License which is an OSI approved license.

The primary programming language of semantickernel-localLLMs is C#.


Sponsored
SaaSHub - Software Alternatives and Reviews
SaaSHub helps you find the best software and product alternatives
www.saashub.com