Our great sponsors
-
WorkOS
The modern identity platform for B2B SaaS. The APIs are flexible and easy-to-use, supporting authentication, user identity, and complex enterprise features like SSO and SCIM provisioning.
-
semantic-kernel
Integrate cutting-edge LLM technology quickly and easily into your apps (by MyIntelligenceAgency)
Now, that implies you know what you're doing. If you want to get started running the sample notebooks and the many console examples, you'd want to clone the repository and build in your dev environment. For c#, that would be typically Visual Studio, Jetbrains Rider, or VsCode with the Polyglot extension to run the c# notebooks (they make use on the Nuget package), and the c# and vscode-solution extensions to build the source code and Console examples the way you'd do it in Visual Studio or Rider.
Now, that implies you know what you're doing. If you want to get started running the sample notebooks and the many console examples, you'd want to clone the repository and build in your dev environment. For c#, that would be typically Visual Studio, Jetbrains Rider, or VsCode with the Polyglot extension to run the c# notebooks (they make use on the Nuget package), and the c# and vscode-solution extensions to build the source code and Console examples the way you'd do it in Visual Studio or Rider.
Now that implies that you will use either of the existing connectors, providing an OpenAI API key for instance to leverage ChatGPT or davinci models (you have instructions for how to do that). If you wish to use your own local LLM hosted on oobabooga, since my pull request wasn't merged yet, you will need to clone or fork my own fork of the repository where the new connector is currently available. The connector's project to build is located there, and the first step would be to test it works by running the corresponding integration test, which is located there. Note that you will need to activate oobabooga API with the appropriate blocking and streaming ports (integration test uses the default ones).