playground
semantic-kernel
Our great sponsors
playground | semantic-kernel | |
---|---|---|
16 | 47 | |
11,674 | 18,111 | |
1.1% | 6.4% | |
0.0 | 9.9 | |
3 months ago | 4 days ago | |
TypeScript | C# | |
Apache License 2.0 | MIT License |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
playground
-
Why do tree-based models still outperform deep learning on tabular data? (2022)
Not the parent, but NNs typically work better when you can't linearize your data. For classification, that means a space in which hyperplanes separate classes, and for regression a space in which a linear approximation is good.
For example, take the circle dataset here: https://playground.tensorflow.org
That doesn't look immediately linearly separable, but since it is 2D we have the insight that parameterizing by radius would do the trick. Now try doing that in 1000 dimensions. Sometimes you can, sometimes you can't or do want to bother.
-
Introduction to TensorFlow for Deep Learning
For visualisation and some fun: http://playground.tensorflow.org/
- TensorFlow Playground – Tinker with a NN in the Browser
-
Visualization of Common Algorithms
https://seeing-theory.brown.edu/
https://www.3blue1brown.com/
https://playground.tensorflow.org/
-
Stanford A.I. Courses
There’s an interactive neural network you can train here, which can give some intuition on wider vs larger networks:
https://mlu-explain.github.io/neural-networks/
See also here:
http://playground.tensorflow.org/
-
Let's revolutionize the CPU together!
This site is worth playing around with to get a feel for neural networks, and somewhat about ML in general. There are lots of strategies for statistical learning, and neural nets are only one of them, but they essentially always boil down into figuring out how to build a “classifier”, to try to classify data points into whatever category they best belong in.
-
Curious about Inputs for neural network
I don’t know much experimenting you’ve done, but many repeated small scale experiments might give you a better intuition at least. I highly recommend this online tool for playing with different environmental variables, even if you’re comfortable coding up your own experiments: http://playground.tensorflow.org
-
Intel Announces Aurora genAI, Generative AI Model With 1 Trillion Parameters
Even if you can’t code, play around with this tool: https://playground.tensorflow.org — you can adjust the shape of the NN and watch how well it classifies the data. Model size obviously matters.
-
Where have all the hackers gone?
I don't think so. You can easily play around in the browser, using Javascript, or on https://processing.org/, https://playground.tensorflow.org/, https://scratch.mit.edu/, etc.
If anything the problem is that today's kids have too many options. And sure, some are commercial.
-
[Discussion] Questions about linear regression, polynomial features and multilayer NN.
Well there is no point of using a multilayer linear neural network, because a cascade of linear transformations can be reduced to a single linear transformation. So you can only approximate linear functions. However if you have prior knowledge about the non linearity of your data lets say you know that it is a linear combination of polynomials up to certain degree, you can expand your input space by explicitly making non linear transformation. For instance a 1D linear regression can be modeled by 2 input neurons and 1 output neuron where the activation of the output is the identity. The input neuron x0 will take a constant input namely 1 and the second input neuron x1 will takes your data x. The output neuron will be y=w_0 * 1+w_1 *x which is equal to y=w_0 +w_1 * x. Let us say that your data follows a polynomial form, the idea is to add input neurons and expand your input to for instance X=[1 x x2] in this case you have 3 input neurons where the third is an explict non linear form of the input so y=w_0 + w_1 x +w_2 x2. The general idea is to find a space where the problem becomes linear. In real life example these spaces are non trivial the power of neural network is that they can find by optimization such space without explicitly encoding these non linearities. Try playing around with https://playground.tensorflow.org/ you can get an intuition about your question.
semantic-kernel
-
#SemanticKernel – 📎Chat Service demo running Phi-2 LLM locally with #LMStudio
There is an amazing sample on how to create your own LLM Service class to be used in Semantic Kernel. You can view the Sample here: https://github.com/microsoft/semantic-kernel/blob/3451a4ebbc9db0d049f48804c12791c681a326cb/dotnet/samples/KernelSyntaxExamples/Example16_CustomLLM.cs
-
Semantic Tests for SemanticKernel Plugins using skUnit
This week, I had the chance to explore the SemanticKernel code base, particularly the core plugins. SemanticKernel comes equipped with these built-in plugins:
- FLaNK Stack for 04 December 2023
- Semantic Kernel
-
Getting Started with Semantic Kernel and C#
In this article we'll look at the high-level capabilities building AI orchestration systems in C# with Semantic Kernel, a rapidly maturing open-source AI orchestration framework.
-
Agency: Pure Go LangChain Alternative
I'm using Semantic Kernel (https://github.com/microsoft/semantic-kernel) and it's really nice. Makes building more complex workflows really simple without sacrificing control.
A bunch of examples (https://github.com/microsoft/semantic-kernel/blob/main/dotne...) for how to handle just about anything you need to do with OAI with a lot less boilerplate.
-
New: LangChain templates – fastest way to build a production-ready LLM app
I haven't tried it but there's Microsoft semantic-kernel.
https://github.com/microsoft/semantic-kernel
-
Overview: AI Assembly Architectures
Semantic Kernel github.com/microsoft/semantic-kernel
-
Automated Routing of Tasks to Optimal Models: A PR for Semantic-Kernel
The need for efficient model routing has been a point of discussion in the community. Addressing this, I've submitted a pull request to Semantic-Kernel that introduces an automated multi-model connector.
What are some alternatives?
clip-interrogator - Image to prompt with BLIP and CLIP
langchain - ⚡ Building applications with LLMs through composability ⚡ [Moved to: https://github.com/langchain-ai/langchain]
dspy - DSPy: The framework for programming—not prompting—foundation models
langchain - 🦜🔗 Build context-aware reasoning applications
nvim-treesitter - Nvim Treesitter configurations and abstraction layer
guidance - A guidance language for controlling large language models.
pyllama - LLaMA: Open and Efficient Foundation Language Models
guidance - A guidance language for controlling large language models. [Moved to: https://github.com/guidance-ai/guidance]
lake.nvim - A simplified ocean color scheme with treesitter support
autogen - A programming framework for agentic AI. Discord: https://aka.ms/autogen-dc. Roadmap: https://aka.ms/autogen-roadmap
developer - the first library to let you embed a developer agent in your own app!
private-gpt - Interact with your documents using the power of GPT, 100% privately, no data leaks