playground
nvim-treesitter
Our great sponsors
playground | nvim-treesitter | |
---|---|---|
16 | 300 | |
11,674 | 9,426 | |
1.1% | 4.8% | |
0.0 | 9.9 | |
3 months ago | 6 days ago | |
TypeScript | Scheme | |
Apache License 2.0 | Apache License 2.0 |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
playground
-
Why do tree-based models still outperform deep learning on tabular data? (2022)
Not the parent, but NNs typically work better when you can't linearize your data. For classification, that means a space in which hyperplanes separate classes, and for regression a space in which a linear approximation is good.
For example, take the circle dataset here: https://playground.tensorflow.org
That doesn't look immediately linearly separable, but since it is 2D we have the insight that parameterizing by radius would do the trick. Now try doing that in 1000 dimensions. Sometimes you can, sometimes you can't or do want to bother.
-
Introduction to TensorFlow for Deep Learning
For visualisation and some fun: http://playground.tensorflow.org/
- TensorFlow Playground – Tinker with a NN in the Browser
-
Visualization of Common Algorithms
https://seeing-theory.brown.edu/
https://www.3blue1brown.com/
https://playground.tensorflow.org/
-
Stanford A.I. Courses
There’s an interactive neural network you can train here, which can give some intuition on wider vs larger networks:
https://mlu-explain.github.io/neural-networks/
See also here:
http://playground.tensorflow.org/
-
Let's revolutionize the CPU together!
This site is worth playing around with to get a feel for neural networks, and somewhat about ML in general. There are lots of strategies for statistical learning, and neural nets are only one of them, but they essentially always boil down into figuring out how to build a “classifier”, to try to classify data points into whatever category they best belong in.
-
Curious about Inputs for neural network
I don’t know much experimenting you’ve done, but many repeated small scale experiments might give you a better intuition at least. I highly recommend this online tool for playing with different environmental variables, even if you’re comfortable coding up your own experiments: http://playground.tensorflow.org
-
Intel Announces Aurora genAI, Generative AI Model With 1 Trillion Parameters
Even if you can’t code, play around with this tool: https://playground.tensorflow.org — you can adjust the shape of the NN and watch how well it classifies the data. Model size obviously matters.
-
Where have all the hackers gone?
I don't think so. You can easily play around in the browser, using Javascript, or on https://processing.org/, https://playground.tensorflow.org/, https://scratch.mit.edu/, etc.
If anything the problem is that today's kids have too many options. And sure, some are commercial.
-
[Discussion] Questions about linear regression, polynomial features and multilayer NN.
Well there is no point of using a multilayer linear neural network, because a cascade of linear transformations can be reduced to a single linear transformation. So you can only approximate linear functions. However if you have prior knowledge about the non linearity of your data lets say you know that it is a linear combination of polynomials up to certain degree, you can expand your input space by explicitly making non linear transformation. For instance a 1D linear regression can be modeled by 2 input neurons and 1 output neuron where the activation of the output is the identity. The input neuron x0 will take a constant input namely 1 and the second input neuron x1 will takes your data x. The output neuron will be y=w_0 * 1+w_1 *x which is equal to y=w_0 +w_1 * x. Let us say that your data follows a polynomial form, the idea is to add input neurons and expand your input to for instance X=[1 x x2] in this case you have 3 input neurons where the third is an explict non linear form of the input so y=w_0 + w_1 x +w_2 x2. The general idea is to find a space where the problem becomes linear. In real life example these spaces are non trivial the power of neural network is that they can find by optimization such space without explicitly encoding these non linearities. Try playing around with https://playground.tensorflow.org/ you can get an intuition about your question.
nvim-treesitter
-
JetBrains' unremovable AI assistant meets irresistible outcry
I suggest looking for blog posts about this, you're gunnuh wanna pick out a plugin manager and stuff. It's kind of like a package manager for neovim. You can install everything manually but usually you manually install a plugin manager and it gives you commands to manage the rest of your plugins.
These two plugins are the bare minimum in my view.
https://github.com/nvim-treesitter/nvim-treesitter
Treesitter gives you much better syntax highlighting based on a parser for a given language.
https://github.com/neovim/nvim-lspconfig
This plugin helps you connect to a given language LSP quickly with sensible defaults. You more or less pick your language from here and copy paste a snippet, and then install the relevant LSP:
https://github.com/neovim/nvim-lspconfig/blob/master/doc/ser...
For Python you'll want pylsp. For JavaScript it will depend on what frontend framework you're using, I probably can't help you there.
pylsp itself takes some plugins and you'll probably want them. https://github.com/python-lsp/python-lsp-server
Best of luck! Happy hacking.
-
Help needed with Treesitter sql injection
It was changed in https://github.com/nvim-treesitter/nvim-treesitter/commit/78b54eb
-
Do I need NeoVIM?
https://github.com/hrsh7th/nvim-cmp This is an autocompletion engine https://github.com/nvim-treesitter/nvim-treesitter This allows NeoVim to install parsing scripts so NeoVim can do things like code highlighting. https://github.com/williamboman/mason.nvim Not strictly necessary, but allows you to access a repo of LSP, install them, and configure them for without you actively messing about in config files. https://github.com/neovim/nvim-lspconfig Also not strictly necessary, but vastly simplifies LSP setup. https://github.com/williamboman/mason-lspconfig.nvim This lets the above two plugins talk to each other more easily.
- Problem with highlighting when attempting to create own treesitter parser
-
neorg problem, all other plugins deactivate when added to init.lua
vim.opt.rtp:prepend(lazypath) require('lazy').setup({ { "nvim-neorg/neorg", build = ":Neorg sync-parsers", opts = { load = { ["core.defaults"] = {}, -- Loads default behaviour ["core.concealer"] = {}, -- Adds pretty icons to your documents ["core.dirman"] = { -- Manages Neorg workspaces config = { workspaces = { notes = "~/notes", }, defaultworkspace = "notes", }, }, }, }, dependencies = { { "nvim-lua/plenary.nvim", }, { -- YOU ALMOST CERTAINLY WANT A MORE ROBUST nvim-treesitter SETUP -- see https://github.com/nvim-treesitter/nvim-treesitter "nvim-treesitter/nvim-treesitter", opts = { auto_install = true, highlight = { enable = true, additional_vim_regex_highlighting = false, }, }, config = function(,opts) require('nvim-treesitter.configs').setup(opts) end }, { "folke/tokyonight.nvim", config=function(,) vim.cmd.colorscheme "tokyonight-storm" end,}, }, }, }) require 'plugins' ```
-
Getting Treesitter to work for Windows 10
Change the compiler to use 'llvm' and install visual studio build tools command line stuff - at least that is what worked for me without problems. If you are using c++ then I would assume you have visual studio installed already. If you need more info follow the treesitter windows support
-
Just come back up out of the rabbit hole - TS unsets syntax variable by design!
After a lot of time spent yesterday I took a fresh look today and then thought to myself - what if this is what TS does by design? A few clicks later and I found this https://github.com/nvim-treesitter/nvim-treesitter/issues/1327
- What is this color scheme
-
nvim-treesitter erroring on Windows 11 Pro
I've followed the official guide for nvim-treesitter support on Windows, but I'm having problems making it work. I keep getting a compilation error for any parser I try to install using TSInstall. If instead I use TSInstallSync I don't get errors but the parser is not correctly installed. My setup uses lazyvim and I installed LLVM using winget to have a C compiler.
-
Neovim can't find C compiler
I have read that gcc in windows doesn't always provide the necessary support for treesitter. I have seen ppl prefer clang over gcc in Windows. Please see also Windows support in treesitter's repo. Unfortunately I cannot help further as I don't use Windows for coding, but hope you can deduce something to solve your problem from the above link (if you haven't already read through it).
What are some alternatives?
clip-interrogator - Image to prompt with BLIP and CLIP
coc.nvim - Nodejs extension host for vim & neovim, load extensions like VSCode and host language servers.
dspy - DSPy: The framework for programming—not prompting—foundation models
nvim-lspconfig - Quickstart configs for Nvim LSP
pyllama - LLaMA: Open and Efficient Foundation Language Models
vim-polyglot - A solid language pack for Vim.
lake.nvim - A simplified ocean color scheme with treesitter support
vim-python-pep8-indent - A nicer Python indentation style for vim.
developer - the first library to let you embed a developer agent in your own app!
packer.nvim - A use-package inspired plugin manager for Neovim. Uses native packages, supports Luarocks dependencies, written in Lua, allows for expressive config
machine-learning-specialization-andrew-ng - A collection of notes and implementations of machine learning algorithms from Andrew Ng's machine learning specialization.
tree-sitter - An incremental parsing system for programming tools