playground
packer.nvim
playground | packer.nvim | |
---|---|---|
16 | 180 | |
11,707 | 7,603 | |
0.8% | - | |
0.0 | 3.4 | |
3 months ago | 2 months ago | |
TypeScript | Lua | |
Apache License 2.0 | MIT License |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
playground
-
Why do tree-based models still outperform deep learning on tabular data? (2022)
Not the parent, but NNs typically work better when you can't linearize your data. For classification, that means a space in which hyperplanes separate classes, and for regression a space in which a linear approximation is good.
For example, take the circle dataset here: https://playground.tensorflow.org
That doesn't look immediately linearly separable, but since it is 2D we have the insight that parameterizing by radius would do the trick. Now try doing that in 1000 dimensions. Sometimes you can, sometimes you can't or do want to bother.
-
Introduction to TensorFlow for Deep Learning
For visualisation and some fun: http://playground.tensorflow.org/
- TensorFlow Playground – Tinker with a NN in the Browser
-
Visualization of Common Algorithms
https://seeing-theory.brown.edu/
https://www.3blue1brown.com/
https://playground.tensorflow.org/
-
Stanford A.I. Courses
There’s an interactive neural network you can train here, which can give some intuition on wider vs larger networks:
https://mlu-explain.github.io/neural-networks/
See also here:
http://playground.tensorflow.org/
-
Let's revolutionize the CPU together!
This site is worth playing around with to get a feel for neural networks, and somewhat about ML in general. There are lots of strategies for statistical learning, and neural nets are only one of them, but they essentially always boil down into figuring out how to build a “classifier”, to try to classify data points into whatever category they best belong in.
-
Curious about Inputs for neural network
I don’t know much experimenting you’ve done, but many repeated small scale experiments might give you a better intuition at least. I highly recommend this online tool for playing with different environmental variables, even if you’re comfortable coding up your own experiments: http://playground.tensorflow.org
-
Intel Announces Aurora genAI, Generative AI Model With 1 Trillion Parameters
Even if you can’t code, play around with this tool: https://playground.tensorflow.org — you can adjust the shape of the NN and watch how well it classifies the data. Model size obviously matters.
-
Where have all the hackers gone?
I don't think so. You can easily play around in the browser, using Javascript, or on https://processing.org/, https://playground.tensorflow.org/, https://scratch.mit.edu/, etc.
If anything the problem is that today's kids have too many options. And sure, some are commercial.
-
[Discussion] Questions about linear regression, polynomial features and multilayer NN.
Well there is no point of using a multilayer linear neural network, because a cascade of linear transformations can be reduced to a single linear transformation. So you can only approximate linear functions. However if you have prior knowledge about the non linearity of your data lets say you know that it is a linear combination of polynomials up to certain degree, you can expand your input space by explicitly making non linear transformation. For instance a 1D linear regression can be modeled by 2 input neurons and 1 output neuron where the activation of the output is the identity. The input neuron x0 will take a constant input namely 1 and the second input neuron x1 will takes your data x. The output neuron will be y=w_0 * 1+w_1 *x which is equal to y=w_0 +w_1 * x. Let us say that your data follows a polynomial form, the idea is to add input neurons and expand your input to for instance X=[1 x x2] in this case you have 3 input neurons where the third is an explict non linear form of the input so y=w_0 + w_1 x +w_2 x2. The general idea is to find a space where the problem becomes linear. In real life example these spaces are non trivial the power of neural network is that they can find by optimization such space without explicitly encoding these non linearities. Try playing around with https://playground.tensorflow.org/ you can get an intuition about your question.
packer.nvim
-
thethethe.nvim - neovim friendly autocorrect plugin
packer
-
Help Enablin Powerline Font for Lightline in Kitty NeoVim
-- Check if Packer.nvim is already installed if fn.empty(fn.glob(install_path)) > 0 then -- If not installed, clone it from GitHub fn.system({'git', 'clone', '--depth', '1', 'https://github.com/wbthomason/packer.nvim', install_path}) -- Load Packer.nvim vim.cmd [[packadd packer.nvim]] return true end return false end
-
Installing neovim on windows 10 does not work (no really, it doesn't)
local ensure_packer = function() local fn = vim.fn local install_path = fn.stdpath('data')..'/site/pack/packer/start/packer.nvim' if fn.empty(fn.glob(install_path)) > 0 then fn.system({'git', 'clone', '--depth', '1', 'https://github.com/wbthomason/packer.nvim', install_path}) vim.cmd [[packadd packer.nvim]] return true end return false end
-
Issue with treesitter highlights, disappears after 5 seconds each time
local fn = vim.fn -- Automatically install packer local install_path = fn.stdpath("data") .. "/site/pack/packer/start/packer.nvim" if fn.empty(fn.glob(install_path)) > 0 then PACKER_BOOTSTRAP = fn.system({ "git", "clone", "--depth", "1", "https://github.com/wbthomason/packer.nvim", install_path, }) print("Installing packer close and reopen Neovim...") vim.cmd([[packadd packer.nvim]]) end -- Autocommand that reloads neovim whenever you save the plugins.lua file vim.cmd([[ augroup packer_user_config autocmd! autocmd BufWritePost plugins.lua source | PackerSync augroup end ]]) -- Use a protected call so we don't error out on first use local status_ok, packer = pcall(require, "packer") if not status_ok then return end -- Have packer use a popup window packer.init({ display = { open_fn = function() return require("packer.util").float({ border = "rounded" }) end, }, }) -- Install your plugins here return packer.startup(function(use) -- My plugins here use({ "wbthomason/packer.nvim" }) -- Have packer manage itself use({ "nvim-lua/plenary.nvim" }) -- Useful lua functions used by lots of plugins use({ "windwp/nvim-autopairs" }) -- Autopairs, integrates with both cmp and treesitter use({ "numToStr/Comment.nvim" }) use({ "JoosepAlviste/nvim-ts-context-commentstring" }) use({ "kyazdani42/nvim-web-devicons" }) use({ "akinsho/bufferline.nvim" }) use({ "moll/vim-bbye" }) use({ "nvim-lualine/lualine.nvim" }) use({ "akinsho/toggleterm.nvim" }) use({ "ahmedkhalf/project.nvim" }) use({ "lewis6991/impatient.nvim" }) use({ "lukas-reineke/indent-blankline.nvim" }) use({ "goolord/alpha-nvim" }) use("folke/which-key.nvim") -- Colorschemes use({ "folke/tokyonight.nvim" }) use("lunarvim/darkplus.nvim") use("bluz71/vim-moonfly-colors") use("fcpg/vim-fahrenheit") use("rainglow/vim") use("wojciechkepka/vim-github-dark") use("gavinok/spaceway.vim") use({"mcchrish/zenbones.nvim", requires = "rktjmp/lush.nvim"}) use({ "ellisonleao/gruvbox.nvim" }) -- Gruvbox theme -- LSP use({ "neovim/nvim-lspconfig" }) -- enable LSP use({ "williamboman/nvim-lsp-installer" }) -- simple to use language server installer use({ "jose-elias-alvarez/null-ls.nvim" }) -- for formatters and linters -- Telescope use({ "nvim-telescope/telescope.nvim" }) -- Treesitter use("nvim-treesitter/nvim-treesitter", {run = ':TSUpdate'}) use("nvim-treesitter/nvim-treesitter-context") use("nvim-treesitter/playground") -- Editor plugins use({ "karb94/neoscroll.nvim" }) -- Git use({ "lewis6991/gitsigns.nvim" }) -- LSP Zero use { 'VonHeikemen/lsp-zero.nvim', requires = { -- LSP Support {'neovim/nvim-lspconfig'}, -- Required {'williamboman/mason.nvim'}, -- Optional {'williamboman/mason-lspconfig.nvim'}, -- Optional -- Autocompletion {'hrsh7th/nvim-cmp'}, -- Required {'hrsh7th/cmp-buffer'}, {'hrsh7th/cmp-path'}, {'hrsh7th/cmp-nvim-lua'}, {'hrsh7th/cmp-nvim-lsp'}, -- Required {'L3MON4D3/LuaSnip'}, -- Required {'rafamadriz/friendly-snippets'}, } } -- Automatically set up your configuration after cloning packer.nvim -- Put this at the end after all plugins if PACKER_BOOTSTRAP then require("packer").sync() end end)
-
Editing init.lua with lua_ls on gives "Undefined global : vim" ?
require('packer').startup(function(use) use 'https://github.com/wbthomason/packer.nvim' use 'https://github.com/neovim/nvim-lspconfig' end)
- error message whenever I write a file
-
[Help] Packer.nvim
git clone --depth 1 https://github.com/wbthomason/packer.nvim\ ~/.local/share/nvim/site/pack/packer/start/packer.nvim
-
Pyright Won't Let me Quit Python Files
``` vim.g.maplocalleader = " " vim.g.mapleader = " " local ensure_packer = function() local fn = vim.fn local install_path = fn.stdpath("data") .. "/site/pack/packer/start/packer.nvim" if fn.empty(fn.glob(install_path)) > 0 then fn.system({ "git", "clone", "--depth", "1", "https://github.com/wbthomason/packer.nvim", install_path }) vim.cmd([[packadd packer.nvim]]) return true end return false end
- [Neovim] Gestionnaire de packages basé à Lua
What are some alternatives?
clip-interrogator - Image to prompt with BLIP and CLIP
vim-plug - :hibiscus: Minimalist Vim Plugin Manager
dspy - DSPy: The framework for programming—not prompting—foundation models
lazy.nvim - 💤 A modern plugin manager for Neovim
nvim-treesitter - Nvim Treesitter configurations and abstraction layer
pyllama - LLaMA: Open and Efficient Foundation Language Models
nvim-lspconfig - Quickstart configs for Nvim LSP
lake.nvim - A simplified ocean color scheme with treesitter support
paq-nvim - 🌚 Neovim package manager
developer - the first library to let you embed a developer agent in your own app!
gruvbox.nvim - Lua port of the most famous vim colorscheme