Get Scout setup in minutes, and let us sweat the small stuff. A couple lines in settings.py is all you need to start monitoring your apps. Sign up for our free tier today. Learn more →
XMem Alternatives
Similar projects and alternatives to XMem
-
-
Scout Monitoring
Free Django app performance insights with Scout Monitoring. Get Scout setup in minutes, and let us sweat the small stuff. A couple lines in settings.py is all you need to start monitoring your apps. Sign up for our free tier today.
-
-
segment-anything
Discontinued The repository provides code for running inference with the SegmentAnything Model (SAM), links for downloading the trained model checkpoints, and example notebooks that show how to use the model.
-
DeepSpeed
DeepSpeed is a deep learning optimization library that makes distributed training and inference easy, efficient, and effective.
-
hivemind
Decentralized deep learning in PyTorch. Built to train models on thousands of volunteers across the world.
-
yolov7
Implementation of paper - YOLOv7: Trainable bag-of-freebies sets new state-of-the-art for real-time object detectors
-
openpose
OpenPose: Real-time multi-person keypoint detection library for body, face, hands, and foot estimation
-
InfluxDB
Purpose built for real-time analytics at any scale. InfluxDB Platform is powered by columnar analytics, optimized for cost-efficient storage, and built with open data standards.
-
-
-
Track-Anything
Track-Anything is a flexible and interactive tool for video object tracking and segmentation, based on Segment Anything, XMem, and E2FGVI.
-
deeplab2
DeepLab2 is a TensorFlow library for deep labeling, aiming to provide a unified and state-of-the-art TensorFlow codebase for dense pixel labeling tasks.
-
-
-
-
multiface
Hosts the Multiface dataset, which is a multi-view dataset of multiple identities performing a sequence of facial expressions.
-
Cream
Discontinued This is a collection of our NAS and Vision Transformer work. [Moved to: https://github.com/microsoft/AutoML]
-
EfficientZero
Open-source codebase for EfficientZero, from "Mastering Atari Games with Limited Data" at NeurIPS 2021.
-
-
msn
Discontinued Masked Siamese Networks for Label-Efficient Learning (https://arxiv.org/abs/2204.07141)
-
SaaSHub
SaaSHub - Software Alternatives and Reviews. SaaSHub helps you find the best software and product alternatives
XMem discussion
XMem reviews and mentions
-
[D] Which open source models can replicate wonder dynamics's drag'n'drop cg characters?
Use Segmentation Model (SAM) combined with Inpainting model (E2FGVI) and Xmem to cut out the live action subject.
-
Track-Anything: a flexible and interactive tool for video object tracking and segmentation, based on Segment Anything and XMem.
Nvm just found the occlusion video on https://github.com/hkchengrex/XMem holy shit
- XMem: Long-Term Video Object Segmentation with an Atkinson-Shiffrin Memory Model
-
[D] Most important AI Paper´s this year so far in my opinion + Proto AGI speculation at the end
XMem: Long-Term Video Object Segmentation with an Atkinson-Shiffrin Memory Model ( Added because of the Atkinson-Shiffrin Memory Model ) Paper: https://arxiv.org/abs/2207.07115 Github: https://github.com/hkchengrex/XMem
- [D] Most Popular AI Research July 2022 pt. 2 - Ranked Based On GitHub Stars
- Most Popular AI Research July 2022 pt. 2 - Ranked Based On GitHub Stars
-
I trained a neural net to watch Super Smash Bros
Yeah MiVOS would speed up your tagging a lot. I also was curious if you saw XMem which just came out. I found that worked really well too.
-
University of Illinois Researchers Develop XMem; A Long-Term Video Object Segmentation Architecture Inspired By Atkinson-Shiffrin Memory Model
Continue reading | Check out the paper and github link.
-
[R] Unicorn: 🦄 : Towards Grand Unification of Object Tracking(Video Demo)
Have you check XMem?
-
A note from our sponsor - Scout Monitoring
www.scoutapm.com | 20 Sep 2024
Stats
hkchengrex/XMem is an open source project licensed under MIT License which is an OSI approved license.
The primary programming language of XMem is Python.