DNABERT

DNABERT: pre-trained Bidirectional Encoder Representations from Transformers model for DNA-language in genome (by jerryji1993)

DNABERT Alternatives

Similar projects and alternatives to DNABERT

NOTE: The number of mentions on this list indicates mentions on common posts plus user suggested alternatives. Hence, a higher number means a better DNABERT alternative or higher similarity.

DNABERT reviews and mentions

Posts with mentions or reviews of DNABERT. We have used some of these posts to build our list of alternatives and similar projects.
  • [D] New to DNABERT
    1 project | /r/MachineLearning | 3 Nov 2023
    If I want to get started, they said it's optional to pre-train (so you can skip to step 3). This is where I got tripped up: "Note that the sequences are in kmer format, so you will need to convert your sequences into that." From what I understand, you need to do this so that all of the sequences are the same length? So kmer=6 means all of the sequences are length 6? Someone suggested that I take the first nucleotide in the promoter and grab 3 nucleotides before and 3 nucleotides after (+/-3 bases). I don't think that's how the kmer thing works though? I tried replicating how I think it works down below (I got confused on the last row of the 'after' df). Please correct me if I'm wrong!

Stats

Basic DNABERT repo stats
1
543
3.1
about 2 months ago

jerryji1993/DNABERT is an open source project licensed under Apache License 2.0 which is an OSI approved license.

The primary programming language of DNABERT is Python.


Sponsored
SaaSHub - Software Alternatives and Reviews
SaaSHub helps you find the best software and product alternatives
www.saashub.com