DNABERT: pre-trained Bidirectional Encoder Representations from Transformers model for DNA-language in genome
-
Updated
Jul 8, 2025 - Python
DNABERT: pre-trained Bidirectional Encoder Representations from Transformers model for DNA-language in genome
Transcription factor binding site prediction for novel DNA sequence data aiding in mutation identification and drug discovery
This project implements a hybrid machine learning approach for classifying breast cancer from DNA sequences using bidirectional embeddings generated by DNABERT. The study processes over 46 million high-quality DNA sequences to distinguish between cancerous and non-cancerous genomic material.
Promoter detection of DNA sequences using Transformers from scratch and DNABERT
Add a description, image, and links to the dnabert-model topic page so that developers can more easily learn about it.
To associate your repository with the dnabert-model topic, visit your repo's landing page and select "manage topics."