Skip to content
#

tokenizer

A grammar describes the syntax of a programming language, and might be defined in Backus-Naur form (BNF). A lexer performs lexical analysis, turning text into tokens. A parser takes tokens and builds a data structure like an abstract syntax tree (AST). The parser is concerned with context: does the sequence of tokens fit the grammar? A compiler is a combined lexer and parser, built for a specific grammar.

Here are 111 public repositories matching this topic...

mern-advanced-authentication-authorization

This project is a full-stack authentication system built using the MERN stack. It offers user registration, login with email or Google OAuth, profile management, and token-based authentication. Key features include error handling, password hashing, and JWT implementation. The frontend is developed with React and Redux for state management.

  • Updated Feb 8, 2024
  • JavaScript