Skip to content
#

tokenizer

A grammar describes the syntax of a programming language, and might be defined in Backus-Naur form (BNF). A lexer performs lexical analysis, turning text into tokens. A parser takes tokens and builds a data structure like an abstract syntax tree (AST). The parser is concerned with context: does the sequence of tokens fit the grammar? A compiler is a combined lexer and parser, built for a specific grammar.

Here are 75 public repositories matching this topic...

🐚 A fully functional mini shell written in C as part of the 42 school curriculum. Implements key shell features like built-in commands, pipelines, redirections (<, >, >>, <<), and environment variable expansion. Designed to mimic basic Bash behavior while exploring process creation, parsing, file descriptors, and terminal signal handling.

  • Updated Jul 3, 2025
  • C