Hangman AI: Transformer-Based Game Solver

An AI agent for the Hangman word game that integrates a custom character-level transformer with heuristic decision layers to outperform frequency baselines (โ‰ˆ18%) and pure neural approaches.

  • Model: 6-layer encoder transformer (512 hidden, multi-head attention) trained with masked character prediction to infer missing letters.
  • Data Expansion: Morphological augmentation via WordNet expanding vocabulary from ~250k to 750k+ variants for broader generalization.
  • Strategy Engine: MasterGuesser module combines neural likelihood scores, dynamic opening letter ordering, and endgame pruning logic.
  • Filtering: On-the-fly candidate dictionary reduction using pattern constraints + positional frequency recalculation.
  • Performance: Full pipeline (transformer + augmentation + heuristics) sustains >60% solve rate over 1000+ randomized games.
  • Artifacts: Includes training notebooks, inference pipeline, pretrained weights, and evaluation scripts for reproducibility.

Tech: Python, PyTorch, NLTK, Transformers, NumPy.

Evaluation harness records per-turn entropy reduction and guess efficiency metrics for model iteration.

View on GitHub โ†’