Home
  • CV
  • Tech Stack
  • Books
  • Projects
  • List 100
LLM Transformer architecture diagram showing complete workflow from token embeddings through self-attention layers, MLP, residual connections, softmax sampling strategies, training with backpropagation, and RLHF reward model for ChatGPT and large language models
How Large Language Models Work

Introduction Large Language Models (LLMs) like ChatGPT, Claude, Gemini, and Llama are built on a single breakthrough idea: the Transformer. This article...

LLM MODELS, PROVIDERS AND TRAINING
Advanced prompt engineering techniques for complex AI reasoning tasks
Advanced Prompt Engineering for Complex Tasks

As LLM applications mature, simple single-shot prompting no longer suffices for complex reasoning tasks. Dynamic prompt chaining enables AI systems to break...

PROMPT AND CONTEXT ENGINEERING
Prompt engineering best practices that work in 2025
Prompt Engineering 2025: What Works

Prompt engineering has evolved from an art to a science. After analyzing over 1,000 prompts and their results across different models, patterns...

PROMPT AND CONTEXT ENGINEERING

© 2025 Amir Teymoori