“Dynamic programming” is not referring to “computer programming”

Multiple meanings and historical origin

  • Commenters note that “dynamic programming” means different things in:
    • Competitive programming / LeetCode (table-based optimization for overlapping subproblems).
    • Reinforcement learning and control theory (Bellman equations, Hamilton–Jacobi–Bellman, dynamical systems).
  • Several point out that both strands trace back to Bellman’s work.
  • The article’s story—“dynamic” chosen for its positive, impressive sound and “programming” meaning planning/scheduling—matches others’ recollections, though some cite disputes about details of the anecdote.
  • “Programming” is linked to “linear/integer/constraint programming” and to scheduling in operations research, not to writing code.

What dynamic programming is (according to the thread)

  • Many comments emphasize: DP is fundamentally about decomposing an optimization problem into overlapping subproblems with optimal substructure, not about any particular implementation.
  • In classical math/OR/control, DP is framed as backward induction on time-dependent systems, not recursion+arrays.
  • Several argue that in “true” DP, the table or value function and Bellman-type recurrences are the core, and memoized recursion is just one computational technique.

Memoization, caching, recursion debate

  • A large subthread argues over whether DP is “just cached recursion.”
    • One camp: practical DP = recursion + memoization; that’s what most CS learners see.
    • Other camp: this is reductive; DP is a problem-structuring method, caching is only one way to exploit it.
  • Distinctions are drawn between:
    • Memoization vs general caching (determinism, scope, global shared state).
    • Top-down (memoized recursion) vs bottom-up tabulation and “minimal-memory memoization.”
  • Some insist that conflating memoization with generic caching leads to bad designs and bugs.

Applications and contest culture

  • Multiple IOI/ICPC anecdotes:
    • Early contestants often failed problems due to using plain recursion and learning only later that DP was needed.
    • DP knowledge evolved from medal-winning edge to “table stakes” as resources improved.
  • Examples mentioned: SQL join-order optimization (e.g., DuckDB), Emacs redisplay, shortest/longest paths, edit distance, BLAST approximating Smith–Waterman, routing (Bellman–Ford), kinship/relatedness computations, query optimizers.

Naming, marketing, and confusion

  • Many dislike the term: “dynamic” feels vacuous, “programming” misleading; some prefer terms like “tabulation,” “optimization,” or “bottom-up memoization.”
  • Others broaden the criticism to “linear programming,” “extreme programming,” “wave function collapse,” “extreme learning machines,” etc., as examples of marketing-oriented or opaque names.
  • Several say such names made the concept seem harder than it is and delayed their understanding; at least one commenter still feels unclear even after reading the article and discussion.