All elementary functions from a single binary operator

Basic idea and example constructions

  • EML is defined as eml(x,y) = exp(x) – ln(y); with EML and the constant 1, one can build all elementary functions.
  • Simple derived forms:
    • exp(x) = eml(x, 1)
    • ln(x) = eml(1, eml(eml(1, x), 1))
  • From exp and ln:
    • Subtraction: x - y = eml(ln x, exp y)
    • Addition via x + y = ln(exp(x) * exp(y))
    • Multiplication, division, powers, roots, trig and hyperbolic functions are then composed using standard identities.
  • Expanded EML trees become large; e.g., multiplication can require depth-8 trees with 40+ leaves.

Expressiveness, math context, and edge conventions

  • The result is likened to NAND/NOR functional completeness, but for continuous/elementary functions rather than Boolean logic.
  • Some note that hypergeometric or multi-argument “selector” functions already encode many functions; the novelty here is a binary operation plus one constant.
  • The completeness proof sometimes relies on extended real conventions like ln(0) = -∞, e^{-∞} = 0; this is called out explicitly in the paper and debated:
    • Some see this as a non-standard caveat.
    • Others argue it is standard when working over the extended reals and IEEE‑754 behavior.
  • There is discussion of domain issues (e.g., log not a single-valued function over ℂ), and that some constructions pass through ln(0) or infinities.

Practicality, efficiency, and hardware

  • Consensus: this is mainly a theoretical/symbolic result, not a better way to numerically compute basic functions.
  • Using EML to express simple operations like + or * is far more complex and inefficient than standard primitives.
  • Analogies are drawn to:
    • NAND/NOR as universal logical bases, but rarely used directly in optimized designs.
    • Lambda calculus/Iota as minimal universal formalisms with little direct practical use.
  • Some speculate on:
    • EML-based symbolic regression and function discovery, potentially using gradient descent on EML trees.
    • Specialized EML coprocessors or analog EML circuits, though others doubt performance benefits versus existing FPUs and polynomial/rational approximations.

Verification, tooling, and reactions

  • Several participants reconstruct or verify EML expressions (e.g., using SymPy or small interpreters) and confirm correctness for many constants and operations.
  • Others propose using EML as a benchmark challenge for LLMs (“express 2x+y or sin(x)/x in EML+1”).
  • Overall tone mixes excitement at the conceptual elegance with skepticism about real-world impact or novelty relative to existing analytic frameworks.