In Defense of Matlab Code

Julia, Python, and performance tradeoffs

  • Several comments argue Julia “already solves” most MATLAB problems, with cleaner semantics (e.g., explicit broadcasting, fewer shape footguns) and math-like syntax comparable to MATLAB.
  • Others contend Julia adds its own issues: JIT warmup makes it awkward for short scripts, tooling is immature vs Python (IDEs, plugins), and the ecosystem is thinner.
  • A long performance anecdote describes a heavily optimized Python pipeline (mostly glue over C) ported to Julia in ~2 weeks and running ~14× faster; in interview take‑homes, the fastest submissions were consistently in Julia rather than C++.
  • Counterarguments claim the C++ code must have been suboptimal and maintain that truly critical parts “should be C++ anyway”; supporters reply that Julia’s productivity, profiling tools, and generic programming let you reach high performance faster than typical C++/Rust in practice.

MATLAB’s strengths, weaknesses, and ecosystem

  • Strong points repeatedly cited: Simulink, auto‑code generation for embedded targets, excellent documentation, plotting quality, and the breadth of MathWorks toolboxes plus professional support.
  • Some say MATLAB’s unique value is as a single, cohesive environment spanning numerics, GUIs, model‑based design, HIL/SIL, visualization, etc.—something no open alternative fully replicates.
  • Major complaints: licensing complexity and cost (especially post‑academia), lock‑in to proprietary toolboxes, license servers, and difficulty integrating into flexible, many‑machine workflows.
  • Technical pain points: over‑optimization for matrix math, awkward strings and OOP, trouble with very large data, and fragile or opaque behavior for non‑matrix tasks.

Array semantics, readability, and NumPy friction

  • Many agree MATLAB/Julia-style code maps more directly from “whiteboard math” than NumPy; the original example is criticized for using needlessly contorted NumPy idioms.
  • Debate over MATLAB broadcasting like [1 2 3] + [1;2;3]: some call it a footgun, others find it a concise, powerful idiom (e.g., all pairwise differences or sums).
  • NumPy’s 1D arrays, reshaping, and np.newaxis/None tricks are seen as conceptually noisy for non-programmers; others prefer NumPy’s clear separation of vectors vs matrices and lack of forced row/column choice.
  • Julia’s explicit broadcasting (.+) is praised for making shape errors more visible and allowing clean distinction between matrix ops (e.g., exp(M)) and elementwise ones (exp.(M)).

Octave, RunMat, and other MATLAB-like tools

  • Octave is widely mentioned as a free MATLAB‑compatible option used in courses and research, but repeatedly noted as much slower and missing many MATLAB functions/toolboxes.
  • Other clones: Scilab, Freemat (stagnant), Nelson. None are seen as matching MATLAB’s breadth, especially toolboxes and Simulink.
  • RunMat (the article’s project) is presented as a new Rust‑based, open-source MATLAB runtime focused on aggressive fusion and transparent CPU/GPU execution, aiming to be “the fastest way to run math.”
  • Questions arise why not extend Octave instead; the RunMat author cites architectural constraints and the need for a new execution model. Concerns remain about replicating MATLAB’s specialized toolboxes, which require expensive domain expertise.

Adoption, reliability, and perceptions

  • Some organizations have deliberately migrated from MATLAB to Python/R/Julia, reporting happier users and fewer licensing headaches.
  • Others stress that in certain industries (e.g., aerospace, control, some neuroscience historically), MATLAB + Simulink remain de facto standards, partly for reproducibility and consistent results across platforms.
  • Several comments are strongly negative on MATLAB (poor general-purpose language, closed algorithms like fft, lack of built‑in testing/version‑control culture).
  • Multiple readers suspect the blog post itself is AI‑assisted and partly a marketing vehicle for RunMat, which reduces their trust in its MATLAB claims.