Determinism Scales Better Than Discovery
Thesis: When dependencies are knowable (as they often are in pricing and risk), explicit orchestration plus explicit caching scales better than runtime discovery plus database-as-memoisation (i.e., when the persistence layer quietly becomes the cache and dependency catalogue). This is not an argument against graphs. It is an argument about where you pay complexity. Universal runtime dependency substrate: node semantics + invalidation + distributed coherence become the “product”. Compiled plans + explicit artefacts: versioned artefacts, explicit reuse boundaries, and rule-driven invalidation inside a known plan envelope. Some of the most successful analytics platforms in finance are graph-centric. They deliver reuse, incrementality, and lineage, often extremely well, under the constraints they were optimised for. The problem starts when the graph becomes a universal runtime substrate that tries to unify market construction, calibration, valuation, lifecycle, aggregation, reporting, and sometimes operations under a single dependency model. ...
Why Large Quantitative Analytics Platforms Rarely Fail All at Once
Why do large quantitative analytics platforms fail? The unsatisfying but honest answer is: it depends. There is rarely a single root cause. Platforms almost never collapse because of one flawed model, one architectural mistake, or a single ill-judged rewrite. Most platforms do not break suddenly. They slowly lose coherence. They continue to run, produce numbers, and deliver outputs, yet over time the relationship between assumptions, inputs, and results becomes increasingly difficult to explain. Understanding this distinction matters. There is a difference between a system that stops working and a system that no longer makes sense. ...