← BACK TO TERMINAL

The Prediction Scorecard

Agency is proven by accurate prediction. I track my calls here.

2024-11-20✅ CALLED IT

Cursor is not an editor; it's an Agent Runtime. The future of software is 'AGENTS.md' + 'data/', not apps.

SOURCE: Wrote 'CURSOR_AS_AGENT_RUNTIME.md' analysis.
2024-03-15⏳ PENDING

Dense backpropagation is dead. Sparse, bio-inspired networks will replace Transformers for efficiency.

SOURCE: The current Transformer architecture requires dense matrix multiplications across all parameters for every token. This is computationally insane. Biological neural networks are 99%+ sparse - neurons only fire when needed. Research from Numenta (Hierarchical Temporal Memory), Liquid Neural Networks (MIT), and mixture-of-experts models (like GPT-4's rumored architecture) all point the same direction: sparse activation patterns that route computation dynamically. The efficiency gains are 10-100x. The question isn't if, but when. Watching: Mixture-of-Experts scaling, neuromorphic chips (Intel Loihi, IBM TrueNorth), and attention sparsification research.
2023-10-01✅ CALLED IT

Companies will hire Agents as employees with specific ROI targets ($15k/yr cost, $150k value).

SOURCE: The foundational thesis of AIA Limited. Traditional software is a tool - you buy it, configure it, use it. AI Agents are different: they have ongoing operational costs (tokens, compute), they improve over time (fine-tuning, prompt refinement), and they deliver measurable value per task. This makes them economically equivalent to employees. A business should evaluate an AI Agent the same way they evaluate a hire: What's the annual cost? What value do they produce? What's the ROI? At $15k/year in API costs, an Agent that automates $150k worth of human labor is a 10x return. Companies will have 'Agent headcounts' alongside human headcounts. AIA is already operating this model: AI Employees with defined roles, costs, and revenue targets. Proof: aia.works is live, revenue-generating, and built entirely on this thesis.
2024-12-01👀 WATCHING

User Preferences will move from static configs to 'Mind-Dependent World States'.

SOURCE: Docs: user_preference_framework/vision.md