notable mental methane vents

  • SOC (n.): system on a chip. Previously known as a microcontroller. What we now call just 'a computer', but integrated boards were an enormous deal, a revolution within the digital revolution.

  • UUOC (n.): Useless Use Of Cat (Award). Surprisingly mean retort to StackOverflow answers which use the UNIX tool cat where a pipe would do.

  • abience (n.): the urge to withdraw. Usually used to mean pathological avoidance, but to me it is also the plain, sacred joy of missing out.

  • hardtack (n): A very basic cracker, just baked flour and water. Staple of navies and Tudor explorers.

  • HARKing (v.): Hypothesizing After the Results are Known. A particular problem in social science, where pre-registration of studies is a tiny minority of work.

  • merchantable (UK legal n.): Good enough to be sold.

  • technical steer (n.): Input from expert staff, AKA 'knowledge'.

  • whitespace damage (n.): subtle but breaking changes to source code performed by ordinary text processors, e.g. line wrapping, hidden characters, odd apostrophes. This phrase is a shibboleth for being A Very Serious Person, e.g. a kernel dev.

  • moving up the value chain (phrase): performing work further away from physical extraction, processing, and manufacturing. Supposedly insulates you from competition because your outputs are less easily evaluated as they become less physical. Economic abstraction.

  • FANG: Facebook / Amazon / Netflix / Google, particuarly when their stocks are used as a bellwether.

  • DGP: data generating process. This took bloody ages to google.

  • rebranding (v.): "a euphemism for 'euphemism'" - Jonathan Meades

machines inside

PSA: It took me many years to internalise the formal methods I know now.*

I use "internalise" as distinct from "learn", because, let's face it: we all "learn" statistics in uni, in the sense of briefly knowing a tiny set of teacher passwords, of knowing what a mode is, and of knowing how to dumbly apply two canned tests of inference.** But almost no-one with that badge on their resume actually remembers, actually uses, and was actually changed by contact with it, the driest and most nutritious method.

My measure of internalisation is if you use the method, without prompting by school or advisor, in your investigations. Internalisation requires some understanding, but I'm not saying that I have any deep grasp of these things. I just appreciate their power, and use them as well as I can where I can.

  • First contact with algebra: 2000
    Internalised algebra: 2012.

  • First contact with Analysis: 2003
    Internalised Analysis: Not yet.

  • First contact with formal logic: 2008.
    Internalised first-order logic: 2010. (pic above)

  • First contact with proof: 2010
    Internalised proof: Not yet.

  • First contact with statistical inference: 2011 **
    Internalised statistical inference: 2017.

  • First contact with Bayesian / cognitive / decision science: 2010
    Internalised decision science: Not yet.

  • First contact with full-blown probability theory***: 2012
    Internalised probability theory: 2017.

  • First contact with (imperative) computational thinking: 2014
    Internalised computations: 2015.

  • First contact with functional programming: 2016
    Internalised computations: Not yet.

  • First contact with machine learning: 2015
    Internalised ML: Not yet.

* This strikes me as worth stating, because the rigorous fields are so demoralising to tackle alone, and take so long for even very intelligent people to get comfortable with. In a standard mathematical education, we don't get to see the cockups or the thousands of fruitless hours that Jacobi or Germain had to put in, to win as they won. (The painstaking labours of Wiles and Zhang are at least a bit more available to us.)

** I have tried to learn statistics (that is, higher statistics, data analysis and inference) four times in my short life:

  • 1) 2010: in the standard, cursory Research Methods module in undergraduate economics (I find myself not guilty).
  • 2) 2011: to catch up with discussions on LessWrong
  • 3) 2013: through a formal stats degree
  • 4) 2016: on the job.

Only the last is sticking.^ But I was extremely lucky to get a statistically demanding job without credentials in the first place; I snuck in because my profession confuses people and my programming ability was so far beyond the spec that they halo-effected me in.

So where does everyone else actually learn stats?

^ Again, I'm using a strict definition of "learn": in the others I learned many terms and followed many formal derivations and clicked many buttons in many statistical packages, but I did not actually gain the statistical mindset, modelling ability.

*** Not even measure theory...