Friday Blast #41
Probit regression (2018) - this is an alternative to the more famous logistic regressions. They’re both generalized linear models, and the difference lies with the link function. For the first it is the so-called probit function, while for the latter the logistic one. The probit function is the inverse of the standard normal distribution’s CDF. It looks very much like the well known logistic link-function / neural activation function, and in practice the two method are basically the same. But sometimes it’s easier to analyze with one than with another.
Introduction to decision tree learning (2018) - covers what the models looks like as well as a sketch of a learning algorithm (ID3 in this case).
Oracle plans to dump risky Java serialization (2018) - I’ve heard a lot of good stuff from Java world lately and it seems the language is moving faster than ever. Dropping serialization is a big move, and it’d be interesting to see it play out. If I hadn’t moved to other pastures in the meantime.
The headers we don’t want (2018) - an analysis done by Fastly on the common headers they encounter. Turns out there’s a lot of popular but otherwise useless ones. The author selects vanity, deprecated standards, debug and plain old misused headers for special attention. There’s an example of VCL - Fastly’s small language for request transformations. Whic his pretty neat.
Malloc never fails (2012) - yes it does, but in Linux you’ll be killed by the kernel before you get a chance to react. So it still doesn’t make sense to try to treat this particular failure case in most situations. What is interesting is that even if it succeeds you’re not guaranteed the memory - just when you start using it. So you can allocate a bunch more than you could actually use - and will only crash when you do use it.