Category: Technology

Home >> Archive by category "Technology"

OOM-killer Aircraft Analogy

A very amusing analogy by Andries Brouwer about the Linux OOM-killer! When available memory is exhausted, Linux invokes the Out Of Memory (OOM) killer, which has to make a decision on which process or processes should be terminated to free memory. This is not a precise science! The heuristic algorithm cannot satisfy everyone. Enter Andries' take on this sufficient killer; "An aircraft company discovered that it was cheaper to fly its planes with less fuel on board. The planes would be lighter and use less fuel and money was saved. On rare occasions however the amount of fuel was insufficient, and the plane would crash. This problem was solved by the engineers of the company by the development of a special OOF (out-of-fuel) mechanism. In emergency cases a passenger was selected and thrown out of the plane. (When necessary, the procedure was repeated.) A large body of theory was developed and many publications were devoted to the problem of properly selecting the victim to be ejected. Should the victim be chosen at random? Or should one choose the heaviest person? Or the oldest? Should passengers pay in order not to be ejected, so that the victim would be the poorest on board? And if for example the heaviest person was chosen, should there be a special exception in case that was the pilot? Should first class passengers be exempted? Now that the OOF mechanism existed, it would be activated every now and then, and eject passengers even when there was no fuel shortage. The engineers are still studying precisely how this malfunction is caused."

KubeCon Highlight: Saving the Planet from Zombies!

Highlight of #kubecon #cloudnativecon 2020 so far!! Climate Keynote by Holly Cummins; "One of the problems we need to think about in 2020 is zombies destroying the climate"! 🧟‍♂️ 🌍 Very important topic and so great to see it addressed at a tech conference keynote! De-zombify your K8s clusters, everyone! Also worth viewing:

The Battle of the Neighborfoods

1. Introduction 1.1 Background "UK restaurant market facing fastest decline in seven years" A headline from last year[i] prior to the coronavirus. MCA’s UK Restaurant Market Report 2019[ii] indicated that "large falls in the sales value and outlet volumes of independent restaurants is the cause of the overall decline of the UK restaurant market. It attributes this to a “perfect storm” of rising costs, over-supply, and weakening consumer demand." London's restaurant scene changes week on week, with openings and closures happening on a regular basis; it must be hard to keep up. The hyper-competitiveness of London's restaurant scene make it one of the toughest cities in the world to launch a new venture. "With business rates up and footfall down, a winning formula is worth its weight in gold and although first-rate food is inevitably the focus, other factors can also affect a restaurant's success. Atmosphere is frequently cited in customer surveys as second only to food in an enjoyable restaurant visit and getting the vibe right is crucial."[iii] Due to the coronavirus most businesses have suffered even greater losses. As restrictions lift businesses will be looking for ways to make up for lost time and earnings. Reopening a restaurant once lockdown is over is one thing, but knowing what to put on the menu if you haven't been in contact with a punter in months is another. "Are there any grounds for hope? A wild optimist might point to some encouraging data about the overperformance of small chains while everyone else loses their shirts; a realist might make coughing noises about small sample sizes and growth from a low base. The queues snaking out of Soho’s recently opened Pastaio suggest one genuinely viable route to salvation – concepts may need to follow its lead and amp up the comfort food factor while dialling down prices. And while home delivery is a source of confidence for some parties (Deliveroo, for instance, recently listed its shares on the stock market) it may well end up a false friend: the increased volume of so-called “dark kitchens” presage a sinister vision of the future, where restaurants don’t exist to serve customers onsite at all, but just pump out takeaway meals for us to consume on our sofas. A little far-fetched, perhaps, but with lights going out at a faster rate than many can remember, it can’t be too long before whole tranches of…

Max Welling on the Future of Machine Learning (TDS Podcast)

Max Welling, former physicist, current VP Technologies at Qualcomm. Max is also a ML researcher affiliated with UC Irvine, CIFAR and the University of Amsterdam. Max has just shared some great insights about the current state of research in ML, and the future direction of the field: “Computations cost energy, and drain phone batteries quickly, so machine learning engineers and chipmakers need to come up with clever ways to reduce the computational cost of running deep learning algorithms. One way this is achieved is by compressing neural networks, or identifying neurons that can be removed with minimal consequences for performance, and another is to reduce the number of bits used to represent each network parameter (sometimes all the way down to one bit!). These strategies tend to be used together, and they’re related in some fairly profound ways.” “Currently, machine learning models are trained on very specific problems (like classifying images into a few hundred categories, or translating from one language to another), and they immediately fail if they’re applied even slightly outside of the domain they were trained for. A computer vision model trained to recognize facial expressions on a dataset featuring people with darker skin will underperform when tested on a different dataset featuring people with lighter skin, for example. Life experience teaches humans that skin tone shouldn’t affect interpretations of facial features, yet this minor difference is enough to throw off even cutting-edge algorithms today.” “So the real challenge is generalizability — something that humans still do much better than machines. But how can we train machine learning algorithms to generalize? Max believes that the answer has to do with the way humans learn: unlike machines, our brains seem to focus on learning physical principles, like “when I take one thing and throw it at another thing, those things bounce off each other.” This reasoning is somewhat independent of what those two things are. By contrast, machines tend to learn in the other direction, reasoning not in terms of universal patterns or laws, but rather in terms of patterns that hold for a very particular problem class.” “For that reason, Max feels that the most promising future areas of progress in machine learning will concentrate on learning logical and physical laws, rather than specific applications of those laws or principles.”Jeremy Harris, Towards Data Science, Jun 3 2020 ( Hear the full topic discussion on Spotify:

Word of the Month: Heteroscedasticity (and Homoscedasticity)

In Linear Regression Residual Analysis heteroscedastic results mean that the variance in errors is not consistent (see: Graph 1 and 2), which is what a good linear regression model should show — a good random scattering, showing no particular pattern. This is called, homoscedasticity (see: Graph 3). Graph 1Graph 2Graph 3 If your residual analysis results look like this then the model is not a good fit. To fix this, one could perform a data transform, or add a variable to the model to help account for what is the cause between the relationship of errors and input values. In the example above for Graph 1 and 2, this could be the number of people at a table or the time of day — since larger groups sometimes tip less because they assume everyone else will tip, or people are more generous later in day after some vino in the evening! But remember, "Essentially, all models are wrong, but some are useful." Now that’s what I call statistical bombasticity!