People, you couldn’t trust any of these three guys to go down to the corner grocery for a pack of cigarettes. Stallman would bring you tiny peat-pots of baby tobacco plants, then tell you to grow your own. Assange would buy the cigarettes, but smoke them all himself while coding up something unworkable. And Ed would set fire to himself, to prove to an innocent mankind that tobacco is a monstrous and cancerous evil that must be exposed at all costs.
-M is the port autossh uses to keep the session alive, -t is necessary to persuade screen/tmux to attach to your local terminal. I use the comment to help track which session is which and it allows me to use bash’s history search (ctrl-r) to find the right command string.
The visual representation of data has gone through a number of phases, with its goals switching back and forth between analysis and presentation over time. Many introductions to visualization tend to portray historical examples as all being done for the same purpose. That, I argue in this short, incomplete, and cherry-picked history, is not true.
Early to Mid–1800s: Playfair, Nightingale, Snow, Minard
The first uses of graphics to represent data, interestingly, were very bare and abstract, and at the same time were mostly tools for communication. The abstract nature of these early charts is surprising when you consider the amount of ornamentation and decoration that was common with even simple household objects in the early to middle of the 19th century. John Snow’s and Charles Minard’s maps were downright stark compared with many maps drawn at the time.
Those charts were drawn to communicate, not to analyze. Snow’s cholera map often wrongly serves as an example of visual analysis, when it was drawn to convince. Similarly, Florence Nightingale’s chart of deaths in the Crimean War was used to illustrate her argument that improvements in hygiene would save many lives, and William Playfair illustrated the trade balance between England and its trade partners.
In the first thirty years of the 20th century, Otto Neurath designed a visual language, the ISOTYPE, that not only showed numbers in ways that were easy to read, but that also communicated what they meant. Want to show statistics about workers and factories? Show them as little worker and factory icons. Each icon represents a certain number of the respective object, making it easy to compare and to read off numbers.
Neurath wanted to educate people about the world: ISOTYPE is the acronym for International System of TYpographic Picture Education. His illustrations were meant to stand on their own, without somebody there to explain. Nightingale and Playfair worked under the assumption that there would be explanatory text or they themselves would be there to make the argument, supported by the graphic. Neurath aimed to make his images self-contained and self-explanatory.
1960–70s: Bertin and Tukey
Thirty years later, the focus shifted from presentation to analysis, and the explanatory parts of the graphics disappeared again. John Tukey in particular was interested in what he could learn from the graphical depiction of the data, not whether it would work as a good presentation device. Bertin used graphical means both for analysis and communication, though his more presentational graphics were mostly maps.
Tukey invented a number of different plot types, among them the box plot, the bubble chart, the radar chart, and more. Bertin, in addition to his seminal theoretical work, created the reorderable matrix, a simple yet powerful tool for finding clusters in data. It represents one of the first uses of interaction in visualization.
The late 1970s and early 1980s saw a new development: the elaborate information graphic, which had existed for a while, was starting to be used to communicate numbers. Nigel Holmes is perhaps the most prominent designer of this kind of visualization.
Holmes actually uses the term explanation graphic, which is not only less misused, it also more clearly describes the goal: to explain the data and its context. In addition, Holmes also clearly wanted to draw the reader’s attention and entertain. The result were information graphics that were very elaborate and unique, but always based on actual, real data.
In stark contrast to Holmes, Edward Tufte advocates a minimal and unembellished style, with a strong emphasis on displaying the data and just the data. While Tufte keeps talking about showing information, his focus is clearly on displaying data for analysis. What sets his perspective apart from current information visualization research is that he almost entirely talks about static representations (often on paper, for its high potential information density), which the user can examine and use to explore the data and answer questions.
Tufte’s influence is clearly felt in the visualization field today, and his name is often invoked when elaborate information graphics are criticized. Tufte and Holmes represent the two extremes of the embellishment spectrum, and while Tufte’s end has been explored quite well by the scientific community, there is still work to be done on the Holmes side.
2000s: INFOGRAPHICs vs. Visualization
Today’s deluge of infographics is a mix of many different styles, with the loud and crazy ones unfortunately sticking out (and perhaps being the most common). Often, they are used to attract eyeballs and links to otherwise mundane articles; which is not an issue in principle, Holmes’ work partly served a similar purpose.
What many of these infographics lack, unfortunately, is accuracy and depth. While the information graphics of the 1980s were generally useful for understanding the context of the data, many of today’s infographics just add eye candy that is of little practical use, while playing fast and loose with the data.
At the same time, the academic visualization community is all about analysis and exploration of data, and almost entirely ignoring information and explanatory graphics. There is clearly value in the work that is being done, but I also feel that a huge opportunity is being missed.
In its roughly 200 years of history, our idea of visualization has changed considerably, and work has been done for different purposes. Visual representations are very malleable, and can often serve different purposes reasonably (or even equally) well. To properly understand why things were done a certain way, we have to look at the work based on what we know about its creator’s goals and ideas. If we ignore this context, we are doing a disservice to the people we inherit from, as well as limit our own ability to build on their work.
Bret is a man on a crusade to change programming. His essays and talks have all been insightful and great food for thought. This one is a critique of Khan Academy’s computer science courses, which aim to teach programming by getting learners to experiment with code in their browser using a graphics library called Processing, allowing them to see the results of their code as soon as they make a change.
The first half of the article shows the sort of thing he’d have done, with short videos of prototypes for each improvement he has to suggest. These are encouraging, yet they only cover fairly basic programming activities, and seem pretty tied to the realm of graphics programming. The second part (from “Language” onwards) looks at the bigger picture and gives several references and big ideas, but is notably missing the prototypes that he’d knocked up for the easier code examples. At the end, he makes some pretty bold statements:
A frequent question about the sort of techniques presented here is, “How does this scale to real-world programming?” This is a reasonable question, but it’s somewhat like asking how the internal combustion engine will benefit horses.
He may have a point, it certainly all sounds very good, but unless the next thing he publishes is a working IDE or language that supports his theories (preferably mostly built in its own environment after some bootstrapping), I’m afraid it’ll turn out he’s not on the cusp of a Kuhnian revolution, but on the slow descent to becoming another crank.
Fascinating debate where one guy argues Obama is complicit in making regressive policies a reality and the other that he’s as progressive as America can expect to get in office. Not the sort of argument you tend to hear out of America.
A relatively short point & click adventure game tailored for touch screen apps.
Except it also fuses the game with a luscious soundtrack we’re constantly reminded was written by Jim Guthrie, which is integral to some of the puzzles and otherwise infuses the game with a rich atmosphere, invoking woodland clearings at one point and a decrepit mountaintop barrow at another.
It’s also tapped into the heart of a post-modern pop culture ethic, referencing Twin Peaks and Videodrome, featuring an “Archetype” whose shadowy behavior provides the connection between the player and the protagonist a (female!) warrior mage known as the Scythian. It’s completely aware of itself as a game, as a digital artefact, as a completely self aware digital gaming artefact.
This game screams punk, it roars metal and they have a methodology that explains why. If The Sword made a video game, it would be a lot like this. It is seriously cool, tricky enough to feel like a challenge, but not so much that its a chore to complete. Indeed the game is split into “sessions” each of which take about an hour to work through, after which you can go to a dubstep gig and huff some bath salts or whatever it is the kids are doing these days.
This is joining Psychonauts and Portal in that class of games everyone will be sick of being told to play. It’s more Firefly than The Wire, but its clearly rubbing shoulders with them. It’s a delightful experience.