Ever go to see a movie based on seeing a great trailer for it, only to discover that the trailer was much better than the movie? This new trailer for Kubrick's The Shining [.mov file] is as good as the movie, only it's apparently of a very different movie. Nice work.
This is pretty far beyond the sort of things I usually post, but I thought this alleged quote from Fury.com today was amusing:
We realize you had a choice betwen several bankrupt airlines to fly today, and we thank you for choosing our bankrupt airline.
-- my Delta Airlines pilot this morning
From Overheard in the Office:
Patron: Can you please tell me where I can find post-modern American fiction?
Librarian: Post-modern? That would be in the future, there's no such thing.
Patron: Uh, okay. Can you tell me where science fiction is?
James Nash has been remixing Flickr CC-licensed photos to create new images. The one above includes an image I posted to Flickr a month or two back (I think he primarily used some of the USB cables from this image, as well as several other images you can see at the first link). One of the great parts about Nash's remix is that my original image was majorly dull--just a shot of all the cables running out of the left side of my computer, which intrigued me, but the shot itself wasn't all that interesting from a photographic perspective (major issues with lighting and contrast, among other things). Now it's part of something much cooler.
SorryGottaGo.com offers .wav files you can use when you really need an excuse to hang up during a phone conversation. Sounds range from "There's my ride" (a honking horn) and "The smoke alarm is going off" to "My other phone is ringing" and "The ambulance is here" (I guess you can also use this one if the smoke alarm doesn't do the trick).
I wasn't going to post this site since I hate using the phone, and these would come in handy. Then I realized my current solution--just plain not answering the phone--is pretty foolproof.
I cribbed the title from Slashdot since it was so funny. Many many news sources last spring carried the story of research on IM, chat, and email as intellect-wasting habits. News today (not likely to be so widely covered) of the fact that the research the headlines were based on wasn't quite so damning. One weblog that criticized the early press coverage now has a more in-depth analysis (and apology), including some comments from Glenn Wilson, one of the primary researchers covered extensively in the original story.
This "infomania study" has been the bane of my life. I was hired by H-P for one day to advise on a PR project and had no anticipation of the extent to which it (and my responsibility for it) would get over-hyped in the media.
There were two parts to their "research" (1) a Gallup-type survey of around 1000 people who admitted mis-using their technology in various ways (e.g. answering e-mails and phone calls while in meetings with other people), and (2) a small in-house experiment with 8 subjects (within-S design) showing that their problem solving ability (on matrices type problems) was seriously impaired by incoming e-mails (flashing on their computer screen) and their own mobile phone ringing intermittently (both of which they were instructed to ignore) by comparison with a quiet control condition. This, as you say, is a temporary distraction effect - not a permanent loss of IQ. The equivalences with smoking pot and losing sleep were made by others, against my counsel, and 8 S[ubject]s somehow became "80 clinical trials".
Since then, I've been asked these same questions about 20 times per day and it is driving me bonkers.
This tendency in science reporting has always bothered me. Although there have always been limitations to the scientific method, one thing it tries to be good at is decontextualizing issues in order to study them. Ok, there's a lot of value to that--it's limited, but if you can understand science within those limitations, it can be useful. But the first thing science reporting does, in major news media, is to blindly recontextualize the research, then make it a universal condition, then use it to apply blanket statements to extraordinarily wide contexts: IM is evil.
Even apart from Wilson's comments, I rarely saw anyone ask how it was that solving matrix problems while working in a busy environment was supposed to be generalizable to anyone except for, I guess, people who solve matrix manipulations at work. I know how to do matrix manipulations (an artifact of an undergrad major in math and computer science), and it's basically carrying around an enormous amount of information in short-term memory while you perform successive, detailed calculations on it. Exactly the sort of thing that interruptions play havoc with. But it's obviously not a day-to-day activity for most people. I'm sure those people out there, and I'm sure they're working hard in quiet little cubicles, but it's not a focal point or part of the job description for most people.
(I hope I haven't offended anyone who does matrix manipulations for a living.)
Doc Mara (of SurfnPoetry)has some great comments (via email) on the earlier post on how hypertext history has largely overlooked Paul Otlet's contributions, as well as some questions about comments being turned off on Datacloud (more about which follows his note).
Why no comments feature on your blog? Just wanted to let you know I mentioned Otlet in an article I have floating around. "As We May Think" was published about 10 years after Bush started working on the essay, so Bush may have dreamed up the memex first. Bush and Otlet were kicking the idea around at about the same time. Bush gets the cred, while Otlet and others did lots of the work (except in a few sources--and Alex Wright is hardly the first guy to notice this).
WHY Bush gets the cred (Lessig calling Bush "father of hypertext" etc.) and Otlet shows up on academic blogs as "the guy nobody remembers" is the subject of my essay. I probably give a crap answer, but I think its the discussion we should be having.
Thanks for the post and put me down for a "datacloud should have comments" vote. Of course, there is no democracy on the datacloud, right?
All of which are good points. Our writings of history tend to valorize individual geniuses, unfortunately, and once one hero (like Vannevar Bush) gains prominence, everyone else falls away. But maybe there's room for Otlet (and others) as the history of hypertext continues to be (re)written.
And about the comment feature on Datacloud: Datacloud never gets a huge number of comments, partly because I'm usually just pointing to other things I guess. At some point over the last few months, comment spambots began hitting the weblog really hard. And although I was able to set up tight enough filters and blacklists to keep most of the comments from ever getting to the public face of Datacloud, the filters and blacklists are Perl scripts that run as the comments.cgi was launched. The filters and blacklists did what they were intended to do, but the fact that the cgi's needed to be launched to do their work was creating some pretty heavy loads on Clarkson's web server (to the point that I was getting periodic visits from the IT support staff)--not just a few hits, but tens of thousands of hits a day. I'm hoping that our move to linux next fall will let me run the weblog on MySQL rather than the current BerkeleyDB to speed things up, but in the interim, I've had to leave the comments turned off....
An interesting historical piece at the information architecture site Boxes and Arrows on Paul Otlet's 1934 plans for a mechanical database built with index cards.
... Paul Otlet envisioned a new kind of scholar’s workstation: a moving desk shaped like a wheel, powered by a network of hinged spokes beneath a series of moving surfaces. The machine would let users search, read and write their way through a vast mechanical database stored on millions of 3x5 index cards.
This new research environment would do more than just let users retrieve documents; it would also let them annotate the relationships between one another, “the connections each [document] has with all other [documents], forming from them what might be called the Universal Book.”
[via 43 Folders]
c|net has posted "Intelligence in the Internet Age," a look at shifting definitions of what it means to be intelligent in the age of ubiquitous information. Lots of useful perspectives from (intelligent) people like Doug Englebart, Vint Cerf, and Jeff Hawkins (and Seneca, who, as you might guess, comes off as sort of a Luddite).
"It's true we don't remember anything anymore, but we don't need to," said Hawkins, the co-founder of Palm Computing and author of a book called "On Intelligence."
"We might one day sit around and reminisce about having to remember phone numbers, but it's not a bad thing. It frees us up to think about other things. The brain has a limited capacity, if you give it high-level tools, it will work on high-level problems," he said.
Given my own brain's extremely limited capacity, I've moved myself firmly into the "I know everything; I just keep it out there on the Web so I can recall it as needed" camp.
[via Boing Boing]
Cleaning out my download folder, I found SoundOfAnImage, an OS X app that converts images to audio. You can customize the three instruments used in playback (from various OS X, built-in synths), change key and tempo, and save audio as MIDI, and toggle between conversion by line or by pixel. Cool.
I haven't seen any confirmation on this (and Blizzard's website doesn't mention it), but there's a report at ShackNews on a virtual plague spreading in the way-popular MMORPG World of Warcraft (spelling errors retained for the cinema verite, breaking-news feel):
Heres the skinny: Blizzard adds in a new instance, Zul'Gurub. Inside is the god of blood, Hakkar. Well, when you fight him he has a debuff called Corrputed Blood. It does like 250-350 damage to palyers and affects nearby players. The amazing thing is SOME PLAYERS have brought this disease (and it is a disease) back to the towns, outside of the instance. It starts spreading amongst the genral population including npcs, who can out generate the damage. Some servers have gotten so bad that you can't go into the major cities without getting the plague (and anyone less than like level 50 nearly immediately die).
GM's even tried quarantining players in certain areas, but the players kept escaping the quarentine and infect other players.
The term "computer virus" comes full circle, I guess.
OS X app Onlife visually maps your work on a computer:
Onlife is an application for the Mac OS X that observes your every interaction with sofware applications such as Safari, Mail and iChat and then creates a personal shoebox of all the web pages you visit, emails you read, documents you write and much more. Onlife then indexes the contents of your shoebox, makes it searchable and displays all the interactions between you and your favorite apps over time.
[via information aesthetics]
Linotype has just released their free font-management program, FontExplorer X for Mac OS X. (Previous versions, including one for Windows, are available via a link at that page as well.)[via Macintouch]
D. Keith Robinson at Asterisk (actually, Eric asked D.) one of those desert island questions. "If you were on a desert island and you could only listen to five bands for the rest of your life, who would they be?" Those sort of questions are only interesting if they're really hard to answer, and this one is.
I keep going back and forth, trying to come up with musicians I really like, and that I don't think will get old after repeated listening. But I'm also trying to get some diversity, which makes it harder.
I figure this will give me a pretty broad range of sounds. I'm going to bend the rules a bit here and claim other music that key figures of those bands also figured in, which will let me include Uncle Tupelo and Golden Smog in with Wilco (let me really stretch it and include Jay Farrar and Son Volt), and Robert Pollard and Tobin Sprout solo material in with GBV. Even so, I'm already revising in my head.
Wait, where's Sonic Youth? What about some Hank Williams Sr. or Kitty Wells? If I include Hank Williams Sr., can I keep bending the rules and include Hank III? (I can live with skipping Hank Williams, Jr.)
Oh, some rap and turntablism: Public Enemy? Peanut Butter Wolf? Grandmaster Flash? DJ Shadow? DJ Logic? And punk. Clash? Rancid? Pennywise? Sex Pistols? (Maybe not enough stuff there.) And where the hell is Johnny Cash?
Damn, this is hard. Luckily, I'm not going to be stuck on a desert island any time soon, so I have time to revise.
From CNN, a British politician is bitten by MS Word's Track Changes feature that was used to reveal deleted text in a document. (I guess this is why I always toggle "Track Changes While Editing" when I use the Track Changes, since that way the stuff is still on-screen instead of being invisible.)
[via CNET News.com]
Mike Davidson discusses (and deconstructs) the difficulty of originality in logo design:
"Never waste a stroke."
That’s the best piece of advice you’ll ever get in logo design. However, it’s also advice that can inadvertently get you in trouble. Draw a blue circle on the screen and you’ve just stolen the Blaupunkt logo. Draw a yellow line and you’re copying Visa. Draw a black swoosh and you’re ripping off Nike. The less intricacies involved in creating your masterpiece, the more likely it is that someone has already created it.
Davdison's post also includes extensive comments from readers.
I'm not quite sure how to interpret this cluster of headlines from Google News, allegedly about some hidden link between Google's new Blog Search service and Gillette's new five-blade razor.
Is it some sort of SAT sample problem (Google Blogs are to the Web as Gillette razors are to a three-day growth of beard?)? Or maybe an as-yet-unreleased new Google tech (Google Bathroom Appliance: Now With Wi-Fi)?
An inspired group of fifth graders from a school in Minnesota put together their own music video for Devo's "Whip It" [.mov link].
Damn. When I was in fifth grade, we thought we were lucky because we got to sing CCR's "Jeremiah Was a Bullfrog" at our school concert.
And in an additional example of borrowing in design, consider this discussion of the iPod, particularly the original white iPod and the new white Nano iPods: Luke Williams of frog design writes about the apparent borrowing of "clean" aspects of bathroom design--think smooth and white, like porcelain bathtubs--in the design of the iPod.
Historically, designers and manufacturers have made interesting use of conventions in design to alter the way people perceive products. The public once thought electricity was dangerous and expensive, so to change this perception, the electricity industry sought to project the image of electricity as a modern and progressive source of energy. To symbolize these qualities, designers used the conventions associated with ‘technological futurism’—chrome plating and streamlining. In 1955, industrial designer Henry Drefuss wrote that changes in the design of the modern kitchen had been brought about ‘by two things that had nothing to do with cooking a meal—the automobile and the airplane.’
Although the symbolism has changed, the iPod also uses conventions to appear ahead of its time. Its surfaces are seamless and have no moving parts— two conventions that have often been used in science and science-fiction to connote advanced technology. Remember the seamless, molten-metal bad guy in Terminator 2? Or how about the perfectly seamless, black monolith in 2001: A Space Odyssey?
At some level, I admit this seemed to stretch influence a bit, like an academic paper deconstructing a physical design (which isn't bad, but sort of handwaves around--deconstructs, actually--issues of intention). But as Williams points out, Jonathan Ive, the designer of the iPod, has designed an extensive range of products that includes bathroom fixtures. So while he may not have thought, Let's make it look like a bathtub!, he also clearly understands what meanings people will make from various textures, colors, and shapes.
[via Anne and Gizmodo]
BadDesignKills has some examples of logo designs from a company they claim publishes "rip-off artwork." Examples include both the original logo designs and the allegedly plagiarized design. Like the other design examples I discussed earlier, the examples might be useful in talking to students (and to each other) about what constitutes creativity and originality. Graphic design, like architecture and other design fields (and writing), frequently borrow from earlier work; the trick is to re-use small portions or visual quotes in creative ways.
The comparisons from BDK do, on one hand, situate their borrowings in new contexts and frequently modify those portions. But in most of the cases, the creativity involved and the modifications seem minimal (transformations such as changing color or flipping an element's orientation). In addition, the overall designs, given that they are logos, are themselves relatively abstract and simple, with the allegedly copied portions constituting the larger part of the overall design.
The Vancouver International Digital Festival (VIDFEST) has selected winners for this year's interactive design competition. Interesting projects in business, education, entertainment, and art/experimental categories (includes descriptions of top 3 in each category, comments from judges, and links to the projects as well).
Salon has an compendium of clips showing the news media actually asking some hard-hitting questions about the government response to Katrina. (You can use the minor "day pass" option and watch a ten-second Flash ad to get free access.)
[via Boing Boing]
Celtx is an open-source multi-platform (win, os x, linux) application for script writing and pre-production (location and talent scouting), including facilities for collaboration during development. I haven't had a chance to use it much yet, but my first impression is that it's really, really cool. Check out the overview and three-minute walkthrough.
On this date in either 1945 or 1947 (depending on which history you believe), the first computer bug was discovered. In diagnosing malfunctions in the Mark II computer at Harvard, a technician tracked the problem down to a dead moth caught in the relays. The operators described the error and even taped the insect into the log. See a description of the events at Wikipedia, including a picture of the relevant section of the notebook. (Contrary to popular belief, Grace Hopper didn't actually discover the bug, but she was on the Mark II team and popularized the story.)
These questions are even trickier in architecture—or painting, or sculpture, for that matter—than on the written page. A paragraph in a biography of George Washington, for example, that reads the same word for word as a paragraph from a previous biography of the first president is solid evidence of theft. But if a building has a torqued (that is, corkscrew-like) façade and thus bears a passing resemblance to an older, similarly torqued design, is that also solid evidence? On the one hand, you could say yes, and make the case that because the torque is so integral that it defines the building, it is therefore even more egregious to copy such an important element than to lift a single paragraph from a book.
But most people would go the other way, because architecture is fundamentally different from writing. A book’s value is decided in large part by the accumulated impressions gained from reading it; therefore, if part of the book was written by someone else, its author has rigged the reader’s appreciation of their work. But architectural appreciation works differently, more holistically. The vast majority of people, inside and out of the profession, judge a building by the sum of its parts to the near exclusion of its individual elements. What is important is not so much the torque, but how the torque works with, say, the building’s base or crown. To be sure, the difference isn’t completely distinct—a book’s worth is obviously determined in part by how well all its arguments and characters and whatnot go together, and an otherwise well-done building can be marred by a particular element, such as a poorly executed entrance. But by and large, we look at the two in fundamentally different ways.
Such issues are increasingly important for a couple of key reasons. First, "creativity" (in my view at least) is being redefined so that it focuses on the manipulation--connection, juxtaposition, filtering, arrangement--as much as on unique production of artifacts from the solitary mind of a genius. Second, plagiarism for most people has historically been defined in terms of written, verbal texts. But today people are not simply writing texts, but designing things, and design involves a different set of ground rules. In text production, quotations are marked literally with quotation marks and explicit citations. There aren't quote marks in design.
Attempting to make finer-grained distinctions than his editors agree with, Douglass Rushkoff explores the differences between "commoditization" and "commodification." (For the record, I agree with him--language changes over time, and the publishing industry usually lags far behind both popular culture and theory; NetNewsWire's spellcheck doesn't like either, but it also flagged NetNewsWire, so that's not saying much.) Oddly, this is the second inquiry I've see this week about editors balking at terms they don't seem to quite get; the first was from a friend struggling to get her editors to allow the word "appropriated" in a textbook about mix culture. After a long discussion, here's what Rushkoff ends up with:
'Commoditization' is a newer and undocumented word (except in WIKI) referring specifically to the way that goods that used to be distinguishable in terms of attributes end up becoming mere commodities in the eyes of the market or consumers. 'The collapse of Marlboro's brand value in the early 1990's convinced cigarette manufacturers that their products had become commoditized.' or 'Unless Intel comes up with a new kind of computer memory chip, Japanese equivalents will commoditize RAM.' The problem with commoditization is that the only thing that left to distinguish one brand from another is price, so margins shrink.
Commodification is more of a crime of the market against humanity, while commoditization is more of a market problem for the manufacturers of branded goods.