I call back, and speak to a tech support woman. She says: "press Tab." I explain that I can't without saying I've read and agreed to documents I don't have. She says "press page down". Same problem. She says "scroll down". I explain it's not a Windows screen. She says "insert any Dell-shipped CD". I exlpain [sic] the problem of opening the CD packaging.
She insists I have to press a key. I ask her if she really means that I have to agree to the licenses before it's at all possible that i've read them. She says "yes". I explain that that's not acceptable, and ask for her supervisor.
(Via Dan Gillmor's eJournal)
In the beginning, it all started out as an experiment. "What would happen if I wore a nametag all the time?" I wondered. Maybe people would be friendlier. Maybe people would say hello to me. Maybe people would stare at me and think I was a complete weirdo!The numbers page provides some enlightening statistics on how Scott's been received:
98 percent of the population thinks I'm a pretty weird guyI'm not sure what sort of person is provoked to violence by the sight of a nametag. In fact, I'm pretty sure I dont want to know.
16 people have ripped my nametag off
16 spare nametags I took out from my wallet that were reapplied
7 people who have tried to beat me up
An interesting response to Tufte's widely circulated condemnation of PowerPoint, from ex-Talking Heads leader David Byrne:
Although I began by making fun of the medium, I soon realized I could actually create things that were beautiful. I could bend the program to my own whim and use it as an artistic agent. The pieces became like short films: Some were sweet, some were scary, and some were mysterioso. I discovered that even without text, I could make works that were "about" something, something beyond themselves, and that they could even have emotional resonance. What had I stumbled upon? Surely some techie or computer artist was already using this dumb program as an artistic medium. I couldn't really have this territory all to myself -or could I?
PowerPoint's gotten an unfortunately bad rap. Sure, the program allows--in some ways, encourages--bad presentation design. But other aspects of the program provide useful structures for developing presentations: the outline structure, the landscape orientation (compared to, say, default document creation in Word), etc.
I wonder what PowerPoint template this was from?
More to the point, though, Tufte's comments position users as passive sheep--their work automatically structured (tainted) by the technology. But while technologies do lend force to particular types of work (encouraging some, discouraging others), it doesn't make sense to lay blame with the technology. Why not ask the larger question: Why do we assume that technologies operate under simple cause and effect patterns? Do we really believe those Microsoft and Apple advertisements that promise us perfection as long as we stay on the upgrade plan?
Why not, instead, understand technology use as a complicated interplay among users, technologies, and contexts? Why not help users figure out how to design effective presentations and other communications?
The short answer, of course, is that it's easier to sell magic than work.
Remember that joker in high school who, when he was working under the hood of his car, would tell you to turn the ignition key? Then he'd start screaming like his hand was being chewed off by the fan belt? And your heart would leap out of your chest and you'd yell an involuntary "Gah!" while he was rolling on the ground and laughing?
This is the technical communication equivalent of that guy. (From a tutorial on using vi, a text editor; author's name withheld.)
6. Is your cursor placed before the word? Place it after the word by hitting the "w" key. Now, hit the "b" key. Pretty neat, isn't it? Play around for a minute and then bring yourself back to the beginning of the word "at."
7. Now, press "dd". Oops! We erased the whole line! Type "u". It is fixed!
A Slashdot piece (w/extensive hyperlinks) on crumbling corporate support to SCO's ongoing conference.
[...] Intel and HP seem to be backing swiftly away from their sponsorship of SCO's in-progress Las Vegas conference (a EWeek article suggests that "Intel Corp. was recently billed as one of the lead sponsors of SCO's Forum 2003 conference here this week, but then suddenly disappeared from all marketing and press material for the forum. It appears that Hewlett-Packard Co. also got cold feet. As late as last week, SCO was telling attendees that HP would be giving a partner keynote at the forum on Tuesday morning. But on Sunday the schedule of events given to attendees when they registered makes no mention of an HP keynote...")Tragic.
Related to Gillmor's piece (below): A Wired article on aggregators by Ryan Singel.
I'm subscribed to 200 feeds," said Luke Hutteman, who designed SharpReader. "Last year, I didn't even know what an aggregator was."
Though newsreaders have been around almost as long as Usenet and the Internet, some prominent bloggers and programmers argue that better syndication standards and more sophisticated readers herald the next big leap for life on the Internet.
"I'm a voracious reader and I built the software because I couldn't stand the Web without it," said Brent Simmons, who says the sales of his Macintosh-based aggregator, NetNewsWire, is now in five figures. "The demand for aggregators is just going to tip over at some point and go wild."
Piece by Dan Gillmor in the Mercury News about news aggregators and working with (and in) information:
Every morning I learn the latest from a variety of news organizations, Weblogs, newsletters and other online information sources. But I don't use my e-mail program or go surfing from Web site to Web site.Here's a shot of NetNewsWire, which includes tools for posting to weblogs (I'm posting this to MovableType from NetNewsWire):
Instead, I use a piece of software called a news aggregator or newsreader -- in my case, it's called NetNewsWire and runs on Mac OS X -- to scoop up headlines and summaries, along with links to the places where they originated.[...]
One fan is Mitchell Kertzman, a friend who has run several Silicon Valley companies and is now a venture capitalist with the firm Hummer Winblad in San Francisco. When he saw an RSS newsreader for the first time, he says, ``I had the same instinctive feeling I had years ago when I saw my first primitive Web browser -- a feeling of amazing and unlimited possibilities.''
[larger version (128k) here]
Aggregators are important tools because they provide a method for re-shaping information space. Browsing the web is, in general, a relatively slow and linear process. (Even if you jump around, you're still moving primarily in a line, albeit a crooked one, for the most part.) But news aggregators condense that space, folding the linear threads of websites up accordion-style, stacking one site up against the next to provide a smaller, condensed space, sort of a neutron star version of the web.
I AM MR. DARL MCBRIDE CURRENTLY SERVING AS THE PRESIDENT AND CHIEF EXECUTIVE OFFICER OF THE SCO GROUP, FORMERLY KNOWN AS CALDERA SYSTEMS INTERNATIONAL, IN LINDON, UTAH, UNITED STATES OF AMERICA. I KNOW THIS LETTER MIGHT SURPRISE YOUR BECAUSE WE HAVE HAD NO PREVIOUS COMMUNICATIONS OR BUSINESS DEALINGS BEFORE NOW.
MY ASSOCIATES HAVE RECENTLY MADE CLAIM TO COMPUTER SOFTWARES WORTH AN ESTIMATED $1 BILLION U.S. DOLLARS. I AM WRITING TO YOU IN CONFIDENCE BECAUSE WE URGENTLY REQUIRE YOUR ASSISTANCE TO OBTAIN THESE FUNDS.
In a wide ranging piece for PBS, Robert Cringley offers some opinions about IT policies.
Why aren't Apple Macintosh computers more popular in large mainstream organizations? Whatever the gigahertz numbers say, Macintoshes are comparable in performance to Windows or Linux machines. Whatever the conventional wisdom or the Microsoft marketing message, Macs aren't dramatically more expensive to buy and on a Total Cost of Ownership basis they are probably cheaper. Nobody would argue that Macs are harder to use. Clearly, they are easier to use, especially on a network. So what's the problem? Why do Macs seem to exist only in media outfits?I'm not convinced that this is a conspiracy, if by conspiracy we mean intentional and secretive activity. But if we step back and look at some of the arguments made against Macs by sysadmins and Windows users, we can start to sketch the outlines of some of these policy decisions.
I used to think it came down to nerd ego. Macs were easy to use, so they didn't get the respect of nerds who measured their testosterone levels by how fluently they could navigate a command line interface. Now, I think differently. Now, I think Macs threaten the livelihood of IT staffs. If you recommend purchasing a computer that requires only half the support of the machine it is replacing, aren't you putting your job in danger? Exactly.
Ideally, the IT department ought to recommend the best computer for the job, but more often than not, they recommend the best computer for the IT department's job.
First, that Macs don't allow users to control the system in deep and powerful ways. Let's leave aside the fact that this isn't true at all--many geeks hack their systems down to machine code levels. The point of the Mac, though, is to facilitate use without requiring in-depth tweaking (at least in theory). This philosophy is absolutely opposed to that of many Windows users--and just as importantly, opposed to the tendencies of many people who go into IT careers.
It's not a conspiracy, but an attitude. One that probably isn't examined seriously by those holding it.
Second, that (as Cringley points out elsewhere in the article) Macs are more expensive. But Total Cost of Ownership studies show, over and over again, that when the cost of troubleshooting and training is taken into account, Macs are on par or cheaper that Windows systems. But here again we see another unexamined tendency: IT staffs tend not to look at TCO because of the conflict of interest: it's difficult to ask someone to consider every hour of their work as a negative effect on the bottom line (which is what TCO studies ask for). This isn't, I think, a conscious decision, but it's still striking.
From a CNET report, SCO's unproven but loudly shouted claims to own key linux IP is paying off:
The Lindon, Utah-based company said it earned $3.1 million, or 19 cents per share, during its fiscal third quarter, compared with a loss of $4.5 million, or 35 cents a share, in the year-ago quarter. The company said third-quarter revenue was $20.1 million, up from $15.4 million a year ago.
The analogy doesn't completely hold, because in order to work properly it would also involve a seedy man holding a gun to each player's head, and informing them in low, certain tones that they should put their $100 down and walk away without ever finding out which shell the ball is under. If they decline the exit option and pick the wrong shell, the gun goes off. Or something like that.
See this Open Source analysis for a decidedly anti-SCO (but well researched) version of the situation:
Through phrases like “misusing and misappropriating SCO's proprietary software”, and through the enumeration of five categories of rights in paragraph 68, SCO/Caldera's complaint implies the existence of relevant intellectual-property rights based on patent, copyright, trade-secret, and trademark law as a background to the explicit matter of its licensing dispute with IBM over Linux.
It is notable that the complaint does so without ever actually stating what those claims are.
[From LockerGnome Bytes]
To use Google's built-in calculator function, simply enter the expression you'd like evaluated in the search box and hit the Enter key or click the Google Search button. The calculator can evaluate mathematical expressions involving basic arithmetic (5+2*2 or 2^20), more complicated math (sine(30 degrees) or e^(i pi)+1), units of measure and conversions (100 miles in kilometers or 160 pounds * 4000 feet in Calories), and physical constants (1 a.u./c or G*mass of earth/radius of earth^2). You can also experiment with other numbering systems, including hexadecimal and binary.
Fox News has filed a lawsuit against comedian Al Franken to prevent him from using the phrase "Fair and Balanced" in the title of his upcoming book, Lies and the Lying Liars Who Tell Them: A Fair and Balanced Look at the Right. Fox claims that Franken infringes on their trademark, filed in 1998. As the Washington Times notes,
Fox also complains that the cover of Franken's book closely resembles the cover of Fox News celebrity Bill O'Reilly's book The O'Reilly Factor.
In its fair and balanced way, Fox News refers in its suit to Franken as an "unstable" and "shrill" "C-level commentator" who is "not a well-respected voice in American politics."
The attorneys do concede that Franken "achieved some renown as a comedy writer in the 1970s when he worked for the television program 'Saturday Night Live' " but add he since "has attempted to remake himself into a political commentator" and "is neither a journalist nor a television news personality." (Note the distinction being made between "journalist" and "television news personality.")
"His views lack any serious depth or insight," Fox News sniffed for good measure.
All of which suggests, of course, that Franken's book--his use of the term, the cover, etc.--are satirical looks at Fox News reporting tactics. Such parodies, of course, are part of the Fair Use provisions in US Copyright law and, as such, are protected forms of speech. Stanford's Fair Use site provides this definition
A parody is a work that ridicules another, usually well-known work, by imitating it in a comic way. Judges understand that by its nature, parody demands some taking from the original work being parodied. Unlike other forms of fair use, a fairly extensive use of the original work is permitted in a parody in order to "conjure up" the original.Franken undoubtedly will appreciate the irony of the lawsuit and get some comedic mileage from it.
A Slashdot article reports that the Joint Committee of the Higher Education and Entertainment Communities has published "Background Discussion of Copyright Law and Potential Liability for Students Engaged in P2P File Sharing on University Networks".
Colleges and universities are under no legal obligation to defend, or accept responsibility for, the illegal actions of their students in the P2P context. A major step toward ending unlawful P2P activities on college and university campuses lies in education.
The report is available as a 1689k PDF file.
From Boxes and Arrows
Semiotics: A Primer for Designers by Challis Hodge
Once you acknowledge that meaning arises, as Saussure frames it, not from some inherent meaning in objects, but out of differences among signs in complex symbol systems. That's not to say (as some postmodernists do) that contingency equals complete breakdown. As cultural theorist Stuart Hall says, just because objects don't automatically correspond to words doesn't mean that objects never correspond to words. The correspondances exist because we, as cultures, perpetuate them--we teach them to each other in everyday practice, in schools and homes and workplaces.
A "computer" used to be, in the nineteenth century, someone who performed calculations. There was no necessary connection between that word and a job occupation. But the connection was upheld in some communities by active use, by defining "computer" as something different than "boss" or "slave" or "clock".
But the contingency of that connection provides a deconstructive hinge for changing--slipping in the system of signification. So after several decades of active development and education, a "computer" referred to an extremely large piece of technology, run by highly trained specialists for very specific, often esoteric purposes. Developments in computing during the last quarter of the twentieth century, "computer" became an increasingly general purpose, small, common object. Things change.
Why is this important to designers? Because we have to acknowledge that the meaning of our communications--web pages, novels, interactive videos, or even consumer technologies like mice--is contingent, created in the interaction between specific users and specific communications with specific contexts. It's literally out of our control.
But, avoiding the yawning maw of postmodernism, we avoid this falling completely into chaos because we involve ourselves in the communities that we develop for--through usabilty testing, participatory design, contextual inquiry, and a myriad of other methods--for understanding tendencies in meaning. This concern for the user is what sets good design apart from bad design in many cases.
In addition, we need to be careful when we make proclamations about users--there's no such thing as a universal user. There are tendencies, there are trends and rules of thumb. But these are only ever guesses, no matter how accurate they seem in practice. We have to always keep ourselves open to the ideas that
Although I've just shown why there's no such things as the One True Meaning, theoretical deconstruction (as opposed to simply taking something apart) is about finding the contingency with an object or term and teasing it open, showing how that term actual contains within itself its own negation. The concept of "usability" for example, necessarily contains within it some form of difficulty, because without at least some fleck of resistance or tension, usability would disappear into non-action. In other words, usability has as its goal its own disappearance.
This might seem to be merely so much wordplay, but it also suggests (as I have in several other places) that usability theory frequently places so much emphasis on the user as an idiot that it forgets sometimes users have to think. Learning isn't passive, it's active. And it requires work. Without that work, there's nothing to learn.
LITTLE ROCK, Ark. (AP) -- A computer hacker gained access to private files at Acxiom Corp., one of the world's largest consumer database companies, and was able to download sensitive information about some customers of the company's clients, the company said Thursday.
But before you panic, Jennifer Barrett, Acxiom's Chief Privacy Officer, provided this comforting note:
"We view the risk of it at this point as very low," she said. "We also were notified that data ... hasn't been accessed by any other parties or used for any other fraudulent purposes. I can say this about the data, much of it was nonsensitive information." (emphasis added)
Instant Messaging, over the last year or so, has begun an inexorable shift from a disruptive, underground technology -- banned at many corporate and academic sites -- into a sanctioned (and much hyped) communication tool. C|Net (among other sites) reports that Microsoft has released specs on their corporate instant messaging platform. Sporting corporate-friendly capabilities like higher security and "management tools", IM is poised to begin bridging the gap between email and telephone.
The rise of IM is in no way a "natural" movement, but it's a common one: disruptive technology emerges, is taken up by users, and flourishes under the radar. IM (and before it, hypertext, web pages, etc.) operate frequently (although not necessarily) as subversive technologies. Or perhaps transversive -- they operate not out of a simple, reactive resistance to corporate structures, but without regard to those structures, sometimes alongside them and sometimes against them, but within the spaces purportedly "owned" by corporations.
The postmodernization of communication in IM -- the breaking down of hierarchy, the insistence on circulation of information -- is seen at first as disruptive. Eventually, though, the technology is articulated as a more efficient tool in a postmodern economy, an easier way to move information about in fluid, deterritorialized spaces. In that recognition, the fluid spaces are reterritorialized, but only in limited ways: information still flows freely, but is now subject to observation, measuring, benchmarking.
I'm not saying this is a bad thing. IT happens. Or so the story goes.
The breakdown and loose recombination of spaces mirrors an overall cycle in postmodern capitalism: Fragment organic wholes, then pick up the pieces (or, even better, get the pieces to self-organize in smaller, more fluid structures). The flattening of corporate hierarchies and the corresponding move to "empower" workers brings with it a certain type of freedom: freedom to do work more efficiently. So the carefully controlled breakdown, especially when its results and processes are closely monitored by management, harnesses subversive and transversive power and articulates it in line with existing corporate needs.
A week or two back, Bruce Tognazinni, in an Ask Tog article argues (correctly, I think) that people who work in the current disparate areas that might be collected under "interaction architect" lack a strong professional focus, status, and -- most importantly -- the ability to drive key decisions about how software, hardware, and users (including their contexts) all interact. Often, crucial decisions about the design of software and hardware are left up to programmers, managers, database designers, and others who, despite their clear expertise in their chosen fields, do not have knowledge about how computers actually get used (or misused).
We've been complaining bitterly, these last 25 years, that we get no respect, that we are thought of as nothing more than decorators, if we are thought of at all. Guess what? We have no one to blame but ourselves. We have sat on the sidelines, perpetually powerless, whining, instead of changing up the game so we can win.
Somehow "user experience practitioner" doesn't roll off the tongue so easily. Hence the inevitable effort for UX-types to name what it is they do: at conferences and in newsletters, for years, I've seen the endless discussions. Should it be "usability professional"? "Information designer"? "Interaction architect"? Some other permutation?
Here's my proposal - easy to pronounce, easy to understand, just two easy words: "Who cares?"
The problem of visiblity and status in the professions is not a new one--every profession goes through a period of struggle for recognition and stability. I've written several articles on it over the last ten years -- it's still a pressing issue, though.
Some professions make it and some don't. Carolyn Marvin, some time ago, documented the ways that electrical engineering struggled for professional recognition, emerging during the rise of electrical development, innovation, and deployment. Tactics such as publicity campaigns, certification, professional journals and conferences were part of articulating numerous different strands until they came to represent "electrical engineer".
So what's the big deal with "interaction architect" (the title Tog recommends)? Not much from one perspective. Hurst argues that to be truly effective, interface/interaction/usability people should be invisible. Hurst misses some key points though, in his effort to be witty and provocative. First, when this group argues for increased status, they're not simply being self-serving. In order to do their jobs, interaction architects (or whatever we call them) absolutely have to be visible in corporate contexts. If Hurst is in an organization where he can quietly do his job and money just rolls in, I want to work there too. In order to "facilitiate" users, interaction architects need to have an extremely high profile in the company: without that profile, key decisions end up being made by the people in marketing, in programming, in management -- groups that actually know very little about how users think, live, and communicatio. (Marketing might be close, but in the long run, marketing's main focus is on the sale. Period. Take a look at the most recent version of an MS Office app to see what marketing can do for interface design. There are some key usable innovations there, but the amount of irresponsible garbage is overwhelming.)
Or perhaps Hurst meant (but didn't actually say) that interaction architects should never be seen by users, like wizards behind curtains. Even if this were the case, that invisiblity wouldn't prevent interaction designers from having high corporate presence.
But it's not the case: There's no reason that the profession of interaction architect should be a veiled mystery to users. Indeed, think of all the other professions that we might call "facilitative": Architect (Tog's choice), physician, coach, teacher.
And it's not even the case that software and the interface should be transparent to users. There are times when the interface should disappear, but there are also lots of times when effective use requires difficulty. It'd be great if website design were so simple that all we had to do was push a button in order to create innovative, unique, attractive, and useful websites? It's a myth, at least for anything but trivial websites. Websites often need to challenge us to think about ourselves, about others. They often need to push us to learn new ways of working and living. They often need to argue with us, entering into public debate.
Becoming invisible isn't the first step toward facilitation, it's the first step toward death.
[Thanks to Bill Hart-Davidson for pointing out the Hurst article after I posted some comments and a link to Tog's piece on the Association for Teachers of Technical Writing discussion list.]
I recently purchased a Sharp Zaurus PDA. Or maybe it's a laptop. Or palmtop. It's difficult to tell. Running a QTopia's version of linux, I can use it for a much broader range of activities than I could my previous PDAs (going all the way back to an Apple Newton and up to the Handspring the Zaurus replaces). In fact, for the most part, I'm using the Zaurus for 90% of what I could use a desktop or laptop. It won't run MS Office, for example, but HancomWord, which is included, reads and writes .doc files. And I can't run Adobe Photoshop, but it does run simple paint programs (and if I upgrade to the Debian Linux installation that's available, I could run GIMP or another full-power image editor). Here's a shot of the Zaurus in relation to the 15" PowerBook (the Zaurus is displaying Google News).
Along with wireless, the Zaurus is beginning to change my ideas about how these technologies are integrated into my worklife (and life in general). I'm still in the middle of the transition period, but I find myself relying on my PowerBook less (and the several Dell Dimension towers have been relegated to serving various applications or running specialized financial software that won't run under OS X or linux). And the portability of the device makes it much less of an issue to carry around--sure, I can bring the PowerBook around with me, but the Zaurus is an order of magnitude simpler, sort of like the same leap that was made between early portable computers like the Osborne to the modern, four- to six-pound laptop.
There's an odd, almost contradictory impulse in the development of these technologies: at the same time that I'm shifting large portions of my work onto a laptop device slightly larger than a pack of cards, I'm also getting used to the idea that some of my work requires different support. So while for years I struggled to use the PowerBook as a desktop replacement--and it was, for the most part--I've realized that different types of work should rely on different types of techology. The middle ground won't always cut it. So I can use the Web, do email, and edit documents on the Zaurus, the tiny screen (even though it's 640x480 resolutio) won't cut it for work requiring large amounts of information on the display. For that sort of work, the PowerBook setup makes more sense (particularly the PowerBook when I'm running in dual-monitor mode to add the pixel real estate from the 21-inch monitor on my desk to the 15" lcd of the laptop). And, despite the ease of use and portability of the PowerBook, I sometimes need more CPU speed than it can deliver and use one of the various desktop machines in my office or my lab.
(All of this, I realize, is partially to rationalize my irrational need to buy more technology.)
In a Linux Journal intervew, Brian Kernighan, computer programming pioneer, identified the two main challenges facing computing:
There are only two real problems in computing: computers are too hard to use and too hard to program.
Much less work, though, goes into that crucial oscillation between chaos and order, on working within and across information spaces. Anyone who has worked on a complex problem knows that creative work absolutely requires frustration and misunderstanding, absolutely needs a messy space in which to play. It's not either/or.
See, for example, Eastgate's Tinderbox (at left--click to enlarge) or Ranchero's NetNewsWire as spaces (in the former) and tools (for the latter) and working productively within complex information spaces without reducing that complexity to something that needs to be removed or easily managed. (There are many others, but not enough.)
Researchers at UC Berkeley have developed the world's smallest motor, approximately 500 nanometers across.
Obviously more of a proof of concept project than a productive technology, but the concepts it marks significant progress in realizing the wild-assed early claims for nanotech, like cellular-level medical treatment. This is an extension of the buckyballs idea, which was a passive method for delivering medical compounds in extremely tiny doses to targeted locations.
More fanciful portrayals at the Foresight Institute's Nanomedicine Art Gallery.
About freakin' time...
On Monday, leading Linux distributor Red Hat Inc. jumped into the fray with a lawsuit that asks a court to decide if any of its software infringes of SCO's intellectual property. It also seeks an order that would bar SCO from making "unfair, untrue and deceptive" claims.
"We're seeking a resolution ... to all the rhetoric as fast as possible," said Matthew Szulik, Red Hat's chief executive.
I'm starting the slow and clumsy process of migrating my website over to MovableType. Lots of bugs, although none too major (from a user's perspective at least). The "home" links on second-level subpages are broken for now--use the "back" button/command in your browser. And other issues are sort of clunky. And I miss the wrecked cars of the previous version, so there will likely be an interface makeover.
But, on the plus side, MovableType provides weblogging facilities plus discussion.
Check the ongoing production blog of The Invisible Revolution, a one-hour TV documentary on Douglas Englebart. The sixties computer pioneer's innovations included the mouse, windowed interfaces, hypertext, and more.
(Took the industry a while to catch on....)
Just testing to see if NetNewsWire will post to the MovableType blog.
I've managed to get MovableType up and running. I'll spend a little time getting familiar with the interface. The next stage will be to develop a template that mirrors the current (static) homepage of my site (at least consistent in style) and convert the main page to MovableType.