DITCHWALK

A Road Less Traveled

Topics / Books / Docs

About / Archive / Contact

Copyright © 2002-2022 Mark Barrett 

Home > Archives for 2015

Archives for 2015

Publishing is for Professionals

July 14, 2015 By Mark Leave a Comment

So today is the day that Harper Lee’s new ‘novel’ goes on sale. Far be it from me to question the motives of the titans of cultural responsibility at HarperCollins, but if the early returns are any indication this is not a glorious day in the history of literature:

“Watchman”s portrayal of the older Finch as a man who has attended a Ku Klux Klan meeting and opposes racial desegregation has already grabbed headlines because of the stark contrast to the noble lawyer in “Mockingbird” who defends a black man wrongly accused of raping a white woman.

The Wall Street Journal’s Sam Sacks described “Watchman” as “a distressing book, one that delivers a startling rebuttal to the shining idealism of ‘To Kill a Mockingbird.’ This story is of the toppling of idols; its major theme is disillusion.”

Several reviewers found fault with the new book on artistic grounds.

David L. Ulin of the Los Angeles Times called it “an apprentice effort (that) falls apart in the second half” and Julia Teller at the Chicago Tribune said it was “almost unbearably clunky” in parts.

It’s quite clear that until very recently Harper Lee never intended this ‘novel’ to be published, and that until the death of her sister, who was her primary caretaker, that wish was respected. Now, amazingly, at exactly the moment when Lee is alone and also quite aged and infirm, it turns out that the kindly cultural stewards at HarperCollins have been able to convince Lee otherwise. It’s a miracle — and in particular a miracle that has absolutely nothing to do with money.

But there’s a problem, of course, and the problem is how to see this new ‘novel’ in the context of Lee’s less-infamous novel, To Kill a Mockingbird. Or rather it’s a problem for some, but not for anyone who has ever written, because what’s being sold as a new novel from Harper Lee is almost certainly an early exploratory draft that held great meaning for Lee not because of what it was, but because of what it led to.

When you’re a writer, and particularly when you work in long form, you learn that your initial work is not always on the mark. Sometimes you get help from others, sometimes you see a better way yourself, but in any case you try something, it doesn’t work, so you try something else. There is nothing new in this. It is the way authors have always written, even as many authors themselves prefer to cling to the self-aggrandizing (and coincidentally salable) lie that great works emerge wholly formed, without typos.

In the graphic-novel genre Lee’s new ‘novel’ would simply be considered an alternate history and discussed in that context, but Mockingbird is sainted literature. Sainted literature that may now be indelibly stained by the noble and benevolent actions of a giant corporation acting only in the best interest of its author and readers. Because many of the critics who bless literature with sainthood are themselves culturally unable to comprehend Lee’s new ‘novel’ as a work product, as opposed to a statement of some kind, the myth will be perpetuated that this new work is in fact a separate work, which it almost certainly is not.

Whatever becomes of Lee and her legacy, the lesson for other writers is clear. If you’ve got an early exploratory draft, and you don’t want someone coming along later and misrepresenting that draft as a separate work, then you need to burn or delete that draft. At which point the academics will accuse you of having stolen or appropriated the final product, because they will find no evidence of how you got there on your own.

— Mark Barrett

Filed Under: Publishing Tagged With: HarperCollins, Jonathan Burnham, professionals

VR, Drones and Autonomous Vehicles

July 5, 2015 By Mark 2 Comments

As you are probably aware due to the unending stream of utopian press reports emanating from Silicon Valley, three new technologies bankrolled by three of the biggest names in tech are poised to change your life for the better. Just as the computer and internet have been nothing but a positive in the lives of all people everywhere, so too will virtual reality, drones and self-driving vehicles liberate human beings from the tedium of, respectively, sensing the real world, delivering packages, and driving.

Still, in the wee hours of the night, and admittedly afflicted by the kind of doubt that will forever keep human beings from reaching the computational certitude of computers, I find myself thinking that VR, drones and autonomous vehicles sound nice in the vacuum of public relations and venture-capital funding, but may experience or even provoke real-world problems upon deployment. In fact, I can’t keep my storytelling reflex from filling in all the utopian backdrops and can’t-miss financial windfalls with scenarios in which these technologies fail or are repurposed to darker intents.  [ Read more ]

Filed Under: Non Sequiturs Tagged With: Facebook, Google

Authors, Artists and the Internet

June 16, 2015 By Mark Leave a Comment

Because it’s easy to become overwhelmed by tech minutia, particularly if you hail from the arts, I thought it might be useful to step back from the discussion of SEO in the previous post and consider the internet in broader context. If you’re not into technology most tech-speak probably sounds like gibberish, but you probably also have faith that it all makes sense to someone somewhere. If the internet is a mystery to you as an artist or author, you trust that the smart, wonderful, benevolent people who created the internet in order to help you reach both your intended audience and your creative potential really do understand what it’s all about.

The internet is an amazing creation, and has come to dominate our lives in an amazingly short amount of time. Backed by hundreds of billions of dollars in investment, infrastructure and advertising, the internet is clearly the place to be, at least according to the internet. Beyond making a lot of people rich, however, the internet as a method of communication has democratized conversations that were previously controlled by self-interested if not bigoted gatekeepers, meaning voices that were perpetually overlooked or muted can now be heard on issues of critical importance. In every way the internet imitates life, and at times even imitates art.

The problem with that feel-good appraisal is that it ignores another fundamental truth about the internet, which is that is completely insane. And in saying that I do not mean the internet is exasperating or wildly avante-garde, nor am I being hyperbolic or pejorative. Rather, I mean that as a cold, clinical appraisal. If you are an author or artist the maze of technologies driving the internet may make it hard to perceive the systemic dysfunction emanating from your screen (though the phrase virtual reality is itself a shrill clue), but you are in fact better positioned than most to understand it. All you need to do is recast your conception of the internet in familiar terms.

If you’re a writer, think of the internet as having been authored by Joseph Heller or Kurt Vonnegut. If you’re an artist, think of the internet as a work by Salvador Dali or René Magritte. Which is to say that the internet is not simply the sum of its technologies and techniques, but a construct, space, and experience informed and distorted by human perception and imagination.  [ Read more ]

Filed Under: ~ Tangents Tagged With: platform

Authors, Artists and SEO

May 17, 2015 By Mark 1 Comment

Over the past month or so I’ve been catching up on a lot of old to-do items, including a year’s worth of site maintenance that I kept putting off until I switched ISP’s, which, oddly enough, I kept putting off until a couple of months ago. Anyway, while researching something or other I ran across a truly useful article on the subject of search engine optimization (SEO), which had the refreshing candor to acknowledge that SEO advocates speak gibberish:

As you sit down with your new SEO consultant it starts out well, but soon he says “We’ll need to implement a good 301 redirect plan so that you don’t lose organic rankings and traffic.” Then he says something about title tags, which you’ve heard of although you’re not quite sure exactly what they are or what they do, or why it’s important to update them as your consultant is recommending, although it all sounds good. Then he starts using other jargon like “indexing,” “link equity,” and “canonicalization,” and with every word you feel your grasp on reality slipping and the need to take a nap.

The entire piece contains an excellent glossary of terms that come up again and again in SEO articles. Unfortunately, the very fact that SEO is such an enigma confuses the question of how sole-proprietors — including particularly independent artists and authors — might best make use of SEO without themselves becoming confused or lapsing into gibberish.

So here’s the truth about SEO if you’re an artist. For the most part SEO is not something you need to be concerned about. Whatever time you might put into SEO, or whatever time you might convert into money in order to pay someone else to worry about SEO, can usually be more profitably spent creating whatever it is that you create.  [ Read more ]

Filed Under: ~ Tangents Tagged With: SEO

Google is the New Microsoft

May 14, 2015 By Mark Leave a Comment

Two days ago I went to log into Gmail and found that the login screen I had been using for, what — an entire decade? — was suddenly behaving differently. Now, as a longtime web user I’ve been taught that any time something seems phishy I should make sure that what I’m seeing is actually what it purports to be. That is in fact the lesson all large web companies preach — be vigilant!

The problem, of course, is that the level of criminal sophistication perpetrating such deceptions keeps growing, to the point that almost anything seems possible. How do I know that someone hasn’t figured out a way to show me the appropriate URL, then redirect my traffic or keystrokes to a hostile server? I mean, I’m a reasonably sophisticated web user, but that only means I’m that much more aware of what I don’t know.

As it turns out, the change to my Gmail login ritual was not only initiated by Google, it was rolled out on the sly without, ironically, so much as an email that a change was coming. Meaning I had to get on the internet to find out that other users around the country and around the world were being confronted by that same autocratic change before I knew it was safe to log into my Gmail account.

Somewhere in the high-tech bowels of Google a group of very highly paid people got together and decided that they would roll out a new login scheme which requires twice as many clicks as the old scheme, that they would do so without giving notice to anyone who used that scheme, and that they would give no reason for doing so. That is exactly how the world ended up with Windows 8, and a whole host of other Microsoft initiatives to win market share and own technology spaces in complete disregard for its customers.

I suspect that the Gmail change has something to do with Google’s recognition that the world is going mobile, but the real story here is the contempt with which Google views its users. That is in fact the signature moment in any tech company’s life cycle — the one where current users are considered to be, at best, nothing more that a population to be exploited, and at worst, a hindrance to corporate goals that have completely diverged from the products and services being offered and utilized.

In terms of righteous indignation this barely qualifies as a 2, so I’m not suggesting anyone leave Gmail, but simply that you take a step back and get your mind around the contempt that any company would have to have in order to suddenly change the portal to your email account. Because those are the same people who have said they are not reading your hosted emails, or personally identifying your web traffic, or doing anything else you wouldn’t want them to do because they promised they wouldn’t be evil.

Update: It occurred to me last night that the new Gmail requirement that users click on two separate screens in order to log in, instead of only one as before, may have been initiated as a means of encouraging people to stay logged in all the time. While presenting as an initial annoyance, once users gave in and complied it would strengthen Google’s brand association with email products and the user’s reliance on same, preventing people from migrating to other platforms for chat and video, etc. The downside, obviously, is that it would actually make Gmail accounts significantly less secure if an always-logged-in device fell into the wrong hands.

— Mark Barrett

Filed Under: Non Sequiturs Tagged With: Google

The Best Monitor / Display for Text

May 10, 2015 By Mark 59 Comments

A few weeks ago, just before my keyboard died, my monitor momentarily flickered ever so subtly between displaying white as full white and white as soft pink. It happened so quickly, and the change was so faint, that at first I thought my eyes were playing tricks on me. Fortunately, a day or two later the same thing happened, allowing me to determine that the monitor itself was hinky.

While I have no qualms about opening up a keyboard to see if I can rectify a problem, or just about any other gadget you could name, I draw the line at messing around inside devices that contain potentially lethal capacitors. Combine that reticence with the flickering I had seen, and the low, staticky hum that had been building up in my monitor for the past year or two, and it suddenly seemed prudent to once again peruse the state-of-the-art display offerings available in the market before the very device I would need to rely on to do so failed completely.

(There are all kinds of things that can go wrong with a computer, and the most maddening aspect of all of them is that those issues immediately make it impossible to access the internet, which is where all the solutions are. If your operating system locks up you need to access another computer to research the problem. If your monitor dies you need to have another display on hand in order to order a replacement, which you would not need if you already had one on hand. Speaking of which, even if you use an add-on graphics card, the motherboard in your computer should always have its own graphics chip for exactly that reason. If your card dies — and graphics cards are always dying, or freaking out — you can still drive your monitor and access the web.)

As was the case with my venerable old keyboard, I was not at all surprised that my monitor might be at the end of its useful life. In fact — and you will no doubt find this amusing or absurd — I am still using a second-hand CRT that I bought in the mid-aughts for the lofty price of twenty-five dollars. While that in itself is comical, the real scream is that the monitor was manufactured in 2001, meaning it’s close to fifteen years old. Yet until a couple of weeks ago it had been working flawlessly all that time.

The monitor is a 19″ Viewsonic A90, and I can’t say I’ve ever had a single complaint about it. It replaced my beloved old Sony Trinitron G400, which borked one day without the slightest hint that something might be amiss. Scrambling to get myself back in freelance mode I scanned Craigslist and found a used monitor that would allow me to limp along until I found a better permanent solution. Eight or so years later here I am, still using the same A90. (In internet time, of course, those eight years are more like eight hundred. Not only were LCD’s, and later, LED’s, pricey back then, yet quite raw in terms of performance, but you could also reasonably expect to use Craigslist without being murdered.)

Between then and now I have kept track of changes in the price, size, functionality and technology of flat-panel monitors, and more than once researched display ratings with the thought that I might join the twenty-first century. Each time, however, three issues kept me from pulling the trigger.

First, while all that snazzy new technology was indeed snazzy and new, relative to CRT technology it was still immature, requiring compromises I was not willing to make in terms of display quality and potential effects on my eyes. Having always been sensitive to flickering monitors, I was not eager to throw money at a problem I did not have — or worse, buy myself a problem I did not want. (As a general rule, putting off any tech purchase as long as possible pays off twice, because what you end up with later is almost always better and cheaper than what you can purchase today.)

Second, at the time I was primarily freelancing in the interactive industry, which meant I was working with a lot of beta-version software that had not been fully tested with every conceivable display technology. Using lagging tech at both the graphics and display level meant I could be reasonably confident that whatever I was working on would draw to my screen, at least sufficiently to allow me to do my part.

Third — and this relates somewhat to the second point — one advantage CRT’s had and still have over LCD/LED displays is that they do not have a native resolution:

The native resolution of a LCD, LCoS or other flat panel display refers to its single fixed resolution. As an LCD display consists of a fixed raster, it cannot change resolution to match the signal being displayed as a CRT monitor can, meaning that optimal display quality can be reached only when the signal input matches the native resolution.

Whether you run a CRT at 1024×768 or 1600×1200 you’re going to get pretty much the same image quality, albeit at different scales. The fact that I could switch my A90 to any resolution was a boon while working in the games biz, because I could adjust my monitor to fit whatever was best for any game while still preserving the detail and clarity of whatever documents I was working on.

While imagery is and always has been the lusty focus of monitor reviews, there is almost nothing more difficult to clearly render using pixels of light than the sharply delineated, high-contrast symbols we call text. Because LCD/LED monitors have a native resolution, attempting to scale text (or anything else) introduces another problem:

While CRT monitors can usually display images at various resolutions, an LCD monitor has to rely on interpolation (scaling of the image), which causes a loss of image quality. An LCD has to scale up a smaller image to fit into the area of the native resolution. This is the same principle as taking a smaller image in an image editing program and enlarging it; the smaller image loses its sharpness when it is expanded.

The key word there is interpolation. If you run your LCD/LED at anything other than its native resolution what you see on your screen will almost inevitably be less sharp. While that may not matter when you’re watching a DVD or playing a game, interpolating text is one of the more difficult things to do well. Particularly in early flat panels the degradation from interpolation was considerable, making anything other than the native resolution ill-suited for word processing.  [ Read more ]

Filed Under: ~ Tangents Tagged With: tools

Requiem for a Keyboard

May 3, 2015 By Mark 1 Comment

A week or so ago I was bashing away at my keyboard when I suddenly began producing g’s that looked like this — ‘g — and h’s that looked like this: -h. Having spent a fair amount of time working with computers I knew there were various reasons why my keyboard might suddenly become possessed, none of them even remotely exciting. Because those multi-character glitches were crippling my ability to type, however, I quickly set about diagnosing the cause.

Occasionally, when my mind is firing faster than my fingers can move, I’ll accidentally enter a key combination that performs some automated feat I did not intend to perform. Worse, because I don’t know which keys I hit in which order, I won’t know if some preexisting configuration was altered that will trip me up later. If I’m lucky the result will be something obvious, as happens from time to time with Google Mail, which seems to delight in auto-sending messages before I’m through with them. And of course there’s the perpetually irritating StickyKeys feature, which, as far as I know, exists only to remind me that my left pinky is loitering on the shift key while I’m thinking about what I want to type.

Because there are so many default key combinations that come with any word processing software, to say nothing of the keyboard software itself, and because it’s possible to invoke such macros by accident — including changing the output of specific keys — my first diagnostic act was to uninstall both Intellitype and the keyboard driver to see if that solved the problem. Which it did not.

My next concern — which grew rapidly — was that the errant behavior I was seeing was the result of a virus or malware. After running scans for both, however, my machine was deemed clean, which meant, almost certainly, that I was having a hardware problem.

My keyboard of choice is the Microsoft Natural Ergonomic Keyboard 4000 (MNEK4K), which also has, I believe, the longest name in the history of keyboards. The model flaking out on my desk was the most recent incarnation of the same split-board device I had been using for close to twenty years, since just after the MNEK4K debuted. It was at least six or seven years old, had seen regular use almost daily over that time, and given how loose and clicky the keys had become I wasn’t particularly surprised that it might have reached the end of its useful life. (Conservatively, total key presses on that board easily topped ten million, with the bulk taking placing on the most-used letter keys. Speaking of which, years ago the A, S, D and arrow keys lost their labels due to repeated use, and the M key only displayed the upper-left corner of that letter.)

As with any keyboard, from time to time a key had gotten stuck, so that was my first thought in terms of mechanical failures. Close inspection turned up nothing obvious, however, so my next idea was that six or seven years worth of wayward hairs, lint and food debris might be causing trouble I could not see. And that meant I would have to open up the board.

Now, if you’re not used to taking things apart, the idea of opening up a keyboard may seem fraught with risk. Fortunately, I had two things going for me. First, I’ve taken a lot of things apart over the years, from computer tech to automobile engines, so I have some familiarity with the procedures and practices involved. Second, my keyboard had been out of warranty for years, meaning I could hardly make things worse. If I didn’t open it up I would have to by a new one, and if I broke it by opening it I would have to buy a new one, so there was literally nothing to lose other than time and a little DIY dignity.  [ Read more ]

Filed Under: ~ Tangents Tagged With: tools

Graphics and Interactive Storytelling

April 30, 2015 By Mark 3 Comments

In the mid-nineties I became fascinated by the storytelling potential of interactive entertainment. My interest peaked in the early aughts, during what I now think of as the second great wave of interactive storytelling mania. While the potential of interactive storytelling seems obvious to everyone, the mechanisms — the actual techniques — by which interactive stories might be told are complex and at times counterintuitive.

After finding my way into the interactive industry and meeting with some professional success, I was asked in 2000 to write an article for SIGGRAPH’s Computer Graphics magazine about the future of interactive storytelling. While great effort was being put into replicating techniques from passive mediums, including, particularly, film, it seemed to me that such an imitative approach had everything exactly backwards.

Recently, while conducting periodic maintenance on my computer and sprucing up Ditchwalk, I ran across that article, which for some reason I had never gotten around to adding to the Docs page on this site. That omission now stands corrected.

The title of the article is Graphics — the Language of Interactive Storytelling. Coming from someone who primarily made a living with words that may seem odd, but it and the accompanying text goes to the heart of the interactive storytelling problem, and why so little progress has been made. In fact, the only thing that’s changed is that we no longer worry about having enough processing power to do what we want — yet today’s enviably high hardware ceiling is still rarely used to facilitate aspects of interaction that might truly drive emotional involvement.

Fifteen years on, during the fourth great wave of interactive storytelling mania now taking place in the industry, little has changed. Another generation of eager developers is grappling with the same questions, reaching the same inherently limiting conclusions, attempting to once again adapt non-interactive techniques from passive mediums, and confusing the revelation of pre-designed outcomes with choices that determine outcomes.

— Mark Barrett

Filed Under: Ditchwalk.com, Interactive Tagged With: interactive storytelling

Site Seeing: Daniel Menaker

April 26, 2015 By Mark Leave a Comment

One additional nugget I managed to recover while fixing broken links was a post on the Barnes & Noble site, written by Daniel Menaker. Who is Daniel Menaker? Well, at the time I knew almost nothing about him, to the point that I described him — hilariously in retrospect — as “another dirt-dishing voice” in the publishing industry. (Saving me somewhat, I also noted that he was a former Editor-in-Chief at Random House and Fiction Editor at The New Yorker.)

Re-reading the B&N post after five years, however, I found myself more curious about Mr. Menaker than about publishing. A quick search led me to a memoir he’d written, titled My Mistake, which was published in 2013. Interestingly, in reading that book I found that the context of Mr. Menaker’s life gave more weight to the views he expressed in the B&N post, as well as those in that book and in other writings I discovered.

Now, it may be that confirmation bias played a part in my reaction because much of what Mr. Menaker had to say jibed with my own conclusions, but I don’t think that’s the case. Not only do I think he would disagree with some of my grousing here on Ditchwalk, but my interest in understanding the publishing industry has decreased so much in the past five years that I now consider such questions moot at best. (For example, five years ago I would have deemed this story important. Today it seems meaningless.)

Still, as an outsider corroboration is useful when you’re assessing any human endeavor, to say nothing of doing so from the relative orbit of, say, Neptune. In reading My Mistake I found a fair bit of corroboration for conclusions I’d previously reached, yet after I finished the book I also decided to see what others had to say as a hedge against my own potential bias. That impetus quickly led to this review in The New York Times, which caused me to stare agape at my screen as I read what seemed to be a bizarro-world take on the same text I’d just digested:

Make no mistake, this is an angry book. Menaker is angry at himself for his character flaws (a flippant one-­upmanship that alienates others), and he is thin-skinned, remembering every slight. As a former executive editor in chief of Random House, he is proud to have nurtured writers who went on to win literary acclaim (the Pulitzer Prize winner Elizabeth Strout, the National Book Award winner Colum McCann). Menaker is understandably upset over being ousted from that job in 2007, but what seems to truly infuriate him is being shunned by the publisher, Gina Centrello, during a transition period.

I honestly don’t know what that reviewer is talking about. My Mistake is not an angry book, unless your definition of anger includes expressing an opinion. And no, Mr. Menaker is not infuriated about being shunned by anyone — or at least not anyone in the publishing biz. If anything, he’s infuriated by his own serial incapacity to connect with other human beings in his life, though over time — and particularly in the writing and structure of My Mistake — I think he belatedly squares things with his departed father.

Then again, that’s the publishing industry in a nutshell. You can spend a year or two writing a book, yet when it’s reviewed — in this case, by no less than the self-anointed consensus cultural steward of commercial literary criticism — you can still end up being cleaved by a reviewer with an axe to grind, or mischaracterized because of a reviewer’s blind spots or personal acidity. (If you also worked in publishing for a time you might even be the recipient of some score settling.)

[ Read more ]

Filed Under: Publishing Tagged With: site seeing

Site Seeing: Laura Resnick

April 24, 2015 By Mark Leave a Comment

Speaking of reclaiming busted links, one benefit I didn’t anticipate was that chasing down lost pages put me back in touch with information and sources I previously found valuable. For example, while I was ultimately thwarted in my ability to recover an excellent post by Laura Resnick concerning cover design, digging around on the web for that missing content led to two informative discoveries.

First, I eventually found what I think is a more recent discussion of the same subject here. (The first link at the bottom of that interview is the same busted link I was trying to track down.) Second, when I went to Laura’s new site I found a great resource page that every independent author should bookmark and peruse.

Sure, the fact that I don’t have a resources page suddenly makes me look very bad in comparison, but that’s all the more reason to visit Laura’s site and check it out.

— Mark Barrett

Filed Under: Publishing, Writing Tagged With: cover design, site seeing

  • « Previous Page
  • 1
  • …
  • 3
  • 4
  • 5
  • 6
  • 7
  • Next Page »