A Remembrance of Webs Past

A lot of the recent talk about Ello, an upstart social media site, has been about how the nascent community reminds members of the early days of the internet. You know, before native advertising, privacy invasions and Buzzfeed.

Ello's appealing manifesto recalls the first days and first communities in the Web. 

And that got me thinking about what the early Web was like. Many people, and a lot of my students, think the internet began in 1991, with the birth of the World Wide Web. And, while it's true that the Web of intertwingled, typographically varied documents started then, the internet, as a network of computers, had been around for decades. Networks like Fidonet and Compuserve offered a rich array of documents and online communities and dialing into FTP servers opened up even more digital doors.

And, in the early 90s, when you browsed the wonderous Web with Mosaic, you could actually logon on and scan the new sites that had sprung up over night. Honestly, in those first days I did just that, spending a half hour to see what new resources had popped up. It was that small and that fresh.

The sites I found all sported generic gray backgrounds, ungainly fonts and designs that reveled in rotating ovids, "under construction" graphics and ungodly colour schemes. What pictures there were got painted across the screen at the pace of a dot matrix printer and were tiny, 256 colour blocky smudges. There was little sound, almost no video and it was a Land Before Flash.

To create the pages you either hand coded HTML or used a clunky tool like PageMill, which, when I first saw it demonstrated at MacWorld in 1995, was the first software program I lusted after.

The online communities back then were either walled gardens, like AOL or Prodigy, or text-only playgrounds for nerds. Some of my favourites were Community Nets, geo-centric sites that had sprung up in cities like Cleveland and Denver and acted as virtual city hubs. I even helped build one in Hamilton - CompuSpec, an online community started by the Hamilton Spectator and killed a few years later by editors who thought the internet was probably a fad.

It's hard not to be nostaglic about those years. Facebook didn't exist, nor did Google, or cookies, or botnets, malware or popups. It was clear sailing to the far side of the world.

Now the web has become it's own world of wonders, beyond we dreamed of back then. I'm writing this on cloud-based word processor, with a wireless keyboard paired with a phone with 128 gigs of memory (so much, so small) on a moving bus while connected to a Web so fast I could stream a hi rez movie and have a video conference with ten friends. And, yet.

So, I understand the nostalgia Ello spawns. We can't live there anymore. But sometimes it's nice to visit. 

The MacPhail-safe Guide to Better Smartphone Photos

In the last few weeks new smartphones from a variety of manufacturers have sported improved cameras. I'm thinking here of the HTC One M8, the Moto X and the new iPhones.

Each, in their own ways, have made it easier to take better pictures using phones with lenses and sensors that are, compared to other photographic gear on the market, Lilliputian. 

So, I thought it was a good time for me to present the MacPhailsafe Guide to Mobile Photography. Or, Five Tips for Taking Better Pictures with That Phone in Your Hand. 

Here we go.

Tip#1- Clean Your Lens

I know this sounds stupid and obvious. But, trust me, it isn't. Take a look at the lens of your smartphone right now. See all that dust and that dull sheen on the surface? That's like a sheet of waxed paper on your lens. If you shoot bright windows or streetlights you'll see the effect immediately - greasy streaks of light and halos. But, even when you aren't shooting light sources, the grunge that is causing those light smears is there, ruining the sharpness, colour saturation and contrast of every picture you take. So, take the microfibre cloth from your glasses case and give the phone lens a little love. 

Tip#2 - Brace Yourself

The human body is just a skin bag of levers and fulcrums. The longer the end of a lever (say, an arm) is from the fulcrum (say, a shoulder joint) the more it can fanny about in space. So, when you take a mobile phone picture holding that phone out in front of you at arm's length, you're putting it in about the most unstable place possible. A better way is to bring your elbows to your sides and gently press them to your ribcage. Now, brace your core, like you've been told in Pilates, and don't lock your knees. Now, breathe in and, halfway through a gentle exhale, take your picture with a delicate touch of the screen. There. That's a picture that's less likely to suffer from the camera shake that can ruin a shot, especially one taken in low light. Want more stability? Brace your phone on a chair, a table, in a doorway or on anything solid. You'll be amazed how much sharper your images will appear. You could also use a tripod, but few people actually ever do that with a mobile phone, so this will get you by.

Tip#3 - Better Panos

This one will make you do forehead slap. I see lots of vacationers taking panoramic shots with their phones by arcing their aforementioned extended arms in space in from of them, as if presenting the vista to a visiting guest. Wrong. Think about it. When you swing your camera like that, the lens at the end of the arc is a good three feet to the right of the lens at the beginning of the shot. That yard introduces parallax (think of how different a scene looks when you close the left then the right eye). And, that parallax introduces distortion that can make a pano look wonky. A better solution is to rotate the phone itself on the axis that runs through the centre of the lens. There's way less distortion that way, since the lens is travelling a minimum distance through space.

Tip#4 - Baby Your Sensor

The sensor on an expensive DSLR is the size of a Triscuit. The sensor in most smartphones is the size of a Triscuit crumb. Actually, a little bigger, the size of an infant's fingernail. It's tiny. That means it can't display the range of light to dark a big sensor can. So, don't expect it to give you great shots of a bald-headed man in the noon sun. It can't capture the detail in the shadow under his nose as well as the sweaty pores in his sun-drenched brow. When you can, shoot in environments with less dynamic range, or fill shadows using natural reflectors like the sides of buildings, white interior walls or anything that will help your infant-fingernail sensor cope.

Tip#5 - Come in Close

War photographer Robert Capa once said, "If your pictures aren't good enough, you're not close enough". One of the first things I do when I see beginning photographers' work is to suggest they crop in on their images to remove distractions. Why? Maybe because they're shy, or maybe because they haven't trained themselves to see distraction on the edges, but beginners often just don't get close enough to a subject. The problem is, when you crop the image from a tiny sensor you're tossing away information captured by sensor pixels you could have put to better use. So, your image is degraded. Better to crop at the scene, rather than after the fact.

There you go. Now, go outside and take better pictures.

Hanging Out in Meet Space

Hanging out with other humans is what humans like to do best. And, we have hundreds of ways and places to do it. So, it is strange that so many organizations have dreadful intranets that make hanging out as easy as a shoe fitting at a mermaid convention.

For many companies, intranets are nothing more than PDF graveyards. Even organizations that pride themselves on collaboration, have internal web spaces with no place for discussion, sharing or creating together.

Sometimes that's because IT departments are primarily concerned with managing the haphazard files spit out by what they see as an unruly band of cubicle monkeys pounding away at Microsoft Office. And between that, solving the same monkeys' mundane hardware issues, dealing with legal and government compliance and sweating the details on security, there's no time to think about something as thorny and organic as discussion forums or wikis.

Sometimes it's because the organization is afraid that conversation amongst staff will foment or fuel unrest, or that the moderation of discussion forums will take time away from HR or communications, or IT or however is charged with that oversight.

And, sometimes it’s just because managers don’t know that there is a wide world out there beyond resource management structures and Sharepoint. But there is.

I’m a big fan of Basecamp, a great online product that has a short, easy learning curve and lets teams share text documents, files, discussions, calendars and to dos. It has very powerful mobile apps and a robust tagging system that makes finding files natural and simple.

My favourite feature is the ease with which plain text documents can be collaboratively created. If you’re using documents for print, online and for web apps all at the same time, having a plaintext, canonical document that lives in one place is a God send.

I’m also a fan of wikis. My favourite is a made-in-Canada wiki called Project Forum. A wiki is a simple collaborative workspace in which team members can read and write web pages. The spaces created in wikis like Project Forum are generally organic and the team creating them is self-sufficient. That is, they can edit, create, comment on and erase pages and whole fora, without IT support. They tend not to be the prettiest of group spaces, but they are the most flexible.

Then, there is a fascinating tool that is the darling of Silicon Valley startups, Slack. Slack is less a collaborative space, as a powerful chat tool that has robust file archiving and searching baked into it. For small, creative teams, it’s a great alternative to a product like Basecamp.

And, there are a suite of new collaborative spaces that have a sparse, modern feel. They combine the flexibility of wikis with the good looks of contemporary apps. I’m thinking of products like asana, and the Canadian-made Igloo. Asana has more of a project management focus, while Igloo is not only a great way to set up collaborative spaces for a variety of teams inside a company, but is also a terrific extranet tool. So, a company can use it to work on projects with remote suppliers, consultants and partners.

Or, you could just head back into the dank archives of your long dead PDFs and blow the dust off a few. But, if that’s your company’s idea of collaboration, you’re riding in the wrong rowboat, friend.





 

The Digital Crown Affair

As Apple  CEO Tim Cook pointed out at the unveil of his company's new watch, every new computing platform has its unique input device. The mainframe had a terminal, the Mac a mouse, the iPod a click wheel, the laptop a track pad, the iPhone multitouch gestures. And, now, the Apple Watch has the little metal windey thing that sticks out of its side. 

On an analog watch that thing is a crown and it's used to set the time. Apple calls its windey thing a digital crown and it's used to scroll through menus, zoom in on photos and change settings on the high resolution screen of what has to be the most stylish smart watch on the planet. While that's a bit like saying the snappiest dresser in Math Club, it really is a fashion plate by any Swiss standard. 

But it was the digital crown that most impressed me. It's a perfectly Apple piece of industrial design. It echoes the functionality of a crown in an analog watch, looks like it belongs and yet is the perfect input device for a tiny screen your fingers would obscure if you tried to use multitouch. 

And, it was only one of a handful of design details that demonstrates how much Jony Ive and his design team care about user experience. 

Another is the rounding of the glass on the surface of the new iPhone 6 and 6 Plus. That gentle curve not only picks up and carries around to the front, the curved back edges, but also means that when you swipe left and right, the tactile experience trails off with a lovely fade, instead of a sharp edge. 

On the subject of the new phones, I think the new 5.5 inch size is the real winner here. It seems to have all the functionality of an iPad mini but in pocketable form. And the new keyboard layout with dedicated cut and paste keys means it could be an ideal two thumbs writing slate. Plus, its going to be the only iOS device with an optically stabilized camera for video and stills. Since it also now shoots 1080p video at up to 60 frames per second, it could be the only recording and writing device a field reporter would need. It is, I think, the new iPad Mini. 

I suspect the next iPad lineup will feature a 12 inch model and the iPad Mini will disappear. 

But then, I thought the big white mystery cube at the event would be a house, not just a big demo area, so, what do I know?

 

 

 

 

 

The Mystery of the Big White Box

In about a week’s time Apple will announce its new iPhone or iPhones. It might even tease, or release, its speculated wearable. Rumours about all of these theoretical devices have been swirling for months. And, as usual, Apple has had nothing to say. But, they have been building something.

Last week photos appeared of a three-storey, white edifice. The company is erecting the mystery cube near the Flint Center for Performing Arts in Cupertino, California. It’s just south of same building in which Steve Jobs first unpacked the Macintosh for the public in back in 1984.

Apple already has its own small event hall. And its executives have made frequent use of two spaces in downtown San Francisco: the Yerba Buena Center (750 seats) and Moscone West (1,500) seats. The Flint Center itself has over 2,000 seats. So, it’s unlikely the mystery building is an auditorium. I mean, it’s as big as a house. Which is, I’m betting, exactly what it is.

Back in 2006 Apple built a multiroomed demo space to show off its underwhelming IPod Hi-Fi. It also modelled full-sized Apple stores months prior to their installation around the world.

And we also know that iOS 8 will feature both HomeKit and HealthKit apps. The former will serve as hub for health and fitness data. The latter will be  the lingua franca for a host of home automation devices and appliances. Not coincidentally, on Tuesday Elgato, announced the Eve home sensor kit that works with Homekit.

And, there is now a strong rumour that after a long delay, the iPhone 6 will have a Near Field Communication (NFC) chip. That means that combined with the underused Passbook, the fingerprint scanner and the millions of credit cards the Apple Store already has on file, NFC could turn the iPhone 6 into a secure electronic wallet. All the pieces just have to snap together.

Now imagine a wearable with health sensors, NFC and access to iOS 8’s improved notifications.

In 2001 Apple introduced the idea that the iMac could be the hub of your digital life. I think, 30 years and in the same place the Macintosh was introduced, and 13 years after the “digital hub” was launched, Apple will announce two new products that will make the “digital hub” look quaint. Apple is now aiming to be at the centre of our lives, and its going to invite us into their idea of our home to show us how it’s going to work.

Either that or reporters will be led into a big white room and finally discover where all the Jony Ive videos have been made.

 

Interactive Graphics. The New kid in Town.

As any good reporter knows, drama drubs data. You want to convince someone of something, sell someone something or compel someone to action? A great story beats a spreadsheet, every time. 

This is a lesson all journalists learn early and many scientists have trouble getting into their large craniums. Unless you can make your audience engage with a story, all your numbers are just like the muted trombone sounds in a Charlie Brown special when the teacher is talking. Waah, waah, waah.

You think Jenny McCarthy has a following because she knows how to use Excel? 

Stories about income tax almost always start with an anecdotal lead: “Josh and Anita Roseman are going to have a little more to save for their daughter’s university education this year …” and Ms McCarthy can spout rubbish about her son, autism and vaccinations and all the data becomes so much number gumbo. 

but, telling a human story isn't the only way to bring integers to life. 

Good infographics can serve to illustrate a narrative hidden in data. Using carefully data points, charts, text and typography, they can brighten dull numbers and bring clarity to  buried insights and patterns. They tell a compelling story with pictures. The best don’t dumb down data, they focus it. Besides, really, nothing is a dumb as presenting a mass of grey numbers and expecting anyone to care.

Human interest stories and great infographics are old tools in a communicator’s utility belt.

But, there's a new kid in town. Interactive infographics. 

Software tools like Qlik and Tableau make it possible for organizations to take reams of data and turn them into pie charts, scatter graphs and maps that you can publish to the Web or even embed on your own site. The magic of these new tools is that these infographics allow your users to play with the data themselves. You could, for example, publish an infographic that shows major pollution sources across Canada. But then your users, by just highlighting certain segments of the charts, zoom in on a province, or if you had the data to support it, their own city. Or you could publish epidemiological data and then users could drill down on just, say, women aged 19-34 in Alberta and all the pie charts, bar graphs etc would change in response to that specific data set. 

Both Qlik and Tableau offer free software trials, and both are relatively easy to use. Qlik is more powerful if you are looking to unpack relationships between multiple data sets. Tableau, however, is much easier to use if you want to get quick interactive graphics up online. 

What this means, in terms of storytelling, is that users can find their own drama in the data, discover stories hidden in the numbers or even identify outliers that might have new stories to tell. "Why are so few women in Nova Scotia getting HPV vaccinations?" for example. 

Since we are in the midst of a Big Data sea, these kinds of tools, that allow the visual display of quantitative information, will be an asset to organizations, activists and media outlets who want to crowdsource the discover of the narratives in the numbers.

As with all tools like this, there may be some blow back within your organization from the folks who are the keepers of the data, and/or who see their ownership as a base of power. In the end, I don't think they'll be a real barrier. Like any kind of shift to crowdsourcing (folksonomic tagging, user-suggested stories, comments, shared document analysis) this is an unstoppable force that will soon be the way much data is presented online. 

The sooner you learn the tool the sooner you can let your audience discover their own drama in what used to be the drab data that, it turns out, cleans up real nice.  

 

 

 

I, For One, Scoff at Our Robot Overlords

Apart from vampires and the rain-soaked kiss of true love, there is no more persistent cinematic trope than computers ruling the world. They are the Skynet of the Terminator movies, the HAL of 2001, The Matrix of The Matrix and the Johnny Depp of Transcendence. In these films computer intelligence has moved far beyond the cranial capabilities of mere humans. The machines can out think us, out fight us and can even invade other machines and reproduce themselves. Humans are merely slaves or, at best, fresh batteries.

It was hard not to think of these films after reading a couple of news items in the last month. First, the team that built Siri is now working on Viv, an artificially intelligent personal assistant that makes Siri look like a Speak ‘n’ Spell. Second, IBM announced that it has created a "neurosynaptic computing chip". The novel chip simulates the brain's neurons, synapses and axons. It can learn to look for and flag "interesting" objects in a video stream. Cut to scene of naked, hungry survivors being hunted like rabbits by metal overlords.

Oh, and one final news item to add some verisimilitude to that last money shot. This week

a startup, One Codex, announced it wants to make it easy to search the petabytes of data about the genetic information of living creatures, including humans. Know your enemy indeed.

But, let us, please, put away our foil hats as we consider all this.

It is true that we have, collectively, created a diffuse digital database of the lives, loves, inventions and art of the human race online. For good or ill, it is a wacky fun house mirror held up to the mundane trapse of flawed bipeds for the last 30 years. We can search and we can find, using a device the size of a used bar of soap, just about anything we need to know. Go ahead, ask Siri: "What planes are overhead right now?" if you don't believe me.

It is also true that virtual assistants like Google Now, Siri and Cortana are getting smarter all the time. They will soon be so clever that actually entering a search will seem as quaint as inputting DOS commands on a screen of green text. Search, the way Google makes money at it, will seem like a commodity offering, or even a public utility.

And, yes, robots that can walk, run, skitter and swarm are already among us.

But does all that add up to a future that plays out like a sci-fi blockbuster? I don't think so.

I have every faith that in the next ten years the Web will be a natural language query away and that our vacation plans will be a matter of our digital valet networking with virtual valets worldwide. Maybe that will happen through wearables, maybe via implants. Probably our brains will be augmented as invisibly as our eyes are when we get LASIK eye surgery. I have no doubt manufacturing and service jobs will be passed to robots and the care and feeding of those robots to other robots.

But that is a far cry from a robot revolution, a sentient über mind and master metal race. The more we understand the human brain, using the tools we've invented to do that, the more we appreciate that we know little about the complex electrochemical dance that gives rise to consciousness. I think that complexity will yield more complexity that in turns unpacks a fractal depth of unfathomable richness. To compete with that using the constraints of Moore's law is like expecting a snail to win a race against a rifle shot.


 

The Miracle of Materials

Who, three decades ago, would have imagined that the materials that would change consumer electronics would be glass, ABS plastic, sapphire, graphite and aluminium?

Back then folks might have said silicon, tin and polypropylene, since that’s what made up the majority of cheap computers, cassette decks and mobile phones. Back then most consumer electronics weren't cheap, they just looked it. High end materials and elegant industrial design went into luxury cars and watches, not video consoles and desktop computers. 

Sapphire? That was for the faces of Swiss timepieces. Toughened glass, the forerunner of today's Gorilla Glass, was installed in jet aircraft. So was milled and carefully-welded aluminum, which also formed the frames of Canondale's top-of-line mountain bikes. 

Now artificial sapphire is poised to become the tough, thin protective layer over the screens of our mobile devices. Beneath it, another form of toughened glass, Willow, will have displays embedded into its thin flexible surface. And, thanks to Apple's industrial milling capacity, aluminum is forming the unibody casing on the most popular laptop on the planet. 

Thirty years ago, graphite was in pencils and dry cells. Nanoscale changes that yielded materials like Buckminster fullerene (Buckyballs) and graphene hadn't been developed yet. Graphene, a two dimensional array of carbon atoms, holds promise as the substrate for a new generation of transparent electronic circuits. 

ABS plastic was used in Lego bricks, car trim and as a colourant in tattoo ink. Now it's the go-to material for 3D printers. 

It could well be that in ten years many of our most coveted devices will be translucent, laminated sapphire wafers with nearly invisible graphene-based circuitry. And the cases, if we need them, will be 3D-printed at home. 

The point of this is simple. The technology of the future is never just an extension of the past or present. Revolution sometimes comes in leaps, not baby steps. Yesterday's pencil lead can become tomorrow's circuit board. The gemstone of the past is the screen of today. And the Lego brick of 30 years ago the building block of an industrial revolution. 

Sometimes all it takes is an economy to scale, a breakthrough to cascade, or a single device to set the world on its ear. Sometimes it's just a solitary "what if?" that becomes the Bolivian butterfly wing that sparks the tornado in Kansas that sends us all to the land of Oz. 

On occasion we know it when it happens. The iPhone changed interface design and it was obvious on first sight. WiFi was a clear win. Netflix reconfigured our relationship to television and you knew it the first time you saw it. 

Sometimes the leaps are more stealthy. That's especially true with materials. With chemistry it takes a critical mass to spark a chain reaction.

The Ilusion of the Frozen Moment

What's a photograph? If you answered: "a moment in time captured on film or as a digital image" your answer would only be right for the last hundred years or so. Back in 1839, when the Daguerreotype process was announced to the world, an exposure on a glass positive would take 20-30 minutes. When, two  years later, Henry Fox Talbot introduced his calotype method of creating a film negative, the exposures were shorter, but still measured in minutes, not seconds or fractions of a second.

So, with either method, what was captured was the accreation of time stacked, chemical reaction by chemical reaction, on an exposed plate. Early photographs are a hearty slice of time, not a unique, frozen sliver. The images they catch never really existed as we see them now. They are collapsed movies.

By 1900, with the introduction of the Kodak Brownie, the idea of capturing a moment in time became more real. The faster film in the boxy, cardboard cameras meant that shutter speeds could be as fast as 1/50th of a second. An instant in anyone's books.

But, the Brownie is well over 100 years old. And, today's digital cameras, smartphones to DSLR, have more in common with the earliest cameras, and the human eye, than they do with the square little Kodak snapshooter.

Like Daguerreotype and calotypes, modern sensors capture a swath of time, not a discrete moment. When you shoot a high dynamic range (HDR) image with a smartphone, the device's camera is actually taking two or more pictures at different exposures over time. Those images are then combined to make a single image with more details in the dark and light areas of the final combined image than could be obtained by taking a single shot. 

When an iPhone 5S takes an indoor image in low light it combines the sharpest parts of multiple images. A variety of low light photography apps (or settings) on smartphones and DLSRs will conflate a few shots to average out the low light noise that plagues some sensors.

Even when you take a single shot with a smartphone, the sensor sweeps a scene like a radar scope. To see that, shoot a picture from a fast-moving car. The "jelly" effect you see is the result.

The frozen moment is a centuries-old illusion, or even as old as vision itself.

Why? Because today's camera sensors work much like our own eye/brain partnership when it comes to making sense of the world.

We only imagine we take in a scene in the blink of an eye. In fact, our retinas have only a very small part, called the fovea, that can see coloured images in high definition. The rest of the retina can discern only gross shapes and shades.

To see our world in detail, we have to flit our eyes around a scene so rapidly we're not even aware of what's going on. These flits, called saccades, happen several times a second. After each flit our fovea falls on a tiny section of a scene for about three hundred milliseconds. So, what you think is a simple glance at a scene is our brain assembling a jigsaw made of little, sharp images sampled over time.

And, even then, our brains have to route that image to the pariatel and temporal lobes and elsewhere in our grey matter to sort out shape, face and context.

Likewise, in modern cameras, the data that hits the sensor is processed nine ways from Sunday by the imaging silicon in the cameras. That software will enhance edges, guess at colour balance, sort out exposure data, sweeten contrast and even toss away data it doesn't think your eye needs to still see a great looking image. It's your camera's equivalent of the foveal/gray matter dance in your head.

So, a moment frozen in time? That's so very 1900.

We Can't Get No Satisfication

Human beings are odd creatures. Example: we satisfice. We exhibit that tendency when we discover a way of doing things that is, however awkward and convoluted, strangely comforting over time.

Relatives who get to the google.com home page by searching Yahoo for Google are satisficing. No matter how often you explain you could just enter the URL directly, they will continue with their familiar routine. We are creatures of habit, no matter how wonky and convoluted that habit might be. But, and here again is evidence of our oddness, we route around the complexity imposed by others.

If someone introduces a process at work that requires us to click on a different button or save a file a new way, we route around those new complexities like water curving to either side of a boulder in a stream.

In both cases we are craving simplicity. It's just that, strange creatures that we are, we are blind to self-created complexity - in other words, engrained familiar patterns, even convoluted ones, become worn smooth and frictionless by the grit of repetition.

We are also irrepressibly social beings. With the exception of the psychopaths among us, we, like Blanche Dubois, depend upon the kindness of strangers. When we are stressed our first reaction is to seek social comfort and comparison. When we are excited we want to share. When we fail or do wrong we worry about the social approbation. We crave (and I say this as an introvert) community, conversation and collaboration.

So, these two realities: humans are social and crave simplicity, should be key guideposts for any kind of computer system or process that hopes to attract the interest and adoption of its users.

But, oddly, many organizations create IT processes and adopt software solutions that turn their backs on these key characteristics of human nature.

And what do the odd social, satisficing creatures in the organizations do? They route around the complexity. Give staff a complex, unreliable and dated web conferencing tool that requires a login and they'll fire up Skype or a Google Hangout instead. Require staff to use a secure but complicated file sharing process and they'll download Dropbox. Require that employees check in and check out Word documents in order to work on them and teams will fire up Google Docs and gang edit a document simultaneously. Create a collaborative space that is unfriendly to social media tools and is little more than a Word document graveyard and groups, yearning to collaborate, will start a wiki.

In other words, organizations that create mechanistic, security-centric and integrated solutions will ironically, end up with two parallel worlds.

The one will be the artificial one where everything is secure, integrated and as efficient as a librarians file cabinet.

The other is the real world were security, order and integration are sacrificed by employees who use simple, free tools to actually get work done in a social, frictionless way.

As we've seen in government after government, when the way people want to live and work divert sufficiently from the way a ruling class tries to make them work, we get a revolution. Ironically, the very tools the people adopt to live their social lives makes it easy for them to start a revolution.

Organizations that subvert the needs of users to the needs of legislation, security and IT infrastructure should prepare themselves for placards in a cafeteria. Not that: "What do we want!" "Tools for us!" "When do we want them?" "Let me search yahoo.com to Google that for you!" is much of a rallying cry.





 

What Comes Around

Boy, what a difference 30 years makes. In 1984 Apple released its famous “1984” advertisement. In the ad a blonde-haired athlete hurls a hammer through a giant screen. On that screen a “Big Brother” character drones about “information purification directives”. The target of the hammer, and the ad? IBM.

Just before the preview of the ad Steve Job told a crowd at a 1983 keynote:

“It is now 1984. It appears IBM wants it all. Apple is perceived to be the only hope to offer IBM a run for its money. Dealers initially welcoming IBM with open arms now fear an IBM dominated and controlled future. They are increasingly turning back to Apple as the only force that can ensure their future freedom. IBM wants it all and is aiming its guns on its last obstacle to industry control: Apple. Will Big Blue dominate the entire computer industry? The entire information age? Was George Orwell right about 1984?"

Well, just last week Tim Cook announced a surprising deal. IBM is not only going to sell iPads and iPhones to enterprise clients. It’s also going to develop over 100 apps that will allow those clients to dive into big data using their mobile devices. Was George Orwell right about 2014?

Meanwhile, early the same week, Microsoft’s relatively new CEO, Satya Nadella, in a long, biz-geek speak memo, announced a new focus on the cloud and mobile and hinted at layoffs. By week’s end the other shoe, and 18,000 jobs, dropped in Redmond. That's 14 per cent of Microsoft's workforce, and the biggest layoff in the company's history. Microsoft executives are claiming they’re still bullish on hardware (including Surface tablets, the XBox One and smartphones). But, industry observers expect the company to give up on hardware and focus on the business software and cloud, side. Just like IBM does.

And, of course, on a Canadian note, none of this is any good for Blackberry, which saw its stock take a four per cent hit on news of the IBM/Apple deal.

So, what’s going on in this topsy-turvy world? Microsoft is suffering on the PC and hardware side, and is tossing aside the devices and services religion of Gates and Ballmer. Apple’s in bed with arch-rival IBM and stands to become the purveyor of mobile devices that can see deep into some of the richest data mines in the world.

First, Microsoft. The Redmond-based company tied its software to cheap PCs. But, its software alone failed to offer a value-added difference to those PCs. So, manufacturers started a race to the bottom in order to carve out some kind of margin from what had become a commoditized business. The result was cheap, creaky, plastic junk boxes all running the same OS. Meanwhile Apple had OSX, an operating system people were willing to pay more for. As a result, Macbooks could afford to offer solid, beautiful devices that PC manufacturers could only dream of emulating.

Then, Microsoft introduced Windows 8, a dreadfully marketed, bifrucated and confusing operating system that enterprise customers ignored in almost as many numbers as average consumers and those looking for a mobile phone from Microsoft. Which is why, of course, a majority of the layoffs will come from the beleaguered Nokia division, Microsoft picked up in April for just over $7 billion. It was a bad acquisition Ballmer made as a lame duck CEO on his way out. 

Microsoft left itself, in the consumer space, with cheap junky laptops running an OS few customers wanted. And it was squeezed at both ends. Apple owned the high end market and upstart Chromebooks, which educators fell in love with, ate the sub-$400 space for breakfast. Plus, on the mobile side, it came to the table too late with a confusing collection of handsets and a half-baked mobile OS.

Meanwhile, over at Apple, Tim Cook was bringing his supply chain genius inside Apple and making the Cupertino company run like a well-oiled, fully-integrated Swiss watch churning out some of the best mobile devices in the world like clockwork. Just the kind of partner IBM needed.

So, Microsoft is now going to find itself in a bit of an enterprise catfight with Apple and IBM on the other side of the table. We just get to sit back and watch the hammers fly.




 

Scaling the Rich Media Mountain

The PDF was invented 21 years ago. That means that a vast majority of the content that companies put online would have been right at home back when I Will Always Love You was in the Billboard Top Ten. It’s like two decades of technology never existed. Why?

One of the main reasons is a lack of time. It's far faster to generate a PDF of a report than to make the effort to produce rich, interactive media more appropriate for online and mobile devices. 

So, clearly companies need to think through how their rich media production is scalable and sustainable given the organization's capacity. Either that or keep producing content as if the advances of the last 20 years never happened. 

Let's tackle scalability and capacity separately. Scalability simply means that you begin as you mean to go on. It involves using tools, workflows and templates that make it possible to quickly and repeatedly put out quality rich media without having it suck your resources like a black hole in a Dyson. 

The biggest barrier to scalability is captured in the old adage: "great is the enemy of good". If you begin creating rich media (say, a video) and try to make it an "all-singing-all-dancing" high production value piece it will take you so much time and drain so many resources it will be a long time before you have the energy or budget to do it again. And, if it happens to be the first video you produce, you’ve set the bar so high that you’ve effectively destroyed your ability to scale. You can’t repeat the feat on a regular basis. 

The solution? Avoid great. It makes more sense to most organizations to produce good but not great video in a way that you can do week-in and week-out without each video taking two days or more to produce. You do that by keeping the videos under two minutes, using simple intro and extro clips (Keynote is great for this) and keeping the b-roll to a minimum so that your post production time is decreased. Better that you get a good video a week online than a great video online once every six months.

Scalability is all about being pro enough to sacrifice the ideal for the doable. 

That doesn't mean you can't have great looking, videos. You can, just have budget to outsource those occasional gems to professional videographers.

Next, capacity. In order to produce rich media in sufficient volume, you need to remove production bottlenecks. If only one person in your organization can shoot video, (let’s call her Jenni) then Jenni will be your bottleneck. Your capacity can be no greater than Jenni’s capacity. And, even if your communications team can all shoot video but Jenni is the only one who can edit video is Jenni, then, once again, Jenni’s capacity is the entire company’s capacity.

You don’t want to give Jenni all that power. It will just go to her head.

It makes more sense to train your entire communications team to shoot and edit good, but not great video. They won’t always do it, all the time. But, every time they do, you’ve increased your capacity.

Now, if you increase your scalability and your capacity, you’re in a better position to create good rich media on a regular basis.

If your next question is: “How are we supposed to do all this rich media stuff along with all the other things we do?” You’re asking the wrong question. The better question is: “What are we doing now we wouldn’t have to do if we were all producing good rich media?”

Ask Jenni, she seems to be on top of this.

Wrapping the Future Around Your Wrist

Okay, I think I’ve got a sense of what Apple might do for an iWatch. This has taken me a while. But, now we’ve seen a majority of the smartwatches other companies like Samsung, LG, Pebble and Basis have put out. And, Apple's most recent major hire is a dead giveaway.

The recent Android-based smartwatches are most interesting here, but they are all clunky, notification-based and meant for a male wrist. None of them would tempt a non-nerd, non-watch-wearing consumer to change their habits. The Moto 360, with its round face comes closest, but it has the thickish, oversized look of geek apparel.

On the fitness side, Jawbone, Shine and Fitbit have compelling products, but none replace a watch, they just occupy the other wrist.

And we’ve also got a sense, from it hires and ads, about where Apple’s interests lie. Clearly biometrics are important, as the recent announcement of the Healthkit app, the recent “Powerful” ads series and the hiring of medical experts on biometric measurement demonstrate.

But, most interesting was the announcement this week that  Apple is now the employer of Patrick Pruniaux, formerly Tag Heuer's vice president of sales and retail. It seems clear that Pruniaux will help launch the iWatch.

And, before Pruniaux, Apple snagged Angela Ahrendts, the former CEO of Burberry, to head up its retail division. Ahrendts turned around the fortunes of the formerly-stuffy Burberry by engineering a shift into upmarket and trendy.

And, here’s the final piece of the puzzle for me: Jony Ive and his love of miniaturization, Swiss watches and precision metal machining.

I’ve just finished reading  Leander Kahney’s biography, Jony Ive: The Genius Behind Apple's Greatest Products. It’s clear that Ive loves the challenge of getting technology into the smallest possible space. And, his position at Apple, even after Steve Job’s death, is such that he can push the company’s engineers to breaking and beyond to serve design.

He was also the champion of making use of computer numerical control (CNC) milling machines to carve complex housing for iPhones and Macbooks out of single slabs of aluminium. It was a "unibody" technique he perfected after observing how high-end Swiss watch manufacturers used it to produce precision casings for their exquisitely-detailed, limited edition chronometers. You know, companies like Tag Heuer.

So, here’s my theory. In order for Apple to seriously consider making an iWatch, it would need to be the same kind of product the Macintosh, the iPod and the iPhone were - category changers. None of the existing smartwatches come near that. They don’t reimagine the watch, and they aren’t fashionable enough to appeal to a broad group of users.

I think the iWatch will be. I am hopeful that the iWatch will become as revolutionary as the iPhone. I think with Ive’s design genius, the deep, cross-device integration now possible in iOS and the advances in chip fabrication, battery life and CNC milling, Ahrendts’ retail smarts and Pruniaux’s ability to sell high-end fashion masterpieces, we will see the birth of a beautiful wrist computer. It will be both a biometric miracle and a piece of jewelry men and women will covet. It might be to the iPhone what the iPhone was to the laptop.

And, I can’t imagine it being a single device. I think we should think instead of a line of Apple iWatches, of different sizes, features and casings. Watches that will give both Swatch and Tag Heurer a run for their money.

I could be totally wrong, but my gut sense tells me Apple plans to knock this out of the park. If they do Samsung, Motorola and Google will all look like punters who reached for the early geek win instead of taking hold the the general public’s wrist and wrapping the future around it.

 

Facebook and the Bald-headed Kid

There’s this cartoon you probably know. It features a big-headed, bald kid named Charlie and a girl named Lucy. Again and again in this amusing comic, Lucy promises not to move the football she’s holding while the Charlie kid runs to kick it. And, again and again, to great bittersweet hilarity, she does and little Charlie goes flying landing on his back, chagrined but, we understand, still trusting in Lucy’s innate goodness. We love that adorable sap, Charlie Brown.

And, when it comes to Facebook, aren’t we all just a dumb pack of Charlie Browns. Again and again Mark Zuckerberg yanks the ball of privacy rules, data gathering or social graph algorithms away from us, promises not to next time, and then, whoops, we’re on our backs chagrined but willing to give the good ol’ social network one more try.

But, this time, it’s a bit different.

This time Facebook conducted a psychology experiment nearly 700,000 unsuspecting users. The Princeton researchers who undertook the study with Facebook, wanted to see if emotional contagion could spread in a social network the way it spreads in real life. In other words, can being subjected to postive or negative status updates effect your mood the way being around positive and negative people can. In order to test that, Facebook diddled the newsfeeds of the unsuspecting users to show more positive or negative posts. Then the researchers monitored those naive users posts to see if they, in turn, reflected the mood of the jiggered newsfeeds. All this without the explicit and informed consent of the 700,000 users.

In other words, to use another Peanuts reference, the doctor was in, but the patients didn’t know they were being examined.

Facebook argues that its long-winded terms of service allow for the use of user data for research. But, the experimental protocol for this kind of research requires that researchers “obtain the informed consent of the individual or individuals using language that is reasonably understandable to that person”. Come on, people don't even read terms of service and if they did, they'd find no mention of this sort of experiment that doesn't study data, it alters feeds. And, if a subject is being duped, it is the researcher’s duty to inform the subject as soon as “reasonably possible”. In this case, there seems to be no evidence that users were told that they were part of an experiment or that their feeds had been manipulated.

Facebook also argues that the newsfeed is manipulated all the time by a “secret sauce” algorithm. And, besides, what they did was no different than standard A/B testing that websites do all the time. The latter argument is just willful stupidity. If you can't tell a Pepsi Challenge from a psych test, you shouldn't be allowed sharp objects.

And, what did the research uncover? A tiny bit of evidence that emotional contagion could happen on Facebook and that when users got more bland posts in their newsfeed they were disinclined to post much.

So, what’s the problem with all this?

To go back to Charlie Brown and Lucy, it’s just more evidence that Facebook doesn’t give a flying, flubbed football kick about its users. It cynically toys with privacy, always to its advantage, lies about doing it, buries privacy controls and generally acts, always, in the best interests of its real customers, advertisers. We just so much product.

None of that is new. But here we see Facebook secretly toying with people’s emotions (albeit with little effect, which is, really beside the point, since the outcome wasn’t known). And we see them probing how best to get folks to post more often. Plus, unless we really are trusting bald headed kids, we have to ask what other experiments are Facebook carrying out, and for whom? Mood control is nothing to take lightly.

The lead researcher on the study, Adam Kramer, now suggests maybe the research wasn’t such a good idea after all. Yeah, I think that’s about right. Because sometime Charlie Brown’s foot is going bloody Lucy’s nose, and nobody will care about the football anymore.

Or, maybe I’m just the bald-headed kid who thinks that this time people really will catch on.












 

Of Phablets, Pop Tarts and Sweet Spots

What is up with phablets? Everywhere I look I see folks with big ass phones these days. Phones the size of Pop Tarts, or single serving styrofoam trays. Phones that make other phones want to buy red sports cars and get blonde trophy phones to make up for a lack of diagonal screen dimension.

First off, phablet?  Let’s start with a definition. Phablet is an awkward contraction of phone and tablet. Most tech folks say large smartphones or phablets have a screen that’s 5.5 inches on the diagonal or bigger. The poster child for phablets is the Galaxy Note, which weighs in at 5.7 inches. When the Note first appeared it was mocked because you’d look like you were taking calls by putting a box of chocolates on your face. But it wasn’t a phone. A phablet isn’t so much a phone as a small tablet, the baby brother to a Nexus 7 or an iPad Mini. But, it has all the cellular connectivity of a smartphone. That means it has an almost-always-on connection to the web. In developing markets like Brazil, Russia, India and China, shipments of over 5 inch phones rose 389 per cent from 2013 to 2014.

A report from Opera Mediaworks suggests that phablets, wherever they’re purchased, are heavily used around town for map checking, emailing, texting and social media check in. Then, when users get home, they’re opting for a phablet over a tablet for social media couch surfing.

And a recent survey of UK mobile device users by Ofcom indicates that 47 per cent of the respondents aged 16-24 would rather give up their computer, laptop, tablet, radio or print publications before they’d let anyone tear their smartphone away from them.

So, why are phablets so hot now? I think it’s just taken a few years for the virtuous circle of consumer needs, industrial design and innovation to spiral in on the right size for a mobile computer. The first smartphones where the size they were because big, hi-rez screens were expensive. And, they were aping small devices called mobile phones. But, as anyone who uses a smartphone knows, the phone is the least interesting part of a smartphone.

Few people with phablets use the device much as a phone. They’re texting, or Snapchatting or Instagramming instead. And, sometimes emailing. Users were finding that a sub-five-inch screen wasn’t ideal for typing on a virtual keyboard, which is what you do when you have a pocket computer, not a portable phone, in your hand.

Now, even Apple, which has resisted the trend to phablets, is rumoured to be getting on board. We should see an Apple phablet sometime this year.

So, despite the goofy name, phablets may wind up being the ideal mobile computer for a generation that talks with their thumbs.

It’s fascinating to me that the its taken a half decade for the smartphone to circle and land on an ideal size. But, as science fiction author William Gibson once wrote in 1982, “the street finds its own uses for things”. And, the street also finds the right size for the things it needs.






 

Magazines, Bound for Digital

Last week I shot video and stills for Magazines Canada at its annual MagNet 2014 conference. The event brought together just over a thousand magazine publishers, editors, writers and business folks to discuss what they’d been up to, drink at sponsor-hosted mixers and speculate what digital  means for an industry awash in glossy stock and discount mail rates. I got to eavesdrop on a lot of the buzz and biz talk.

Here’s one of hot topics: smartphones and tablets. Panelists in a session shared what Rogers, The Walrus and St. Joseph Media have learned about putting their publications on tablets. A couple of things jumped out at me here. The Google Play store (for Android devices) wasn’t a great place to sell magazines. The real moneymaker was the Apple Newsstand and Apps Store. This is interesting because, in Canada, Android devices are starting to overtake Apple phones and tablets. But, as other app developers have seen the same thing. Android users tend towards free apps. Apple users open their wallets.

And, after the panel Larry Wyatt of St Joseph Media told me that if he were developing mobile magazine experiences right now he’d probably skip tablets altogether and go after smartphones. That’s a fascinating observation because we’re starting to see more and more people using large phones (over 5.5 inches) as not only a replacement for laptops, but as a substitute for tablets. In fact a recent OfCom survey found that the device that most users aged 34 or younger would miss most would be their smartphone, not their tablet or laptop.

This is going to be tricky for magazine publishers since the “big ass” phone market, to date, has been the domain of every other smartphone maker but Apple. That means that the most profitable digital newsstand is owned by the one manufacturer who doesn’t make a phone the size that is dominating the user base. If Apple does release a large phone later this year, that will make things a bit easier but the dichotomy illustrates how fickle the market is and how hard it is for publishers to find a “holy grail” that will deliver a solid, profitable product to a wide mobile audience.

Second, brand extensions. Most magazines started life as print-only, editorial products. But, at the conference publication after publication proudly announced itself as a business that sees a magazine as only one revenue stream, and not necessarily the most important one. In fact,  Matthew Blackett of Spacing used the conference to announce that it was pairing its editorial offices with a storefront.  The storefront (at 401 Richmond in Toronto) will sell Spacing-branded, Toronto-centric items like the magazine’s popular subway station buttons, t-shirts and other urban design products. Imagine, even a few years ago, a newspaper or magazine putting its editors in the same room with a store. Back then it would have made about a much sense as putting them in the same room as a free bar.

Spacing also makes a good deal of its revenue from hosting events. So does Toronto Life and The Walrus. In fact, Shelley Ambrose, the executive director and co-publisher of The Walrus said that the Walrus Foundation makes about a third of its revenue from events. And Toronto Life editor-in-chief Sarah Fulford works hand-in-hand with event planning to make sure the Toronto Life brand is properly represented and extended.

And finally, video. M. Scott Havens, the  senior vice president of Digital Time Inc. kicked off the conference with a unambiguous call for magazines to embrace mobile and to disrupt the $75 billion video advertising. “So, we all need to get over there, those of us who have a brand that can use video,” he told his audience. He said that Time Inc. has 15 recurring shows in production now with 30 additional ones they are considering.

Prior to taking the reins at Digital Time Inc., Havens headed the transformation of The Atlantic from a magazine with a 156 year history into an online brand that also makes great use of video. And, it’s no slouch in the live event arena either.

So, smartphones, brand extensions, storefronts and video. Odd stuff for magazine folks to be discussing. But it was clear that the magazines that really want to survive understand they’re in a business that puts out a magazine and a lot more. The ones who cling to their tree-based roots? They won’t be putting out a magazine for much more.

 

Solar Roadways - Just Porkpie in the Sky

This week, on Facebook, I joked that I'm going to launch an Indiegogo campaign for growing all of Canada’s vegetables in gardens atop porkpie hats. I was reacting to the depressing news that the Solar Roadways project got more than $1.8 million in donations from thousands of individuals. What were they supporting? The unicorns-and-Skittles dream of Julie and Scott Brusaw to replace the highways, parking lots and sidestreets of America with smart, hexagonal solar glass tiles. The couple believe their solar roadways can decrease our reliance on fossil fuels, power the U.S. with the clean energy of the sun, can melt ice and snow, embed road signs and make U.S. roads safer.

I think the Idaho couple are sincere and well-meaning, but their plan makes as much sense, well, as growing all our vegetables on the tops of our heads.

Let’s break it down a bit. Solar panels work by converting solar energy into electricity. They only do that, of course, when the sun is out. And, even the best panels convert about 15 per cent of solar to electrical energy. And, those efficient panels need to be away from shadows, in areas with lots of sun and be clean and transparent to be at their best. You know where a really, really bad place for a solar cell is? Under a car on a dirty highway. Where’s a dumb place to put a solar panel during the day? On a surface of a parking lot covered with opaque cars. And where’s an odd place to put solar panels at night, when there are no cars to block the sun that has gone down? A parking lot.

That sound you hear is the loud thump of a million civil engineers all hitting their foreheads against their desks at once.

The couple claims heating elements in the glass cells can melt ice and snow on their solar roadways so snowploughs will be a thing of the past. They appear to have no real idea how much energy it takes to melt snow (hint, a metric buttload). And, snow and ice are real issues in northern climates (where solar energy is low) and in the winter (when the days are short). Using solar cells to melt the ice on them is about as practical as using a drinking straw and an asthmatic six-year-old to drain an Olympic-sized swimming pool.

The solar panels are hexagonal with spacing between them. Do you know why roads aren’t already paved with tiles? Because the uneven weight on them as cars and trucks pass over works them loose and breaks down the subsurface. It's like when you were a kid and you wiggled a loose tooth free with your tongue, except that your tongue has eighteen wheels and is carrying ingots. 

A promotional video for the project shows the couple loading crushed, coloured glass into a wheelbarrow, as if they are going to make their clear, tempered glass panels from the multi-hued beads. While I do think the couple is well-intentioned, that scene made me mad and caused me to wonder if they aren’t scam artists, after all.

It is ridiculous to think that the chemical and thermal processes needed to turn coloured junk glass into high strength, clear panels would in any way be economically feasible. And let’s not even start about the end-to-end carbon footprint of “green” panels made from recycled glass. Because, remember, the energy to melt the glass at high temperatures has to come from somewhere.

But, the final nail in this wacky notion’s coffin is that the video fails to mention anything about energy storage and transport. Energy from solar cells is typically low voltage direct current. That’s notoriously hard to transmit over long distances. That’s why Edison’s electric schemes for the U.S., which used direct current, were defeated by the alternating current systems proposed by Westinghouse.

So, you’d need to convert the D.C. to A.C., store it at night (requiring millions of batteries) and then rig up an electrical transport system that runs along all the highways. The cost of all that is astronomical. Plus, all the workers who now are employed laying hot asphalt would be replaced by electricians, who are, you know, really inexpensive by the hour.

Worst of all, there are more practical alternatives literally so close. Solar panels on the roofs of car parks, or tilted at practical angles in grids alongside the highways themselves, for example. Which is exactly what Oregon has done.

So, given the blissful naivety of the doomed and beknighted scheme, why did thousands of us share the videos on reddit, facebook and other social networks? And why, for the love of sweet baby Jesus on a Triscuit, did they give it money? I think for two reasons.

First, I don’t think most folks who shared the video know enough of the practical science to deconstruct the claims. But, more importantly, sharing it wasn’t showing support for the practical idea, it was about supporting the belief that one simple, out-there idea could solve all manner of social ills. A bullet as silver as a herbal cancer cure or miracle diet.

It wasn’t about the practical. It was about the magical. In fact, many of the posts I saw about the scheme said, “Share this if you believe in it.”

But belief has nothing to do with success when success runs headlong into material science, thermodynamics and civil engineering. We may all believe that clapping our hands loud enough will make the tarsands disappear, or that a drum circle will collapse the Harper regime. But magical thinking isn’t helpful. And magical ideas aren’t worth spreading, funding or defending. We should all keep our powder dry for the ideas that can really make a difference.

And on that note, porkpie hat vegetable gardens, there’s an idea you should all get behind.

 

Why Science Isn't a Steam Table

It's been an interesting couple of weeks for science and belief on the Internet. Let's start with the John Oliver clip that made a clever hash of climate change deniers. Oliver called out the key fallacy about folks with fringe ideas about global warming. He mocked the notion, embraced by mainstream media, that the unsupported ideas of a slim minority actually balance a mountain of scientific evidence against the claims. And, in a clarion call for common sense, he told news outlets who carry out polls on opinions about climate change: “Who gives a shit? You don’t need people’s opinion on a fact. You might as well have a poll asking: ‘Which number is bigger, 15 or 5?’ or ‘Do owls exist?’ or ‘Are there hats?’”

Millions of people, sick of the Fox-News-fuelled, right wing talking points, viewed and shared the clip gleefully. It was heartening to see so many people passing around a clear, evidence-based counter to a common media-borne notion. And, it was a guerrilla propaganda masterpiece for the skeptics movement. Why? Because one of the most important lessons you can learn about the process of science is that evidence trumps opinion - and the rantings of a few don’t balance the scale against the careful, repeatable research and experimentation of the many. So, Oliver not only called bullshit on he-said/she-said journalism, and the lazy notion of false balance he also called out the conceit that opinion trumps data. Nearly three million people got the message, or, at least, amplified it on social media.

But then something odd happened. Peter Gibson, a professor of gastroenterology at Monash University has done work on gluten sensitivity. In fact, his 2011 research helped strap a jetpack to the back of the gluten-free movement. But, last year Gibson carried out a more rigorous, further study. The results? He determined that he was probably wrong about gluten sensitivity.

He could find no evidence of gluten intolerance in his 35 test subjects, which he screened to make certain they did not suffer from celiac disease. The subjects were blind fed a series of diets and Gibson could not associate any of their reported symptoms with the presence, or absence of gluten.  In short, the idea of gluten-sensitivity in folks without celiac disease is probably a myth, or a social construct. This is a big deal, given that gluten-free diet products are a multi-billion dollar industry and that about 30 per cent of the population thinks reducing gluten is good for their health. The reality seems to be, it’s not. News of Gibson's freshly published research broke just after the John Oliver clip went viral.

So, you would think that the folks who shared the global warming clip would be equally as keen to embrace this remarkable news. Because, you know, lesson learned, your opinion about gluten sensitivity, isn't equal to the results of a carefully done experiment.

But, no. The buzz about the study was the opposite. A lot of the online chatter in my corner of the web claimed the study was wrong or flawed. One of the most interesting responses was: "a study of 35 people really doesn't conclude anything  … People know their own bodies." That’s a strange, but common response. Strange, because you know what a really small study is? You reporting on yourself. That’s just a lousy experiment with a cohort of one. So, if that’s the counter to a rigorous, double-blind, control group experiment conducted by a gastroenterologist, you lose.

So, scorecard time: lesson learned about climate change, ignored about gluten.

So, what's going on? Why would millions of people share a video that demonstrates the illogic of being opinionated about a fact and shows the false balance of lining up fringe opinion against overwhelming countervailing evidence. Then, within two weeks, that same demographic fall for those same things like lemmings off a cliff?

I think part of the answer is, of course, global warming fits with their ecological paradigm, so of they’re more apt to approve of, and share, a parody that mocks the right. But when the same lessons are applied to something more personal and challenges the rationality of their beliefs, it’s a whole new game.

But, it is also true that most people treat science like the steam tables at the Mandarin - picking a choosing the choicest, tastiest bits.  But, here's the thing. You can't just pig out on a giant delicious plate of global warming and ignore the less appealing side dish of imagined gluten sensitivity. Science isn’t a buffet. You either apply its principles consistently or you don't. And, when you don't, then you’re just passing on belief and opinion. And, three million of us shared what we think of opinion. Right?



 

The NYT, Innovation and the Pampered Newsroom

Like just about everyone else keenly interested in online journalism, I have devoured the New York Times report, Innovation. The document is 1) a situational analysis of the NYT in comparison to its online competitors 2) a deconstruction of the NYT’s online efforts, wins and shortfalls to date 3) a sly dissection of NYT culture and 4) a sketch of the way forward for the Grey Lady, who seems to be wandering about the online cocktail party not really sure who to talk to or if she’s dressed appropriately.

I’ve read the complete report, twice now, and I think it's a tough and thorough piece of work. It should be required reading in every j-school that cares about what comes next. But, here’s what surprised me most about the document: I was gobsmacked by how coddled NYT journalists are when it comes to dealing with the issues of online journalism.

There are, the report reveals, Consumer Insight, Technology, Digital Design, R&D and Product teams at the NYT that all, in one way or another, wrestle with connecting with online audiences. They design new NYT branded products and create user experiences that make sense for an increasing mobile audience. As I count it, those teams are 630 people strong.

And, it’s clear from Innovation, all of these units do their work with the grudging help, ignorance, rebuffs or fear from the newsroom and its leaders (known in the document as “the masthead”). The report also reveals that there are editors at the paper who know little about the Web and, my words here, wouldn’t know a decent interactive feature if it leapt up and bit them on the ass. They also let other folks handle parts of their social media outreach, spend a remarkable amount of time, energy and success metrics on the contents of Page One, are obsessed with the home page of their website and remain culturally isolated from the rest of the company by a “Chinese Wall” that separates “church” and “state”.

This is remarkable, really. Here we are, almost 15 years into the World Wide Web, and editors at one of the finest papers in the word are still behaving like they are in a community theatre production of “The Front Page”.

Of course, the historical reason for “The Wall”, as the report calls it, made sense. It was to keep a paper’s journalism free of influence from its moneymaker, the ad department. But the departments the NYT journalists has eschewed include the above mentioned Consumer Insight, Technology, Digital Design, R&D and Product teams. That makes no sense at all unless the rank and file NYT journalists really could give a smoking rat’s butt about online - which seems to be the between-the-lines message of the report.

In contrast, Pierre Omidyar’s First Look Media, brings all those teams together to create a digitally native publication that the Innovation report admits is one of the young startups that is kicking the Grey Lady’s ass online. Other fiesty young’uns the paper fears include Buzzfeed, Circa and Vox Media. Digest that, conservative fans of the Times.

Innovation wraps up with a series of suggestions for making the NYT a truly digital-first publication. I think they are smart, well-intentioned, and doomed by the baggage of generations, years and tradition. Plus, since more than 75 per cent of the NYT's revenue still comes from print (in part due to it's shortsighted online paywall), there is little incentive for the paper to execute on the recommendations.

Add to that the recent defection of Aron Pilhofer, the managing editor for digital strategy at the New York Times, who just became the executive editor of digital for the Guardian. Pilhofer forged the digital team at the Time.  His departure was preceded by the firing of Jill Abramson, the now former executive editor of the paper. Both were mentioned as key players in the move to a digital-first strategy.

And, finally, the report was especially sad reading when coupled with another report from last week, the unfortunately-named The Goat That Must Be Fed. The report, from Duke University’s Reporter Lab, shows that the vast majority of U.S. newspapers are doing little, if any, online innovation, relying on papers like the New York Times to lead the way. This is known in the news business as bitter irony.






 

In Praise of Looking Down

Last week a video called Look Up made the social media rounds. It features an earnest young spoken word poet, Gary Turk, extolling the virtues of ignoring the screen in your hand and enjoying the world, and the people, around you. It's a well-made video but is as subtle as a sack of hammers to the nose. And, I beg to differ.

Of course I agree that we should acknowledge the people and places in the moment. But I disagree that using a handheld computer prevents that. And, I think the poet generalizes about the value of face-to-face contact and makes sentimental mush out of contact with random strangers.

But, let's start with the straw man that somehow today's technology has created a wholesale shift in the way humans interact. Smartphones as Ground Zero for downcast zombies.

Here's a lovely Punch cartoon from 1906 that shows that back then folks had the same concerns about sweethearts, the telegram and looking down in public. And here's a photo from 1946 showing commuters buried in their newspapers. Technology making anti-social drones of us all - it's a timeless, easy trope. I'm sure there were Edwardian and Cold War poets who were fretting about Morse code and newsprint.

Next, let's deconstruct the notion that interacting with other human beings in the flesh trumps all other forms of social engagement. Some does, certainly. Like the comfort of a friend at a funeral or the affectionate hug given to a sister on the birth of her first child. But I've ridden enough commuter trains to know that a good deal of human interaction is loud endless conversations about Dancing With Stars or how it took forever for Bill to get the washer in the bathroom sink changed. And, honestly, when I hear that junk food chatter I wish the speakers would bury themselves in their phone screens and shut up.

As an introvert I often feel a lot of face-to-face conversation is overrated and that I'm trapped in an extrovert’s world where even the smallest verbalized brainfart, inane observation or compulsive oversharing is valued more highly than even 30 seconds of companionable silence. So, no, Mr. Turk, I don’t relish the idea of striking up conversations with strangers who are just as apt to be tedious, babbling bores as former prima ballerinas now in their golden years and willing to share bon mots about their halcyon days in Russia.

And, finally, the extroverted, idealistic Mr. Turks of the world have no idea why it is I’m looking down at my phone, anyway. Perhaps I am recording the ambient sounds around me, which I could do better if everyone would just keep their opinions about Miley Cyrus' baby bump to themselves. Maybe I was sharing a photo of a lovely detail, or the play of shadow on a face or the interaction of colour and line that is there, and then gone.

Maybe I'm making social contact via my smartphone in a way that is more comfortable for me and my personality - which is about as far from gregarious and sunny as China is from North Bay. In short, maybe I’m actually paying more attention and having more valuable social engagement than the “be here now” chatterboxes around me. Or, maybe I’m not.

Maybe I’m just making my sunflower spit seeds at a zombie wearing a road safety cone. But, you know, that’s my choice in that moment, in that now, and I’m doing noone any harm. And, I’m certainly not being a self-righteous anti-technology Pollyanna who imagines that if we all looked up from our phone life would turn into one big group hug Coke commercial.

There are lots of ways to be social, some of them involve a smartphone. And, there are lots of ways to be antisocial, none of which involve technology. So, Mr. Turk, if you don’t make me spark up a forced conversation with someone I’ve never met, I won’t make you share a photo of something you didn’t see because you were too busy doing just that.