If a dumb shmo like me sitting in far away Australia got it so right about the iPad in 2010, why didn’t those smarties on Wall Street and in Silicon Valley see the future too? And still can’t…

Three years ago this weekend, the iPad was released for sale in the USA. In Australia, it came in July, with the 3G configuration.

I went back to what I wrote about the iPad after its announcement at an Apple keynote in January 2010, and then in the days before its release. It makes for edifying reading given the tech punditry who got it so, so wrong, as evidenced by Asymco’s Horace Dediu here.

Blog entry of March 21, 2010:

How I know the iPad will be a success – unusual sources of evidence: potential users

Snippet:

“The second thing: There is one game displayed where what looks like a jigsaw where pieces of yellow cheese are assembled into a one piece, about 21secs into the video. Here’s a stillshot:

When I first saw this clip, I was reminded of a widely-used IQ test, known as the WISC (Wechsler Intelligence Scale for Children), which contains a series of tests, some timed, which measures both verbal and non-verbal aspects of intelligence.

The equipment comes in a box and is several thousand dollars. Several of the tests use jigsaw-like elements asking the child to assemble the elements into a gestalt. At first, the child is told what the final assembly looks like. More challenging elements see the child merely told to assemble the pieces without knowledge what it is.

There are also small individual squares (3″ x 3″) containing elements of an illustrated story for which there is one best way to place them in order. The child starts with just two pictures and it’s very easy to place them in linear order of occurrence. The stories get more complicated and take longer as the child progresses.

As I was showing the Redfish video a second time, I asked the audience to consider how the WISC could be almost entirely performed on the iPad, together with a stopwatch function which could then automatically time and enter scores without the manual being needed for norming the results. There is one block test that requires small blocks to be laid out according to an illustration that may not be doable on the iPad, but if imagination is allowed to reign, the next generation of children could be tested with a WISC specifically designed to be performed on the iPad with a new block test normed for a new generation. Readers should bear in mind that psychologists don’t simply gathers scores, but also look keenly at how the child goes about the task, how he or she deals with frustration or failure or success, things that aren’t normed but important clinical indicators nonetheless.”

But all this is a digression for certain interested readers, away from the point I wish to make in that this was a natural small experiment into the appeal of the iPad for certain groups: one with concerns about adapting new technologies who understood my enthusiasm for the iPad and where I think it fits in professionally; and the second of course were the young boys who ignored their DVD which had so occupied them to stare gobsmacked at the iPad Redfish video. You could almost see them aching to get their hands on one, and play the same game, one of many Redfish will be releasing for educational purposes.

I have every confidence their excitement is the tip of the iceberg, and naysayers will be looking very glum in a year’s time for their shallow prognostications.”

UPDATE: The huge publishing house, Pearson, with whom Apple has done deals for textbooks on the iPad, (see here), has the rights to the WISC and the Adult version, the WAIS. I am now beta testing these tests which utilise two iPad 4’s which through bluetooth allow the clinician to see what the test subject is doing. Pearson “got it” when others didn’t.

Blog entry of March 25, 2010:

Thinking about the iPad in a professional psychology practice – in response to a fifteen year old’s dissing of it as useless

Snippet:

“So here are my thoughts, without yet getting my hands on it, as to how I might use the iPad in a professional psychology setting, as well as (as an addendum to be added to once I actually see how Keynote works) how to use it as a Presentation tool.

1. Intake: Patients waiting to meet me can fill in questionnaires or biographical information (much of it radio buttons or tick boxes) or using the built in or outrigger keyboard. In the future, a pend device for handwriting might become available.

2. Billing: As of now, many patients make direct payment using their internet banking or via PayPal if using credit cards. Just like iccpay.com, I imagine we’ll see similar instant credit card payment systems evolve for the iPad.

3. Patient database management, using an evolved form of Bento or a specific Numbers template which is easily transferred back to the Macbook Pro.

4. Showing educational movies, either on the iPad itself or via a USB or wireless connection to a TV or data projector.

5. Testing: I can see a number of specialist psychological testing outfits developing normed tests for use with children and adults on the iPad.

6. Distractor for children: Sometimes, a child in a session needs to be kept occupied when parents are the subject of interview, and the iPad with its games will be great for this. The last thing I want to give them is my Macbook Pro.

7. Information to read about their disorder or malady, which can then be printed out at will. Yes, it can be done on the Macbook Pro, but it’s always hooked up to monitors, backup drives and my iPhone and isn’t moveable during a session. Much easier simply to give a patient the iPad to read.

8. Make audio recording of the session. I record all sessions (patients remember about 10% of a sessions content) and from the iPad the AAC or mp3 file is emailed to them. Again, it can be done on the Macbook Pro using an external microphone like a Blue Snowball.

9. Specialised measuring tools, such as biofeedback devices like the emWave I now use to monitor heart rate variability, useful in stress management and arousal modulation. If patients get their own iPads with the software (possibly in development now), practise the techniques I’ve taught, and theit data can hopefully be transferred to my main database for comparisons and expose improvements over time.

10. A miniature whiteboard using Keynote to highlight ideas and demonstrate concepts.

These are just a few ideas thrown together without too much effort. Once the ball is rolling and the first of a new generation of apps of released, no doubt surprising us with their look, feel and innovation, the ball will start rolling and the penny dropping. For myself, I can see workshops ahead for using the iPad in professional health consulting, and hopefully hooking up with developers with a psychology interest to create new apps.”

Current Status: My accuracy record for all these predictions, before even getting my hands on the iPad:

100%

I am particularly proud of #9, the emWave heart rate monitoring device. After Macworld 2009, I visited the developers of this program in Santa Cruz, and implored them to develop for the iPhone, instead of their dinky portable device. They refused, citing exhorbitant development costs. I told them flat out they were wrong, the iPhone will surge in sales as more developers came on board, and if they didn’t develop, someone else would.

In March this year, the emWave for iPad and iPhone was released, known as Inner Balance (review coming soon). The US Navy, in an effort to increase mental toughness and reduce PTSD in its corps, is experimenting with the iPad and such a system.

#10 was achieved through another app I am a beta tester for, Doceri. It has now been taken up by UPenn for a 1000 seats its lecture theatres.

Blog entry for March 28, 2010:

While the 3 year old will yelp with delight when they discover the iPad’s games, the 80 year old will quietly say, “I get it. This is what computing’s about.”

Snippet:

“Seated at the same dinner table last Friday were students who entered the course after I had completed my studies, and whom I’ve met at other functions organised by this very social graduate group. One, Winston, works for a very large car manufacturing company whose world headquarters are in Detroit, and was in receipt of bailout money in recent months. The company has been part of the Australian manufacturing sector since the 1940s, and their vehicles remain very popular with Australians.

Somehow, the discussion moved to the iPad, perhaps after I had excused myself from the table to answer my iPhone, and Winston suggested on my return he was interested in getting an iPhone too. I suggested he wait a little while, perhaps June or July, when a new model might become available, and from there a discussion took place about the iPhone’s place in business now that Microsoft Exchange could work with it. It was a quick skip to speculation about the iPad.

Winston put me on the spot to pronounce why the iPad was a better choice than a netbook, which in Australia would be half the price and pack more features, such as a camera, “real” keyboard, iPhone tethering, and the full Microsoft Office suite.

My response was to suggest that the iPad should better be considered not as a computer in the common use of the term, i.e. a notebook or desktop device, but as a knowledge management tool in its own right, and rattled off the sort of apps it would inherit from the iPhone as well as those likely be designed to take advantage of its speed and screen size.

I suggested to Winston that the iPad would have limited initial appeal to computer wonks who wanted merely a smaller form factor for Windows-based computing. It would fail their needs. But I then suggested that there would be huge numbers of ordinary people with very limited knowledge of computer innards and workings – that is, the vast bulk of the Australian population – for whom the iPad would elicit the spontaneous remark:

So this is what computing should be!

No menu bars, no operating system to fiddle with, instant on and ready to use at the simple touch of one button, yet also have powerful business applications such as iWork and Bento and Evernote should this group of users work its way up the skill and learning curve.

When Winston said he had elderly parents who had never touched a computer but had expressed interest in what their use might bring to their lives, I asked him in all honesty which he would buy them: A $400 netbook running Windows Xp (then add the cost of Microsoft Office 2007) or a $650 iPad plus the $50 for iWork + Bento?

The picture of 75 year old mum and dad sitting on their couch wrestling with a netbook with its tiny keyboard and poor resolution screen was enough to observe Winston momentarily pause in his tracks to reconsider his options. Yes, for him, with his background in engineering, a netbook was a no-brainer. A good match for the problems he wished to solve.”

Current Status: Totally nailed it!

Blog entry for March 31, 2010

Where to go to find people using iPads this weekend? In all sorts of interesting places!

Snippet:

We’re just a few days away from the iPad falling into users’ hands, in time for Spring Break, Easter, and Passover.

So where might you go and find iPads if you weren’t lucky enough to order one for yourself? Well, as the video below shows, a whole variety of places, perhaps even at the White House Passover seder hosted by Barak and Steve himself!

Enjoy!

A question for you: If a dumb shmo like me sitting in far away Australia can get it right, why didn’t those smarties in Wall Street and Silicon Valley see it too? And still can’t…

Updating the Shaking book effect – better or worse than the original?

I’ve had sufficient comments and time on my hands to play a little with this opening slide for most of my presentations.

I altered the video’s outline to be less “ripped”, made it tumble rather than pop out of the book, and gave it a landing “splash” using the Anvil build (can you work out how I did that?) What do you prefer – the original, or this modification?

The Shaking Book effect in Keynote

Thank you to all those who’ve come by to visit my website following the podcast with David and Katie over at the Mac Power Users’ site.

I thought as a reward I would post a video of the “Shaking Book” effect as I call it, which I discussed in the podcast. I start most of my workshops, no matter the subject, with it. It follows my first slide which is usually just the title of the presentation du jour. The point is to inform the audience that no matter what they may learn on the day, I’m hoping they walk out happy they attended, and this is indeed what I actually say.

But the other unspoken message of showing the shaking book slide, right up front, is several-fold:

1. This will not be your usual dull, disengaging Powerpoint.

2. Even if you’re an old hand at presenting, and have attended lots of such trainings, you ain’t seen nothing yet = raising expectations (Contrary to Barry Schwart’z message)

3. Stamps my authority as an expert Keynote user since the effect is not one you can merely select but must create yourself, thus displaying a depth of knowledge of what Keynote can accomplish.

So, here’s the video, and beneath it, the Keynote slide and the Inspector so you can go figure out how it was done.

Now, there is a little more to this video than first meets the eye. Go back and have a second look. Note that the video seems to come between the open pages of the book, not from behind. Your mission, Mr. Phelps, is to figure out how that was achieved.

And here’s the Inspector (click to enlarge):

Image

There is a little more to it in terms of what all these elements in the Inspector achieve but I’m sure you can figure out some of the magic for yourself!

PS. I will not be returning to Macworld 2013 next February. Perhaps in 2014 depending on the direction Macworld heads.

Presentation Magic interviewed about presentation workflow, Keynote and helpful presentation equipment on the 5×5 podcast: Mac Power Users

I had an opportunity yesterday to be interviewed by lawyers, bloggers and Mac Power users David Sparks and Katie Floyd for their Mac Power Users podcast on the 5 x 5 podcast network. Our subject was Presentation Skills and workflow, and Keynote gets a good mention as well as some of the other tooks I use.

You can hear and the podcast and note links to items mentioned here: http://5by5.tv/mpu/111

The podcast is more than 90 mins and I hope you enjoy. Feedback and questions welcomed. You’ll hear us occasionally walk over each other and me do more than my customary “ums and “ahs” due to the nature of Skype audio (I’m in Australia and they’re in the US on the west and east coast).

Aside

Back earlier this year, I wrote a simple headline (below) suggesting that if Apple could stream Paul McCartney over its AppleTV arrangement, why not return us to the 1990s and stream its keynotes?

Image

Well, perhaps someone at Apple was listening! Because for the first time, Apple wil be using its AppleTV service to live stream the October 23 Special event. It won’t be at 5am local time in Australia, but 4am… Ah, decisions, decisions!

Here is a photo of my LG monitor display showing the announcement, and Other Events going back to June 2011. Let’s hope this becomes a permanent arrangement. That “hobby” of Apple’s is sure starting to take on new life!

Image

PayPal Here, a card reader and app for taking payment on your iPhone or iPad finally comes to Australia, getting in before Square and Apple’s likely credit card system.

Some time back, I introduced a video from Square in my “IT for Psychologists” workshops which I offer in Australia. At the time, Square was just beyond being a startup.

Square offered a small hardware device which plugged into your iPhone headphone jack and communicated with its own app so you could accept credit card payments. The Square card reader “swiped” the card, and the money would soon enough be transferred to your nominated bank account.

The video still excites many of my colleagues even though it was made several years ago, and it’s yet to be made available in Australia. It’s been reported that Square has since enjoyed an investment from VISA after initial warnings of “security issues” from Verifone, a maker of card readers More recently, Square has expanded its wares to purposely include iPad-expanded abilities for point-of-sale businesses.

There has been talk of Square coming to Australia, where many psychologists use those old bank-leased card readers, which usually means you also need to have a landline, or a more expensive 3G model. Some like to use it because one bank has arrangements with Australia’s Medicare national medical health scheme, allowing payments to be made directly into the psychologist’s nominated account.

But for the many solo psychologists who work in multiple locations, these solutions are not particularly cost-effective, since one pays a monthly fee for the lease, as well as a percentage per transaction, a 30c fee, and other ancillary set-ups fees.

It’s essentially a monopoly situation, which I have rejected in my own practice.

For those who want to pay me by credit card, they receive ahead of their appointment, advice that credit card payments can be made via PayPal, with a surcharge of $5 to cover the fees I am charged (a little under 3%). This keeps the banks in their place.

The other electronic alternative (cash is certainly accepted; cheques are a dying monetary exchange system in Australia) is direct debit, where the patient can make an online payment from their bank account. Neither they nor I receive any costs for this transaction. For that reason, I have the apps of the most popular banks installed on my iPad, so patients can either pay at the end of the session using my iPad, or pay at home or their office. Many now use their own iPhone during the session to make their payment. Once they use my bank details, websites usually keep me as a preferred payee and those details don’t need to be entered a second time.

The case for an iPhone or iPad enabled payment system is a no-brainer. One wonders, what with the Samsung vs Apple trial currently under way, if Apple ever thought its iDevices would be used this way. After all, even in its own Apple Stores it formerly used Windows CE-based portable card scanning devices.

These were replaced, perhaps with a big sigh of relief, with iPod touch units and an EasyPay system, as its called. At left, is an Apple Store employee in Perth, Western Australia using one when I visited in June 2012.

The units incorporate both a card reader and a bar scan reader and after taking your money, a receipt will be emailed to your nominated address. If you’ve purchased before, your details will quickly come up from the server and speed up the process.

More so, you can now use an EasyPay app on your own iPhone to make your purchase without even sighting an Apple Store employee!

Apple’s modded iPod Touch showing the bard card reader in action, purchasing an AppleTV remote

Apple EasyPay app at work in the Apple Store

Returning momentarily to Square, some have mooted it might make an excellent acquisition for Apple, as reported in the New York Times recently, above.

What makes the NYT report incorrect is its report that the Square service is “a unique electronic payment service through iPhones and iPads”.

There are several competitors, not all of whom use a card reader. One is ICCPay, which I also mention in my IT workshops, below (Shame on the NYT for not doing its homework).

The app needs to be used with a gateway linked to banks, and for some people these extra steps may prove to be a hurdle.

With the next iPhone not very far away, others have suggested Apple has its own plans for an iPhone based payment system, using Near Field Communications (NFC). With iOS6 coming with a coupon and ticket app called Passbook, it may also be the case that Apple will later allow iPhones to act as credit card terminals, perhaps utilising technologies from its Apple Store EasyPay setup.

While all this is in the not-to-distant future, Australian and US iDevice owners have another system just coming onto the market from PayPal, called PayPal Here. Below, the US website announcing its availability.

The Australian PayPal site shows a “Notify me when it’s ready” sign but some time back when it was first mooted coming to Australia, I applied to go on PayPal’s waiting list. This was back in April. In recent weeks, I was notified things were on the move. Last week , apparently in preparation, my PayPal account was suspended, pending receipt of documents pertaining to security questions, including if I was at all politically connected to anyone in the public eye. Seriously. I had to fax or email documents containing my photo ID and birthdate, as well as documents showing my name and current address, such as a utility bill. I used my US Passport business VISA for the former.

This was accepted eventually, and I was in business, even though the small triangular card reader was yet to arrive. The free app was available to be used however. Doing this has drawbacks, though.

1. Entering card data manually, with a purchaser’s finger signature and three digit CVV, attracts a higher % commission (about 3%), and

2. It takes 21 days to clear into your PayPal account, plus a few more after that to go into your nominated bank account.

Fortunately, my card reader arrived by courier yesterday, after being notified by PayPal Australia to expect it in 3-5 business days. It actually came just a day or two after that email, with its courier tracking details.

Here’s what the container box looks like:

The box includes an adhesive label you can place at your business entrance

Here’s the physical unite – it measures in imperial terms, 1.5″ x 1.5″ x 2″ approx

Here are a number of shots I took with my iPhone showing the boxing of the unit. This might not quite match the unboxing of a new Mac, but still….

It’s nicely done, complete with the adhesive label to place on your front door, and PayPal’s local support number. I haven’t tried to reach support yet, but would think that anyone trying to compare various payment systems ought to incorporate a comparison of support parameters, such as time spent waiting, quality of answers, followup, etc.

Using the unit requires you to download the free PayPal Here app from the App store. It is set for the iPhone, but will happily expand to 2x appearance on the iPad without much pixelation.

There can be a problem with iPhones with cases which make full insertion of the card reader difficult which I have already discovered with certain cables.

The card reader, unlike the Square unit, contains a triangular cover which partially rotates and “grabs” the iPhone preventing the unit from swivelling. Again, with a thick iPhone case, this feature just gets in the way. I have  swivelled the unit, and held it for a successful card swipe, but clearly if one is to use it frequently, it might require the temporary removal of the case.

The app., by the way, allows you to practise swiping without incurring fees, and as a way to test the functionality of the card reader.

Let’s now go look at the app itself, which has a number of interesting features.

The first is the “Open for Business” screen, above,  which connects you to PayPal to show balances, as as well the means to input items and information about your business.

There are a variety of settings, above, where you can list your inventory and give each item a sale price, and keep a running total of items being purchased. It night be useful for a bar or restaurant,  for instance, collecting table orders.

Each item can have its own amount and description, above, quite useful for those working garage sales or fleamarket transactions.

The app also includes a handy calculator, above,  to keep a running total going…

The app also includes the option, above, of putting in your business information, including either your mapped place of business and/or your correspondence address, as in a post box.

There is also the opportunity to track sales figures, both past and pending, above.

Items sold may include not just physical goods, but services too which can be itemised. There is also for goods, an area to display a picture of the item next to its description and cost. The picture can be imported from the Photo app, as you can see, below.

The option to take a picture on the fly could turn out to be very useful in some circumstances. There also exists the option, like Apple’s EasyPay system, of emailing a receipt to the purchaser. Handy to develop a marketing list of email addresses…

There are some questions that remain however.

Will the card reader work with all cards, and what of worn cards? How well will it work with the next iPhone, if rumours of its headphone jack heading south to the bottom of the unit prove to be true? And what is its future should Apple release its own credit card system in the near future?

The beauty of the PayPal system is the small size and thus portability of the reader, the data safety via encryption offered standard by PayPal, the well thought out version 1 of its app, and the public’s general awareness of the PayPal brand.

It’s also relatively easy to use and navigate through the various app screens, and the costs are very good when compared to what’s already out there. While it’s currently limited to VISA and Mastercard credit and debit cards, the lack of AMEX and Discover might be a concern for some. Mind you, I can’t recall the last time a patient tried to pay for a session with AMEX.

I’ll  update this blog entry once I start to make some “sales” with the unit, and report back on its actual usefulness.

Dear Technology Section Editor: Ten ways you know your tech journalists should be switched from covering Apple Inc. to say, Microsoft or RIM

Dear Technology Section Editor ,
Mainstream Media Publication,
Anytown, Anywhere

Dear Sir/Madam,

After many years of observing your publication in operation and as it attempts to make the transition to a digital news flow, may I offer the following reasons why some of your syndicated, featured, or freelance writers, be they journalists or bloggers or members of the kommentariat at large, may cause you to shift their fields of interest. Or:

Ten ways you know your tech journalists should be switched from covering Apple Inc. to say, Microsoft or RIM:

1. They refer to any success Apple enjoys as being due to its legions of “iSheep”, “fanbois” or cult believers who will indiscriminantly buy anything Apple due to Apple’s vast marketing budget and prowess. They will perhaps give a very brief mention to design and production qualities, but keep the focus on slavish followers.

2. They damn Apple for not having the courage to enter the enterprise market and compete head to head with Microsoft, thus revealing they haven’t seen or heard Steve Jobs’ metaphors of trucks and cars, and a post-PC world, nor do they understand the term “flight to the bottom”.

3. They rabbit on about “market share” and how low is Apple’s with respect to the desktop OS, while conveniently ignoring Apple’s quarterly profits, growth and customer satisfaction surveys. Oh, and its market share with respect to the iOS-powered devices.

4. They hold up examples of failed Apple products as to why Apple might fail with its next rumoured product… “remember the Pippin, the Newton, The Cube? See, Apple doesn’t get it right always….”

5. They admonish Apple for releasing or spreading rumours there will be a product “soon” but one which Steve Jobs said Apple would never do. This is used  as an example of Apple’s lack of trustworthiness, but bald-faced lying. iPod Video 5th gen., anyone?

6. They report on how worried Apple should be because they really believe RIM is about to turn the corner and blow the tech world out of the water with the next Blackberry with its new OS. Or Microsoft will do it with Windows 8, or Nokia will… you get the picture.

7. They do “exclusive” product review “showdowns” between vapourware products no one has been able to put side by side e.g. “Who will win? We compare Microsoft’s Surface RT versus Apple’s iPad 7 inch.”

8. In predicting Apple’s future, they can’t help themselves from referring to Microsoft “saving” Apple from oblivion at the time of Steve Jobs’ return in 1997, with an investment of $150 million in non-voting stock, thus perpetuating demonstrably untrue folklore.

9. They include current quotes from Steve Wozniak about contemporary Apple issues like design, functionality or competitiveness, things he would be best to leave alone for oh…  the past 20 years, and the next 20 to come.

10. They continually present you with articles about Apple which are lists of ten things Apple could do differently, should be doing, are not doing, are doing worse than anyone else, etc., etc. And they spread all ten over 5 pages to demonstrate how they are truly clickwhores, which badly reflects on your publication.

These are my ten. Dear Reader, I’m sure I’ve missed a few… can you assist with your own, and assist Dear Editor out of this dilemma?

Aside

I took a recent opportunity to drop into my nearest Apple store at Chadstone, a major shopping centre in southeastern Melbourne. It was the first Apple store to open in Melbourne and even at 11am midweek it was buzzing with new purchasers, one on one training, and sales of accessories.

I hadn’t stepped more than a metre or so into the store when I was welcomed by an Apple employee, to whom I said I’m just looking, and proceeded to track down the new Retina display Macbook Pro.

I’m a year off updating my April 2011 Macbook Pro 15″, having just given it a big speed boost by removing the standard 500GB hard drive (albeit the 7200RPM model), and replacing it with an OWC 120GB solid state drive. I didn’t stop there, removing the optical drive, placing it in an OWC USB-powered case, and using an OWC datadoubler cage to place a 750GB Seagate momentum hard drive. The SSD contains my operating system and applications, while the Seagate has my documents and other files, as well as used as a scratch disk. How fast is the system now? Well, Microsoft Word now opens with one bounce, as does iPhoto with 3500 pictures.

More importantly it shuts down and boots up much more quickly, and I’m estimating the Macbook’s battery endurance has also increased significantly.

It will do fine for another year.

Equally important, it gives Keynote – my presentation application which I discuss frequently on this blog – a speed boost too, and it has a snappier feel to it when I’m in the process of creating new presentations.

Which all brings me to discuss the future of Keynote and its brethren apps which make up the iWork suite, which has not seen a significant update for more than three years. Meanwhile, Powerpoint for Macs and Windows have seen major version updates, Prezi is growing in popularity, and iWork apps for the iPad have seen several significant improvements, bringing them closer to the capabilities of the desktop versions.

Keynote users, which we can guess are growing in number to judge by the sales figures Apple publishes on the App store (it’s currently in the top three of paid apps), are asking the following questions, mystified by Apple’s seeming neglect of their favourite app.:

1. Is an update – or more plainly – a significant version improvement due some time this decade?

2. Will it have the same look and feel as the current version, or will Apple switch it to its “professional look”, seen in apps such as Final Cut X, Motion, Aperture, etc.

3. Will Keynote fully utilise Airplay in Mountain Lion such that in either Presentation or Mirror mode, a true wireless data projector connection may be made, either with the latest wifi-equipped projector, or via connecting AppleTV to an HDMI equipped setup

4. Will Keynote instead be dumbed down so as to provide greater compatibility with the iPad version?

5. If not, what new features will Keynote emphasize? Clearly, new transitions or build styles will come along, but is this sufficient to sustain interest in Keynote or are new useability features to be the name of the game in 2012?

6. Will Keynote make it easier for third party developers to come to the party, not just with new themes, but this time with new transitions and builds?

So what new features are the most desired and have any recent official Apple keynotes given a hint of Apple’s thinking about presentations?

After using Keynote for almost a decade, when it was almost featureless when compared to Powerpoint 2003, experienced users have developed their own workarounds for the application’s deficiencies, even if they must do so with clenched teeth.

The lack of a useful timeline remains for me the most glaring need to be fixed. Currently, rather complex slideshows, which Keynote begs to do due to its cinematic capabilities, are hobbled. The go-around is to create Quicktime movies of complex single slides, perhaps using iMovie, Motion or Final Cut to manage exact timings and mixtures of images, video, and sound.

Grouping images into a single image file is still troublesome, as Apple has yet to find a way to name each group on the one slide with its own name, rather than a generic, “Group”. Moving these groups forward or back with respect to each slide item, something novice users are unaware of, is part of elevating presentations to another level. I don’t think I’ve ever seen it taken advantage of in scientific presentations featuring Powerpoint. In Keynote currently, it’s particularly kludgy, and Powerpoint for Mac has a 3D view which Apple could think about improving upon. Using layers within a presentation slide raises presentations from ho-hum to something special.

The most recently observed Apple keynotes haven’t shown any easily detectable new features. For the next section, I’ll refer to the WWDC 2012 keynote of June 11 where new Macbooks were shown together with previews of Mountain Lion and iOS 6.

There were a couple of very neat effects displayed which can be reproduced using the existing feature set, but which require several steps rather than built-in effect generators. Let’s start early in the WWDC when CEO Tim Cook is describing how many more countries are will soon be able to access Apple’s App store.

If you download the video via the iTunes podcast feature, you’ll see a world map at 05:50 where Cook has several new countries “fly” in, overlaying in a different shade of blue those countries new to the app store service. I was able to reproduce this on one slide using the move and scale build in feature, but to do it over 30 countries is a pain.

It means utilising the shape feature to outline, then “cut” the country, use the Adjust image to shift the shade of blue, place the image off the slide, then create and combined “move and scale” to slide the new image in and over the country. Lots of work but quite an effective visual. Here’s Greenland (circled) flying in:

Image

The hope we have is that the next Apple will make such efforts much less intensive, requiring much less mousing about. It’s easier of course if the map is built of separate country images, but the cut method I refer to is how I used the display above as a proof of concept that the fly in effect can be achieved in the current Keynote.

The next feature I want to highlight is just that – highlighting. While the map sequence shows elements flying onto the slide, an important feature of contemporary presenting is doing away with those awful laser pointers so favoured by those who won’t put the effort into preparing both their stories and slides ahead of time, preferring to appear “spontaneous” while wiggling green or red lights in dizzying circles.

When I visited Apple’s Keynote team in Pittsburgh a few years ago, I made it a special point to discuss the need for the team to understand the importance of “call outs”, ways of highlighting elements on the slide. This could be a set of cells in a Numbers spreadsheet, or an element of a photograph, or a line of text from a scholarly publication – something you want to stand out from the rest of the slide elements as something requiring the audience’s attention which you speak to, but by enlarging it or shadowing it or somehow calling it out, you still allow the audience to see where it belongs on the slide. This adds to your authenticity by showing the source of the quote, rather than merely typing it onto a slide.

Take a look at the slide (below) I created to get the idea visually:

Image

Notice a few things: I have enlarged the main graphic which is taken from a PDF of a journal article I am discussing with the live audience. It’s intentionally pixelated and thus hard to read, because I don’t want the audience to race ahead and read it. It’s purposely difficult because within a moment of its appearance I am overlaying a much clearer image of the paragraph I am going to read to the audience.

What, you say, read a slide?

As I tell Presentation Magic audiences, the only time I read a slide is when it is a direct quote from a source I am displaying – never my own words on a constructed slide unless it’s a single word or phrase, but never sentences.

Remember – once you display a slide with words on it, your audience will read it whether you ask them to or not… so any time you put words on a slide it’s because you want them to read the words in preference to listening to you or because they read while you say the say words. Vision and sound work together to make it more memorable.

In the slide above there are two effects: As the main page appears, it immediately goes to pixelated form while the paragraph leaps off the page (notice the shadow effect) to grab attention. And as I read the slide, I say “and here’s the main point I want to make with this slide” and use the red shadowed box to highlight the sentences in the highlighted paragraph. This double handling take some effort to create on the slide, but in the presentation it runs seamlessly and produces an engaging sequence. It shows your audience you’ve really put some thought and effort into it for your audience’s erudition.

As much I tried to show the Pittsburgh-based Keynote team the importance of callouts, I reinforced this a few years ago at a Macworld Presentation Magic workshop when two senior members (one a new hire specialising in interface design) of the Keynote team attended my training. I didn’t announce their presence to the other attendees, as I wanted them to sit in as regular attendees. During the workshop, I spent considerable time discussing callouts, why I think they’re important in contemporary presenting, and some of the techniques I use to create various call out effects. I know the Keynote team members were very interested in what I did, and went away thinking about how to create these effects as part of the Keynote app attributes, rather than create them using a multitude of keystrokes and mousing about.

I am pleased to say, if I may judge from the WWDC keynote, that at least the concept of the call out has made its way into Apple’s keynotes. I have no idea if we’re watching a new Keynote which makes callouts easy to create, or the current one using the same or similar techniques I am using. Let’s take a look at these features from the WWDC:

Image

Discussing the screen resolution of the new Macbook Pro, Apple Senior VP Marketing Phil Schiller uses a magnifying callout to highlight the resolution improvements in Apple default apps, such as Mail.

This is not the first time this call out has been used, but it’s certainly the largest. Nor is it a built-in shape; the annulus is a 3D graphic selected to emulate a magnifying glass without the handle. Apple included such a glass with Keynote 1.0 as part of a selection of bundled clip art, not to be confused with the chintzy art included with Powerpoint (until the most recent Powerpoint for the Mac, when Microsoft included high res photo images.) I learnt it was Steve Jobs who stopped the inclusion of clip art.

Later, in the keynote, Apple SVP software engineering, Craig Frederighi, demoes various new features of upcoming Mountain Lion, due in a few weeks.

Notice below how he calls out a feature of new Safari, leaving unneeded areas greyed out compared to where he wants the audience to look:

Image

This call out is repeated several times, as more features are displayed. This is easily done by transiting over several slides and seeming to animate the move simply by going from one slide to the next. This sequence occurs at 54:30. Here’s what the next slide highlighting the same feature set looks like:

Image

If you go to the keynote to see the sequence, you’ll see it’s a rather plain slide to slide transition.

I’ve taken another section of his talk which features a pulldown, and I’ve shown how to animatedly highlight various sections. Can you guess (below) how I did it?

Craig shows us a few more call out variations, which furthers my belief that if we are watching the new Keynote in action there will be a new call out feature in the next Inspector – watch:

This is the area of his presentation where Craig is demoing the iCloud sharing of Safari tabs across platforms, from Mac to iPhone to iPad. Alongside the URL entry area is a sharing and icloud icon. In the next illustration below, the call out of the icloud icon begins:

I’ve circled the cloud icon beginning its enlargement, part of the call out sequence, which concludes below:

While it may be a variation of the magnifying glass previously demoed, the former was not animated, so I’m tempted to think this is a build effect, which of course one could do now with the available feature set. But I’d like to be optimistic in thinking the Keynote team have really thought more deeply about the importance of call outs to give it a place in the next Inspector.

(Funny aside: Craig demoes a car racing game to show AirPlay in action in Mountain Lion. From his previous keynote appearances, he has developed quite a following it seems for his lush abundance of hair. Note in the highlighted picture below his racing nickname – click to enlarge:)

More evidence is available when Apple SVP for iOS, Scott Forstall demoes iOS 6.

We start with Scott and two images of iPhones. Note I’ve captured Scott looking down at his confidence or vanity monitor, a presentation skill he has yet to master (at least compared to the much missed S. Jobs).

Next, out pops a callout of an area of interest featuring Facebook. No grayed out areas, but an enlargement which pops. Again, one can do this with current build styles, but I’d like to think one could outline an area, and a build option would give you choices as to how it would be called out:

Now, another sequence to show this same call out style in action:

This is Scott demoing the new “Do Not Disturb” feature. Notice how the effect is to float the panel above the iPhone. Apple loves 3D!

This series of call outs in this year’s WWDC really highlights the feature, so if I may connect all the dots mentioned so far in this blog entry, I remain hopeful Keynote is about to be refreshed, perhaps soon after Mountain Lion is released in a few weeks.

There is more information to consider however, not all of it good.

Early in his demoing of the new Retina Macbook, Phil Schiller mentions how the system apps have been updated to take advantage of the extra pixels, such as Mail, Address Book and so on.

He then goes on to show how a select group of Apple’s “professional” apps have also been updated, and we see Aperture and Final Cut X. We even hear that Adobe apps are due to be updated for the Retina display too, as well as Autodesk.

Adobe Photoshop on the Retina Macbook

Autodesk on the Retina Macbook

But where is the mention of Apple’s other “professional” apps, like iWork?

Let’s not give up hope however. When I went to the Apple store in Chadstone I opened up Keynote on the Retina Macbook. I took a picture with my iPhone of the Theme Chooser, and I’ve overlayed it on Keynote on my current Macbook Pro. You’ll need to take my word that Keynote on the retina Macbook is quite observedly pixelated, much like an iPhone app which is blown up 2x on the iPad is pixelated (Click on the image below to enlarge it):

In conclusion, I wonder how long Apple can live with itself allowing Keynote to look so… impoverished and uncinematic on its premier Mac. Let’s hope not for long, especially if Retina display iMacs and other Macbooks are allegedly not far away.

UPDATE: I’ll be in the USA (New York City) and Canada (Toronto) in late August/early July of you’d like to set up a Presentation Magic training day or seminar. Email me at les(at)lesposen.com or tweet @lesposen

Pilots, presentation skills and preparation: The crash of Air France 447 into the Atlantic – The official French investigative report shows what pilot training and presentation skills have in common

If you’ve attended a workshop or seminar of mine, whether about presentation skills or technology or health, you’ll know I sooner or later introduce something about commercial aviation.

I have a passion for this, and partly earn my living from working with patients who suffer fear of flying, which affects a significant number of people, and cuts across profession, gender, intelligence and age amongst notable variables.

Above, you will see the cover of a just published technical report into the causes of the total loss of an Air France Airbus A330 a little over three years ago, while making its way from Rio De Janeiro to Paris. There were no survivors when the aircraft was lost over the Atlantic having entered a quite severe weather pattern.

Very modern and ultra-reliable aircraft like the A330 don’t just “fall out of the sky”, and its mysterious loss was compounded by the difficulty of locating the tell-tale cockpit voice recorder (CVR) and the “black box” which records the total flight experience with respect to the aircraft’s performance. These two devices when recovered usually enable investigators to piece together likely causative factors, together with recovery of as many aircraft parts (and human parts for that matter) in order to put together a plausible story.

Sometimes, mechanical failure is the total reason for catastrophic failures. But it is rare when there is no human involvement in such failures, whether it be on the flight deck, in a radar facility, in an engineering maintenance facility, in the original aircraft supplier’s building facility, including all the sub-contractors supplying parts, or an airline’s navigation and flight planning department, all of whom can contribute to the ultimate loss of the aircraft.

Each of these involves humans, and so the field of Human Factors is an important one in both the prevention of incidents, training of personnel, and helping to explain when all seems to fail, as it did in Air France 447 in 2009. Total hull losses of modern aircraft are very rare when pitched against the total number of flights undertaken each year. Indeed, as I show my patients on my iPad (projected onto my wall in a 6ft x 6ft picture via an AppleTV), there are thousands of aircraft in operation at any one time, 24/7. The app I use is FlightRadar24 ($2.99) which shows traffic, airport info., aircrafts routes, and various aircraft parameters in real time. You can see what it looks like below:

Notice in the very bottom right corner it says: “Showing 72 of 2063”.

This is because the app has detected 2063 flights worldwide, and this is at 3:30pm AEST, which is early morning in Europe. I usually say to my patients that at any time, there are on average 7000 commercial aircraft flying, going from and to departure and destination points at any one point in time. Essentially commercial aviation is the safest form of mass transport, after elevators and escalators.

For some patients, this is comforting; for others it’s not where the action is. But when events like AF447 occurs, everyone in commercial aviation sits up and pays attention, especially airlines which operate the same aircraft type. It’s an anxious time, especially when it’s the first total loss of an aircraft type, for fear some design fault or build issue has finally shown its ugly face, and potentially means the entire fleet of such aircraft across all airlines needs to be grounded.

I recall working with Qantas pilots in Sydney in the months after AF447, and later visiting their A330 simulators, where I learnt the world’s airlines including QANTAS were attempting to simulate the known events of the Air France loss, to see the potential contribution of pilot versus aircraft systems. Here’s me at the simulation centre near Sydney International airport:

This was about a year after the loss, and only speculations were being entertained as no recovery of information systems had occurred.

Ultimately, using very advanced and expensive equipment, recovery of parts and telltale recorders miles beneath the Atlantic surface occurred, and investigators began to meticulously piece together the contributing events to AF447’s final moments.

I’ve downloaded the French investigators’ (BAE) report, and it is a very detailed, technical report which owners of A330s will be poring over this weekend.

One of the regular aviation blog sites (Flightblogger) I read captured my attention with its current entry, reporting on the investigative outcome:

Two short paragraphs of the Air France AF447 investigation report offer an (sic) curious insight into the brain’s response to aural alarm signals – and might go some way to explain not just the crew’s failure to recognise the A330’s stall but why terrain-warning systems sometimes seem to bark at pilots to ‘pull up’ in vain.

Stall warnings on the ill-fated Airbus sounded continuously for 54 seconds. But the inquiry report, sourcing seven different research papers, states that aural warnings demand the use of cognitive resources already engaged during periods of high workload.

“The ability to turn one’s attention to this [aural] information is very wasteful,” the analysis says, adding that the rarity – and even “aggressive nature” – of such warnings might lead to their being ignored.

Studies on visual-auditory conflict, it states, show a “natural tendency” to favour visual over auditory perception when information acquired by both senses appears to be contradictory.

“Piloting, calling heavily on visual activity, could lead pilots to a type of auditory insensitivity to the appearance of aural warnings that are rare and in contradiction with cockpit information,” the analysis adds. Visual-auditory conflict during heavy workload translates into “attention selectivity” which accepts visual information but disregards critical aural warnings.

Those of you who have been to a Presentation Magic workshop will acknowledge almost instantly why this sub-section leapt off the page at me.

In the course of the workshop, attendees learn about the multimedia theory of persuasive presenting, using research from the field of affective neuroscience to promote this understanding.

It follows a model of Don Norman, formerly an Apple Fellow in Apple’s early days, a psychologist and engineer now part of the Neilson-Norman group, who speaks of our emotional relationships with technology.

He reminds us that we have at least three ways of interacting with and understanding the world outside of ourselves. Here is the final slide I use, having built it up discussing in turn each of the three elements you see below:

It is Norman’s plea to industrial and software designers (link to his 2003 TED talk) to take all three into account when designing everyday things for humans to use. Those technologies that seem to attend to these elements become indispensable and beloved by their owners, such as the iPhone and iPad.

When we drop down a level from thinking and planning – a top of the brain phenomenon, literally – we use our senses to make sense of the world. For humans, the sense we primarily use is visual, and between 40% and 60% of brain real estate is devoted to processing visual cues. Think of all the things we do with vision. We detect:

1. Size and difference in size between objects

2. Distance – is one object further away or closer than another

3. Speed – how fast is an object travelling, a combination of 1. and 2. above

4. Colour

5. Motion – coming closer, or moving away from us. In aviation, this is helped by colour because a silouhetted aircraft can fool us in terms of its direction. So the left (port) wing tip has a red light and the right or starboard has a green light, remembered by the mnemonic, “There is no red port left.” 

6. Transparency, or we can see the spatial orientation of objects behind or in front of each other.

7. Texture – our eyes pick up edges, smooth areas, folds, etc., and our brains can assign meaning to these elements in a haptic fashion, meaning we can assume what the object will feel like when we run our hands over it.

8. Sameness or likeness, such as with faces. There is a region of the brain, the fusiform gyrus, whose task it is to recognise faces. See neursoscientist Vilayanur Ramachandran’s fascinating TED talk for what happens when this area is damaged.

9. Balance – our visual system works intimately with our vestibular system so we know what’s up or down, or where we are in space at any one moment. Disagreements between the two senses, which is what happens when you’re momentarily weightless – rollercoaster, fast moving elevator, sitting in the back seat of a car with limited vision, on a boat in heavy swells – will usually have you feel very uncomfortable and nauseous.

Our other senses – hearing, taste, touch, and smell are all important too – but not at the level of the visual sense. On the other hand, in terms of priorities, your dog makes sense of its world in this order: smell, hearing, vision.

They have nostrils which smell in stereo so they can detect location of a smell in very small amounts. In working dogs, their outer ears, the pinnae, can independently swivel to act as funnels to give extra location detection too.

If you go back to the Air France investigative report, it confirms what I’ve been teaching in Presentation Magic talks: give priority to the visual display of information in a timely manner, building up a complex story so as not to overload the viewer, and keep words on the slide to a useful minimum. Audiences don’t just read words (more quickly than you can physically say them) but they sub-vocalise them so they “hear” the words. This can put their automatic actions in competition with the visual sense, and a mixed or diluted message is perceived.

Here’s what the Air France report said (p.105):

In addition, studies on the visual-auditory conflict show a natural tendency to favour visual to auditory perception when information that is contradictory and conflicting, or seen as such, of both senses is presented [4, 5, and 6]. Piloting, calling heavily on visual activity, could lead pilots to a type of auditory insensitivity to the appearance of aural warnings that are rare and in contradiction with cockpit information. A recent study in electrophysiology on a piloting task seems to confirm that the appearance of such visual-auditory conflicts in a heavy workload situation translates into an attention selectivity mechanism that favours visual information and leads to disregarding critical aural warnings [7].

If we generalise this to presentations – or if only the pilots and their trainers had attended some presentation training which featured sections on affective neuroscience – it reminds us once more to stop piling words on slides, or too many pictures for that matter, because we unwittingly ask our audiences to engage in cognitive overload. A narrative flow of ideas, using both spoken word and images consistent with those words, minimise overload and allow for greater information management abilities.

In aviation, that means you stay on task managing what’s called “situational awareness”, while in presentations it means your audience stays engaged, curious and likely to eventually “connect the dots” in a meaningful way.

Personal connection

I first learnt of the primacy of human factors in commercial aviation following the total loss of an Air New Zealand DC-10 on the Antarctic plateau, when in 1979 this most sophisticated aircraft for its day crashed into a dormant volcano.

This despite the presence of a very experienced crew and commentators on board with significant experience of the region.

This total loss has proved significant in many ways. The New Zealand Royal Commission, led by Justice Mahon became a model for such investigations. The initial findings by the nations’s Chief Investigator, with his experience limited to lighter aircraft, and which placed the blame principally on the flight crew, were overturned in the Royal Commission, which (to make a complex story rather simple) found the correction of a long term navigation error itself to be an error, placing the aircraft on a direct path to the 16,000 foot mountain, covered in snow.

(Aside – some have held Mahon’s investigation in high regard as a worldwide turning point in hull loss investigations including human factors. The Erebus disaster also shifted the field of traumatology in rescue workers, looking at how such workers can be helped to recover from the awful sights and smells they witnessed. The key psychologist who investigated this is New Zealander Tony Taylor.)

Unaware of their situation, which the pilots likely believed was 20 miles from the mountain as per their briefing and charts, the aircraft headed down its plotted course. The trained observers looking out the flight deck windows did not see anything out of the ordinary, an example of confirmatory bias: we see  what we expect to see…

But two questions ought to leap out at you if you do not know the story of this crash on Mt. Erebus in 1979.

1. Why didn’t the pilots see the mountain?

2. Why didn’t the radar warn them?

To answer 1., again rather simply, it appears a polar visual event called White Out was present at the time, creating an illusion of cloud ahead and obscuring the mountain. Unless they knew a 16,000 foot mountain was dead ahead, the flight crew would have been deceived by this polar optical illusion that all was safe, and they remained 20 miles away from a direct line to the mountain. Again, they saw what they expected to see. This is because our eyes are merely data detectors. The data is sorted into useful information in our brains, where it is compared to known past events, and sense made of it.

The Royal Commission called numerous experts to give evidence, but the evidence on white out was given by Ross Day, who was my psychology professor in my undergraduate days. He was an experimental psychologist with an interest in perceptual illusions, and his evidence was convincing. When Justice Mahon visited the crash site on the one year anniversary of the incident, he serendipitously the white out phenomenon onboard a Hercules aircraft, confirming the illusion as discussed by Professor Day. In 1974, when I was Professor Day’s first year student participating in his lectures and experiments, I had no idea what he was teaching me – dry experimental science – it would one day became a subset of what I would teach others in Presentation Magic workshops.

To answer 2., one needs to know the Bendix weather radar on board the DC-10 had two “mapping” modes. One, with the nose-located radar aimed downwards, scans land terrain for features. At the 6000 foot cruising altitude of the DC-10 over Antarctica, it would likely instead have been set to locate weather activity directly ahead, looking for water droplets indicating rain and thunderstorms, something to be avoided for the comfort of passengers.

So why didn’t the weather radar detect the snow on Mt. Erebus? Because the density of the snow would have absorbed the radar signals, meaning it was a dry location compared to the individual moving particles of water in rain, hail or falling snow. Mt. Erebus would not have been perceived ahead unless the flight crew knew to look for it directly in front of them. Eventually, as the aircraft approached the mountain at significant speed, another set of radar instruments detected low ground clearance, and an automated voice called out “Pull up! Pull up!”. (See the section from the French BEA report, above).

From the recovery teams’ information, the crew responded as per their training (to judge from the full power “go around” settings discovered), but to no avail.

With aircraft becoming more and more automated, the tasks of flight crew continue to change, and the more important human factors become in terms of human conditions such as attention, engagement, activation, and rehearsal. These bear an uncanny resemblance to the skills needed if presenters are to conduct presentations which are engaging and make a difference in their audience’s lives.

How the New York Times technology blog, Bits, perpetuated the myth of a mental illness due to mobile phone use: Or, Follow the money

You may not have noticed it, but lately the New York Times has been running a series of long feature articles taking Apple Inc. to task. Whether it be about the human conditions of its manufacturing plants in China, or about its employee welfare in its US-based retail stores, the NYT has been casting a stern eye over Apple’s operations.

The Times also maintains its own set of technical blogs and writers, some producing rather hip columns about all things technical and gadgetry. A coupe of days ago, one of its writers whose Twitter account I follow, Nick Bilton (@nickbilton) tweeted the following which caught my psychologist’s eye:

Clicking through to the article Nick cites reveals a piece in the Times Bits technology blog for June 21, 2012:

The picture you see is of a rather contented Jerry Hall in the Broadway production of The Graduate where she plays Mrs. Robinson.

The article, by Nicole Perlroth, focusses on how smartphones, such as Apple’s iPhone or Samsung’s Galaxy III to name the two most popular examples, are changing certain behaviours, previously taken for granted.

She cites a number of published surveys looking at how Americans’ relationships with their technologies are changing, such as the Harris survey conducted for online security company Lookout. One of its products is an app you can employ to find your lost cellphone. Here’s the opening paragraph which sets the stage for the survey findings:

Perlroth in her opening remarks states:

“Americans are clearly addicted to their smartphones.”

To non-Americans, it seems Americans are addicted to calling many things which they perceive to be outside the range of “normal” behaviour an “addiction” with all the stigma and negative stereotyping that goes with it, along with the eventual 12-step program modelled on Alcoholics Anonymous.

I get enquiries from local Australian media about such “addictions” and related dangers on a regular basis, being as I am a “go-to” person for journalists seeking local flavour for internationally-sourced articles such as the Times.

Perlroth to her credit reminds us that Harris-type surveys are not scientific in nature, citing Harris’ own spokesperson. Later in her article she introduces us to a new psychological concept, the fear of losing one’s cellphone:

Psychologists recently tried to coin a name for the fear of losing a cellphone. It’s called nomophobia — which stands for “no-mobile-phone phobia”— and apparently it’s on the rise. The vast majority of survey respondents — 94 percent—said they lived in a perpetual state of fear that they might lose their cellphones. A similar study of cellphone owners four years ago found that only 53 percent felt that way.

The first line of the quote caught my attention because it’s the first I’d read that my professional colleagues were at it again constructing neologisms. This is an important concept at the moment as much vigorous debate continues in the world of mental health as the American Psychiatric Association prepares for publication of the fifth edition of its Diagnostic and Statistical Manual (DSM). The Manual is frequently cited in courts of law and insurance precincts as it defines illnesses which can explain illegal actions, provide or restrict workers’ compensation, and allow for admission to hospitals, either voluntary or involuntary. These are some of the purposes to which it is expected to be put, as well as its use in research so as to make sure experimental subjects meet the criteria for inclusion in studies which say might compare various treatment regimes. Thus the multibillion dollar pharmaceutical industry is an interested stakeholder, and some of the misgivings being voiced have centred on those who are developing the DSM and their connections to “Big Pharma”, as it’s referred to often.

So, I wondered who these psychologists might be who are “trying” to come up with a new diagnostic term.

To find out, one does what one does in mid-2012: You Google the term, and see what first comes up in the search results. Here they are:

The two returns that caught my eye as I cast my view down the list were the Wikipedia entry, and the entry of February 20 from the local Sydney Morning Herald: Nomophobia – fear of being without your phone – on the rise.

Local entries for me are important to follow up on, to see if any colleagues I know have been sourced, or if something I have blogged about has been cited.

What we find is in fact a Los Angeles Times reprinted article, and my journey around the internet echo chamber begins.

The term Nomophobia seems to have its most recent mention from an online survey, conducted by yet another online security company, SecurEnvoy from the UK. Like US-based Lookout who commissioned Harris, Securenvoy commissioned OnePoll to do its online survey, replicating an earlier survey conducted in 2008, when apparently Nomophobia was first “identified”. We’ll come to that original article in a moment.

So far, no mention yet of any psychologists giving the term any credence, although the LA Times piece offers the following:

So, this is how you go from a fun and light piece where we can all have a little laugh at ourselves and our relationships with technology to a real and treatable illness possibly requiring medication.

If you go the Securenvoy’s website where its CTO and co-founder Andy Kemshall discusses the survey results, the impetus for the polling becomes clear by virtue of mentioning his company’s product:

And once more showing the echo chamber in full force, here is a section of allaboutcounseling.com’s article on nomophobia:

The original survey says nothing about “symptoms”, but now we have a “real illness” with symptoms not unlike other well known phobias, such as they ones I treat in my psychology practice. The conclusion that “53% of cell phone users, based on the Securenvoy polling will suffer from nomophobia” is plain laughable.

So what was the original 2008 article which first mentioned Nomophobia?

Well, it seems to come from the UK Post Office who also commissioned a study of more than 2,000 mobile phone users by YouGov.

It’s “telecom expert”, Stewart Fox-Mills said:

“Nomophobia is all too real for many people.

“We’re all familiar with the stressful situations of everyday life such as moving house, break-ups and organising a family Christmas.

But it seems that being out of mobile contact may be the 21st century’s latest contribution to our already hectic lives.”

So this is the inventor of the term, it seems, as even Wikipedia shows no earlier citation than this 2008 UK survey.

And my brief research into Mr. Fox-Mills suggests he isn’t a psychologist, but the Head of Marketing for the UK Postal Service. (I tried to locate the original research at the Royal Mail’s website, but the link was dead).

So it seems every few years, as smartphones continue to evolve and become the standard communication device, we will see these same surveys dribbling out, the mainstream media echo chamber leaping onto it, dial a quote researchers and dare I say it psychologists asked for their opinion to give the fluff piece gravitas, and more members of the public either thinking they have an illness, or poking fun at psychologists for making up illnesses to drum up business.

What is true it seems is that the old saw of “Follow the money” holds true for Nomophobia, and it’s a pity the Newspaper of Record, the New York Times, has got itself caught perpetuating nonsense when a little research could have steered the story to something more useful and interesting, such as the history of moral panics due to generational adaptation to things technological.

UPDATE (June 26, 2012):

I located the ZOMM bluetooth-powered device while googling around looking for Bluetooth 4.0 specifications for an upcoming blog post. I so like this easy and inexpensive potential solution for lost iPhone and Android devices (as well as iPads, and more) I thought I’d include a promotional YouTube video, below. Enjoy!