• Follow Curiously Persistent on WordPress.com
  • About the blog

    This is the personal blog of Simon Kendrick and covers my interests in media, technology and popular culture. All opinions expressed are my own and may not be representative of past or present employers
  • Subscribe

  • Meta

Learning from Steve Jobs

Steve Jobs' fashion choices over the years

Understandably, technology news over the past week has been dominated by Steve Jobs’ resignation as Chief Executive from Apple. While he will stay on as Chairman, Tim Cook – former Chief Operating Officer – will take the helm.

There have been many wonderful pieces on Jobs (though some do read like obituaries) – these from Josh Bernoff and John Gruber being but two – which cover many angles – whether personal, professional, industry or other. I’m neither placed nor qualified to add anything new but I have enjoyed synthesising the various perspectives. Yet invariably, the person saying it the best was Jobs himself:

  • He knew what he wanted – “Your work is going to fill a large part of your life, and the only way to be truly satisfied is to do what you believe is great work. And the only way to do great work is to love what you do. If you haven’t found it yet, keep looking” (Stanford commencement speech)
  • He felt he knew better than anyone else – “The only problem with Microsoft is they just have no taste. They have absolutely no taste. And I don’t mean that in a small way, I mean that in a big way, in the sense that they don’t think of original ideas, and they don’t bring much culture into their products.” (Triumph of the Nerds)
  • He, along with empowered colleagues, relentlessly pursued this – “You have to trust in something — your gut, destiny, life, karma, whatever. This approach has never let me down, and it has made all the difference in my life.”(Stanford commencement speech)
  • He was a perfectionist – “When you’re a carpenter making a beautiful chest of drawers, you’re not going to use a piece of plywood on the back, even though it faces the wall and nobody will ever see it. You’ll know it’s there, so you’re going to use a beautiful piece of wood on the back. For you to sleep well at night, the aesthetic, the quality, has to be carried all the way through.2 (Playboy)

NB: The quotes above were taken from this Wall Street Journal article.

In Gruber’s words “Jobs’s greatest creation isn’t any Apple product. It is Apple itself.”

In 14 years he took Apple from near-bankruptcy to – briefly – the biggest company in the world by market capitalisation. He has been enormously successful. And while possibly unique – his methods run counter to textbook advice on how to run an organisation – a lot can be learned from him.

The thing I have taken most from this is Jobs’ uncompromising nature. If people weren’t on board with him, then to hell with them. This of course led to his dismissal from Apple in 1985. And his dogged focus on his preferences has informed his fashion choices over the years, as the above picture illustrates.

It might seem strange for a market researcher to take this away, particularly since research is stereotyped as decision-making by committee – something which Jobs despised:

  • “We think the Mac will sell zillions, but we didn’t build the Mac for anybody else. We built it for ourselves. We were the group of people who were going to judge whether it was great or not. We weren’t going to go out and do market research. We just wanted to build the best thing we could build.” (Playboy)
  • “For something this complicated, it’s really hard to design products by focus groups. A lot of times, people don’t know what they want until you show it to them.” (BusinessWeek)

Unfortunately, this stereotype is often true, and I have been guilty of perpetuating it on occasion.

One example was when trying to get a project up and running (on a far smaller scale than rescuing Apple admittedly). With a lot of stakeholders, I tried to include as many of their wishes and requests is possible. The end result was bloated, incoherent, unfocused and over-deadline. It wasn’t one of my finer moments.

Rather than bolt everything on, I should have appraised all the input and only included that which remained pertinent to the core objective. I lost authorship of the project, and it suffered.

While there will be counter-arguments, many public failures do seem to be the result of committee-made decisions. Two bloated, incoherent examples that immediately spring to mind are Microsoft Office 2003 and the Nokia N96. Conversely, there are many examples of visionary micro-managing leaders that have driven a company to success – Walt Disney, Ralph Lauren and Ron Dennis to name but three.

I am a researcher rather than a consultant, and so don’t intend to fully adopt this approach. However, it appears that there is a greater chance of success when primary research or stakeholder input informs, rather than dictates, the final decision.

Steve Jobs knew this. His flagship products weren’t revolutionary (IBM, Microsoft, Nokia and the like were the primary innovators). But his genius was in refining a variety of inputs and stimulus, and moulding them into an expertly designed final product.

And that is something to aspire to.

sk

Advertisement

Overhauling the agency pricing model

Agencies are potentially losing out on beneficial and worthwhile commissions due to a fundamentally flawed approach to pricing their work.

(Note: My experience with pricing is almost exclusively tied to research agencies but I think this is broadly applicable to all industries).

Projects are commissioned when there is agreement between what an agency is willing to offer, and what a client is willing to pay.

My issue is that both of these components are based on cost.

Instead, they should be based on value.

£1 price tag

The agency side

The current model

Looking at the agency side first, it is clear that the focus upon cost makes the process far more transactional than it should be.

Using a dodgy equation (channelling John. V Willshire, who does this sort of thing far better).

P = d + αi + βt + p where P =< B

In English, Price =direct costs + a proportion of indirect costs/overheads + an estimate of the time spent + profit, where price is less than or equal to the client budget

(The alpha sign has arbitrarily been assigned to meaning a proportion, and beta an estimate)

d + αi + βt can be simplified to C for costs. Thus:

P = C + p where P =< B

Explaining the equation (this can be skipped if you trust me)

Of course, this is an oversimplification (though if agencies don’t use timesheets then the equation will lose the time segment and become even simpler) but it does explain the majority of the considerations.

Competitor pricing will be a factor. Market rates are to an extent set by those that have offered the service – an agency will seek to match, undercut or add to a premium to this depending on the relative positioning. This is reflected in the equation through time (premium agencies will generally spend longer on the delivery) and in desired profit.

An agency’s price will miraculously match the stated client budget (or in some instances, come in £500 under which I don’t understand since a) I thought psychological pricing had been phased out b) that spare £500 is not going to be able to cover any contingencies, expenses or VAT that aren’t included in the cost).

However, there are (at least) two things that aren’t yet factored in:

  • Opportunity cost – the cost in terms of alternatives foregone. This isn’t included since the only time you can really be sure that new requests for proposals appear is at the end of the financial year. Otherwise – for ad-hoc project work at least – there is no way to accurately predict the flow of work.
  • Competitive bidding – where profit is multiplied with expected success rate to give expected profit. While guesses can be informed by previous success rates, I don’t rate it as a) closed bidding processes mean competitor bidding strategies are unknown and b) perceived favourites are just that – perceptions (for instance, an incumbent may be secretly detested)

So what does this mean?

Ultimately, an agency will only submit a proposal if they think the profit they will make is worthwhile. The above equation can be reframed to reflect this:

p = P – C where P =< B

Or profit is price minus cost.

And this is where my main problem is with agency pricing. Profit is expressed purely financially.

Undoubtedly, finance is crucial. An agency requires cashflow to operate, it cannot survive solely on kudos. But it shouldn’t be the sole consideration

What I think should be included

Value should be added to the equation.

An agency should think not only about the financial margin, but about the business margin.

In addition to revenue, an agency can receive:

  • Knowledge – will the project increase knowledge of markets, industries, processes or methodologies that can be applied to other projects in future? This can be used to improve the relevance of business proposals, or be incorporated into frameworks of implementation
  • Skills – is the process repeatable, which can create future efficiencies? Does the project offer opportunities for junior staff to train on the job? If so, savings in training and innovation can be made
  • Reputation – will the results of the project be shared publicly – in testimonials, trade press, conference circuit or otherwise. If the agency is fully credited, there is PR value in terms of profile and attracting new business
  • Follow-up sales – will the project lead to additional work, either repeating the process for another aspect of the business or in up-selling follow-on work? Again, this can save on business development and can offer some future financial assurances (which will influence the amount of money borrowed and subsequent interest paid)
  • Social good – perhaps not as relevant for those in commercial sectors, but will the project create real and tangible benefits for a community – referencing Michael Porter’s concept of shared value

Thus, project gains are far more than financial. These intangible benefits should be applied as a discount to financial profit

Dodgy algebra (this can be skipped unless you want to pick holes in my logic)

Because while net gain would be:

N = p + β(k+s+r+f+g)

The net gains from a project are profit plus estimated gains in knowledge, skills, reputation, follow-up sales and social good (note that these factors can be negative or zero as well as positive). These can be simplified as intangibles:

N = p + I

These intangibles offer alternatives to financial profit. Increasing the amount can be gained effectively increases the budget:

P = C + p where P =< B + I

Assuming that an agency won’t offer psychological pricing, we can assume that P = B. This makes the equation

B + I = C + p

Substituting budget back in for price, and rearranging gives:

P = C + p – I

However, this assumes that the entire surplus is passed onto the client. Obviously, this shouldn’t be the case but equally the agency shouldn’t keep all of this surplus. Instead, I propose a proportion of the benefit is passed onto the client via a discount (in order to make the agency more competitive and improve chances of success).

Value is therefore a function of profit and discounted intangible gain:

V = fn(p – ɣI) where gamma is a discounted proportion

What this means – the conclusions bit

All of this long-winded (and probably incorrect) algebra effectively changes to equation

P = C + p

becomes

P = C + V

Financial profit is substituted for value.

I believe that the price an agency charges should be a reflection of their costs and the overall value that is received from the profit – both in tangible revenue and intangible benefits. Some of these benefits should be passed on to the client in the form of a price reduction, in order to make the bid more competitive and improve chances of success.

This also works in the converse. If there is a project that an agency isn’t enthusiastic about – it might be laborious or for an undesirable client – then the intangibles are negative and so profit needs to increase in order to make the project worth undertaking (in a purely financial equation, this means costs will need to fall within a fixed price/budget).

I should also make it explicit that I am not advocating a purely price-driven approach to bidding. Other factors – communicable skills and expertise, vision and so forth – are still vital. The reality is that markets are highly competitive, and price (or more accurately, the volume of work that can be delivered within a fixed budget) will be a large factor on scorecards used to rate bids.

The client side

This section doesn’t require algebra (fortunately).

My main issue with client budgeting is that it only concentrates on purchasing outputs. While these are tangible, these outputs (at least in research) are a means to an end. A client may want eight groups and transcripts, or a survey and a set of data tables, but the client doesn’t want these for the sake of it. They are purchased to provide evidence to validate or iterate a business process.

Therefore, I believe the client budget should be split into two.

  • The project budget – the amount that a client is willing to pay for the tangibles – the process required to complete the delivery of the project. These outputs are outcome-independent.
  • The implementation budget – which is outcome-dependent. The complexity or implications of a project are often unknown until completion. A project could close immediately, or it could impact critical business decisions in nuanced ways. If the latter, additional resource should be assigned to ensure the business can best face any challenges identified.

The majority of costs are incurred in the project, but the real value to the client comes in the implementation. This needs to be properly reflected; it currently isn’t.

Effectively, I propose a client should commission an “agency” to manage the project and a “consultancy” to manage the implementation. These could be the same organisation, they could not.

Wrapping up

There are undoubtedly things I have overlooked, and I’m pretty sure my algebra is faulty.

However, I believe my underlying hypothesis is valid. The current agency pricing model is flawed and needs overhauling because

  • Agencies ignore non-financial benefits
  • Clients ignore implementation requirements
Both of these are easily correctable, and these corrections can only improve the process.

sk

Image credit: http://www.flickr.com/photos/chrisinplymouth/3222190781

Google Firestarters #2 – Design Thinking

The second Firestarters event, hosted by Google and curated by Neil Perkin, was an excellent evening – probably even better than the first evening. There were lots of interesting people to speak to and debate with in the break-out session and afterwards, while the Google catering is unrivalled. I’m amazed the staff aren’t twice the size they are, given the volume of cupcakes around.

The primary reason for the quality of the event is the speakers. Both were very interesting.

Tom Hulme (IDEO)

Tom talked about design thinking as a set of beliefs. He advocated it as a form of divergent thinking. Strong companies that perform well tend to be good at optimising and being efficient in their areas of expertise. Creativity in opening up new avenues can bring in new aspects to a business, which they can subsequently optimise and renew the cycle. Traditionally these would be have been consecutive but with things moving so quickly they should now be concurrent.

Tom’s 8 steps for design thinking are

  1. Challenge the question
  2. Be user-centred (and do so in context. Focus groups are not the place to introduce ideas)
  3. Look to extremes
  4. Messages or experiences? The answer is both – they are coherent.
  5. Be holistic – the business model and marketing model are now indistinct from one another
  6. Value diversity
  7. Launch to learn – prototyping is now redundant as it is so cheap to launch and run A/B tests
  8. Stay in beta

Tom is a very charismatic speaker and came up with wonderful examples – from Sneakerpedia being an example of message and experience combining, to Steve Jobs’ calligraphy course as an example of diversity to his open document containing useful tips for start-ups.

He also ended with a great quote: “Looking at why people really hate stuff is wonderful inspiration to come up with new ideas”

John V Willshire (PHD)

John is well-known for his unique analogies, and he didn’t disappoint with a seamless weaving of Bad Religion and Adam Smith.

John was a counterpoint to Tom, in that he argued the case against process. Channelling Bruce Nussbaum, he said that companies are only comfortable with design theory when it is packaged as a process. And then they are principally purchasing the process, rather than the idea or outputs themselves. Real work, in other words.

Process might make bad things good, but it also makes great things good. It levels things out into mediocrity.

When Adam Smith discussed the division of labour, he noted that the benefits to industry would be in dexterity, time and technology. However, he noted that this process wasn’t applicable to agriculture due to its unpredictability and variety. As John noted with regard to marketing agencies, “The sell is industrial. The work is agricultural”.

sk

Image credit: http://www.flickr.com/photos/dunechaser/3339729380

Dynamic Knowledge Creation Model

The Dynamic Knowledge Creation Model was created by Nancy Dixon, building on the work by Ikuijro Nonaka. It refers explicitly to how organisations deal with knowledge, though other academics have noted its relevance in other fields.

Nonaka posited that there are four processes of knowledge creation that link across tacit and explicit knowledge. These are illustrated below.

SECI modelImage linked from here.

This shows that the four processes are

  • Tacit to tacit knowledge – acquired through conversation and socialisation. It may not be the primary subject of the conversation, but new data points can be joined up new ways to create additional meaning
  • Tacit to explicit knowledge – this can be again acquired through conversation or another form of communication, but in this instance the transference is intentional
  • Explicit to explicit knowledge – where multiple data sources are combined in intended ways, to create additional understanding that can be greater than the sum of their parts
  • Explicit to tacit knowledge – where individuals take things they have learnt and apply them to their thinking and actions

In Rachel Bodle’s article, she combines this with Dixon’s thinking to come up with the composite diagram below.

A Model of Dynamic Knowledge Creation The diagram shows that there are four types of knowledge assets within an organisation (or individual)

  • Routine knowledge (explicit to tacit) – learning by doing
  • Experiential knowledge (tacit to tacit) – judgement of individuals
  • Conceptual knowledge (tacit to explicit) – frameworks and models to utilise
  • Systemic knowledge (explicit to explicit) – editing and synthesising multiple sources
Market research agencies traditionally reside in the conceptual sphere – it takes the tacit knowledge from stakeholders and the target audience and converts them into meaningful, actionable recommendations and frameworks. The best agencies will frame their solution in such a way that makes it transferable beyond the confines of the specific brief.
However, there are also opportunities for agencies to assist organisations in the other areas
  • Routine knowledge – research may not necessarily help people or departments do their jobs better. But in certain circumstances, research tools extend into these areas. Workshop debriefs can walk through the practical implications of implementing the findings, ideally in a real situation. An example of this would be in processing and responding to consumer feedback.
  • Experiential knowledge – debriefs shouldn’t be reserved for the immediate stakeholder. By inviting everyone within an organisation, those inquisitive minds with a gap in their schedule can listen to the findings. There may not be any obvious, explicit benefit but the opportunity for serendipity arises
  • Systemic knowledge is traditionally the preserve of the client but, with resources increasingly stretched, some are looking to outsource this. Good research agencies should already be doing this – surveys and focus groups don’t reside within a black box. Secondary data collection and bricolage solutions using cost-effective online tools (the precise ones depend on the nature of the brief) should be pre-requisites in complementing the core research offering
I’ve only recently become aware of these models, but I’ve already found them extremely useful in reframing the nature of my projects. Organisations thrive on knowledge. It can only be a good thing if I can identify additional means of them harnessing and applying that

sk

Legacy effects

Earlier this week Seth Godin blogged about legacy issues. He stated that “The faster your industry moves, the more likely others are willing to live without the legacy stuff and create a solution that’s going to eclipse what you’ve got, legacies and all.”

That might be true, but legacy effects are just as prevalent on the consumer side as the production side, and they should be recognised and incorporated as far as possible.

For instance, early digital cameras didn’t contain a shutter sound. After all, it doesn’t need one – the noise was merely a byproduct of the analogue mechanism. Nevertheless, early users felt a disconnect – the noise had let them know when their photo had been taken. Hence digital cameras all now have the option for the shutter sound to be incorporated.

Legacy effects are also present in our naming conventions – records, films and so on. I suspect this may also soon apply to the device we carry around in our pockets and handbags.

Our contracts and pay as you go credits are currently with phone companies, and so the “mobile phone” name still makes sense, even when on smartphones the phone is “just another app” (and not a regularly used one at that). But with Google looking at unlocked handsets, and the introduction of cashless payments through NFC, the business models may soon be changing. I suspect that if Visa starts selling devices that allow you to make payments as well as contact people, they will initially call it a “mobile phone” rather than a “mobile wallet”.

Behaviours are also subject to legacy effects – our habitual purchases that we continue to make without consideration. Some companies (like AOL) benefit from it, while others can suffer. For instance, I have only recently purchased a Spotify subscription and am considering a Love Film trial. From a purely economic standpoint I should have done this a long time ago, but I’ve been wedded to the idea of needing to own something tangible. Digital distribution means this isn’t necessarily the best option anymore (I type this as I look at shelves full of DVDs that I will need to transport when moving flat).

Consumers on the business-to-business side aren’t immune from this either – witness the continued reliance on focus groups or a thirty-second spot. These are undoubtedly still effective in the right circumstances, but some budget holders can be extremely reticent to leave traditional tried and trusted methods even when faced with reliable evidence than an alternative could prove more effective.

So while some companies can benefit from removing their legacy attributes early, doing so too early may be counterproductive. The comfort of sticking with what one knows can be very powerful, no matter how irrational it can seem.

sk

The importance of evaluation

The control element is a vital stage in project management, occupying a core position in frameworks such as APIC (analysis, planning, implementation, control). Broadly, it covers two distinct elements – monitoring and evaluation. From my perspective, the latter of these has been grossly overlooked.

To some extent, monitoring is the easiest of the two as it focuses a project manager on visible outcomes that link to key performance indicators. At the basic level, assets (principally time and money) are monitored, and performance (output, sales etc) is assessed to ensure a project is on track, and that the iron triangle is in balance.

So far, so good.

A project evaluation should cover not only this but far more. Unfortunately, it seems that they rarely go beyond the additional measure of some outcomes or intangibles (satisfaction, brand reputation etc).

A proper evaluation should not only measure the what, but strive to understand the why.

Specifically, project managers need to go beyond the self-serving bias. A project manager shouldn’t take the credit for all the success, and attribute the blame externally in the case of failure.

A full project evaluation is crucial irrespective of the outcome, whether success, failure or indeterminable (and the latter shouldn’t exist).

If a project is a success, laurels shouldn’t be rested upon. The recent HBR article on Why Leaders don’t learn from success is fascinating in this regard. All aspects of a project should be critically assessed – was success down to luck, competitor failure/inaction, or were the critical success factors actually internal? Furthermore, a project will never be without issue – these should be identified and remedies to mitigate them reoccurring installed.

Likewise, failure shouldn’t be a blame game. A project is a rarely an unmitigated failure. As Seth Godin writes in Poke The Box, failure should be celebrated at some level – it’s better to attempt a risk than to do nothing. After all, you can only win the lottery by playing it.

Obviously, celebrating success is a morale booster and this should continue. But a bit of critical thinking is vital to long-term development. By learning as much from the past as we can, we can better reshape the future.

sk

Image credit: http://www.flickr.com/photos/paulk/5131407407

The Beyond Bullet Points guide to presentations

To ease myself into my goal of doing more stuff, I’ve read my first book for a little while – Beyond Bullet Points by Cliff Atkinson.

Overview

cover for Beyond Bullet Points, written by Cliff Atkinson and published by MicrosoftThe book’s tagline is Using PowerPoint to Create Presentations that Inform, Motivate, & Inspire. Given that the book’s publisher in Microsoft, it is partly a guide to structuring a presentation and part PowerPoint manual. The book almost explicitly delineates itself into those two sections. Of the two, the latter half is pretty weak – even beginners don’t need a half-dozen pages on the virtues of Clip Art.

Nevertheless, I did pick up some useful PowerPoint tips. Aside from relatively minor things (for instance, I didn’t know that you could hold shift when resizing an image to retain the proportions), the main thing I took was that I should make more of the three separate views in PowerPoint. I normally stick to the normal view, but it is true that the note layout can be used to convert the document to a handout, and the slide sorter layout can equally function as an executive summary.

To be fair, the book was worth me reading for that realisation alone. Furthermore, the first three chapters – on structuring a story – were also very interesting. Atkinson is a big proponent of the rule of three, and thus it is apt that his style has three primary influences.

Influence 1 – Hollywood

The Hollywood influence is that an output requires a process. In this instance the three key milestones all correspond to the three key PowerPoint views..

  1. The script – the script of the piece is written out in long-hand, including stage directions. A presentation is slightly different in that detailed notes aren’t as important as headings. These summarise and navigate the content. The remaining components of a presentation (the flow, notes and graphics) stem from the heading
  2. The storyboard – the scenes are stitched together. Headlines are sorted and resorted to give the optimal flow. A presentation should have consistent pacing – multiple slides rather than builds should be used in order to manage the pace and keep notes distinct.
  3. The production – Only when the individual components are planned, can the production fully commence in its execution. Likewise, the slides and visuals should be the last aspect of a presentation that is completed. Within this, there are three further points to bear in mind
  • If revealing or teasing the answer in the introduction (which Atkinson advocates), then always start with the most important point as cascading conclusions require strong justification
  • Constantly remind the audience of the purpose of the presentation, and use  active and personal language to assist in persuasion
  • Use consistency and repetition throughout the presentation, including variations on a theme

Influence 2 – Aristotle

As part of the power of three, Atkinson obviously refers to Aristotle’s three act structure. He has embellished this slightly, and in fact has created a quite useful template that you can download from his website.

  1. Act 1 is an appeal to emotion whereby the story – the setting, protagonist, imbalance, balance and resolution – are set up
  2. Act 2 turns to reason, and justification for the solution. Within this, there should be three key points of descending importance. The depth and detail of each point depends on the length of the presentation. This section is a dynamic interplay between the questions of “how” and “why”, with one answering the other and vice versa.
  3. Act 3 ties the previous two acts together, framing the reasoning for the reiteration of the crisis, the solution, the climax and resolution

Influence 3 – Mayer

Richard E. Mayer has written extensively on multimedia learning theory, and ten of his principles are outlined in the book to justify why slides should be visuals and headlines, with the spoken details in the notes pages:

  1. Multimedia principle – people learn better with words and pictures than words alone
  2. Redundancy principle – people understand better when words are presented as verbal narration alone, instead of both spoken and on screen
  3. Segmentation principle – people learn better in bite-sized chunks
  4. Signalling principle – people learn better when information is presented using clear outlines and headings
  5. Personalisation principle – people learn better when conversational rather than formal
  6. Spatial contiguity principle – people learn better when words are near pictures
  7. Coherence principle – people learn better when extraneous information is removed
  8. Modality principle – people learn better from animation and voiceover than animation and text
  9. Temporal contiguity principle – people learn better when animation and narration are simultaneous rather than successive
  10. Individual differences principle – people learn better when prior knowledge, visual literacy and spatial aptitude are taken into account
Conclusion

I’d recommend this book with a caveat – understand what it is (and who published it) before deciding whether you want to read it. Around 60% of the book is pretty basic PowerPoint advice, and the style of presentation is much more American than European (I don’t think the sailing motif would work so well in London). However, I found the explanation of Atkinson’s structure to be very clear and useful and the chapters on storytelling are certainly worth reading.

sk

FYI I haven’t applied the principle to this blog post, since this is evidently a different medium.

A little less information, a little more action

My New Year’s resolution was to cut the current – to step away from the real-time information flow so that I can spend more time thinking and reflecting.

The first part of this has gone very well. The second part hasn’t – though I have reflected enough to realise that a third, related, aspect should have been included in the resolution.

I’ve successfully stepped away from the real-time more out of necessity than choice – my schedule has been unrelenting for several months now. I’m hoping that this will soon change, and that I can spend more time on both reflection and the overlooked aspect.

Before I divulge that, a brief review of three months of being more distanced.

On the whole, I’m happy with the decision. I may be less active on social media nowadays, but I’d argue I’m more efficient (albeit starting from a low base).

Despite potential benefits around phatic communication, the online signal-noise ratio problem is well-known.

Arguably a deeper problem is in filtering the signal strength – not all useful or relevant information is equal. What seems meaningful or resonant at the time can quickly turn out to be transitory or inconsequential.

I sincerely doubt that I’m now more discerning or incisive in my reading choice, but I do feel like my filtering of priority information has improved.

To give an example, I have a broad interest in technology and social media. As a consumer researcher, I need to understand trends, and ideally identify them before they reach critical mass.

But realistically, Austin is so far removed from the Home Counties that the information is largely superfluous. Beyond a basic knowledge of what the likes of  Beluga, Color, Path, Groupme, Quora, Instagram et al are providing (not least to see if they would be relevant to my atypical needs), I don’t need to know any more about them.

At least not yet. Do you know the proportion of the UK population that has heard of Foursquare? Not used, but heard of. How about Quora? The figures are 5% and 1% respectively (data comes from the digital media tracker I run).

They may morph into the next Twitter, but they may not. Furthermore, it isn’t the products or technologies I’m interested in, but the behaviours – Kevin Kelly has a nice diagram of benefits vs. company. And consumer behaviour (let alone attitude) is pretty slow to shift.

They may morph into Twitter, but they may not. Wired’s top 10 tech start-ups of 2008 doesn’t fill me with confidence. Only LinkedIn (21% UK awareness) and AdMob are relevant to me. That’s a 20% success rate from a small sample size – it would be much lower if you counted every company on Wired’s radar.

The slow speed of shifting attitudes and behaviours are why  so many of the “classic texts” – Ogilvy, Ries, Drucker, Peters, Collins, Covey, Pink, Gladwell etc – are still relevant.

Shamefully, I’ve read very few of these. This will hopefully be rectified as I make better use of the time spent away from the firehose.

Once I improve upon this, I can move to the next piece of the puzzle.

Doing.

It is good to improve upon my sources of reading, but it is also a very limited ambition. In the same way that innovation builds upon invention, I should seek to create a practical outlet for my reading. Ideas are good, execution is great.

Given that I deal with knowledge and information, my definition of  “doing” is going to be far narrower than that which Neil Perkin has been excellently espousing. But the likes of Noah Brier, Neil Charles and Rich Shaw have shown that it is possible to merge technical proficiency with clarity of thought.

My short-term goals are going to be small-scale – I haven’t managed to port my blog over to a .com address (admittedly, procrastination has been the main obstacle) so I’m not going to be coding any apps.

But even a better understanding of Microsoft Office will help me improve as a researcher – both through more efficient uses of what I already know and the introduction of new functionality (macros?). Reading informs of the overt or already discovered trends or approaches, but a merging of reading and doing widens the scope to not only think of something new, but to actually implement it.

This entry also acts as a good excuse to repost this Dolph Lungren video

Should pitching be a classical recital or a jazz improv?

I’ll avoid the layer of lingering suspense from the subject title by saying that it is a false dichotomy. Both can be suitable in different circumstances, though I lean more towards the latter.

Over the past few months I’ve been involved in quite a few pitches – sitting on both sides of the table.

The obvious thing that all pitches need is preparation. Lots of it. But there seems to be two broad approaches (note the emphasis: The rest of the post contains exaggeration).

1. The orchestral recital

This treats the outcome as fixed. Overt preparation goes into perfecting a repeatable performance.

This can be fine if you know exactly what your audience wants, and your audience knows exactly what it is getting. But is can also be a bit obvious. Perfectly pleasant, but not inspiring. It is not necessarily one-note but it is one performance.

In a business sense, it could be a face-to-face pitch follows a written proposal. But unlike a concert, the ticket isn’t bought and the relationship isn’t cemented – thus the dangerous assumption that you know exactly what your prospective client wants could back-fire if there is a miscommunication along the way.

2. The Jazz improv

The opposite end of the false spectrum is improv riffing. Here the preparation is more covert. All the pieces and mechanics are meticulously prepared, but there is no set way to put them together.

This enables a flexible performance to adjust and adapt to the mood of the room. But it still requires a fulcrum or groove to maintain structure and avoid obfuscating the issue.

This approach is more applicable to business development meetings. There may not be a set agenda, so the seller has to adapt to the need of the prospective client. The challenge is to make the covert preparation overt where applicable, through the introduction of easily digestible and memorable products or concepts.

The combination

Clearly, the optimal solution will be a combination of the two approaches – the relative weight depending on the specific circumstances. Across these, there are a few key things to remember.

  1. Prepare. And do lots of it.
  2. Create a skeleton structure that can be expanded or contracted to fill available space. There may not be a need to talk at someone for 30 minutes, but empty space should be filled
  3. Don’t plan to communicate everything that is prepared – always leave things behind that can be brought to the fore if the conversation moves that way
  4. If you can’t answer, at least respond – there is always the possibility of an intentionally tricky question. Acknowledge it but deftly segue into a related area that can more comfortably be answered.
  5. Prepare multiple scenarios – don’t plan for a single performance, plan for a residency

sk

Image credit: http://www.flickr.com/photos/joelwashing/3108694945/

Predictions for 2011

In the grand tradition of December blog posts, here are seven predictions for 2011:

<sarcasm filter>

  • A niche technology will grow
  • Businesses to focus less on the short-term bottom line and more on consumer needs for a long-term sustainable relationship
  • Traditional media/methods will take several more steps closer to its death
  • Social media will become more important within organisations
  • Companies will banish silo thinking and restructure around a holistic vision with multi-skilled visionaries at the helm
  • The product will be the only marketing needed
  • A company will launch a new product with these precise specifications…

</sarcasm filter>

1999 A.D. / Predictions From 1967

I think the tone and style of my predictions are about right. They run the spectrum from bland tautology to wild guesswork with plenty of jargon and generalisation thrown in.

Given how utterly useless predictions are, why do people persist? I presume they pander to people’s love of lists while gambling on their inherent laziness in not checking accuracy of previous predictions and hoping that, as with horoscopes, people read their own truths into open statements.

I’ve had the displeasure of running across numerous offenders in the past month. I won’t name check them all but, unsurprisingly perhaps, the tech blogs are the worst offenders. This example from Read Write Web and these two examples from Mashable are particularly mind-numbing in both their blandness and unlikeliness.

Living on the bleeding edge can massively skew perspective. I’m sure Cuil (remember them?), Bebo and Minidiscs have all featured in predictions of game-changing technology. In other past predictions, you can probably swap “virtual reality” for “augmented reality” or “geo-location”, or Google for Facebook or Twitter, and recycle old predictions for different time periods.

The basic truth is that the future is unpredictable. We are micro participants trying to define macro trends. A reliance on logical step-progression completely overlooks the serendipity and unanticipated innovation that characterises long-term trends, which constantly ebb and flow as tastes change and rebound against the status quo.

Take popular music as an illustration. The most popular acts of one year don’t predict the most popular acts of the following year. Tastes evolve (and revolve) with pop, rock, urban (I intensely dislike that word but can’t think of a better one), electronic and dance being in the ascendency at different points in the past twenty years.

With honourable exceptions, business and technological breakthroughs are revolutionary rather than evolutionary (note I have quite a wide definition of revolutionary). To give some examples

  • 2 years ago how many people would have predicted that an online coupon site would be one of the fast growing companies of all time
  • 5 years ago how many people would have predicted that a social network would be the most visited website in the UK
  • 7 years ago how many people would have predicted that company firewalls would be rendered obsolete by internet-enabled phones
  • 10 years ago how many people would have predicted that Apple would change the way mobile phones are perceived
  • 15 years ago how many people would have predicted that a search engine dominated advertising revenues
  • 20 years ago how many people would have predicted that every business would need a presence on the internet

Undoubtedly, some people would have made these predictions. But to use the well-worn cliché, even a stopped clock is right twice a day.

Despite my negativity, I recognise that there are some benefits to offering predictions. It opens up debate around nascent movements and trends and adds to their momentum, and provides a forum for authors to say where they’d like things to be in addition to where they think things will be.

If only so many weren’t so badly written.

(NB: I recognise by saying that I open myself up to accusations of poor writing, to which I fully admit)

sk

Image credit: http://www.flickr.com/photos/blile59/4707767185/