Wednesday, 17 December 2008

v2.0 of Market Research 2.0

Another year has gone by and 2009 is around the corner.  
Colin Stein Director of Marketing at ResponseTek made me aware of the fact that whenever you google for "market research 2.0" my post of a year ago still pops-up first.  So he asked quite rightly in a comment: 
...has the Market Research 2.0 discussion not been followed up by further discussion? 
Hasn't it? 
He continues in his comment: 
... The legacy of market research is one of batch-based results, organizational silos of information, and executive analysis with little or no connection to the front-line. Your definition opens up the idea of research to the community, and the use of contemporary tools to enable real-time insight collection and knowledge sharing...
So what has changed over the past 12 months.  Well, Collin does give some insight into the answer:
 ... customer experience management and enterprise feedback management software [...] has forced an acknowledgement of the validity of customer feedback as primary data.  
He's quite right!  And much to my delight he adds a prediction for 2009 
... I suspect large organizations will start to see that there actually is something called Market Research 2.0 to embrace and endorse. 
In my original post of  November 2007  I expressed my disappointment  about how at that time Wikipedia decided that the definition I added to Wikipedia had to be removed since it was an "unremarkable neologism ",  Collin's comment almost made me repost it again, but he ends by asking the question which actually needs no answer: we need Wikipedia's permission first...?
Let me -- in a next post -- digg into what 2009 may bring us.  Until that post, send me a mail or post a comment if you have ideas on what 2009 will bring to the Market Research industry.

Tuesday, 2 December 2008

Trend for 2009: Mapamania

Yep, it's the end of the year again, time to start thinking of what is ahead. just posted half a dozen of trends for 2009, including what they call MapaMania.   The post explains how they wonder if 2009 will be the year in which all things ‘contextual’, ‘app’, ‘local’, ‘urban’, 'tags', 'lidar', ‘smartphone’, ‘convenience’, 'Cell ID', ‘spontaneity’, ‘infolust’, and ‘GPS’ come together in one orgasmic celebration of map-based tracking, finding, knowing and connecting? 
This brings me back to my previous post on Location Based Survey and on the iPhone and market research.   Will 2009 be the year in which market research will actually start working towards location based surveys?   But if we just think a bit further on how this may impact our industry, let’s just remember that a future version of the iPhone will most definitively be a transmitter for the Global Positioning System (GPS). This will offer researchers the opportunity to exploit the user’s physical location and link this to mobile survey data. This will than offer researchers an opportunity to conduct “point of exposure” data collection centred on event tracking: Track the respondent's proximity to outdoor advertising, allowing effectiveness research, probing for recall, etc. The accuracy and precision will only become much greater in the next few years.
Embraced by eager consumer masses who will flock to anything from friend-finders to lowest-gas-price-locators? Aided by services that already know which street users are on?
Nokias expect half of their handsets to be GPS enabled by 2010-2012).  MapQuests, Navteqs, and TomToms of this world continue to build the necessary infrastructure, devices and apps, any market research company would be stupid not to be partnering or experimenting with these map-based services. 
Why? Geography is about everything that is (literally) close to consumers, and it's a universally familiar method of organizing, finding and tracking relevant information on objects, events and people. And now that superior geographical information is accessible on-the-go, from in-car navigation to iPhones, the sky is the limit.
So to conclude the future of market research will undoubtedly bring a new reality:
  • in which the “portable Internet” will provide researchers with more timely, comprehensive and accurate recall of consumer experience, and
  • in which the combination of consumer and product data with occasion-based event information will provide a new way of data collection.

Monday, 1 December 2008

Are you fighting the internal Community battle?

Whenever you are working in a market research firm which has proprietary online panels, you may very well be working on an internal project right now trying to mover your online consumer panels to a more Web 2.0 sustainable environment: towards research communities, not plain panels.
You may encounter some internal resistance or perhaps road blocks and you may need a lot of energy and time to convert those who are not as up to speed on Web 2.0 as you are.  Should this apply, but probably also simply for those who are working on a business case on research communities I found a great training.  Today I came across a presentation by Joshua Rosh, VP of O'Reilly's InPractice in which he offers hands-on advice on how to start-up social technologies in your company.  His webcast basically is a field guide to bringing social technologies into any organization. The webcast will explore how to: 
  1. Make the case: how to bring Web 2.0 concepts into your organization (including convincing upper management)
  2. Fail Forward Fast: how to create effective pilot programs without losing your head (or your job)
  3. Spread the gospel: The key ingredients that make a successful Web 2.0 evangelist
Based on direct consulting experience, and with plenty of hands-on examples Joshua needs 40 minutes to share the do's and don'ts for all of you out there who are thinking to introduce social community services to your panels.  It's not focussed on our industry, but I think you'll get the idea and will easily be able to mirror the general best-practices and apply them to your research company.   After the 40 minute presentation, there is a 20 minute Q&A session.