Please select your region. The page you linked to will then be displayed.

Knowledge bank

Showing 3 of 3 papers

Human factors in B2B Research over the Internet
26/02/2008 | By
The Internet promises to be a commonplace feature of conducting business in the near future. In a recent telephone study conducted by DVL Smith Ltd amongst a representative sample of UK small and medium sized businesses, over 2/3 of all businesses had internet access. The US based Boston Consulting Group predicts that by 2003 one quarter of the world's business-to-business purchases will be made on-line (an annual growth rate of 33% over the 5 years from 1998-2003).   This paper will explore the problems and challenges surrounding the conduct of research via the internet amongst business audiences. The paper will highlight the great potential that exists for business-to-business research over the net. It will also identify some of the main obstacles to researching in this way, examine the factors which cause them, and share the authors’ experience of using a number of methods they have successfully employed to overcome them.
Read paper >       Download PDF >
IF IT AIN'T BROKE FIX IT ANYWAY - Effectively incorporating usability studies into market research programmes
25/02/2008 | By
This paper examines the increasing use of usability studies to aid the development of new media products and services – predominantly websites – and looks at whether the growing distinction of usability as a discipline in itself is counter productive to today’s requirement to provide more holistic and integrated marketing research services.  Specifically we will look at how usability is conducted both by ourselves, and other agencies, and look at the consequences of sacrificing broader user testing studies and NPD research, to focus on ensuring perfect functionality of product through isolated usability testing.
Read paper >       Download PDF >
On-line surveys - respondent quality assessment
10/08/2009 | By Andrew Elder, Illuminas Austin
This paper was first presented at the 2009 Sawtooth Software conference in Delray Beach, Florida by Andrew Elder, Vice President of Marketing Sciences for Illuminas in Austin, TX and Terry Pan, Illuminas Austin MS Manager. The paper was called Survey Quality and MaxDiff: An Assessment of Who Fails, and Why.  The authors use MaxDiff (a derivative of conjoint analysis) as a tool to assess respondent quality, since it implicitly measures the consistency across multiple comparisons.  Since this consistency can be standardized across topics and audiences, a meta-analysis of 8 MaxDiff studies helps to paint a more specific picture of the low-quality respondent. Based on this analysis, “speeding” through a survey and “straight lining” ratings questions are shown to have the most significant negative relationship with MaxDiff performance.   When poor performers in each task are overlaid, it becomes clear that there is a hierarchy of quality in which people who fail at multiple question types are objectively providing the lowest data quality.  However, individuals who fail at an individual task (e.g., those who “speed” without “straight lining” or poor MaxDiff consistency) perform relatively well in the remaining survey components.  These insights serve as a cautionary tale to the overzealous rejection of questionable respondents, as this risks biasing results toward those who answer questions in a particular way.  By using a variety of question types, researchers are less likely to exclude individuals who favour one response style over another.  Overall, the authors found between 1-4% of individuals justify rejection, varying by study. You can download the pdf presentation below. For more information please contact
Read paper >       Download PDF >
knowledge doodle