Abstract: Survey scripting and coding have lots in common and we should bring testing techniques into Survey Design. For this we have improved Random Data Generation and created a new Tools module called “Script Verification”.

SurveyMonkey, Google consumer surveys and other disruptive DIY technologies have changed the Market Research industry. Any marketing director can put together an online survey, get sample from a number of panel providers and have results to their strategic questions in hours.

But Askia software is not designed for marketing directors. It has been conceived for survey specialists, scripters, data processors who design and analyse complex surveys – sometimes long, sometimes algorithmically challenging, over long periods of time and eventually collecting millions of records. And with our target audience in mind we are continually improving our range of software. We want any design to be achievable, any layout, any number of records. It would be an exaggeration to call it Big Data but let’s say we specialise in “Medium Data”.

On the subject of interview data, I will only mention that in the last 2 years we have completely overhauled our way of storing data in SQL Server (5.3.3) and a new compressed inverted data format (5.3.4). But I am digressing, the subject of this article is about methodologically managing complexity.

Managing complexity with Askia's survey software

The challenge with complexity is that it invariably leads to human errors, their number exponentially growing with size, harder to spot and often too late. The thing is we, as programmers, know about complexity. Askia software is made up of millions of lines of code and, as some of you may have fleetingly experienced, it sometimes breaks. And, believe it or not, we coders have an aspiration to perfection: we constantly try new methodological or technical ways of testing our software so it works smoothly the second we release it. But any program that does anything more than sorting three numbers is bound to break and we have to live with the fact that we will always deliver short of what we wished for – but hopefully learn from past mistakes.

Survey scripting is programming – unfortunately Market Research tools are a little bit behind (yes we are aware of our responsibility there). Our first version of Design in 1994 was attempting to mimic the revolution Visual Basic brought to the programming world in 1991. All basic functionalities were available in a Graphical User Interface. We made the layout WYSIWYG but we still allowed programming in event driven scripts but hidden from the interface. Our AskiaScript still has the traces of its ancestry with variables defined with Dim, For Next loops – I’ll admit that not everybody at Askia thinks it’s a good thing but that’s the price to pay for backwards compatibility.

Reusability & object-oriented programming by Askia

Reusability is the key to decrease development time and increase reliability. For programmers reusability is generally known as Object-Oriented Programming. In all of our software, we have tried to include reusable objects: Generations Settings and Internet options in askiadesign, Tab-templates and clones in Askiaanalyse, survey inheritance in askiasurf, libraries everywhere. Last but certainly not least we have created Askia Design Controls: we have enabled (advanced) users to generate the perfect HTML / JavaScript for each PC / tablet / mobile target whatever the browser, its version or its Operating System. ADCs encapsulate data, they are polymorphic (you can use them on different types of questions and browsers) and because they are open source, it’s up to you to give them inheritance.

There is another part of programming that we would like to bring to the Market Research industry: it’s testing – unit testing, integration testing, system testing. For the development of AskiaScript 2.0 we designed the tests before we wrote one line of code – this is called test-driven development (TDD). The number of bugs was minimal for a development of that size. Each time we found a fault, we added it to the list of tests to make sure it would never surface again in subsequent versions.

Test-driven development in survey design by Askia

Along with the spec of a survey, there should be a list of tests. These tests should be run by someone other than the survey scripter – and the tester should not peek into the routing coding. Different people will think differently ensuring your tests cover more defects.  We have put together a non-exhaustive list of tests:

  • Interview level: data presence for mandatory questions, skip routing testing, coherence between questions, testing links and response visibility.
  • Usability testing: testing each screen on every platform.
  • Aggregated testing: making sure quotas are respected, rotations are balanced, multiple questions have multiple responses.
  • System testing: ensuring the survey runs well on the server and that the data you produce is usable.

Long before considering a soft launch the simplest way to see if your survey runs correctly is to generate random data. You have two ways of doing so: either by using AskiaTools’ random data generator or using a JavaScript simulator (see here). The JS simulator is a great way to achieve systems testing.

System testing can also be achieved by exporting test interview data as .dat files and looking at the size of the individual dat files: you will be able to measure the load that will incur on memory. Multiply this by the number of concurrent interviews you expect and you will have an idea of the specs you need on your server(s). Additionally, looking at the size of a .QES file or preferably of the tables generated in SQL server will indicate how much Hard Drive space you will need.

Random Data Generation in askiatools

We have recently added a lot of features to the Tools random data generator: you can define routings that would only be run during random generation (for validating the screening for instance), you can specify the behaviour when blocking error messages are displayed and more importantly you can import your quota settings and take them into account in your generation (all available in 5.3.3). Quota code is often complex and going over could be an expensive mistake.

We have also created a brand new module in Tools 5.3.5 called “Verification scripts” (see here for more details). This allows a tester – remember not the survey scripter – to create checks in AskiaScripts that will be run on each interview. So you can verify that the question about credit cards has been asked if the interviewee has mentioned banks in another question. You simply write a check like this:

Assert.Check( Banks <> {} and CreditCards.HasNA,” Interviewee should have been asked the question about credit cards” )

The scripts can be as long as you like, we have added If Else conditions and Goto to help you create complex code that you will keep in one single text file. And you can write it within an environment – the askia visual studio – where you get help and documentation on any objects, methods or keywords. You can run this on your randomly generated data, on your soft launch or on your full data set – each time you get a detailed report of how many checks have failed. At the time of writing, this is not released yet but contact us if you want to try a beta.

In these scripts, we also want to have access to aggregated data… this will allow us to have one script that runs interview level testing and aggregated testing. We might want to test if an interview took less than 10% of the average length or if a given response to a question is outside a percentile. In other words, you might want to compare interview responses not with other interview responses but all other interview responses. The script grammar for this will be described in a forthcoming article – we are still passionately discussing it internally.

Usability testing in survey design by Askia

We have not covered usability testing here – not that we do not think it’s important: we are constantly talking about it internally. We are putting together a range of tools for designing ADCs (so far codenamed ADCUtil – yes we need something catchier), we have added ways of visualising your HTML in other browsers in Design. But we need to understand when a display no longer works because of screen size, the bias triggered by no longer using JavaScript, count the number of heads of Internet Explorer 5 users – and there again we need your input and your ideas so we can automate these tasks.

In the meantime, I leave you with these great quotes:

The act of maintaining software necessarily degrades it.” – Alain April

It’s harder to read code than write it.Joel Spolsky

If you can’t measure it, you can’t improve it. Peter Drucker