This is part 2 of a series of articles on automating tasks in Analyse.

According to the new Askia tagline, you can now automate insights. The claim is bold and possibly premature. But for years now, we have been thinking of ways we could decrease the lapse between questionnaire design and the analysis. Of course producing insights is hard on a new topic but it shouldn’t be that difficult on a more targeted environment. If your survey is generic enough (say an ad test or package testing), you should be able to create your surveys in minutes (check our interactive libraries), get the data in hours and analyse the data automatically, compare it with benchmarks and get a result with hardly any human intervention.

But we have been thinking about ad-hoc as well. You know how to analyse NPS. You do it all the time. The system should pick up the NPS questions and analyse them accordingly. How often do you see a lifestyle grid? Again you always do the same thing – pick every statement individually, cross them by the demographics banner and produce one summary table. This should and must be automated.

The challenge is how to detect these question types that you know to analyse specifically. Ideally they should be tagged and so we have introduced tags in 5.5.2. And that should be made even easier if you re-use blocks coming from a library.

Born in Analyse version 5.5.2 and supercharged in 5.5.3, Automated scripts are a new and powerful paradigm in Askia’s approach to building Analyses. You can now execute scripts that will instantly build your portfolios. The idea is that the scripts are generic so can be used across different surveys both manually and from the command line.

Instant, ready-made analyses are now an exciting facet of Askia’s extensive automation possibilities and a new jewel in the crown of Askia Analyse!

In this knowledge base article are several ready-to-use examples accompanied by explanations, which attempt to demonstrate the concepts and language for this new efficiency-boosting feature. We hope you enjoy it and, as always, comments & feedback are welcome!

The examples in this blog post can be tested on the .qes file here. It has all the question types and tags required to run them. In order to run the the same scripts on your data files, you’ll need to add tags to your .qes or .qew. If the tags you add have different names, you’ll need to update tag references in the scripts provided.

All these examples can be run from the command line using the /"automate:" parameter e.g. 

Analyse.exe  "c:\Qes\ex.qes" /"automate:c:\Scripts\myScript.txt"

Script files and .bat (command line) files can be downloaded by clicking on the images or animations below. Download and place the ‘Applications’

The example folders contain script files and .bat (command line) files. Download and place the ‘Applications‘ & ‘Clones‘ folders within each of the example folders A, B & C.

Why Automation scripts?

Below are some examples of regular and repetitive analyses that Data Processing (DP) staff might be expected to run. Each shows how these can be achieved from the command line with no need to create a portfolio yourself!


  • The new project has gone live, please produce a hole-count for data checking…

Steps: – Drag all the variables into the rows (but not the hidden ones), put the demographics and panel provider variables in the columns, set the correct tab template, save and export…

    • The script shows how we use tags, defined in Design, to isolate the demographics and sample provider questions, and how question type keywords are used to fine tune what appears in the rows and columns.
    • We then run the .bat file to generate the portfolio from scratch and export the Excel output from from this new portfolio

  • A new SPSS .sav file has arrived from the collection agency, please convert it to a .qes file and produce the data summary tables with nets and means.

Steps: – Import the .sav file in Askia Tools, create loops & multiples where required, add all questions to the rows and any wave questions to the columns. Add nets and NPS calculations plus scaled values to produce mean scores.

    • The command line has 5 parts:
      1. Import the .sav file. This create a ‘flattened’ structure for all multiples and loops
      2. Transform into multiple variables using regex
      3. Transform into loops using regex
      4. Open the resulting file in Analyse and run the automation script
      5. Open the resulting portfolio in Analyse and export to Excel
    • Since the .qes file is generated afresh as part of the process, adding tags in Design is replaced by checking question response counts in the script to see clones we can apply. We also check for any shortcut containing “wave” and add this to the columns.

  • New analysis spec received for the large ad-hoc project that just closed, I better set aside at least two days to put it all together…

Steps: – Create full tabs for each section of the survey with demographics and brand groupings in the banner, add all the loop summary tables at the end and export. Then create a copy of the main portfolio filtered by each of the four segments and six countries.

    • The command line has 5 parts:
      1. Generate the full tabs portfolio with the (C2) script. It uses a recursive function to put all variables in their own tab definition organised in levels of chapters matching the structure of the survey design.
      2. Generate a summary table for each loop of the survey using script (C1). Any loop with more than one level has these added to the edges, which we can show as nested in the tab.
      3. Merge these two portfolios using script (C3)
      4. Open the resulting portfolio in Analyse and export the full tabs to Excel.
      5. Using script (C4) open the full portfolio and filter it by 10 different sub-populations in turn producing a new portfolio for each.

To conclude, we have shown ways you can create one (or many) portfolio(s) out of the blue from a generic survey. But portfolios have their usage – usually to export to Excel and to PowerPoint. But that’s not entirely groundbreaking because as nicely put by Tom De Ruyck: “PowerPoint does not speak“. But you know what speaks? Dashboards (and Vista)! You can measure usage and understand which piece of information is asked by which user. And the future is there: some machine automation that monitors what people look at and offers that information automatically.

To improve software, we need to monitor usage and that’s why all of this has to go online. That’s what we are working on next.

Show CommentsClose Comments

Leave a comment