Skip to content

Preparing Multi-Language and Multi-Environment Test Sets

When designing a chatbot test strategy from scratch, there are often test case requirements like:

  • We want to test the chatbot in multiple languages, but we don’t want to replicate the whole test set, as the conversations are basically the same in all our supported languages
  • We want to run the tests in multiple enviroments - dev, test and production - and there are different user tokens to be used in each environment

If you can see yourself in one of the above points, then read on. This article will present a best practice technique to prepare Botium Box for those requirements.

Features and Techniques Overview

For setting up Botium Box we will use several techniques that are actually independent of each other, but in combination they are incredibly powerful.

Convos and Utterances

These really are the basics of Botium - if you do not know what Convos and Utterances mean in Botium context, then please head over here. In short:

  • With Convos you are describing the conversation structure of your tests, including user input and expected responses
  • With Utterances you are describing lists of words or sentences
  • Convos can reference Utterances to separate between the conversation structure and the actual phrasing

We will use this concept to handle multi-language requirements

Scripting Memory

With the Botium Scripting Memory it is possible to inject dynamically generated or static values into your test cases.

We will use this concept to set different conversation parameters for each environment your tests should run against.

Test Set Dependencies

In Botium Box it is possible to define dependencies between test sets and combine them into a single test set. We will use this technique to separate the different requirements into individual test sets and combine them as needed.

Resulting Test Sets

In the end there will be a couple of test sets in Botium Box:

  • There will be one test set holding the convo files valid for all conversations the should be available in all languages
  • For each supported language, there will be a language-specific test set holding the utterances and any language-specific test cases
  • For each environment, there will be a test set holding only the scripting memory
  • For each combination of language + environment you have to run your tests, there will be one test set combining the partial test sets from above:
    • the convo files
    • the language-specific utterance files
    • the environment-specific scripting memory

![](/attachments/1537703937/1541111811.png?width=680 """ width="680)

Step By Step

Now comes the interesting part - follow those steps to setup the basic structure in Botium Box.

1. Create a Shared Convos Test Set

Create a test set named Shared Convos in Botium Box. Add some first Convos in the Visual Convo Designer. The convos should map the conversation structure, but they should be free from any language-specific content. We have here a convo named TC_HELLO, which sends a default greeting to the chatbot, and expects a default greeting back - we will use utterance codes for it instead of literal phrases:

Here is the corresponding BotiumScript (for Copy&Paste):




In another convo we send some kind of special greeting:

And again the BotiumScript:




2. Create Language-Specific Utterance Test Sets

For each supported language, create a language-specific test set, name them for example Utterances EN and Utterances DE. To complete the shared convos from above, we need to define three utterance lists:


In addition, these language-specific utterances should be free from environment-specific phrases, so we are already using the scripting memory in the UTT_GREETING_DEFAULT:

Chatid $chatid_default {"testId":"QA-$testcasename","testType":"regressionTest"}

Here we are using a scripting memory variable that will be different for each environment, the $chatid_default - and another one $testcasename to let Botium fill in the test case name on execution (see Botium Docs).

Similarily in the UTT_GREETING_SPECIAL, we will be using another variable $chatid_special

Chatid $chatid_special {"testId":"QA-$testcasename","testType":"regressionTest"}

In the utterance representing the chatbot response B_GREETING_DEFAULT we will use the variable $name_default, as the user names in the supported environments will be different - using a test user account in the test environment and a real user account (created specifically for testing) in the production environment is a common setting:

Hello $name_default, How can I help ?

Do the same for other supported languages. In german, this could look like this:

Hallo $name_default, wie kann ich helfen ?

3. Create Environment-Specific Scripting Memory Test Sets

The environment-specific parameters will be saved in Scripting Memory files. Create a test set named Params DEV and add a YAML-file named Scripting Memory:


Scripting Memory can be defined in other file formats as well - plain text, JSON, Excel - but we found that the YAML representation is the most concise one: easy to write and to understand.

In this file we define the variables that we used in the utterances above:

  - header:
      name: dev
      $chatid_special: 4542343434343
      $name_default: Johnny

For the test set Params PROD this can look roughly the same, but with different variable values:

  - header:
      name: prod
      $chatid_special: 56565656565
      $name_default: ProdTest

Enable the Scripting Memory for these test sets in the Settings / Scripting Settings section

  • Enable the switch Enable Scripting Memory
  • Enable the switch Enable Test Parameter Store

4. Combine Test Sets into Language-Environment-Specific Test Set

Now create a test set EN DEV and set the Test Set Dependencies in Settings / Test Set Settings:

  • Add the Shared Convos test set
  • Add the Utterances EN test set
  • Add the Params DEV test set

In a similar way you can now create test sets EN PROD, DE DEV and DE PROD by combinding the language- and environment-specific test sets.

You can see a nice graphical representation of the combined structure in the Flow section of the test set:

5. Create Test Projects

As a final step, use the Quickstart Wizard to create the test projects in Botium Box:

  • Select the english language development version of the chatbot instance with the EN DEV test set
  • Select eh german language production version of the chatbot instance with the DE PROD test set
  • … and so on …

Everything is ready now for running your language- and environment-specific test sets.

Advanced: Connect to Git Repository

The clear suggestion is to connect Botium Box to your own Git repository to setup a Continuous Testing pipeline. This principle can be applied to the structure established above as well.

1. Set Content Selection Type

In all of the Shared Convos, Utterances XX and Params XX test sets created above, set the Content Selection Type to Use Local Repository only, in order to not intervene with the Git repository content in the test sets you are developing within Botium Box.

2. Connect Test Sets to Git Repository

For all of your Shared Convos, Utterances XX and Params XX test sets, now head over to the Linked Test Case Repositories section and register your Git Repository. You will need:

  • Git Clone Url
  • Username and password
  • Git Branch (if not yet existing, don’t worry, you can create it later)
  • And for each test set, use a separate folder in the Relative Path in Repository field
    • utterances_en for the Utterances EN test set
    • utterances_de for the Utterances DE test set
    • shared_convos for the Shared Convos test set
    • params_dev for the Params DEV test set
    • … and so on …


If you create the Git branch to use and the folder structure upfront in Git, you can use the Git branch dropdown selector and the Git directory browser in Botium Box instead of entering everything manually

3. Export Test Set Content to Git Repository

For all of the connected test sets you can now export the content to the Git repository by using the Export Local Repository button in the Test Case Repository section.

Select the Git repository and the Git branch you want to export the content, enter a meaningful Git checkin comment and click the OK button. Botium Box will now export the repository content to the Git repository.

When doing this for the first few test sets, you will end up in some new directories in your Git repository.

You now have to repeat this export everytime you think it is required - once per day, once per sprint, on every change - that’s totally up to you. Remember that Git is there for tracking changes, so you should do this as often to get a meaningful tracking history in your Git repository.

4. Use the Git Repository for Running Tests

Now we create a test set named EN DEV (Git) - the english-specific test set for the dev-environment loaded from the Git repository - head over to the Settings.

  • In the Content Selection Type, choose Use Remote Repository only
  • Enable the Scripting Memory and the Test Parameter Store as above for the Params XX test set

Afterwards, register the Git repository three times, each time pointing to one of three required folders:

  • params_dev
  • utterances_en
  • shared_convos


In a future Botium Box version it will be possible to select multiple directories for one single Git repository connection.

To verify the content, now header over to the Test Set Dashboard and click the update Test Set Statistics & Insights button. This will make Botium Box download all of the content from the Git repository and build up some statistics and the flow chart for it.

In the flow chart you can now see the combined test set - the shared convos and the utterance examples.

![](/attachments/1537703937/1541865488.png?width=680 """ width="680)

5. Create Test Projects

As a final step, use the Quickstart Wizard to create the test projects in Botium Box.

  • Select the english language development version of the chatbot instance with the EN DEV (Git) test set
  • Select eh german language production version of the chatbot instance with the DE PROD (Git) test set
  • … and so on …

Everything is ready now for running your language- and environment-specific test sets from a Git repository connected to Botium Box.