Performance Tips - keeping Testpad fast from build to build

Testpad has a very fast interface for script editing that's javascript powered and uses asynchronous communication with the server in the background. However, there are limits, and if scripts and result grids are allowed to get too big the interface can get laggy, especially on older computers.

Most examples I see of scripts getting too big are through unintended mis-use of the interface. So this is the first post in a series of three that shares some tips on how to best use Testpad, and in so doing, keep scripts to a useful and sensible size.

In summary, these tips boil down to:

  • use Retests for retesting new builds as part of the same release (the subject of this post)
  • copy Scripts (or whole Folders) when preparing for new releases
  • and ideas for refactoring scripts; joining too-small scripts or splitting apart big scripts

Retesting New Builds

First up is the idea of "retesting" test runs (columns). When a Test Run in Testpad is completed, whether marked as such manually in the Test Run Details dialog, or through setting results for every test, the Test Run Details dialog offers the "Start a Retest" button.

Retests are intended for retesting e.g. a new build. You've run through your tests a first time and found a number of test fails. The developers have fixed these and issued a new build that needs another test. So you come back to Testpad to make another run through your tests.

The thing not to do at this point is click on "New Test Run" to make a new column, additional to the columns already present. This is the path to the dark side and will lead to an ever accumulating number of columns, eventually slowing down the browser's responsiveness as you get north of 100 columns (sooner if you have >1000 rows).

Instead, you want to be using the "Start a Retest" button, offered in the Test Run Details dialog for runs marked Completed.

A Retest makes a new column that takes the place of the old column. To start with, the old column is displayed beside it, but greyed out. After a page reload, or when you next return to the script, the old column is not displayed at all. (You can get old columns back on screen with a right-click on the run header, selecting "Show Old Runs").

Retests are numbered like version numbers. Test Run 1.1 is the first retest of Test Run 1. Test Run 2.4 is the fourth retest of Test Run 2, and so on.

You can optionally inherit the results from last time. Inherited results are shown slightly lighter than new results and are there if you want to e.g. only retest the problems from last time.

A Retest also takes over the contribution to the progress statistics for the script. If you keep making new test runs (the bad way!), the progress statistics are always the sum of all runs ever conducted, and so can never approach a 100% pass rate with successive new builds. But use retests, and each retest can iterate the results for that run and allow them to reach 100% when everything is passing.

That's it for tip #1. Use the Retest feature when retesting new builds and don't keep making more and more columns, stretching the page off to the right.

Any questions, please email - always keen to help you get the most out of Testpad.

New Release: Testpad 4

New Look and Feel

Testpad just got a major update to its look and feel. The whole design of the app, and the branding of the marketing website, has been updated with a fresh look.

The functionality remains nearly identical, with only a few menu options moving around. Everything you could do before you should still be able to do in the new version.

New Pricing

At the same time, Testpad also updated its pricing. This is the first time prices have been changed (yes, raised) since Testpad launched in 2011, so a change has been well overdue for some time. The four existing plans have been replaced with three user-limited plans (for up to 3, 10 and 25 users respectively) along with a new kind of bespoke plan for enterprise customers.

Existing customers who were paying automatically either monthly or annually may continue on their current plans at their current pricing.

If you have any questions, or find problems with the new version, please don't hesitate to contact

Writing BDD Test Scenarios with Gherkin Syntax

Testpad now has built-in syntax highlighting for writing tests in the "Given, When, Then" style.

Behavior-driven development (BDD)

Behavior Driven Development is a software development process that is based around the desired behaviors of a system; defining a framework for collaboration that is both precise enough for developers/QA and human-readable enough for business stakeholders and non-technical team members.

Requirements are framed as user-focused scenarios (effectively acceptance criteria), composed in plain English with a descriptive grammar in the form: Given [some initial conditions], when [something happens], then [something should happen].

Scenarios often use one or more concrete example values to help define the behavior.

BDD Automation frameworks, e.g. Cucumber, jbehave

The controlled structure of BDD requirements lend themselves to test automation. Frameworks like Cucumber and jbehave start with Scenarios defined in the plain "business-readable" language and then structure the test code in and around it.

As the developers of Testpad, a tool to help you manage your manual testing, it's perhaps not obvious that we promote the idea of automating as much of your testing as is humanly/reasonably possible. If you're not automating any testing yet, stop reading and get on it!

But you can't automate everything, and probably have lots of tests that you haven't got around to automating yet... in which case, despite wanting to automate everything, you have plenty of manual BDD testing to organise. Which leads us nicely onto the new support in Testpad for writing test plans as collections of scenarios in the Gherkin syntax...

Gherkin syntax - Given/When/Then - a business-readable domain-specific language

The Gherkin language is a formalisation of the Given/When/Then style of test definitions, as used by most prominently by the Cucumber automation framework amongst others.

Testpad now automatically looks for this formatting and colors the keywords if it finds them. If Testpad doesn't recognise your way of formatting Given/When/Then, you can force ON the syntax highlighting. Equally, if Testpad is thinking you're using Gherkin when you're not, you can force it OFF.

Formatting options to suit your style of Gherkin Syntax

By default, every row in Testpad, that isn't a title row, is a "test" that will collect a result i.e. Pass, Fail, Blocked etc, along with optional Comments, Screnshots and Bug Numbers.

When importing (or writing) BDD tests, this means rows that begin GIVEN and WHEN will also be regarded as tests looking for a result:

For lots of customers, this is just fine, and recording a "Pass" against a GIVEN statement can just be interpreted as "yep, set that up, no problems", and a "Pass" against a WHEN statement can mean "yep, made that happen, no problems".

For other customers, there is a preference to only record results against the THEN statements that define what should (and therefore might not) happen. To this end, Testpad offers a couple of options:

  • Use the new comment prefix -- which acts just like the // comment prefix, making a row a non-test row, except that the -- characters are hidden for a prettier display
  • Use cascaded indentation to make title rows out of every non-test statement

Combining Exploratory Testing with BDD/Gherkin

Whilst Testpad's free-form layout can be used for almost any test process, Testpad excels in writing plans that are guides (whether high-level or detailed) for Exploratory Testing. The idea of exploratory testing is that you don't follow prescriptive instructions for what to test, but instead leave the human brain free to be inquisitive and tenacious in hunting down unexpected behavior. Test plans for exploratory testing are therefore "guides"; checklists of features that you don't want to forget to take a good look at.

However... BDD/Gherkin tests, on the other hand, are more proscriptive, defining in significant detail what needs to happen given/when/then this or that happens. This is of course great for Acceptance Testing; proving that the stated requirements have been met. But alone, this is then less useful for having confidence that the system is bug free.

So why not have your cake and eat it?

Copy/paste your Gherkin tests into Testpad, tweak the formatting if required, and then extend the tests with ideas, details and edge-cases to explore in and around the given scenario.

Test plans in Testpad are a free-form checklist document. So whilst you can structure your requirements using Gherkin syntax, there's nothing to stop you adding further lists (sub-checklists) to each BDD Scenario.

If bugs crop up in the field after Release 1.0, go back to your Scripts (checklists) in Testpad and add tests/ideas/notes to the relevant scenarios to protect against regressions next time.

Any problems, questions or further suggestions to improve these ideas, please email

Folders for Simpler Release Management

With the recent success of the new folders feature, here's a usage pattern that lots of customers have adopted to handle releases. In essence, this pattern avoids using Templates and instead uses the previous release as an on-the-fly template for the new one:

    Duplicate the previous release

  1. Right-click on the folder for the previous release and select Duplicate
  2. Edit the new folder name to reflect the new release version number
  3. Update the tests

  4. Edit the new copies of the Scripts to catch up with the latest features in the product; adding new Scripts as required, and deleting any unneeded Scripts
  5. Archive the old release

  6. (optional, depending on how tidy you like your Project view) Right-click the old release folder and select Archive; this puts the old folder in the Project's Archive tab. Note that this is different to archiving a whole Project which you do by right-clicking on the Project's name in the list on the left.
  7. Do the testing

  8. Within each Script in the new release folder, and when the product is ready for testing, make the first test runs for the release, collecting lots of passes (hopefully) and a few fails (inevitable)
  9. Share the progress

  10. Start sharing the folder report with stakeholders: from the Project view, right click on the folder name and select View Report. This opens the report in a new window. The contents and verbosity of the report can be configured using the button in the top-right corner, which is also where you can find the Enable Sharing button to get a share link. Alternatively, the whole report is easy to SaveAs because it is a single self-contained HTML document that can be archived as-is in your own systems (file sharing, wiki attachments, emails etc).
  11. Retest new builds

  12. If a new build is coming, create ReTests of the complete Test Runs. Prepare ReTest columns using the prompt when a run is first completed, or by opening the Test Run Details dialog (hover over a Test Run header and click on the Edit icon that appears), sliding the slider to Complete if it's not already, and then clicking "start a retest".
  13. A ReTest is different to simply pressing "new test run" in that the new column takes the place of the previous run, both visually and in the progress stats. Thus you don't clutter the screen with more and more old test runs (unless you like that!), and the progress bar for the Script (and in turn, the Folder it is part of) can approach 100% pass as the fails get re-tested as working.

    You can get old runs back on the screen with a right-click on the header of the latest test run.

  14. And repeating step 7 as many times as new builds need re-testing to ship the release.
  15. Ship it!

Any questions, problems or feedback, just email

Attaching Images, Screenshots and Other Files

Testpad now supports uploading files and images in support of tests and results. Files are uploaded using drag'n'drop onto areas in the dialog boxes for test details and test running.

Images for Test Descriptions

Test descriptions can have images and files attached by opening the Test Details dialog and dragging files onto the Attachments area

The Test Details dialog is available for each row in a Script and is accessed by any of

  • double-clicking on the row ID
  • typing the shortcut Alt-T
  • clicking on the small triangle icon at the end of the row of text
  • if files already attached, clicking on the file thumbnail at the end of the row of text

For Chrome users, files can also be Pasted (Ctrl-V) from the clipboard when the Attachments area is focussed and highlighted.

Files and images can be viewed in a File Viewer dialog by clicking (or right-clicking) on the thumbnails in the Test Details dialog.

The File Viewer will stay open during testing, and will auto-update to display the files/images associated with the current test.

Right-click on thumbnails (in the Test Details dialog) for more options such as download and delete.

Images for Test Results

Test results can have images (presumably screenshots) and files attached by dragging files onto the Attachments area of the Test Run dialog. The Test Run dialog includes the Pass/Fail buttons and is displayed during a test run.

Again, for Chrome users, images can be Pasted (Ctrl-V) from the clipboard. This makes it very easy to attach screenshots when using system shortcuts such as Cmd-Opt-Shft-4 on macOS.

When images are attached to results, they are represented in the results grid as Comments and displayed when the mouse hovers over the footnote dagger icon beside the relevant result.

Attached images are displayed as clickable thumbnails in the Comments section (below each result grid) of reports.

Storage Limitations

Individual files cannot be larger than 6MB.

Files are not part of the account/project/script exports and cannot be bulk downloaded from Testpad. Testpad is not a general purpose file storage facility!

The general idea is that uploaded files are simply copies of originals already in the user's possession and can be uploaded for the convenience of running tests or recording results.

As ever, any questions or problems, please email

New Features - Images, Folders and Reporting

This weekend sees the release of several updates to Testpad. These will be described in more detail in subsequent blog posts, but can be summarised as follows. Also scroll down for a detailed list of UI changes that could require minor changes to current process.

Image Attachments

Images and files can be attached to test descriptions and to test results by drag and drop.

Files in support of test descriptions go in the Test Details dialog, which is accessed by any of: double-clicking on the row ID, clicking on the triangle icon at the end of the row of text or typing the shortcut Alt-T.

Screenshots etc in support of test results go in the Test Run dialog which is displayed during a test run.


Scripts can be organised into folders for improved release management and general grouping of tests.

Scripts and Folders themselves can be organised by drag'n'drop, with CTRL held down to make copies instead of moving items.

Right-click on Script names and Folder names for more options, including accessing reports per folder.


Reports are now all presented using the all-in-one printable HTML page and displayed in a new tab/window.

Reports can be accessed using Folders (right-click on the folder name), or via the "view report" button that is displayed when a project has no folders.

Configure the contents and sharing of reports by opening the report and then clicking on the settings button in the top right-hand corner.

Tag Colors

Tags can be displayed using custom colors, configured in the Project Settings tab. For now, custom tag colors need to be configured per project. If there is demand, an Account level default setting could be added, as for how the Bug Link setting works.

User Sorting

The Users view now offers some sort options to help customers with their long lists of users.

Emails case-insensitive

Emails for login etc are finally case-insensitive as they should have been all along. Some customers own multiple accounts and use different spellings (patterns of upper and lower case) of their email addresses to identify these different accounts. For backwards compatibility, these different spellings of their email addresses will continue to work as before, but new accounts will not be able to use this trick.

UI Changes

Most of the changes in this update are additional to existing functionality and as such should not present any process problems. However, there are some minor changes, which are summarised:

  • Reporting is now accessed via the View Report button (for existing projects without folders) or by right-clicking on folders and selecting the 'View Report' option. The 'browse' report (also called 'live' in some places) has been deprecated as it was both redundant and under-used.
  • Sharing of reports is now managed in the report itself. Open the report, click on the Report Settings button in the top right-hand corner, and use the Enable/Revoke Guest Access button.
  • Folder reports do not have the Report Comments field that Projects used to have. Existing projects (without folders) continue to have a Report Comments field (but moved to the Settings Tab) for backwards compatibility. For newer reports, based on Folders, there is a new feature to insert Notes within the list of Scripts. These Notes can be used for small comments and are included in the summary sections of reports.
  • The Show All/Hide All Details button in the project view has been moved into the SCRIPTS menu item in the header bar. The FOLDERS menu has a similar option for expanding/collapsing all folders.
  • The project progress bar is now only displayed for existing projects that do not contain folders. As soon as folders are added, progress is displayed per folder.
  • Project level reports are still available via the View Report button, but this is only available until Folders are used, after which each Folder should be used to access reports within that Project. If this becomes a problem, please contact for a workaround.
  • The sort order of scripts used to be alphanumeric and this was updated automatically when new Scripts were created or renamed. This automatic sorting is now replaced with manual sorting via drag'n'drop. If you would still like to sort Scripts alphanumerically, there is an option in the right-click menu for Folders to sort their contents.
  • Templates can be used to make scripts by drag'n'drop onto a Project name. Either single Templates or whole Folders at a time can be dragged onto a Project to make one or more Scripts. There's no longer a menu option in the Project view to create a script from a template.

With the number of changes to Testpad there are bound to be some early issues. If you have any problems or questions, please do not hesitate to email

Coming soon: image uploads and folders for scripts

The blog has been too quiet for a couple of years, so here's a quick update to say hello.

Testpad has been live since 2011 and has enjoyed steady growth year on year, now with thousands of users who have collectively recorded over 250,000 test runs totalling millions of pass/fail results.

Servers and infrastructure have been upgraded in the last year to keep ahead of the growing demand and development continues on new features. I'm hoping to make significant updates to Testpad this Summer introducing several features that have long been asked for...

  • folders to organise scripts within projects, with drag'n'drop organisation for simpler release management
  • file attachments to tests for images, PDFs, docs etc in support of what to test
  • file attachments in results for screenshots etc in support of describing problems

More when they're ready. As ever, any questions, please email me at