Ruth Kusterer is a technical writer for BlazeMeter.

Become a JMeter and Continuous Testing Pro

Start Learning
Slack

Test Your Website Performance NOW! |

arrowPlease enter a URL with http(s)
Jun 24 2021

Moving from Manual to Automated Testing with BlazeMeter

This story was written based on our experience working with multiple QA team leads across many industries. It does not depict a real person or testimonial.

 

As the QA architecture lead for a bank's web portal, I realised I had reached the point where my team needed to spend more time on automation, and less on manual testing. In my last blog about my test automation journey, I wrote how I was pouring over expert blogs to formulate my decision criteria, and how I arrived at giving BlazeMeter a try. In this blog, I'll describe my evaluation process and first impressions of BlazeMeter.

 

The Chrome Recorder Extension

 

To evaluate BlazeMeter, I registered a free account and logged in (you'll get 1 workspace and 1 project for free). A "workspace" is where I store my library of web element locators that I will want to share with my team across several projects. If I need more workspaces or more projects, I can upgrade to an enterprise account later, but for now, I only need one project.

 

First, I went to the trusted Chrome store and installed the Recorder extension. One advantage of the extension is that the Recorder automatically updates itself every time the BlazeMeter cloud is updated. This means for us that we don't have to install any scripting software, and we don't have to spend time checking versions for compatibility and so on, that'll save us a lot of effort.

 

After I saw the Recorder appear in my Chrome, I was immediately able to start recording my first test -- since I was logged on to the blazemeter.com portal already, the extension immediately greeted me with "Hi Inna" on top of the plugin window. I liked that logging in was so seamless. 

 

(In the same week, one of my team members suddenly was not able to save, or select our project. By the missing greeting, we quickly figured out that project selection was disabled because their session had run out. That actually makes sense.)

 

 

Next, I clicked Default Project and verified that my account, workspace, and project was selected. The default project is where recorded test scripts will be uploaded. As the QA architecture lead, I'll be maintaining multiple test projects in the future, so I want to get into the habit of checking which target is selected.

 

You'll see that the Chrome Recorder is quite user-friendly, there's a big red "Start Recording" button, and next to it, "Stop Recording". Usability is important for me because I want to get my junior testers up to speed as quickly as possible.

 

Start Recording!

 

Accordingly, my first evaluation criterion is how easy it is to record a test script. As always, I start with closing all browser tabs that are unrelated to the application under test. In my case, the application is a bank's web portal. I open the portal and quickly make sure my tester account is logged out.

 

I click Start Recording, and as usual, I begin each test by visiting the portal home page, and by logging in with my tester account. Even though no one has access to the stored password inside the recorded test, I prefer using a throw-away tester account.

 

(Logging on is likely the most common thing I do in all my tests. I saw that I can save a series of test steps as a reusable group -- I'll deep dive into that handy tip in another blog later.)

 

A Structure for My Scenario

 

Looking back on my first few long-winded recordings, I recommend getting into the habit of structuring your recorded scripts on the fly: While recording, before the next set of related actions, such as logging on or filling in a form, I now click Add Step to insert labels. It's like adding chapter headings, an easy way to improve readability in the editor and in reports later.

 

 

After I have completed all the steps of my test case, I click Stop Recording, and BlazeMeter generates the test script. All I have to do is click Run... > GUI Functional (Selenium) to upload and run the script in the BlazeMeter cloud. 

 

 

Run the Test

 

The test runs and displays a report of its test results (Pass or Fail). You also see that this test was run on two different browsers, Firefox and Chrome.

 

 

Reports and Playback

 

I really liked the video included on the Details tab. I could scroll through the scenario and see the outcomes step by step in the video. If a step fails, I'll be able to see what the screen looked like at that very moment! 

 

 

To sum everything up, that's how we got started:

  1. I registered and logged in to BlazeMeter.com, and
  2. installed the Chrome Recorder extension.
  3. Then I invited my team members to the workspace.

 

Then we recorded and ran our first tests to get the hang of it. Here's my checklist if you need it:

  1. In the Recorder extension, verify the default account, workspace, and project.
  2. Click Start Recording, and open the application under test.
  3. Perform test steps and label scenario sections.
  4. Click Stop Recording.
  5. Click Run… > GUI Functional (Selenium).

 

If you are curious what a test recording looks like in BlazeMeter, check out my upcoming blogs.

 

   
arrowPlease enter a URL with http(s)

Interested in writing for our Blog?Send us a pitch!