Pesto PDF
Pesto PDF
Pesto PDF
Assignment No. 3
Contents
Summary 2
1. Introduction 2
2. DOM-Based & Visual Web Testing 3
2.1. Web testing tools 3
2.2. The Page Object and Page Factory patterns 3
2.3. DOM-based Web testing 3
2.4. Visual Web testing 3
3. Challenging aspects of the Transformation from DOM to Visual 3
3.1. Generating visual locators for repeated Web elements 4
3.2. Web elements changing their state 4
3.3. Web elements with complex visual interaction 4
4. PESTO: The Page Object Transformation TOOL 4
4.1. Visual locators generator (module 1) 4
4.2. Test suite transformer (module 2) 5
5. PESTO Improvement Process 5
5.1. Considered Web applications 5
5.2. Training test suite development 5
5.3. PESTO Improvement 5
5.4. Summary of the results on the training test suites 5
6. PESTO Evaluation 6
6.1. Web Applications 7
6.2. Test suite development 7
6.3. Experimental procedure 7
6.4. Experimental results 7
6.5. Discussion 7
6.6. Generalizability of the results 8
7. Related Work 8
8. Conclusion and Future Work 8
Strengths 9
Weaknesses 9
PESTO: Automated migration of
DOM-based Web Tests towards the Visual
Approach
Summary
1. Introduction
Testing web applications is a continually evolving task due to fast pace development of web
applications. This is the reason E2E testing systems are widely popular which require scripts that
can auto-fill the data fields to test the assertions. E2E have undergone multiple generations over
the years.
First Generation: Test cases were produced on the basis of screen coordinates. It required
locating web elements by coordinates and is thus fragile.
Second Generation: Test cases are produced on the basis of DOM properties. It’s easier and
better approach than first generation testing approaches.
Third Generation: It is most popular in contemporary era due to its ability of handling complex
applications like Google Docs & Google Maps. In such cases, DOM-based tools do not suffice
because the DOM of the application may not include all user interface controls.
Nowadays, a combination of DOM and Visual based approach is popular. But it is unclear which
approach which overcome the other in the future because migration of test cases is a
cumbersome task.
The research paper deals with migration of Old test cases to New ones automatically which
proves to be complicated and complicated and error prone task.
DOM-Based
These are second generation testing tools. Widely used example is Selenium. It relies on locating
web elements on the basis of their DOM properties like id, name etc. XPaths can be used too, if
DOM attributes cannot be used.
Visual
Such tools rely on image processing techniques to generate test cases of web elements.
DOM–based tools allow users to specify fine-grained assertions while Visual tools check that an
image representing the visual appearance of an element is present in the web page as rendered at
runtime by the browser. DOM based approach can locate elements regardless of the position,
while Visual tools require positions of the elements.
3. Challenging aspects of the Transformation from DOM to Visual
This process requires solution to various problems like generating test for repeated elements,
localizing the elements and managing complex elements.
PESTO: Visual Locator Generator (Generates a set of visual locators for each WebElement
located by DOM-based locator). This maps DOM based to Visual Locator. Thus DOM based
suite is transformed into Visual Test Suite.
For each Web element, DOM to Visual Locator Mapping consists of following triple:
Total 189 calls of 942 turned out to be fairly common. Overall, PESTO was able to manage
almost all the command calls used in the four training test suites. In remaining, it copied
commands from original test suits and thus used a hybrid test suit. The resulting test suites could
be compiled and executed without requiring any manual intervention. The migrated test suites
contain a total of about 99% of visual command calls and just about 1% untransformed
DOM-based command calls. When ran without intervention, we had the same results obtained
with the original DOM-based training test suites.
6. PESTO Evaluation
To evaluate our tool without introducing any author bias, we recruited a professional tester
(referred from now on as Tester) . Following testing questions were answered in this phase:
1. Automation: What’s the required amount of intervention that goes into Module 1 & 2?
2. Correctness: How many migrated visual test suites have issues when locating,
interacting with, and asserting on the Web elements under test?
3. Multistate: How often are multiple visual locators, associated with a single Web
element, required?
4. Applicability: What percentage of the original DOM-based command calls is left
un-transformed by PESTO?
1. DOM-based test suite development: The tester developed a DOM-based test suite
2. DOM-based test suite transformation: PESTO was used for this. Problems were noted
and modified.
3. Visual test suite evaluation: The newly generated visual test suite was executed to
check for its correctness.
Automation: To answer this, we need to consider (1) What were the issues faced by PESTO’s
Visual Locator Generator. Elements at odd places like close to border were problematic as visual
locator couldn’t be created for them. Luckily, they were few and were tested at the end of test
suit. (2) Compilation and execution issues faced by Module 2. This phase required only minor
manual refactoring to transform its syntax into one supported by PESTO.
Correctness: Apart from a minor problems due to Sikuli’s search for web elements during page
transition, every test case worked as expected.
Multistate: For 192 Web elements, 449 visual locators were used. 51% of such locators
corresponded to single state elements, 20% dealt with double state and 29% were for triple state
elements. Thus, the more test cases are produced, the higher the probability that a test case may
interact with a given Web element.
Applicability: Many such commands were expected however, no such commands were used by
tester.
6.5. Discussion
Advantages: PESTO reduces time by at-least 50% as it generates visual locators automatically
and removes duplicated locators.
Machine Time required for Migration: Keeping in view the hardware specs, PESTO’s average
time for evaluating test suits is almost similar to typical process.
Execution time of the visual test cases created by PESTO: This time hinges on underlying
image processing algorithms & loading time of web pages. PESTO caters to the computational
time overhead by removing duplicates, so processing is quicker as fewer elements are there to
care for. PESTO also introduces small delays to cater loading problem.
Visual locators readability: The visual locators generator algorithm of PESTO has been
designed to mimic the visual selection strategy of a human tester and is thus easier to read.
WebDriver commands used in practice: We can conclude that the potential PESTO's coverage
of theWebDriver commands used in practice is high so the tool is easy to learn.
PESTO's limitations: Few commands are not covered by PESTO like accept() and
selectByValue(). The elements at the borders are difficult to be covered by PESTO. This
however can be solved in future.
7. Related Work
The only work that faced similar challenges with respect to PESTO is the one by Alégroth et al.
They migrated existing automated component-based GUI test cases (tool GUITAR [57]) to the
visual approach (tool VGT GUITAR). Since PESTO is a tool for migrating the test code, while
keeping it functionally the same, it can be regarded as a refactoring tool.
Van Deursen et al explained bad smells of the code. In another work, the same authors study
interplay between testing and refactoring. Chu et al. proposed a plugin tool to guide test case
refactoring after having applied well-established pattern-based code refactoring. PESTO differs
from these works since it aims at refactoring the test code to make it executable by visual Web
testing tools, without modifying the application code.
In future, we intend to evaluate PESTO on some industrial case studies. Moreover, we need to
investigate different visual locator generation strategies and to study how they impact the locator
comprehensibility. Currently, PESTO does not generate a visual test suite portable across
different platform-browser combinations. A possible extension of PESTO could take into
account the possibility of merging two or more visual test suites generated by PESTO in order to
obtain a single, platform-browser portable visual test suite. Finally, we plan to investigate
whether starting from PESTO, it would be possible to derive a more general and tool
independent transformational framework.
Strengths
1. Procedure of transformation from DOM to 3rd generation is well explained.
2. Logic behind transformation is clearly portrayed.
3. Tool experimentation results are well cited.
4. Aids a 2nd generation tester to understand a 3rd generation tests.
5. Sample codes are provided to add in to application of the tool in industry.
6. References are well cited.
7. Need of the tool is well explained.
8. Advantages are well explained.
9. Related work is well explained
10. Refers to the latest applications.
Weaknesses
1. Explicit implementation of function like send keys
2. Require more locators
3. More complex than DOM
4. Sometimes may not create a visual locator.
5. May require modification of test suite to match pesto syntax.
6. Test suit take more time to execute than DOM based.
7. Some test suite may not adapt to page object and page factory design patterns.
8. Can’t generate visual locator for accept().
9. Can’t migrate selectbyvalue().
10. Visual locator may not exist if target object is repeated on web page.
11. Only understandable by tester specifically
12. Requires alot of domain logic.