Testlink User Manual
Testlink User Manual
Testlink User Manual
Version: Status:
1.15 Updated
2004 - 2009 TestLink Community Permission is granted to copy, distribute and/or modify this document under the terms of the GNU Free Documentation License, Version 1.2 published by the Free Software Foundation; with no Invariant Sections, no Front-Cover Texts, and no Back-Cover Texts. The license is available in "GNU Free Documentation License" homepage.
Table of Contents
1 General information 1.1 Overall structure 1.2 Basic Terminology 1.3 Example of TestLink simple work-flow 1.4 Short-cuts 2 Test Projects 2.1 Creating new Test Projects 2.2 Edit and delete Test Projects 3 Test Specification 3.1 Test Suites 3.2 Test Cases 3.3 Keywords 3.4 Generate Test Specification document 4 Requirement based testing 4.1 Availability 4.2 Requirements Specification Document 4.3 Requirements 5 Test Plans 5.1 Create and delete Test Plan 5.2 Builds 5.3 Test Set - adding Test Cases to Test Plan 5.4 Removing Test Cases from Test Set 5.5 Test execution assignment 5.6 Prioritize testing 5.7 Milestones 6 Test Execution 6.1 General 6.2 Navigation and settings 6.3 Test execution 7 Test Reports and Metrics 7.1 General Test Plan Metrics 7.2 Query Metrics 7.3 Blocked, Failed, and Not Run Test Case Reports 7.4 Test Report 7.5 Charts 7.6 Total Bugs For Each Test Case 7.7 Requirements based report 7.8 How to add a new report 8 Administration 8.1 User account settings 8.2 Role Permissions 8.3 Test Plan assignment to users 8.4 Custom Fields 9 Import and Export data 9.1 Export/Import Test Project 9.2 Import/Export Test suite 9.3 Just one Test Case 9.4 All Test Cases in test suite 9.5 Import/Export Keywords 9.6 Import/Export Software Requirements 9.7 Results import 9.8 Import Test Cases from Excel via XML 3 3 3 4 5 8 8 8 9 9 10 12 13 14 14 14 16 17 17 17 18 19 20 25 25 27 27 27 28 31 31 32 36 36 36 37 37 37 38 38 38 39 39 46 46 47 48 51 52 53 54 55
-2-
1 General information
TestLink is web based Test Management system. This manual should serve as source for users to understand processes, terms and organization of work with TestLink. See the Installation manual for more information about system requirements, installation steps and configuration. The latest documentation is available on www.teamst.org or testlink.sourceforge.net. Please use our forum if you have questions that the manual doesn't answer. TestLink code is often faster than documentation and some contribution missed appropriate description. Some enhancement are not described for this reason here. Thank you for notify us via tracker.
logical parts. Test Suite replaces the components and categories in TL 1.6 and earlier. Test Plan is created when you'd like to execute Test Cases. Test Plans can be made up of the Test Cases from the current Test Project. Test Plan includes Builds, Milestones, user assignment and Test Results. Test Project is something that will exist forever in TestLink. Test Project will undergo many different versions throughout its lifetime. Test Project includes Test Specification with Test Cases, Requirements and Keywords. Users within the project have defined roles. Test Project was called Product in TL 1.6 and earlier. User: each TestLink user has a Role that defines available TestLink features. See more in chapter User Administration. Illustration 2 shows common activities according to user roles.
9. Later, developers finally add also Chips functionality. Adam creates a Test Plan Fish & Chips 2. He can reuse the first Test Plan as template. All Fish Test Cases and roles will be automatically added. He create a new Build Fish 1.1 and links all Chips Test Cases to this Test Plan too. 10. Now testing starts as usual. 11. Later, Admin creates a new Test Project for another product Hot Dog. But this is another test team and a different story.
1.4 Short-cuts
TestLink has short-cuts to speed up a navigation. Use ALT (+ shift) + key.2 List global available short-cuts:
[h] Home [s] Test Specification [e] Test execution [r] Reports [i] Personal [q] Logout/quit [u] Administration (admin only)
Create Test Case short-cut [t] is available in View Test Suite page. Browser support: The used accesskey attribute is currently supported by the following 2 IE only selects the link. Enter key must follow. -5-
web browsers:
Firefox Internet Explorer 4+ (Windows) 5+ (Mac) Mozilla (Windows + Linux) Netscape 6.1+ (Windows) Opera 7+ (Windows + Linux)
FCKeditor shortcuts The next short-cuts are supported in edit text mode:
Shortcut
Ctrl + A Ctrl + C, Ctrl + Ins Ctrl + V, Shift + Ins Ctrl + X, Shift + Del Ctrl + Z Ctrl + Y Ctrl + K Ctrl + W Ctrl + B Ctrl + I Ctrl + U Ctrl + Shift + S Ctrl + Alt + Enter Ctrl + L Ctrl + R Ctrl + E Ctrl + J Ctrl + [1-5] Ctrl + 0, Ctrl + N Ctrl + Shift + L Tab, Shift + Tab Select All Copy to clipboard Paste from clipboard Cut Undo Redo Insert Link
Action
Insert Link witout popup of the dialog Bold Italic Underline Save Fit Window Justify Left Justify Right Justify Center Justify Block Header 1-5 Format Paragraph Format Toggles between Bulleted List, Numbered List and Paragraph Increases and Decreases (with Shift) Indent. If cursor is inside the text and FCKConfig.DekiTabSpaces > 0, inserts the spaces. Inside the tables cursor jumps to the next/previous cell or inserts the new line if there is no next
-6-
cell. If cursor is in title or header, jump to the next paragraph. Shift + Space Non-break space ( )
-7-
2 Test Projects
Test Projects are the basic organisational unit of TestLink. Test Projects could be products or solutions of your company that may change their features and functionality over time but for the most part remains the same. Test Project includes requirements documentation, Test Specification, Test Plans and specific user rights. Test Projects are independent and do not share data. Consider using just one Test Project for one Test team and/or one product. For example: there are two test teams for one product: system testing and integration testing. These teams will share some test cases. You should create one project for the product. The work of both teams will be structured in two or more trees in Test Specification and testing results will be collected to different Test Plans.
Deleting Test Projects from the system is not recommended as this either orphans a large number of Test Cases or deletes the Test Cases from the system. Test Plans represent the testing of a Test Project at a certain point in time. Consequently, Test Plans are created from Test Project Test Cases. We do not recommend to create separate Test Projects for versions of one Product. TestLink supports importing XML or CSV data into a Test Project. This is explained further in the Import section, below.
-8-
3 Test Specification
TestLink breaks down the Test Specification structure into Test Suites and Test Cases. These levels are persisted throughout the application. One Test Project has just one Test Specification. Future consideration: we plan to implement the next abilities: Test patterns, review process and status of Test case, variable Test case structure (for example step by step design).
Illustration 3: Test Specification is structured by Test Suites Creating one or more Test Suites is one of the first steps when creating your Test Project. Users (with edit right) can create, delete, copy, move, export and import both nested Test Suites and Test Cases. Title and description can be modified too. You can reorder both Test cases and child Test Suites via Drag & Drop items on the navigation tree.3 Practice: Consider structure of your test specification. You could divide tests by functional/non functional testing, particular features, components. You can change the structure later of course move test suites without lost of any information. Practice: Later versions of your product could have some features obsolete. You can create a special Test suite 'Obsolete' or 'Your product 0.1' and move test cases there. Deleting of test case causes that earlier test results will be deleted too. Attachments with external documents or pictures could be added into particular Test Suites. 3 It requires EXT-JS component (default). -9-
Note: The functionality must be allowed via TestLink configuration. More information about import and export are in appendix. FAQ: Could I copy/move Test suites between projects? You cannot directly in this version. We designed Test projects fully independent each from other. Use import/export as workaround.
Title: could include either short description or abbreviation (e.g. TL-USER-LOGIN) Summary: should be really short; just for overview, introduction and references. Steps: describe test scenario (input actions); can also include precondition and cleanup information here. FAQ: Our test steps are pretty detailed are use a lot of formatting coming from Word. Use "Paste from msword" feature of FCKeditor. It cleans garbage. (Configure toolbar if not available.)
Expected results: describe checkpoints and expected behaviour of a tested product or system.
Attachments: could be added if configuration allows it. Importance: Test designer could set importance of the test [HIGH, MEDIUM and LOW]. The value is used to compute priority in Test Plan.5 Execution type: Test designer could set automation support of the test [MANUAL/AUTOMATED].5 Custom fields: Administrator could define own parameters to enhance Test Case description or categorization. See 8.4 for more. Large custom fields (more than 250 characters) are not possible. But information could be added into parent Test Suite and referred via custom fields. For example you can describe Configuration 'standard', 'performance', 'standard_2' and refer via CF to this labels.
Note: The development is planned to make a TC structure more flexible in future. Test Case - Active Attribute If several versions of a Test Case exist, it would be useful to have a new attribute, Active/Inactive, to use in this way:
Every Test Case version is created ACTIVE An Inactive Test Case Version will not be available in "Add Test Cases to Test Plan". This can be useful for Test Designers. They can edit or change the Test Case Version and only when he/she decides it is completed, change Status to ACTIVE so it will be available to be used in a Test Plan. Once a Test Case Version has been linked to a Test Plan, and has results, it can't be
4 TestLink 1.7 and earlier versions have had this ID as a global counter independent of the Test Project. TestLink 1.8 still uses this unique ID for API calls.
5 The feature must be enabled on Test project management.
- 10 -
turned INACTIVE. Spell checker You can use web browser ability. Firefox has add-on for example British English Dictionary or amount dictionaries for other languages. FCKeditor supports own solution. See their Developers Guide for more.
Illustration 5: What you see when trying to add Test Cases to a Test Plan
As you can note, the number near Test Project name (in this example: toaster_xl5) is 2, but the Test Project has 3 Test Cases. Test Case TC1 is not counted, because is inactive. Removing Test Cases Test Cases and Test Suites may be removed from a Test Plan by users with lead permissions. This operation may be useful when first creating a Test Plan since there are no results. However, removing Test Cases will cause the loss of all results associated with them. Therefore, extreme caution is recommended when using this functionality.
- 11 -
Requirements relation Test Cases could be related to software/system requirements as n to n. The functionality must be enabled for a Test Project. User can assign Test Cases and Requirements via the Assign Requirements link in the main screen. Test Plan relation Test Cases can be assigned to particular Test Plans for execution. Test Leader use Add / Remove Test Cases link in main page to select appropriate Test set. The list of related Test Plans are listed on the View Test Case page under title Test Plan usage. See the screenshot below.
Illustration 6: List of related Test Plans is listed on the "View Test Case" page
3.3 Keywords
Keywords were created to give users another level of depth when categorizing Test Cases. Keywords serve as a means of grouping Test Cases with some attribute within a Test Specification. For example, you can use it to define:
Reviewed Test Cases Set of Test Cases valid for one platform
Keyword Creation At this time keywords can only be created by users with the mgt_modify_key rights. These rights are currently held only by Leaders. Once a keyword or grouping of keywords has been created users may assign them to Test Cases. Assigning Keywords Keywords may be assigned to Test Cases either from the assign keyword screen (in batch) or via the Test Case management (individually). Filter by Keyword Users have the ability to filter by Keywords to:
Search Test Cases in Test Specification. Add groups of Test Cases in a Test Case Suite (Test Plan). Execute test screen.
Illustration 7: Options for generated Test Specification User can choose document format (HTML, OpenOffice text document or msword document), TOC, Test suite data, Test Case summary, steps + expected results, and list of related keywords and/or requirements.
- 13 -
Linking risks and requirements will reveal vague or missing requirements. This is especially interesting for risks with a high priority. Testing can be focused on the most important parts of an information system first: covering the risks with the highest priority. Communicating in the same language as the client and the stakeholders. This makes it easier to report on the status of the Test Project. Then a better founded decision can be made whether to invest more in testing or take the risk. The risks and their priority make negotiating on the Test Project in times of pressure easier. What risks have to be covered within this Test Project and which ones can be postponed. Risk and requirement-based testing results in a better controlled Test Project. The communication with the client and the stakeholders is improved. The test manager begins testing with risks with the highest priority. The process is streamlined and the end result is higher quality.
4.1 Availability
The functionality is available at the Test Project level. I.e. Administrator should enable it for a specified Test Project (Edit Test Project link in Main window). Otherwise links are not shown. There are two user levels for this feature. Most roles can view requirements, but not modify them. Refer to User section for more details.
- 14 -
Illustration 9: Manage a Requirements Specification Create a document with Requirements: 1. Click Requirements Specification in Main window. The list of Requirement Specifications is shown. 2. Press Create button to create a document. 3. Adjust Title, Scope and eventually Count of Test Cases. The last parameter is used for statistics. Use only if you have a valid Requirement document but not all requirements are available at the moment in TestLink. Default value '0' means that the current count of requirements in a specification is used. 4. Press Create button to add data to database. You can see the title of your new created document in the table of list of Requirement Specifications. 5. Click the title of document for next work. The Requirement Specification window is shown. Each Requirement Specification has own statistics and report related to included data. All Specifications can be printed using the "Print" button in the "Requirement Specification" window. Administrator can define company, copyright and confident text via configuration files. - 15 -
Warning: TestLink 1.8 brings support of n-depth tree structure for requirements (should be enabled by configuration). However some related features (for example import/export, document generation) aren't compliant with this settings yet. These features behaves like only direct child requirements exists and all inner sub folders are ignored.
4.3 Requirements
Each requirement has Title, Scope (html format) and Status. Title must be unique and has max. 100 characters. Scope parameter is text in HTML format. Status can have the values VALID or NOT_TESTABLE. A NOT_TESTABLE requirement is not counted in metrics. Requirements could be created/modified or deleted manually via TestLink interface or imported as CSV file. Import requirements TestLink supports two types of CSV. The first 'simple' is composed from title and scope in each row. The second 'Export from Doors' tries to detect the header and choose the correct fields. Import compares titles and tries to resolve conflicts. There are three ways to do this: update, create requirements with same title and skip adding the conflicted ones. Requirements to Test Case relation Test Cases are related with software/system requirements as * to *. I.e. you can assign one or more Test Cases to one Requirement and one or more requirements could be covered by one Test Case. User can assign Requirements to Test Cases via the Assign Requirements link in the Main window. The coverage of the Test Specification could be viewed via pressing the Analyse button in the Requirement Specification window. Requirement based Report Navigate to Reports and Metrics menu. There is a Requirements based Report link. Requirements in the current Requirement Specification and Test Plan are analysed for this report. The latest results of Test Cases (available in Test Plan) are processed for each requirement. The result with the highest priority is applied for the requirement. Priorities, from the highest to lowest, are: Failed, Blocked, Not Run and Passed. Example of requirement coverage A requirement is covered by three Test Cases. Two of them are included in the current Test Suite. One passed and one was not tested for the Build 1. Now Requirement has overall result of Not Run. Second Test Case was tested with Build 2 and passed. So Requirement passed too.
- 16 -
5 Test Plans
A record of the test planning process detailing the degree of tester involvement, the test environment, the Test Case design techniques and test measurement techniques to be used, and the rationale for their choice. Test Plans are the basis for test execution activity. A Test Plan contains name, description, collection of chosen Test Cases, Builds, Test Results, milestones, tester assignment and priority definition. Each Test Plan is related to the current Test Project.
Summary/Scope Features to be tested Features to not be tested Test criteria (to pass tested product) Test environment, Infrastructure Test tools Risks References (Product plan or Change request, Quality document(s), etc.)
Test Plans are made up of Test Cases imported from a Test Specification at a specific point of time. Test Plans may be created from other Test Plans. This allows users to create Test Plans from Test Cases that exist at a desired point in time. This may be necessary when creating a Test Plan for a patch. In order for a user to see a Test Plan they must have the proper rights. Rights may be assigned (by leads) in the define User/Project Rights section. This is an important thing to remember when users tell you they can't see the project they are working on. Test Plans may be deleted by users with lead privileges. Deleting Test Plans permanently deletes both the Test Plan and all of its corresponding data, including Test Cases (not in Test Specification), results, etc. This should be reserved only for special cases. Alternatively, Test Plans may be deactivated on the same page, which suppresses display on selection menus in the main and Execute pages.
5.2 Builds
An user with lead privileges could follow the link Build management in the main page. Builds are a specific release of software. Each project in a company is most likely made up of many different Builds. In TestLink, execution is made up of both Builds and Test Cases. If there are no Builds created for a project the execution screen will not allow you to execute. The metrics screen will also be completely blank.
- 17 -
Illustration 10: Build management Each Build is identified via title. It includes description (html format) and two states: Active / Inactive defines whether the Build is available for TestLink functionality. Inactive Build is not listed in either execution or reports pages. Opened / Closed defines if Test Results can be modified for the Build. Builds can be edited (via link under a Build title) and deleted (by click on the appropriate bin icon) in the table of existing Builds.
- 18 -
User can choose Test cases via check-box and hit Add selected button to define Test Set. Click on the 'check' icon to select all Test Cases in each Test Suite. All Test cases could be added by click on label under Test suite title. The certain version of Test Case is assigned to a Test Plan. User could update version for later testing if the Test Case is updated. The content of executed Test case version could not be modified. Note: There is basic rule: One Test Plan has just one set of Test cases. So you can add during testing more new Test cases. And this affect metrics of older builds. In the most of case this is not important difference. You can two possibilities if you care about: - Create a new Test Plan for the second round. - Use Keywords to recognize phases of testing in one Test Plan.
Illustration 12: Frame for Adding Test Cases into Test Plan User can reorder Test cases within Test Set. This modification has no influence to Test Specification. Modify numbers in column Execution order and push Save order button.
- 19 -
Illustration 13: Frame for modifying content of test cases within Test Plan
Using tree menu on left frame, you can choose a whole test suite, that will be displayed on detail on right frame.
- 20 -
To assign a test case to an user, following steps must be followed: 1. select test case, using check box on the left of test case ID.
- 21 -
2. choose user, using combo box place on column with label 'Assign to'. 3. Use 'Save' button to write selection on database
To simplify this task when you need to assign a whole test suite to an user, bulk user
User can notify assigned testers select check-box Send mail notification to tester. Bulk user assignment Bulk user assignment is not really different, that normal assignment. Using clickable images you can toggle state of multiple check boxes with one click. In the image displayed above, you can see following help, when you put cursor over image present on the left of test suite name.
Clicking on this image will result on all test cases present on Transportation test suite and it's children test suites, be selected as displayed on following image:
- 22 -
Next step is to choose user, this will be done using bulk assignment section under test suite name. Then using 'Do' button, combo boxes of all checked check boxes will be setted to 'admin'.
- 23 -
Bulk assignment, can be also done just on test cases belonging to a test suite without affecting children test suites. As an example if you want to change assignment, only on test cases children of HyperSpace test suite , instead of using image on the left of test suite name:
- 24 -
5.7 Milestones
Test leader can define a milestone for certain date with a goal of expected percentage of finished tests. This goal could be defined for three levels of priority in the case that test prioritization is allowed. See Test Plan metrics report to check a satisfaction of these goals.
- 25 -
Illustration 14: Test leader can define one or more milestones for a Test Plan
- 26 -
6 Test Execution
6.1 General
Test execution is available after: 1. A Test Specification is written. 2. A Test Plan is created. 3. Test Cases are added into Test Plan. 4. At least one Build is created. 5. Testers have appropriate rights for execution to work with the this Test Plan. Select the Execute link in top menu or Execute Tests link in the Main page to navigate to the Test Execution window. The left pane allows navigation in Test Cases via a tree menu, and set-up settings and filters. The execution window shows relevant information and allow to add test results.
Define a tested Build Users should specify one from all active Builds to add results. The latest build is set by default. Build label specified exact package of product under test for tracking purpose. Each Test Case may be run more times per a Build. However it's common practise that just one testing round is executed against a Build for a test case. Builds can be created by Test Leader using the Create New Build page.
- 27 -
Search a Test Case Users can specify exact the Test Case identifier to faster navigation. Filtering Test Cases This table allows the user to filter Test Cases for smart navigation before they are executed. You must hit the Apply filter button to propagate a new filter settings.
Tester: Users can filter Test Cases by their tester. There is also check-box to see both the chosen tester and unassigned Test Cases. Keyword: Users can filter Test Cases by keyword. See 3.3 Keywords. Result: Users can filter Test Cases by results. Results are what happened to that Test Case during a particular Build. Test Cases can pass, fail, be blocked, or not be run. Test priority: Users can prioritize Test Cases.
Tree menu The tree menu in navigation pane shows the chosen list of Test Cases in the Test Plan. It allows to open appropriate Test Case(s) for test execution in the right frame. Test Suites in the menu are enhanced by short test status overview behind a title (count of test cases, passed, failed, blocked and not-executed counter) coloured by results. The tree menu could be coloured for some types of used menu component only. By default the tree will be sorted by the results for the defined Build that is chosen from the drop-down box. Example TC coloured according to the Build: User selects Build 2 from the drop-down box and doesn't check the "most current" check box. All Test Cases will be shown with their status from Build 2. So, if Test Case 1 passed in Build 2 it will be coloured green. A second possibility is that the menu is coloured according to the latest Test Result. Example TC coloured according to the latest result User selects Build 2 from the drop-down box and this time checks the "most current" check box. All Test Cases will be shown with most current status. So, if Test Case 1 passed in Build 3, even though the user has also selected Build 2, it will be coloured green.
- 28 -
Illustration 16: User can select to print only the last result
Illustration 17: The last result could be printed only The indication that the Test Case was updated or deleted in test Specification is not supported after 1.5 version. Updated Test Cases: TL 1.0.4 version has indication by flag, that is missing from 1.6 version. If users have the proper rights they can go to the Update modified test case - 29 -
page through the link on main page. It is not necessary for users to update Test Cases if there has been a change (newer version or deleted).
- 30 -
- 31 -
Results by top level Test Suites Lists the results of each top level suite. Total cases with status are listed: passed, failed, blocked, not run, and percent completed. A completed Test Case is one that has been marked pass, fail, or block. Results for top level suites include all children suites. Results by Build Lists the execution results for every Build. For each Build, the total Test Cases, total pass, % pass, total fail, % fail, blocked, % blocked, not run, and %not run are displayed. If a Test Case has been executed twice on the same Build, the most recent execution will be taken into account. Results By Keyword Lists all keywords that are assigned to cases in the current Test Plan, and the results associated with them. Example : Keyword Total Passed Failed Blocked Not run
P3 P2 P1 1128 585 328 346 372 257 47 25 6 55 31 51 680 157 14
Completed [%]
39.72 73.16 95.73
Results by owner Lists each owner that has Test Cases assigned to them in the current Test Plan. Test cases which are not assigned are tallied under the unassigned heading. Example: Tester
Dominika Mohammad unassigned Ken Mallik Ali Mike Alex
Completed [%]
51.47 37.80 57.14 45.67 67.91 72.25 91.67 70.59
- 32 -
Query Form Page: User is presented with a query page with 4 controls. Each control is set to a default which maximizes the number of Test Cases and Builds the query should be performed against. Altering the controls allows the user to filter the results and generate specific reports for specific owner, keyword, suite, and Build combinations. keyword 0->1 keywords can be selected. By default no keyword is selected. If a keyword is not selected, then all Test Cases will be considered regardless of keyword assignments. Keywords are assigned in the Test Specification or Keyword Management pages. Keywords assigned to Test Cases span all Test Plans, and span across all versions of a Test Case. If you are interested in the results for a specific keyword you would alter this control. owner 0->1 owners can be selected. By default no owner is selected. If an owner is not selected, then all Test Cases will be considered regardless of owner assignment. Currently there is no functionality to search for unassigned Test Cases. Ownership is assigned through the Assign Test Case execution page, and is done on a per Test Plan basis. If you are interested in the work done by a specific tester you would alter this control. top level suite - 0-> n top level suites can be selected. By default all suites are selected. Only suites that are selected will be queried for result metrics. If you are only interested in the results for a specific suite you would alter this control. Builds 1->n Builds can be selected. By default all Builds are selected. Only executions performed on Builds you select will be taken into account when producing metrics. For example if you wanted to see how many Test Cases were executed on the last 3 Builds you would alter this control. Keyword, owner, and top level suite selections will dictate the number of Test Cases from your Test Plan that are used to compute per suite and per Test Plan metrics. For example, if you select owner=Greg, Keyword=Priority 1, and all available test suites only Priority 1 Test Cases assigned to Greg will be considered. The # of Test Cases totals you will see on the report will be influenced by these 3 controls. Build selections will influence whether a case is considered pass, fail, blocked, or not run. Please refer to Last Test Result rules as they appear above. Press the submit button to proceed with the query and display the output page.
- 33 -
- 34 -
Illustration 18: Query metrics - Input parameters Query Report Page: The report page will display: 1. the query parameters used to create the report 2. totals for the entire Test Plan 3. a per suite breakdown of totals (sum / pass / fail / blocked / not run) and all executions performed on that suite. If a Test Case has been executed more than once on multiple Builds all executions will be displayed that were recorded against the selected Builds. However, the summary for that suite will only include the Last Test Result for the selected Builds.
- 35 -
7.5 Charts
This report page requires your browser have a flash plug-in. Last Test Result logic is used for all 4 charts that you will see. The graphs are animated to help the user visualize the metrics from the current Test Plan. The four charts provide are : 1. Pie chart of overall pass / fail / blocked / and not run Test Cases 2. Bar chart of Results by Keyword 3. Bar chart of Results By Owner 4. Bar chart of Results By Top Level Suite
- 36 -
The bars in the bar charts are coloured such that the user can identify the approximate number of pass, fail, blocked, and not run cases. It uses flash technology provided by http://www.maani.us to display results in a graphical format.
Total number of requirements Requirements within TestLink Requirements covered by Test Cases Requirements not covered by Test Cases Requirements not covered or not tested Requirements not tested
Requirements are divided into four sections. Each requirement is listed together with all related Test Cases (coloured according to Test Case result):
8 Administration
8.1 User account settings
Every guest can create own account on login page. The auto-create feature can be disabled by configuration. There is also configurable default role parameter ('guest' by default). Every user on the system will be able to edit their own information via the Account settings window ('Personal' link in menu bar). He can modify own Name, email address, password and generate API key (if enabled). TestLink allows users with administrator rights to create, edit, and delete users within the system. However, TestLink does not allow administrators to view or edit user's passwords. If users forget their passwords there is link on the login screen, that will mail the user their password based upon their user name and the email address they entered.
Guest: A guest only has permission to view Test Cases, reports and metrics. He cannot modify anything. Test Executor: A tester has permissions to see and run tests allocated to them. Test Designer: A user can fully work (view and modify) with Test Specification and Requirements. Test Analyst: A tester can view, create, edit, and delete Test Cases as well as execute them. Testers lack the permissions to manage Test Plans, manage Test Projects, create milestones, or assign rights. (initially Senior tester). Test Leader: A lead has all of the same permissions as a Tester but also gains the ability to manage Test Plans, assign rights, create milestones, and manage keywords. Administrator: An admin has all possible permissions (leader plus the ability to manage Test Projects and users).
Changing of these rights is handled by the user administration link which is accessible by administrators. Admin can add a new custom roles and modify existing ones via GUI. See Developer's Guide to understand more. If you view the table you will see rows for each of the permissions levels (guest ,tester, senior tester, leader, admin). The second column holds all of the different rights levels which will be defined below. These levels have been determined as standard for use but they can be edited to define new roles (for experienced administrator). The user table contains a foreign key that points to the appropriate permission level in the rights table. - 38 -
Note: Test Plan related features needs also assign a Test Plan to be available. See Test Plan Assignment. Case study restrict access by default Situation In our organization we would like to restrict access to projects unless it is specifically granted. We currently have about 150 users with just over 90 different project. Many of the users are not QA, but are business analysts, and in some cases end users doing UAT. From what I can tell, all rights are inherited based on how the user was set-up. But we don't want a Business Analyst who is working on only one project have access to all 90. Solution Set these users with the global role <No rights> and granted an appropriate role on a Test project or Test plan base. In custom_config.inc.php you can also set the default role id to <No rights>: $tlCfg->default_roleid = TL_ROLES_NO_RIGHTS; You can also would like to rename a role No rights to something more polite.
name type - possible values are listed below. Value constraints (Possible values, default value, regular expression, minimum length, maximum length). Availability/Display control (where the field will show up and must be filled in
- 39 -
All fields are compared in length to be greater than or equal to the minimum length, and less than or equal to the maximum length, unless these values are 0. If the values are 0, the check is skipped. All fields are also compared against the regular expression. If the value matches the expression, then the value is stored. For example, the expression "/^-?([0-9])*$/" can be used to constrain an integer. There is data format verification for types numeric, float and email. On submit (when created a Test case or while executing) CF of this type are controlled. Empty value OK, but if value NoT empty but do not conform a regular expression (NOT USER DEFINED but coded ) submit will be blocked with message on screen. The table below describes the field types and the value constraints. Type
String Numeric Float Email
Field Contents
text string up to 255 characters an integer a floating point number an email address string up to 255 characters zero or more of a list of text strings
Value Constraints
When displayed, the value will also be encapsulated in a mailto: reference. Enter the list of text strings separated by "|" (pipe character) in the Possible Values field. The Default value should match one of these strings as well. This will be displayed as a list of text strings with a checkbox beside them. Enter the list of text strings separated by "|" (pipe character) in the Possible Values field. The Default value should match one of these strings as well. This will be displayed as a multi-line dropdown menu. Enter the list of text strings separated by "|" (pipe character) in the Possible Values field. The Default value should match one of these strings as well. This will be displayed as a multi-line dropdown menu. This is displayed as a set of dropdown menus for day, month, and year. Defaults should be defined in yyyy-mm-dd format.
Checkbox
List
Multiselection List
Table 1: Custom field types The display entries are used as follows:
- 40 -
Entry
Display on test specification Enable on test specification Display on test execution Enable on test execution Available for
Meaning
If checked, the field will be shown on Test Specification tree If checked, the field will be editable (assign/change) on Test Specification If checked, the field will be shown on Test Execution If checked, the field will be editable on Test Execution Test case, Test plan or Test suite
Table 2: Custom fields: Availability/Display Attributes Example 1. Custom Field : Additional Notes Type : string applicable to test suites, to be edited ONLY during Test Case specification, but useful to be seen during test execution. show on design enable on design show on execution enable on execution = = = = YES YES YES NO
Example 2. Custom Field : Operating System Type : list applicable to Test Cases, to be edited ONLY during Test Case EXECUTION, unused during Test Case DESIGN. show on design enable on design show on execution enable on execution = = = = NO NO YES NO
Custom field feature has been implemented using a mix functionality from Mantis (http://www.mantisbt.org/) and dotproject (http://www.dotproject.net/) models. Custom fields: Estimated and Actual execution time Estimated Execution Time and Actual Execution Time custom fields can be added to any project and TestLink will automatically summarize those fields during the reporting process. Step 1 Create the custom fields NOTE: The user must have permissions (i.e. Custom Field Management must be enabled for the user's role) to define and assign custom fields. First select Define custom fields from the Test Project group as seen below.
- 41 -
Next select the Create button. It is most important that the Name field is exactly CF_ESTIMATED_EXEC_TIME due to the built in queries designed to this field. The Type field must be either Numeric, String or Float, as the built queries will only work on numerical data. It is suggested that Float type as it allows more flexibility.
Once completed, click Add to create the new custom field to the system. Next repeat the process for CF_EXEC_TIME as seen below.
- 42 -
The new custom fields will be available for assignment to ANY project in TestLink. Make sure the Test Project is set to the project that you want before assigning the custom fields. Step 2 Assign custom fields Select Assign custom fields from the Test Project group as seen below. Select each of the custom fields and click Assign. Now each of the custom fields have been assigned to the current project.
When you create a new test case (or edit a previous entered test case), above the Keywords section you will notice the custom fields.
- 43 -
Add the estimated time and click Create (or Save on an edit). Localizations: Use . (dot) as decimal separator or sum will be wrong. Step 3 Add the test cases to the Test Plan See chapter 5.3 for details. Step 4 Summation of the Estimations Select Test reports and Metrics from the Test Execution group as seen below. Select Test Report from the menu, then check the Metrics option Finally select the Test Suite or Test Case that you wish to create the report. At the bottom of the report will be a summary of the estimated execution time for all of the test cases.
- 44 -
NOTE: The time entered into the custom fields doesn't have to be in units of minutes, hours can also be used. The $TLS_estimated_time_hours and $TLS_estimated_time_min strings are predefined strings can be selected based on your preference.
- 45 -
File format
XML
Import X
Export Notes X All test suites and test cases. You can choose if export also assigned keywords.
Test suite
XML
Test suite details, All test cases and child test suites and test cases. You can choose if export assigned keywords.
Test case
XML
Two types of exports can be done: Just one test case All test cases in test suite.
You can choose if export assigned keywords. Custom Fields assigned are exported. Requirements assigned are exported. Test case Keyword Requirement Requirement XLS CSV, XML CSV, XML CSV DOORS, DocBook Results XML X X X X X X X Keywords import is NOT supported. All test projects keywords
Table 3: Items that can be exported/imported Limitation: Attached files and custom fields7 are not imported/exported. Table format (CSV) is not directly supported in some cases. You should convert it into XML before import. See below for more. Requirements for Internal and External ID
Every object has its internal ID , this ID is value of ID column in database table Test cases are special case because they have internal and external ID. Every time you see keyword ID in xml format it indicates INTERNAL ID.
Warning: You can rich a server memory limit for larger amount of Test cases.
<?xml version="1.0" encoding="UTF-8"?> <testsuite name=""> <details><![CDATA[]]></details> <testsuite name="Communications"> <details><![CDATA[<p>Communication Systems of all types</p>]]></details> <testsuite name="Hand-held devices"> <details><![CDATA[]]></details> <testcase name="10 G shock"> <summary><![CDATA[]]></summary> <steps><![CDATA[]]></steps> <expectedresults><! [CDATA[]]></expectedresults> </testcase> <testcase name="Gamma Ray Storm"> <summary><![CDATA[]]></summary> <steps><![CDATA[]]></steps> <expectedresults><![CDATA[]]></expectedresults> </testcase> </testsuite> <testsuite name="Subspace channels"> <details><![CDATA[<p>Only basic subspace features</p>]]></details> <testcase name="Black hole test"> <summary><![CDATA[]]></summary> <steps><![CDATA[]]></steps> <expectedresults><![CDATA[]]></expectedresults> </testcase> </testsuite> </testsuite> <testsuite name="Holodeck"> <details><![CDATA[]]></details> <testcase name="Light settings"> <summary><![CDATA[]]></summary> <steps><![CDATA[]]></steps> <expectedresults><![CDATA[]]></expectedresults> </testcase> </testsuite> <testsuite name="Propulsion Systems"> <details><![CDATA[]]></details> <testsuite name="Main engine"> <details><![CDATA[]]></details> <testcase name="Emergency stop"> <summary><![CDATA[]]></summary> <steps><![CDATA[]]></steps> <expectedresults><![CDATA[]]></expectedresults> </testcase> </testsuite> </testsuite> </testsuite>
- 47 -
<expectedresults><![CDATA[]]></expectedresults> </testcase> <testcase name="Gamma Ray Storm"> <summary><![CDATA[]]></summary> <steps><![CDATA[]]></steps> <expectedresults><![CDATA[]]></expectedresults> </testcase> </testsuite>
<?xml version="1.0" encoding="UTF-8"?> <testsuite name="Hand-held devices"> <details><![CDATA[]]></details> <testcase name="10 G shock"> <summary><![CDATA[]]></summary> <steps><![CDATA[]]></steps> <expectedresults><![CDATA[]]></expectedresults> <keywords> <keyword name="Klyngon"> <notes><![CDATA[Klyngon keyword notes]]></notes> </keyword> </keywords> </testcase> <testcase name="Gamma Ray Storm"> <summary><![CDATA[]]></summary> <steps><![CDATA[]]></steps> <expectedresults><![CDATA[]]></expectedresults> <keywords> <keyword name="Klyngon"> <notes><![CDATA[Klyngon keyword notes]]></notes> </keyword> <keyword name="Moon rocks"> <notes><![CDATA[Moon rocks keyword notes]]></notes> </keyword> </keywords> </testcase> </testsuite>
- 48 -
- 49 -
<steps><![CDATA[ <p>Preset bias to 0</p> <p>Enable <strong>long range</strong> communications control</p> <p>Simulate black hole interference</p>]]> </steps> <expectedresults><![CDATA[ <table width="200" cellspacing="1" cellpadding="1" border="1"> <caption>Main Results</caption> <tbody> <tr><td>Spin value</td><td>9.9</td></tr> <tr><td>Opposite Angle</td><td>18 rad</td></tr> <tr><td> </td><td> </td></tr> </tbody> </table>]]> </expectedresults> <custom_fields> <custom_field> <name><![CDATA[CF_SKILLS_NEEDED]]></name> <value><![CDATA[QA Engineer]]></value> </custom_field> <custom_field> <name><![CDATA[CF_ESTIMATED_EXEC_TIME]]></name> <value><![CDATA[12]]></value> </custom_field> </custom_fields> </testcase> </testcases>
- 50 -
</requirement> </requirements> </testcase> <testcase internalid="12648" name="Jump start"> <node_order><![CDATA[0]]></node_order> <externalid><![CDATA[184]]></externalid> <summary><![CDATA[]]></summary> <steps><![CDATA[]]></steps> <expectedresults><![CDATA[]]></expectedresults> <requirements> <requirement> <req_spec_title><![CDATA[RSPEC-001]]></req_spec_title> <doc_id><![CDATA[ENG-0002]]></doc_id> <title><![CDATA[Main Deflector]]></title> </requirement> <requirement> <req_spec_title><![CDATA[RSPEC-001]]></req_spec_title> <doc_id><![CDATA[DOC-009]]></doc_id> <title><![CDATA[James Bond]]></title> </requirement> </requirements> </testcase> </testcases>
<?xml version="1.0" encoding="UTF-8"?> <testcases> <testcase name="10 G shock"> <summary><![CDATA[]]></summary> <steps><![CDATA[]]></steps> <expectedresults><![CDATA[]]></expectedresults> </testcase> <testcase name="Gamma Ray Storm"> <summary><![CDATA[]]></summary> <steps><![CDATA[]]></steps> <expectedresults><![CDATA[]]></expectedresults> </testcase> </testcases>
Test cases in XLS format Every row must have four columns: Column number
1 2 3 4
Contents
Test case name summary steps Expected results
- 51 -
First row will be skipped, because is supposed it contains column descriptions. Example:
Name Engine fast start up Engine stop Summary Start up on 5 second Steps Bla, bla,bla 1. 2. 3. xxxx xxxx xxxx Unlock panic button Press panic button Press confirm xxx Expected results Bla, bla Engine must right now stop
Illustration 20: Keywords frame includes buttons for import and export Example of CSV Keyword;Notes:
Klyngon;Klyngon keyword notes Moon rocks;Moon rocks keyword notes Example of XML with keywords:
<?xml version="1.0" encoding="UTF-8"?> <keywords> <keyword name="Klyngon"> <notes> <![CDATA[Klyngon keyword notes]]> </notes> </keyword> <keyword name="Moon rocks"> <notes> <![CDATA[Moon rocks keyword notes]]> </notes> </keyword> </keywords>
- 52 -
CSV file includes Identifier of document, title, description. Example of CSV file:
ENG-0001,Terrestrial Propulsor, ENG-0002,Main Deflector,"<p>Main deflector bla, bla, bla.</p>"
Import rich text format requirements via DocBook There is limited support of import for documents in such formats as is MSWord or openoffice. You can export original document as DocBook (tested with openoffice2 and 3). Choose import button for your SRS in TestLink. Select type DocBook. The exported file is XML. Basic element for default settings could be the next:
... <sect3> <title>Title</title> ... <para>Description</para> ... <orderedlist> <listitem>Item</listitem> ... </orderedlist> ... <informaltable> <tgroup> <thead> <row> ... <entry></entry> ... </row> </thead> <tbody> <row> ... <entry></entry> ... </row>
- 53 -
TestLink uses such element as data for just one requirement. This element is defined via constant DOCBOOK_REQUIREMENT (check the code). i.e. <sect3> is default but could be modified. Each requirement content is maintain the following way: title receive text in tag <title> req_doc_id parse title for the first two words and add counter. You can modify regular expression directly in code. Default is "[ a-zA-Z_]*[0-9]*". description parse following elements after title (<para>, <orderedlist>, <informaltable>, etc.). DocBook elements are modified to HTML tags. Unknown ones are ommited. Warning: the original code could be modified to fit your structure of DocBook. Check requirement.inc.php (function importReqDataFromDocBook($fileName)) and related constants. Warning: generated REQ_DOC_ID is danger for the case of update. Because it's generated from file content without relation to existing testlink data.
<result>f</result> <notes>functionality works great KIMI</notes> </testcase> <testcase external_id="1256" > <!-- Using INTERNAL ID ---> <result>f</result> <notes>Using INTERNAL ID as link </notes> </testcase> </results>
You can import several / multiple execution results using a single XML file
Step 4. Then we will get dialog box asking "Where do you want to put the data?"
Step 5. Choose option one "XML list in existing worksheet" with first cell $A$1 Step 6. You will be able to see following columns : name, summary, steps & expected results
- 55 -
Step 7. Copy your data into this file accordingly & save the file in XML Data (*.xml) format Step 8. Check your XML file for correctness by opening with the help of internet explorer.
Importing XML file into TestLink Step 1. Login in to TestLink > Select your project in dropdown list. Step 2. Click on Specification > Create New Suite > Select Suite > Click on Import Test Cases
- 56 -
Step 3. Browse for the XML file, submit it and you are done with the Importing.
- 57 -
Revision History: #
0.x
Description
Documents for TL 1.5 and update for TL 1.6 2005
Date
Author
M. Havlt A. Morsing F. Mancardi
Converted to OO2 format; Minor update; FIX 372, 352 Updated as draft for TL 1.7 Removed TL 1.6 terms Added initial information about Custom Fields
1.4
Added content and updated Francisco's jumpstart_manual and tl_file_format. General style clean-up and update.
2007/09/06
M. Havlt
General update and restructuring; added Test Suite chapter; requirements report Overall language review Minor update; Added section Import Test Cases from Excel via XML (prepared by Prem) Update to TL 1.8 draft Update some new features
1.8 1.9
M. Havlat M. Havlt
Update for 1.8 RC3 Update for 1.8 RC5 Update for 1.8.0; Import/Export Update according to issues Update according to issues (TL 1.8.2) Updated chapter Custom fields: Estimated and Actual execution time by qvii Added page numbers
- 58 -