Important Instructions To Examiners:: (2 Marks For Each Definition)
Important Instructions To Examiners:: (2 Marks For Each Definition)
Important Instructions To Examiners:: (2 Marks For Each Definition)
(Autonomous)
(ISO/IEC - 27001 - 2005 Certified)
SUMMER – 13 EXAMINATION
Model Answer
Page 1 of 26
MAHARASHTRA STATE BOARD OF TECHNICAL EDUCATION
(Autonomous)
(ISO/IEC - 27001 - 2005 Certified)
SUMMER – 13 EXAMINATION
Model Answer
4. Testing and debugging are different activities, but debugging must be accommodated in any testing
strategy.
5. A strategy for software testing must accommodate low-level tests that are necessary to verify that a
small source code segment has been correctly implemented as well as high-level tests that validate
major system functions against customer requirements.
SUMMER – 13 EXAMINATION
Model Answer
Defined responsibilities. Every task that is scheduled should be assigned to a specific team member.
Defined outcomes. Every task that is scheduled should have a defined out come.
Defined milestones. Every task or group of tasks should be associated with a project milestone.
Program evaluation and review technique (PERT) and critical path method (CPM) are two project scheduling
methods that can be applied to software development.
SUMMER – 13 EXAMINATION
Model Answer
PM = person month
a) Explain the rule that should be followed for creating the analysis model.
(6 marks for all rules)
Answer:
The following are the rules for creating the analysis model :-
1. The information domain of a problem must be represented and understood.
2. The functions that the software is to perform must be defined.
3. The behavior of the software (as a consequence of external events) must be represented.
4. The models that depict information ,function, and behavior must be partitioned in a manner that
uncovers detail in a layered (or hierarchical) fashion.
5. The analysis process should move from essential information toward implementation detail.
By applying these principles, the analyst approaches a problem systematically. The information domain is
examined so that function may be understood more completely.
SUMMER – 13 EXAMINATION
Model Answer
3. A design should contain distinct representations of data, architecture, interfaces, and components
(modules).
4. A design should lead to data structures that are appropriate for the objects to be implemented and are
drawn from recognizable data patterns.
5. A design should lead to components that exhibit independent functional characteristics.
6. A design should lead to interfaces that reduce the complexity of connections between modules and with
the external environment.
7. A design should be derived using a repeatable method that is driven by information obtained during
software requirements analysis.
b) Explain briefly:
(2 marks for each method)
Answer:
i) Unit Testing
(1 mark for definition, 1 mark for explanation, fig is optional )
Unit testing focuses verification effort on the smallest unit of software design—the software
component or module. Using the component-level design description as a guide, important control
paths are tested to uncover errors within the boundary of the module. The relative complexity of tests
Page 5 of 26
MAHARASHTRA STATE BOARD OF TECHNICAL EDUCATION
(Autonomous)
(ISO/IEC - 27001 - 2005 Certified)
SUMMER – 13 EXAMINATION
Model Answer
and uncovered errors is limited by the constrained scope established for unit testing. The unit test is
white-box oriented, and the step can be conducted in parallel for multiple components
The module interface is tested to ensure that information properly flows into and out of the program
unit under test. The local data structure is examined to ensure that data stored temporarily maintains
its integrity during all steps in an algorithm's execution. Boundary conditions are tested to ensure that
the module operates properly at boundaries established to limit or restrict processing. All independent
paths (basis paths) through the control structure are exercised to ensure that all statements in a
module have been executed at least once. And finally, all error handling paths are tested.
SUMMER – 13 EXAMINATION
Model Answer
SQA Activities –
These activities are performed (or facilitated) by an independent SQA group that:
1. Prepares an SQA plan for a project.
2. Participates in the development of the project’s software process description.
3. Reviews software engineering activities to verify compliance with the defined software process.
4. Audits designated software work products to verify compliance with those defined as part of the
software process.
5. Ensures that deviations in software work and work products are documented and handled according
to a documented procedure.
6. Records any noncompliance and reports to senior management.
Page 7 of 26
MAHARASHTRA STATE BOARD OF TECHNICAL EDUCATION
(Autonomous)
(ISO/IEC - 27001 - 2005 Certified)
SUMMER – 13 EXAMINATION
Model Answer
The bedrock that supports software engineering is a quality focus. The foundation for software engineering is
the process layer. Software engineering process is the glue that holds the technology layers together and
enables rational and timely development of computer software. Process defines a framework for a set of key
process areas (KPAs) that must be established for effective delivery of software engineering technology. The
key process areas form the basis for management control of software projects and establish the context in
which technical methods are applied, work products (models, documents, data, reports, forms, etc.) are
produced, milestones are established, quality is ensured, and change is properly managed.
Software engineering methods provide the technical how-to's for building software. Methods encompass a
broad array of tasks that include requirements analysis, design, program construction, testing, and support.
Software engineering methods rely on a set of basic principles that govern each area of the technology and
include modeling activities and other descriptive techniques.
Software engineering tools provide automated or semi-automated support for the process and the methods.
When tools are integrated so that information created by one tool can be used by another, a system for the
support of software development, called computer-aided software engineering, is established. CASE
combines software, hardware, and a software engineering database (a repository containing important
information about analysis, design, program construction, and testing) to create a software engineering
environment analogous to CAD/CAE (computer-aided design/engineering) for hardware.
Page 8 of 26
MAHARASHTRA STATE BOARD OF TECHNICAL EDUCATION
(Autonomous)
(ISO/IEC - 27001 - 2005 Certified)
SUMMER – 13 EXAMINATION
Model Answer
Object: Customer
SUMMER – 13 EXAMINATION
Model Answer
Attributes:
name
company name
status of contact
Page 10 of 26
MAHARASHTRA STATE BOARD OF TECHNICAL EDUCATION
(Autonomous)
(ISO/IEC - 27001 - 2005 Certified)
SUMMER – 13 EXAMINATION
Model Answer
Page 11 of 26
MAHARASHTRA STATE BOARD OF TECHNICAL EDUCATION
(Autonomous)
(ISO/IEC - 27001 - 2005 Certified)
SUMMER – 13 EXAMINATION
Model Answer
Page 12 of 26
MAHARASHTRA STATE BOARD OF TECHNICAL EDUCATION
(Autonomous)
(ISO/IEC - 27001 - 2005 Certified)
SUMMER – 13 EXAMINATION
Model Answer
Page 13 of 26
MAHARASHTRA STATE BOARD OF TECHNICAL EDUCATION
(Autonomous)
(ISO/IEC - 27001 - 2005 Certified)
SUMMER – 13 EXAMINATION
Model Answer
Page 14 of 26
MAHARASHTRA STATE BOARD OF TECHNICAL EDUCATION
(Autonomous)
(ISO/IEC - 27001 - 2005 Certified)
SUMMER – 13 EXAMINATION
Model Answer
Page 15 of 26
MAHARASHTRA STATE BOARD OF TECHNICAL EDUCATION
(Autonomous)
(ISO/IEC - 27001 - 2005 Certified)
SUMMER – 13 EXAMINATION
Model Answer
Page 16 of 26
MAHARASHTRA STATE BOARD OF TECHNICAL EDUCATION
(Autonomous)
(ISO/IEC - 27001 - 2005 Certified)
SUMMER – 13 EXAMINATION
Model Answer
1) A common process framework is established by defining a small number of framework activities that
are applicable to all software projects, regardless of their size or complexity.
2) A number of task sets—each a collection of software engineering work tasks, project milestones,
work products, and quality assurance points—enable the framework activities to be adapted to the
characteristics of the software project and the requirements of the project team.
3) Umbrella activities—such as software quality assurance, software configuration management, and
measurement—overlay the process model.
4) Umbrella activities are independent of any one framework activity and occur throughout the process.
5) Software Engineering Umbrella Activities ( Software project tracking and control, Formal technical
reviews, Software quality assurance, Software configuration management, Document preparation and
production, Reusability management, Measurement, Risk management)
Page 17 of 26
MAHARASHTRA STATE BOARD OF TECHNICAL EDUCATION
(Autonomous)
(ISO/IEC - 27001 - 2005 Certified)
SUMMER – 13 EXAMINATION
Model Answer
Ans:
Software design sits at the technical kernel of software engineering and is applied regardless of the
software process model that is used. Beginning once software requirements have been analyzed and
specified, software design is the first of three technical activities—design, code generation, and test—that
are required to build and verify the software. Each activity transforms information in a manner that
ultimately results in validated computer software.
Each of the elements of the analysis model provides information that is necessary to create the four
design models required for a complete specification of design. The flow of information during software
design is illustrated in above figure.
Software requirements, manifested by the data, functional, and behavioral models, feed the design
task. Using one of a number of design methods (discussed in later chapters), the design task produces a
data design, an architectural design, an interface design, and a component design.
Page 18 of 26
MAHARASHTRA STATE BOARD OF TECHNICAL EDUCATION
(Autonomous)
(ISO/IEC - 27001 - 2005 Certified)
SUMMER – 13 EXAMINATION
Model Answer
The data design transforms the information domain model created during analysis into the data
structures that will be required to implement the software. The data objects and relationships defined in
the entity relationship diagram and the detailed data content depicted in the data dictionary provide the
basis for the data design activity.
Part of data design may occur in conjunction with the design of software architecture. More detailed
data design occurs as each software component is designed. The architectural design defines the
relationship between major structural elements of the software, the ―design patterns‖ that can be used to
achieve the requirements that have been defined for the system, and the constraints that affect the way in
which architectural design patterns can be applied
The architectural design representation the framework of a computer-based system—can be derived
from the system specification, the analysis model, and the interaction of subsystems defined within the
analysis model. The interface design describes how the software communicates within itself, with
systems that interoperate with it, and with humans who use it. An interface implies a flow of information
(e.g., data and/or control) and a specific type of behavior. Therefore, data and control flow diagrams
provide much of the information required for interface design.
The component-level design transforms structural elements of the software architecture into a
procedural description of software components. Information obtained from the PSPEC, CSPEC, and STD
serve as the basis for component design.
Page 19 of 26
MAHARASHTRA STATE BOARD OF TECHNICAL EDUCATION
(Autonomous)
(ISO/IEC - 27001 - 2005 Certified)
SUMMER – 13 EXAMINATION
Model Answer
Ans :
The clean room approach makes use of a specialized version of the incremental software model. A
pipeline of software increments is developed by small independent software engineering teams. As each
increment is certified, it is integrated in the whole. Hence, functionality of the system grows with time.
Overall system or product requirements are developed using the system engineering methods. Once
functionality has been assigned to the software element of the system, the pipeline of clean room
increments is initiated. The following tasks occur: Increment planning. A project plan that adopts the
incremental strategy is developed. The functionality of each increment, its projected size, and a clean
room development schedule are created. Special care must be taken to ensure that certified increments
will be integrated in a timely manner.
Page 20 of 26
MAHARASHTRA STATE BOARD OF TECHNICAL EDUCATION
(Autonomous)
(ISO/IEC - 27001 - 2005 Certified)
SUMMER – 13 EXAMINATION
Model Answer
Box Structure Specification: A specification method that makes use of box structures is used to
describe the functional specification. Conforming to the operational analysis principles, box structures
isolate and separate the creative definition of behavior, data, and procedures at each level of refinement.
Formal Design: Using the box structure approach, clean room design is a natural and seamless
extension of specification. Although it is possible to make a clear distinction between the two activities,
specifications (called black boxes) are iteratively refined (within an increment) to become analogous to
architectural and component-level designs (called state boxes and clear boxes, respectively).
Correctness Verification: The clean room team conducts a series of rigorous correctness verification
activities on the design and then the code. Verification begins with the highest-level box structure
(specification) and moves toward design detail and code. The first level of correctness verification occurs
by applying a set of correctness questions. If these do not demonstrate that the specification is correct,
more formal (mathematical) methods for verification are used.
Code Generation, Inspection, And Verification: The box structure specifications, represented in a
specialized language, are translated into the appropriate programming language. Standard walkthrough or
inspection techniques are then used to ensure semantic conformance of the code and box structures and
syntactic correctness of the code. Then correctness verification is conducted for the source code.
Statistical Test Planning: The projected usage of the software is analyzed and a suite of test cases that
exercise a probability distribution of usage are planned and designed. Clean room activity is conducted in
parallel with specification, verification, and code generation.
Statistical Use Testing. Recalling that exhaustive testing of computer software is impossible, it is always
necessary to design a finite number of test cases. Statistical use techniques execute a series of tests
derived from a statistical sample (the probability distribution noted earlier) of all possible program
executions by all users from a targeted population.
Certification: Once verification, inspection, and usage testing have been completed (and all errors are
corrected), the increment is certified as ready for integration.
Page 21 of 26
MAHARASHTRA STATE BOARD OF TECHNICAL EDUCATION
(Autonomous)
(ISO/IEC - 27001 - 2005 Certified)
SUMMER – 13 EXAMINATION
Model Answer
Answer:
RAD Model
Rapid application development (RAD) is an incremental software development process model that
emphasizes an extremely short development cycle. The RAD model is a ―high-speed‖ adaptation of the
linear sequential model in which rapid development is achieved by using component-based construction.
If requirements are well understood and project scope is constrained, the RAD process enables a
development team to create a ―fully functional system‖ within very short time periods (e.g., 60 to 90
days). Used primarily for information systems applications, the RAD approach encompasses the
following phases
Business modeling: - The information flow among business functions is modeled in a way that answers the
following questions: What information drives the business process? What information is generated? Who
generates it? Where does the information go? Who processes it?
Page 22 of 26
MAHARASHTRA STATE BOARD OF TECHNICAL EDUCATION
(Autonomous)
(ISO/IEC - 27001 - 2005 Certified)
SUMMER – 13 EXAMINATION
Model Answer
Data modeling: - The information flow defined as part of the business modeling phase is refined into a set of
data objects that are needed to support the business. The characteristics (called attributes) of each object are
identified and the relationships between these objects defined. Data modeling is considered in.
Process modeling: - The data objects defined in the data modeling phase are transformed to achieve the
information flow necessary to implement a business function. Processing descriptions are created for adding,
modifying, deleting, or retrieving a data object.
Application generation: - RAD assumes the use of fourth generation techniques (Section 2.10). Rather than
creating software using conventional third generation programming languages the RAD process works to
reuse existing program components (when possible) or create reusable components (when necessary). In all
cases, automated tools are used to facilitate construction of the software.
Testing and turnover: - Since the RAD process emphasizes reuse, many of the program components have
already been tested. This reduces overall testing time. However, new components must be tested and all
interfaces must be fully exercised.
Page 23 of 26
MAHARASHTRA STATE BOARD OF TECHNICAL EDUCATION
(Autonomous)
(ISO/IEC - 27001 - 2005 Certified)
SUMMER – 13 EXAMINATION
Model Answer
Page 24 of 26
MAHARASHTRA STATE BOARD OF TECHNICAL EDUCATION
(Autonomous)
(ISO/IEC - 27001 - 2005 Certified)
SUMMER – 13 EXAMINATION
Model Answer
mechanisms. The data structure and processing detail required to build the interface are contained with a
library of reusable components for interface construction.
Answer
The risk components are defined in the following manner:
1. Performance risk—the degree of uncertainty that the product will meet its requirements and be fit for
its intended use.
2. Cost risk—the degree of uncertainty that the project budget will be maintained.
3. Support risk—the degree of uncertainty that the resultant software will be easy to correct, adapt, and
enhance.
4. Schedule risk—the degree of uncertainty that the project schedule will be maintained and that the
product will be delivered on time.
Answer:
Alpha Testing: -The alpha test is conducted at the developer's site by a customer. The software is used in
a natural setting with the developer "looking over the shoulder" of the user and recording errors and
usage problems. Alpha tests are conducted in a controlled environment.
Beta Testing: - The beta test is conducted at one or more customer sites by the end-user of the software.
Unlike alpha testing, the developer is generally not present. Therefore, the beta test is a "live" application
of the software in an environment that cannot be controlled by the developer. The customer records all
problems (real or imagined) that are encountered during beta testing and reports these to the developer at
regular intervals. As a result of problems reported during beta tests, software engineers make
modifications and then prepare for release of the software product to the entire customer base.
Page 25 of 26
MAHARASHTRA STATE BOARD OF TECHNICAL EDUCATION
(Autonomous)
(ISO/IEC - 27001 - 2005 Certified)
SUMMER – 13 EXAMINATION
Model Answer
Answer
There are three Different Debugging Strategies are available as follows:-
(1) Brute Force,
(2) Backtracking, And
(3) Cause Elimination.
The brute force category of debugging is probably the most common and least efficient method for
isolating the cause of a software error. Brute force debugging methods are applied when all else fails. Using a
"let the computer find the error" philosophy, memory dumps are taken, run-time traces are invoked, and the
program is loaded with WRITE statements. In the morass of information that is produced a clue is found that
can lead us to the cause of an error. Although the mass of information produced may ultimately lead to
success, it more frequently leads to wasted effort and time. Thought must be expended first.
Backtracking is a fairly common debugging strategies that can be used successfully in small
programs. Beginning at the site where a symptom has been uncovered, the source code is traced backward
(manually) until the site of the cause is found. Unfortunately, as the number of source lines increases, the
number of potential backward paths may become unmanageably large.
The third strategies to debugging—cause elimination—is manifested by induction or deduction and
introduces the concept of binary partitioning. Data related to the error occurrence are organized to isolate
potential causes. A "cause hypothesis" is devised and the aforementioned data are used to prove or disprove
the hypothesis. Alternatively, a list of all possible causes is developed and tests are conducted to eliminate
each. If initial tests indicate that a particular cause hypothesis shows promise, data are refined in an attempt
to isolate the bug.
Page 26 of 26