Integrated Assets Management System
Integrated Assets Management System
Integrated Assets Management System
CERTIFICATE
Reg.No-426707219), a 6th Semester MCA student of Jorhat Engineering College, Jorhat, for
the partial fulfillment of the requirement for the award of the degree of Master of Computer
Application under Assam Science and Technology University(ASTU) from Jorhat Engineering
College, Jorhat.
I wish him success in all future endeavors.
CERTIFICATE
College, Jorhat, for the partial fulfillment of the requirement for the award of the degree of
Master of Computer Application under Assam Science and Technology University from
Jorhat Engineering College, Jorhat.
I wish him success in all future endeavors.
(External Examiner)
Date:
JORHAT ENGINEERING COLLEGE, JORHAT
DEPARTMENT OF COMPUTER APPLICATION
ASSAM SCIENCE AND TECHNOLOGY UNIVERSITY (ASTU)
CERTIFICATE
CERTIFICATE
College, Jorhat, for the partial fulfillment of the requirement for the award of the degree of
Master of Computer Application under Assam Science and Technology University(ASTU)
from Jorhat Engineering College, Jorhat under my guidance and the project has been
successfully completed.
Chandrani Borah
Assistant Professor
Internal Co-Guide
Department of Computer Application,
Jorhat Engineering College, Jorhat, Assam
DECLARATION BY THE CANDIDATE
I, Anamitrra Vijay, a 6th Semester MCA student of Jorhat Engineering College, Jorhat
hereby declare that the project work entitled “INTEGRATED ASSETS MANAGEMENT
SYSTEM(IAMS) ”, a management system for the National Informatics Center(NIC), Golaghat,
Assam is an authentic work carried out by me at National Informatics Center(NIC), Golaghat,
Assam under the guidance of Mr. Abhijeet Kakoty, Scientist-C (NIC) for the partial
fulfillment and award of the degree of Master of Computer Application(MCA). This project has
not been submitted anywhere else for the award of any other degree/Diploma. Where other
sources of information have been used, they have been acknowledged.
Anamitrra Vijay
MCA 6th Semester
Jorhat Engineering College
Jorhat, Assam
ACKNOWLEDGEMENT
The ability to help and patience to exercise diligence and provide support is a quality admonished by
very few. Any job in this world, however trivial or tough cannot be accomplished without the assistance of the
others. I would hereby take the opportunity to express my indebtedness to people who have helped me to
accomplish this task. The present line of accomplishment is not a formality but an honest word of appreciation
that has exactly been felt by me during my Project.
First and foremost, I would like to thanks Mr. Abhijeet Kakoti, Scientist-C, National
Informatics Centre (NIC), Golaghat for giving me the opportunity and allow me to do the project in NIC,
Golaghat.
Then I would like to thanks Mr. Rituraj Borgohain, District Information Technology
Manager, NIC, Golaghat, Assam for providing unwavering support and the opportunity to work in this
organization.
I convey my sincere thanks to Dr. Rupam Baruah, Principal and HOD of Department of
Computer Application of Jorhat Engineering College for his help in getting my project.
My deep sense of gratitude to my Internal project guide Dr. Dhrubajyoti Baruah (Associate
professor, dept. of Computer Application, JEC) for his deep interest in the development of the project
and constant reminder for updates.
Last but not the least, I would like to thank my Co-Guide Miss Chandrani Borah (Assistant
Professor, MCA) for providing continuous support and help.
I also take this opportunity to express my indebtedness to my respected parents and all my respected
teachers of JEC, Jorhat for their kind consent, expert guidance, valuable suggestions and affectionate
encouragement.
Anamitrra Vijay
MCA 6th Semester
Jorhat Engineering College
Jorhat, Assam
PREFACE
Excellence is an attitude that the whole of the human race is born with. It is the environment that makes
sure that whether the result of this attitude is visible or otherwise. The well planned, properly executed and
evaluated industrial training help a lot in including the good work culture. It provides linkage between the
student and industry in order to develop the awareness of industrial approach to problem solving based on broad
understanding of process and mode of operation of an organization.
During this period, the students get their real first-hand experience on working in the actual
environment. Most of the theoretical knowledge that they have gained during the course of their studies is to put
to test here. Apart from this the students get the opportunity to learn the latest technology, which immensely
help them in their career. This also benefits the organization as many students doing their projects perform very
well.
I had the opportunity to have the real practical experience, which has increased my sphere of knowledge
to a great extent. Now, I am better equipped to handle the real thing than anyone else that has not undergone
any such training. During the training period, I learned how an actual project progresses, what sort of problems
actually occur during the development of such big projects, how to produce quality products and so on. I also
learned how to share one’s knowledge to bring out a cumulative effect, how to solve problem in modular way.
I learned the strategy of divide and conquer with application-level approach and methodical cycle of software
development. And being in such a reputed organization, I had but the best exposure.
Anamitrra Vijay
MCA 6th Semester
Jorhat Engineering College
CONTENTS
Chapter 1: About the organization
1.1 Introduction
1.2 Infrastructure
3.1 Introduction
4.1.1 LARAVEL
4.1.3 REACT JS
5.1 Introduction
5.2.1 Modelling
5.2.2 Functional Modelling
7.1 Introduction
7.2 Testing
8.1 Implementation
8.2 Maintenance
Chapter 9: Screenshots
NIC has been closely associated with the Government in different aspects of
Governance besides establishing a Nationwide State-of-the-Art ICT Infrastructure, it has also
built a large number of digital solutions to support the government at various levels, making the
last-mile delivery of government services to the citizens a reality.
* Capacity Building
ICT infrastructure of NIC viz. NICNET, NKN, LAN, Mini Data Centre, Video
conference studios, messaging service, Webcast facilities are the key constituents of NIC
services across all 37 States/ UTS and 720+ Districts.
Infrastructure 1.2
Network: Core of NICNET backbone is fully upgraded to multiple 10 Gbps capacity with sufficient
redundancy. States are connected through multiple 1/10 Gbps links and districts 34/100 Mbps links with
redundancy built at State and District links. Last mile redundancy for NICNET has been extended to more
number of districts, with primary link from BSNL and secondary links from Railtel/PGCIL. Most of the
Bhawan links at Delhi which were on 34 Mbps are upgraded to 100 Mbps and those on 100 Mbps are
upgraded to 1Gbps.
Direct peering of NICNET with BSNL, PGCIL and Railtel are completed at Delhi and Hyderabad
for saving Internet Bandwidth and faster access of each other's Network and Data Centre, Peering with
Google, Microsoft and Akamai Content Delivery Network has facilitated faster access to Google services
and other important International web sites. Re-structuring of Videoconferencing network has enabled to
minimize delay and handle large scale important video conferencing such as PRAGATI of Hon'ble PM, GST
Council Meetings by Hon'ble FM etc. High speed Internet services are provided to national data centres to
ensure that the applications hosted are accessible to users across the globe with minimum latency. Capacity
planning and upgradation of Internet Gateway at regular interval has been undertaken to provide smooth
Internet access to all NICNET users throughout the country. To maintain accurate timing and
synchronization of all network elements and servers on the network Stratum-1 clocks are installed at Delhi
and Hyderabad.
*NKN: NKN empowers Digital India, as it is the primary backbone for all e-Governance initiatives in the
country. It is the only network globally, that carries R&E, Internet and e-Governance traffic as independent
verticals under one umbrella. NKN has multiple 10G links that are combining a core bandwidth of close to
1000G, providing secured and highly resilient connectivity across major Institutions for research, education
and e-Governance.
NKN has a strong backbone connectivity with 31 Points of Presence (POPs) in various State Capitals and 92
core links connected with meshed topology. Moreover, currently over 700 Gigabits (reaching a peak of 5
Petabytes) of data is flowing within the NKN backbone every day, Over 40 links (premium Institutes, SDC
(State Data Centres) & SWAN of many states) have been upgraded to 10 Gbps. NKN has also established a
High Capacity SCPC VSAT Connectivity at Kavarati, Lakshadweep and Port Blair, Andaman & Nicobar
Island.
National Cloud Infrastructure: NIC launched National Cloud Services in year 2014 under MeghRaj
Government of India Cloud Initiative, NIC Cloud Services are being provided from multiple locations of
National Data Centres at Bhubaneswar, Delhi, Hyderabad, and Pune. Various new services are now offered
on Cloud including Application Programme Monitoring (APM) Service, Data Analytics (DA) Service,
Resource Monitoring (RM) Service and Container Service. In order to cater to the projects envisioned
under Digital India Programme and growing requirements of existing Projects, over 18,000 Virtual Servers
were provisioned and allocated to over 1100 Ministries/Departments for e-Governance Projects,
Network Security: The Network Security Division is in relentless pursuit of achieving CIA
(Confidentiality, Integrity, and Availability) of ICT assets in NICNET through deployment of expert
manpower, appropriate tools, and state-of-the-art technologies.
The Network Security Division (NSD) of NIC is engaged in assessment, planning, deployment and
management of security devices and solutions across the NICNET in general and the Data Centres in
particular. The security span of NSD comprises of all National and State Data Centres, over 1000 LANs of
Govt. offices and MPLS networks, more than 2 Lakh endpoints and a series of networking devices deployed
across the country. A dedicated team actively monitors real time attacks on 24×7 basis.
Application Security: NIC is formulating and updating the Security Policies for NICNET as and when
required. Security Audit of Web Applications / Websites, Penetration Testing and Vulnerability
Analysis,SSL compliance testing, Version Detection for application hosting environment with infrastructure
compliance checks are also done as per user requirement. Critical Web applications are secured through
Web Application Firewall (WAF) to counter Application layer threat, Management and administration of
deployed WAF solutions configuration of critical sites including CMF (Drupal) based portals, WAF service
support at NIU Hyderabad for non-compliant web applications and 24x7 monitoring service.
The center provides Incident Handling and Malware Analysis, Sanitization of security controls based
on analysis results and Issuing advisories to NICNET users. Videoconference: Videoconferencing facilitates
direct interaction with concerned stake holders and save time & Money. Videoconferencing services are
being used for monitoring of various Government Projects, Schemes, Public Grievances, monitoring of law
and order, Hearings of RTI cases, Tele-education, Tele-medicine and Launching of new schemes etc.NIC's
VC services are being extensively used by Hon'ble Prime Minister, Union Ministers, Governors,
Chief Ministers of states,
Cabinet Secretary and Chief Secretaries, Chief Information Commissioner and various other senior
officials across country.NIC is also providing web-based desktop videoconferencing services to users of
various departments of central government & state governments.
Webcast: NIC has been providing live/on-demand webcast services to Central and State Government for
important National, International and regional, educational events and conferences. Live webcast services
are provided for government TV channels such as Lok Sabha TV, Rajya Sabha TV, Doordarshan News, DD
Kisan, UGC CEC higher educational channel, DD Punjabi on 24×7 basis. Important events such as Union
Budget speech, President's address to the nation, Prime Minister's Mann Ki Baat& other speeches,
Independence and Republic Day celebrations at New Delhi, Air Force Day, Dance and cultural Festivals,
PIB press conferences, NIC Knowledge sharing, NKN events, proceedings of state assemblies, other
national and international events/conferences like make in India, Skill India, Start-up India, Digital India,
International Yoga Day were covered.
Chapter: 2
2. PROJECT OVERVIEW
2.1 PROJECT TITLE
3. FEASIBILITY STUDY
3.1 INTRODUCTION
There are three aspects in the feasibility study portion of the primary investigation. They are:
1. Technical feasibility.
2. Economic feasibility.
3. Operational feasibility.
TECHNICAL FEASIBILITY:
Technical feasibility means, can the work for the project be done with current equipment,
existing software technology and available personal. If new technology is needed, what is it that can
be developed? There are a number of technical issues, which are generally raised during the feasibility
stage of the investigation. They are as follows:
2. Does the proposed equipment have the technical capability to hold the data required to
use the new system?
So, the technical feasibility of this project is that the project can be done with current
equipment, existing software technology and available personnel. The proposed equipment has the
technical capability to hold the data required using the new system. There are technical guarantees of
accuracy, reliability, ease of access and data security.
Front-End selection:
• It must have a graphical user interface that assists employees that are not from IT
background.
• Scalability and extensibility.
• Flexibility.
• Robustness.
• According to the requirements of the organization and the culture.
• Must provide excellent reporting features with good printing supports.
• Platform independent.
• Easy to debug and maintain.
• Event driven programming facility.
• Front end must support some popular back end like MS SQL server.
Back-End selection:
The technical feasibility is frequently the most difficult area encountered at this stage. It is
essential that the process of analysis and definition be conducted in parallel with an assessment to
technical feasibility. It centre on the existing computer system (hardware, software etc.) and to what
extent it can support the proposed system.
ECONOMIC FEASIBILITY:
Economic feasibility means, an evolution of development cost weight against the income or
benefit derived from the developed project. Economic feasibility determine whether, there are
sufficient benefit in creating the system to make the system cost acceptable or the cost of not creating
the system so great that it is advisable to undertake the project. Analyst raises various financial and
economic questions during the preliminary and economic question during the preliminary
investigation to estimate the following:
The economic feasibility of this system is there are sufficient benefits in creating the system to
make the system cost acceptable. As the existing website system is regulated manually so, there is
always a tendency of losing money and time.
OPERATIONAL FEASIBILITY:
Operational feasibility means, will the system be used if it developed and implemented.
Proposed projects are beneficial only if there can be termed in to information system that will meet
the operational requirement of the organization. This test of feasibility asks if the system will work
when it is developed and installed. Are there major barriers to implementation? Some of the important
question those are useful to the Operational Feasibility of a project are given below:
➢ It there sufficient support for the project from the management? If the present system
is well linked and used to the extend the portion will not be able to see reasons for a
change, there may be resistance.
➢ Are current business methods acceptable to the users? If they are not, users may
welcome to change that will bring about a more operational useful system.
➢ Have the user been involved in the planning and developed of project? If they are
involved at the earlier stage of project development, the change of resistance can be
possible reduced.
➢ Will the proposed system cause harm? Will it produce poorer result in any case?
Will the performance of staff member fall down after implementation?
Issue that appears to be quite minor at the early stage can grow in to major problem after
implementation. Therefore, if is always available to consider operational aspect carefully.
The operational feasibility of this project is that it is user friendly software. So there are no
difficulties to train the users about the software. User can take benefits from the system by saving
time and money.
Chapter: 4
4. TECHNOLOGY USED
4.1 SOFTWARE SPECIFICATIONS
The project is comprised using the following software tools. These tools were chosen in
such a manner taking into account the need for future enhancements, system longevity and
maintainability.
Languages/Scripts: PHP, Laravel, JSX (JavaScript XML)
Application server: Apache (XAMPP Server)
GUI Design: React JS, Cascading Style Sheets (CSS)
Browsers: Mozilla Firefox, Google Chrome or JavaScript supported browser
Database: MYSQL
PHP code can be simply mixed with HTML code, or it can be used in combination with various
templating engines and web frameworks. PHP code is usually processed by a PHP interpreter,
which is usually implemented as a web server's native module or a Common Gateway Interface
(CGI) executable. After the PHP code is interpreted and executed, the web server sends resulting
output to its client, usually in form of a part of the generated web page, for example, PHP code can
generate a web page's HTML code, an image, or some other data. PHP has also evolved to include a
command-line interface (CLI) capability and can be used in standalone graphical applications.
PHP syntax:
The following “Hello User” program is written in PHP code embedded in an HTML document:
<!DOCTYPE html>
<html>
<head>
<title>PHP Test</title>
</head>
<body>
<? php echo '<p>Hello User </p>'; ?>
</body>
</html>
The PHP interpreter only executes PHP code within its delimiters. Anything outside its
delimiters is not processed by PHP (although non-PHP text is still subject to control structures
described in PHP code). The most common delimiters are <? php to open and ?> to close PHP
sections. There are also the shortened forms <? or <?= (which is used to echo back a string or
variable) and ?>. Short delimiters make script files less portable, since support for them can be
disabled in the local PHP configuration, and they are therefore discouraged. The purpose of all these
delimiters is to separate PHP code from non-PHP code, including HTML.
The first form of delimiters, <?php and ?>, in XHTML and other XML documents, creates
correctly formed XML "processing instructions". This means that the resulting mixture of PHP code
and other markup in the server-side file is itself well-formed XML. Variables are prefixed with a
dollar symbol, and a type does not need to be specified in advance. PHP 5 introduced type hinting
that allows functions to force their parameters to be objects of a specific class, arrays, interfaces or
callback functions. However, before PHP 7.0, type hints could not be used with scalar types such as
integer or string. Unlike function and class names, variable names are case sensitive. Both double-
quoted ("") and heredoc strings provide the ability to interpolate a variable's value into the string.
PHP treats newlines as whitespace in the manner of a free-form language, and statements are
terminated by a semicolon. PHP has three types of comment syntax: /* */ marks block and inline
comments; // as well as # are used for one-line comments. The echo statement is one of several
facilities PHP provides to output text, e.g., to a web browser.
In terms of keywords and language syntax, PHP is similar to most high level languages that
follow the C style syntax. if conditions, for and while loops, and function returns are similar in syntax
to languages such as C, C++, C#, Java and Perl.
LARAVEL:
Laravel is a PHP based web framework for building high-end
web applications using its significant and graceful syntaxes. It comes with a robust collection of
tools and provides application architecture. Moreover, it includes various characteristics of
technologies like ASP.NET MVC, CodeIgniter, Ruby on Rails, and a lot more. This framework is
an open-source framework. It facilitates developers by saving huge time and helps reduce
thinking and planning to develop the entire website from scratch. Along with that, the security of
the application is also Laravel take care of. Hence all its features can boost the web
development pace for you. If anyone is familiar with the basics of PHP along with some
intermediate PHP scripting, then Laravel can craft your work more efficiently.
XAMPP is a free and open source cross-platform web server solution stack package
developed by Apache Friends, consisting mainly of the Apache HTTP Server, MariaDB Database,
and interpreters for scripts written in the PHP and Perl programming Languages. XAMPP stands for
Cross-Platform(X), Apache (A), MariaDB (M), PHP (P), and Perl (P). It is a simple, lightweight
Apache distribution that makes it extremely easy for developers to create a local web server for testing
and development purposes. Everything needed to set up a web server-server application (Apache),
database (MariaDB) and Scripting language (PHP)-is included in an extractable file. XAMPP is also
cross-platform, which means it works equally well on Linux, Mac and Windows. Since most actual
web server deployments use the same components as XAMPP, it makes transitioning from a local
test server to a live server extremely easy as well.
XAMPP Features: XAMPP is regularly updated to the latest releases of Apache, MariaDB, PHP and
Perl. It also comes with a number of other modules including OpenSSL, phpMyAdmin, MediaWiki,
Joomla, WordPress and more. Self-contained, multiple instances of XAMPP can exist on a single
computer, and any given instance can be copied from one computer to another. XAMPP is offered in
both a full and a standard version.
Usage of XAMPP: Officially, XAMPP’s designers intended it for use only as a development tool, to
allow website designers and programmers to test their work on their own computers without any
access to the internet. To make this as easy as possible, many important security features are disabled
by default. XAMPP has the ability to serve web pages on the WORLD WIDE WEB. A special tool
is provided to password-protect the most important parts of the package.
XAMPP also provides supports for creating and manipulating databases in MariaDB and
SQLite among others.
Once XAMPP is installed, it is possible to treat a localhost like a remote host by connecting
using an FTP client. Using a program like FileZilla has many advantages when installing a content
management system (CMS) like Joomla or WordPress. It is also possible to connect to localhost via
FTP with an HTML editor.
REACT JS:
React.js is an open-source JavaScript library that is used for building user interfaces
specifically for single-page applications. It’s used for handling the view layer for web and mobile
apps. React also allows us to create reusable UI components. React was first created by Jordan Walke,
a software engineer working for Facebook. React first deployed on Facebook’s newsfeed in 2011 and
on Instagram.com in 2012.
React allows developers to create large web applications that can change data, without
reloading the page. The main purpose of React is to be fast, scalable, and simple. It works only on
user interfaces in the application. This corresponds to the view in the MVC template. It can be used
with a combination of other JavaScript libraries or frameworks, such as Angular JS in MVC.
JSX
In React, instead of using regular JavaScript for templating, it uses JSX. JSX is a simple
JavaScript that allows HTML quoting and uses this HTML tag syntax to render subcomponents.
HTML syntax is processed into JavaScript calls of React Framework. We can also write in pure old
JavaScript.
NodeJS:
Node.js also provides a rich library of various JavaScript modules which simplifies the
development of web applications using Node.js to a great extent. It brings plenty of advantages to
the table, making it a better choice than other server-side platforms like Java or PHP.
CSS stands for Cascading Style Sheets. Styling can be added to HTML elements in 3 ways:
• External - using one or more external CSS files.The most common way to add styling, is to
keep the styles in separate CSS files.
The element is an HTML element name. The property is a CSS property. The value is a CSS value.
Multiple styles are separated with semicolon.
Inline Styling (Inline CSS):
Inline styling is useful for applying a unique style to a single HTML element.Inline styling uses the
style attributes. This inline styling changes the text colour of a single heading:
An internal style sheet can be used to define a common style for all HTML elements on a page.
Internal styling is defined in the <head> section of an HTML page, using a <style> element.
Example:
<!DOCTYPE html>
<html>
<head>
<style>
body {background‐color:lightgrey}
h1 {color:blue}
p {color:green}
</style>
</head>
<body>
<h1>This is a heading</h1>
</body>
</html>
External Styling (External CSS): External style sheet are ideal when the style is applied to many
pages. With external style sheets, we can change the look of an entire website by changing one file.
External styles are defined in an external CSS file, and then linked to in the <head>
section of an HTML page.
Example:
<!DOCTYPE html>
<html>
<head>
</head>
<body>
<h1>This is a heading</h1>
<p>This is a paragraph.</p>
</body>
</html>
CSS Fonts:
• The CSS color property defines the text color to be used for the HTML element.
• The CSS font-family property defines the font to be used for the HTML element.
• The CSS font-size property defines the text size to be used for the HTML element.
Every HTML element has a box around it, even if we cannot see it.The CSS border property define
a visible border around an HTML element.
Example: p{
The CSS padding property defines a padding (space) inside the border.
padding:10px; }
The CSS margin property defines a margin (space) outside the border.
Example p{
padding:10px;
margin:30px;
}
The id Attribute:
All the examples above use CSS to style HTML elements in a general way.To define a special style
for one special element; first add an id attribute to the element:
To define a style for a special type (class) of elements, add a class attribute to the element:
Now we can define a different style for all elements with the specified class:
MYSQL DATABASE
✓ Products
✓ Customers
✓ Orders
Database queries: A query is a question or a request. We can query a database for specific
information and have a record set returned.
The query above selects all the data in the “lastname” column from the “employees” table.
5. System ANALYSIS
5.1 INTRODUCTION
System Analysis:
Software requirement specification (SRS) is the starting point of the software development
activity. The objective of analysis of the problem is to answer the question: Exactly what must the
system do? During system analysis the analyst attempts to develop a complete functional
understanding of the proposed system. The document identifies a number of processes or functions
that must be performed by the system.
There are mainly two parts of this phase:
1. Problem Analysis or Requirement Analysis
2. Requirement Specifications and Review
Modelling: We create model to gain a better understanding of the actual entity to be built. Here entity
is to be built is software, so it must be capable of modelling the information that the software
transforms, the functions and sub-functions that enable the transformation to occur, and the behaviour
of the system as the transformation is taking place.
Functional Modelling: Software transforms information, and in order to accomplish this, it must
perform at least three generic functions: input, processing, and output. The functional mode begins
with a single context level model (i.e. the name of the software to be built). Over a series of iterations,
more and more functional detail is provided, until a thorough delineation of all system functionality
is represented.
Data Modelling: Data modelling defines primary data objects, composition of each data object,
attributes of object, relationships between the object and between the objects and the processes.
Tools of Structured Analysis: The structured tools include the data flow diagram, data dictionary,
structured English, decision trees and decision tables. The objective is to build a new document called
system specifications that provides the basis for design and implementation. The Data Flow
Diagram’s (DFD) of the project are shown next.
Data flow diagrams illustrate how data is processed by a system in terms of inputs and outputs.
Data Flow Diagramming is a means of representing a system at any level of detail with a graphic
network of symbols showing data flows, data stores, data processes, and data sources/ destinations.
The Data Flow Diagram is analogous to a road map. It is a network of all possibilities with different
detail shown, on different hierarchical levels. The process of representing different detail levels is
called levelling or partitioning by some data flow diagram advocates. Like a road map, there is no
start or stop point, no time or timing, or steps to get somewhere. We just know that the data path must
exist because at some point it will be needed. A road map shows all existing or planned roads because
at some point it will be detail that is not shown on the different levels of the data flow diagram such
as volumes, timing, frequency, etc. is shown on supplementary diagrams or in the data dictionary.
A DFD shows the flow of data through a system. It views a system as a function that
transforms the inputs into desired outputs. Any complex system will not perform this transformation
in a "single step", and a data will typically undergo a series of transformations before it becomes the
output. The DFD aims to capture the transformations that take place within a system to the input data
so that eventually the output data is produced. The agent that performs the transformation of data
from one state to another is called a process (or a bubble). So a DFD shows the movement of data
through the different transformation or process in the system.
DFDs are basically of 2 types: Physical and logical ones.
Physical DFD:
The Physical Date Flow diagram (DFD) reveals the actual device and people that perform the
functions. It shows the physical components of a system. The emphasis of this type of DFD is on the
physical characteristics of a system. It depicts the various people doing jobs in an organization. It is
used in the analysis phase to study the functioning of the current system.
Logical DFD:
A Logical DFD shows the ongoing activities of the system. It does not show us how these
tasks are done or who does these tasks. It is used in the design phase for depicting the flow of data in
proposed system.
PURPOSE/OBJECTIVE OF DFD:
1. Graphical, eliminating thousands of words, Logical representations, modelling WHAT a
system does, rather than physical models showing HOW it does it.
2. Hierarchical, showing systems at any level of detail, and
3. Allowing user understanding and reviewing.
Creating a DFD:
Step 1: Plan the Solution
1. Identify the inputs, outputs and external enteritis for the system.
2. Identify the top-level processes in the system.
3. Identify the detailed processes of the system.
Step 2: Implement the Solution
1. Draw the Context Analysis Diagram (CAD).
2. Draw the Top Level DFD.
3. Draw the detailed Logical DFD.
Step 3: Verify the Solution
Get approval of the design from your client.
.
An External Entity
An external entity cloud either is a source or a destination of data in the system design being
constructed. It lies outside the context of the system. It represented by a solid square. If entity is need
to represent more than once then both instance of the entity are represented as follow
EXTERNAL ENTITY
A Process
A process indicates the work that is performed on data. It transforms data from one form to
another. A circle represents a process.
Process No.
Process
Name
A Data Flow
A data flow takes place between the various components of the system. In Data Flow Diagram
the data flow is represented as the thin line pointing in the direction in which the data is flowing.
Data Store:
A data store is repository for the data. While making a logical design if it is require storing
the data, data store is used. A data store is represented by open rectangle. It also has a number and
name.
6. SYSTEM DESIGN
6.1 SYSTEM INTRODUCTION
Good design is the key to effective engineering. However, it is not possible to formalize
the design process in any engineering discipline. Design is a creative process requiring insight and
flair on the part of the designer. It must be practiced and learnt by experience and study of existing
systems. Any design problem must be tackled in three stages:
• Study and understand the problem: Without this understanding, effective software design is
impossible. The problem should be examined from a number of different angles or viewpoints
as these provide different insights in to the design requirements.
• Identify gross features of at least one possible solution: It is often useful to identify a number
of solutions and to evaluate them all. The choice of solution depends on the designer’s
experience, the availability of reusable components, and the simplicity of the derived
solutions. Designers usually prefer familiar solutions even if these are not optimal, as they
understand their advantages and disadvantages.
• Describe each abstraction used in the solution: Before creating formal documentation, the
designer may write an informal design description. This may be analyzed by developing it in
detail. Errors and omissions in the high-level design will probably be discovered during this
analysis. These are corrected before the design is documented. There is no general agreement
on the notion of a ‘good’ design. Apart from the obvious criteria that a design should correctly
implement a specification, a good design might be a design that allows efficient code to be
produced; it might be a minimal design where the implementation is as compact as possible;
or it might be the most maintainable design.
Database Design creates a model for data that are represented at a high level of abstraction.
The data design transforms the information domain model created during analysis into the data
structures that will be required to implement the software. The structure of the data has always been
an important part of the design. Like other software engineering activities, data design (sometimes
referred to as data architecting) creates a model of data and/or information that is represented at a
high level of abstraction (the customer/user’s view of data).This data model is then refined into
progressively more implementation-specific representations that can be processed by the computer-
based system. In many software applications, the architecture of the data will have a profound
influence on the architecture of the system that must process it.
The structure of data has always been an important part of design. At the program
component level, the design of data structures and the associated algorithms required to manipulate
them is essential to the creation of high-quality applications. At the application level, the translation
of a data model (derived as part of requirements engineering) into a database is pivotal to achieving
the business an objective of a system. At the business level, the collection of information stored in
disparate databases and reorganized into a “data warehouse” enables data mining or knowledge
discovery that can have an impact on the success of the business itself. In every case, data design
plays an important role. The general objective of the database is to make information access easy,
quick, inexpensive and flexible for user. Tables are maintained in the remote server.
DATA DICTIONARY
1. Categories
2. Subcategories:
3. items:
4. assettypes:
5. ordermasters:
6. Orderitems:
7. stockmasters
8. branchmaster:
9. branchmanager:
10. rolemaster:
11. designations:
12. users
ENTITY-RELATIONSHIP DIAGRAM
iv. An entity is a piece of data—an object or concept about which data is stored.
A relationship is how the data is shared between entities. There are three types of
relationships between entities:
i. One-to-one: one instance of an entity (A) is associated with one other instance of another entity
(B).
ii. One-to-many: one instance of an entity (A) is associated with zero, one or many instances of
another entity (B)
iii. Many-to-many: one instance of an entity (A) is associated with one, zero or many instances of
another entity (B), and one instance of entity (B) is associated with one, zero or many instances of
entity (A).
SYMBOLS
Entity
Attribute
Relationship
Attribute
Primary Key
Attribute
Foreign Key
Chapter: 7
7. SYSTEM Testing
7.1 INTRODUCTION
Software testing is a critical element of software quality assurance and represents the
ultimate review of specification, design and coding. The purpose of product testing is to verify and
validate the various work viz. units integrated units, and final product to ensure that they meet their
respective requirements. This has two parts:
• Planning: This involves writing and reviewing unit, integration, functional, validation and
acceptance test plans.
• Execution: This involves executing these tests plans, measuring, collecting data and verifying
if it meets the quality criteria set in the quality plan. Data collected is used to make appropriate
changes in the plan related to development and testing.
7.2 TESTING
Testing Objectives:
The quality of a product or item can be achieved by ensuring that the product meets the
requirements by planning and conducting the following tests at various stages:
➢ Unit Tests: at unit level, conducted by development team, to verify individual standalone
units.
➢ Integration Tests: after two or more product units are integrated conducted by
development team to test the interface between the integrated units.
➢ Functional Test: prior to the release to validation manager, designed and conducted by the
team independent of designers and coders, to ensure the functionality provided against the
customer requirement specifications.
➢ Acceptance Tests: prior to the release to validation manger, conducted by the development
team, if any supplied by the customer.
Validation Tests prior to customer, conducted by the validation team to validate the product
against the customer requirement specifications and the user documentation.
Black box testing is designed to uncover errors. They are used to demonstrate that software function
are operations; that input is properly accepted and output is correctly produced ; and that integrity of
external information is maintained (e.g. data files .). A black box examines some fundamental aspects
of a system with little regard for the internal logical structure of the software.
White box testing of software is predicated on close examination of procedural details. Providing
test cases that exercise specific set of conditions and loops test logical paths through the software.
The “state of the program” may be examined at various points to determine if the expected or asserted
status corresponds to the actual status.
The module interface is tested to ensure that information properly flows into and out of the
program unit under test. Data structure is locally examined to ensure that data stored temporarily
maintains its integrity during all the steps in execution of algorithm. Boundary conditions are tested
to ensure that module is operating properly at boundaries which are established to limit or restrict
processing. All independent paths through control structure are exercised to ensure that all statements
in module are executed at least once. Finally all error-handling paths are tested.
Test cases uncover errors like:
1. Comparison of different data type.
2. Incorrect logical operators or precedence.
3. Expectation of equality when precision errors make equality unlikely.
4. Incorrect comparisons of variables.
5. Improper or non-existent loop termination.
6. Failure to exit when divergent iteration is encountered.
7. Improperly modified loop variables.
Chapter: 8
A crucial phase in the system development life cycle is successful implementation of new
system design. Implementations simply mean converting new system design into operation. This is
the moment of truth the first question that strikes in every one’s mind that whether the system will be
able to give all the desires results as expected from system. The implementation phase is concerned
with user training and file conversion. When the candidate system is linked to remote terminals or
remote sites, the telecommunication network and tests of the network along with the system are also
included with the system are also included under implementation. During the final testing, user
acceptance is tested, followed by user training. Depending on the nature of the system, extensive user
training may be required. Conversion usually takes place at about the same time the user is trained or
later.
System testing checks the readiness and accuracy of the system to access update and retrieve
data from the new files. Once the program becomes available, test data are read into the computer
and processed against the files provided for testing. If successful, the program is the run with live
data. Otherwise, a diagnostic procedure is used to locate and correct the errors in program. In most
conversions, a parallel run is conducted where the new system runs simultaneously with the old
system. This method though costly, provided added assurance against errors in the candidate system
and also gives the user staff an opportunity to gain experience through operation. In some cases
however parallel processing is not practical.
The term implementation has different meanings, ranging from the conversion of a basic
application to a complete replacement of computer system Implementation is used here to mean the
process of converting a new or revised system design into an operational one. Conversion is one
aspect of implementation. The other aspects are the post implementation review and software
maintenance.
Corrective: Corrective maintenance of a software product may be necessary either to rectify some
bugs observed while the system is in use, or to enhance the performance of the system
Adaptive: A software Product might need maintenance when the customer need the product to run
on new platforms, on new operating systems, or when they need the product to interface with new
hardware or software.
Prefecture: A software product needs maintenance to support the new feature that the users want or
to change different functionalities of the system according to customer demands.
The activities involved in software maintenance project are not unique and depend on several
factors such as:
9. SCREENSHOTS
1. Nazarat Dashboard:
4. Subcategory View:
5. Item Entry Form:
6. Pending Order:
7. Billing Details Entry (Order Approve)
10, CONCLUSION
CONCLUSION
A proper record will be maintain so that if any controversy arises it can be resolved by
referring such records. The branch managers and other users will get a better platform to keep the
proper track of each and every item which is present or is ordered.
There is also an auto generated report/bill for every order which is placed by the branch
managers so that they can get a proper expanses or cost along with item details.
Coding was done as per the standards and general coding guidelines were decided so as to
have a consistent coding standard across different modules. While coding emphasis was given on
efficiency, optimum use of system resources, object naming conventions, documentation,
indentation, function and module headers.
Finally exhaustive testing and debugging was done so that the software is as much error free
as possible. Thus the main objective of developing a system that was reliable, efficient and met all
the requirements was developed. Care was also taken that the modules can be easily maintained and
modified.
As there is always a scope of further development in any software product, and the same is
true for this project as well. The end-user will decide its scope in future applications. This project is
flexible enough to incorporate any requirement changes.