Integrated Assets Management System

Download as pdf or txt
Download as pdf or txt
You are on page 1of 59

JORHAT ENGINEERING COLLEGE, JORHAT

DEPARTMENT OF COMPUTER APPLICATION


ASSAM SCIENCE AND TECHNOLOGY UNIVERSITY

CERTIFICATE

This is to certify that the project work entitled “ Integrated Assets


Management System(IAMS) ” a management system for the National Informatics System(NIC),
Golaghat is an approved work done by Mr. Anamitrra Vijay (Roll No-190720014005,

Reg.No-426707219), a 6th Semester MCA student of Jorhat Engineering College, Jorhat, for
the partial fulfillment of the requirement for the award of the degree of Master of Computer
Application under Assam Science and Technology University(ASTU) from Jorhat Engineering
College, Jorhat.
I wish him success in all future endeavors.

Dr. Rupam Baruah


Principal and Head of the Department
Department of Computer Application
Jorhat Engineering College
Jorhat, Assam
JORHAT ENGINEERING COLLEGE, JORHAT
DEPARTMENT OF COMPUTER APPLICATION
ASSAM SCIENCE AND TECHNOLOGY UNIVERSITY (ASTU)

CERTIFICATE

This is to certify that the project work entitled “ Integrated Assets


Management System(IAMS) ” a management system for the National Informatics System(NIC),
Golaghat, Assam is an approved work done by Mr. Anamitrra Vijay (Roll No-

190720014005, Reg.No-426707219), a 6th Semester MCA student of Jorhat Engineering

College, Jorhat, for the partial fulfillment of the requirement for the award of the degree of
Master of Computer Application under Assam Science and Technology University from
Jorhat Engineering College, Jorhat.
I wish him success in all future endeavors.

(External Examiner)
Date:
JORHAT ENGINEERING COLLEGE, JORHAT
DEPARTMENT OF COMPUTER APPLICATION
ASSAM SCIENCE AND TECHNOLOGY UNIVERSITY (ASTU)

CERTIFICATE

This is to certify that the project work entitled “ Integrated Assets


Management System(IAMS) ” a management system for National Informatics Center, Golaghat,
Assam is an approved work done by Mr. Anamitrra Vijay (Roll No-190720014005,
Reg.No-426707219), a 6th Semester MCA student of Jorhat Engineering College, Jorhat,
for the partial fulfillment of the requirement for the award of the degree of Master of
Computer Application under Assam Science and Technology University(ASTU) from Jorhat
Engineering College, Jorhat under my guidance and the project has been successfully completed.
I wish him success in all future endeavors.

Dr. Dhrubajyoti Baruah


Associate Professor
Internal Guide
Department of Computer Application,
Jorhat Engineering College, Jorhat, Assam
JORHAT ENGINEERING COLLEGE, JORHAT
DEPARTMENT OF COMPUTER APPLICATION
ASSAM SCIENCE AND TECHNOLOGY UNIVERSITY (ASTU)

CERTIFICATE

This is to certify that the project work entitled “ Integrated Assets


Management System(IAMS) ” a management system for the National Informatics Center(NIC),
Golaghat, Assam is an approved work done by Mr. Anamitrra Vijay (Roll No-

190720014005, Reg.No-426707219), a 6th Semester MCA student of Jorhat Engineering

College, Jorhat, for the partial fulfillment of the requirement for the award of the degree of
Master of Computer Application under Assam Science and Technology University(ASTU)
from Jorhat Engineering College, Jorhat under my guidance and the project has been
successfully completed.

I wish him success in all future endeavors.

Chandrani Borah
Assistant Professor
Internal Co-Guide
Department of Computer Application,
Jorhat Engineering College, Jorhat, Assam
DECLARATION BY THE CANDIDATE

I, Anamitrra Vijay, a 6th Semester MCA student of Jorhat Engineering College, Jorhat
hereby declare that the project work entitled “INTEGRATED ASSETS MANAGEMENT
SYSTEM(IAMS) ”, a management system for the National Informatics Center(NIC), Golaghat,
Assam is an authentic work carried out by me at National Informatics Center(NIC), Golaghat,
Assam under the guidance of Mr. Abhijeet Kakoty, Scientist-C (NIC) for the partial
fulfillment and award of the degree of Master of Computer Application(MCA). This project has
not been submitted anywhere else for the award of any other degree/Diploma. Where other
sources of information have been used, they have been acknowledged.

Anamitrra Vijay
MCA 6th Semester
Jorhat Engineering College
Jorhat, Assam
ACKNOWLEDGEMENT

The ability to help and patience to exercise diligence and provide support is a quality admonished by
very few. Any job in this world, however trivial or tough cannot be accomplished without the assistance of the
others. I would hereby take the opportunity to express my indebtedness to people who have helped me to
accomplish this task. The present line of accomplishment is not a formality but an honest word of appreciation
that has exactly been felt by me during my Project.

First and foremost, I would like to thanks Mr. Abhijeet Kakoti, Scientist-C, National
Informatics Centre (NIC), Golaghat for giving me the opportunity and allow me to do the project in NIC,
Golaghat.

Then I would like to thanks Mr. Rituraj Borgohain, District Information Technology
Manager, NIC, Golaghat, Assam for providing unwavering support and the opportunity to work in this
organization.

I convey my sincere thanks to Dr. Rupam Baruah, Principal and HOD of Department of
Computer Application of Jorhat Engineering College for his help in getting my project.

My deep sense of gratitude to my Internal project guide Dr. Dhrubajyoti Baruah (Associate
professor, dept. of Computer Application, JEC) for his deep interest in the development of the project
and constant reminder for updates.

Last but not the least, I would like to thank my Co-Guide Miss Chandrani Borah (Assistant
Professor, MCA) for providing continuous support and help.

I also take this opportunity to express my indebtedness to my respected parents and all my respected
teachers of JEC, Jorhat for their kind consent, expert guidance, valuable suggestions and affectionate
encouragement.

Anamitrra Vijay
MCA 6th Semester
Jorhat Engineering College
Jorhat, Assam
PREFACE

Excellence is an attitude that the whole of the human race is born with. It is the environment that makes
sure that whether the result of this attitude is visible or otherwise. The well planned, properly executed and
evaluated industrial training help a lot in including the good work culture. It provides linkage between the
student and industry in order to develop the awareness of industrial approach to problem solving based on broad
understanding of process and mode of operation of an organization.

During this period, the students get their real first-hand experience on working in the actual
environment. Most of the theoretical knowledge that they have gained during the course of their studies is to put
to test here. Apart from this the students get the opportunity to learn the latest technology, which immensely
help them in their career. This also benefits the organization as many students doing their projects perform very
well.

I had the opportunity to have the real practical experience, which has increased my sphere of knowledge
to a great extent. Now, I am better equipped to handle the real thing than anyone else that has not undergone
any such training. During the training period, I learned how an actual project progresses, what sort of problems
actually occur during the development of such big projects, how to produce quality products and so on. I also
learned how to share one’s knowledge to bring out a cumulative effect, how to solve problem in modular way.
I learned the strategy of divide and conquer with application-level approach and methodical cycle of software
development. And being in such a reputed organization, I had but the best exposure.

Anamitrra Vijay
MCA 6th Semester
Jorhat Engineering College
CONTENTS
Chapter 1: About the organization
1.1 Introduction

1.2 Infrastructure

Chapter 2: Project Overview

2.1 Project Title

2.2 Proposed System

2.2.1 Objectives of the System

2.2.2 Description of the System

Chapter 3: Feasibility Study

3.1 Introduction

3.1.1 Technical Feasibility

3.1.2 Economical Feasibility

3.1.3 Operational Feasibility

Chapter 4: Technology Used

4.1 Software Specifications

4.1.1 LARAVEL

4.1.2 JSX (JavaScript XML)

4.1.3 REACT JS

4.1.4 APACHE (XAMPP Server)

4.1.5 MySQL Database

4.2 Hardware Specifications

Chapter 5: System Analysis

5.1 Introduction

5.2 System Analysis Elements

5.2.1 Modelling
5.2.2 Functional Modelling

5.2.3 Data Modelling

5.2.4 Data Objects, Attributes, Relationships

5.3 Structured Analysis

5.4.1 Dataflow Diagrams

5.4.2 Purpose/Objective of DFD

5.4.3 DFD Elements

Chapter 6: System Design

6.1 System Introduction

6.2 Database Design

6.2.1 Data Dictionary

6.2.2 Entity-Relationship Diagram

6.2.3 E-R Diagram of the project

Chapter 7: System Testing

7.1 Introduction

7.2 Testing

7.2.1 Testing Objectives

7.2.2 Test Plan

7.3 Level of Testing

7.3.1 Black Box Testing

7.3.2 White Box Testing

Chapter 8: System Implementation and Maintenance

8.1 Implementation

8.2 Maintenance

Chapter 9: Screenshots

Chapter 10: Conclusion


Chapter: 1

1. ABOUT THE ORGANIZATION


1.1 INTRODUCTION
National Informatics Centre (NIC) was established in 1976, and has rich experience in
providing ICT and eGovernance support to the Government for the last 4 decades and bridge the
digital divide. It has emerged as a promoter of digital opportunities for sustainable development.
NIC spearheaded "Informatics-Led-Development" by implementing ICT applications in social and
public administration and facilitates electronic delivery of services to the government (G2G),
business (G2B), citizen (G2C) and government employee (G2E). NIC, through its ICT Network,
"NICNET", has institutional linkages with all the Ministries /Departments of the Central
Government, 37 State Governments/ Union Territories, and about 720+ District Administrations
of India.

NIC has been closely associated with the Government in different aspects of
Governance besides establishing a Nationwide State-of-the-Art ICT Infrastructure, it has also
built a large number of digital solutions to support the government at various levels, making the
last-mile delivery of government services to the citizens a reality.

The following major activities are being undertaken:

• Setting up of ICT Infrastructure

• Implementation of National and State Level e-Governance Projects/Products

• Consultancy to the Government departments

• Research & Development

* Capacity Building

National Informatics Centre (NIC) provides nationwide ICT infrastructure to support


e-Governance services and various initiatives of Digital India. NIC has been associated with
design and development of software for improving delivery of services undertaken by
government departments at State and District level.

ICT infrastructure of NIC viz. NICNET, NKN, LAN, Mini Data Centre, Video
conference studios, messaging service, Webcast facilities are the key constituents of NIC
services across all 37 States/ UTS and 720+ Districts.

In order to design, development and implementation of various eGovernance initiatives


and Digital India programme, NIC State Centres along with their respective District Centers are
continually engaged to automate and accelerate eGovernance processes in close interaction with
Government Departments.
New ICT initiatives at the behest of District Magistrates are also being undertaken for design &
development under their guidance and technical support of District Informatics Officer (DIO)
and Addl. District Informatics Officer (ADIO).

District Administration with support from NIC is executing and implementing e-


governance and Digital India initiatives to the grass root level in achieving transparent, efficient
and responsive governance by providing ICT-led developments e.g. NICNET and NKN
Connectivity, Video Conferencing, Projects Implementation, Capacity building, eMail and
SMSS services, ICT implementation in districts including Software Development, Technical
Support to VVIP Events and DeitYProgrammes eg. Digital India, CSCS, DISHA, e-Governance
Society etc.

Infrastructure 1.2

Network: Core of NICNET backbone is fully upgraded to multiple 10 Gbps capacity with sufficient
redundancy. States are connected through multiple 1/10 Gbps links and districts 34/100 Mbps links with
redundancy built at State and District links. Last mile redundancy for NICNET has been extended to more
number of districts, with primary link from BSNL and secondary links from Railtel/PGCIL. Most of the
Bhawan links at Delhi which were on 34 Mbps are upgraded to 100 Mbps and those on 100 Mbps are
upgraded to 1Gbps.

Direct peering of NICNET with BSNL, PGCIL and Railtel are completed at Delhi and Hyderabad
for saving Internet Bandwidth and faster access of each other's Network and Data Centre, Peering with
Google, Microsoft and Akamai Content Delivery Network has facilitated faster access to Google services
and other important International web sites. Re-structuring of Videoconferencing network has enabled to
minimize delay and handle large scale important video conferencing such as PRAGATI of Hon'ble PM, GST
Council Meetings by Hon'ble FM etc. High speed Internet services are provided to national data centres to
ensure that the applications hosted are accessible to users across the globe with minimum latency. Capacity
planning and upgradation of Internet Gateway at regular interval has been undertaken to provide smooth
Internet access to all NICNET users throughout the country. To maintain accurate timing and
synchronization of all network elements and servers on the network Stratum-1 clocks are installed at Delhi
and Hyderabad.

*NKN: NKN empowers Digital India, as it is the primary backbone for all e-Governance initiatives in the
country. It is the only network globally, that carries R&E, Internet and e-Governance traffic as independent
verticals under one umbrella. NKN has multiple 10G links that are combining a core bandwidth of close to
1000G, providing secured and highly resilient connectivity across major Institutions for research, education
and e-Governance.

NKN has a strong backbone connectivity with 31 Points of Presence (POPs) in various State Capitals and 92
core links connected with meshed topology. Moreover, currently over 700 Gigabits (reaching a peak of 5
Petabytes) of data is flowing within the NKN backbone every day, Over 40 links (premium Institutes, SDC
(State Data Centres) & SWAN of many states) have been upgraded to 10 Gbps. NKN has also established a
High Capacity SCPC VSAT Connectivity at Kavarati, Lakshadweep and Port Blair, Andaman & Nicobar
Island.

National Cloud Infrastructure: NIC launched National Cloud Services in year 2014 under MeghRaj
Government of India Cloud Initiative, NIC Cloud Services are being provided from multiple locations of
National Data Centres at Bhubaneswar, Delhi, Hyderabad, and Pune. Various new services are now offered
on Cloud including Application Programme Monitoring (APM) Service, Data Analytics (DA) Service,
Resource Monitoring (RM) Service and Container Service. In order to cater to the projects envisioned
under Digital India Programme and growing requirements of existing Projects, over 18,000 Virtual Servers
were provisioned and allocated to over 1100 Ministries/Departments for e-Governance Projects,

Network Security: The Network Security Division is in relentless pursuit of achieving CIA
(Confidentiality, Integrity, and Availability) of ICT assets in NICNET through deployment of expert
manpower, appropriate tools, and state-of-the-art technologies.

The Network Security Division (NSD) of NIC is engaged in assessment, planning, deployment and
management of security devices and solutions across the NICNET in general and the Data Centres in
particular. The security span of NSD comprises of all National and State Data Centres, over 1000 LANs of
Govt. offices and MPLS networks, more than 2 Lakh endpoints and a series of networking devices deployed
across the country. A dedicated team actively monitors real time attacks on 24×7 basis.

Application Security: NIC is formulating and updating the Security Policies for NICNET as and when
required. Security Audit of Web Applications / Websites, Penetration Testing and Vulnerability
Analysis,SSL compliance testing, Version Detection for application hosting environment with infrastructure
compliance checks are also done as per user requirement. Critical Web applications are secured through
Web Application Firewall (WAF) to counter Application layer threat, Management and administration of
deployed WAF solutions configuration of critical sites including CMF (Drupal) based portals, WAF service
support at NIU Hyderabad for non-compliant web applications and 24x7 monitoring service.

The center provides Incident Handling and Malware Analysis, Sanitization of security controls based
on analysis results and Issuing advisories to NICNET users. Videoconference: Videoconferencing facilitates
direct interaction with concerned stake holders and save time & Money. Videoconferencing services are
being used for monitoring of various Government Projects, Schemes, Public Grievances, monitoring of law
and order, Hearings of RTI cases, Tele-education, Tele-medicine and Launching of new schemes etc.NIC's
VC services are being extensively used by Hon'ble Prime Minister, Union Ministers, Governors,
Chief Ministers of states,

Cabinet Secretary and Chief Secretaries, Chief Information Commissioner and various other senior
officials across country.NIC is also providing web-based desktop videoconferencing services to users of
various departments of central government & state governments.

Webcast: NIC has been providing live/on-demand webcast services to Central and State Government for
important National, International and regional, educational events and conferences. Live webcast services
are provided for government TV channels such as Lok Sabha TV, Rajya Sabha TV, Doordarshan News, DD
Kisan, UGC CEC higher educational channel, DD Punjabi on 24×7 basis. Important events such as Union
Budget speech, President's address to the nation, Prime Minister's Mann Ki Baat& other speeches,
Independence and Republic Day celebrations at New Delhi, Air Force Day, Dance and cultural Festivals,
PIB press conferences, NIC Knowledge sharing, NKN events, proceedings of state assemblies, other
national and international events/conferences like make in India, Skill India, Start-up India, Digital India,
International Yoga Day were covered.
Chapter: 2

2. PROJECT OVERVIEW
2.1 PROJECT TITLE

The project is titled as - “INTEGRATED ASSETS MANAGEMENT SYSTEM


(IAMS)”, National Informatics Centre (NIC), Golaghat.

2.2 OBJECTIVES OF THE SYSTEM

• Build a digital Inventory for Assets Management.


• Friendly UI/UX for inventory creation and Report generation.
• Provide a facility to track any items or belongings to the DC office.
• Help maintain assets by providing reports/Alert of AMC contacts.
• Help Branch/Department to generate their requirements online.
• Facilitate automatic workflow management where HOD may grant
item purchase orders online.
• Provide various reports of logistics and items for better decision
making.

2.3 DESCRIPTION OF THE ADVISORY VISIT

Integrated Assets management system is a digital inventory for asset management.


This system will prove to be a systematic and convenient replacement of the traditional asset
management maintain through registered and paper format. Moreover, this system will provide facilities
to track any items or belongings of Golaghat DC office which was not possible in traditional system.
The system will have a user-Friendly UI/UX for inventory creation and Report generation. The system
will help to maintain assets by providing reports/alert of AMC contacts. As in today’s world, everything
is moving on to digital platform, this system will also help every Branch/Department to generate their
requirement through online. The system will have automatic workflow management where HOD may
grant the purchase of item ordered online. The system will also provide various reports of logistics and
items for better decision making. A proper record will be maintained so that if any controversy arises it
can be resolved by referring such records. The branch managers and other users will get a better platform
to keep the proper track of each and every item which is present or is ordered. There is also an auto
generated report/bill for every order which is placed by the branch managers so that they can get a proper
expanses or cost along with item details.
Chapter: 3

3. FEASIBILITY STUDY
3.1 INTRODUCTION

An important outcome of the preliminary investigation is the determination of whether the


proposed requests for develop an existing system or a new system is feasible or not. It is necessary to
evaluate the feasibility of a project at the earliest possible time. Months and years of efforts, thousands
or millions of dollars and untold embarrassment can be averted if an ill-conceived system is
recognized early in the definition phase.

Eight steps involved in the feasibility analysis are:

Steps in feasibility analysis:

• Form a project team and appoint a project leader.


• Enumerate potential proposed system.
• Define and identify characteristics of proposed system.
• Determine and evaluate performance and cost effective of each
proposed system.
• Weight system performance and cost data.
• Select the best-proposed system.
• Prepare and report final project directive to management.

There are three aspects in the feasibility study portion of the primary investigation. They are:

1. Technical feasibility.

2. Economic feasibility.

3. Operational feasibility.

TECHNICAL FEASIBILITY:

Technical feasibility means, can the work for the project be done with current equipment,
existing software technology and available personal. If new technology is needed, what is it that can
be developed? There are a number of technical issues, which are generally raised during the feasibility
stage of the investigation. They are as follows:

1. Does the necessary technology exist to do what is suggested?

2. Does the proposed equipment have the technical capability to hold the data required to
use the new system?

3. Can the system be upgraded if developed?


4. Are there technical guarantees of accuracy, reliability, ease of access and data security?

So, the technical feasibility of this project is that the project can be done with current
equipment, existing software technology and available personnel. The proposed equipment has the
technical capability to hold the data required using the new system. There are technical guarantees of
accuracy, reliability, ease of access and data security.

Front-End selection:

• It must have a graphical user interface that assists employees that are not from IT
background.
• Scalability and extensibility.
• Flexibility.
• Robustness.
• According to the requirements of the organization and the culture.
• Must provide excellent reporting features with good printing supports.
• Platform independent.
• Easy to debug and maintain.
• Event driven programming facility.
• Front end must support some popular back end like MS SQL server.

Back-End selection:

• Multiple user supports.


• Efficient data handling.
• Provide inherent features for security.
• Efficient data retrieval and maintenance.
• Stored procedures.
• Popularity.
• Operating system compatible.
• Easy to install.
• Various drives must be available.
• Easy to implant with the Front-End.

The technical feasibility is frequently the most difficult area encountered at this stage. It is
essential that the process of analysis and definition be conducted in parallel with an assessment to
technical feasibility. It centre on the existing computer system (hardware, software etc.) and to what
extent it can support the proposed system.
ECONOMIC FEASIBILITY:

Economic feasibility means, an evolution of development cost weight against the income or
benefit derived from the developed project. Economic feasibility determine whether, there are
sufficient benefit in creating the system to make the system cost acceptable or the cost of not creating
the system so great that it is advisable to undertake the project. Analyst raises various financial and
economic questions during the preliminary and economic question during the preliminary
investigation to estimate the following:

• The cost to conduct a full system investigation.


• The cost to conduct of hardware & software for the class of application being
considered.
• The cost if nothing changes (i.e. the proposed system is not developed)

The economic feasibility of this system is there are sufficient benefits in creating the system to
make the system cost acceptable. As the existing website system is regulated manually so, there is
always a tendency of losing money and time.

OPERATIONAL FEASIBILITY:

Operational feasibility means, will the system be used if it developed and implemented.
Proposed projects are beneficial only if there can be termed in to information system that will meet
the operational requirement of the organization. This test of feasibility asks if the system will work
when it is developed and installed. Are there major barriers to implementation? Some of the important
question those are useful to the Operational Feasibility of a project are given below:

➢ It there sufficient support for the project from the management? If the present system
is well linked and used to the extend the portion will not be able to see reasons for a
change, there may be resistance.
➢ Are current business methods acceptable to the users? If they are not, users may
welcome to change that will bring about a more operational useful system.
➢ Have the user been involved in the planning and developed of project? If they are
involved at the earlier stage of project development, the change of resistance can be
possible reduced.
➢ Will the proposed system cause harm? Will it produce poorer result in any case?
Will the performance of staff member fall down after implementation?

Issue that appears to be quite minor at the early stage can grow in to major problem after
implementation. Therefore, if is always available to consider operational aspect carefully.

The operational feasibility of this project is that it is user friendly software. So there are no
difficulties to train the users about the software. User can take benefits from the system by saving
time and money.
Chapter: 4

4. TECHNOLOGY USED
4.1 SOFTWARE SPECIFICATIONS

The project is comprised using the following software tools. These tools were chosen in
such a manner taking into account the need for future enhancements, system longevity and
maintainability.
Languages/Scripts: PHP, Laravel, JSX (JavaScript XML)
Application server: Apache (XAMPP Server)
GUI Design: React JS, Cascading Style Sheets (CSS)
Browsers: Mozilla Firefox, Google Chrome or JavaScript supported browser
Database: MYSQL

PHP: HYPERTEXT PRE-PROCESSOR

PHP or Hypertext Pre-processor is a widely used, general-purpose scripting language that


originally designs for web development to produce dynamic. It can embedded into HTML and
generally runs on a web server, which needs to be configured to PHP code and create web page
content from it. It can be deployed on most web servers and almost every operating system and
platform free of charge. PHP is installed on over 20 million websites and 1 million web servers.

PHP code can be simply mixed with HTML code, or it can be used in combination with various
templating engines and web frameworks. PHP code is usually processed by a PHP interpreter,
which is usually implemented as a web server's native module or a Common Gateway Interface
(CGI) executable. After the PHP code is interpreted and executed, the web server sends resulting
output to its client, usually in form of a part of the generated web page, for example, PHP code can
generate a web page's HTML code, an image, or some other data. PHP has also evolved to include a
command-line interface (CLI) capability and can be used in standalone graphical applications.
PHP syntax:
The following “Hello User” program is written in PHP code embedded in an HTML document:
<!DOCTYPE html>
<html>
<head>
<title>PHP Test</title>
</head>
<body>
<? php echo '<p>Hello User </p>'; ?>
</body>
</html>
The PHP interpreter only executes PHP code within its delimiters. Anything outside its
delimiters is not processed by PHP (although non-PHP text is still subject to control structures
described in PHP code). The most common delimiters are <? php to open and ?> to close PHP
sections. There are also the shortened forms <? or <?= (which is used to echo back a string or
variable) and ?>. Short delimiters make script files less portable, since support for them can be
disabled in the local PHP configuration, and they are therefore discouraged. The purpose of all these
delimiters is to separate PHP code from non-PHP code, including HTML.
The first form of delimiters, <?php and ?>, in XHTML and other XML documents, creates
correctly formed XML "processing instructions". This means that the resulting mixture of PHP code
and other markup in the server-side file is itself well-formed XML. Variables are prefixed with a
dollar symbol, and a type does not need to be specified in advance. PHP 5 introduced type hinting
that allows functions to force their parameters to be objects of a specific class, arrays, interfaces or
callback functions. However, before PHP 7.0, type hints could not be used with scalar types such as
integer or string. Unlike function and class names, variable names are case sensitive. Both double-
quoted ("") and heredoc strings provide the ability to interpolate a variable's value into the string.
PHP treats newlines as whitespace in the manner of a free-form language, and statements are
terminated by a semicolon. PHP has three types of comment syntax: /* */ marks block and inline
comments; // as well as # are used for one-line comments. The echo statement is one of several
facilities PHP provides to output text, e.g., to a web browser.
In terms of keywords and language syntax, PHP is similar to most high level languages that
follow the C style syntax. if conditions, for and while loops, and function returns are similar in syntax
to languages such as C, C++, C#, Java and Perl.

LARAVEL:
Laravel is a PHP based web framework for building high-end
web applications using its significant and graceful syntaxes. It comes with a robust collection of
tools and provides application architecture. Moreover, it includes various characteristics of
technologies like ASP.NET MVC, CodeIgniter, Ruby on Rails, and a lot more. This framework is
an open-source framework. It facilitates developers by saving huge time and helps reduce
thinking and planning to develop the entire website from scratch. Along with that, the security of
the application is also Laravel take care of. Hence all its features can boost the web
development pace for you. If anyone is familiar with the basics of PHP along with some
intermediate PHP scripting, then Laravel can craft your work more efficiently.

Some essential features provided by Laravel are:


• Routing controllers
• Configuration management
• Testability
• Authentication and authorization of users
• Modularity
• ORM (Object Relational Mapper) features
• Provides template engine
• Building schemas
• E-mailing facilities
APACHE (XAMPP SERVER)

XAMPP is a free and open source cross-platform web server solution stack package
developed by Apache Friends, consisting mainly of the Apache HTTP Server, MariaDB Database,
and interpreters for scripts written in the PHP and Perl programming Languages. XAMPP stands for
Cross-Platform(X), Apache (A), MariaDB (M), PHP (P), and Perl (P). It is a simple, lightweight
Apache distribution that makes it extremely easy for developers to create a local web server for testing
and development purposes. Everything needed to set up a web server-server application (Apache),
database (MariaDB) and Scripting language (PHP)-is included in an extractable file. XAMPP is also
cross-platform, which means it works equally well on Linux, Mac and Windows. Since most actual
web server deployments use the same components as XAMPP, it makes transitioning from a local
test server to a live server extremely easy as well.

XAMPP Features: XAMPP is regularly updated to the latest releases of Apache, MariaDB, PHP and
Perl. It also comes with a number of other modules including OpenSSL, phpMyAdmin, MediaWiki,
Joomla, WordPress and more. Self-contained, multiple instances of XAMPP can exist on a single
computer, and any given instance can be copied from one computer to another. XAMPP is offered in
both a full and a standard version.

Usage of XAMPP: Officially, XAMPP’s designers intended it for use only as a development tool, to
allow website designers and programmers to test their work on their own computers without any
access to the internet. To make this as easy as possible, many important security features are disabled
by default. XAMPP has the ability to serve web pages on the WORLD WIDE WEB. A special tool
is provided to password-protect the most important parts of the package.

XAMPP also provides supports for creating and manipulating databases in MariaDB and
SQLite among others.

Once XAMPP is installed, it is possible to treat a localhost like a remote host by connecting
using an FTP client. Using a program like FileZilla has many advantages when installing a content
management system (CMS) like Joomla or WordPress. It is also possible to connect to localhost via
FTP with an HTML editor.

REACT JS:

React.js is an open-source JavaScript library that is used for building user interfaces
specifically for single-page applications. It’s used for handling the view layer for web and mobile
apps. React also allows us to create reusable UI components. React was first created by Jordan Walke,
a software engineer working for Facebook. React first deployed on Facebook’s newsfeed in 2011 and
on Instagram.com in 2012.
React allows developers to create large web applications that can change data, without
reloading the page. The main purpose of React is to be fast, scalable, and simple. It works only on
user interfaces in the application. This corresponds to the view in the MVC template. It can be used
with a combination of other JavaScript libraries or frameworks, such as Angular JS in MVC.

JSX

In React, instead of using regular JavaScript for templating, it uses JSX. JSX is a simple
JavaScript that allows HTML quoting and uses this HTML tag syntax to render subcomponents.
HTML syntax is processed into JavaScript calls of React Framework. We can also write in pure old
JavaScript.

NodeJS:

Node.js is an open source, cross-platform runtime environment for developing server-side


and networking applications. Node.js applications are written in JavaScript, and can be run within
the Node.js runtime on OS X, Microsoft Windows, and Linux.

Node.js also provides a rich library of various JavaScript modules which simplifies the
development of web applications using Node.js to a great extent. It brings plenty of advantages to
the table, making it a better choice than other server-side platforms like Java or PHP.

HTML STYLES-CASCADING STYLE SHEETS (CSS)

CSS stands for Cascading Style Sheets. Styling can be added to HTML elements in 3 ways:

• Inline - using a style attribute in HTML elements

• Internal - using a <style> element in the HTML <head> section

• External - using one or more external CSS files.The most common way to add styling, is to
keep the styles in separate CSS files.

CSS Syntax: CSS styling has the following syntax:

element { property:value; property:value }

The element is an HTML element name. The property is a CSS property. The value is a CSS value.
Multiple styles are separated with semicolon.
Inline Styling (Inline CSS):
Inline styling is useful for applying a unique style to a single HTML element.Inline styling uses the
style attributes. This inline styling changes the text colour of a single heading:

Example: <h1 style="color:blue">This is a Blue Heading</h1>

Internal Styling (Internal CSS):

An internal style sheet can be used to define a common style for all HTML elements on a page.
Internal styling is defined in the <head> section of an HTML page, using a <style> element.

Example:

<!DOCTYPE html>

<html>

<head>

<style>

body {background‐color:lightgrey}

h1 {color:blue}

p {color:green}

</style>

</head>

<body>

<h1>This is a heading</h1>

<p>This is a paragraph. </p>

</body>

</html>

External Styling (External CSS): External style sheet are ideal when the style is applied to many
pages. With external style sheets, we can change the look of an entire website by changing one file.
External styles are defined in an external CSS file, and then linked to in the <head>
section of an HTML page.

Example:

<!DOCTYPE html>

<html>
<head>

<link rel="stylesheet" href="styles.css">

</head>

<body>

<h1>This is a heading</h1>

<p>This is a paragraph.</p>

</body>

</html>

CSS Fonts:

• The CSS color property defines the text color to be used for the HTML element.

• The CSS font-family property defines the font to be used for the HTML element.

• The CSS font-size property defines the text size to be used for the HTML element.

The CSS Box Model:

Every HTML element has a box around it, even if we cannot see it.The CSS border property define
a visible border around an HTML element.

Example: p{

border:1px solid black;

The CSS padding property defines a padding (space) inside the border.

Example p { border:1px solid black;

padding:10px; }

The CSS margin property defines a margin (space) outside the border.

Example p{

border:1px solid black;

padding:10px;

margin:30px;
}

The id Attribute:
All the examples above use CSS to style HTML elements in a general way.To define a special style
for one special element; first add an id attribute to the element:

Example: <p id="p01">I am different</p>

then define a different style for the (identified) element:

Example: p#p01 { color:blue; }

The class Attribute:

To define a style for a special type (class) of elements, add a class attribute to the element:

Example: <p class="error">I am different</p>

Now we can define a different style for all elements with the specified class:

Example: p.error { color:red; }

Use id to address single elements. Use class to address groups of elements.

MYSQL DATABASE

MySql is the most popular database system used with PHP.

• MySql is a database system used on the web.

• MySql is a database system that runs on a server.

• MySql is ideal for both small and large applications.

• MySql is very fast, reliable and easy to use.

• MySql uses standard Sql.

• MySql compiles on a number of platforms.

• MySql is free to download and use.

• MySql is developed, distributed and supported by Oracle Corporation.


The data in a MySql database are stored in tables. A table is a collection of related data and it
consists of columns and rows. Databases are useful for storing information categorically. A company
may have a database with the following tables:
✓ Employees

✓ Products

✓ Customers

✓ Orders

Database queries: A query is a question or a request. We can query a database for specific
information and have a record set returned.

Example: Following query (using standard SQL)

select lastname from employees;

The query above selects all the data in the “lastname” column from the “employees” table.

4.2 HARDWARE SPECIFICATIONS

Hardware used for developing the system:

• Processor: Intel core i5

• Memory: 4GB RAM

• Hard disk: 500GB


Chapter: 5

5. System ANALYSIS
5.1 INTRODUCTION

System Analysis:
Software requirement specification (SRS) is the starting point of the software development
activity. The objective of analysis of the problem is to answer the question: Exactly what must the
system do? During system analysis the analyst attempts to develop a complete functional
understanding of the proposed system. The document identifies a number of processes or functions
that must be performed by the system.
There are mainly two parts of this phase:
1. Problem Analysis or Requirement Analysis
2. Requirement Specifications and Review

5.2 SYSTEM ANALYSIS ELEMENTS

Modelling: We create model to gain a better understanding of the actual entity to be built. Here entity
is to be built is software, so it must be capable of modelling the information that the software
transforms, the functions and sub-functions that enable the transformation to occur, and the behaviour
of the system as the transformation is taking place.

Functional Modelling: Software transforms information, and in order to accomplish this, it must
perform at least three generic functions: input, processing, and output. The functional mode begins
with a single context level model (i.e. the name of the software to be built). Over a series of iterations,
more and more functional detail is provided, until a thorough delineation of all system functionality
is represented.

Data Modelling: Data modelling defines primary data objects, composition of each data object,
attributes of object, relationships between the object and between the objects and the processes.

Data Objects, Attributes, Relationships:


The data model consists of three interrelated pieces of information: the data objects, the
attributes that describe the data object, and the relationships that connect the data object to one
another.
Data Objects: A data object is a representation of almost any composite information that must
be understood by the software. By composite information, we mean something that has number of
different properties or attributes.
Attributes: Attributes define the properties of a data object.
Relationships: Data object are connected to one another in a variety of different ways. We can
define a set of object object-relationship pairs that define the relevant relationships. Object
relationship pairs are bi-directional.
5.3 STRUCTURED ANALYSIS

Structured Analysis is a data-oriented approach to conceptual modelling. It is a set of


techniques and graphical tools that allow the analyst to develop a new kind of system specifications
that are easily understandable to the user.

Tools of Structured Analysis: The structured tools include the data flow diagram, data dictionary,
structured English, decision trees and decision tables. The objective is to build a new document called
system specifications that provides the basis for design and implementation. The Data Flow
Diagram’s (DFD) of the project are shown next.

DATA FLOW DIAGRAMS:

Data flow diagrams illustrate how data is processed by a system in terms of inputs and outputs.
Data Flow Diagramming is a means of representing a system at any level of detail with a graphic
network of symbols showing data flows, data stores, data processes, and data sources/ destinations.
The Data Flow Diagram is analogous to a road map. It is a network of all possibilities with different
detail shown, on different hierarchical levels. The process of representing different detail levels is
called levelling or partitioning by some data flow diagram advocates. Like a road map, there is no
start or stop point, no time or timing, or steps to get somewhere. We just know that the data path must
exist because at some point it will be needed. A road map shows all existing or planned roads because
at some point it will be detail that is not shown on the different levels of the data flow diagram such
as volumes, timing, frequency, etc. is shown on supplementary diagrams or in the data dictionary.

A DFD shows the flow of data through a system. It views a system as a function that
transforms the inputs into desired outputs. Any complex system will not perform this transformation
in a "single step", and a data will typically undergo a series of transformations before it becomes the
output. The DFD aims to capture the transformations that take place within a system to the input data
so that eventually the output data is produced. The agent that performs the transformation of data
from one state to another is called a process (or a bubble). So a DFD shows the movement of data
through the different transformation or process in the system.
DFDs are basically of 2 types: Physical and logical ones.

Physical DFD:
The Physical Date Flow diagram (DFD) reveals the actual device and people that perform the
functions. It shows the physical components of a system. The emphasis of this type of DFD is on the
physical characteristics of a system. It depicts the various people doing jobs in an organization. It is
used in the analysis phase to study the functioning of the current system.

Logical DFD:
A Logical DFD shows the ongoing activities of the system. It does not show us how these
tasks are done or who does these tasks. It is used in the design phase for depicting the flow of data in
proposed system.
PURPOSE/OBJECTIVE OF DFD:
1. Graphical, eliminating thousands of words, Logical representations, modelling WHAT a
system does, rather than physical models showing HOW it does it.
2. Hierarchical, showing systems at any level of detail, and
3. Allowing user understanding and reviewing.

Creating a DFD:
Step 1: Plan the Solution
1. Identify the inputs, outputs and external enteritis for the system.
2. Identify the top-level processes in the system.
3. Identify the detailed processes of the system.
Step 2: Implement the Solution
1. Draw the Context Analysis Diagram (CAD).
2. Draw the Top Level DFD.
3. Draw the detailed Logical DFD.
Step 3: Verify the Solution
Get approval of the design from your client.
.

DATA FLOW DIAGRAM (DFD) ELEMENTS:


The following four elements are used in the Data Flow Diagrams.
1. An External Entity
2. A Data Flow
3. A Process
4. A Data store
Data Flow Diagrams are composed of the four basic symbols shown below.

An External Entity
An external entity cloud either is a source or a destination of data in the system design being
constructed. It lies outside the context of the system. It represented by a solid square. If entity is need
to represent more than once then both instance of the entity are represented as follow

EXTERNAL ENTITY

A Process
A process indicates the work that is performed on data. It transforms data from one form to
another. A circle represents a process.

Process No.

Process
Name
A Data Flow
A data flow takes place between the various components of the system. In Data Flow Diagram
the data flow is represented as the thin line pointing in the direction in which the data is flowing.

Data Store:
A data store is repository for the data. While making a logical design if it is require storing
the data, data store is used. A data store is represented by open rectangle. It also has a number and
name.

Data Store Name


Chapter: 6

6. SYSTEM DESIGN
6.1 SYSTEM INTRODUCTION

Good design is the key to effective engineering. However, it is not possible to formalize
the design process in any engineering discipline. Design is a creative process requiring insight and
flair on the part of the designer. It must be practiced and learnt by experience and study of existing
systems. Any design problem must be tackled in three stages:

• Study and understand the problem: Without this understanding, effective software design is
impossible. The problem should be examined from a number of different angles or viewpoints
as these provide different insights in to the design requirements.

• Identify gross features of at least one possible solution: It is often useful to identify a number
of solutions and to evaluate them all. The choice of solution depends on the designer’s
experience, the availability of reusable components, and the simplicity of the derived
solutions. Designers usually prefer familiar solutions even if these are not optimal, as they
understand their advantages and disadvantages.

• Describe each abstraction used in the solution: Before creating formal documentation, the
designer may write an informal design description. This may be analyzed by developing it in
detail. Errors and omissions in the high-level design will probably be discovered during this
analysis. These are corrected before the design is documented. There is no general agreement
on the notion of a ‘good’ design. Apart from the obvious criteria that a design should correctly
implement a specification, a good design might be a design that allows efficient code to be
produced; it might be a minimal design where the implementation is as compact as possible;
or it might be the most maintainable design.

6.2 DATABASE DESIGN

Database Design creates a model for data that are represented at a high level of abstraction.
The data design transforms the information domain model created during analysis into the data
structures that will be required to implement the software. The structure of the data has always been
an important part of the design. Like other software engineering activities, data design (sometimes
referred to as data architecting) creates a model of data and/or information that is represented at a
high level of abstraction (the customer/user’s view of data).This data model is then refined into
progressively more implementation-specific representations that can be processed by the computer-
based system. In many software applications, the architecture of the data will have a profound
influence on the architecture of the system that must process it.
The structure of data has always been an important part of design. At the program
component level, the design of data structures and the associated algorithms required to manipulate
them is essential to the creation of high-quality applications. At the application level, the translation
of a data model (derived as part of requirements engineering) into a database is pivotal to achieving
the business an objective of a system. At the business level, the collection of information stored in
disparate databases and reorganized into a “data warehouse” enables data mining or knowledge
discovery that can have an impact on the success of the business itself. In every case, data design
plays an important role. The general objective of the database is to make information access easy,
quick, inexpensive and flexible for user. Tables are maintained in the remote server.

DATA DICTIONARY

A data dictionary, or metadata repository, as defined in the IBM Dictionary of Computing, is


a centralized repository of information about data such as meaning, relationships to other data, origin,
usage, and format. The term may have one of several closely related meanings pertaining
to databases and Database Management Systems (DBMS):
• a document describing a database or collection of databases
• an integral component of a DBMS that is required to determine its structure
• a piece of middleware that extends or supplants the native data dictionary of a DBMS
For the project the following relations are in the database.

1. Categories

2. Subcategories:
3. items:

4. assettypes:

5. ordermasters:
6. Orderitems:

7. stockmasters

8. branchmaster:
9. branchmanager:

10. rolemaster:

11. designations:

12. users
ENTITY-RELATIONSHIP DIAGRAM

The entity-relationship (ER) diagram also known as entity-relationship model is a


specialized graphic that illustrates the interrelationships between entities in a database. ER diagrams
often use symbols to represent three different types of information.

i. Boxes are commonly used to represent entities.

ii. Diamonds are normally used to represent relationships.

iii. Ovals are used to represent attributes.

iv. An entity is a piece of data—an object or concept about which data is stored.

A relationship is how the data is shared between entities. There are three types of
relationships between entities:

i. One-to-one: one instance of an entity (A) is associated with one other instance of another entity
(B).

ii. One-to-many: one instance of an entity (A) is associated with zero, one or many instances of
another entity (B)

iii. Many-to-many: one instance of an entity (A) is associated with one, zero or many instances of
another entity (B), and one instance of entity (B) is associated with one, zero or many instances of
entity (A).

The Building Blocks: Entities, Relationships, and Attributes

Entity: An entity may be defined as a thing which is recognized as being capable of an


independent existence and which can be uniquely identified. An entity is an abstraction from the
complexities of some domain. An entity may be a physical object such as a house or a car, an event
such as a house sale or a car service, or a concept such as a customer transaction or order. An entity-
type is a category. There are usually many instances of an entity-type. Entities can be thought of
as nouns- a computer, an employee, a song, a mathematical theorem, etc.
Relationships: A relationship captures how two or more entities are related to one another.
Relationships can be thought of as verbs, linking two or more nouns. Examples: an own relationship
between a company and a computer, a supervise relationship between an employee and a department,
a perform relationship between an artist and a song, a proved relationship between a mathematician
and a theorem.
Attributes: An Attribute is a specification that defines a property of an object, element, or
file. It may also refer to or set the specific value for a given instance of such. An attribute of an object
usually consists of a name and a value; of an element, a type or class name; of a file, a name and
extension.
Entities and relationships can both have attributes. Examples: an employee entity might have
a Social Security Number (SSN) attribute; the proved relationship may have a date attribute. Every
entity (unless it is a weak entity) must have a minimal set of uniquely identifying attributes, which is
called the entity's primary key, and a set of attributes that are used to refer other entities, which are
called the entity’s foreign keys.

The Major components of an E-R diagram are:

SYMBOLS

Entity

Attribute

Relationship

Attribute
Primary Key

Attribute
Foreign Key
Chapter: 7

7. SYSTEM Testing
7.1 INTRODUCTION

Software testing is a critical element of software quality assurance and represents the
ultimate review of specification, design and coding. The purpose of product testing is to verify and
validate the various work viz. units integrated units, and final product to ensure that they meet their
respective requirements. This has two parts:

• Planning: This involves writing and reviewing unit, integration, functional, validation and
acceptance test plans.
• Execution: This involves executing these tests plans, measuring, collecting data and verifying
if it meets the quality criteria set in the quality plan. Data collected is used to make appropriate
changes in the plan related to development and testing.

7.2 TESTING

Testing Objectives:

➢ Testing is a process of executing a program with the intent of finding an error.


➢ A good test case is one that has a high probability of finding an as-yet undiscovered error.
➢ A successful test is one that uncovers an as –yet undiscovered error.
Test Plan:

The quality of a product or item can be achieved by ensuring that the product meets the
requirements by planning and conducting the following tests at various stages:

➢ Unit Tests: at unit level, conducted by development team, to verify individual standalone
units.
➢ Integration Tests: after two or more product units are integrated conducted by
development team to test the interface between the integrated units.
➢ Functional Test: prior to the release to validation manager, designed and conducted by the
team independent of designers and coders, to ensure the functionality provided against the
customer requirement specifications.
➢ Acceptance Tests: prior to the release to validation manger, conducted by the development
team, if any supplied by the customer.
Validation Tests prior to customer, conducted by the validation team to validate the product
against the customer requirement specifications and the user documentation.

7.3 LEVEL OF TESTING

Any software product can be tested in one of two ways:


Knowing the specific function that a product has been designed to perform, test can be
conducted that demonstrate each function is fully operational, at the same time searching for errors
in each function. This approach is known as black box testing. Knowing the internal working of a
product, test can be conducted to ensure that internal operation performs according to specification
and all internal components have been adequately exercised. This approach is known as white-box
testing.

Black box testing is designed to uncover errors. They are used to demonstrate that software function
are operations; that input is properly accepted and output is correctly produced ; and that integrity of
external information is maintained (e.g. data files .). A black box examines some fundamental aspects
of a system with little regard for the internal logical structure of the software.

White box testing of software is predicated on close examination of procedural details. Providing
test cases that exercise specific set of conditions and loops test logical paths through the software.
The “state of the program” may be examined at various points to determine if the expected or asserted
status corresponds to the actual status.
The module interface is tested to ensure that information properly flows into and out of the
program unit under test. Data structure is locally examined to ensure that data stored temporarily
maintains its integrity during all the steps in execution of algorithm. Boundary conditions are tested
to ensure that module is operating properly at boundaries which are established to limit or restrict
processing. All independent paths through control structure are exercised to ensure that all statements
in module are executed at least once. Finally all error-handling paths are tested.
Test cases uncover errors like:
1. Comparison of different data type.
2. Incorrect logical operators or precedence.
3. Expectation of equality when precision errors make equality unlikely.
4. Incorrect comparisons of variables.
5. Improper or non-existent loop termination.
6. Failure to exit when divergent iteration is encountered.
7. Improperly modified loop variables.
Chapter: 8

8. SYSTEM IMPLEMENTATION AND


MAINTENANCE
8.1 IMPLEMENTATION

A crucial phase in the system development life cycle is successful implementation of new
system design. Implementations simply mean converting new system design into operation. This is
the moment of truth the first question that strikes in every one’s mind that whether the system will be
able to give all the desires results as expected from system. The implementation phase is concerned
with user training and file conversion. When the candidate system is linked to remote terminals or
remote sites, the telecommunication network and tests of the network along with the system are also
included with the system are also included under implementation. During the final testing, user
acceptance is tested, followed by user training. Depending on the nature of the system, extensive user
training may be required. Conversion usually takes place at about the same time the user is trained or
later.

System testing checks the readiness and accuracy of the system to access update and retrieve
data from the new files. Once the program becomes available, test data are read into the computer
and processed against the files provided for testing. If successful, the program is the run with live
data. Otherwise, a diagnostic procedure is used to locate and correct the errors in program. In most
conversions, a parallel run is conducted where the new system runs simultaneously with the old
system. This method though costly, provided added assurance against errors in the candidate system
and also gives the user staff an opportunity to gain experience through operation. In some cases
however parallel processing is not practical.

The term implementation has different meanings, ranging from the conversion of a basic
application to a complete replacement of computer system Implementation is used here to mean the
process of converting a new or revised system design into an operational one. Conversion is one
aspect of implementation. The other aspects are the post implementation review and software
maintenance.

There are three types of implementation:

Implementation of a computer system to replace a manual system.

Implementation of a new computer system to replace an existing one.

Implementation of a modified application to replace an existing one.


8.2 MAINTENANCE

Software maintenance traditionally denotes the process of modifying a software product


after it has been delivered to the customer. Maintenance is inevitable for almost any kind of product.
Most of the software products need maintenance on account of the following three main reasons:

Corrective: Corrective maintenance of a software product may be necessary either to rectify some
bugs observed while the system is in use, or to enhance the performance of the system

Adaptive: A software Product might need maintenance when the customer need the product to run
on new platforms, on new operating systems, or when they need the product to interface with new
hardware or software.

Prefecture: A software product needs maintenance to support the new feature that the users want or
to change different functionalities of the system according to customer demands.

Software maintenance is an important activity of every organization as the rate of hardware


obsolescence is very high. Also the demand of user community to see the existing software product
run on newer platforms, run in newer environment and with enhanced features. Whenever hardware
changes and if your software performs some low-level functions, maintenance is necessary.

The activities involved in software maintenance project are not unique and depend on several
factors such as:

➢ The extent of modification to the product required.


➢ The resources available to the maintenance team.
➢ The conditions of existing product (e.g. how structured it is, how well documented it is, etc.).
the expected project risk
Chapter: 9

9. SCREENSHOTS
1. Nazarat Dashboard:

2. Category Entry Form


3. Category List:

4. Subcategory View:
5. Item Entry Form:

6. Pending Order:
7. Billing Details Entry (Order Approve)

8. View Approved Order


9. Pending Stock Entry

10. Branch User Dash Board


11. Requisition :

12. Ordered Item


13. Order Bill

14. Approved orders


Chapter: 10

10, CONCLUSION
CONCLUSION

The project “Integrated Assets Management System (IAMS), National Informatics


Centre (NIC), Golaghat” is developed to enhance and improve all the records of the assets which
are present in physical form. Though presently it is being made through registers and paper format, it
is difficult to maintain a proper track record. This system will facilitate am improved record system
in between various branch managers and other users.

A proper record will be maintain so that if any controversy arises it can be resolved by
referring such records. The branch managers and other users will get a better platform to keep the
proper track of each and every item which is present or is ordered.

There is also an auto generated report/bill for every order which is placed by the branch
managers so that they can get a proper expanses or cost along with item details.

Coding was done as per the standards and general coding guidelines were decided so as to
have a consistent coding standard across different modules. While coding emphasis was given on
efficiency, optimum use of system resources, object naming conventions, documentation,
indentation, function and module headers.

Finally exhaustive testing and debugging was done so that the software is as much error free
as possible. Thus the main objective of developing a system that was reliable, efficient and met all
the requirements was developed. Care was also taken that the modules can be easily maintained and
modified.

As there is always a scope of further development in any software product, and the same is
true for this project as well. The end-user will decide its scope in future applications. This project is
flexible enough to incorporate any requirement changes.

You might also like