Online Shopping Project Synopsis
Online Shopping Project Synopsis
Online Shopping Project Synopsis
Project Synopsis
On
SUBMITTED FOR THE PARTIAL FULFILMENT OF THE REQUIREMENT FOR THE DEGREE
OF
SUPERVISED B
Y
Faculty Name Here
(Faculty)
Submitted by:
Operational Structure
R-World will start its operations in Mangalore city. It will commission 8 retail
outlets in the important residential localities of the city.
2 stock depots-East and West, will service r-World stock distribution. Each
stock depot will service 4 locations each. Inter depot stock transfers will also
happen to facilitate a balanced stock position.
Within each outlet, there are two kinds of stores that are maintained. Main
stores and Shelf inventory. Main stores reflect the outlets stores from where
items are moved to the shelf. Shelf inventory reflects items in stock on
various shelves in the outlet. Therefore, total stock of item is the main stores
stock added with the shelf inventory stock.
Minimum total stock(MTS) positions are defined for each item. Re-order
levels (ROL) are also defined for each item. Whenever stock falls to the Re-
order level, stock indents are placed to the nearest stock depot. Stock depots
reach items twice everyday – morning and afternoon.
Minimum Shelf quantities (MSQ) are maintained for each item within the
outlet. Whenever stock level in the shelf of a specific item falls to MSQ, the
outlet manager instructs replenishment of stock to the shelf.
The Dispatch Manager compiles door delivery orders three times during the
day. The cut-off time for packing for the 1.00 pm delivery is 10.30 am. At
10.30.a Customer-wise packing list is generated. Together with that, a Item-
wise total quantity list is also generated. Packers assemble items in bags and
make a ticked off and the packer presents the list and the consignment to
the Dispatch clerk for billing. Two copies of the bill and consignment list are
stapled along with each consignment box. Boxes are loaded into the delivery
trucks after the driver verifies the Consignment List with delivery
instructions. Credit feasibility of each customer is indicated in the bill.
Customers may pay by cash, Cheque, by card or opt for credit.
Delivery trucks are loaded with standard sized consignment boxes, which
come in three sizes-small, medium and Big. The total space of a Delivery
truck is fixed and therefore, it should be possible to calculate space
availability in a truck based on pending Door Delivery order volumes. Unless
specified, customer orders are loaded on first-come-first-served basis on the
delivery truck. Currently, there is only one truck. This will be increased as
order volumes increase.
Customers
Customers are divided into two categories. Walk-in customers and Account
Customers. Account customers are provided with the R-World card and they
may use that card while billing at the counter or alternatively, quote the Card
Number while ordering stocks over the Net.
Account customers can order for stocks over the Web. The minimum value
for door delivery orders is Rs.250.This amount may be revised later.
Customers need to be given a choice of item categories to choose from, item
lists with prices and status of item availability in stock.
Based on the delivery schedule and the status of space availability with in
each delivery, earliest delivery date/time will be intimated to the customer.
Account customers may also walk-in to the store, buy items and ask for door
delivery.
Customers may return items to the store and based on the situation, the
store may accept items back. Either money will be returned or customers
may buy another item and settle the difference (either way).
Volume discounts are defined for each item. Discount rates are defined for
quantity slabs. There may be any number of quantity slabs for individual
item. Some may have two and others may have up to 5 quantity slabs. This
can change in the future.
The project has been planned to be having the view of distributed architecture, with
centralized storage of the database. The application for the storage of the data has
been planned. Using the constructs of MS-SQLServer2000 and all the user
interfaces have been designed using the ASP.Net technologies. The database
connectivity is planned using the “SQL Connection” methodology. The standards of
security and data protective mechanism have been given a big choice for proper
usage. The application takes care of different modules and their associated reports
which are produced as per the applicable strategies and standards that are put
forwarded by the administrative staff.
Chapt
er 2
Project
Synopsis
The entire project has been developed keeping in view of the distributed client
server computing technology, in mind. The specification have been normalized up
to 3NF to eliminate all the anomalies that may arise due to the database
transaction that are executed by the general users and the organizational
administration. The user interfaces are browser specific to give distributed
accessibility for the overall system. The internal database has been selected as MS-
SQL server 200.The basic constructs of tablespaces, clusters and indexes have been
exploited to provide higher consistency and reliability for the data storage. The MS-
SQL server 200 was a choice as it provides the constructs of high-level reliability
and security. The total front end was dominated using the ASP.Net technologies. At
all proper levels high care was taken to check that the system manages the data
consistency with proper business rules or validations. The database connectivity
was planned using the latest ”SQL Connection” technology provided by Microsoft
Corporation. The authentication and authorization was crosschecked at all the
relevant stages. The user level accessibility has been restricted into two zones
namely. The administrative zone and the normal user zone.
States a glance
Customer
physically visits
the Retail outlet
of the products
and selects the
required
of his choice products
The sales
Customers Cross-verifies Bill clerk person
leaves without the items from raises the precedes the
reference bill Bill products as
per demand
The system at any point of time can provide the information related
to all the existing retail outlets and their operations.
The system at any point of time can provide the list of items and
their availability stock.
The system at any point of time can help the customers in raising
their orders.
The user components are designed to handle the transactional states that
arise upon the system whenever the general employee within the
organization visits the user interface for mock enquiry for the required data.
The normal user interfaces get associated to the environment mostly for the
sake of report standardization. The user components are scheduled to accept
parametrical information from the users as per the system necessity.
GUI’s
In the flexibility of the uses the interface has been developed a graphics
concept in mind, associated through a browses interface. The GUI’S at the
top level have been categorized as
The operational or generic user interface helps the users upon the system in
transactions through the existing data and required services. The operational
user interface also helps the ordinary users in managing their own
information helps the ordinary users in managing their own information in a
customized manner as per the assisted flexibilities.
Number of Modules
The system after careful analysis has been identified to be presented with
the following modules:
Document Conventions:
The overall documents for this project use the recognized modeling
standards at the software industries level.
Introduction
Overview
The original code was inherited from Sybase and designed for
eight-megabyte Unix systems in 1983.These new formats improve
manageability and scalability and allow the server to easily scale
from low-end to high-end systems, improving performance and
manageability.
Benefits
Overview
Files
SQL Server 7.0 creates a database using a set of operating system
files, with a separate file used for each database. Multiple
databases can no longer share the same file. There are several
important benefits to this simplification. Files can now grow and
shrink, and space management is greatly simplified. All data and
objects in the database, such as tables, stored procedures,
triggers, and views, are stored only within these operating system
files:
Secondary These files are optional and can hold all data and
data files objects that are not on the primary data file. Some
databases may not have any secondary data files,
while others have multiple secondary data files.
Log files These files hold all of the transaction log information
used to recover the database. Every database has at
least one log file.
A database now consists of one or more data files and one or more
log files. The data files can be grouped together into user-defined
filegroups. Tables and indexes can then be mapped to different
filegroups to control data placement on physical disks. Filegroups
are a convenient unit of administration, greatly improving
flexibility. SQL Server 7.0 will allow you to back up a different
portion of the database each night on a rotating schedule by
choosing which filegroups to back up. Filegroups work well for
sophisticated users who know where they want to place indexes
and tables. SQL Server 7.0 can work quite effectively without
filegroups.
Log files are never a part of a file group. Log space is managed
separately from data space.
Space Management
File Shrink
SQL Server shrinks files by moving rows from pages at the end of
the file to pages allocated earlier in the file. In an index, nodes
are moved from the end of the file to pages at the beginning of the
file. In both cases pages are freed at the end of files and then
returned to the file system. Databases can only be shrunk to the
point that no free space is remaining; there is no data
compression.
File Grow
There are seven types of pages in the data files of a SQL Server 7.0 database.
There are several ways to deal with this. One way is to use
battery-backed cached I/O devices that guarantee all-or-nothing
I/O. If you have one of these systems, torn page detection is
unnecessary.
In SQL Server 7.0, you can enable torn page detection for a
particular database by turning on a database option.
Locking Enhancements
Row-Level Locking
Dynamic Locking
Lock Modes
Overview
Table Organization
The data for each table is now stored in a collection of 8-KB data
pages. Each data page has a 96-byte header containing system
information such as the ID of the table that owns the page and
pointers to the next and previous pages for pages linked in a list.
A row-offset table is at the end of the page. Data rows fill the rest
of the page.
SQL Server 7.0 tables use one of two methods to organize their
data pages:
Table Indexes
Clustered Indexes
Nonclustered Indexes
Unicode Data
The new data types that support Unicode are ntext, nchar, and
nvarchar. They are the same as text, char, and varchar, except for
the wider range of characters supported and the increased storage
space used.
Normalization
SQL
The common type system defines how types are declared, used,
and managed in the runtime, and is also an important part of
the runtime's support for cross-language integration. The
common type system performs the following functions:
Type Members
Value Types
Classes
Delegates
Arrays
Interfaces
Pointers
Cross-Language Interoperability
In This Section
Language Interoperability
Console applications.
ASP.NET applications.
Windows services.
Choosing a Complier
Compiling translates your source code into MSIL and generates the
required metadata.
Assemblies Overview
If you develop and publish your own XML Web service, the .NET
Framework provides a set of classes that conform to all the
underlying communication standards, such as SOAP, WSDL, and
XML. Using those classes enables you to focus on the logic of your
service, without concerning yourself with the communications
infrastructure required by distributed software development.
Developing Components
Processing Transactions
Securing Applications
Serializing Objects
Threading
Introduction to ASP.NET
Building Applications
Deploying Applications
Shows how to use the .NET Framework and the common
language runtime to create self-described, self-contained
applications.
Configuring Applications
Enabling Profiling
When you create Web Forms pages, you can use these types of
controls:
You can use all types of controls on the same page. The
following sections provide more detail about ASP.NET server
controls. For more information about validation controls, see
Web Forms Validation for information about user controls; see
Introduction to Web User Controls
The object model for HTML server controls maps closely to that
of the corresponding elements. For example, HTML attributes are
exposed in HTML server controls as properties.
A set of events for which you can write event handlers in much
the same way you would in a client-based form, except that the
event is handled in server code.
Support for HTML 4.0 styles if the Web Forms page is displayed
in a browser that supports cascading style sheets. Pass-through
of custom attributes. You can add any attributes you need to an
HTML server control and the page framework will read them and
render them without any change in functionality. This allows you
to add browser-specific attributes to your controls. For details
about how to convert an HTML element to an HTML server
control, see Adding HTML Server Controls to a Web Forms Page
Web server controls offer all of the features described above for
HTML server controls (except one-to-one mapping to HTML
elements) and these additional features:
When the Web Forms page runs, the Web server control is
rendered on the page using appropriate HTML, which often
depends not only on the browser type but also on settings that
you have made for the control. For example, a Textbox control
might render as an <INPUT> tag or a <TEXTAREA> tag,
depending on its properties.
Chapt
er 5
Design
Docume
nt
Design Document
In this the structural and behavioral aspects of the environment in which the
system is to be implemented are represented.
Databases used
Login
Information
Customer
Registration
Query for
existing
Administrator items
Raising
order
High-level Diagram
Sales staff
These are the internal actor within the systems; they execute the sales
process, with specific to the orders that are raised by the customers.
Login
Information
Query for
customers
orders
Sales Staff
Query for
items
inventory
Generate
Bill
Internal Administrators: These are the actors who have the overall
control and construct upon the data maintenance of the system. He is in
charge of any consistent data transactions that may execute upon the
system.
Login
Information
Register
outlets
Register stock
depots
Internal
Administrator
Register
Items
Maintain
stores
inventory
<<Uses>> <<Uses>>
Request for <<Uses>> Generate Order Order the Validate data
raising order No
Retail outlet Id fields
<<Uses>>
Display
<<Uses>>
Store
Elaborated diagram for Sales Staff
<<Uses>> <<Uses>>
<<Uses>> <<Uses>>
Query for customer Enter the order Validate the
orders No fields Display
<<Uses>> <<Uses>>
Display
<<Uses>> <<Uses>>
Select the <<Uses>>
Bill Generation Generate Bill No
Customer order Check all the
Number ordered items
Generate the
bill
Elaborated diagram for Internal Administrator
<<Uses>> <<Uses>>
Request for outlet Generate outlet Validate
Enter required
registration ID data fields
<<Uses>> <<Uses>>
<<Uses>>
Internal <<Uses>>
Administrator <<Uses>> <<Uses>> <<Uses>>
Request for item Generate Store
Enter the required Validate
registration Item ID data fields
<<Uses>>
Store
<<Uses>>
<<Uses>>
Stores Enter the <<Uses>>
inventory Inventory ID Validate the
field Display
Class Collaboration For:
Retail outlet, Major stores inventory, shelf Inventory and customer
orders collaboration
Category master
Item-category-id: number
Item-category-name: varchar2
Item-category-description: varchar2
Insert (), Delete ()
Update (), Search ()
Customer Bill Generation collaboration
Enter Enter
Login Validate
name () Enter
Log name ()
Password () Validate
Password () Display
Request for
customer
account
registration Insert () Generate
Cust-Acc- Accept
NO ()
Fields () Validate
Data fields
() Commit
Customer Item order sequence
Request for
customer
account
registration Insert () Generate
item order
NO () Validate
retail outlet Accept Validate
ID ()
Fields () Data fields
Commit
()
Request for
customer
account
registration Insert () Generate
Bill No ()
Validate
custord No() Validate
sales person
id() Validate
Accept
data Commit
Discount
fields ()
ID ()
Chapt
er 6
Coding
Program Design Language
Psychology of Testing
The aim of testing is often to demonstrate that a program works by showing that it
has no errors. The basic purpose of testing phase is to detect the errors that may
be present in the program. Hence one should not start testing with the intent of
showing that a program works, but the intent should be to show that a program
doesn’t work. Testing is the process of executing a program with the intent of
finding errors.
Testing Objectives
The main objective of testing is to uncover a host of errors, systematically
and with minimum effort and time. Stating formally, we can say,
finding an error.
A good test case is one that has a high probability of finding error,
if it exists.
standards.
Levels of Testing
In order to uncover the errors present in different phases we have the
concept of levels of testing. The basic levels of testing are as shown below…
Requirements
System Testing
Design
Integration Testing
Code
Unit Testing
System Testing
The philosophy behind testing is to find errors. Test cases are devised with this in
mind. A strategy employed for system testing is code testing.
Code Testing:
This strategy examines the logic of the program. To follow this method we
developed some test data that resulted in executing every instruction in the
program and module i.e. every path is tested. Systems are not designed as entire
nor are they tested as single systems. To ensure that the coding is perfect two
types of testing is performed or for that matter is performed or that matter is
performed or for that matter is performed on all systems.
Types Of Testing
Unit Testing
Link Testing
Unit Testing
Unit testing focuses verification effort on the smallest unit of software i.e. the
module. Using the detailed design and the process specifications testing is done to
uncover errors within the boundary of the module. All modules must be successful
in the unit test before the start of the integration testing begins.
In this project each service can be thought of a module. There are so many
modules like Login, HWAdmin, MasterAdmin, Normal User, and PManager. Giving
different sets of inputs has tested each module. When developing the module as
well as finishing the development so that each module works without any error. The
inputs are validated when accepting from the user.
Link Testing
Link testing does not test software but rather the integration of each module in
system. The primary concern is the compatibility of each module. The Programmer
tests where modules are designed with different parameters, length, type etc.
Integration Testing
After the unit testing we have to perform integration testing. The goal here is to see
if modules can be integrated proprerly, the emphasis being on testing interfaces
between modules. This testing activity can be considered as testing the design and
hence the emphasis on testing module interactions.
In this project integrating all the modules forms the main system. When integrating
all the modules I have checked whether the integration effects working of any of
the services by giving different combinations of inputs with which the two services
run perfectly before Integration.
System Testing
Here the entire software system is tested. The reference document for this process
is the requirements document, and the goal os to see if software meets its
requirements.
Here entire ‘ATM’ has been tested against requirements of project and it is checked
whether all requirements of project have been satisfied or not.
Acceptance Testing
Acceptance Test is performed with realistic data of the client to demonstrate that
the software is working satisfactorily. Testing here is focused on external behavior
of the system; the internal logic of program is not emphasized.
thoroughly at a statement level to find the maximum possible errors. I tested step
wise every piece of code, taking care that every statement in the code is executed
at least once. The white box testing is also called Glass Box Testing.
I have generated a list of test cases, sample data. which is used to check all
possible combinations of execution paths through the code at every module level.
Black Box Testing
This testing method considers a module as a single unit and checks the unit at
interface and communication with other modules rather getting into details at
statement level. Here the module will be treated as a block box that will take some
input and generate output. Output for a given set of input combinations are
forwarded to other modules.