The Role of Information Systems
The Role of Information Systems
The Role of Information Systems
Information components collect, store, organize, and distribute data throughout the organization.
In fact, we might say that one of the roles of information systems is to take data and turn it into
information, and then transform that into organizational knowledge. As technology has
developed, this role has evolved into the backbone of the organization. To get a full appreciation
of the role information systems play, we will review how they have changed over the years.
From the late 1950s through the 1960s, computers were seen as a way to more efficiently do
calculations. These first business computers were room-sized monsters, with several refrigerator-
sized machines linked together. The primary work of these devices was to organize and store
large volumes of information that were tedious to manage by hand. Only large businesses,
universities, and government agencies could afford them, and they took a crew of specialized
personnel and specialized facilities to maintain. These devices served dozens to hundreds of
users at a time through a process called time-sharing. Typical functions included scientific
calculations and accounting, under the broader umbrella of “data processing.”
In the late 1960s, the Manufacturing Resources Planning (MRP) systems were introduced. This
software, running on a mainframe computer, gave companies the ability to manage the
manufacturing process, making it more efficient. From tracking inventory to creating bills of
materials to scheduling production, the MRP systems (and later the MRP II systems) gave more
businesses a reason to want to integrate computing into their processes. IBM became the
dominant mainframe company. Nicknamed “Big Blue,” the company became synonymous
with business computing. Continued improvement in software and the availability of cheaper
hardware eventually brought mainframe computers (and their little sibling, the minicomputer)
into most large businesses.
The PC Revolution
In 1975, the first microcomputer was announced on the cover of Popular Mechanics: the Altair
8800. Its immediate popularity sparked the imagination of entrepreneurs everywhere, and there
were quickly dozens of companies making these “personal computers.” Though at first just a
niche product for computer hobbyists, improvements in usability and the availability of practical
software led to growing sales. The most prominent of these early personal computer makers was
a little company known as Apple Computer, headed by Steve Jobs and Steve Wozniak, with the
hugely successful “Apple II.” Not wanting to be left out of the revolution, in 1981 IBM (teaming
with a little company called Microsoft for their operating-system software) hurriedly released
their own version of the personal computer, simply called the “PC.” Businesses, who had used
IBM mainframes for years to run their businesses, finally had the permission they needed to
bring personal computers into their companies, and the IBM PC took off. The IBM PC was
named Time magazine’s “Man of the Year” for 1982.
Because of the IBM PC’s open architecture, it was easy for other companies to copy, or “clone”
it. During the 1980s, many new computer companies sprang up, offering less expensive versions
of the PC. This droves prices down and spurred innovation. Microsoft developed its Windows
operating system and made the PC even easier to use. Common uses for the PC during this
period included word processing, spreadsheets, and databases. These early PCs were not
connected to any sort of network; for the most part they stood alone as islands of innovation
within the larger organization.
Client-Server
In the mid-1980s, businesses began to see the need to connect their computers together to
collaborate and share resources. This networking architecture was referred to as “client-server”
because users would log in to the local area network (LAN) from their PC (the “client”) by
connecting to a powerful computer called a “server,” which would then grant them rights to
different resources on the network (such as shared file areas and a printer). Software companies
began developing applications that allowed multiple users to access the same data at the same
time. This evolved into software applications for communicating, with the first real popular use
of electronic mail appearing at this time.
This networking and data sharing all stayed within the confines of each business, for the most
part. While there was sharing of electronic data between companies, this was a very specialized
function. Computers were now seen as tools to collaborate internally, within an organization. In
fact, these networks of computers were becoming so powerful that they were replacing many of
the functions previously performed by the larger mainframe computers at a fraction of the cost. It
was during this era that the first Enterprise Resource Planning (ERP) systems were developed
and run on the client-server architecture. An ERP system is a software application with a
centralized database that can be used to run a company’s entire business. With separate modules
for accounting, finance, inventory, human resources, and many, many more, ERP systems,
with Germany’s SAP leading the way, represented the state of the art in information systems
integration. We will discuss ERP systems as part of the chapter on process (chapter 9).
The World Wide Web and E-Commerce
First invented in 1969, the Internet was confined to use by universities, government agencies,
and researchers for many years. Its rather arcane commands and user applications made it
unsuitable for mainstream use in business. One exception to this was the ability to expand
electronic mail outside the confines of a single organization. While the first e-mail messages on
the Internet were sent in the early 1970s, companies who wanted to expand their LAN-based e-
mail started hooking up to the Internet in the 1980s. Companies began connecting their internal
networks to the Internet in order to allow communication between their employees and
employees at other companies. It was with these early Internet connections that the computer
truly began to evolve from a computational device to a communications device.
In 1989, Tim Berners-Lee developed a simpler way for researchers to share information over the
network at CERN laboratories, a concept he called the World Wide Web.[4] This invention
became the launching point of the growth of the Internet as a way for businesses to share
information about themselves. As web browsers and Internet connections became the norm,
companies rushed to grab domain names and create websites.
In 1991, the National Science Foundation, which governed how the Internet was used, lifted
restrictions on its commercial use. The year 1994 saw the establishment of both eBay and
Amazon.com, two true pioneers in the use of the new digital marketplace. A mad rush of
investment in Internet-based businesses led to the dot-com boom through the late 1990s, and
then the dotcom bust in 2000. While much can be learned from the speculation and crazy
economic theories espoused during that bubble, one important outcome for businesses was those
thousands of miles of Internet connections were laid around the world during that time. The
world became truly “wired” heading into the new millennium, ushering in the era of
globalization, which we will discuss in chapter 11.
As it became more expected for companies to be connected to the Internet, the digital world also
became a more dangerous place. Computer viruses and worms, once slowly propagated through
the sharing of computer disks, could now grow with tremendous speed via the Internet. Software
written for a disconnected world found it exceedingly difficult to defend against these sorts of
threats. A whole new industry of computer and Internet security arose. We will study
information security in chapter 6.
Web 2.0
As the world recovered from the dot-com bust, the use of technology in business continued to
evolve at a frantic pace. Websites became interactive; instead of just visiting a site to find out
about a business and purchase its products, customers wanted to be able to customize their
experience and interact with the business. This new type of interactive website, where you did
not have to know how to create a web page or do any programming in order to put information
online, became known as web 2.0. Web 2.0 is exemplified by blogging, social networking, and
interactive comments being available on many websites. This new web-2.0 world, in which
online interaction became expected, had a big impact on many businesses and even whole
industries. Some industries, such as bookstores, found themselves relegated to a niche status.
Others, such as video rental chains and travel agencies, simply began going out of business as
they were replaced by online technologies. This process of technology replacing a middleman in
a transaction is called disintermediation.
As the world became more connected, new questions arose. Should access to the Internet be
considered a right? Can I copy a song that I downloaded from the Internet? How can I keep
information that I have put on a website private? What information is acceptable to collect from
children? Technology moved so fast that policymakers did not have enough time to enact
appropriate laws, making for a Wild West–type atmosphere. Ethical issues surrounding
information systems will be covered in chapter 12.
The Post-PC World
After thirty years as the primary computing device used in most businesses, sales of the PC are
now beginning to decline as sales of tablets and smartphones are taking off. Just as the
mainframe before it, the PC will continue to play a key role in business but will no longer be the
primary way that people interact and do business. The limited storage and processing power of
these devices is being offset by a move to “cloud” computing, which allows for storage, sharing,
and backup of information on a massive scale. This will require new rounds of thinking and
innovation on the part of businesses as technology continues to advance.