You are currently viewing Evolution of Information System Function

Evolution of Information System Function

An information system is a combination of processes, hardware, trained personnel, software, infrastructure and standards that are designed to create, modify, store, manage and distribute information to suggest new business strategies and new products. It leads to efficient work practices and effective communication to make better decisions in an organization. There has been a significant evolution of Information System function over the past few decades.

The evolution of Information System function can be summarized as follows:

1950 – 19601960 – 19701970 – 19801980 – 19901990 – 20002000 – Present
Data ProcessingManagement ReportingDecision SupportExecutive SupportKnowledge ManagementE-Business
Collects, stores, modifies and retrieve day-to-day transactions of an organization

 

 

 

Help workers

Pre-specified reports and displays to support business decision-making

 

 

 

 

Helps middle managers

Interactive ad-hoc support for the decision-making process

 

 

 

 

 

Helps senior managers

Provide both internal and external information relevant to the strategic goals of the organization

 

 

Helps Executives

Supports the creation, organization and dissemination of business knowledge

 

 

 

Help available enterprise wide

Greater connectivity, higher level of integration across applications

 

 

 

Helps global e-business

Evolution of Information System Function

1950 – 1960: Electronic Data Processing, Transaction Processing System

During this period, the role of IS was mostly to perform activities like transaction processing, recordkeeping and accounting. IS was mainly used for electronic data processing (EDP).

EDP is described as the use of computers in recording, classifying, manipulating, and summarizing data. It is also called information processing or automatic data processing.

Transaction Processing System (TPS) was the first computerized system developed to process business data. TPS was mainly aimed at clerical staff of an organisation. The early TPS used batch processing data which was accumulated over a period and all transactions were processed afterward.

TPS collects, stores, modifies and retrieves day-to-day transactions of an organization. Usually, TPS computerize or automate an existing manual process to allow for faster processing, improved customer service and reduced clerical costs. Examples of outputs from TPS are cash deposits, automatic teller machine (ATM), payment order and accounting systems. TPS is also known as transaction processing or real-time processing.

1960 to 1970: Management Information Systems

During this era, the role of IS evolved from TPS to Management Information Systems (MIS). MIS process data into useful informative reports and provide managers with the tools to organize evaluate and efficiently manage departments within an organization. MIS delivers information in the form of displays and pre-specified reports to support business decision-making. Examples of output from MIS are cost trend, sales analysis and production performance reporting systems.

Usually, MIS generates three basic types of information which are:

  • Detailed information reports typically confirm transaction-processing activities. A detailed Order Report is an example of a detailed report.
  • Summary information establishes data into a format that an individual can review quickly and easily.
  • Exception information report information by filtering data that is an exception inventory report. Exception reports help managers save time because they do not have to search through a detailed report for exceptions.

 

This period also marked the development when the focus of organizations shifted slowly from merely automating basic business processes to consolidating the control within the data processing function.

1970 to 1980: Decision Support Systems

In this era, a major advancement was an introduction of the personal computers (PC). With the introduction of PCs, there was the distribution of computing or processing power across the organization. IS function associated strongly with management rather than a technical approach in an organisation. The role focused on “interactive computer-based system” to aid decision-makers in solving problems.

This new role of information systems to provide interactive ad-hoc support for the decision-making process to managers and other business professionals is called Decision Support Systems (DSS). DSS serve the planning, management and operations level of an organization usually senior management.

DSS uses data from both internal and/or external sources. Internal sources of data might include inventory, sales, manufacturing or financial data from an organization’s database. External sources could include pricing, interest rates, population or trends. Managers use DSS to manipulate the data to help with decisions. Examples of DSS are projected revenue figures based on new product sales assumptions, product pricing and risk analysis systems.

1980 to 1990: Executive Information Systems

This period gave rise to departmental computing due to many organisations purchasing their own hardware and software to suit their departmental needs. Instead of waiting for indirect support of centralized corporate service department, employees could use their own resources to support their job requirements. This trend led to new challenges of data incompatibility, integrity and connectivity across different departments. Further, top executives were neither using DSS nor MIS hence executive information systems (EIS) or executive support systems (ESS) were developed.

EIS offers decision making facilities to executives through providing both internal and external information relevant to meeting the strategic goals of the organization. These are sometimes considered as a specific form of DSS. Examples of the EIS are systems for easy access to actions of all competitors, economic developments to support strategic planning and analysis of business performance.

1990 to 2000: Knowledge Management Systems

During this era, the rapid growth of the intranets, extranets, internet and other interconnected global networks dramatically changed the capabilities of IS in business. It became possible to circulate knowledge to different parts of the world irrespective of time and space.

This period also saw an emergence of enterprise resource planning (ERP) systems. ERP is an organization-specific form of a strategic information system that incorporates all components of an organisation including manufacturing, sales, resource management, human resource planning and marketing.

Moreover, there was a breakthrough in the development and application of artificial intelligence (AI) techniques to business information systems. Expert systems (ES) and knowledge management systems (KMS) interconnected to each other.

Expert systems (ES) are a computer system that mimics the decision-making ability of human experts. For example, systems making financial forecasts, diagnosing human illnesses and scheduling routes for delivery vehicles. Knowledge management system (KMS) is an IT system that stores and retrieves knowledge to support creation, organization and dissemination of business knowledge within the enterprise. Examples of KMS are feedback database and helpdesk systems.

ES uses data from Knowledge Management Systems to generate desirable information system’s output for example loan application approval system.

2000 – present: E-Business

The Internet and related technologies and applications changed the way businesses operate and people work. Information systems functions in this period are still the same just like 50 years ago doing records keeping, reporting management, transactions processing, support management and managing processes of the organization. It is used to support business process, decision making and competitive advantage.

The difference is greater connectivity across similar and dissimilar system components. There is great network infrastructure, higher level of integration of functions across applications and powerful machines with higher storage capacity. Many businesses use Internet technologies and web-enable business processes to create innovative e-business applications. E-business is simply conducting business process using the internet.

Evolution of Information Technology

1940s – 1950s: UNIVAC Computer

On June 30, 1945, John Von Neumann published the First Draft of a Report on the EDVAC. It was the first documented discussion of the stored program concept and the blueprint for computer architecture. Further, in this period, it was a direct-access architecture with no operating system or remote access and used for scientific computing. There were no software applications.

1960s- 1970s: Mainframe Computer

In this era, it was a centralized architecture with operating system built-in within the hardware. Computers were mostly mainframes and minicomputers. Remote access was available from client terminals. Applications and data were centralized. Applications were designed to function as silos.

The role of Chief Information Officer (CIO) was as operational manager of a specialist function. IT roles included performing printer backups, conducting system upgrades via lengthy procedures, manually running user batch tasks, keeping terminals stocked with paper and swapping out blown tubes.

IT staff was working in separate rooms than other employees with system interconnectivity minimal at the time. People’s desire to bridge the gaps led to the motivation behind ARPANET.

1980s – 1990s: Personal Computer

This decade saw the introduction of the Personal Computer(PC) and had Client/Server architecture with operating system separated from hardware. Application and data were distributed with processing shared between clients and distributed servers. Applications were shared throughout the organisation.

This generation of IT worked in cubicles onsite, often sharing space alongside the users they supported. Most employees were using PCs with Windows operating system.

The role of CIO was an organizational designer or a technology advisor. The typical IT roles at that time consisted of installing and maintaining file and print servers to automate data storage, retrieval and printing. Other business roles included installing and upgrading DOS on PCs. IT support was engaged with network maintenance, PC email support, networking, Windows and Microsoft Office installations and adding memory or graphics cards.

Toward the end of the 1990s, Internet connectivity became the most requested computing resource among growing businesses. Employers worried about productivity and often limited Web access.

Connecting people in a vast and distributed network of computers not only increased the amount of data generated but also led to numerous new ways of getting value out of it using new enterprise applications. Data mining helped in the analysis of data from different prospects and summarizing data into useful information.

IT staff believed in building their own IT infrastructures from components sold by focused, specialized IT vendors such as Intel in semiconductors, Microsoft in operating systems, Oracle in databases, Cisco in networking, Dell in PCs and EMC in storage.

2000s – present: Mobile

This era is web services architecture with the virtual operating system. While computer networks took IT from the accounting department to all corners of the enterprise, the World Wide Web took IT to all corners of the globe, connecting millions of people. World Wide Web led to the proliferation of new applications which were no longer limited to enterprise-related activities but digitized almost any activity in people’s lives. It greatly facilitated the creation and sharing of information by anyone with access to the Internet. This also increased the amount of data created, stored, moved, and consumed.

Today applications, data and services are distributed. Processing is shared and utilized intelligently within grid-computing or P2P application. Applications are shared between clients, suppliers and external partners.

Mobile and cloud computing is responsible for running the infrastructure. Cloud computing enables on-demand and convenient network access to a shared pool of computing devices such as applications, networks, servers, storage and services that can be established quickly and released with minimal management effort or service provider interactions.

Big data and its analysis are becoming an organisation’s competitive edge. Big data is a collection of data from traditional and digital sources inside and outside of an organization that represents a source for ongoing discovery and analysis.

The role of CIO is business visionary and master outsourcer. Today’s IT job roles include support of Bring Your Own Device(BYOD), introduction of social media for sales and marketing (and the blocking of its access at work for personal use), constant security patches and DevOps automation.

Service Level Agreement and its role

A Service Level Agreement is an agreement between an IT Service Provider and an IT Customer or a supplier. It may also be a legally binding formal or an informal “contract” (for example, internal department relationships). They are important for continues improvement and vital in moving towards partnership relations. Corporate IT organizations, particularly those that have embraced IT Service Management (ITSM), enter SLAs with their in-house customers (users in other departments within the enterprise). An IT department creates an SLA so that its services can be measured, justified and sometimes compared with those of outsourcing vendors.

SLAs measure the service provider’s performance and quality in a number of ways. Some metrics that SLAs may specify include:

  • Availability and uptime — the percentage of the time services will be available
  • The number of concurrent users that can be served
  • Specific performance benchmarks to which actual performance will be periodically compared
  • Application response time
  • The schedule for notification in advance of network changes that may affect users
  • Help desk response time for various classes of problems
  • Usage statistics that will be provided.

In addition to establishing performance metrics, an SLA may include a plan for addressing downtime and documentation for how the service provider will compensate customers in the event of a contract breach. SLAs, once established, should be periodically reviewed and updated to reflect changes in technology and the impact of any new regulatory directives

The value of an SLA is that it helps to facilitate a “Service Culture”  within which quality control standards can operate.  In effect, it serves to unify the aim of the provider with the user. It commits the user to forecast volumes and other such operating conditions in return for which the provider omits an agreed level of service, quality and cost.

Business Process Management

Business Process Management includes features that manage person-to-person process steps, system-to-system process steps, and those processes that include a combination of both.

  • include process modeling, simulation, code generation, process execution, monitoring, and integration capabilities for both company-based and web-based systems.
  • The tools allow an organization to actively manage and improve its processes from beginning to end.
  • BPM systems are a way to build, execute, and monitor automated processes that span organizational boundaries.

The goal of BPM is to reduce human error and miscommunication and focus stakeholders on the requirements of their roles. BPM is a subset of infrastructure management, an administrative area concerned with maintaining and optimizing an organization’s equipment and core operations.

BPM is often a point of connection within a company between the line-of-business (LOB) and the IT department. Business Process Execution Language (BPEL) and Business Process Management Notation (BPMN) were both created to facilitate communication between IT and the LOB

There are three different kinds of BPM frameworks available in the market today. Horizontal frameworks deal with design and development of business processes and are generally focused on technology and reuse. Vertical BPM frameworks focus on a specific set of coordinated tasks and have pre-built templates that can be readily configured and deployed. BPM software suites and other technology help to accelerate and standardize faster, more accurate methods for such daily operations as invoicing and shipping.

Business process management is significantly different from ERP. Usually, BPM is used in the daily operations that you need to run your business. ERP is a culmination of providing the system components of consolidating business processes under one umbrella. ERP consists of financials, possibly customer relationship management, inventory control, human resources, or human capital management and payroll, to name a few.

How to deal with legacy systems?

This post details how to deal with legacy systems.The legacy system you’ve got to deal with is where the costs and risks are higher than the value. However, many IT leaders don’t make those calculations until they run into problems. Subsequently, they find they’re facing one of three problems:

  • The system costs too much to run
  • It no longer supports business needs and/or it requires technology.
  • Skills are hard to find.

Moving from outdated applications and systems to newer models is frequently a complex, high-risk exercise but one required for gaining the agility, cost-savings and improved user experiences that IT leaders are expected to deliver.  Application modernization can be a time-consuming and costly initiative, but ultimately, it also can slash expenses and streamline tasks. Integration problems and mismatched skills are typical of the challenges legacy infrastructure can present. Companies are turning to agile development techniques in order to increase efficiency and bring new products to market faster.  They’re leaving non-relational databases behind, using data warehousing to get more intelligence from core applications.

Many companies are now turning to cloud, big data and collaboration. Companies are looking for something new to increase competitive advantage and save money.

Benefits of collaborative work

Collaboration is at the heart of business today, Technology (IT) advancements now parallel organization structures. There is a flatter management structure with greater emphasis on teams, collaboration across disciplines, time and space. Technological tools for communication and interaction, problem-solving, and knowledge management and IT-based collaboration takes work to the next level. Collaboration changes the process by altering who can participate, how they participate and even the nature of the work itself.

Key Benefits

  • More efficient and cost-effective way to provide access to company information
  • New tools – Infrastructure for collaboration
  • Fostering a sense of belonging

Intranets or other enterprise-wide collaborative platforms evolving into very important enterprise structure

Corporate mission and values ;   Internal forms, rules, processes

Internal news (can be interactive – tools)

Intranets can provide the foundation for creating corporate culture and climate by giving a means for communication and creating communities

How the IS function can help to provide the right support

The IS Function can introduce the facilities and processes of Knowledge Asset Management (KAM) to aid decision support and collaboration. CIO’s job is to provide the technology to support online communities and collaboration

The use of technology to support decision making covers a variety of functions including

  • Alert, recommendation or decision making itself

IS managers must comprehend the opportunities and constraints of these technologies

  • Real-time data and real-time performance metrics
    • Focus on high value-added data & Identify key activities and performance indicators that are needed in real time
    • Technology readiness -Substantial computing resources Integrated and seamless system that is capable of selecting, filtering and compiling data to send them in real time to designated users on demand.

IT-based collaboration takes work to the next level

This Post Has One Comment

  1. Prateek Arora

    too nic

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.