Before discussing the merits of the
approaches, some simple definitions are required. Much debate and system
based argument is centered around the approach to data. Forgive me the time taken in placing definitions on the terms, before
discussing the merits, advantages and disadvantages of the processes. An
understanding of what each of these terms actually means, and what can
be expected as a result is critical.
Definition: Real-time Data
is defined as access to data as and when it is updated. In a
transactional environment, this would be the split second after the
transaction is committed and processed. In a time series environment, it
would be the split second after the passing of a time segment in the
An Order Entry Clerk, receives a
telephonic order from a customer, while he is capturing the order, the
transaction is in essence dormant, until he presses the save button.
Once he presses the save button the transaction(s) are committed and
updated to the system. Immediately the stock picking slip is generated,
once the stock is drawn, a delivery note is processed; an invoice is
issued and updated into the Debtors, General Ledger and stock systems.
This is real-time access to data. A real-time system would be able to
report immediately after each event.
Example: Time Series
A PLC (Logic Controller) on a
manufacturing plant monitors the flow and temperature of chemicals
through the plant. This information is fed to analogue dials and is
simultaneously displayed on a graph display in the controller’s office.
This real-time system updates and displays the information immediately
after each sample is taken.
Definition: Real Time Access to Information
Real time access to information can be defined as any process which
allows the user to interface with Information, via a direct request for
and response to a query or refresh mechanism. The information being
refreshed may come from a real-time data source, or other processes
which accumulate, interrogate, update and correlate the information.
The daily sales at a retail outlet are uploaded to a central server
over night. The Sales manager can at any point draw reports and graphs
from this central server. The manager gains real time access to
information, he may draw any information stored on the server at any
point in time.
Example: Time Series
The quality control department in the
manufacturing plant uploads statistical data from multiple sources in
the plant on an hourly basis. This statistical data is analyzed through
a labyrinth of statistical algorithms and give a statistical “health
report”. The various elements of the health report are stored in a
central database and can be further analyzed for variances by the
Quality Control team.
Definition: Non-Real Time Data or
Non Real Time
Data or Information is data or information that is produced periodically
through process on which the users has little or no interaction or
control. A simple example of this is a monthly management report,
produced by the finance department. The user can neither cause the
report to be generated, nor can he alter the contents of the report
without going through an intermediary.
Data versus Information
In layman’s terms Data can be defined as a large quantity of
seemingly uncorrelated information, that in and of itself may or may not
be useful. Information is distilled data. It extracts the essence of the
data without overwhelming the viewer.
A simple example:
A detailed breakdown of the approximate
number and quantity of different insect species and their respective
scientific names, living around the world by region, is data. The fact
that there are three cockroaches in your sink is information. If you are
an entomologist (scientist who studies insects) the data may be of some
use to you, but you would probably still have to analyze it further to
be of real use. For the rest of us, three cockroaches is useful
information, on which we can act.
A five second time series of fuel
consumption of your motor vehicle would be data. It may prove
interesting that between 7’30” and 8’55” your fuel consumption increased
from 8 litres per 100 kilometres to 12 litres per 100 kilometres. But it
would be far more useful to tell you that your average fuel consumption
between month one and month two had risen from 7.67 litres/100km to 9.35
litres/100km; which would indicate either excessive loads, or your
vehicle needs a service. The data may be useful to an aircraft engineer,
but to the average man in the street…
I define the 21st Century Myth
as follows: Modern man is pelted and snowed under by information.
Everywhere we turn we are quite literally pummeled with information.
Therefore the modern executive needs systems and processes to simplify
this information into manageable chunks.
myth is simple; are we pummeled by Information or Data? Have we through
our technology succeeded in simplifying our world or complicating it
further. Have we substituted accuracy and / or volume for understanding?
The simplest rule of scientific study is: until analyzed it is only
data, until the essence has been extracted and checked for aberrations
(Variations which alter the data in a non meaningful way) all you have
is data, useful data perhaps, enlightening perhaps but just data.
Information Cycles like Business Cycles go through phases, and tend to
over correct at the peak of each cycle. With the rapid increase in both
processing power and communication (both personal and computer
networking) over the last twenty odd years, we have witnessed an
exponential growth in access to data, the swell became a wave became a
tsunami of data, washing all before it. As data proliferates and the
average businessman attempts to maintain a balance between
paralysis-by-analysis and proactive response, the screams have
slowly risen above the noise of the tsunami. The screams have been
heard, and the technology sector has swung into action.
response to too much data – simplify it, COMPLETELY.
“Don’t worry Mr Executive. We will simplify it and present you with a
dashboard, like your motor car.” And if you want to take it further, we
have a three letter acronym (TLA) which we can implement – KPI’s. Key
Point Indicators through which you can manage your business, that way
you won’t have to deal with any of the messy data.
truth, KPI’s are a useful tool, as are dashboards. They are not the
answer to every ill, as too much data wasn’t the solution. There is a
simple caveat to both dashboards and KPI’s, they are as useful as the
instruments on the dashboard of your car. If you do not know what is
“normal”, they may prove more frustrating than useful. Simply put, if
you have just started your car, the temperature indicates cold, normal
operation around 90 Degrees Celsius, under very hot or load conditions
this may creep up etc. So knowing what is normal is crucial.
have seen very few implementations which take this into account. Most
indicators compare Actual to Budget, or Forecast. The reason, the
enormous complexity of acquiring useful data, to display comparisons to
the norm, one has to first define what norm, daily, weekly, monthly or
year-to-date, is relevant. Once this has been defined, one has to
consider how to create realistic comparisons of normal.
conclusion of this tirade is clear, to create manageable and meaningful
information structures, that enhance decision making, a combination of
the Data models is required. Real-Time data feeds presenting the here
and now, while real time access to information provides the analytical
backdrop against which the real time feed becomes meaningful and useful.
Finally non-real time information and data provide the final element,
with which decision making aided, and possibly the key element: Analysis
Nicholas Campleman is Group CEO of the Dream Catcher Group of Companies
© Dream Catchers 2006