Log/Mate Esp Assistant - A Knowledge-Based System For Log Analysis A Progress Report
Log/Mate Esp Assistant - A Knowledge-Based System For Log Analysis A Progress Report
Log/Mate Esp Assistant - A Knowledge-Based System For Log Analysis A Progress Report
A PROGRESS REPORT
E. R. (Ross) Crain, P.Eng.
D&S Petroleum Consulting Group / Alberta Research Council Joint Venture
Calgary, Alberta
403 845 2527 [email protected]
Distributed internally to D&S/ARC Sep 1986. This electronic version created Jan 2005.
Authors Note: The LOG/MATE ESP ASSISTANT joint venture research project with the Alberta
Research Council was suspended shortly after this was written and my involvement ceased. The
project was resumed later in 1987 and a deliverable product was completed in 1988, based on the
foundation described here. The earlier papers on LOG/MATE ESP and expert systems research
leading up to this work are available from the Publications section of this website.
This paper was never intended for external publication. It contained proprietary trade secrets and
details of management, planning, and financing problems that were not for public consumption.
However, 20 years have passed and the experiences recounted may be instructive today. ERC Jan
2005.
ABSTRACT
This paper reviews the plans and progress achieved to date on research, design, and implementation
of a prototype knowledge-based (expert) system for log analysis, being developed by the author and
his colleagues, under the terms of a joint venture agreement between D&S Petroleum Consulting
Group Ltd. and the Alberta Research Council in Calgary, Alberta, Canada.
6. The program should be able to learn new knowledge, by observation and by asking questions.
(Most systems do not remember this new knowledge for use on the next problem, and it must be
specifically entered into the data or rule bases for later use.)
7. The program's knowledge should be easily modifiable, to account for new knowledge or methods.
Our decisions about the tools and methods to use were predicated on these seven principles, plus
the engineer's more pragmatic KISS principle (keep it simple and small). Obviously, there are many
ways in which to create such a system, and still honour these rules. We have spent considerable time
and effort evaluating these factors and have made a number of arbitrary but pragmatic decisions
which naturally flavour the result.
as the physical properties of rocks and fluids, and log analysis parameters previously used in the
area.
Our second goal was thus to provide the textbook, parameter, and procedural data for Western
Canada, and at least text book data for other areas. The historical data base will be augmented by the
user at each installed site, so that empty areas grow, or learn, from use of the system. With support
from users, the parameter data for a wide area of the world could be shared.
A historical data base for Western Canada has been prepared. This data represents the log data,
results, methods, and parameters used by analysts to solve the standard log analysis algorithms in
this area, sorted by formation name and locality. It includes approximately 900 model zones, drawn
from our previous analyses. These wells all have good ground truth data, so that analysis parameters
and methods can be validated.
This database will learn from an expert's use of the system, and could be called a teachable database.
It will not learn everything, but only those things we wish it to learn. The learning function will be
provided by an application program which will update the historical data base upon user command.
This update facility will add parameter values used successfully by the analyst since the last update,
provide a mapping facility for data evaluation, and an editing feature to remove or correct
inconsistent data. Coding of this function has not yet begun, but is planned for this year.
It would thus be possible for experts to share local knowledge amoung many users, and to provide
less experienced users with a good starting point for their analyses. It also serves as the perfect
memory for both advanced and novice users.
Systems sold locally could contain a considerable amount of data since it would be readily available
from our own files. Those sold internationally would likely be delivered with an empty database,
except for universally accepted rock and fluid properties and rational default values for all other
parameters. These would be updated by the software as analyses are run, preferably under the
control of knowledgable analysts.
An integral part of this enhancement will be a parameter picking feature, so that parameter values can
be extracted from the historical data base, as well as from depth plots and crossplots of current data,
for use in analysing the current well.
Use of the commercially available relational database was found to be the most efficient mechanism
to aid our conversion and portability, and satisfy client demand for a database which could be read by
other prrograms they may already own. RDS is available on a large variety of target delivery vehicles,
is maintained and updated by others, and has a very broad user base and acceptance level. Use of
such a tool has reduced our programming effort considerably, although we elected to create our own
internal data representation structure in C language. This was done so as not to be tied directly to the
RDS human interface, which requires more user knowledge than we would wish upon our casual
users.
We have decided to utilize the GKS graphics kernal to rewrite our interactive graphics modules. A
commercially available package (Visual:GKS) provides GKS primitives and a library of device
dependant drivers. The device driver library is itself an expert system, making a total of six that will be
imbedded in our system. This reduces our effort in attempting to be machine independant. As well
this package uses the RDS data base and is available on a large variety of our target delivery vehicles.
We have decided also to use Rulemaster, a decision table approach to expert system building, to
create and run the rule bases in our prototype system. However, we are continuing research (as time
and resources permit) on data representation and rule processing on the Symbolics 3600 computer
using the more powerful KnowledgeCraft expert system environment. Future releases of the
LOG/MATE ESP ASSISTANT may use a delivery version of this kind of expert system shell.
Multiple active windows, using commercially available window management packages, have yet to be
added to the screen handler of LOG/MATE ESP, to improve the human interface. However, we are now
yet able to move the contents of a data element in one screen to another, reducing the need for this
feature in the near term.
The C language conversion is approximately 70% complete and all modules converted to date have
been thoroughly tested. Approximately 50 lines of assembly language code are needed for rapid
screen handling for each new system configuration.
Integration of the Rulemaster, RDS database, and GKS kernal will begin in October, 1986.
transmission to IBM mainframes. Transmission to DEC and other mainframes can be adequately
handled by the RS-232 and Ethernet protocols.
Since the data base will be in RDS format it could be accessed directly from foreign programs without
the necessity of going through LOG/MATE ESP.
These data communications features are widely used on our present systems, and will be integrated
into the knowledge-based system as they are completed. The data comm work is not part of the D&S
joint venture with ARC, but is being undertaken by D&S to provide greater marketability of the new
product.
planned. Also, we intend to have a larger involvement of D&S marketing efforts sooner than expected
to raise the chances of commercial success.
We have been criticized for not using more sophisticated tools and for our emphasis on commercial
aspects at the apparent expense of more elaborate research. Our rationale is based on the fact that
the success or failure of the project will be judged primarily on its performance in the marketplace,
and not on the elegance, cost, or ultimate capabilities of the inference engine.
No one should be ashamed of the calibre of the original plan, for the problems stated above are quite
realistic results when engaged in leading-edge research with a commercial goal, a strict budget, and a
deadline to meet. However, these obvious facts should be a warning to others not to expect instant
results, clearly defined paths, or even unanimous agreement as to methodology or goals in a project
of this type. The same warning should be heeded as we prepare plans and budget for the coming
year.
We would certainly like to do more research and development on the KnowledgeCraft system. Indeed,
we would be foolhardy in not planning further work on several fronts. Lynn Sutherland, one of the
project members, concludes her report with the following: "Traditional environments cannot easily
support the process of reasoning, flexible programs, or friendly human interfaces. Perhaps the
established commercial and professional world is not yet ready to accept the ease of use provided by
non-traditional environments such as Macintosh and Lisp machines, but professional graduates now
are computer literate and will demand flexible, integrated, and compatible software. We have the
people and the equipment in place to research a sophisticated expert system problem. We may feel at
this time that we are ahead of commercial acceptance of such a development. We probably are, but
just a little. Eventually commercial competition will force the decision to accept more powerful, nontraditional, tools."
Frankly, the current joint venture does not have sufficient resources to bring about the goal
suggested by Lynn. The joint venture participants should therefore seriously consider a further two
year research and development program to pilot test a KnowledgeCraft version of LOG/MATE ESP
ASSISTANT for the high end user. The timing would be right. as the oil industry slump should
rebound over this period, and our efforts to date on the KnowledgeCraft version show considerable
promise.
However, the full ramifications of trying to mount the existing and planned capabilities of LOG/MATE
ESP have not been investigated thoroughly enough as yet to write a plan for the complete program.
We need to know a great deal more about our results to date on KnowledgeCraft. Deeper investigation
of a relational data base interface to the Lisp workspace, conversion of bit-mapped pixel location to
scaled data values, and the massive list of ESP features to be re-designed and re-coded is required.
This work could be done under the terms of the present joint venture, but should only be undertaken
if the two year extension is considered highly probable.
CONCLUSIONS
We are confident that the last twelve months effort by our team of eight full time staff and two part
time advisors has generated a workable, achievable design. We have tested many of the components
and are satisfied with most of the results to date. We have set a target date of December 1986 for our
first commercially releasable HP-300 based version and January 1987 for the first IBM/PC version.
They will lead log analysis and related geological and engineering data processing into new realms of
capability, and still be friendly and easy to use. At the same time, we are continuing to investigate
more powerful tools to represent our knowledge and to process it in new mays which may enhance
productivity even more on future releases.
investigation of appropriate rule managing tools, or inference engines, such as Rulemaster, KEE,
KnowledgeCraft, and hardcoded LISP has been undertaken.
A prototype inference engine was developed operating on the Symbolics 3600 using the
KnowledgeCraft shell, both specially acquired for the task. This work has been temporarily
suspended, pending a detailed review of what we have learned to date. A second prototype, using
Rulemaster as an inference engine, has been tested with a more extensive rule set than that used in
KnowledgeCraft.
We have successfully implemented usage and parameter selection rules in both Rulemaster and
Knowledgecraft. Rulemaster will be used in the initial prototypes because it is available on the lower
cost workstations we plan to market, and the rule description interface is easy to teach to high level
users who may wish to modify the actual rules for his own area. KnowledgeCraft has a much richer
data representation methodology, but the human interface requires considerably more work by our
development staff before it could be used by potential customers. It is also 60 times more expensive,
and is not yet available on our target machines, although this, as well as a lower delivery
version price, have been promised.
We have experimented with iteration rules with Knowledgecraft, and have determined that if a rule can
be defined by an expert, it can be represented and invoked. Similar rules can be phrased in
Rulemaster, but at the time of writing, they have been tested only on paper.
Conflict, completeness, and consistency issues are still being resolved manually, as no available
tools cover these problems adequately. We will have to trust the initial expert and subsequent users
to behave rationally, or to be smart enough to find their errors and correct them. This is similar in
many ways to debugging problems in conventional programming.
We have also looked at ART and KEE, two other high end expert system tools, which promise
execution versions on at least HP and VAX hardware as well as LISP machines. We have also tested
material from Xerox using Interlisp-D and Notecards as a data base. We have modified the proprietary
software of one of our team for experimental purposes. Because of this extensive investigation, we
feel confident in our choice of Rulemaster for our early prototypes, but have left our options open for
future enhancements with more powerful tools. Since we have limited research resources, we must
produce working software in evolutionary steps, and cannot wait for the perfect solution.
Numerous other research activities were carried out and are covered in separate reports provided by
the individual researchers. These reports are appended to this progess report. Some of the topics
include details on database and graphics on the Symbolics, relational data bases on LOG/MATE and
Rulemaster, IBM-PC environment, Rulemaster and RADIAL environment, KnowledgeCraft
environment, UNIX environment, and integration issues. In addition a large library of technical papers
on expert systems and many related topics has been gathered and is available for review by
interested parties.