1009 Ai PDF
1009 Ai PDF
1009 Ai PDF
ESs have been successful largely because they restrict the field of
interest to a narrowly defined area that can be naturally described
by explicit verbal rules.
43
OPERATION OF THE SYSTEM
• Knowledge acquisition
• Consultation
• Explanation
KNOWLEDGE ACQUISITION
The designer of the system must liaise with people in order to gain
knowledge and these people must be acknowledged experts in the
appropriate area of activity, for example physicians, lawyers or
investment analysts. The knowledge engineer acts as an
intermediary between the human expert and the expert system.
Typical of the information that must be gleaned is vocabulary or
jargon, general concepts and facts, problems that commonly arise,
the solutions to the problems that occur, and skills for solving
particular problems. This process of picking the brain of an expert
is a specialised form of data capture and makes use of interview
techniques. Having acquired the information the knowledge
engineer is also responsible for the self consistency of the data,
and a number of specific tests have to be performed to ensure that
the conclusions reached are sensible.
44
CONSULTATION
The system is in this mode when a user is interacting with it. The
user interacts by entering data in English and the system responds
using a backward chaining (deductive reasoning) process to derive
an answer to the questions posed by the user. As explained
earlier the user may during this time be asked for information that
can be used to support the system’s hypothesis, with appropriate
backtracking if contradictory evidence to this hypothesis is found.
EXPLANATION
This mode allows the system to explain its conclusions and its
reasoning process. This ability comes from the AND/OR trees
created during the deduction process. As a result most expert
systems can answer the following 'why' and 'how' questions
45
STRUCTURE
EXTERNALLY:
46
The inference engine is not explicitly shown; it however controls
the flow of information around the above modules, retrieving
appropriate facts and rules during the reasoning process
47
A simple example of deductive reasoning
green(Fritz).
Step 1
Knowledge base is examined to see if 'hops(Fritz)' is a
recorded fact. It’s not.
Step 2
Rule base is examined to see if there’s a rule of the form
IF A THEN hops(x); x=Fritz.
There is, with A=frog(x); x=Fritz. But is the premise
'frog(Fritz)' actually true?
Step 3
Knowledge base is examined to see if 'frog(Fritz)' is a
recorded fact. As with 'hops(Fritz),' it’s not, so it’s again
necessary to look instead for an appropriate rule.
Step 4
Rule base is examined to see if there’s a rule of the form
IF A THEN frog(x); x=Fritz.
Again, there is a suitable rule, this time with with
A=green(x); x=Fritz. But now is 'green(Fritz)' true?
Step 5
Knowledge base is yet again examined, this time to see if
'green(Fritz)' is a recorded fact, and yes -- this time the
premise is directly known to be true.
48
One can therefore finally conclude that the original assertion
'hops(Fritz)' was also true.
Failure of a query
NOTE:
49
IMPLEMENTATION OF EXPERT SYSTEMS
50
Some areas of financial application of ESs are:
Note that these two areas are also ones to which subsymbolic
systems, neural networks in particular, have been applied. It
would be very interesting to see a benchmark comparison of a
neural network and ES solution, but to my knowledge no such
studies have been published. Benchmarking studies are
unfortunately rare in AI applications, even ones of a more
limited scope which compare the ability of say, neural networks
training using different methodologies.
• Audit planning:
51
• Tax planning:
52
FUZZY LOGIC
One major criticism of traditional expert systems has been that the
rules they use are in many cases too precise; they don’t capture
the 'shades of grey' of everyday life.
a sale would not be made if the intended purchase were for exactly
£5000, no matter how good the customer’s credit rating might be.
And what, in any case, does 'GOOD' mean? Is it reasonable to
divide the whole population into just two classes, those with GOOD
and BAD credit ratings?
53
Combining fuzzy assertions via 'fuzzy logic operators'
Example:
To date, expert systems are the major application area for fuzzy
logic based technologies, and have been applied successfully in a
wide range of fields including linear and nonlinear control; pattern
recognition; financial systems; operations research; data analysis.
In a fuzzy expert system, all relevant rules are 'fired.' So if, for
example, there are separate rules pertaining to OLD and YOUNG
people, then someone of age 47 would have both rules applied to
them as they are according to fuzzy logic both OLD (to the degree
0.56) and YOUNG (to the degree 0.44).
54
Because of this feature, a final decision from a fuzzy expert system
requires defuzzification, a process that can take a number of
forms. The choice of method depends most strongly on whether it
makes sense to blend the conclusions of several jointly fired rules
in some way (for example if the system was being used as a
controller), or whether a 'crisp' output is needed which selects just
one conclusion from the range of candidates (as for example in a
legal expert system).
HYBRID SYSTEMS
Fuzzy logic gives some flexibility to rule-based AI, but it does not
give it the ability to create its own rules. For this, a subsymbolic
system such as a neural network or genetic algorithm is still
required.
55
Symbolic vs. subsymbolic AI: final thoughts...
GE’s expert system by its nature could never be better than the
human expert from which its knowledge and rule bases were
derived. The experience with the 'axle-tapper' neural net however
demonstrates that the skills developed during a training process by
such machines can exceed our own. What will happen when
neural networks, based on an improved knowledge of the brain,
are built that not only have pattern recognition but reasoning skills?
Will such machines be able to think -- in a true, broad sense --
better than we can? Could they be more creative than us? And
what might we (and, possibly, they) feel about that?
56