ML Lab Manual

Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 90

1.

Candidate Elimination Algorithm

Aim:

To implement candidate elimination algorithm using python code.

Algorithm:

Step1: Load Data set

Step2: Initialize General Hypothesis and Specific Hypothesis.

Step3: For each training example

Step4: If example is positive example

if attribute_value == hypothesis_value:

Do nothing

else:

replace attribute value with '?' (Basically generalizing it)

Step5: If example is Negative example

Make generalize hypothesis more specific.

Csv files:

'trainingdata.csv'

sky,airTemp,humidity,wind,water,forecast,enjoySport

Sunny,Warm,Normal,Strong,Warm,Same,Yes

Sunny,Warm,High,Strong,Warm,Same,Yes

Rainy,Cold,High,Strong,Warm,Change,No

Sunny,Warm,High,Strong,Cool,Change,Yes
Program:

import numpy as np

import pandas as pd

# Loading Data from a CSV File

data = pd.DataFrame(data=pd.read_csv('trainingdata.csv'))

print(data)

# Separating concept features from Target

concepts = np.array(data.iloc[:,0:-1])

print(concepts)

# Isolating target into a separate DataFrame

# copying last column to target array

target = np.array(data.iloc[:,-1])

print(target)

def learn(concepts, target):

'''

learn() function implements the learning method of the Candidate elimination algorithm.

Arguments:

concepts - a data frame with all the features

target - a data frame with corresponding output values

'''
# Initialise S0 with the first instance from concepts

# .copy() makes sure a new list is created instead of just pointing to the same memory location

specific_h = concepts[0].copy()

print("\nInitialization of specific_h and general_h")

print(specific_h)

#h=["#" for i in range(0,5)]

#print(h)

general_h = [["?" for i in range(len(specific_h))] for i in range(len(specific_h))]

print(general_h)

# The learning iterations

for i, h in enumerate(concepts):

# Checking if the hypothesis has a positive target

if target[i] == "Yes":

for x in range(len(specific_h)):

# Change values in S & G only if values change

if h[x] != specific_h[x]:

specific_h[x] = '?'

general_h[x][x] = '?'

# Checking if the hypothesis has a positive target

if target[i] == "No":
for x in range(len(specific_h)):

# For negative hyposthesis change values only in G

if h[x] != specific_h[x]:

general_h[x][x] = specific_h[x]

else:

general_h[x][x] = '?'

print("\nSteps of Candidate Elimination Algorithm",i+1)

print(specific_h)

print(general_h)

# find indices where we have empty rows, meaning those that are unchanged

indices = [i for i, val in enumerate(general_h) if val == ['?', '?', '?', '?', '?', '?']]

for i in indices:

# remove those rows from general_h

general_h.remove(['?', '?', '?', '?', '?', '?'])

# Return final values

return specific_h, general_h

s_final, g_final = learn(concepts, target)

print("\nFinal Specific_h:", s_final, sep="\n")

print("\nFinal General_h:", g_final, sep="\n")


Output:

sky airTemp humidity wind water forecast enjoySport

0 Sunny Warm Normal Strong Warm Same Yes

1 Sunny Warm High Strong Warm Same Yes

2 Rainy Cold High Strong Warm Change No

3 Sunny Warm High Strong Cool Change Yes

[['Sunny' 'Warm' 'Normal' 'Strong' 'Warm' 'Same']

['Sunny' 'Warm' 'High' 'Strong' 'Warm' 'Same']

['Rainy' 'Cold' 'High' 'Strong' 'Warm' 'Change']

['Sunny' 'Warm' 'High' 'Strong' 'Cool' 'Change']]

['Yes' 'Yes' 'No' 'Yes']

Initialization of specific_h and general_h

['Sunny' 'Warm' 'Normal' 'Strong' 'Warm' 'Same']

[['?', '?', '?', '?', '?', '?'], ['?', '?', '?', '?', '?', '?'], ['?', '?', '?', '?', '?', '?'], ['?', '?', '?', '?', '?', '?'], ['?', '?', '?', '?',
'?', '?'], ['?', '?', '?', '?', '?', '?']]

Steps of Candidate Elimination Algorithm 1

['Sunny' 'Warm' 'Normal' 'Strong' 'Warm' 'Same']

[['?', '?', '?', '?', '?', '?'], ['?', '?', '?', '?', '?', '?'], ['?', '?', '?', '?', '?', '?'], ['?', '?', '?', '?', '?', '?'], ['?', '?', '?', '?',
'?', '?'], ['?', '?', '?', '?', '?', '?']]

Steps of Candidate Elimination Algorithm 2

['Sunny' 'Warm' '?' 'Strong' 'Warm' 'Same']

[['?', '?', '?', '?', '?', '?'], ['?', '?', '?', '?', '?', '?'], ['?', '?', '?', '?', '?', '?'], ['?', '?', '?', '?', '?', '?'], ['?', '?', '?', '?',
'?', '?'], ['?', '?', '?', '?', '?', '?']]
Steps of Candidate Elimination Algorithm 3

['Sunny' 'Warm' '?' 'Strong' 'Warm' 'Same']

[['Sunny', '?', '?', '?', '?', '?'], ['?', 'Warm', '?', '?', '?', '?'], ['?', '?', '?', '?', '?', '?'], ['?', '?', '?', '?', '?', '?'], ['?',
'?', '?', '?', '?', '?'], ['?', '?', '?', '?', '?', 'Same']]

Steps of Candidate Elimination Algorithm 4

['Sunny' 'Warm' '?' 'Strong' '?' '?']

[['Sunny', '?', '?', '?', '?', '?'], ['?', 'Warm', '?', '?', '?', '?'], ['?', '?', '?', '?', '?', '?'], ['?', '?', '?', '?', '?', '?'], ['?',
'?', '?', '?', '?', '?'], ['?', '?', '?', '?', '?', '?']]

Final Specific_h:

['Sunny' 'Warm' '?' 'Strong' '?' '?']

Final General_h:

[['Sunny', '?', '?', '?', '?', '?'], ['?', 'Warm', '?', '?', '?', '?']]

Result:

Thus the program is executed successfully.


2.ID3 algorithm

Aim:

To implement ID3 algorithm using python code.

Algorithm:

Step 1. Select Root node(S) based on low Entropy and Highest Information Gain

Step 2 . On each iteration of an algorithms it calculate the Entropy and Information gain, considering
that every node is unused

Step 3. Select node base on Lowest Entropy or Highest I.G

Step 4. then Splits set S to produce the subsets of data

Step 5. An algorithms continuous to recur on each subset and make sure that attributes are fresh and
Creates the decision Tree

Csv file:

tennisdata.csv

Outlook,Temperature,Humidity,Windy,PlayTennis

Sunny,Hot,High,False,No

Sunny,Hot,High,True,No

Overcast,Hot,High,False,Yes

Rainy,Mild,High,False,Yes

Rainy,Cool,Normal,False,Yes

Rainy,Cool,Normal,True,No

Overcast,Cool,Normal,True,Yes

Sunny,Mild,High,False,No

Sunny,Cool,Normal,False,Yes

Rainy,Mild,Normal,False,Yes
Sunny,Mild,Normal,True,Yes

Overcast,Mild,High,True,Yes

Overcast,Hot,Normal,False,Yes

Rainy,Mild,High,True,NoProgram:

Program:

import numpy as np

import math

import csv

def read_data(filename):

with open(filename, 'r') as csvfile:

datareader = csv.reader(csvfile, delimiter=',')

headers = next(datareader)

metadata = []

traindata = []

for name in headers:

metadata.append(name)

for row in datareader:

traindata.append(row)

return (metadata, traindata)

class Node:

def __init__(self, attribute):


self.attribute = attribute

self.children = []

self.answer = ""

def __str__(self):

return self.attribute

def subtables(data, col, delete):

dict = {}

items = np.unique(data[:, col])

count = np.zeros((items.shape[0], 1), dtype=np.int32)

for x in range(items.shape[0]):

for y in range(data.shape[0]):

if data[y, col] == items[x]:

count[x] += 1

for x in range(items.shape[0]):

dict[items[x]] = np.empty((int(count[x]), data.shape[1]), dtype="|S32")

pos = 0

for y in range(data.shape[0]):

if data[y, col] == items[x]:

dict[items[x]][pos] = data[y]

pos += 1

if delete:
dict[items[x]] = np.delete(dict[items[x]], col, 1)

return items, dict

def entropy(S):

items = np.unique(S)

if items.size == 1:

return 0

counts = np.zeros((items.shape[0], 1))

sums = 0

for x in range(items.shape[0]):

counts[x] = sum(S == items[x]) / (S.size * 1.0)

for count in counts:

sums += -1 * count * math.log(count, 2)

return sums

def gain_ratio(data, col):

items, dict = subtables(data, col, delete=False)

total_size = data.shape[0]

entropies = np.zeros((items.shape[0], 1))


intrinsic = np.zeros((items.shape[0], 1))

for x in range(items.shape[0]):

ratio = dict[items[x]].shape[0]/(total_size * 1.0)

entropies[x] = ratio * entropy(dict[items[x]][:, -1])

intrinsic[x] = ratio * math.log(ratio, 2)

total_entropy = entropy(data[:, -1])

iv = -1 * sum(intrinsic)

for x in range(entropies.shape[0]):

total_entropy -= entropies[x]

return total_entropy / iv

def create_node(data, metadata):

if (np.unique(data[:, -1])).shape[0] == 1:

node = Node("")

node.answer = np.unique(data[:, -1])[0]

return node

gains = np.zeros((data.shape[1] - 1, 1))

for col in range(data.shape[1] - 1):

gains[col] = gain_ratio(data, col)


split = np.argmax(gains)

node = Node(metadata[split])

metadata = np.delete(metadata, split, 0)

items, dict = subtables(data, split, delete=True)

for x in range(items.shape[0]):

child = create_node(dict[items[x]], metadata)

node.children.append((items[x], child))

return node

def empty(size):

s = ""

for x in range(size):

s += " "

return s

def print_tree(node, level):

if node.answer != "":

print(empty(level), node.answer)

return

print(empty(level), node.attribute)
for value, n in node.children:

print(empty(level + 1), value)

print_tree(n, level + 2)

metadata, traindata = read_data("tennisdata.csv")

data = np.array(traindata)

node = create_node(data, metadata)

print_tree(node, 0)

Output:

Outlook

Overcast

b'Yes'

Rainy

Windy

b'False'

b'Yes'

b'True'

b'No'

Sunny

Humidity

b'High'

b'No'

b'Normal'

b'Yes'
Ex:no:3 Backpropagation algorithm

Aim:

To implement Back Propagation using Python code.

Introduction:

This part is broken down into two sections.

Transfer Derivative.

Error Backpropagation.

3.1. Transfer Derivative

Step 1:Given an output value from a neuron, we need to calculate it’s slope.

We are using the sigmoid transfer function, the derivative of which can be calculated as follows:

1 derivative = output * (1.0 - output)

Below is a function named transfer_derivative() that implements this equation.

1 # Calculate the derivative of an neuron output

2 def transfer_derivative(output):

3 return output * (1.0 - output)

Now, let’s see how this can be used.

3.2. Error Backpropagation

The first step is to calculate the error for each output neuron, this will give us our error signal (input) to
propagate backwards through the network.

The error for a given neuron can be calculated as follows:

1 error = (output - expected) * transfer_derivative(output)

Where expected is the expected output value for the neuron, output is the output value for the neuron
and transfer_derivative() calculates the slope of the neuron’s output value, as shown above.
This error calculation is used for neurons in the output layer. The expected value is the class value itself.
In the hidden layer, things are a little more complicated.

The error signal for a neuron in the hidden layer is calculated as the weighted error of each neuron in
the output layer. Think of the error traveling back along the weights of the output layer to the neurons
in the hidden layer.

The back-propagated error signal is accumulated and then used to determine the error for the neuron in
the hidden layer, as follows:

1 error = (weight_k * error_j) * transfer_derivative(output)

Where error_j is the error signal from the jth neuron in the output layer, weight_k is the weight that
connects the kth neuron to the current neuron and output is the output for the current neuron.

Below is a function named backward_propagate_error() that implements this procedure.

You can see that the error signal calculated for each neuron is stored with the name ‘delta’. You can see
that the layers of the network are iterated in reverse order, starting at the output and working
backwards. This ensures that the neurons in the output layer have ‘delta’ values calculated first that
neurons in the hidden layer can use in the subsequent iteration. I chose the name ‘delta’ to reflect the
change the error implies on the neuron (e.g. the weight delta).

You can see that the error signal for neurons in the hidden layer is accumulated from neurons in the
output layer where the hidden neuron number j is also the index of the neuron’s weight in the output
layer neuron[‘weights’][j].

1 # Backpropagate error and store in neurons

2 def backward_propagate_error(network, expected):

3 for i in reversed(range(len(network))):

4 layer = network[i]

5 errors = list()

6 if i != len(network)-1:

7 for j in range(len(layer)):

8 error = 0.0

9 for neuron in network[i + 1]:

10 error += (neuron['weights'][j] * neuron['delta'])


11 errors.append(error)

12 else:

13 for j in range(len(layer)):

14 neuron = layer[j]

15 errors.append(neuron['output'] - expected[j])

16 for j in range(len(layer)):

17 neuron = layer[j]

18 neuron['delta'] = errors[j] * transfer_derivative(neuron['output'])

Let’s put all of the pieces together and see how it works.

We define a fixed neural network with output values and backpropagate an expected output pattern.
The complete example is listed below.

1 # Calculate the derivative of an neuron output

2 def transfer_derivative(output):

3 return output * (1.0 - output)

5 # Backpropagate error and store in neurons

6 def backward_propagate_error(network, expected):

7 for i in reversed(range(len(network))):

8 layer = network[i]

9 errors = list()

10 if i != len(network)-1:

11 for j in range(len(layer)):

12 error = 0.0

13 for neuron in network[i + 1]:

14 error += (neuron['weights'][j] * neuron['delta'])


errors.append(error)
15
else:
16
for j in range(len(layer)):
17
neuron = layer[j]
18
errors.append(neuron['output'] - expected[j])
19
for j in range(len(layer)):
20
neuron = layer[j]
21
neuron['delta'] = errors[j] * transfer_derivative(neuron['output'])
22

23
# test backpropagation of error
24
network = [[{'output': 0.7105668883115941, 'weights': [0.13436424411240122,
25 0.8474337369372327, 0.763774618976614]}],

26 [{'output': 0.6213859615555266, 'weights': [0.2550690257394217, 0.49543508709194095]},


{'output': 0.6573693455986976, 'weights': [0.4494910647887381, 0.651592972722763]}]]
27
expected = [0, 1]
28
backward_propagate_error(network, expected)
29
for layer in network:
30
print(layer)

Running the example prints the network after the backpropagation of error is complete. You can see
that error values are calculated and stored in the neurons for the output layer and the hidden layer.

[{'output': 0.7105668883115941, 'weights': [0.13436424411240122, 0.8474337369372327,


0.763774618976614], 'delta': 0.0005348048046610517}]
1
[{'output': 0.6213859615555266, 'weights': [0.2550690257394217, 0.49543508709194095], 'delta':
2
0.14619064683582808}, {'output': 0.6573693455986976, 'weights': [0.4494910647887381,
0.651592972722763], 'delta': -0.0771723774346327}]

Now let’s use the backpropagation of error to train the network.

4. Train Network

The network is trained using stochastic gradient descent.


This involves multiple iterations of exposing a training dataset to the network and for each row of data
forward propagating the inputs, back propagating the error and updating the network weights.

This part is broken down into two sections:

Update Weights.

Train Network.

4.1. Update Weights

Once errors are calculated for each neuron in the network via the back propagation method above, they
can be used to update weights.

Network weights are updated as follows:

1 weight = weight - learning_rate * error * input

Where weight is a given weight, learning_rate is a parameter that you must specify, error is the error
calculated by the backpropagation procedure for the neuron and input is the input value that caused
the error.

The same procedure can be used for updating the bias weight, except there is no input term, or input is
the fixed value of 1.0.

Learning rate controls how much to change the weight to correct for the error. For example, a value of
0.1 will update the weight 10% of the amount that it possibly could be updated. Small learning rates are
preferred that cause slower learning over a large number of training iterations. This increases the
likelihood of the network finding a good set of weights across all layers rather than the fastest set of
weights that minimize error (called premature convergence).

Below is a function named update_weights() that updates the weights for a network given an input row
of data, a learning rate and assume that a forward and backward propagation have already been
performed.

Remember that the input for the output layer is a collection of outputs from the hidden layer.

1 # Update network weights with error

2 def update_weights(network, row, l_rate):

3 for i in range(len(network)):

4 inputs = row[:-1]

5 if i != 0:
6 inputs = [neuron['output'] for neuron in network[i - 1]]

7 for neuron in network[i]:

8 for j in range(len(inputs)):

9 neuron['weights'][j] -= l_rate * neuron['delta'] * inputs[j]

10 neuron['weights'][-1] -= l_rate * neuron['delta']

Now we know how to update network weights, let’s see how we can do it repeatedly.

4.2. Train Network

As mentioned, the network is updated using stochastic gradient descent.

This involves first looping for a fixed number of epochs and within each epoch updating the network for
each row in the training dataset.

Because updates are made for each training pattern, this type of learning is called online learning. If
errors were accumulated across an epoch before updating the weights, this is called batch learning or
batch gradient descent.

Algorithm:

Step 1:Initialze the network

Step 2:Calculate the neuron activation for the input

Step 3:Transfer neuron activation

Step 4:Forward propagate input into a network output

Step 4:Calculate the derivative of neuron output:

Step 5: Backpropagate error and store in neurons

Steo 6:Update network weights with error

Step 7:Train a network for a fixed number of epochs

Step 8:Test training backprop algorithm

Program:

from math import exp


from random import seed

from random import random

# Initialize a network

def initialize_network(n_inputs, n_hidden, n_outputs):

network = list()

hidden_layer = [{'weights':[random() for i in range(n_inputs + 1)]} for i in range(n_hidden)]

network.append(hidden_layer)

output_layer = [{'weights':[random() for i in range(n_hidden + 1)]} for i in range(n_outputs)]

network.append(output_layer)

return network

# Calculate neuron activation for an input

def activate(weights, inputs):

activation = weights[-1]

for i in range(len(weights)-1):

activation += weights[i] * inputs[i]

return activation

# Transfer neuron activation

def transfer(activation):

return 1.0 / (1.0 + exp(-activation))

# Forward propagate input to a network output

def forward_propagate(network, row):

inputs = row

for layer in network:

new_inputs = []

for neuron in layer:


activation = activate(neuron['weights'], inputs)

neuron['output'] = transfer(activation)

new_inputs.append(neuron['output'])

inputs = new_inputs

return inputs

# Calculate the derivative of an neuron output

def transfer_derivative(output):

return output * (1.0 - output)

# Backpropagate error and store in neurons

def backward_propagate_error(network, expected):

for i in reversed(range(len(network))):

layer = network[i]

errors = list()

if i != len(network)-1:

for j in range(len(layer)):

error = 0.0

for neuron in network[i + 1]:

error += (neuron['weights'][j] * neuron['delta'])

errors.append(error)

else:

for j in range(len(layer)):

neuron = layer[j]

errors.append(neuron['output'] - expected[j])

for j in range(len(layer)):

neuron = layer[j]
neuron['delta'] = errors[j] * transfer_derivative(neuron['output'])

# Update network weights with error

def update_weights(network, row, l_rate):

for i in range(len(network)):

inputs = row[:-1]

if i != 0:

inputs = [neuron['output'] for neuron in network[i - 1]]

for neuron in network[i]:

for j in range(len(inputs)):

neuron['weights'][j] -= l_rate * neuron['delta'] * inputs[j]

neuron['weights'][-1] -= l_rate * neuron['delta']

# Train a network for a fixed number of epochs

def train_network(network, train, l_rate, n_epoch, n_outputs):

for epoch in range(n_epoch):

sum_error = 0

for row in train:

outputs = forward_propagate(network, row)

expected = [0 for i in range(n_outputs)]

expected[row[-1]] = 1

sum_error += sum([(expected[i]-outputs[i])**2 for i in range(len(expected))])

backward_propagate_error(network, expected)

update_weights(network, row, l_rate)

print('>epoch=%d, lrate=%.3f, error=%.3f' % (epoch, l_rate, sum_error))

# Test training backprop algorithm

seed(1)
dataset = [[2.7810836,2.550537003,0],

[1.465489372,2.362125076,0],

[3.396561688,4.400293529,0],

[1.38807019,1.850220317,0],

[3.06407232,3.005305973,0],

[7.627531214,2.759262235,1],

[5.332441248,2.088626775,1],

[6.922596716,1.77106367,1],

[8.675418651,-0.242068655,1],

[7.673756466,3.508563011,1]]

n_inputs = len(dataset[0]) - 1

n_outputs = len(set([row[-1] for row in dataset]))

network = initialize_network(n_inputs, 2, n_outputs)

train_network(network, dataset, 0.5, 20, n_outputs)

for layer in network:

print(layer)

Output:

>epoch=0, lrate=0.500, error=6.350

>epoch=1, lrate=0.500, error=5.531

>epoch=2, lrate=0.500, error=5.221

>epoch=3, lrate=0.500, error=4.951

>epoch=4, lrate=0.500, error=4.519

>epoch=5, lrate=0.500, error=4.173

>epoch=6, lrate=0.500, error=3.835

>epoch=7, lrate=0.500, error=3.506


>epoch=8, lrate=0.500, error=3.192

>epoch=9, lrate=0.500, error=2.898

>epoch=10, lrate=0.500, error=2.626

>epoch=11, lrate=0.500, error=2.377

>epoch=12, lrate=0.500, error=2.153

>epoch=13, lrate=0.500, error=1.953

>epoch=14, lrate=0.500, error=1.774

>epoch=15, lrate=0.500, error=1.614

>epoch=16, lrate=0.500, error=1.472

>epoch=17, lrate=0.500, error=1.346

>epoch=18, lrate=0.500, error=1.233

>epoch=19, lrate=0.500, error=1.132

[{'weights': [-1.4688375095432327, 1.850887325439514, 1.0858178629550297], 'output':


0.029980305604426185, 'delta': 0.0059546604162323625}, {'weights': [0.37711098142462157, -
0.0625909894552989, 0.2765123702642716], 'output': 0.9456229000211323, 'delta': -
0.0026279652850863837}]

[{'weights': [2.515394649397849, -0.3391927502445985, -0.9671565426390275], 'output':


0.23648794202357587, 'delta': 0.04270059278364587}, {'weights': [-2.5584149848484263,
1.0036422106209202, 0.42383086467582715], 'output': 0.7790535202438367, 'delta': -
0.03803132596437354}]

Ex:no:9 Locally Weighted Regression Algorithm

Aim:
To Implement the non-parametric Locally Weighted Regression algorithm in Python
in order to fit data points.

Regression:

 Regression is a technique from statistics that are used to predict values of the desired
target quantity when the target quantity is continuous.
 In regression, we seek to identify (or estimate) a continuous variable y associated
with a given input vector x.
 y is called the dependent variable.
 x is called the independent variable.

Loess/Lowess Regression:

Loess regression is a nonparametric technique that uses local weighted regression to fit a
smooth curve through points in a scatter plot.
Lowess Algorithm:

Locally weighted regression is a very powerful nonparametric model used in statistical learning.

Given a dataset X, y, we attempt to find a model parameter β(x) that minimizes residual sum of
weighted squared errors.

The weights are given by a kernel function (k or w) which can be chosen arbitrarily

Algorithm

1. Read the Given data Sample to X and the curve (linear or non linear) to Y

2. Set the value for Smoothening parameter or Free parameter say τ

3. Set the bias /Point of interest set x0 which is a subset of X

4. Determine the weight matrix using :

5. Determine the value of model term parameter β using:

6. Prediction = x0*β
Python Program

import numpy as np

import matplotlib.pyplot as plt

def local_regression(x0, X, Y, tau):

x0 = [1, x0]

X = [[1, i] for i in X]

X = np.asarray(X)

xw = (X.T) * np.exp(np.sum((X - x0) ** 2, axis=1) / (-2 * tau))

beta = np.linalg.pinv(xw @ X) @ xw @ Y @ x0

return beta

def draw(tau):

prediction = [local_regression(x0, X, Y, tau) for x0 in domain]

plt.plot(X, Y, 'o', color='black')

plt.plot(domain, prediction, color='red')

plt.show()
X = np.linspace(-3, 3, num=1000)

domain = X

Y = np.log(np.abs(X ** 2 - 1) + .5)

draw(10)

Output
Exno:8 KNN

Aim:

To implement k-Nearest Neighbour algorithm to classify the iris data set. Print both correct and wrong
predictions.

Algorithm:

Step 1: Selecting the optimal value of K

K represents the number of nearest neighbors that needs to be considered while making prediction.

Step 2: Calculating distance

To measure the similarity between target and training data points, Euclidean distance is used. Distance
is calculated between each of the data points in the dataset and target point.

Step 3: Finding Nearest Neighbors

The k data points with the smallest distances to the target point are the nearest neighbors.

Step 4: Voting for Classification or Taking Average for Regression

In the classification problem, the class labels of are determined by performing majority voting. The class
with the most occurrences among the neighbors becomes the predicted class for the target data point.

In the regression problem, the class label is calculated by taking average of the target values of K nearest
neighbors. The calculated average value becomes the predicted output for the target data point.

Let X be the training dataset with n data points, where each data point is represented by a d-
dimensional feature vector and Y be the corresponding labels or values for each data point in
X. Given a new data point x, the algorithm calculates the distance between x and each data
point in X using a distance metric, such as Euclidean distance:

The algorithm selects the K data points from X that have the shortest distances to x. For classification
tasks, the algorithm assigns the label y that is most frequent among the K nearest neighbors to x. For
regression tasks, the algorithm calculates the average or weighted average of the values y of the K
nearest neighbors and assigns it as the predicted value for x.

Program:

from sklearn.model_selection import train_test_split


from sklearn.neighbors import KNeighborsClassifier

from sklearn import datasets

iris=datasets.load_iris()

print("Iris Data set loaded...")

x_train, x_test, y_train, y_test = train_test_split(iris.data,iris.target,test_size=0.1)

#random_state=0

for i in range(len(iris.target_names)):

print("Label", i , "-",str(iris.target_names[i]))

classifier = KNeighborsClassifier(n_neighbors=2)

classifier.fit(x_train, y_train)

y_pred=classifier.predict(x_test)

print("Results of Classification using K-nn with K=1 ")

for r in range(0,len(x_test)):

print(" Sample:", str(x_test[r]), " Actual-label:", str(y_test[r])," Predicted-label:", str(y_pred[r]))

print("Classification Accuracy :" , classifier.score(x_test,y_test));

output:

Iris Data set loaded...


Label 0 - setosa
Label 1 - versicolor
Label 2 - virginica
Results of Classification using K-nn with K=1
Sample: [5. 3.6 1.4 0.2] Actual-label: 0 Predicted-label: 0
Classification Accuracy : 0.9333333333333333
Sample: [4.5 2.3 1.3 0.3] Actual-label: 0 Predicted-label: 0
Classification Accuracy : 0.9333333333333333
Sample: [5.1 3.5 1.4 0.3] Actual-label: 0 Predicted-label: 0
Classification Accuracy : 0.9333333333333333
Sample: [6.1 2.6 5.6 1.4] Actual-label: 2 Predicted-label: 1
Classification Accuracy : 0.9333333333333333
Sample: [4.4 2.9 1.4 0.2] Actual-label: 0 Predicted-label: 0
Classification Accuracy : 0.9333333333333333
Sample: [5.2 3.5 1.5 0.2] Actual-label: 0 Predicted-label: 0
Classification Accuracy : 0.9333333333333333
Sample: [6.2 3.4 5.4 2.3] Actual-label: 2 Predicted-label: 2
Classification Accuracy : 0.9333333333333333
Sample: [4.8 3.4 1.9 0.2] Actual-label: 0 Predicted-label: 0
Classification Accuracy : 0.9333333333333333
Sample: [6.9 3.1 5.4 2.1] Actual-label: 2 Predicted-label: 2
Classification Accuracy : 0.9333333333333333
Sample: [5.6 3. 4.1 1.3] Actual-label: 1 Predicted-label: 1
Classification Accuracy : 0.9333333333333333
Sample: [4.7 3.2 1.6 0.2] Actual-label: 0 Predicted-label: 0
Classification Accuracy : 0.9333333333333333
Sample: [6.3 2.3 4.4 1.3] Actual-label: 1 Predicted-label: 1
Classification Accuracy : 0.9333333333333333
Sample: [5.1 3.4 1.5 0.2] Actual-label: 0 Predicted-label: 0
Classification Accuracy : 0.9333333333333333
Sample: [6. 2.9 4.5 1.5] Actual-label: 1 Predicted-label: 1
Classification Accuracy : 0.9333333333333333
Sample: [5.4 3.9 1.3 0.4] Actual-label: 0 Predicted-label: 0
Classification Accuracy : 0.9333333333333333

Ex:no:7

Aim:

To implement EM algorithm to cluster a set of data stored in a .CSV file. Use the same data set for
clustering using the k-Means algorithm. Compare the results of these two algorithms.

Algorithm

1. import some data to play with


2. Build the K Means Model
3. Visualise the clustering results
4. Plot the Original Classifications using Petal features
5. Plot the Models Classifications
6. General EM for GMM
7. transform your data such that its distribution will have a # mean
value 0 and standard deviation of 1.

CSv files:
Csv file:

Id,SepalLengthCm,SepalWidthCm,PetalLengthCm,PetalWidthCm,Species

1,5.1,3.5,1.4,0.2,Iris-setosa

2,4.9,3.0,1.4,0.2,Iris-setosa

3,4.7,3.2,1.3,0.2,Iris-setosa

4,4.6,3.1,1.5,0.2,Iris-setosa

5,5.0,3.6,1.4,0.2,Iris-setosa
6,5.4,3.9,1.7,0.4,Iris-setosa

7,4.6,3.4,1.4,0.3,Iris-setosa

8,5.0,3.4,1.5,0.2,Iris-setosa

9,4.4,2.9,1.4,0.2,Iris-setosa

10,4.9,3.1,1.5,0.1,Iris-setosa

11,5.4,3.7,1.5,0.2,Iris-setosa

12,4.8,3.4,1.6,0.2,Iris-setosa

13,4.8,3.0,1.4,0.1,Iris-setosa

14,4.3,3.0,1.1,0.1,Iris-setosa

15,5.8,4.0,1.2,0.2,Iris-setosa

16,5.7,4.4,1.5,0.4,Iris-setosa

17,5.4,3.9,1.3,0.4,Iris-setosa

18,5.1,3.5,1.4,0.3,Iris-setosa

19,5.7,3.8,1.7,0.3,Iris-setosa

20,5.1,3.8,1.5,0.3,Iris-setosa

21,5.4,3.4,1.7,0.2,Iris-setosa

22,5.1,3.7,1.5,0.4,Iris-setosa

23,4.6,3.6,1.0,0.2,Iris-setosa

24,5.1,3.3,1.7,0.5,Iris-setosa

25,4.8,3.4,1.9,0.2,Iris-setosa

26,5.0,3.0,1.6,0.2,Iris-setosa

27,5.0,3.4,1.6,0.4,Iris-setosa

28,5.2,3.5,1.5,0.2,Iris-setosa

29,5.2,3.4,1.4,0.2,Iris-setosa

30,4.7,3.2,1.6,0.2,Iris-setosa
31,4.8,3.1,1.6,0.2,Iris-setosa

32,5.4,3.4,1.5,0.4,Iris-setosa

33,5.2,4.1,1.5,0.1,Iris-setosa

34,5.5,4.2,1.4,0.2,Iris-setosa

35,4.9,3.1,1.5,0.1,Iris-setosa

36,5.0,3.2,1.2,0.2,Iris-setosa

37,5.5,3.5,1.3,0.2,Iris-setosa

38,4.9,3.1,1.5,0.1,Iris-setosa

39,4.4,3.0,1.3,0.2,Iris-setosa

40,5.1,3.4,1.5,0.2,Iris-setosa

41,5.0,3.5,1.3,0.3,Iris-setosa

42,4.5,2.3,1.3,0.3,Iris-setosa

43,4.4,3.2,1.3,0.2,Iris-setosa

44,5.0,3.5,1.6,0.6,Iris-setosa

45,5.1,3.8,1.9,0.4,Iris-setosa

46,4.8,3.0,1.4,0.3,Iris-setosa

47,5.1,3.8,1.6,0.2,Iris-setosa

48,4.6,3.2,1.4,0.2,Iris-setosa

49,5.3,3.7,1.5,0.2,Iris-setosa

50,5.0,3.3,1.4,0.2,Iris-setosa

51,7.0,3.2,4.7,1.4,Iris-versicolor

52,6.4,3.2,4.5,1.5,Iris-versicolor

53,6.9,3.1,4.9,1.5,Iris-versicolor

54,5.5,2.3,4.0,1.3,Iris-versicolor

55,6.5,2.8,4.6,1.5,Iris-versicolor
56,5.7,2.8,4.5,1.3,Iris-versicolor

57,6.3,3.3,4.7,1.6,Iris-versicolor

58,4.9,2.4,3.3,1.0,Iris-versicolor

59,6.6,2.9,4.6,1.3,Iris-versicolor

60,5.2,2.7,3.9,1.4,Iris-versicolor

61,5.0,2.0,3.5,1.0,Iris-versicolor

62,5.9,3.0,4.2,1.5,Iris-versicolor

63,6.0,2.2,4.0,1.0,Iris-versicolor

64,6.1,2.9,4.7,1.4,Iris-versicolor

65,5.6,2.9,3.6,1.3,Iris-versicolor

66,6.7,3.1,4.4,1.4,Iris-versicolor

67,5.6,3.0,4.5,1.5,Iris-versicolor

68,5.8,2.7,4.1,1.0,Iris-versicolor

69,6.2,2.2,4.5,1.5,Iris-versicolor

70,5.6,2.5,3.9,1.1,Iris-versicolor

71,5.9,3.2,4.8,1.8,Iris-versicolor

72,6.1,2.8,4.0,1.3,Iris-versicolor

73,6.3,2.5,4.9,1.5,Iris-versicolor

74,6.1,2.8,4.7,1.2,Iris-versicolor

75,6.4,2.9,4.3,1.3,Iris-versicolor

76,6.6,3.0,4.4,1.4,Iris-versicolor

77,6.8,2.8,4.8,1.4,Iris-versicolor

78,6.7,3.0,5.0,1.7,Iris-versicolor

79,6.0,2.9,4.5,1.5,Iris-versicolor

80,5.7,2.6,3.5,1.0,Iris-versicolor
81,5.5,2.4,3.8,1.1,Iris-versicolor

82,5.5,2.4,3.7,1.0,Iris-versicolor

83,5.8,2.7,3.9,1.2,Iris-versicolor

84,6.0,2.7,5.1,1.6,Iris-versicolor

85,5.4,3.0,4.5,1.5,Iris-versicolor

86,6.0,3.4,4.5,1.6,Iris-versicolor

87,6.7,3.1,4.7,1.5,Iris-versicolor

88,6.3,2.3,4.4,1.3,Iris-versicolor

89,5.6,3.0,4.1,1.3,Iris-versicolor

90,5.5,2.5,4.0,1.3,Iris-versicolor

91,5.5,2.6,4.4,1.2,Iris-versicolor

92,6.1,3.0,4.6,1.4,Iris-versicolor

93,5.8,2.6,4.0,1.2,Iris-versicolor

94,5.0,2.3,3.3,1.0,Iris-versicolor

95,5.6,2.7,4.2,1.3,Iris-versicolor

96,5.7,3.0,4.2,1.2,Iris-versicolor

97,5.7,2.9,4.2,1.3,Iris-versicolor

98,6.2,2.9,4.3,1.3,Iris-versicolor

99,5.1,2.5,3.0,1.1,Iris-versicolor

100,5.7,2.8,4.1,1.3,Iris-versicolor

101,6.3,3.3,6.0,2.5,Iris-virginica

102,5.8,2.7,5.1,1.9,Iris-virginica

103,7.1,3.0,5.9,2.1,Iris-virginica

104,6.3,2.9,5.6,1.8,Iris-virginica

105,6.5,3.0,5.8,2.2,Iris-virginica
106,7.6,3.0,6.6,2.1,Iris-virginica

107,4.9,2.5,4.5,1.7,Iris-virginica

108,7.3,2.9,6.3,1.8,Iris-virginica

109,6.7,2.5,5.8,1.8,Iris-virginica

110,7.2,3.6,6.1,2.5,Iris-virginica

111,6.5,3.2,5.1,2.0,Iris-virginica

112,6.4,2.7,5.3,1.9,Iris-virginica

113,6.8,3.0,5.5,2.1,Iris-virginica

114,5.7,2.5,5.0,2.0,Iris-virginica

115,5.8,2.8,5.1,2.4,Iris-virginica

116,6.4,3.2,5.3,2.3,Iris-virginica

117,6.5,3.0,5.5,1.8,Iris-virginica

118,7.7,3.8,6.7,2.2,Iris-virginica

119,7.7,2.6,6.9,2.3,Iris-virginica

120,6.0,2.2,5.0,1.5,Iris-virginica

121,6.9,3.2,5.7,2.3,Iris-virginica

122,5.6,2.8,4.9,2.0,Iris-virginica

123,7.7,2.8,6.7,2.0,Iris-virginica

124,6.3,2.7,4.9,1.8,Iris-virginica

125,6.7,3.3,5.7,2.1,Iris-virginica

126,7.2,3.2,6.0,1.8,Iris-virginica

127,6.2,2.8,4.8,1.8,Iris-virginica

128,6.1,3.0,4.9,1.8,Iris-virginica

129,6.4,2.8,5.6,2.1,Iris-virginica

130,7.2,3.0,5.8,1.6,Iris-virginica
131,7.4,2.8,6.1,1.9,Iris-virginica

132,7.9,3.8,6.4,2.0,Iris-virginica

133,6.4,2.8,5.6,2.2,Iris-virginica

134,6.3,2.8,5.1,1.5,Iris-virginica

135,6.1,2.6,5.6,1.4,Iris-virginica

136,7.7,3.0,6.1,2.3,Iris-virginica

137,6.3,3.4,5.6,2.4,Iris-virginica

138,6.4,3.1,5.5,1.8,Iris-virginica

139,6.0,3.0,4.8,1.8,Iris-virginica

140,6.9,3.1,5.4,2.1,Iris-virginica

141,6.7,3.1,5.6,2.4,Iris-virginica

142,6.9,3.1,5.1,2.3,Iris-virginica

143,5.8,2.7,5.1,1.9,Iris-virginica

144,6.8,3.2,5.9,2.3,Iris-virginica

145,6.7,3.3,5.7,2.5,Iris-virginica

146,6.7,3.0,5.2,2.3,Iris-virginica

147,6.3,2.5,5.0,1.9,Iris-virginica

148,6.5,3.0,5.2,2.0,Iris-virginica

149,6.2,3.4,5.4,2.3,Iris-virginica

150,5.9,3.0,5.1,1.8,Iris-virginica

Program:

import matplotlib.pyplot as plt


from sklearn import datasets
from sklearn.cluster import KMeans
import pandas as pd
import numpy as np

# import some data to play with


iris = datasets.load_iris()
X = pd.DataFrame(iris.data)
X.columns = ['Sepal_Length','Sepal_Width','Petal_Length','Petal_Width']
y = pd.DataFrame(iris.target)
y.columns = ['Targets']

# Build the K Means Model


model = KMeans(n_clusters=3)
model.fit(X) # model.labels_ : Gives cluster no for which samples belongs
to

# # Visualise the clustering results


plt.figure(figsize=(14,7))
colormap = np.array(['red', 'lime', 'black'])

# Plot the Original Classifications using Petal features


plt.subplot(1, 3, 1)
plt.scatter(X.Petal_Length, X.Petal_Width, c=colormap[y.Targets], s=40)
plt.title('Real Clusters')
plt.xlabel('Petal Length')
plt.ylabel('Petal Width')

# Plot the Models Classifications


plt.subplot(1, 3, 2)
plt.scatter(X.Petal_Length, X.Petal_Width, c=colormap[model.labels_],
s=40)
plt.title('K-Means Clustering')
plt.xlabel('Petal Length')
plt.ylabel('Petal Width')

# General EM for GMM


from sklearn import preprocessing

# transform your data such that its distribution will have a # mean value
0 and standard deviation of 1.
scaler = preprocessing.StandardScaler()
scaler.fit(X)
xsa = scaler.transform(X)
xs = pd.DataFrame(xsa, columns = X.columns)
from sklearn.mixture import GaussianMixture
gmm = GaussianMixture(n_components=40)
gmm.fit(xs)
plt.subplot(1, 3, 3)
plt.scatter(X.Petal_Length, X.Petal_Width, c=colormap[0], s=40)
plt.title('GMM Clustering')
plt.xlabel('Petal Length')
plt.ylabel('Petal Width')

print('Observation: The GMM using EM algorithm based clustering matched


the true labels more closely than the Kmeans.')
output:

Ex:no:5

Aim:

To implement naïve Bayesian Classifier model to classify a set of documents and measure the accuracy,
precision, and recall.

Algorithm

1. import necessary libraries

2. Load Data from CSV

3. convert them in numbers

4. perform classification using GaussianNB()

5.Display the result

Code:

CSV:naivedata.csv

6,148,72,35,0,33.6,0.627,50,1

1,85,66,29,0,26.6,0.351,31,0

8,183,64,0,0,23.3,0.672,32,1

1,89,66,23,94,28.1,0.167,21,0
0,137,40,35,168,43.1,2.288,33,1

5,116,74,0,0,25.6,0.201,30,0

3,78,50,32,88,31.0,0.248,26,1

10,115,0,0,0,35.3,0.134,29,0

2,197,70,45,543,30.5,0.158,53,1

8,125,96,0,0,0.0,0.232,54,1

4,110,92,0,0,37.6,0.191,30,0

10,168,74,0,0,38.0,0.537,34,1

10,139,80,0,0,27.1,1.441,57,0

1,189,60,23,846,30.1,0.398,59,1

5,166,72,19,175,25.8,0.587,51,1

7,100,0,0,0,30.0,0.484,32,1

0,118,84,47,230,45.8,0.551,31,1

7,107,74,0,0,29.6,0.254,31,1

1,103,30,38,83,43.3,0.183,33,0

1,115,70,30,96,34.6,0.529,32,1

3,126,88,41,235,39.3,0.704,27,0

8,99,84,0,0,35.4,0.388,50,0

7,196,90,0,0,39.8,0.451,41,1

9,119,80,35,0,29.0,0.263,29,1

11,143,94,33,146,36.6,0.254,51,1

10,125,70,26,115,31.1,0.205,41,1

7,147,76,0,0,39.4,0.257,43,1

1,97,66,15,140,23.2,0.487,22,0

13,145,82,19,110,22.2,0.245,57,0
5,117,92,0,0,34.1,0.337,38,0

5,109,75,26,0,36.0,0.546,60,0

3,158,76,36,245,31.6,0.851,28,1

3,88,58,11,54,24.8,0.267,22,0

6,92,92,0,0,19.9,0.188,28,0

10,122,78,31,0,27.6,0.512,45,0

4,103,60,33,192,24.0,0.966,33,0

11,138,76,0,0,33.2,0.420,35,0

9,102,76,37,0,32.9,0.665,46,1

2,90,68,42,0,38.2,0.503,27,1

4,111,72,47,207,37.1,1.390,56,1

3,180,64,25,70,34.0,0.271,26,0

7,133,84,0,0,40.2,0.696,37,0

7,106,92,18,0,22.7,0.235,48,0

9,171,110,24,240,45.4,0.721,54,1

7,159,64,0,0,27.4,0.294,40,0

0,180,66,39,0,42.0,1.893,25,1

1,146,56,0,0,29.7,0.564,29,0

2,71,70,27,0,28.0,0.586,22,0

7,103,66,32,0,39.1,0.344,31,1

7,105,0,0,0,0.0,0.305,24,0

1,103,80,11,82,19.4,0.491,22,0

1,101,50,15,36,24.2,0.526,26,0

5,88,66,21,23,24.4,0.342,30,0

8,176,90,34,300,33.7,0.467,58,1
7,150,66,42,342,34.7,0.718,42,0

1,73,50,10,0,23.0,0.248,21,0

7,187,68,39,304,37.7,0.254,41,1

0,100,88,60,110,46.8,0.962,31,0

0,146,82,0,0,40.5,1.781,44,0

0,105,64,41,142,41.5,0.173,22,0

2,84,0,0,0,0.0,0.304,21,0

8,133,72,0,0,32.9,0.270,39,1

5,44,62,0,0,25.0,0.587,36,0

2,141,58,34,128,25.4,0.699,24,0

7,114,66,0,0,32.8,0.258,42,1

5,99,74,27,0,29.0,0.203,32,0

0,109,88,30,0,32.5,0.855,38,1

2,109,92,0,0,42.7,0.845,54,0

1,95,66,13,38,19.6,0.334,25,0

4,146,85,27,100,28.9,0.189,27,0

2,100,66,20,90,32.9,0.867,28,1

5,139,64,35,140,28.6,0.411,26,0

13,126,90,0,0,43.4,0.583,42,1

4,129,86,20,270,35.1,0.231,23,0

1,79,75,30,0,32.0,0.396,22,0

1,0,48,20,0,24.7,0.140,22,0

7,62,78,0,0,32.6,0.391,41,0

5,95,72,33,0,37.7,0.370,27,0

0,131,0,0,0,43.2,0.270,26,1
2,112,66,22,0,25.0,0.307,24,0

3,113,44,13,0,22.4,0.140,22,0

2,74,0,0,0,0.0,0.102,22,0

7,83,78,26,71,29.3,0.767,36,0

0,101,65,28,0,24.6,0.237,22,0

5,137,108,0,0,48.8,0.227,37,1

2,110,74,29,125,32.4,0.698,27,0

13,106,72,54,0,36.6,0.178,45,0

2,100,68,25,71,38.5,0.324,26,0

15,136,70,32,110,37.1,0.153,43,1

1,107,68,19,0,26.5,0.165,24,0

1,80,55,0,0,19.1,0.258,21,0

4,123,80,15,176,32.0,0.443,34,0

7,81,78,40,48,46.7,0.261,42,0

4,134,72,0,0,23.8,0.277,60,1

2,142,82,18,64,24.7,0.761,21,0

6,144,72,27,228,33.9,0.255,40,0

2,92,62,28,0,31.6,0.130,24,0

1,71,48,18,76,20.4,0.323,22,0

6,93,50,30,64,28.7,0.356,23,0

1,122,90,51,220,49.7,0.325,31,1

1,163,72,0,0,39.0,1.222,33,1

1,151,60,0,0,26.1,0.179,22,0

0,125,96,0,0,22.5,0.262,21,0

1,81,72,18,40,26.6,0.283,24,0
2,85,65,0,0,39.6,0.930,27,0

1,126,56,29,152,28.7,0.801,21,0

1,96,122,0,0,22.4,0.207,27,0

4,144,58,28,140,29.5,0.287,37,0

3,83,58,31,18,34.3,0.336,25,0

0,95,85,25,36,37.4,0.247,24,1

3,171,72,33,135,33.3,0.199,24,1

8,155,62,26,495,34.0,0.543,46,1

1,89,76,34,37,31.2,0.192,23,0

4,76,62,0,0,34.0,0.391,25,0

7,160,54,32,175,30.5,0.588,39,1

4,146,92,0,0,31.2,0.539,61,1

5,124,74,0,0,34.0,0.220,38,1

5,78,48,0,0,33.7,0.654,25,0

4,97,60,23,0,28.2,0.443,22,0

4,99,76,15,51,23.2,0.223,21,0

0,162,76,56,100,53.2,0.759,25,1

6,111,64,39,0,34.2,0.260,24,0

2,107,74,30,100,33.6,0.404,23,0

5,132,80,0,0,26.8,0.186,69,0

0,113,76,0,0,33.3,0.278,23,1

1,88,30,42,99,55.0,0.496,26,1

3,120,70,30,135,42.9,0.452,30,0

1,118,58,36,94,33.3,0.261,23,0

1,117,88,24,145,34.5,0.403,40,1
0,105,84,0,0,27.9,0.741,62,1

4,173,70,14,168,29.7,0.361,33,1

9,122,56,0,0,33.3,1.114,33,1

3,170,64,37,225,34.5,0.356,30,1

8,84,74,31,0,38.3,0.457,39,0

2,96,68,13,49,21.1,0.647,26,0

2,125,60,20,140,33.8,0.088,31,0

0,100,70,26,50,30.8,0.597,21,0

0,93,60,25,92,28.7,0.532,22,0

0,129,80,0,0,31.2,0.703,29,0

5,105,72,29,325,36.9,0.159,28,0

3,128,78,0,0,21.1,0.268,55,0

5,106,82,30,0,39.5,0.286,38,0

2,108,52,26,63,32.5,0.318,22,0

10,108,66,0,0,32.4,0.272,42,1

4,154,62,31,284,32.8,0.237,23,0

0,102,75,23,0,0.0,0.572,21,0

9,57,80,37,0,32.8,0.096,41,0

2,106,64,35,119,30.5,1.400,34,0

5,147,78,0,0,33.7,0.218,65,0

2,90,70,17,0,27.3,0.085,22,0

1,136,74,50,204,37.4,0.399,24,0

4,114,65,0,0,21.9,0.432,37,0

9,156,86,28,155,34.3,1.189,42,1

1,153,82,42,485,40.6,0.687,23,0
8,188,78,0,0,47.9,0.137,43,1

7,152,88,44,0,50.0,0.337,36,1

2,99,52,15,94,24.6,0.637,21,0

1,109,56,21,135,25.2,0.833,23,0

2,88,74,19,53,29.0,0.229,22,0

17,163,72,41,114,40.9,0.817,47,1

4,151,90,38,0,29.7,0.294,36,0

7,102,74,40,105,37.2,0.204,45,0

0,114,80,34,285,44.2,0.167,27,0

2,100,64,23,0,29.7,0.368,21,0

0,131,88,0,0,31.6,0.743,32,1

6,104,74,18,156,29.9,0.722,41,1

3,148,66,25,0,32.5,0.256,22,0

4,120,68,0,0,29.6,0.709,34,0

4,110,66,0,0,31.9,0.471,29,0

3,111,90,12,78,28.4,0.495,29,0

6,102,82,0,0,30.8,0.180,36,1

6,134,70,23,130,35.4,0.542,29,1

2,87,0,23,0,28.9,0.773,25,0

1,79,60,42,48,43.5,0.678,23,0

2,75,64,24,55,29.7,0.370,33,0

8,179,72,42,130,32.7,0.719,36,1

6,85,78,0,0,31.2,0.382,42,0

0,129,110,46,130,67.1,0.319,26,1

5,143,78,0,0,45.0,0.190,47,0
5,130,82,0,0,39.1,0.956,37,1

6,87,80,0,0,23.2,0.084,32,0

0,119,64,18,92,34.9,0.725,23,0

1,0,74,20,23,27.7,0.299,21,0

5,73,60,0,0,26.8,0.268,27,0

4,141,74,0,0,27.6,0.244,40,0

7,194,68,28,0,35.9,0.745,41,1

8,181,68,36,495,30.1,0.615,60,1

1,128,98,41,58,32.0,1.321,33,1

8,109,76,39,114,27.9,0.640,31,1

5,139,80,35,160,31.6,0.361,25,1

3,111,62,0,0,22.6,0.142,21,0

9,123,70,44,94,33.1,0.374,40,0

7,159,66,0,0,30.4,0.383,36,1

11,135,0,0,0,52.3,0.578,40,1

8,85,55,20,0,24.4,0.136,42,0

5,158,84,41,210,39.4,0.395,29,1

1,105,58,0,0,24.3,0.187,21,0

3,107,62,13,48,22.9,0.678,23,1

4,109,64,44,99,34.8,0.905,26,1

4,148,60,27,318,30.9,0.150,29,1

0,113,80,16,0,31.0,0.874,21,0

1,138,82,0,0,40.1,0.236,28,0

0,108,68,20,0,27.3,0.787,32,0

2,99,70,16,44,20.4,0.235,27,0
6,103,72,32,190,37.7,0.324,55,0

5,111,72,28,0,23.9,0.407,27,0

8,196,76,29,280,37.5,0.605,57,1

5,162,104,0,0,37.7,0.151,52,1

1,96,64,27,87,33.2,0.289,21,0

7,184,84,33,0,35.5,0.355,41,1

2,81,60,22,0,27.7,0.290,25,0

0,147,85,54,0,42.8,0.375,24,0

7,179,95,31,0,34.2,0.164,60,0

0,140,65,26,130,42.6,0.431,24,1

9,112,82,32,175,34.2,0.260,36,1

12,151,70,40,271,41.8,0.742,38,1

5,109,62,41,129,35.8,0.514,25,1

6,125,68,30,120,30.0,0.464,32,0

5,85,74,22,0,29.0,1.224,32,1

5,112,66,0,0,37.8,0.261,41,1

0,177,60,29,478,34.6,1.072,21,1

2,158,90,0,0,31.6,0.805,66,1

7,119,0,0,0,25.2,0.209,37,0

7,142,60,33,190,28.8,0.687,61,0

1,100,66,15,56,23.6,0.666,26,0

1,87,78,27,32,34.6,0.101,22,0

0,101,76,0,0,35.7,0.198,26,0

3,162,52,38,0,37.2,0.652,24,1

4,197,70,39,744,36.7,2.329,31,0
0,117,80,31,53,45.2,0.089,24,0

4,142,86,0,0,44.0,0.645,22,1

6,134,80,37,370,46.2,0.238,46,1

1,79,80,25,37,25.4,0.583,22,0

4,122,68,0,0,35.0,0.394,29,0

3,74,68,28,45,29.7,0.293,23,0

4,171,72,0,0,43.6,0.479,26,1

7,181,84,21,192,35.9,0.586,51,1

0,179,90,27,0,44.1,0.686,23,1

9,164,84,21,0,30.8,0.831,32,1

0,104,76,0,0,18.4,0.582,27,0

1,91,64,24,0,29.2,0.192,21,0

4,91,70,32,88,33.1,0.446,22,0

3,139,54,0,0,25.6,0.402,22,1

6,119,50,22,176,27.1,1.318,33,1

2,146,76,35,194,38.2,0.329,29,0

9,184,85,15,0,30.0,1.213,49,1

10,122,68,0,0,31.2,0.258,41,0

0,165,90,33,680,52.3,0.427,23,0

9,124,70,33,402,35.4,0.282,34,0

1,111,86,19,0,30.1,0.143,23,0

9,106,52,0,0,31.2,0.380,42,0

2,129,84,0,0,28.0,0.284,27,0

2,90,80,14,55,24.4,0.249,24,0

0,86,68,32,0,35.8,0.238,25,0
12,92,62,7,258,27.6,0.926,44,1

1,113,64,35,0,33.6,0.543,21,1

3,111,56,39,0,30.1,0.557,30,0

2,114,68,22,0,28.7,0.092,25,0

1,193,50,16,375,25.9,0.655,24,0

11,155,76,28,150,33.3,1.353,51,1

3,191,68,15,130,30.9,0.299,34,0

3,141,0,0,0,30.0,0.761,27,1

4,95,70,32,0,32.1,0.612,24,0

3,142,80,15,0,32.4,0.200,63,0

4,123,62,0,0,32.0,0.226,35,1

5,96,74,18,67,33.6,0.997,43,0

0,138,0,0,0,36.3,0.933,25,1

2,128,64,42,0,40.0,1.101,24,0

0,102,52,0,0,25.1,0.078,21,0

2,146,0,0,0,27.5,0.240,28,1

10,101,86,37,0,45.6,1.136,38,1

2,108,62,32,56,25.2,0.128,21,0

3,122,78,0,0,23.0,0.254,40,0

1,71,78,50,45,33.2,0.422,21,0

13,106,70,0,0,34.2,0.251,52,0

2,100,70,52,57,40.5,0.677,25,0

7,106,60,24,0,26.5,0.296,29,1

0,104,64,23,116,27.8,0.454,23,0

5,114,74,0,0,24.9,0.744,57,0
2,108,62,10,278,25.3,0.881,22,0

0,146,70,0,0,37.9,0.334,28,1

10,129,76,28,122,35.9,0.280,39,0

7,133,88,15,155,32.4,0.262,37,0

7,161,86,0,0,30.4,0.165,47,1

2,108,80,0,0,27.0,0.259,52,1

7,136,74,26,135,26.0,0.647,51,0

5,155,84,44,545,38.7,0.619,34,0

1,119,86,39,220,45.6,0.808,29,1

4,96,56,17,49,20.8,0.340,26,0

5,108,72,43,75,36.1,0.263,33,0

0,78,88,29,40,36.9,0.434,21,0

0,107,62,30,74,36.6,0.757,25,1

2,128,78,37,182,43.3,1.224,31,1

1,128,48,45,194,40.5,0.613,24,1

0,161,50,0,0,21.9,0.254,65,0

6,151,62,31,120,35.5,0.692,28,0

2,146,70,38,360,28.0,0.337,29,1

0,126,84,29,215,30.7,0.520,24,0

14,100,78,25,184,36.6,0.412,46,1

8,112,72,0,0,23.6,0.840,58,0

0,167,0,0,0,32.3,0.839,30,1

2,144,58,33,135,31.6,0.422,25,1

5,77,82,41,42,35.8,0.156,35,0

5,115,98,0,0,52.9,0.209,28,1
3,150,76,0,0,21.0,0.207,37,0

2,120,76,37,105,39.7,0.215,29,0

10,161,68,23,132,25.5,0.326,47,1

0,137,68,14,148,24.8,0.143,21,0

0,128,68,19,180,30.5,1.391,25,1

2,124,68,28,205,32.9,0.875,30,1

6,80,66,30,0,26.2,0.313,41,0

0,106,70,37,148,39.4,0.605,22,0

2,155,74,17,96,26.6,0.433,27,1

3,113,50,10,85,29.5,0.626,25,0

7,109,80,31,0,35.9,1.127,43,1

2,112,68,22,94,34.1,0.315,26,0

3,99,80,11,64,19.3,0.284,30,0

3,182,74,0,0,30.5,0.345,29,1

3,115,66,39,140,38.1,0.150,28,0

6,194,78,0,0,23.5,0.129,59,1

4,129,60,12,231,27.5,0.527,31,0

3,112,74,30,0,31.6,0.197,25,1

0,124,70,20,0,27.4,0.254,36,1

13,152,90,33,29,26.8,0.731,43,1

2,112,75,32,0,35.7,0.148,21,0

1,157,72,21,168,25.6,0.123,24,0

1,122,64,32,156,35.1,0.692,30,1

10,179,70,0,0,35.1,0.200,37,0

2,102,86,36,120,45.5,0.127,23,1
6,105,70,32,68,30.8,0.122,37,0

8,118,72,19,0,23.1,1.476,46,0

2,87,58,16,52,32.7,0.166,25,0

1,180,0,0,0,43.3,0.282,41,1

12,106,80,0,0,23.6,0.137,44,0

1,95,60,18,58,23.9,0.260,22,0

0,165,76,43,255,47.9,0.259,26,0

0,117,0,0,0,33.8,0.932,44,0

5,115,76,0,0,31.2,0.343,44,1

9,152,78,34,171,34.2,0.893,33,1

7,178,84,0,0,39.9,0.331,41,1

1,130,70,13,105,25.9,0.472,22,0

1,95,74,21,73,25.9,0.673,36,0

1,0,68,35,0,32.0,0.389,22,0

5,122,86,0,0,34.7,0.290,33,0

8,95,72,0,0,36.8,0.485,57,0

8,126,88,36,108,38.5,0.349,49,0

1,139,46,19,83,28.7,0.654,22,0

3,116,0,0,0,23.5,0.187,23,0

3,99,62,19,74,21.8,0.279,26,0

5,0,80,32,0,41.0,0.346,37,1

4,92,80,0,0,42.2,0.237,29,0

4,137,84,0,0,31.2,0.252,30,0

3,61,82,28,0,34.4,0.243,46,0

1,90,62,12,43,27.2,0.580,24,0
3,90,78,0,0,42.7,0.559,21,0

9,165,88,0,0,30.4,0.302,49,1

1,125,50,40,167,33.3,0.962,28,1

13,129,0,30,0,39.9,0.569,44,1

12,88,74,40,54,35.3,0.378,48,0

1,196,76,36,249,36.5,0.875,29,1

5,189,64,33,325,31.2,0.583,29,1

5,158,70,0,0,29.8,0.207,63,0

5,103,108,37,0,39.2,0.305,65,0

4,146,78,0,0,38.5,0.520,67,1

4,147,74,25,293,34.9,0.385,30,0

5,99,54,28,83,34.0,0.499,30,0

6,124,72,0,0,27.6,0.368,29,1

0,101,64,17,0,21.0,0.252,21,0

3,81,86,16,66,27.5,0.306,22,0

1,133,102,28,140,32.8,0.234,45,1

3,173,82,48,465,38.4,2.137,25,1

0,118,64,23,89,0.0,1.731,21,0

0,84,64,22,66,35.8,0.545,21,0

2,105,58,40,94,34.9,0.225,25,0

2,122,52,43,158,36.2,0.816,28,0

12,140,82,43,325,39.2,0.528,58,1

0,98,82,15,84,25.2,0.299,22,0

1,87,60,37,75,37.2,0.509,22,0

4,156,75,0,0,48.3,0.238,32,1
0,93,100,39,72,43.4,1.021,35,0

1,107,72,30,82,30.8,0.821,24,0

0,105,68,22,0,20.0,0.236,22,0

1,109,60,8,182,25.4,0.947,21,0

1,90,62,18,59,25.1,1.268,25,0

1,125,70,24,110,24.3,0.221,25,0

1,119,54,13,50,22.3,0.205,24,0

5,116,74,29,0,32.3,0.660,35,1

8,105,100,36,0,43.3,0.239,45,1

5,144,82,26,285,32.0,0.452,58,1

3,100,68,23,81,31.6,0.949,28,0

1,100,66,29,196,32.0,0.444,42,0

5,166,76,0,0,45.7,0.340,27,1

1,131,64,14,415,23.7,0.389,21,0

4,116,72,12,87,22.1,0.463,37,0

4,158,78,0,0,32.9,0.803,31,1

2,127,58,24,275,27.7,1.600,25,0

3,96,56,34,115,24.7,0.944,39,0

0,131,66,40,0,34.3,0.196,22,1

3,82,70,0,0,21.1,0.389,25,0

3,193,70,31,0,34.9,0.241,25,1

4,95,64,0,0,32.0,0.161,31,1

6,137,61,0,0,24.2,0.151,55,0

5,136,84,41,88,35.0,0.286,35,1

9,72,78,25,0,31.6,0.280,38,0
5,168,64,0,0,32.9,0.135,41,1

2,123,48,32,165,42.1,0.520,26,0

4,115,72,0,0,28.9,0.376,46,1

0,101,62,0,0,21.9,0.336,25,0

8,197,74,0,0,25.9,1.191,39,1

1,172,68,49,579,42.4,0.702,28,1

6,102,90,39,0,35.7,0.674,28,0

1,112,72,30,176,34.4,0.528,25,0

1,143,84,23,310,42.4,1.076,22,0

1,143,74,22,61,26.2,0.256,21,0

0,138,60,35,167,34.6,0.534,21,1

3,173,84,33,474,35.7,0.258,22,1

1,97,68,21,0,27.2,1.095,22,0

4,144,82,32,0,38.5,0.554,37,1

1,83,68,0,0,18.2,0.624,27,0

3,129,64,29,115,26.4,0.219,28,1

1,119,88,41,170,45.3,0.507,26,0

2,94,68,18,76,26.0,0.561,21,0

0,102,64,46,78,40.6,0.496,21,0

2,115,64,22,0,30.8,0.421,21,0

8,151,78,32,210,42.9,0.516,36,1

4,184,78,39,277,37.0,0.264,31,1

0,94,0,0,0,0.0,0.256,25,0

1,181,64,30,180,34.1,0.328,38,1

0,135,94,46,145,40.6,0.284,26,0
1,95,82,25,180,35.0,0.233,43,1

2,99,0,0,0,22.2,0.108,23,0

3,89,74,16,85,30.4,0.551,38,0

1,80,74,11,60,30.0,0.527,22,0

2,139,75,0,0,25.6,0.167,29,0

1,90,68,8,0,24.5,1.138,36,0

0,141,0,0,0,42.4,0.205,29,1

12,140,85,33,0,37.4,0.244,41,0

5,147,75,0,0,29.9,0.434,28,0

1,97,70,15,0,18.2,0.147,21,0

6,107,88,0,0,36.8,0.727,31,0

0,189,104,25,0,34.3,0.435,41,1

2,83,66,23,50,32.2,0.497,22,0

4,117,64,27,120,33.2,0.230,24,0

8,108,70,0,0,30.5,0.955,33,1

4,117,62,12,0,29.7,0.380,30,1

0,180,78,63,14,59.4,2.420,25,1

1,100,72,12,70,25.3,0.658,28,0

0,95,80,45,92,36.5,0.330,26,0

0,104,64,37,64,33.6,0.510,22,1

0,120,74,18,63,30.5,0.285,26,0

1,82,64,13,95,21.2,0.415,23,0

2,134,70,0,0,28.9,0.542,23,1

0,91,68,32,210,39.9,0.381,25,0

2,119,0,0,0,19.6,0.832,72,0
2,100,54,28,105,37.8,0.498,24,0

14,175,62,30,0,33.6,0.212,38,1

1,135,54,0,0,26.7,0.687,62,0

5,86,68,28,71,30.2,0.364,24,0

10,148,84,48,237,37.6,1.001,51,1

9,134,74,33,60,25.9,0.460,81,0

9,120,72,22,56,20.8,0.733,48,0

1,71,62,0,0,21.8,0.416,26,0

8,74,70,40,49,35.3,0.705,39,0

5,88,78,30,0,27.6,0.258,37,0

10,115,98,0,0,24.0,1.022,34,0

0,124,56,13,105,21.8,0.452,21,0

0,74,52,10,36,27.8,0.269,22,0

0,97,64,36,100,36.8,0.600,25,0

8,120,0,0,0,30.0,0.183,38,1

6,154,78,41,140,46.1,0.571,27,0

1,144,82,40,0,41.3,0.607,28,0

0,137,70,38,0,33.2,0.170,22,0

0,119,66,27,0,38.8,0.259,22,0

7,136,90,0,0,29.9,0.210,50,0

4,114,64,0,0,28.9,0.126,24,0

0,137,84,27,0,27.3,0.231,59,0

2,105,80,45,191,33.7,0.711,29,1

7,114,76,17,110,23.8,0.466,31,0

8,126,74,38,75,25.9,0.162,39,0
4,132,86,31,0,28.0,0.419,63,0

3,158,70,30,328,35.5,0.344,35,1

0,123,88,37,0,35.2,0.197,29,0

4,85,58,22,49,27.8,0.306,28,0

0,84,82,31,125,38.2,0.233,23,0

0,145,0,0,0,44.2,0.630,31,1

0,135,68,42,250,42.3,0.365,24,1

1,139,62,41,480,40.7,0.536,21,0

0,173,78,32,265,46.5,1.159,58,0

4,99,72,17,0,25.6,0.294,28,0

8,194,80,0,0,26.1,0.551,67,0

2,83,65,28,66,36.8,0.629,24,0

2,89,90,30,0,33.5,0.292,42,0

4,99,68,38,0,32.8,0.145,33,0

4,125,70,18,122,28.9,1.144,45,1

3,80,0,0,0,0.0,0.174,22,0

6,166,74,0,0,26.6,0.304,66,0

5,110,68,0,0,26.0,0.292,30,0

2,81,72,15,76,30.1,0.547,25,0

7,195,70,33,145,25.1,0.163,55,1

6,154,74,32,193,29.3,0.839,39,0

2,117,90,19,71,25.2,0.313,21,0

3,84,72,32,0,37.2,0.267,28,0

6,0,68,41,0,39.0,0.727,41,1

7,94,64,25,79,33.3,0.738,41,0
3,96,78,39,0,37.3,0.238,40,0

10,75,82,0,0,33.3,0.263,38,0

0,180,90,26,90,36.5,0.314,35,1

1,130,60,23,170,28.6,0.692,21,0

2,84,50,23,76,30.4,0.968,21,0

8,120,78,0,0,25.0,0.409,64,0

12,84,72,31,0,29.7,0.297,46,1

0,139,62,17,210,22.1,0.207,21,0

9,91,68,0,0,24.2,0.200,58,0

2,91,62,0,0,27.3,0.525,22,0

3,99,54,19,86,25.6,0.154,24,0

3,163,70,18,105,31.6,0.268,28,1

9,145,88,34,165,30.3,0.771,53,1

7,125,86,0,0,37.6,0.304,51,0

13,76,60,0,0,32.8,0.180,41,0

6,129,90,7,326,19.6,0.582,60,0

2,68,70,32,66,25.0,0.187,25,0

3,124,80,33,130,33.2,0.305,26,0

6,114,0,0,0,0.0,0.189,26,0

9,130,70,0,0,34.2,0.652,45,1

3,125,58,0,0,31.6,0.151,24,0

3,87,60,18,0,21.8,0.444,21,0

1,97,64,19,82,18.2,0.299,21,0

3,116,74,15,105,26.3,0.107,24,0

0,117,66,31,188,30.8,0.493,22,0
0,111,65,0,0,24.6,0.660,31,0

2,122,60,18,106,29.8,0.717,22,0

0,107,76,0,0,45.3,0.686,24,0

1,86,66,52,65,41.3,0.917,29,0

6,91,0,0,0,29.8,0.501,31,0

1,77,56,30,56,33.3,1.251,24,0

4,132,0,0,0,32.9,0.302,23,1

0,105,90,0,0,29.6,0.197,46,0

0,57,60,0,0,21.7,0.735,67,0

0,127,80,37,210,36.3,0.804,23,0

3,129,92,49,155,36.4,0.968,32,1

8,100,74,40,215,39.4,0.661,43,1

3,128,72,25,190,32.4,0.549,27,1

10,90,85,32,0,34.9,0.825,56,1

4,84,90,23,56,39.5,0.159,25,0

1,88,78,29,76,32.0,0.365,29,0

8,186,90,35,225,34.5,0.423,37,1

5,187,76,27,207,43.6,1.034,53,1

4,131,68,21,166,33.1,0.160,28,0

1,164,82,43,67,32.8,0.341,50,0

4,189,110,31,0,28.5,0.680,37,0

1,116,70,28,0,27.4,0.204,21,0

3,84,68,30,106,31.9,0.591,25,0

6,114,88,0,0,27.8,0.247,66,0

1,88,62,24,44,29.9,0.422,23,0
1,84,64,23,115,36.9,0.471,28,0

7,124,70,33,215,25.5,0.161,37,0

1,97,70,40,0,38.1,0.218,30,0

8,110,76,0,0,27.8,0.237,58,0

11,103,68,40,0,46.2,0.126,42,0

11,85,74,0,0,30.1,0.300,35,0

6,125,76,0,0,33.8,0.121,54,1

0,198,66,32,274,41.3,0.502,28,1

1,87,68,34,77,37.6,0.401,24,0

6,99,60,19,54,26.9,0.497,32,0

0,91,80,0,0,32.4,0.601,27,0

2,95,54,14,88,26.1,0.748,22,0

1,99,72,30,18,38.6,0.412,21,0

6,92,62,32,126,32.0,0.085,46,0

4,154,72,29,126,31.3,0.338,37,0

0,121,66,30,165,34.3,0.203,33,1

3,78,70,0,0,32.5,0.270,39,0

2,130,96,0,0,22.6,0.268,21,0

3,111,58,31,44,29.5,0.430,22,0

2,98,60,17,120,34.7,0.198,22,0

1,143,86,30,330,30.1,0.892,23,0

1,119,44,47,63,35.5,0.280,25,0

6,108,44,20,130,24.0,0.813,35,0

2,118,80,0,0,42.9,0.693,21,1

10,133,68,0,0,27.0,0.245,36,0
2,197,70,99,0,34.7,0.575,62,1

0,151,90,46,0,42.1,0.371,21,1

6,109,60,27,0,25.0,0.206,27,0

12,121,78,17,0,26.5,0.259,62,0

8,100,76,0,0,38.7,0.190,42,0

8,124,76,24,600,28.7,0.687,52,1

1,93,56,11,0,22.5,0.417,22,0

8,143,66,0,0,34.9,0.129,41,1

6,103,66,0,0,24.3,0.249,29,0

3,176,86,27,156,33.3,1.154,52,1

0,73,0,0,0,21.1,0.342,25,0

11,111,84,40,0,46.8,0.925,45,1

2,112,78,50,140,39.4,0.175,24,0

3,132,80,0,0,34.4,0.402,44,1

2,82,52,22,115,28.5,1.699,25,0

6,123,72,45,230,33.6,0.733,34,0

0,188,82,14,185,32.0,0.682,22,1

0,67,76,0,0,45.3,0.194,46,0

1,89,24,19,25,27.8,0.559,21,0

1,173,74,0,0,36.8,0.088,38,1

1,109,38,18,120,23.1,0.407,26,0

1,108,88,19,0,27.1,0.400,24,0

6,96,0,0,0,23.7,0.190,28,0

1,124,74,36,0,27.8,0.100,30,0

7,150,78,29,126,35.2,0.692,54,1
4,183,0,0,0,28.4,0.212,36,1

1,124,60,32,0,35.8,0.514,21,0

1,181,78,42,293,40.0,1.258,22,1

1,92,62,25,41,19.5,0.482,25,0

0,152,82,39,272,41.5,0.270,27,0

1,111,62,13,182,24.0,0.138,23,0

3,106,54,21,158,30.9,0.292,24,0

3,174,58,22,194,32.9,0.593,36,1

7,168,88,42,321,38.2,0.787,40,1

6,105,80,28,0,32.5,0.878,26,0

11,138,74,26,144,36.1,0.557,50,1

3,106,72,0,0,25.8,0.207,27,0

6,117,96,0,0,28.7,0.157,30,0

2,68,62,13,15,20.1,0.257,23,0

9,112,82,24,0,28.2,1.282,50,1

0,119,0,0,0,32.4,0.141,24,1

2,112,86,42,160,38.4,0.246,28,0

2,92,76,20,0,24.2,1.698,28,0

6,183,94,0,0,40.8,1.461,45,0

0,94,70,27,115,43.5,0.347,21,0

2,108,64,0,0,30.8,0.158,21,0

4,90,88,47,54,37.7,0.362,29,0

0,125,68,0,0,24.7,0.206,21,0

0,132,78,0,0,32.4,0.393,21,0

5,128,80,0,0,34.6,0.144,45,0
4,94,65,22,0,24.7,0.148,21,0

7,114,64,0,0,27.4,0.732,34,1

0,102,78,40,90,34.5,0.238,24,0

2,111,60,0,0,26.2,0.343,23,0

1,128,82,17,183,27.5,0.115,22,0

10,92,62,0,0,25.9,0.167,31,0

13,104,72,0,0,31.2,0.465,38,1

5,104,74,0,0,28.8,0.153,48,0

2,94,76,18,66,31.6,0.649,23,0

7,97,76,32,91,40.9,0.871,32,1

1,100,74,12,46,19.5,0.149,28,0

0,102,86,17,105,29.3,0.695,27,0

4,128,70,0,0,34.3,0.303,24,0

6,147,80,0,0,29.5,0.178,50,1

4,90,0,0,0,28.0,0.610,31,0

3,103,72,30,152,27.6,0.730,27,0

2,157,74,35,440,39.4,0.134,30,0

1,167,74,17,144,23.4,0.447,33,1

0,179,50,36,159,37.8,0.455,22,1

11,136,84,35,130,28.3,0.260,42,1

0,107,60,25,0,26.4,0.133,23,0

1,91,54,25,100,25.2,0.234,23,0

1,117,60,23,106,33.8,0.466,27,0

5,123,74,40,77,34.1,0.269,28,0

2,120,54,0,0,26.8,0.455,27,0
1,106,70,28,135,34.2,0.142,22,0

2,155,52,27,540,38.7,0.240,25,1

2,101,58,35,90,21.8,0.155,22,0

1,120,80,48,200,38.9,1.162,41,0

11,127,106,0,0,39.0,0.190,51,0

3,80,82,31,70,34.2,1.292,27,1

10,162,84,0,0,27.7,0.182,54,0

1,199,76,43,0,42.9,1.394,22,1

8,167,106,46,231,37.6,0.165,43,1

9,145,80,46,130,37.9,0.637,40,1

6,115,60,39,0,33.7,0.245,40,1

1,112,80,45,132,34.8,0.217,24,0

4,145,82,18,0,32.5,0.235,70,1

10,111,70,27,0,27.5,0.141,40,1

6,98,58,33,190,34.0,0.430,43,0

9,154,78,30,100,30.9,0.164,45,0

6,165,68,26,168,33.6,0.631,49,0

1,99,58,10,0,25.4,0.551,21,0

10,68,106,23,49,35.5,0.285,47,0

3,123,100,35,240,57.3,0.880,22,0

8,91,82,0,0,35.6,0.587,68,0

6,195,70,0,0,30.9,0.328,31,1

9,156,86,0,0,24.8,0.230,53,1

0,93,60,0,0,35.3,0.263,25,0

3,121,52,0,0,36.0,0.127,25,1
2,101,58,17,265,24.2,0.614,23,0

2,56,56,28,45,24.2,0.332,22,0

0,162,76,36,0,49.6,0.364,26,1

0,95,64,39,105,44.6,0.366,22,0

4,125,80,0,0,32.3,0.536,27,1

5,136,82,0,0,0.0,0.640,69,0

2,129,74,26,205,33.2,0.591,25,0

3,130,64,0,0,23.1,0.314,22,0

1,107,50,19,0,28.3,0.181,29,0

1,140,74,26,180,24.1,0.828,23,0

1,144,82,46,180,46.1,0.335,46,1

8,107,80,0,0,24.6,0.856,34,0

13,158,114,0,0,42.3,0.257,44,1

2,121,70,32,95,39.1,0.886,23,0

7,129,68,49,125,38.5,0.439,43,1

2,90,60,0,0,23.5,0.191,25,0

7,142,90,24,480,30.4,0.128,43,1

3,169,74,19,125,29.9,0.268,31,1

0,99,0,0,0,25.0,0.253,22,0

4,127,88,11,155,34.5,0.598,28,0

4,118,70,0,0,44.5,0.904,26,0

2,122,76,27,200,35.9,0.483,26,0

6,125,78,31,0,27.6,0.565,49,1

1,168,88,29,0,35.0,0.905,52,1

2,129,0,0,0,38.5,0.304,41,0
4,110,76,20,100,28.4,0.118,27,0

6,80,80,36,0,39.8,0.177,28,0

10,115,0,0,0,0.0,0.261,30,1

2,127,46,21,335,34.4,0.176,22,0

9,164,78,0,0,32.8,0.148,45,1

2,93,64,32,160,38.0,0.674,23,1

3,158,64,13,387,31.2,0.295,24,0

5,126,78,27,22,29.6,0.439,40,0

10,129,62,36,0,41.2,0.441,38,1

0,134,58,20,291,26.4,0.352,21,0

3,102,74,0,0,29.5,0.121,32,0

7,187,50,33,392,33.9,0.826,34,1

3,173,78,39,185,33.8,0.970,31,1

10,94,72,18,0,23.1,0.595,56,0

1,108,60,46,178,35.5,0.415,24,0

5,97,76,27,0,35.6,0.378,52,1

4,83,86,19,0,29.3,0.317,34,0

1,114,66,36,200,38.1,0.289,21,0

1,149,68,29,127,29.3,0.349,42,1

5,117,86,30,105,39.1,0.251,42,0

1,111,94,0,0,32.8,0.265,45,0

4,112,78,40,0,39.4,0.236,38,0

1,116,78,29,180,36.1,0.496,25,0

0,141,84,26,0,32.4,0.433,22,0

2,175,88,0,0,22.9,0.326,22,0
2,92,52,0,0,30.1,0.141,22,0

3,130,78,23,79,28.4,0.323,34,1

8,120,86,0,0,28.4,0.259,22,1

2,174,88,37,120,44.5,0.646,24,1

2,106,56,27,165,29.0,0.426,22,0

2,105,75,0,0,23.3,0.560,53,0

4,95,60,32,0,35.4,0.284,28,0

0,126,86,27,120,27.4,0.515,21,0

8,65,72,23,0,32.0,0.600,42,0

2,99,60,17,160,36.6,0.453,21,0

1,102,74,0,0,39.5,0.293,42,1

11,120,80,37,150,42.3,0.785,48,1

3,102,44,20,94,30.8,0.400,26,0

1,109,58,18,116,28.5,0.219,22,0

9,140,94,0,0,32.7,0.734,45,1

13,153,88,37,140,40.6,1.174,39,0

12,100,84,33,105,30.0,0.488,46,0

1,147,94,41,0,49.3,0.358,27,1

1,81,74,41,57,46.3,1.096,32,0

3,187,70,22,200,36.4,0.408,36,1

6,162,62,0,0,24.3,0.178,50,1

4,136,70,0,0,31.2,1.182,22,1

1,121,78,39,74,39.0,0.261,28,0

3,108,62,24,0,26.0,0.223,25,0

0,181,88,44,510,43.3,0.222,26,1
8,154,78,32,0,32.4,0.443,45,1

1,128,88,39,110,36.5,1.057,37,1

7,137,90,41,0,32.0,0.391,39,0

0,123,72,0,0,36.3,0.258,52,1

1,106,76,0,0,37.5,0.197,26,0

6,190,92,0,0,35.5,0.278,66,1

2,88,58,26,16,28.4,0.766,22,0

9,170,74,31,0,44.0,0.403,43,1

9,89,62,0,0,22.5,0.142,33,0

10,101,76,48,180,32.9,0.171,63,0

2,122,70,27,0,36.8,0.340,27,0

5,121,72,23,112,26.2,0.245,30,0

1,126,60,0,0,30.1,0.349,47,1

Program:

import csv

import random

import math

def loadcsv(filename):

lines = csv.reader(open(filename, "r"));

dataset = list(lines)

for i in range(len(dataset)):

#converting strings into numbers for processing

dataset[i] = [float(x) for x in dataset[i]]


return dataset

def splitdataset(dataset, splitratio):

#67% training size

trainsize = int(len(dataset) * splitratio);

trainset = []

copy = list(dataset);

while len(trainset) < trainsize:

#generate indices for the dataset list randomly to pick ele for training data

index = random.randrange(len(copy));

trainset.append(copy.pop(index))

return [trainset, copy]

def separatebyclass(dataset):

separated = {} #dictionary of classes 1 and 0

#creates a dictionary of classes 1 and 0 where the values are

#the instances belonging to each class

for i in range(len(dataset)):

vector = dataset[i]

if (vector[-1] not in separated):

separated[vector[-1]] = []

separated[vector[-1]].append(vector)

return separated

def mean(numbers):
return sum(numbers)/float(len(numbers))

def stdev(numbers):

avg = mean(numbers)

variance = sum([pow(x-avg,2) for x in numbers])/float(len(numbers)-1)

return math.sqrt(variance)

def summarize(dataset): #creates a dictionary of classes

summaries = [(mean(attribute), stdev(attribute)) for attribute in zip(*dataset)];

del summaries[-1] #excluding labels +ve or -ve

return summaries

def summarizebyclass(dataset):

separated = separatebyclass(dataset);

#print(separated)

summaries = {}

for classvalue, instances in separated.items():

#for key,value in dic.items()

#summaries is a dic of tuples(mean,std) for each class value

summaries[classvalue] = summarize(instances) #summarize is used to cal to mean and


std

return summaries

def calculateprobability(x, mean, stdev):

exponent = math.exp(-(math.pow(x-mean,2)/(2*math.pow(stdev,2))))

return (1 / (math.sqrt(2*math.pi) * stdev)) * exponent


def calculateclassprobabilities(summaries, inputvector):

probabilities = {} # probabilities contains the all prob of all class of test data

for classvalue, classsummaries in summaries.items():#class and attribute information as mean


and sd

probabilities[classvalue] = 1

for i in range(len(classsummaries)):

mean, stdev = classsummaries[i] #take mean and sd of every attribute for class 0
and 1 seperaely

x = inputvector[i] #testvector's first attribute

probabilities[classvalue] *= calculateprobability(x, mean, stdev);#use normal


dist

return probabilities

def predict(summaries, inputvector): #training and test data is passed

probabilities = calculateclassprobabilities(summaries, inputvector)

bestLabel, bestProb = None, -1

for classvalue, probability in probabilities.items():#assigns that class which has he highest prob

if bestLabel is None or probability > bestProb:

bestProb = probability

bestLabel = classvalue

return bestLabel

def getpredictions(summaries, testset):

predictions = []

for i in range(len(testset)):

result = predict(summaries, testset[i])


predictions.append(result)

return predictions

def getaccuracy(testset, predictions):

correct = 0

for i in range(len(testset)):

if testset[i][-1] == predictions[i]:

correct += 1

return (correct/float(len(testset))) * 100.0

def main():

filename = 'naivedata.csv'

splitratio = 0.67

dataset = loadcsv(filename);

trainingset, testset = splitdataset(dataset, splitratio)

print('Split {0} rows into train={1} and test={2} rows'.format(len(dataset), len(trainingset),


len(testset)))

# prepare model

summaries = summarizebyclass(trainingset);

#print(summaries)

# test model

predictions = getpredictions(summaries, testset) #find the predictions of test data with the
training data

accuracy = getaccuracy(testset, predictions)

print('Accuracy of the classifier is : {0}%'.format(accuracy))


main()

output:

Split 767 rows into train=513 and test=254 rows

Accuracy of the classifier is : 70.47244094488188%

Ex:no:6:

Aim:

Write a program to construct a Bayesian network to diagnose Heart disease using standard WHO Data
Set.

Algorithm:

age: age in years

sex: sex (1 = male; 0 = female)

cp: chest pain type

Value 1: typical angina

Value 2: atypical angina

Value 3: non-anginal pain

Value 4: asymptomatic

trestbps: resting blood pressure (in mm Hg on admission to the hospital)

chol: serum cholestoral in mg/dl

fbs: (fasting blood sugar > 120 mg/dl) (1 = true; 0 = false)

restecg: resting electrocardiographic results

Value 0: normal

Value 1: having ST-T wave abnormality (T wave inversions and/or ST elevation or depression of > 0.05
mV)

Value 2: showing probable or definite left ventricular hypertrophy by Estes’ criteria


thalach: maximum heart rate achieved

exang: exercise induced angina (1 = yes; 0 = no)

oldpeak = ST depression induced by exercise relative to rest

slope: the slope of the peak exercise ST segment

Value 1: upsloping

Value 2: flat

Value 3: downsloping

thal: 3 = normal; 6 = fixed defect; 7 = reversable defect

Heartdisease: It is integer valued from 0 (no presence) to 4.

Csv:

age,gender,cp,trestbps,chol,fbs,restecg,thalach,exang,oldpeak,slope,ca,thal,heartdisease

63,1,1,145,233,1,2,150,0,2.3,3,0,6,0

67,1,4,160,286,0,2,108,1,1.5,2,3,3,2

67,1,4,120,229,0,2,129,1,2.6,2,2,7,1

37,1,3,130,250,0,0,187,0,3.5,3,0,3,0

41,0,2,130,204,0,2,172,0,1.4,1,0,3,0

56,1,2,120,236,0,0,178,0,0.8,1,0,3,0

62,0,4,140,268,0,2,160,0,3.6,3,2,3,3

57,0,4,120,354,0,0,163,1,0.6,1,0,3,0

63,1,4,130,254,0,2,147,0,1.4,2,1,7,2

53,1,4,140,203,1,2,155,1,3.1,3,0,7,1

57,1,4,140,192,0,0,148,0,0.4,2,0,6,0

56,0,2,140,294,0,2,153,0,1.3,2,0,3,0

56,1,3,130,256,1,2,142,1,0.6,2,1,6,2

44,1,2,120,263,0,0,173,0,0,1,0,7,0
52,1,3,172,199,1,0,162,0,0.5,1,0,7,0

57,1,3,150,168,0,0,174,0,1.6,1,0,3,0

48,1,2,110,229,0,0,168,0,1,3,0,7,1

54,1,4,140,239,0,0,160,0,1.2,1,0,3,0

48,0,3,130,275,0,0,139,0,0.2,1,0,3,0

49,1,2,130,266,0,0,171,0,0.6,1,0,3,0

64,1,1,110,211,0,2,144,1,1.8,2,0,3,0

58,0,1,150,283,1,2,162,0,1,1,0,3,0

58,1,2,120,284,0,2,160,0,1.8,2,0,3,1

58,1,3,132,224,0,2,173,0,3.2,1,2,7,3

60,1,4,130,206,0,2,132,1,2.4,2,2,7,4

50,0,3,120,219,0,0,158,0,1.6,2,0,3,0

58,0,3,120,340,0,0,172,0,0,1,0,3,0

66,0,1,150,226,0,0,114,0,2.6,3,0,3,0

43,1,4,150,247,0,0,171,0,1.5,1,0,3,0

40,1,4,110,167,0,2,114,1,2,2,0,7,3

69,0,1,140,239,0,0,151,0,1.8,1,2,3,0

60,1,4,117,230,1,0,160,1,1.4,1,2,7,2

64,1,3,140,335,0,0,158,0,0,1,0,3,1

59,1,4,135,234,0,0,161,0,0.5,2,0,7,0

44,1,3,130,233,0,0,179,1,0.4,1,0,3,0

42,1,4,140,226,0,0,178,0,0,1,0,3,0

43,1,4,120,177,0,2,120,1,2.5,2,0,7,3

57,1,4,150,276,0,2,112,1,0.6,2,1,6,1

55,1,4,132,353,0,0,132,1,1.2,2,1,7,3
61,1,3,150,243,1,0,137,1,1,2,0,3,0

65,0,4,150,225,0,2,114,0,1,2,3,7,4

40,1,1,140,199,0,0,178,1,1.4,1,0,7,0

71,0,2,160,302,0,0,162,0,0.4,1,2,3,0

59,1,3,150,212,1,0,157,0,1.6,1,0,3,0

61,0,4,130,330,0,2,169,0,0,1,0,3,1

58,1,3,112,230,0,2,165,0,2.5,2,1,7,4

51,1,3,110,175,0,0,123,0,0.6,1,0,3,0

50,1,4,150,243,0,2,128,0,2.6,2,0,7,4

65,0,3,140,417,1,2,157,0,0.8,1,1,3,0

53,1,3,130,197,1,2,152,0,1.2,3,0,3,0

41,0,2,105,198,0,0,168,0,0,1,1,3,0

65,1,4,120,177,0,0,140,0,0.4,1,0,7,0

44,1,4,112,290,0,2,153,0,0,1,1,3,2

44,1,2,130,219,0,2,188,0,0,1,0,3,0

60,1,4,130,253,0,0,144,1,1.4,1,1,7,1

54,1,4,124,266,0,2,109,1,2.2,2,1,7,1

50,1,3,140,233,0,0,163,0,0.6,2,1,7,1

41,1,4,110,172,0,2,158,0,0,1,0,7,1

54,1,3,125,273,0,2,152,0,0.5,3,1,3,0

51,1,1,125,213,0,2,125,1,1.4,1,1,3,0

51,0,4,130,305,0,0,142,1,1.2,2,0,7,2

46,0,3,142,177,0,2,160,1,1.4,3,0,3,0

58,1,4,128,216,0,2,131,1,2.2,2,3,7,1

54,0,3,135,304,1,0,170,0,0,1,0,3,0
54,1,4,120,188,0,0,113,0,1.4,2,1,7,2

60,1,4,145,282,0,2,142,1,2.8,2,2,7,2

60,1,3,140,185,0,2,155,0,3,2,0,3,1

54,1,3,150,232,0,2,165,0,1.6,1,0,7,0

59,1,4,170,326,0,2,140,1,3.4,3,0,7,2

46,1,3,150,231,0,0,147,0,3.6,2,0,3,1

65,0,3,155,269,0,0,148,0,0.8,1,0,3,0

67,1,4,125,254,1,0,163,0,0.2,2,2,7,3

62,1,4,120,267,0,0,99,1,1.8,2,2,7,1

65,1,4,110,248,0,2,158,0,0.6,1,2,6,1

44,1,4,110,197,0,2,177,0,0,1,1,3,1

65,0,3,160,360,0,2,151,0,0.8,1,0,3,0

60,1,4,125,258,0,2,141,1,2.8,2,1,7,1

51,0,3,140,308,0,2,142,0,1.5,1,1,3,0

48,1,2,130,245,0,2,180,0,0.2,2,0,3,0

58,1,4,150,270,0,2,111,1,0.8,1,0,7,3

45,1,4,104,208,0,2,148,1,3,2,0,3,0

53,0,4,130,264,0,2,143,0,0.4,2,0,3,0

39,1,3,140,321,0,2,182,0,0,1,0,3,0

68,1,3,180,274,1,2,150,1,1.6,2,0,7,3

52,1,2,120,325,0,0,172,0,0.2,1,0,3,0

44,1,3,140,235,0,2,180,0,0,1,0,3,0

47,1,3,138,257,0,2,156,0,0,1,0,3,0

53,0,3,128,216,0,2,115,0,0,1,0,?,0

53,0,4,138,234,0,2,160,0,0,1,0,3,0
51,0,3,130,256,0,2,149,0,0.5,1,0,3,0

66,1,4,120,302,0,2,151,0,0.4,2,0,3,0

62,0,4,160,164,0,2,145,0,6.2,3,3,7,3

62,1,3,130,231,0,0,146,0,1.8,2,3,7,0

44,0,3,108,141,0,0,175,0,0.6,2,0,3,0

63,0,3,135,252,0,2,172,0,0,1,0,3,0

52,1,4,128,255,0,0,161,1,0,1,1,7,1

59,1,4,110,239,0,2,142,1,1.2,2,1,7,2

60,0,4,150,258,0,2,157,0,2.6,2,2,7,3

52,1,2,134,201,0,0,158,0,0.8,1,1,3,0

48,1,4,122,222,0,2,186,0,0,1,0,3,0

45,1,4,115,260,0,2,185,0,0,1,0,3,0

34,1,1,118,182,0,2,174,0,0,1,0,3,0

57,0,4,128,303,0,2,159,0,0,1,1,3,0

71,0,3,110,265,1,2,130,0,0,1,1,3,0

49,1,3,120,188,0,0,139,0,2,2,3,7,3

54,1,2,108,309,0,0,156,0,0,1,0,7,0

59,1,4,140,177,0,0,162,1,0,1,1,7,2

57,1,3,128,229,0,2,150,0,0.4,2,1,7,1

61,1,4,120,260,0,0,140,1,3.6,2,1,7,2

39,1,4,118,219,0,0,140,0,1.2,2,0,7,3

61,0,4,145,307,0,2,146,1,1,2,0,7,1

56,1,4,125,249,1,2,144,1,1.2,2,1,3,1

52,1,1,118,186,0,2,190,0,0,2,0,6,0

43,0,4,132,341,1,2,136,1,3,2,0,7,2
62,0,3,130,263,0,0,97,0,1.2,2,1,7,2

41,1,2,135,203,0,0,132,0,0,2,0,6,0

58,1,3,140,211,1,2,165,0,0,1,0,3,0

35,0,4,138,183,0,0,182,0,1.4,1,0,3,0

63,1,4,130,330,1,2,132,1,1.8,1,3,7,3

65,1,4,135,254,0,2,127,0,2.8,2,1,7,2

48,1,4,130,256,1,2,150,1,0,1,2,7,3

63,0,4,150,407,0,2,154,0,4,2,3,7,4

51,1,3,100,222,0,0,143,1,1.2,2,0,3,0

55,1,4,140,217,0,0,111,1,5.6,3,0,7,3

65,1,1,138,282,1,2,174,0,1.4,2,1,3,1

45,0,2,130,234,0,2,175,0,0.6,2,0,3,0

56,0,4,200,288,1,2,133,1,4,3,2,7,3

54,1,4,110,239,0,0,126,1,2.8,2,1,7,3

44,1,2,120,220,0,0,170,0,0,1,0,3,0

62,0,4,124,209,0,0,163,0,0,1,0,3,0

54,1,3,120,258,0,2,147,0,0.4,2,0,7,0

51,1,3,94,227,0,0,154,1,0,1,1,7,0

29,1,2,130,204,0,2,202,0,0,1,0,3,0

51,1,4,140,261,0,2,186,1,0,1,0,3,0

43,0,3,122,213,0,0,165,0,0.2,2,0,3,0

55,0,2,135,250,0,2,161,0,1.4,2,0,3,0

70,1,4,145,174,0,0,125,1,2.6,3,0,7,4

62,1,2,120,281,0,2,103,0,1.4,2,1,7,3

35,1,4,120,198,0,0,130,1,1.6,2,0,7,1
51,1,3,125,245,1,2,166,0,2.4,2,0,3,0

59,1,2,140,221,0,0,164,1,0,1,0,3,0

59,1,1,170,288,0,2,159,0,0.2,2,0,7,1

52,1,2,128,205,1,0,184,0,0,1,0,3,0

64,1,3,125,309,0,0,131,1,1.8,2,0,7,1

58,1,3,105,240,0,2,154,1,0.6,2,0,7,0

47,1,3,108,243,0,0,152,0,0,1,0,3,1

57,1,4,165,289,1,2,124,0,1,2,3,7,4

41,1,3,112,250,0,0,179,0,0,1,0,3,0

45,1,2,128,308,0,2,170,0,0,1,0,3,0

60,0,3,102,318,0,0,160,0,0,1,1,3,0

52,1,1,152,298,1,0,178,0,1.2,2,0,7,0

42,0,4,102,265,0,2,122,0,0.6,2,0,3,0

67,0,3,115,564,0,2,160,0,1.6,2,0,7,0

55,1,4,160,289,0,2,145,1,0.8,2,1,7,4

64,1,4,120,246,0,2,96,1,2.2,3,1,3,3

70,1,4,130,322,0,2,109,0,2.4,2,3,3,1

51,1,4,140,299,0,0,173,1,1.6,1,0,7,1

58,1,4,125,300,0,2,171,0,0,1,2,7,1

60,1,4,140,293,0,2,170,0,1.2,2,2,7,2

68,1,3,118,277,0,0,151,0,1,1,1,7,0

46,1,2,101,197,1,0,156,0,0,1,0,7,0

77,1,4,125,304,0,2,162,1,0,1,3,3,4

54,0,3,110,214,0,0,158,0,1.6,2,0,3,0

58,0,4,100,248,0,2,122,0,1,2,0,3,0
48,1,3,124,255,1,0,175,0,0,1,2,3,0

57,1,4,132,207,0,0,168,1,0,1,0,7,0

52,1,3,138,223,0,0,169,0,0,1,?,3,0

54,0,2,132,288,1,2,159,1,0,1,1,3,0

35,1,4,126,282,0,2,156,1,0,1,0,7,1

45,0,2,112,160,0,0,138,0,0,2,0,3,0

70,1,3,160,269,0,0,112,1,2.9,2,1,7,3

53,1,4,142,226,0,2,111,1,0,1,0,7,0

59,0,4,174,249,0,0,143,1,0,2,0,3,1

62,0,4,140,394,0,2,157,0,1.2,2,0,3,0

64,1,4,145,212,0,2,132,0,2,2,2,6,4

57,1,4,152,274,0,0,88,1,1.2,2,1,7,1

52,1,4,108,233,1,0,147,0,0.1,1,3,7,0

56,1,4,132,184,0,2,105,1,2.1,2,1,6,1

43,1,3,130,315,0,0,162,0,1.9,1,1,3,0

53,1,3,130,246,1,2,173,0,0,1,3,3,0

48,1,4,124,274,0,2,166,0,0.5,2,0,7,3

56,0,4,134,409,0,2,150,1,1.9,2,2,7,2

42,1,1,148,244,0,2,178,0,0.8,1,2,3,0

59,1,1,178,270,0,2,145,0,4.2,3,0,7,0

60,0,4,158,305,0,2,161,0,0,1,0,3,1

63,0,2,140,195,0,0,179,0,0,1,2,3,0

42,1,3,120,240,1,0,194,0,0.8,3,0,7,0

66,1,2,160,246,0,0,120,1,0,2,3,6,2

54,1,2,192,283,0,2,195,0,0,1,1,7,1
69,1,3,140,254,0,2,146,0,2,2,3,7,2

50,1,3,129,196,0,0,163,0,0,1,0,3,0

51,1,4,140,298,0,0,122,1,4.2,2,3,7,3

43,1,4,132,247,1,2,143,1,0.1,2,?,7,1

62,0,4,138,294,1,0,106,0,1.9,2,3,3,2

68,0,3,120,211,0,2,115,0,1.5,2,0,3,0

67,1,4,100,299,0,2,125,1,0.9,2,2,3,3

69,1,1,160,234,1,2,131,0,0.1,2,1,3,0

45,0,4,138,236,0,2,152,1,0.2,2,0,3,0

50,0,2,120,244,0,0,162,0,1.1,1,0,3,0

59,1,1,160,273,0,2,125,0,0,1,0,3,1

50,0,4,110,254,0,2,159,0,0,1,0,3,0

64,0,4,180,325,0,0,154,1,0,1,0,3,0

57,1,3,150,126,1,0,173,0,0.2,1,1,7,0

64,0,3,140,313,0,0,133,0,0.2,1,0,7,0

43,1,4,110,211,0,0,161,0,0,1,0,7,0

45,1,4,142,309,0,2,147,1,0,2,3,7,3

58,1,4,128,259,0,2,130,1,3,2,2,7,3

50,1,4,144,200,0,2,126,1,0.9,2,0,7,3

55,1,2,130,262,0,0,155,0,0,1,0,3,0

62,0,4,150,244,0,0,154,1,1.4,2,0,3,1

37,0,3,120,215,0,0,170,0,0,1,0,3,0

38,1,1,120,231,0,0,182,1,3.8,2,0,7,4

41,1,3,130,214,0,2,168,0,2,2,0,3,0

66,0,4,178,228,1,0,165,1,1,2,2,7,3
52,1,4,112,230,0,0,160,0,0,1,1,3,1

56,1,1,120,193,0,2,162,0,1.9,2,0,7,0

46,0,2,105,204,0,0,172,0,0,1,0,3,0

46,0,4,138,243,0,2,152,1,0,2,0,3,0

64,0,4,130,303,0,0,122,0,2,2,2,3,0

59,1,4,138,271,0,2,182,0,0,1,0,3,0

41,0,3,112,268,0,2,172,1,0,1,0,3,0

54,0,3,108,267,0,2,167,0,0,1,0,3,0

39,0,3,94,199,0,0,179,0,0,1,0,3,0

53,1,4,123,282,0,0,95,1,2,2,2,7,3

63,0,4,108,269,0,0,169,1,1.8,2,2,3,1

34,0,2,118,210,0,0,192,0,0.7,1,0,3,0

47,1,4,112,204,0,0,143,0,0.1,1,0,3,0

67,0,3,152,277,0,0,172,0,0,1,1,3,0

54,1,4,110,206,0,2,108,1,0,2,1,3,3

66,1,4,112,212,0,2,132,1,0.1,1,1,3,2

52,0,3,136,196,0,2,169,0,0.1,2,0,3,0

55,0,4,180,327,0,1,117,1,3.4,2,0,3,2

49,1,3,118,149,0,2,126,0,0.8,1,3,3,1

74,0,2,120,269,0,2,121,1,0.2,1,1,3,0

54,0,3,160,201,0,0,163,0,0,1,1,3,0

54,1,4,122,286,0,2,116,1,3.2,2,2,3,3

56,1,4,130,283,1,2,103,1,1.6,3,0,7,2

46,1,4,120,249,0,2,144,0,0.8,1,0,7,1

49,0,2,134,271,0,0,162,0,0,2,0,3,0
42,1,2,120,295,0,0,162,0,0,1,0,3,0

41,1,2,110,235,0,0,153,0,0,1,0,3,0

41,0,2,126,306,0,0,163,0,0,1,0,3,0

49,0,4,130,269,0,0,163,0,0,1,0,3,0

61,1,1,134,234,0,0,145,0,2.6,2,2,3,2

60,0,3,120,178,1,0,96,0,0,1,0,3,0

67,1,4,120,237,0,0,71,0,1,2,0,3,2

58,1,4,100,234,0,0,156,0,0.1,1,1,7,2

47,1,4,110,275,0,2,118,1,1,2,1,3,1

52,1,4,125,212,0,0,168,0,1,1,2,7,3

62,1,2,128,208,1,2,140,0,0,1,0,3,0

57,1,4,110,201,0,0,126,1,1.5,2,0,6,0

58,1,4,146,218,0,0,105,0,2,2,1,7,1

64,1,4,128,263,0,0,105,1,0.2,2,1,7,0

51,0,3,120,295,0,2,157,0,0.6,1,0,3,0

43,1,4,115,303,0,0,181,0,1.2,2,0,3,0

42,0,3,120,209,0,0,173,0,0,2,0,3,0

67,0,4,106,223,0,0,142,0,0.3,1,2,3,0

76,0,3,140,197,0,1,116,0,1.1,2,0,3,0

70,1,2,156,245,0,2,143,0,0,1,0,3,0

57,1,2,124,261,0,0,141,0,0.3,1,0,7,1

44,0,3,118,242,0,0,149,0,0.3,2,1,3,0

58,0,2,136,319,1,2,152,0,0,1,2,3,3

60,0,1,150,240,0,0,171,0,0.9,1,0,3,0

44,1,3,120,226,0,0,169,0,0,1,0,3,0
61,1,4,138,166,0,2,125,1,3.6,2,1,3,4

42,1,4,136,315,0,0,125,1,1.8,2,0,6,2

52,1,4,128,204,1,0,156,1,1,2,0,?,2

59,1,3,126,218,1,0,134,0,2.2,2,1,6,2

40,1,4,152,223,0,0,181,0,0,1,0,7,1

42,1,3,130,180,0,0,150,0,0,1,0,3,0

61,1,4,140,207,0,2,138,1,1.9,1,1,7,1

66,1,4,160,228,0,2,138,0,2.3,1,0,6,0

46,1,4,140,311,0,0,120,1,1.8,2,2,7,2

71,0,4,112,149,0,0,125,0,1.6,2,0,3,0

59,1,1,134,204,0,0,162,0,0.8,1,2,3,1

64,1,1,170,227,0,2,155,0,0.6,2,0,7,0

66,0,3,146,278,0,2,152,0,0,2,1,3,0

39,0,3,138,220,0,0,152,0,0,2,0,3,0

57,1,2,154,232,0,2,164,0,0,1,1,3,1

58,0,4,130,197,0,0,131,0,0.6,2,0,3,0

57,1,4,110,335,0,0,143,1,3,2,1,7,2

47,1,3,130,253,0,0,179,0,0,1,0,3,0

55,0,4,128,205,0,1,130,1,2,2,1,7,3

35,1,2,122,192,0,0,174,0,0,1,0,3,0

61,1,4,148,203,0,0,161,0,0,1,1,7,2

58,1,4,114,318,0,1,140,0,4.4,3,3,6,4

58,0,4,170,225,1,2,146,1,2.8,2,2,6,2

58,1,2,125,220,0,0,144,0,0.4,2,?,7,0

56,1,2,130,221,0,2,163,0,0,1,0,7,0
56,1,2,120,240,0,0,169,0,0,3,0,3,0

67,1,3,152,212,0,2,150,0,0.8,2,0,7,1

55,0,2,132,342,0,0,166,0,1.2,1,0,3,0

44,1,4,120,169,0,0,144,1,2.8,3,0,6,2

63,1,4,140,187,0,2,144,1,4,1,2,7,2

63,0,4,124,197,0,0,136,1,0,2,0,3,1

41,1,2,120,157,0,0,182,0,0,1,0,3,0

59,1,4,164,176,1,2,90,0,1,2,2,6,3

57,0,4,140,241,0,0,123,1,0.2,2,0,7,1

45,1,1,110,264,0,0,132,0,1.2,2,0,7,1

68,1,4,144,193,1,0,141,0,3.4,2,2,7,2

57,1,4,130,131,0,0,115,1,1.2,2,1,7,3

57,0,2,130,236,0,2,174,0,0,2,1,3,1

38,1,3,138,175,0,0,173,0,0,1,?,3,0

Program:

import numpy as np

import pandas as pd

import csv

from pgmpy.estimators import MaximumLikelihoodEstimator

from pgmpy.models import BayesianModel

from pgmpy.inference import VariableElimination

heartDisease = pd.read_csv('heart.csv')
heartDisease = heartDisease.replace('?',np.nan)

print('Sample instances from the dataset are given below')

print(heartDisease.head())

print('\n Attributes and datatypes')

print(heartDisease.dtypes)

model= BayesianModel([('age','heartdisease'),('sex','heartdisease'),('exang','heartdisease'),
('cp','heartdisease'),('heartdisease','restecg'),('heartdisease','chol')])

print('\nLearning CPD using Maximum likelihood estimators')

model.fit(heartDisease,estimator=MaximumLikelihoodEstimator)

print('\n Inferencing with Bayesian Network:')

HeartDiseasetest_infer = VariableElimination(model)

print('\n 1. Probability of HeartDisease given evidence= restecg')

q1=HeartDiseasetest_infer.query(variables=['heartdisease'],evidence={'restecg':1})

print(q1)

print('\n 2. Probability of HeartDisease given evidence= cp ')

q2=HeartDiseasetest_infer.query(variables=['heartdisease'],evidence={'cp':2})

print(q2)

output:

You might also like