Skip to main content

Dwdm exp 3

 3)Load weather. nominal, Iris, Glass datasets into Weka and run Apriori

Algorithm with different support and confidence values.

Loading WEATHER.NOMINAL dataset

1. Select WEATHER.NOMINAL dataset from the available datasets in the 

preprocessing tab.

2. Apply Apriori algorithm by selecting it from the Associate tab and click start

3. The Associator output displays the following result.

=== Run information ===

Scheme: weka.associations.Apriori -N 10 -T 0 -C 0.9 -D 0.05 -U 1.0 -M 0.1 -S

-1.0 -c -1

Relation: weather.symbolic

Instances: 14

Attributes: 5

 outlook

 temperature

 humidity

 windy

 play

=== Associator model (full training set) ===

Apriori

=======

Minimum support: 0.15 (2 instances)

Minimum metric <confidence>: 0.9

Number of cycles performed: 17

Generated sets of large itemsets:

Size of set of large itemsets L(1): 12

Size of set of large itemsets L(2): 47

Size of set of large itemsets L(3): 39

Size of set of large itemsets L(4): 6

Best rules found:

1. outlook=overcast 4 ==> play=yes 4 <conf:(1)> lift:(1.56) lev:(0.1) [1]

conv:(1.43)

2. temperature=cool 4 ==> humidity=normal 4 <conf:(1)> lift:(2) lev:(0.14) [2]

conv:(2)

3. humidity=normal windy=FALSE 4 ==> play=yes 4 <conf:(1)> lift:(1.56)

lev:(0.1) [1] conv:(1.43)

4. outlook=sunny play=no 3 ==> humidity=high 3 <conf:(1)> lift:(2) lev:(0.11)

[1] conv:(1.5)

5. outlook=sunny humidity=high 3 ==> play=no 3 <conf:(1)> lift:(2.8)

lev:(0.14) [1] conv:(1.93)

6. outlook=rainy play=yes 3 ==> windy=FALSE 3 <conf:(1)> lift:(1.75)

lev:(0.09) [1] conv:(1.29)

7. outlook=rainy windy=FALSE 3 ==> play=yes 3 <conf:(1)> lift:(1.56)

lev:(0.08) [1] conv:(1.07)

8. temperature=cool play=yes 3 ==> humidity=normal 3 <conf:(1)> lift:(2)

lev:(0.11) [1] conv:(1.5)

9. outlook=sunny temperature=hot 2 ==> humidity=high 2 <conf:(1)> lift:(2)

lev:(0.07) [1] conv:(1)

10. temperature=hot play=no 2 ==> outlook=sunny 2 <conf:(1)> lift:(2.8)


Loading IRIS dataset

1. Select IRIS dataset from the available datasets in the preprocessing tab.

2. Apply Apriori algorithm by selecting it from the Associate tab and click start 

button.

3. The Associator output displays the following result.

=== Run information ===

Scheme: weka.associations.Apriori -N 10 -T 0 -C 0.9 -D 0.05 -U 1.0 -M 0.1 -S 

-1.0 -c -1

Relation: iris-weka.filters.unsupervised.attribute.Discretize-B10-M-1.0-Rfirst-

last-precision6

Instances: 150

Attributes: 5

 sepallength

 sepalwidth

 petallength

 petalwidth

 class

=== Associator model (full training set) ===

Apriori

=======

Minimum support: 0.1 (15 instances)

Minimum metric <confidence>: 0.9

Number of cycles performed: 18

Generated sets of large itemsets:

Size of set of large itemsets L(1): 20

Size of set of large itemsets L(2): 15

Size of set of large itemsets L(3): 3

Best rules found:

1. petalwidth='(-inf-0.34]' 41 ==> class=Iris-setosa 41 <conf:(1)> lift:(3)

2.petallength='(-inf-1.59]' 37 ==> class=Iris-setosa 37 <conf:(1)> lift:(3)

lev:(0.16) [24] conv:(24.67)

3. petallength='(-inf-1.59]' petalwidth='(-inf-0.34]' 33 ==> class=Iris-setosa 33

<conf:(1)> lift:(3) lev:(0.15) [22] conv:(22)

4. petalwidth='(1.06-1.3]' 21 ==> class=Iris-versicolor 21 <conf:(1)> lift:(3)

lev:(0.09) [14] conv:(14)

5. petallength='(5.13-5.72]' 18 ==> class=Iris-virginica 18 <conf:(1)> lift:(3)

lev:(0.08) [12] conv:(12)

6. sepallength='(4.66-5.02]' petalwidth='(-inf-0.34]' 17 ==> class=Iris-setosa 17

<conf:(1)> lift:(3) lev:(0.08) [11] conv:(11.33)

7. sepalwidth='(2.96-3.2]' class=Iris-setosa 16 ==> petalwidth='(-inf-0.34]' 16

<conf:(1)> lift:(3.66) lev:(0.08) [11] conv:(11.63)

8. sepalwidth='(2.96-3.2]' petalwidth='(-inf-0.34]' 16 ==> class=Iris-setosa 16

<conf:(1)> lift:(3) lev:(0.07) [10] conv:(10.67)

9. petallength='(3.95-4.54]' 26 ==> class=Iris-versicolor 25 <conf:(0.96)>

lift:(2.88) lev:(0.11) [16] conv:(8.67)

10. petalwidth='(1.78-2.02]' 23 ==> class=Iris-virginica 22 <conf:(0.96)>

lift:(2.87) lev:(0.1) [14] conv:(7.67)

Comments

Popular posts from this blog

Data structures: Introduction to Trees

Tree: The data in a tree are not stored in a sequential manner i.e., they are not stored linearly. Instead, they are arranged on multiple levels or we can say it is a hierarchical structure. For this reason, the tree is considered to be a non-linear data structure. KeyConcepts : Nodes : Individual units within the tree, each storing data and potentially linking to other nodes. Edges : Connections between nodes, representing relationships (parent-child, sibling, etc.). Root : The topmost node in the tree, from which all other nodes originate. Parent : A node that has one or more child nodes. Child : A node connected to a parent node. Leaf : A node with no children. Subtree : A portion of a tree that is itself a tree. Representation of tree: Binary search tree :

Hashing and hash functions

Types of hash functions

Data structures dequeues