Chat with us, powered by LiveChat 5. Consider the following data set for a binary class problem.A B - Aqhomework

5. Consider the following data set for a binary class problem.A B

5. Consider the following data set for a binary class problem.A B Class LabelT F +T T +T T +T F ?T T +F F ?F F ?F F ?T T ?T F ?a. Calculate the information gain when splitting on A and B. Whichattribute would the decision tree induction algorithm choose?b. Calculate the gain in the Gini index when splitting on A and B.Which attribute would the decision tree induction algorithmchoose?c. Figure 3.11 shows that entropy and the Gini index are bothmonotonically increasing on the range [0, 0.5] and they are bothmonotonically decreasing on the range [0.5, 1]. Is it possible thatinformation gain and the gain in the Gini index favor differentattributes? Explain.7. Consider the following set of training examples.X Y Z No. of Class C1 Examples No. of Class C2 Examples0 0 0 5 400 0 1 0 150 1 0 10 50 1 1 45 01 0 0 10 51 0 1 25 01 1 0 5 201 1 1 0 15a. Compute a two-level decision tree using the greedy approachdescribed in this chapter. Use the classification error rate as thecriterion for splitting. What is the overall error rate of the inducedtree?b. Repeat part (a) using X as the first splitting attribute and thenchoose the best remaining attribute for splitting at each of the twosuccessor nodes. What is the error rate of the induced tree?c. Compare the results of parts (a) and (b). Comment on the suitabilityof the greedy heuristic used for splitting attribute selection.8. The following table summarizes a data set with three attributes A, B,C and two class labels +, ?. Build a two-level decision tree.A B CNumber of Instances+ ?T T T 5 0F T T 0 20T F T 20 0F F T 0 5T T F 0 0F T F 25 0T F F 0 0F F F 0 25a. According to the classification error rate, which attribute would bechosen as the first splitting attribute? For each attribute, show thecontingency table and the gains in classification error rate.b. Repeat for the two children of the root node.c. How many instances are misclassified by the resulting decisiontree?d. Repeat parts (a), (b), and (c) using C as the splitting attribute.e. Use the results in parts (c) and (d) to conclude about the greedynature of the decision tree induction algorithm.

Our website has a team of professional writers who can help you write any of your homework. They will write your papers from scratch. We also have a team of editors just to make sure all papers are of HIGH QUALITY & PLAGIARISM FREE. To make an Order you only need to click Ask A Question and we will direct you to our Order Page at WriteDemy. Then fill Our Order Form with all your assignment instructions. Select your deadline and pay for your paper. You will get it few hours before your set deadline.

Fill in all the assignment paper details that are required in the order form with the standard information being the page count, deadline, academic level and type of paper. It is advisable to have this information at hand so that you can quickly fill in the necessary information needed in the form for the essay writer to be immediately assigned to your writing project. Make payment for the custom essay order to enable us to assign a suitable writer to your order. Payments are made through Paypal on a secured billing page. Finally, sit back and relax.

Do you need an answer to this or any other questions?

Do you need help with this question?

Get assignment help from Aqhomework.com Paper Writing Website and forget about your problems.

Aqhomework provides custom & cheap essay writing 100% original, plagiarism free essays, assignments & dissertations.

With an exceptional team of professional academic experts in a wide range of subjects, we can guarantee you an unrivaled quality of custom-written papers.

Click Order now to access our order form, fill your paper details correctly, select your paper deadline and wait for our writers to send a perfectly written assignment. 

Chat with us today! We are always waiting to answer all your questions.