Lower-bound Estimate for Cost-sensitive Decision Trees
Our Price
₹2,500.00
10000 in stock
Support
Ready to Ship
Description
Decision Trees are excellent tools for helping you to choose between several courses of action. They provide a highly effective structure within which you can lay out options and investigate the possible outcomes of choosing those options. They also help you to form a balanced picture of the risks and rewards associated with each possible course of action. Decision trees are most popular classification tools in machine learning ,fault detection etc..Due to their simplicity they are widely used. While decision trees are grown ,computation cost especially cost of test is widely affected and thus decision trees building becomes an optimization problem. To overcome this problem algorithms are implemented based on estimation is tested against cost sensitive heuristics on a range of standard data sets. Hence estimation are gone through for calculating the test cost. A decision tree is a decision support tool that uses a tree-like graph or model of decisions and their possible consequences, including chance event outcomes, resource costs, and utility. It is one way to display an algorithm. Another use of decision trees is as a descriptive means for calculating conditional probabilities . Decision trees implicitly perform variable screening or feature selection We described here why feature selection is important in analytics. We also introduced a few common techniques for performing feature selection or variable screening. When we fit a decision tree to a training dataset, the top few nodes on which the tree is split are essentially the most important variables within the dataset and feature selection is completed automatically. Decision trees require relatively little effort from users for data preparation. .our motto is to estimate a cost if we want to construct a tree.
Tags: 2012, Data Mining Projects, Java