python代写 python作业代写 python程序代写 代写python程序

python机器学习代写,代做python机器学习,python代写

                                                                                                

Homework 2 for CSI 431

                                       

February 28, 2020

                                       

All homeworks are individual assignments. This means: write your own solutions and do not copy code/solutions from peers or online. Should academic dishonesty be detected, the proper reporting protocols will be invoked (see Syllabus for details).

                                       

Instructions: Submit two files. One should be a write-up of all solutions and observations, as Solution.pdf. The second should be an archive Code.zip containing code and any relevant results files.

                                       

Note: individual functions will be tested by a script and potentially different train/test data. Do not change method names or parameters, simply provide implementations. Also, please, do not add additional library imports.

                                       

1. Decision trees As part of this question you will implement and compare the Information Gain, Gini Index and CART evaluation measures for splits in decision tree construction.Let D = (X, y), |D| = n be a dataset with n samples. The entropy of the dataset is defined as

                                       

2
H(D) = − 

京ICP备2025144562号-1
微信
程序代写,编程代写
使用微信扫一扫关注
在线客服
欢迎在线资讯
联系时间: 全天