ML-homework

题目1

image-20221118145407741

基础知识:

​ 矩阵微分公式:

​ 那么

题解:

image-20221121203615626

补充:

  1. 正则化可以看作是损失函数的惩罚项
  2. 岭估计的参数模要严格小于最小二乘估计的参数模,即引入L2正则项后,参数长度变短了。这被叫做特征缩减
  3. 特征缩减可以使得影响较小的特征系数衰减到0,只保留重要特征从而减少模型复杂度,减少过拟合

题目2

image-20221118192313490

基础知识:

image-20221118194704936

image-20221118194620334

image-20221118195546732

题解:

image-20221121203648506

补充:

image-20221118203258011

题目3

image-20221118203820485

基础知识:

​ 分类的评价指标:

image-20221118203938704

image-20221118203958051

image-20221118204052981

image-20221118204244369

image-20221118204505151

对数几率回归:用回归函数做分类任务。

image-20221118204835296

题解:

image-20221121203726097

题目4

image-20221118225705375

题解:

image-20221121203803784

image-20221121203829860

补充:

支持向量机的对偶问题

https://zhuanlan.zhihu.com/p/39592364

https://blog.csdn.net/weixin_40859436/article/details/80647547

https://www.jianshu.com/p/de882f0fc434

题目5

image-20221119110557712

题解:

​ 用了脚本:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
#!/usr/bin/env python
# -*- encoding: utf-8 -*-
'''
# @Time : 2022/11/21 20:41:20
# @Author: wd-2711
'''

import random

if __name__ == "__main__":
x1 = [[0, 0], [0, 1]]
x2 = [[1, 1], [1, 0]]
for i in range(10000):
[w11, w21, w12, w22, b1, b2, b3, w1, w2] = [random.randint(-2,2) for _ in range(9)]

if (max(w11 * x1[0][0] + w12 * x2[0][0] + b1, 0) * w1 + b3 > 0) + (max(w21 * x1[0][1] + w22 * x2[0][1] + b2, 0) * w2 + b3) > 0 and \
(max(w11 * x1[1][0] + w12 * x2[1][0] + b1, 0) * w1 + b3 > 0) + (max(w21 * x1[1][1] + w22 * x2[1][1] + b2, 0) * w2 + b3) < 0:
print([w11, w21, w12, w22, b1, b2, b3, w1, w2])

​ 最终有结果:

image-20221121204158990

补充:

神经网络中的异或问题

https://blog.csdn.net/comli_cn/article/details/109170804

留言

2022-11-18

© 2024 wd-z711

⬆︎TOP