Pytorch使用NN神经网络模型实现经典波士顿boston房价预测问题

2023-11-08

Pytorch使用多层神经网络模型实现经典波士顿boston房价预测问题

波士顿房价数据集介绍

波士顿房价数据集是一个经典的机器学习数据集,用于预测波士顿地区房屋的中位数价格。该数据集包含了506个样本,每个样本有13个特征,包括城镇的各种指标,如犯罪率、住宅用地比例、每个城镇的非零售商业用地比例等。目标变量是房屋的中位数价格(以千美元为单位)。

以下是波士顿房价数据集的特征列表:

CRIM:城镇的犯罪率 ZN:住宅用地超过 25000 平方英尺的比例 INDUS:每个城镇的非零售商业用地比例
CHAS:查尔斯河虚拟变量(如果附近是河流,则为1;否则为0) NOX:一氧化氮浓度(每千万份) RM:每个住宅的平均房间数
AGE:1940 年之前建造的自住单位比例 DIS:到波士顿五个就业中心的加权距离 RAD:径向公路的可达性指数 TAX:每 10000
美元的全价值财产税率 PTRATIO:每个城镇的学生与教师比例 B:计算公式为1000(Bk - 0.63)^2,其中Bk是城镇的黑人比例
LSTAT:低收入人群的百分比
波士顿房价数据集通常用于回归问题的训练和测试,旨在预测房屋的中位数价格。这个数据集被广泛应用于机器学习和数据科学的教学和实践中,用于评估不同算法和模型的性能。

1、引入依赖库和模块

import torch
import torch.nn as nn
import torch.optim as optim
import numpy as np
from sklearn.datasets import load_boston
from sklearn.model_selection import train_test_split
from sklearn.preprocessing import StandardScaler

2: 准备数据集

# 加载Boston房价数据集
data = load_boston()
X, y = data['data'], data['target']
C:\Users\Admin\AppData\Roaming\Python\Python37\site-packages\sklearn\utils\deprecation.py:87: FutureWarning: Function load_boston is deprecated; `load_boston` is deprecated in 1.0 and will be removed in 1.2.

    The Boston housing prices dataset has an ethical problem. You can refer to
    the documentation of this function for further details.

    The scikit-learn maintainers therefore strongly discourage the use of this
    dataset unless the purpose of the code is to study and educate about
    ethical issues in data science and machine learning.

    In this case special case, you can fetch the dataset from the original
    source::

        import pandas as pd
        import numpy as np


        data_url = "http://lib.stat.cmu.edu/datasets/boston"
        raw_df = pd.read_csv(data_url, sep="\s+", skiprows=22, header=None)
        data = np.hstack([raw_df.values[::2, :], raw_df.values[1::2, :2]])
        target = raw_df.values[1::2, 2]

    Alternative datasets include the California housing dataset (i.e.
    func:`~sklearn.datasets.fetch_california_housing`) and the Ames housing
    dataset. You can load the datasets as follows:

        from sklearn.datasets import fetch_california_housing
        housing = fetch_california_housing()

    for the California housing dataset and:

        from sklearn.datasets import fetch_openml
        housing = fetch_openml(name="house_prices", as_frame=True)

    for the Ames housing dataset.
    
  warnings.warn(msg, category=FutureWarning)

警告不用理会,也可以按照警告里的内容进行修改

#3、 划分训练集和测试集

X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.2, random_state=42)

4、数据归一化

scaler = StandardScaler()
X_train = scaler.fit_transform(X_train)
X_test = scaler.transform(X_test)

5、转换为PyTorch张量

X_train = torch.tensor(X_train, dtype=torch.float32)
X_test = torch.tensor(X_test, dtype=torch.float32)
y_train = torch.tensor(y_train, dtype=torch.float32).view(-1, 1)
y_test = torch.tensor(y_test, dtype=torch.float32).view(-1, 1)

4、定义神经网络模型

class FeedforwardNN(nn.Module):
    def __init__(self, input_dim, hidden_dim, output_dim):
        super(FeedforwardNN, self).__init__()
        self.fc1 = nn.Linear(input_dim, hidden_dim)
        self.relu = nn.ReLU()
        self.fc2 = nn.Linear(hidden_dim, output_dim)

    def forward(self, x):
        x = self.fc1(x)
        x = self.relu(x)
        x = self.fc2(x)
        return x

5、定义训练和评估函数

def train(model, criterion, optimizer, X, y, num_epochs=100, batch_size=32):
    model.train()
    num_samples = X.shape[0]
    num_batches = num_samples // batch_size

    for epoch in range(num_epochs):
        total_loss = 0
        for batch_idx in range(num_batches):
            start_idx = batch_idx * batch_size
            end_idx = start_idx + batch_size
            batch_X = X[start_idx:end_idx]
            batch_y = y[start_idx:end_idx]

            optimizer.zero_grad()
            outputs = model(batch_X)
            loss = criterion(outputs, batch_y)
            loss.backward()
            optimizer.step()

            total_loss += loss.item()

        print(f"Epoch {epoch + 1}/{num_epochs}, Loss: {total_loss / num_batches:.4f}")

def evaluate(model, criterion, X, y):
    model.eval()
    with torch.no_grad():
        outputs = model(X)
        loss = criterion(outputs, y)
        rmse = torch.sqrt(loss)
        mae = torch.mean(torch.abs(outputs - y))
    return loss.item(), rmse.item(), mae.item()

6: 运行训练和评估

# 设置模型参数
input_dim = X_train.shape[1]
hidden_dim = 64
output_dim = 1

7、初始化模型

model = FeedforwardNN(input_dim, hidden_dim, output_dim)

8、定义损失函数和优化器

criterion = nn.MSELoss()
optimizer = optim.Adam(model.parameters(), lr=0.001)

9、训练模型

train(model, criterion, optimizer, X_train, y_train, num_epochs=500, batch_size=32)
Epoch 1/500, Loss: 7.6713
Epoch 2/500, Loss: 7.6533
Epoch 3/500, Loss: 7.6367
Epoch 4/500, Loss: 7.6191
Epoch 5/500, Loss: 7.6038
Epoch 6/500, Loss: 7.5861
Epoch 7/500, Loss: 7.5701
Epoch 8/500, Loss: 7.5533
Epoch 9/500, Loss: 7.5396
Epoch 10/500, Loss: 7.5219
Epoch 11/500, Loss: 7.5071
Epoch 12/500, Loss: 7.4910
Epoch 13/500, Loss: 7.4769
Epoch 14/500, Loss: 7.4604
Epoch 15/500, Loss: 7.4455
Epoch 16/500, Loss: 7.4301
Epoch 17/500, Loss: 7.4159
Epoch 18/500, Loss: 7.3997
Epoch 19/500, Loss: 7.3860
Epoch 20/500, Loss: 7.3718
Epoch 21/500, Loss: 7.3559
Epoch 22/500, Loss: 7.3419
Epoch 23/500, Loss: 7.3277
Epoch 24/500, Loss: 7.3155
Epoch 25/500, Loss: 7.3003
Epoch 26/500, Loss: 7.2862
Epoch 27/500, Loss: 7.2728
Epoch 28/500, Loss: 7.2588
Epoch 29/500, Loss: 7.2454
Epoch 30/500, Loss: 7.2323
Epoch 31/500, Loss: 7.2186
Epoch 32/500, Loss: 7.2040
Epoch 33/500, Loss: 7.1909
Epoch 34/500, Loss: 7.1771
Epoch 35/500, Loss: 7.1646
Epoch 36/500, Loss: 7.1500
Epoch 37/500, Loss: 7.1361
Epoch 38/500, Loss: 7.1248
Epoch 39/500, Loss: 7.1110
Epoch 40/500, Loss: 7.0965
Epoch 41/500, Loss: 7.0860
Epoch 42/500, Loss: 7.0732
Epoch 43/500, Loss: 7.0594
Epoch 44/500, Loss: 7.0482
Epoch 45/500, Loss: 7.0353
Epoch 46/500, Loss: 7.0239
Epoch 47/500, Loss: 7.0115
Epoch 48/500, Loss: 6.9981
Epoch 49/500, Loss: 6.9871
Epoch 50/500, Loss: 6.9741
Epoch 51/500, Loss: 6.9618
Epoch 52/500, Loss: 6.9509
Epoch 53/500, Loss: 6.9365
Epoch 54/500, Loss: 6.9262
Epoch 55/500, Loss: 6.9139
Epoch 56/500, Loss: 6.9012
Epoch 57/500, Loss: 6.8910
Epoch 58/500, Loss: 6.8762
Epoch 59/500, Loss: 6.8628
Epoch 60/500, Loss: 6.8507
Epoch 61/500, Loss: 6.8350
Epoch 62/500, Loss: 6.8210
Epoch 63/500, Loss: 6.8089
Epoch 64/500, Loss: 6.7953
Epoch 65/500, Loss: 6.7840
Epoch 66/500, Loss: 6.7698
Epoch 67/500, Loss: 6.7570
Epoch 68/500, Loss: 6.7478
Epoch 69/500, Loss: 6.7344
Epoch 70/500, Loss: 6.7222
Epoch 71/500, Loss: 6.7105
Epoch 72/500, Loss: 6.6986
Epoch 73/500, Loss: 6.6863
Epoch 74/500, Loss: 6.6732
Epoch 75/500, Loss: 6.6629
Epoch 76/500, Loss: 6.6499
Epoch 77/500, Loss: 6.6392
Epoch 78/500, Loss: 6.6261
Epoch 79/500, Loss: 6.6136
Epoch 80/500, Loss: 6.6037
Epoch 81/500, Loss: 6.5918
Epoch 82/500, Loss: 6.5786
Epoch 83/500, Loss: 6.5673
Epoch 84/500, Loss: 6.5566
Epoch 85/500, Loss: 6.5462
Epoch 86/500, Loss: 6.5339
Epoch 87/500, Loss: 6.5224
Epoch 88/500, Loss: 6.5124
Epoch 89/500, Loss: 6.5001
Epoch 90/500, Loss: 6.4900
Epoch 91/500, Loss: 6.4794
Epoch 92/500, Loss: 6.4690
Epoch 93/500, Loss: 6.4578
Epoch 94/500, Loss: 6.4471
Epoch 95/500, Loss: 6.4381
Epoch 96/500, Loss: 6.4284
Epoch 97/500, Loss: 6.4176
Epoch 98/500, Loss: 6.4070
Epoch 99/500, Loss: 6.3981
Epoch 100/500, Loss: 6.3892
Epoch 101/500, Loss: 6.3782
Epoch 102/500, Loss: 6.3686
Epoch 103/500, Loss: 6.3586
Epoch 104/500, Loss: 6.3521
Epoch 105/500, Loss: 6.3401
Epoch 106/500, Loss: 6.3315
Epoch 107/500, Loss: 6.3212
Epoch 108/500, Loss: 6.3127
Epoch 109/500, Loss: 6.3046
Epoch 110/500, Loss: 6.2946
Epoch 111/500, Loss: 6.2848
Epoch 112/500, Loss: 6.2760
Epoch 113/500, Loss: 6.2675
Epoch 114/500, Loss: 6.2588
Epoch 115/500, Loss: 6.2495
Epoch 116/500, Loss: 6.2413
Epoch 117/500, Loss: 6.2320
Epoch 118/500, Loss: 6.2219
Epoch 119/500, Loss: 6.2147
Epoch 120/500, Loss: 6.2047
Epoch 121/500, Loss: 6.1957
Epoch 122/500, Loss: 6.1858
Epoch 123/500, Loss: 6.1764
Epoch 124/500, Loss: 6.1671
Epoch 125/500, Loss: 6.1602
Epoch 126/500, Loss: 6.1496
Epoch 127/500, Loss: 6.1408
Epoch 128/500, Loss: 6.1315
Epoch 129/500, Loss: 6.1248
Epoch 130/500, Loss: 6.1140
Epoch 131/500, Loss: 6.1068
Epoch 132/500, Loss: 6.0980
Epoch 133/500, Loss: 6.0892
Epoch 134/500, Loss: 6.0806
Epoch 135/500, Loss: 6.0731
Epoch 136/500, Loss: 6.0651
Epoch 137/500, Loss: 6.0563
Epoch 138/500, Loss: 6.0487
Epoch 139/500, Loss: 6.0428
Epoch 140/500, Loss: 6.0331
Epoch 141/500, Loss: 6.0275
Epoch 142/500, Loss: 6.0188
Epoch 143/500, Loss: 6.0125
Epoch 144/500, Loss: 6.0041
Epoch 145/500, Loss: 5.9995
Epoch 146/500, Loss: 5.9901
Epoch 147/500, Loss: 5.9834
Epoch 148/500, Loss: 5.9781
Epoch 149/500, Loss: 5.9689
Epoch 150/500, Loss: 5.9638
Epoch 151/500, Loss: 5.9542
Epoch 152/500, Loss: 5.9498
Epoch 153/500, Loss: 5.9417
Epoch 154/500, Loss: 5.9355
Epoch 155/500, Loss: 5.9283
Epoch 156/500, Loss: 5.9228
Epoch 157/500, Loss: 5.9137
Epoch 158/500, Loss: 5.9079
Epoch 159/500, Loss: 5.8998
Epoch 160/500, Loss: 5.8935
Epoch 161/500, Loss: 5.8862
Epoch 162/500, Loss: 5.8799
Epoch 163/500, Loss: 5.8727
Epoch 164/500, Loss: 5.8673
Epoch 165/500, Loss: 5.8595
Epoch 166/500, Loss: 5.8540
Epoch 167/500, Loss: 5.8460
Epoch 168/500, Loss: 5.8405
Epoch 169/500, Loss: 5.8328
Epoch 170/500, Loss: 5.8278
Epoch 171/500, Loss: 5.8194
Epoch 172/500, Loss: 5.8159
Epoch 173/500, Loss: 5.8087
Epoch 174/500, Loss: 5.8011
Epoch 175/500, Loss: 5.7945
Epoch 176/500, Loss: 5.7897
Epoch 177/500, Loss: 5.7834
Epoch 178/500, Loss: 5.7748
Epoch 179/500, Loss: 5.7701
Epoch 180/500, Loss: 5.7621
Epoch 181/500, Loss: 5.7586
Epoch 182/500, Loss: 5.7515
Epoch 183/500, Loss: 5.7426
Epoch 184/500, Loss: 5.7382
Epoch 185/500, Loss: 5.7301
Epoch 186/500, Loss: 5.7249
Epoch 187/500, Loss: 5.7165
Epoch 188/500, Loss: 5.7118
Epoch 189/500, Loss: 5.7042
Epoch 190/500, Loss: 5.6969
Epoch 191/500, Loss: 5.6916
Epoch 192/500, Loss: 5.6836
Epoch 193/500, Loss: 5.6790
Epoch 194/500, Loss: 5.6699
Epoch 195/500, Loss: 5.6653
Epoch 196/500, Loss: 5.6584
Epoch 197/500, Loss: 5.6511
Epoch 198/500, Loss: 5.6476
Epoch 199/500, Loss: 5.6388
Epoch 200/500, Loss: 5.6354
Epoch 201/500, Loss: 5.6268
Epoch 202/500, Loss: 5.6211
Epoch 203/500, Loss: 5.6145
Epoch 204/500, Loss: 5.6094
Epoch 205/500, Loss: 5.6006
Epoch 206/500, Loss: 5.5967
Epoch 207/500, Loss: 5.5900
Epoch 208/500, Loss: 5.5822
Epoch 209/500, Loss: 5.5770
Epoch 210/500, Loss: 5.5698
Epoch 211/500, Loss: 5.5644
Epoch 212/500, Loss: 5.5561
Epoch 213/500, Loss: 5.5518
Epoch 214/500, Loss: 5.5444
Epoch 215/500, Loss: 5.5366
Epoch 216/500, Loss: 5.5314
Epoch 217/500, Loss: 5.5268
Epoch 218/500, Loss: 5.5187
Epoch 219/500, Loss: 5.5131
Epoch 220/500, Loss: 5.5068
Epoch 221/500, Loss: 5.5014
Epoch 222/500, Loss: 5.4941
Epoch 223/500, Loss: 5.4913
Epoch 224/500, Loss: 5.4829
Epoch 225/500, Loss: 5.4784
Epoch 226/500, Loss: 5.4715
Epoch 227/500, Loss: 5.4671
Epoch 228/500, Loss: 5.4601
Epoch 229/500, Loss: 5.4572
Epoch 230/500, Loss: 5.4490
Epoch 231/500, Loss: 5.4446
Epoch 232/500, Loss: 5.4384
Epoch 233/500, Loss: 5.4348
Epoch 234/500, Loss: 5.4285
Epoch 235/500, Loss: 5.4223
Epoch 236/500, Loss: 5.4176
Epoch 237/500, Loss: 5.4119
Epoch 238/500, Loss: 5.4079
Epoch 239/500, Loss: 5.4014
Epoch 240/500, Loss: 5.3977
Epoch 241/500, Loss: 5.3904
Epoch 242/500, Loss: 5.3862
Epoch 243/500, Loss: 5.3814
Epoch 244/500, Loss: 5.3757
Epoch 245/500, Loss: 5.3704
Epoch 246/500, Loss: 5.3649
Epoch 247/500, Loss: 5.3605
Epoch 248/500, Loss: 5.3544
Epoch 249/500, Loss: 5.3508
Epoch 250/500, Loss: 5.3437
Epoch 251/500, Loss: 5.3401
Epoch 252/500, Loss: 5.3322
Epoch 253/500, Loss: 5.3285
Epoch 254/500, Loss: 5.3220
Epoch 255/500, Loss: 5.3158
Epoch 256/500, Loss: 5.3105
Epoch 257/500, Loss: 5.3039
Epoch 258/500, Loss: 5.2994
Epoch 259/500, Loss: 5.2937
Epoch 260/500, Loss: 5.2889
Epoch 261/500, Loss: 5.2810
Epoch 262/500, Loss: 5.2789
Epoch 263/500, Loss: 5.2728
Epoch 264/500, Loss: 5.2654
Epoch 265/500, Loss: 5.2600
Epoch 266/500, Loss: 5.2539
Epoch 267/500, Loss: 5.2494
Epoch 268/500, Loss: 5.2418
Epoch 269/500, Loss: 5.2374
Epoch 270/500, Loss: 5.2297
Epoch 271/500, Loss: 5.2260
Epoch 272/500, Loss: 5.2195
Epoch 273/500, Loss: 5.2145
Epoch 274/500, Loss: 5.2074
Epoch 275/500, Loss: 5.2024
Epoch 276/500, Loss: 5.1976
Epoch 277/500, Loss: 5.1900
Epoch 278/500, Loss: 5.1856
Epoch 279/500, Loss: 5.1795
Epoch 280/500, Loss: 5.1757
Epoch 281/500, Loss: 5.1690
Epoch 282/500, Loss: 5.1647
Epoch 283/500, Loss: 5.1580
Epoch 284/500, Loss: 5.1540
Epoch 285/500, Loss: 5.1486
Epoch 286/500, Loss: 5.1452
Epoch 287/500, Loss: 5.1385
Epoch 288/500, Loss: 5.1349
Epoch 289/500, Loss: 5.1301
Epoch 290/500, Loss: 5.1254
Epoch 291/500, Loss: 5.1208
Epoch 292/500, Loss: 5.1149
Epoch 293/500, Loss: 5.1120
Epoch 294/500, Loss: 5.1068
Epoch 295/500, Loss: 5.1030
Epoch 296/500, Loss: 5.0981
Epoch 297/500, Loss: 5.0925
Epoch 298/500, Loss: 5.0896
Epoch 299/500, Loss: 5.0844
Epoch 300/500, Loss: 5.0810
Epoch 301/500, Loss: 5.0757
Epoch 302/500, Loss: 5.0706
Epoch 303/500, Loss: 5.0670
Epoch 304/500, Loss: 5.0618
Epoch 305/500, Loss: 5.0584
Epoch 306/500, Loss: 5.0533
Epoch 307/500, Loss: 5.0499
Epoch 308/500, Loss: 5.0440
Epoch 309/500, Loss: 5.0412
Epoch 310/500, Loss: 5.0359
Epoch 311/500, Loss: 5.0297
Epoch 312/500, Loss: 5.0271
Epoch 313/500, Loss: 5.0206
Epoch 314/500, Loss: 5.0179
Epoch 315/500, Loss: 5.0127
Epoch 316/500, Loss: 5.0063
Epoch 317/500, Loss: 5.0025
Epoch 318/500, Loss: 4.9961
Epoch 319/500, Loss: 4.9925
Epoch 320/500, Loss: 4.9870
Epoch 321/500, Loss: 4.9816
Epoch 322/500, Loss: 4.9774
Epoch 323/500, Loss: 4.9718
Epoch 324/500, Loss: 4.9690
Epoch 325/500, Loss: 4.9634
Epoch 326/500, Loss: 4.9600
Epoch 327/500, Loss: 4.9557
Epoch 328/500, Loss: 4.9497
Epoch 329/500, Loss: 4.9470
Epoch 330/500, Loss: 4.9420
Epoch 331/500, Loss: 4.9392
Epoch 332/500, Loss: 4.9343
Epoch 333/500, Loss: 4.9289
Epoch 334/500, Loss: 4.9265
Epoch 335/500, Loss: 4.9225
Epoch 336/500, Loss: 4.9191
Epoch 337/500, Loss: 4.9143
Epoch 338/500, Loss: 4.9098
Epoch 339/500, Loss: 4.9061
Epoch 340/500, Loss: 4.9012
Epoch 341/500, Loss: 4.8987
Epoch 342/500, Loss: 4.8925
Epoch 343/500, Loss: 4.8909
Epoch 344/500, Loss: 4.8861
Epoch 345/500, Loss: 4.8809
Epoch 346/500, Loss: 4.8776
Epoch 347/500, Loss: 4.8720
Epoch 348/500, Loss: 4.8688
Epoch 349/500, Loss: 4.8648
Epoch 350/500, Loss: 4.8588
Epoch 351/500, Loss: 4.8551
Epoch 352/500, Loss: 4.8507
Epoch 353/500, Loss: 4.8480
Epoch 354/500, Loss: 4.8435
Epoch 355/500, Loss: 4.8379
Epoch 356/500, Loss: 4.8354
Epoch 357/500, Loss: 4.8316
Epoch 358/500, Loss: 4.8261
Epoch 359/500, Loss: 4.8241
Epoch 360/500, Loss: 4.8184
Epoch 361/500, Loss: 4.8157
Epoch 362/500, Loss: 4.8125
Epoch 363/500, Loss: 4.8074
Epoch 364/500, Loss: 4.8043
Epoch 365/500, Loss: 4.7990
Epoch 366/500, Loss: 4.7977
Epoch 367/500, Loss: 4.7932
Epoch 368/500, Loss: 4.7878
Epoch 369/500, Loss: 4.7859
Epoch 370/500, Loss: 4.7827
Epoch 371/500, Loss: 4.7775
Epoch 372/500, Loss: 4.7755
Epoch 373/500, Loss: 4.7704
Epoch 374/500, Loss: 4.7683
Epoch 375/500, Loss: 4.7643
Epoch 376/500, Loss: 4.7599
Epoch 377/500, Loss: 4.7584
Epoch 378/500, Loss: 4.7536
Epoch 379/500, Loss: 4.7489
Epoch 380/500, Loss: 4.7471
Epoch 381/500, Loss: 4.7434
Epoch 382/500, Loss: 4.7381
Epoch 383/500, Loss: 4.7366
Epoch 384/500, Loss: 4.7324
Epoch 385/500, Loss: 4.7282
Epoch 386/500, Loss: 4.7255
Epoch 387/500, Loss: 4.7224
Epoch 388/500, Loss: 4.7180
Epoch 389/500, Loss: 4.7157
Epoch 390/500, Loss: 4.7106
Epoch 391/500, Loss: 4.7096
Epoch 392/500, Loss: 4.7055
Epoch 393/500, Loss: 4.7005
Epoch 394/500, Loss: 4.6988
Epoch 395/500, Loss: 4.6958
Epoch 396/500, Loss: 4.6909
Epoch 397/500, Loss: 4.6896
Epoch 398/500, Loss: 4.6861
Epoch 399/500, Loss: 4.6805
Epoch 400/500, Loss: 4.6785
Epoch 401/500, Loss: 4.6765
Epoch 402/500, Loss: 4.6718
Epoch 403/500, Loss: 4.6694
Epoch 404/500, Loss: 4.6659
Epoch 405/500, Loss: 4.6616
Epoch 406/500, Loss: 4.6601
Epoch 407/500, Loss: 4.6558
Epoch 408/500, Loss: 4.6520
Epoch 409/500, Loss: 4.6503
Epoch 410/500, Loss: 4.6458
Epoch 411/500, Loss: 4.6415
Epoch 412/500, Loss: 4.6393
Epoch 413/500, Loss: 4.6360
Epoch 414/500, Loss: 4.6319
Epoch 415/500, Loss: 4.6295
Epoch 416/500, Loss: 4.6258
Epoch 417/500, Loss: 4.6210
Epoch 418/500, Loss: 4.6195
Epoch 419/500, Loss: 4.6164
Epoch 420/500, Loss: 4.6110
Epoch 421/500, Loss: 4.6090
Epoch 422/500, Loss: 4.6056
Epoch 423/500, Loss: 4.6016
Epoch 424/500, Loss: 4.5987
Epoch 425/500, Loss: 4.5957
Epoch 426/500, Loss: 4.5912
Epoch 427/500, Loss: 4.5901
Epoch 428/500, Loss: 4.5860
Epoch 429/500, Loss: 4.5819
Epoch 430/500, Loss: 4.5790
Epoch 431/500, Loss: 4.5764
Epoch 432/500, Loss: 4.5725
Epoch 433/500, Loss: 4.5704
Epoch 434/500, Loss: 4.5670
Epoch 435/500, Loss: 4.5631
Epoch 436/500, Loss: 4.5614
Epoch 437/500, Loss: 4.5584
Epoch 438/500, Loss: 4.5543
Epoch 439/500, Loss: 4.5525
Epoch 440/500, Loss: 4.5480
Epoch 441/500, Loss: 4.5468
Epoch 442/500, Loss: 4.5425
Epoch 443/500, Loss: 4.5391
Epoch 444/500, Loss: 4.5377
Epoch 445/500, Loss: 4.5347
Epoch 446/500, Loss: 4.5304
Epoch 447/500, Loss: 4.5291
Epoch 448/500, Loss: 4.5257
Epoch 449/500, Loss: 4.5212
Epoch 450/500, Loss: 4.5206
Epoch 451/500, Loss: 4.5174
Epoch 452/500, Loss: 4.5135
Epoch 453/500, Loss: 4.5117
Epoch 454/500, Loss: 4.5086
Epoch 455/500, Loss: 4.5052
Epoch 456/500, Loss: 4.5027
Epoch 457/500, Loss: 4.4995
Epoch 458/500, Loss: 4.4963
Epoch 459/500, Loss: 4.4940
Epoch 460/500, Loss: 4.4895
Epoch 461/500, Loss: 4.4880
Epoch 462/500, Loss: 4.4848
Epoch 463/500, Loss: 4.4806
Epoch 464/500, Loss: 4.4787
Epoch 465/500, Loss: 4.4740
Epoch 466/500, Loss: 4.4733
Epoch 467/500, Loss: 4.4693
Epoch 468/500, Loss: 4.4659
Epoch 469/500, Loss: 4.4641
Epoch 470/500, Loss: 4.4593
Epoch 471/500, Loss: 4.4580
Epoch 472/500, Loss: 4.4546
Epoch 473/500, Loss: 4.4510
Epoch 474/500, Loss: 4.4487
Epoch 475/500, Loss: 4.4448
Epoch 476/500, Loss: 4.4432
Epoch 477/500, Loss: 4.4394
Epoch 478/500, Loss: 4.4366
Epoch 479/500, Loss: 4.4344
Epoch 480/500, Loss: 4.4296
Epoch 481/500, Loss: 4.4279
Epoch 482/500, Loss: 4.4243
Epoch 483/500, Loss: 4.4202
Epoch 484/500, Loss: 4.4172
Epoch 485/500, Loss: 4.4134
Epoch 486/500, Loss: 4.4121
Epoch 487/500, Loss: 4.4074
Epoch 488/500, Loss: 4.4039
Epoch 489/500, Loss: 4.4017
Epoch 490/500, Loss: 4.3980
Epoch 491/500, Loss: 4.3955
Epoch 492/500, Loss: 4.3911
Epoch 493/500, Loss: 4.3900
Epoch 494/500, Loss: 4.3859
Epoch 495/500, Loss: 4.3812
Epoch 496/500, Loss: 4.3805
Epoch 497/500, Loss: 4.3768
Epoch 498/500, Loss: 4.3754
Epoch 499/500, Loss: 4.3711
Epoch 500/500, Loss: 4.3695

10、评估模型

test_loss, test_rmse, test_mae = evaluate(model, criterion, X_test, y_test)
print(f"Test Loss: {test_loss:.4f}, Test RMSE: {test_rmse:.4f}, Test MAE: {test_mae:.4f}")

Test Loss: 12.0768, Test RMSE: 3.4752, Test MAE: 2.2279

本文内容由网友自发贡献,版权归原作者所有,本站不承担相应法律责任。如您发现有涉嫌抄袭侵权的内容,请联系:hwhale#tublm.com(使用前将#替换为@)

Pytorch使用NN神经网络模型实现经典波士顿boston房价预测问题 的相关文章

随机推荐

  • 静态对象、全局对象与程序的运行机制

    静态对象 全局对象与程序的运行机制 1 在介绍静态对象 全局对象与程序的运行机制之间的关系之前 我们首先看一下atexit函数 atexit函数的声明为 int atexit void cdecl func void 参数为函数指针 返回值
  • vue面试题+答案,2021前端面试

    vue面试题 答案 2022前端面试 前端进阶面试题详细解答 MVC 和 MVVM 区别 MVC MVC 全名是 Model View Controller 是模型 model 视图 view 控制器 controller 的缩写 一种软件
  • 深度学习课程设计——波士顿房价预测

    基于密集连接神经网络的波士顿房价预测 摘 要 神经网络是从信息处理角度对人脑神经元网络进行抽象 建立某种模型 按不同的连接方式组成不同的网络 本文以波士顿房价这一经典数据集为基础 该数据集包含了住宅用地所占比例等13个特征 由于keras
  • rabbitmq基础9——流控、镜像队列、网络分区

    文章目录 一 流控 1 1 流控机制 1 2 流控原理 1 3 流控状态显示 1 4 流控对象 1 5 性能提升 二 镜像队列 2 1 机制原理 2 1 1 集群结构 2 2 镜像结构 2 2 1 组播GM 2 2 1 1 实现原理 2 2
  • 1067 Sort with Swap(0, i) (25 分)

    题目 题目链接 题解 思维 DFS 比较难想的是问题转化的思路 规定a i 表示索引为i处的初始数为a i 我们引入边 由i指向a i 由a i 指向i也可以 将所有n个边都连上后 可能存在若干个环 也可能自己指向了自己 自环 我们思考几种
  • 基于规则的错别字改错

    利用ahocorasick库调用AC自动机寻找已经定义的错别字 不进行分词 并输出错别字开始位置和结束位置 并且在原文中进行改正 import ahocorasick def correct typos text typos 构建 AC 自
  • 字符串排序(函数,指针)(C语言实现)

    编写程序 用户输入n个 n lt 100 字符串 每个字符串长度小于100 按照字典顺序将字符串进行排序 并输出 要求在排序中使用指针数组完成 输入说明 输入第一行是一个整数数字 表示n 接下来有n行字符串 表示待排序字符串 输出说明 输出
  • 虚拟现实期末考试

    考点 考试分比 考试题型单项选择题40 名词解释题10 简答题20 程序设计题8 综合题12 材料分析题10 等 选择题 名词解释 VR 虚拟现实技术 英文名称 Virtual Reality 缩写为VR 是20世纪发展起来的一项全新的实用
  • linux sudo su - 免密配置

    添加用户 sudo useradd d home awen m s bin bash awen 添加用户切换root免密配置 echo awen ALL ALL NOPASSWD ALL gt gt etc sudoers d 90 clo
  • 如何更改电脑开机密码

    以我的电脑为例 是Windows 10系统的笔记本电脑 想要重新设置开机密码 相信大家都有想重新设置密码的冲动 网上也有对应的教程 但是很多人都有一个疑问 为什么根据教程来修改却没有成功 或者是重新登录电脑发现密码根本没有变化 其实不然 这
  • 技术进化对风险隔离与屏蔽超过民众想象

    刷脸支付公众在收获着消费便捷快感的同时 更多地则对刷脸支付的安全性与风险性提出了质疑 与指纹 虹膜等相比 人脸具有弱隐私性的生物特征 而且正是由于手机这一介质的缺失 人脸信息的克隆与利用也会变得 更加容易 相应地 用户使用刷脸支付的风险也就
  • Linux系统卡在iscsi服务导致无法正常开机

    现象 存储网络断开了 机器再开机时卡在开机logo去挂载iscsi 导致没法正常开机 解决办法 1 进救援模式或者单用户模式 将iscsid服务开机自启关了并禁用 关闭自启可能还是存在会被唤醒的情况 systemctl disable is
  • Maven项目的两种打包方式-spring-boot-mavne-plugin/maven-jar-plugin

    Maven项目的两种打包方式 spring boot mavne plugin maven jar plugin 1 前言 Maven的两种打包方式 2 流程图 3 spring boor maven plugin打包 4 maven ja
  • Java架构师的10个学习经验分享,初学者必看

    从零基础学习Java 只要方法正确 依然可以学好Java编程 学习Java就像交朋友一样 要从陌生到熟悉再到铁杆搭档一生相伴的过程 随着深入的了解 你不但会发现学Java并不是和想象的那样无聊和困难 而且还有一些美妙之感 当然在拥有强大的技
  • SecureCRT8.5的下载、安装、注册、连接

    https blog csdn net qq 37233070 article details 120683434
  • 前端Blob文件导出

    前端Blob文件导出 前言 一 Blob是什么 二 axios 拦截器中做统一处理 三 页面全局封装下载组建 总结 前言 通过把blob转file对象的方法下载文件 一 Blob是什么 Blob 对象表示一个不可变 原始数据的类文件对象 B
  • 了解一下Java的日志体系

    目录 了解日子框架体系 分析日志框架如何转换 logback日志的集成 SpringBoot日志使用 常见的一些日志框架如 commons logging jar log4j jar sl4j api jar等 他们之间是存在一定关系的 在
  • mysql错误解决(2003、1045 2800)

    1 2003解决 在D mysql 5 7 26 winx64中新建文件夹data 在bin中进入cmd 运行 mysqld initialize 等一段事件后 重启mysql服务 2 1045 2800 在my ini加入skip gra
  • 【机器学习 & 深度学习】通俗讲解集成学习算法

    目录 集成学习 一 机器学习中的集成学习 1 1 定义 1 2 分类器 Classifier 1 2 1 决策树分类器 1 2 2 朴素贝叶斯分类器 1 2 3 AdaBoost算法 1 2 4 支持向量机 1 2 5 K近邻算法 1 3
  • Pytorch使用NN神经网络模型实现经典波士顿boston房价预测问题

    Pytorch使用多层神经网络模型实现经典波士顿boston房价预测问题 波士顿房价数据集介绍 波士顿房价数据集是一个经典的机器学习数据集 用于预测波士顿地区房屋的中位数价格 该数据集包含了506个样本 每个样本有13个特征 包括城镇的各种