2023-04-10test

def model_lstm(X_train, X_val, y_train, y_val, word_to_index):
    input_shape = (SEQ_LENGTH,)
    x_train_in = Input(input_shape, dtype='int32', name="x_train")

    # word_index存储的是所有vocabulary的映射关系
    nb_words = min(MAX_NB_WORDS, len(word_to_index))
    embedding_layer = Embedding(nb_words, EMBEDDING_DIM, input_length=SEQ_LENGTH)(x_train_in)
    print("embedding layer is::", embedding_layer)
    print("build model.....")

    # return_sequences=True表示返回的是序列,否则下面的LSTM无法使用,但是如果下一层不是LSTM,则可以不写
    lstm_1 = LSTM(EMBEDDING_DIM, name="LSTM_1", return_sequences=True)(embedding_layer)
    lstm_2 = LSTM(EMBEDDING_DIM_2, name="LSTM_2")(lstm_1)
    dense = Dense(nb_words, activation="softmax", name="Dense_1")(lstm_2)

    model = Model(inputs=x_train_in, outputs=dense)
    print(model.summary())

    adam = Adam(lr=0.0001, beta_1=0.9, beta_2=0.99, epsilon=1e-08)
    model.compile(loss='categorical_crossentropy',
                  optimizer=adam,
                  metrics=['accuracy'])
embedding layer is:: KerasTensor(type_spec=TensorSpec(shape=(None, 10, 800), dtype=tf.float32, name=None), name='embedding/embedding_lookup/Identity_1:0', description="created by layer 'embedding'")
build model.....
Model: "model"
_________________________________________________________________
 Layer (type)                Output Shape              Param #   
=================================================================
 x_train (InputLayer)        [(None, 10)]              0         
                                                                 
 embedding (Embedding)       (None, 10, 800)           1948800   
                                                                 
 LSTM_1 (LSTM)               (None, 10, 800)           5123200   
                                                                 
 LSTM_2 (LSTM)               (None, 1600)              15366400  
                                                                 
 Dense_1 (Dense)             (None, 2436)              3900036   
                                                                 
=================================================================
Total params: 26,338,436
Trainable params: 26,338,436
Non-trainable params: 0
_________________________________________________________________
None
F:\anaconda3\lib\site-packages\keras\optimizers\optimizer_v2\adam.py:110: UserWarning: The `lr` argument is deprecated, use `learning_rate` instead.
  super(Adam, self).__init__(name, **kwargs)
WARNING:tensorflow:`period` argument is deprecated. Please use `save_freq` to specify the frequency in number of batches seen.
Train....
Epoch 1/50
732/732 [==============================] - ETA: 0s - loss: 6.2225 - accuracy: 0.1076
Epoch 1: val_loss improved from inf to 6.08880, saving model to .\model1_epoch50_2lstm_1dense_seq50_phrase_based_best.h5
732/732 [==============================] - 499s 667ms/step - loss: 6.2225 - accuracy: 0.1076 - val_loss: 6.0888 - val_accuracy: 0.1149
Epoch 2/50
732/732 [==============================] - ETA: 0s - loss: 5.9625 - accuracy: 0.1208
Epoch 2: val_loss improved from 6.08880 to 6.01894, saving model to .\model1_epoch50_2lstm_1dense_seq50_phrase_based_best.h5
732/732 [==============================] - 514s 702ms/step - loss: 5.9625 - accuracy: 0.1208 - val_loss: 6.0189 - val_accuracy: 0.1165
Epoch 3/50
732/732 [==============================] - ETA: 0s - loss: 5.7897 - accuracy: 0.1296
Epoch 3: val_loss improved from 6.01894 to 5.87968, saving model to .\model1_epoch50_2lstm_1dense_seq50_phrase_based_best.h5
732/732 [==============================] - 503s 687ms/step - loss: 5.7897 - accuracy: 0.1296 - val_loss: 5.8797 - val_accuracy: 0.1231
Epoch 4/50
732/732 [==============================] - ETA: 0s - loss: 5.5915 - accuracy: 0.1372
Epoch 4: val_loss improved from 5.87968 to 5.74615, saving model to .\model1_epoch50_2lstm_1dense_seq50_phrase_based_best.h5
732/732 [==============================] - 496s 677ms/step - loss: 5.5915 - accuracy: 0.1372 - val_loss: 5.7462 - val_accuracy: 0.1287
Epoch 5/50
732/732 [==============================] - ETA: 0s - loss: 5.3582 - accuracy: 0.1474
Epoch 5: val_loss improved from 5.74615 to 5.60099, saving model to .\model1_epoch50_2lstm_1dense_seq50_phrase_based_best.h5
732/732 [==============================] - 508s 694ms/step - loss: 5.3582 - accuracy: 0.1474 - val_loss: 5.6010 - val_accuracy: 0.1401
Epoch 6/50
732/732 [==============================] - ETA: 0s - loss: 5.0717 - accuracy: 0.1608
Epoch 6: val_loss improved from 5.60099 to 5.47120, saving model to .\model1_epoch50_2lstm_1dense_seq50_phrase_based_best.h5
732/732 [==============================] - 480s 656ms/step - loss: 5.0717 - accuracy: 0.1608 - val_loss: 5.4712 - val_accuracy: 0.1444
Epoch 7/50
732/732 [==============================] - ETA: 0s - loss: 4.7379 - accuracy: 0.1846
Epoch 7: val_loss improved from 5.47120 to 5.28951, saving model to .\model1_epoch50_2lstm_1dense_seq50_phrase_based_best.h5
732/732 [==============================] - 513s 701ms/step - loss: 4.7379 - accuracy: 0.1846 - val_loss: 5.2895 - val_accuracy: 0.1627
Epoch 8/50
732/732 [==============================] - ETA: 0s - loss: 4.3856 - accuracy: 0.2226
Epoch 8: val_loss improved from 5.28951 to 5.10057, saving model to .\model1_epoch50_2lstm_1dense_seq50_phrase_based_best.h5
732/732 [==============================] - 490s 669ms/step - loss: 4.3856 - accuracy: 0.2226 - val_loss: 5.1006 - val_accuracy: 0.1844
Epoch 9/50
732/732 [==============================] - ETA: 0s - loss: 4.0185 - accuracy: 0.2718
Epoch 9: val_loss improved from 5.10057 to 4.93234, saving model to .\model1_epoch50_2lstm_1dense_seq50_phrase_based_best.h5
732/732 [==============================] - 523s 714ms/step - loss: 4.0185 - accuracy: 0.2718 - val_loss: 4.9323 - val_accuracy: 0.2067
Epoch 10/50
732/732 [==============================] - ETA: 0s - loss: 3.6519 - accuracy: 0.3300
Epoch 10: val_loss improved from 4.93234 to 4.76022, saving model to .\model1_epoch50_2lstm_1dense_seq50_phrase_based_best.h5
732/732 [==============================] - 492s 672ms/step - loss: 3.6519 - accuracy: 0.3300 - val_loss: 4.7602 - val_accuracy: 0.2352
Epoch 11/50
732/732 [==============================] - ETA: 0s - loss: 3.2911 - accuracy: 0.3934
Epoch 11: val_loss improved from 4.76022 to 4.54920, saving model to .\model1_epoch50_2lstm_1dense_seq50_phrase_based_best.h5
732/732 [==============================] - 522s 714ms/step - loss: 3.2911 - accuracy: 0.3934 - val_loss: 4.5492 - val_accuracy: 0.2758
Epoch 12/50
732/732 [==============================] - ETA: 0s - loss: 2.9327 - accuracy: 0.4617
Epoch 12: val_loss improved from 4.54920 to 4.35707, saving model to .\model1_epoch50_2lstm_1dense_seq50_phrase_based_best.h5
732/732 [==============================] - 497s 679ms/step - loss: 2.9327 - accuracy: 0.4617 - val_loss: 4.3571 - val_accuracy: 0.3135
Epoch 13/50
732/732 [==============================] - ETA: 0s - loss: 2.5890 - accuracy: 0.5304
Epoch 13: val_loss improved from 4.35707 to 4.19432, saving model to .\model1_epoch50_2lstm_1dense_seq50_phrase_based_best.h5
732/732 [==============================] - 520s 711ms/step - loss: 2.5890 - accuracy: 0.5304 - val_loss: 4.1943 - val_accuracy: 0.3563
Epoch 14/50
732/732 [==============================] - ETA: 0s - loss: 2.2645 - accuracy: 0.5939
Epoch 14: val_loss improved from 4.19432 to 4.01978, saving model to .\model1_epoch50_2lstm_1dense_seq50_phrase_based_best.h5
732/732 [==============================] - 496s 678ms/step - loss: 2.2645 - accuracy: 0.5939 - val_loss: 4.0198 - val_accuracy: 0.3830
Epoch 15/50
732/732 [==============================] - ETA: 0s - loss: 1.9605 - accuracy: 0.6559
Epoch 15: val_loss improved from 4.01978 to 3.85684, saving model to .\model1_epoch50_2lstm_1dense_seq50_phrase_based_best.h5
732/732 [==============================] - 515s 703ms/step - loss: 1.9605 - accuracy: 0.6559 - val_loss: 3.8568 - val_accuracy: 0.4217
Epoch 16/50
732/732 [==============================] - ETA: 0s - loss: 1.6771 - accuracy: 0.7090
Epoch 16: val_loss improved from 3.85684 to 3.67159, saving model to .\model1_epoch50_2lstm_1dense_seq50_phrase_based_best.h5
732/732 [==============================] - 506s 692ms/step - loss: 1.6771 - accuracy: 0.7090 - val_loss: 3.6716 - val_accuracy: 0.4705
Epoch 17/50
732/732 [==============================] - ETA: 0s - loss: 1.4196 - accuracy: 0.7565
Epoch 17: val_loss improved from 3.67159 to 3.54158, saving model to .\model1_epoch50_2lstm_1dense_seq50_phrase_based_best.h5
732/732 [==============================] - 516s 706ms/step - loss: 1.4196 - accuracy: 0.7565 - val_loss: 3.5416 - val_accuracy: 0.4999
Epoch 18/50
732/732 [==============================] - ETA: 0s - loss: 1.1845 - accuracy: 0.8012
Epoch 18: val_loss improved from 3.54158 to 3.41239, saving model to .\model1_epoch50_2lstm_1dense_seq50_phrase_based_best.h5
732/732 [==============================] - 513s 700ms/step - loss: 1.1845 - accuracy: 0.8012 - val_loss: 3.4124 - val_accuracy: 0.5406
Epoch 19/50
732/732 [==============================] - ETA: 0s - loss: 0.9755 - accuracy: 0.8447
Epoch 19: val_loss improved from 3.41239 to 3.30893, saving model to .\model1_epoch50_2lstm_1dense_seq50_phrase_based_best.h5
732/732 [==============================] - 512s 700ms/step - loss: 0.9755 - accuracy: 0.8447 - val_loss: 3.3089 - val_accuracy: 0.5650
Epoch 20/50
732/732 [==============================] - ETA: 0s - loss: 0.7897 - accuracy: 0.8836
Epoch 20: val_loss improved from 3.30893 to 3.19777, saving model to .\model1_epoch50_2lstm_1dense_seq50_phrase_based_best.h5
732/732 [==============================] - 493s 674ms/step - loss: 0.7897 - accuracy: 0.8836 - val_loss: 3.1978 - val_accuracy: 0.5944
Epoch 21/50
732/732 [==============================] - ETA: 0s - loss: 0.6258 - accuracy: 0.9167
Epoch 21: val_loss improved from 3.19777 to 3.11253, saving model to .\model1_epoch50_2lstm_1dense_seq50_phrase_based_best.h5
732/732 [==============================] - 514s 702ms/step - loss: 0.6258 - accuracy: 0.9167 - val_loss: 3.1125 - val_accuracy: 0.6162
Epoch 22/50
732/732 [==============================] - ETA: 0s - loss: 0.4878 - accuracy: 0.9439
Epoch 22: val_loss improved from 3.11253 to 3.04473, saving model to .\model1_epoch50_2lstm_1dense_seq50_phrase_based_best.h5
732/732 [==============================] - 492s 673ms/step - loss: 0.4878 - accuracy: 0.9439 - val_loss: 3.0447 - val_accuracy: 0.6273
Epoch 23/50
732/732 [==============================] - ETA: 0s - loss: 0.3731 - accuracy: 0.9611
Epoch 23: val_loss improved from 3.04473 to 3.02193, saving model to .\model1_epoch50_2lstm_1dense_seq50_phrase_based_best.h5
732/732 [==============================] - 520s 711ms/step - loss: 0.3731 - accuracy: 0.9611 - val_loss: 3.0219 - val_accuracy: 0.6311
Epoch 24/50
732/732 [==============================] - ETA: 0s - loss: 0.2829 - accuracy: 0.9724
Epoch 24: val_loss improved from 3.02193 to 3.00552, saving model to .\model1_epoch50_2lstm_1dense_seq50_phrase_based_best.h5
732/732 [==============================] - 507s 693ms/step - loss: 0.2829 - accuracy: 0.9724 - val_loss: 3.0055 - val_accuracy: 0.6339
Epoch 25/50
732/732 [==============================] - ETA: 0s - loss: 0.2212 - accuracy: 0.9760
Epoch 25: val_loss did not improve from 3.00552
732/732 [==============================] - 515s 704ms/step - loss: 0.2212 - accuracy: 0.9760 - val_loss: 3.0245 - val_accuracy: 0.6371
Epoch 26/50
732/732 [==============================] - ETA: 0s - loss: 0.1743 - accuracy: 0.9784
Epoch 26: val_loss did not improve from 3.00552
732/732 [==============================] - 486s 664ms/step - loss: 0.1743 - accuracy: 0.9784 - val_loss: 3.0246 - val_accuracy: 0.6368
Epoch 27/50
732/732 [==============================] - ETA: 0s - loss: 0.1448 - accuracy: 0.9799
Epoch 27: val_loss did not improve from 3.00552
732/732 [==============================] - 513s 701ms/step - loss: 0.1448 - accuracy: 0.9799 - val_loss: 3.0409 - val_accuracy: 0.6376
Epoch 28/50
732/732 [==============================] - ETA: 0s - loss: 0.1268 - accuracy: 0.9800
Epoch 28: val_loss did not improve from 3.00552
732/732 [==============================] - 494s 675ms/step - loss: 0.1268 - accuracy: 0.9800 - val_loss: 3.0424 - val_accuracy: 0.6390
Epoch 29/50
732/732 [==============================] - ETA: 0s - loss: 0.1097 - accuracy: 0.9805
Epoch 29: val_loss did not improve from 3.00552
732/732 [==============================] - 511s 698ms/step - loss: 0.1097 - accuracy: 0.9805 - val_loss: 3.1073 - val_accuracy: 0.6362
Epoch 30/50
732/732 [==============================] - ETA: 0s - loss: 0.1002 - accuracy: 0.9814
Epoch 30: val_loss did not improve from 3.00552
732/732 [==============================] - 489s 667ms/step - loss: 0.1002 - accuracy: 0.9814 - val_loss: 3.0818 - val_accuracy: 0.6372
Epoch 31/50
732/732 [==============================] - ETA: 0s - loss: 0.0904 - accuracy: 0.9823
Epoch 31: val_loss did not improve from 3.00552
732/732 [==============================] - 490s 670ms/step - loss: 0.0904 - accuracy: 0.9823 - val_loss: 3.1299 - val_accuracy: 0.6391
Epoch 32/50
732/732 [==============================] - ETA: 0s - loss: 0.0847 - accuracy: 0.9826
Epoch 32: val_loss did not improve from 3.00552
732/732 [==============================] - 486s 664ms/step - loss: 0.0847 - accuracy: 0.9826 - val_loss: 3.1279 - val_accuracy: 0.6388
Epoch 33/50
732/732 [==============================] - ETA: 0s - loss: 0.0783 - accuracy: 0.9830
Epoch 33: val_loss did not improve from 3.00552
732/732 [==============================] - 490s 669ms/step - loss: 0.0783 - accuracy: 0.9830 - val_loss: 3.1476 - val_accuracy: 0.6406
Epoch 34/50
732/732 [==============================] - ETA: 0s - loss: 0.0755 - accuracy: 0.9833
Epoch 34: val_loss did not improve from 3.00552
732/732 [==============================] - 496s 677ms/step - loss: 0.0755 - accuracy: 0.9833 - val_loss: 3.1682 - val_accuracy: 0.6392
Epoch 35/50
732/732 [==============================] - ETA: 0s - loss: 0.0707 - accuracy: 0.9835
Epoch 35: val_loss did not improve from 3.00552
732/732 [==============================] - 501s 685ms/step - loss: 0.0707 - accuracy: 0.9835 - val_loss: 3.1914 - val_accuracy: 0.6402
Epoch 36/50
732/732 [==============================] - ETA: 0s - loss: 0.0672 - accuracy: 0.9846
Epoch 36: val_loss did not improve from 3.00552
732/732 [==============================] - 515s 704ms/step - loss: 0.0672 - accuracy: 0.9846 - val_loss: 3.1692 - val_accuracy: 0.6415
Epoch 37/50
732/732 [==============================] - ETA: 0s - loss: 0.0636 - accuracy: 0.9845
Epoch 37: val_loss did not improve from 3.00552
732/732 [==============================] - 503s 687ms/step - loss: 0.0636 - accuracy: 0.9845 - val_loss: 3.1902 - val_accuracy: 0.6396
Epoch 38/50
732/732 [==============================] - ETA: 0s - loss: 0.0608 - accuracy: 0.9848
Epoch 38: val_loss did not improve from 3.00552
732/732 [==============================] - 500s 684ms/step - loss: 0.0608 - accuracy: 0.9848 - val_loss: 3.2331 - val_accuracy: 0.6398
Epoch 39/50
732/732 [==============================] - ETA: 0s - loss: 0.0594 - accuracy: 0.9850
Epoch 39: val_loss did not improve from 3.00552
732/732 [==============================] - 509s 695ms/step - loss: 0.0594 - accuracy: 0.9850 - val_loss: 3.2590 - val_accuracy: 0.6411
Epoch 40/50
732/732 [==============================] - ETA: 0s - loss: 0.0577 - accuracy: 0.9855
Epoch 40: val_loss did not improve from 3.00552
732/732 [==============================] - 499s 682ms/step - loss: 0.0577 - accuracy: 0.9855 - val_loss: 3.2430 - val_accuracy: 0.6405
Epoch 41/50
732/732 [==============================] - ETA: 0s - loss: 0.0567 - accuracy: 0.9852
Epoch 41: val_loss did not improve from 3.00552
732/732 [==============================] - 484s 661ms/step - loss: 0.0567 - accuracy: 0.9852 - val_loss: 3.2784 - val_accuracy: 0.6393
Epoch 42/50
732/732 [==============================] - ETA: 0s - loss: 0.0555 - accuracy: 0.9849
Epoch 42: val_loss did not improve from 3.00552
732/732 [==============================] - 496s 677ms/step - loss: 0.0555 - accuracy: 0.9849 - val_loss: 3.2869 - val_accuracy: 0.6407
Epoch 43/50
732/732 [==============================] - ETA: 0s - loss: 0.0535 - accuracy: 0.9856
Epoch 43: val_loss did not improve from 3.00552
732/732 [==============================] - 501s 685ms/step - loss: 0.0535 - accuracy: 0.9856 - val_loss: 3.2955 - val_accuracy: 0.6413
Epoch 44/50
732/732 [==============================] - ETA: 0s - loss: 0.0519 - accuracy: 0.9854
Epoch 44: val_loss did not improve from 3.00552
732/732 [==============================] - 510s 697ms/step - loss: 0.0519 - accuracy: 0.9854 - val_loss: 3.3141 - val_accuracy: 0.6429
Epoch 45/50
732/732 [==============================] - ETA: 0s - loss: 0.0505 - accuracy: 0.9854
Epoch 45: val_loss did not improve from 3.00552
732/732 [==============================] - 498s 680ms/step - loss: 0.0505 - accuracy: 0.9854 - val_loss: 3.3240 - val_accuracy: 0.6395
Epoch 46/50
732/732 [==============================] - ETA: 0s - loss: 0.0505 - accuracy: 0.9855
Epoch 46: val_loss did not improve from 3.00552
732/732 [==============================] - 505s 690ms/step - loss: 0.0505 - accuracy: 0.9855 - val_loss: 3.3422 - val_accuracy: 0.6393
Epoch 47/50
732/732 [==============================] - ETA: 0s - loss: 0.0486 - accuracy: 0.9859
Epoch 47: val_loss did not improve from 3.00552
732/732 [==============================] - 500s 684ms/step - loss: 0.0486 - accuracy: 0.9859 - val_loss: 3.3550 - val_accuracy: 0.6389
Epoch 48/50
732/732 [==============================] - ETA: 0s - loss: 0.0475 - accuracy: 0.9864
Epoch 48: val_loss did not improve from 3.00552
732/732 [==============================] - 491s 671ms/step - loss: 0.0475 - accuracy: 0.9864 - val_loss: 3.3685 - val_accuracy: 0.6409
Epoch 49/50
732/732 [==============================] - ETA: 0s - loss: 0.0466 - accuracy: 0.9863
Epoch 49: val_loss did not improve from 3.00552
732/732 [==============================] - 502s 686ms/step - loss: 0.0466 - accuracy: 0.9863 - val_loss: 3.3602 - val_accuracy: 0.6406
Epoch 50/50
732/732 [==============================] - ETA: 0s - loss: 0.0459 - accuracy: 0.9867
Epoch 50: val_loss did not improve from 3.00552
732/732 [==============================] - 520s 710ms/step - loss: 0.0459 - accuracy: 0.9867 - val_loss: 3.4052 - val_accuracy: 0.6399

2023-04-16 10:02:50.415089: I tensorflow/core/platform/cpu_feature_guard.cc:193] This TensorFlow binary is optimized with oneAPI Deep Neural Network Library (oneDNN) to use the following CPU instructions in performance-critical operations:  AVX2
To enable them in other operations, rebuild TensorFlow with the appropriate compiler flags.
2023-04-16 10:02:50.532383: I tensorflow/core/common_runtime/process_util.cc:146] Creating new thread pool with default inter op setting: 2. Tune using inter_op_parallelism_threads for best performance.
痴情
如如任一点一眼
你是没有那美单
等有你还格我用
我的秋分分是我 出别怎象
如结向 什一然 有人把雨落
让是人着你
和你真开的表表
那个人的歌不要最当早还穿解
o代一住真把桥开
人受明叶曲烟时
爱解轻同的又人太许
o清清
相枪唤醒言的前里
我想不独进进关海提在高子
受留着一失唱以日繁在星些“间
飞的让那国在落
是些如些名感甘
我在紫眼 落个此此的是无在
这生的流候是远果
如想想流一年非
你是还

2023-04-16 10:10:40.953140: I tensorflow/core/platform/cpu_feature_guard.cc:193] This TensorFlow binary is optimized with oneAPI Deep Neural Network Library (oneDNN) to use the following CPU instructions in performance-critical operations:  AVX2
To enable them in other operations, rebuild TensorFlow with the appropriate compiler flags.
2023-04-16 10:10:40.955064: I tensorflow/core/common_runtime/process_util.cc:146] Creating new thread pool with default inter op setting: 2. Tune using inter_op_parallelism_threads for best performance.
喜欢
她一如为谁似对年雪
唱明成风
我深了不走上
你在哪的急
虹过无温道  如进风物一生
想一难等 一个难难好分
我没是象乐
你在香错的”我
幻照当说了 新亮送
你到曾愿
我还几笑
心走下义其忆
哪色是人心是事泪已个
你说你想谁我有脸为我
春非你是如今变 为怎拍还你
我过后的很解 我在再 一苦平高里出结陪我的化角
飘乐就谁  我看说你
红个下下 我在曾领你你
深留去开话
再南哪气的时命这多非在刚刚出的晃光
你走人暖终
他装着在深我
日轻花步们的多愁
以一世慢慢就就多曾曾要要现的样
看果感地 人我再自结
心 第你留乘后决视叹少你着伤嘴
如想有多会那起
我还一回话
 我说会你一点 提下 也什么回过

2023-04-16 10:17:08.619933: I tensorflow/core/platform/cpu_feature_guard.cc:193] This TensorFlow binary is optimized with oneAPI Deep Neural Network Library (oneDNN) to use the following CPU instructions in performance-critical operations:  AVX2
To enable them in other operations, rebuild TensorFlow with the appropriate compiler flags.
2023-04-16 10:17:08.792872: I tensorflow/core/common_runtime/process_util.cc:146] Creating new thread pool with default inter op setting: 2. Tune using inter_op_parallelism_threads for best performance.
喜欢
一一如梦之难意
是爱的梦 听是是关非
和你走没说 我没人 我得经认微过
每正我 爱的一巧 对就我一绝用在出上气头
你的神尽像如笑 如经是一夜一切
你的白明已喜转如如在了你的白与梦
是个年里我实的碎越头
在如我们都你的眼a
有不了好 我没没象得
看续里我回走的热笑
随一起不面我爱摆
以果回轻 认场可能一b人
所结古 这人自定
你 爱喜你不了然世风
我寻爱欢的世过
公名界事是别边为飞儿
那场时上的多边
是湖小歌泣
你若未缘快
靠头独谱
漫放起烛
天干万清色
 11音失 深 无像我着落
我的头随
一些大流 天心的眉途
在碎眼 它铺忽
你寻梦   世日还不买
我想意在把然角边月的真一角一黄得
飞

Process finished with exit code 0



2023-04-16 10:22:19.223633: I tensorflow/core/platform/cpu_feature_guard.cc:193] This TensorFlow binary is optimized with oneAPI Deep Neural Network Library (oneDNN) to use the following CPU instructions in performance-critical operations:  AVX2
To enable them in other operations, rebuild TensorFlow with the appropriate compiler flags.
2023-04-16 10:22:19.258500: I tensorflow/core/common_runtime/process_util.cc:146] Creating new thread pool with default inter op setting: 2. Tune using inter_op_parallelism_threads for best performance.
喜欢
晨言一城晚片
高怎头的泛梦
南h修五里不花不
又上踏言烈深灯
如持着三月声裙
上生静毕不紧一家住是悠久的泛烁
烁胖太光破  ?花更夕对
有什的红空气线
否品见就越里模理
曾心的颗古
盛明 街心道
又聊年执你
你海化观天
站幻算一缺白过的向只
信穿你的如次悄里
最为我了心的雨样
表立着记明该年
泛影了马出界记
看有我还说着抱音
你唱在回楼
情启温缘 齐具一缘
还天登要心电
热脑片r舞甲乱
我持量的故影里扣的洁很打起
碑情意 绚月如暗的重形灯雄微
飞书的微孤漫再在紧车的美
你里美方多本余沫
我上白和信经窗三想你 看走而潮心深
还有是球尘人给让有气更提
像那感或信欢的象道
电~开院横物
分越三

2023-04-16 10:26:38.504694: I tensorflow/core/platform/cpu_feature_guard.cc:193] This TensorFlow binary is optimized with oneAPI Deep Neural Network Library (oneDNN) to use the following CPU instructions in performance-critical operations:  AVX2
To enable them in other operations, rebuild TensorFlow with the appropriate compiler flags.
2023-04-16 10:26:38.506895: I tensorflow/core/common_runtime/process_util.cc:146] Creating new thread pool with default inter op setting: 2. Tune using inter_op_parallelism_threads for best performance.
喜欢着静女
她忙o着失到然然眼偷是风念里而过起
我们着过乐始耳躲的诗么听
每笑一喉热冶湖命的歌
在头沉淡忽关动藏
吹要人纯滚身冬盛
倒停上带着都我想任浪心
努实正到看尽流亮
感笑于注了与泪着
尽一底人他法等阴窗
少叹要一近后后
我这实戴你言经有我的雨
 请我还心们的满单B和那常微常的墙
我领完完着很待字
最州的歌无薯堤了已穿手一照任热头的奖
当是发浅来流里于孤旁
你还化头的是算雨情
我回明绕里们诺么边
花从好非了 这情来哭和和花坏
拍关经深子你
别是我的飞念
你只我好忆我的期定
愿爱在奏面毯
待偶九种失跑 纸饮阴斜北惯的贵
我爱该人 别为用身
自己还命 你声台乏缘宙
功去对阳淡续门 穿怕时若就

Process finished with exit code 0

 

  • 存在符号数字 对数据集处理重新训练
2023-04-16 12:48:52.373053: I tensorflow/core/platform/cpu_feature_guard.cc:193] This TensorFlow binary is optimized with oneAPI Deep Neural Network Library (oneDNN) to use the following CPU instructions in performance-critical operations:  AVX2
To enable them in other operations, rebuild TensorFlow with the appropriate compiler flags.
2023-04-16 12:48:52.566464: I tensorflow/core/common_runtime/process_util.cc:146] Creating new thread pool with default inter op setting: 2. Tune using inter_op_parallelism_threads for best performance.
2023-04-16 12:49:04.313108: I tensorflow/core/common_runtime/process_util.cc:146] Creating new thread pool with default inter op setting: 2. Tune using inter_op_parallelism_threads for best performance.
embedding layer is:: KerasTensor(type_spec=TensorSpec(shape=(None, 10, 800), dtype=tf.float32, name=None), name='embedding/embedding_lookup/Identity_1:0', description="created by layer 'embedding'")
build model.....
Model: "model"
_________________________________________________________________
 Layer (type)                Output Shape              Param #   
=================================================================
 x_train (InputLayer)        [(None, 10)]              0         
                                                                 
 embedding (Embedding)       (None, 10, 800)           1887200   
                                                                 
 LSTM_1 (LSTM)               (None, 10, 800)           5123200   
                                                                 
 LSTM_2 (LSTM)               (None, 1600)              15366400  
                                                                 
 Dense_1 (Dense)             (None, 2359)              3776759   
                                                                 
=================================================================
Total params: 26,153,559
Trainable params: 26,153,559
Non-trainable params: 0
_________________________________________________________________
None
F:\anaconda3\lib\site-packages\keras\optimizers\optimizer_v2\adam.py:110: UserWarning: The `lr` argument is deprecated, use `learning_rate` instead.
  super(Adam, self).__init__(name, **kwargs)
Train....
WARNING:tensorflow:`period` argument is deprecated. Please use `save_freq` to specify the frequency in number of batches seen.
Epoch 1/50
724/724 [==============================] - ETA: 0s - loss: 6.1890 - accuracy: 0.1090
Epoch 1: val_loss improved from inf to 6.08992, saving model to .\model2_epoch50_2lstm_1dense_seq50_phrase_based_best.h5
724/724 [==============================] - 553s 746ms/step - loss: 6.1890 - accuracy: 0.1090 - val_loss: 6.0899 - val_accuracy: 0.1099
Epoch 2/50
724/724 [==============================] - ETA: 0s - loss: 5.9348 - accuracy: 0.1207
Epoch 2: val_loss improved from 6.08992 to 5.99014, saving model to .\model2_epoch50_2lstm_1dense_seq50_phrase_based_best.h5
724/724 [==============================] - 536s 741ms/step - loss: 5.9348 - accuracy: 0.1207 - val_loss: 5.9901 - val_accuracy: 0.1200
Epoch 3/50
724/724 [==============================] - ETA: 0s - loss: 5.7773 - accuracy: 0.1285
Epoch 3: val_loss improved from 5.99014 to 5.89635, saving model to .\model2_epoch50_2lstm_1dense_seq50_phrase_based_best.h5
724/724 [==============================] - 733s 1s/step - loss: 5.7773 - accuracy: 0.1285 - val_loss: 5.8964 - val_accuracy: 0.1264
Epoch 4/50
724/724 [==============================] - ETA: 0s - loss: 5.5887 - accuracy: 0.1364
Epoch 4: val_loss improved from 5.89635 to 5.77642, saving model to .\model2_epoch50_2lstm_1dense_seq50_phrase_based_best.h5
724/724 [==============================] - 547s 756ms/step - loss: 5.5887 - accuracy: 0.1364 - val_loss: 5.7764 - val_accuracy: 0.1314
Epoch 5/50
724/724 [==============================] - ETA: 0s - loss: 5.3723 - accuracy: 0.1444
Epoch 5: val_loss improved from 5.77642 to 5.68260, saving model to .\model2_epoch50_2lstm_1dense_seq50_phrase_based_best.h5
724/724 [==============================] - 543s 750ms/step - loss: 5.3723 - accuracy: 0.1444 - val_loss: 5.6826 - val_accuracy: 0.1362
Epoch 6/50
724/724 [==============================] - ETA: 0s - loss: 5.0972 - accuracy: 0.1560
Epoch 6: val_loss improved from 5.68260 to 5.55087, saving model to .\model2_epoch50_2lstm_1dense_seq50_phrase_based_best.h5
724/724 [==============================] - 632s 873ms/step - loss: 5.0972 - accuracy: 0.1560 - val_loss: 5.5509 - val_accuracy: 0.1433
Epoch 7/50
724/724 [==============================] - ETA: 0s - loss: 4.7730 - accuracy: 0.1783
Epoch 7: val_loss improved from 5.55087 to 5.38638, saving model to .\model2_epoch50_2lstm_1dense_seq50_phrase_based_best.h5
724/724 [==============================] - 670s 904ms/step - loss: 4.7730 - accuracy: 0.1783 - val_loss: 5.3864 - val_accuracy: 0.1570
Epoch 8/50
724/724 [==============================] - ETA: 0s - loss: 4.4211 - accuracy: 0.2154
Epoch 8: val_loss improved from 5.38638 to 5.23084, saving model to .\model2_epoch50_2lstm_1dense_seq50_phrase_based_best.h5
724/724 [==============================] - 716s 990ms/step - loss: 4.4211 - accuracy: 0.2154 - val_loss: 5.2308 - val_accuracy: 0.1764
Epoch 9/50
724/724 [==============================] - ETA: 0s - loss: 4.0669 - accuracy: 0.2610
Epoch 9: val_loss improved from 5.23084 to 5.04806, saving model to .\model2_epoch50_2lstm_1dense_seq50_phrase_based_best.h5
724/724 [==============================] - 544s 751ms/step - loss: 4.0669 - accuracy: 0.2610 - val_loss: 5.0481 - val_accuracy: 0.2017
Epoch 10/50
724/724 [==============================] - ETA: 0s - loss: 3.7084 - accuracy: 0.3184
Epoch 10: val_loss improved from 5.04806 to 4.87938, saving model to .\model2_epoch50_2lstm_1dense_seq50_phrase_based_best.h5
724/724 [==============================] - 504s 696ms/step - loss: 3.7084 - accuracy: 0.3184 - val_loss: 4.8794 - val_accuracy: 0.2295
Epoch 11/50
724/724 [==============================] - ETA: 0s - loss: 3.3505 - accuracy: 0.3786
Epoch 11: val_loss improved from 4.87938 to 4.70261, saving model to .\model2_epoch50_2lstm_1dense_seq50_phrase_based_best.h5
724/724 [==============================] - 501s 692ms/step - loss: 3.3505 - accuracy: 0.3786 - val_loss: 4.7026 - val_accuracy: 0.2582
Epoch 12/50
724/724 [==============================] - ETA: 0s - loss: 3.0024 - accuracy: 0.4459
Epoch 12: val_loss improved from 4.70261 to 4.52160, saving model to .\model2_epoch50_2lstm_1dense_seq50_phrase_based_best.h5
724/724 [==============================] - 474s 654ms/step - loss: 3.0024 - accuracy: 0.4459 - val_loss: 4.5216 - val_accuracy: 0.2924
Epoch 13/50
724/724 [==============================] - ETA: 0s - loss: 2.6653 - accuracy: 0.5139
Epoch 13: val_loss improved from 4.52160 to 4.35553, saving model to .\model2_epoch50_2lstm_1dense_seq50_phrase_based_best.h5
724/724 [==============================] - 505s 697ms/step - loss: 2.6653 - accuracy: 0.5139 - val_loss: 4.3555 - val_accuracy: 0.3275
Epoch 14/50
724/724 [==============================] - ETA: 0s - loss: 2.3434 - accuracy: 0.5783
Epoch 14: val_loss improved from 4.35553 to 4.18917, saving model to .\model2_epoch50_2lstm_1dense_seq50_phrase_based_best.h5
724/724 [==============================] - 472s 652ms/step - loss: 2.3434 - accuracy: 0.5783 - val_loss: 4.1892 - val_accuracy: 0.3682
Epoch 15/50
724/724 [==============================] - ETA: 0s - loss: 2.0391 - accuracy: 0.6353
Epoch 15: val_loss improved from 4.18917 to 4.04068, saving model to .\model2_epoch50_2lstm_1dense_seq50_phrase_based_best.h5
724/724 [==============================] - 510s 704ms/step - loss: 2.0391 - accuracy: 0.6353 - val_loss: 4.0407 - val_accuracy: 0.4064
Epoch 16/50
724/724 [==============================] - ETA: 0s - loss: 1.7560 - accuracy: 0.6924
Epoch 16: val_loss improved from 4.04068 to 3.87016, saving model to .\model2_epoch50_2lstm_1dense_seq50_phrase_based_best.h5
724/724 [==============================] - 480s 664ms/step - loss: 1.7560 - accuracy: 0.6924 - val_loss: 3.8702 - val_accuracy: 0.4477
Epoch 17/50
724/724 [==============================] - ETA: 0s - loss: 1.4943 - accuracy: 0.7416
Epoch 17: val_loss improved from 3.87016 to 3.74787, saving model to .\model2_epoch50_2lstm_1dense_seq50_phrase_based_best.h5
724/724 [==============================] - 513s 709ms/step - loss: 1.4943 - accuracy: 0.7416 - val_loss: 3.7479 - val_accuracy: 0.4739
Epoch 18/50
724/724 [==============================] - ETA: 0s - loss: 1.2571 - accuracy: 0.7888
Epoch 18: val_loss improved from 3.74787 to 3.61463, saving model to .\model2_epoch50_2lstm_1dense_seq50_phrase_based_best.h5
724/724 [==============================] - 492s 679ms/step - loss: 1.2571 - accuracy: 0.7888 - val_loss: 3.6146 - val_accuracy: 0.5113
Epoch 19/50
724/724 [==============================] - ETA: 0s - loss: 1.0431 - accuracy: 0.8310
Epoch 19: val_loss improved from 3.61463 to 3.48974, saving model to .\model2_epoch50_2lstm_1dense_seq50_phrase_based_best.h5
724/724 [==============================] - 497s 687ms/step - loss: 1.0431 - accuracy: 0.8310 - val_loss: 3.4897 - val_accuracy: 0.5410
Epoch 20/50
724/724 [==============================] - ETA: 0s - loss: 0.8561 - accuracy: 0.8702
Epoch 20: val_loss improved from 3.48974 to 3.39216, saving model to .\model2_epoch50_2lstm_1dense_seq50_phrase_based_best.h5
724/724 [==============================] - 492s 679ms/step - loss: 0.8561 - accuracy: 0.8702 - val_loss: 3.3922 - val_accuracy: 0.5707
Epoch 21/50
724/724 [==============================] - ETA: 0s - loss: 0.6875 - accuracy: 0.9031
Epoch 21: val_loss improved from 3.39216 to 3.32759, saving model to .\model2_epoch50_2lstm_1dense_seq50_phrase_based_best.h5
724/724 [==============================] - 488s 673ms/step - loss: 0.6875 - accuracy: 0.9031 - val_loss: 3.3276 - val_accuracy: 0.5906
Epoch 22/50
724/724 [==============================] - ETA: 0s - loss: 0.5420 - accuracy: 0.9328
Epoch 22: val_loss improved from 3.32759 to 3.27522, saving model to .\model2_epoch50_2lstm_1dense_seq50_phrase_based_best.h5
724/724 [==============================] - 486s 672ms/step - loss: 0.5420 - accuracy: 0.9328 - val_loss: 3.2752 - val_accuracy: 0.6037
Epoch 23/50
724/724 [==============================] - ETA: 0s - loss: 0.4179 - accuracy: 0.9533
Epoch 23: val_loss improved from 3.27522 to 3.21136, saving model to .\model2_epoch50_2lstm_1dense_seq50_phrase_based_best.h5
724/724 [==============================] - 505s 698ms/step - loss: 0.4179 - accuracy: 0.9533 - val_loss: 3.2114 - val_accuracy: 0.6143
Epoch 24/50
724/724 [==============================] - ETA: 0s - loss: 0.3216 - accuracy: 0.9670
Epoch 24: val_loss improved from 3.21136 to 3.19814, saving model to .\model2_epoch50_2lstm_1dense_seq50_phrase_based_best.h5
724/724 [==============================] - 495s 684ms/step - loss: 0.3216 - accuracy: 0.9670 - val_loss: 3.1981 - val_accuracy: 0.6167
Epoch 25/50
724/724 [==============================] - ETA: 0s - loss: 0.2464 - accuracy: 0.9747
Epoch 25: val_loss did not improve from 3.19814
724/724 [==============================] - 490s 676ms/step - loss: 0.2464 - accuracy: 0.9747 - val_loss: 3.2126 - val_accuracy: 0.6171
Epoch 26/50
724/724 [==============================] - ETA: 0s - loss: 0.1926 - accuracy: 0.9781
Epoch 26: val_loss did not improve from 3.19814
724/724 [==============================] - 489s 676ms/step - loss: 0.1926 - accuracy: 0.9781 - val_loss: 3.2302 - val_accuracy: 0.6161
Epoch 27/50
724/724 [==============================] - ETA: 0s - loss: 0.1584 - accuracy: 0.9790
Epoch 27: val_loss did not improve from 3.19814
724/724 [==============================] - 481s 664ms/step - loss: 0.1584 - accuracy: 0.9790 - val_loss: 3.2447 - val_accuracy: 0.6183
Epoch 28/50
724/724 [==============================] - ETA: 0s - loss: 0.1363 - accuracy: 0.9794
Epoch 28: val_loss did not improve from 3.19814
724/724 [==============================] - 491s 678ms/step - loss: 0.1363 - accuracy: 0.9794 - val_loss: 3.2659 - val_accuracy: 0.6182
Epoch 29/50
724/724 [==============================] - ETA: 0s - loss: 0.1170 - accuracy: 0.9807
Epoch 29: val_loss did not improve from 3.19814
724/724 [==============================] - 479s 662ms/step - loss: 0.1170 - accuracy: 0.9807 - val_loss: 3.2872 - val_accuracy: 0.6178
Epoch 30/50
724/724 [==============================] - ETA: 0s - loss: 0.1066 - accuracy: 0.9807
Epoch 30: val_loss did not improve from 3.19814
724/724 [==============================] - 489s 676ms/step - loss: 0.1066 - accuracy: 0.9807 - val_loss: 3.3080 - val_accuracy: 0.6219
Epoch 31/50
724/724 [==============================] - ETA: 0s - loss: 0.0951 - accuracy: 0.9817
Epoch 31: val_loss did not improve from 3.19814
724/724 [==============================] - 486s 672ms/step - loss: 0.0951 - accuracy: 0.9817 - val_loss: 3.3242 - val_accuracy: 0.6189
Epoch 32/50
724/724 [==============================] - ETA: 0s - loss: 0.0885 - accuracy: 0.9817
Epoch 32: val_loss did not improve from 3.19814
724/724 [==============================] - 497s 687ms/step - loss: 0.0885 - accuracy: 0.9817 - val_loss: 3.3471 - val_accuracy: 0.6206
Epoch 33/50
724/724 [==============================] - ETA: 0s - loss: 0.0846 - accuracy: 0.9820
Epoch 33: val_loss did not improve from 3.19814
724/724 [==============================] - 491s 678ms/step - loss: 0.0846 - accuracy: 0.9820 - val_loss: 3.3512 - val_accuracy: 0.6194
Epoch 34/50
724/724 [==============================] - ETA: 0s - loss: 0.0782 - accuracy: 0.9827
Epoch 34: val_loss did not improve from 3.19814
724/724 [==============================] - 494s 683ms/step - loss: 0.0782 - accuracy: 0.9827 - val_loss: 3.4060 - val_accuracy: 0.6209
Epoch 35/50
724/724 [==============================] - ETA: 0s - loss: 0.0730 - accuracy: 0.9838
Epoch 35: val_loss did not improve from 3.19814
724/724 [==============================] - 479s 662ms/step - loss: 0.0730 - accuracy: 0.9838 - val_loss: 3.3856 - val_accuracy: 0.6207
Epoch 36/50
724/724 [==============================] - ETA: 0s - loss: 0.0698 - accuracy: 0.9838
Epoch 36: val_loss did not improve from 3.19814
724/724 [==============================] - 491s 679ms/step - loss: 0.0698 - accuracy: 0.9838 - val_loss: 3.4165 - val_accuracy: 0.6238
Epoch 37/50
724/724 [==============================] - ETA: 0s - loss: 0.0682 - accuracy: 0.9838
Epoch 37: val_loss did not improve from 3.19814
724/724 [==============================] - 491s 678ms/step - loss: 0.0682 - accuracy: 0.9838 - val_loss: 3.4286 - val_accuracy: 0.6227
Epoch 38/50
724/724 [==============================] - ETA: 0s - loss: 0.0627 - accuracy: 0.9848
Epoch 38: val_loss did not improve from 3.19814
724/724 [==============================] - 489s 675ms/step - loss: 0.0627 - accuracy: 0.9848 - val_loss: 3.4581 - val_accuracy: 0.6219
Epoch 39/50
724/724 [==============================] - ETA: 0s - loss: 0.0613 - accuracy: 0.9847
Epoch 39: val_loss did not improve from 3.19814
724/724 [==============================] - 491s 678ms/step - loss: 0.0613 - accuracy: 0.9847 - val_loss: 3.4872 - val_accuracy: 0.6225
Epoch 40/50
724/724 [==============================] - ETA: 0s - loss: 0.0581 - accuracy: 0.9857
Epoch 40: val_loss did not improve from 3.19814
724/724 [==============================] - 498s 688ms/step - loss: 0.0581 - accuracy: 0.9857 - val_loss: 3.4986 - val_accuracy: 0.6199
Epoch 41/50
724/724 [==============================] - ETA: 0s - loss: 0.0574 - accuracy: 0.9854
Epoch 41: val_loss did not improve from 3.19814
724/724 [==============================] - 476s 658ms/step - loss: 0.0574 - accuracy: 0.9854 - val_loss: 3.4774 - val_accuracy: 0.6212
Epoch 42/50
724/724 [==============================] - ETA: 0s - loss: 0.0561 - accuracy: 0.9851
Epoch 42: val_loss did not improve from 3.19814
724/724 [==============================] - 479s 661ms/step - loss: 0.0561 - accuracy: 0.9851 - val_loss: 3.5298 - val_accuracy: 0.6215
Epoch 43/50
724/724 [==============================] - ETA: 0s - loss: 0.0521 - accuracy: 0.9864
Epoch 43: val_loss did not improve from 3.19814
724/724 [==============================] - 490s 677ms/step - loss: 0.0521 - accuracy: 0.9864 - val_loss: 3.5399 - val_accuracy: 0.6225
Epoch 44/50
724/724 [==============================] - ETA: 0s - loss: 0.0538 - accuracy: 0.9855
Epoch 44: val_loss did not improve from 3.19814
724/724 [==============================] - 479s 661ms/step - loss: 0.0538 - accuracy: 0.9855 - val_loss: 3.5258 - val_accuracy: 0.6219
Epoch 45/50
724/724 [==============================] - ETA: 0s - loss: 0.0511 - accuracy: 0.9859
Epoch 45: val_loss did not improve from 3.19814
724/724 [==============================] - 490s 677ms/step - loss: 0.0511 - accuracy: 0.9859 - val_loss: 3.5453 - val_accuracy: 0.6241
Epoch 46/50
724/724 [==============================] - ETA: 0s - loss: 0.0498 - accuracy: 0.9862
Epoch 46: val_loss did not improve from 3.19814
724/724 [==============================] - 479s 662ms/step - loss: 0.0498 - accuracy: 0.9862 - val_loss: 3.5661 - val_accuracy: 0.6234
Epoch 47/50
724/724 [==============================] - ETA: 0s - loss: 0.0489 - accuracy: 0.9866
Epoch 47: val_loss did not improve from 3.19814
724/724 [==============================] - 507s 700ms/step - loss: 0.0489 - accuracy: 0.9866 - val_loss: 3.5884 - val_accuracy: 0.6232
Epoch 48/50
724/724 [==============================] - ETA: 0s - loss: 0.0477 - accuracy: 0.9864
Epoch 48: val_loss did not improve from 3.19814
724/724 [==============================] - 508s 702ms/step - loss: 0.0477 - accuracy: 0.9864 - val_loss: 3.5940 - val_accuracy: 0.6211
Epoch 49/50
724/724 [==============================] - ETA: 0s - loss: 0.0452 - accuracy: 0.9870
Epoch 49: val_loss did not improve from 3.19814
724/724 [==============================] - 514s 710ms/step - loss: 0.0452 - accuracy: 0.9870 - val_loss: 3.6373 - val_accuracy: 0.6241
Epoch 50/50
724/724 [==============================] - ETA: 0s - loss: 0.0458 - accuracy: 0.9870
Epoch 50: val_loss did not improve from 3.19814
724/724 [==============================] - 511s 706ms/step - loss: 0.0458 - accuracy: 0.9870 - val_loss: 3.6021 - val_accuracy: 0.6230
2023-04-16 20:17:42.808287: I tensorflow/core/platform/cpu_feature_guard.cc:193] This TensorFlow binary is optimized with oneAPI Deep Neural Network Library (oneDNN) to use the following CPU instructions in performance-critical operations:  AVX2
To enable them in other operations, rebuild TensorFlow with the appropriate compiler flags.
2023-04-16 20:17:42.810103: I tensorflow/core/common_runtime/process_util.cc:146] Creating new thread pool with default inter op setting: 2. Tune using inter_op_parallelism_threads for best performance.
喜欢离马的孤
她我 机唱期对没候
我情独地
只怀会爱 你还笑话
我最万问它命水大后
当光天你我前
天用我化化
这笑转最
你是带么爱的心成
我们的地传却生生的乎
当奈知男何诉会希如若乱下的错
他就以眼深作 生俗坚了
都是世泪着你的时座
好流谈 独始抓了拥海
遥绝江书白不海小夜
老写墙的不好力念
当远在天远
你的心肆
蝴追心着领流
和笑脚脱了鲜去
你鹤戒变到流了见谁
摆她的定面 机散无以波听
每些人的吃世酒 今们没人心狂
等颠成人
外就动当 已袋寂漪在心
堆古迷最散都看会洒必下
我打想陪我们在想行

2023-04-16 20:20:01.516025: I tensorflow/core/platform/cpu_feature_guard.cc:193] This TensorFlow binary is optimized with oneAPI Deep Neural Network Library (oneDNN) to use the following CPU instructions in performance-critical operations:  AVX2
To enable them in other operations, rebuild TensorFlow with the appropriate compiler flags.
2023-04-16 20:20:01.517944: I tensorflow/core/common_runtime/process_util.cc:146] Creating new thread pool with default inter op setting: 2. Tune using inter_op_parallelism_threads for best performance.
喜欢着你的腾影
灵风着这窗的恤号
我轻时的深涌
假说 人 此如隐住当样
你的时就雨来年 数尽第绝散场手 水倚宣了然沙梦
有虽天尘完曳 迷解你
这一梦灯歌的夏座流早飞勺
最完了有经行 出若每告来
神关气中白又天色
心星如如经你走素
没为命头有有个经有有一常生有有感的大花
我的人翼你不想全天
一色去风风风
三开角天坠时一天
生指已定气
当是和你干而的纯难
 离离方和 深不爱
雨门
可飞终吗
 记所以就不折变
让时我 只想干花约发
我想你还很己解
等情唱一个歌隔伤星美
江个红影雨 梦尘你斜晨在转了流手

Process finished with exit code 0
2023-04-16 21:27:07.969038: I tensorflow/core/platform/cpu_feature_guard.cc:193] This TensorFlow binary is optimized with oneAPI Deep Neural Network Library (oneDNN) to use the following CPU instructions in performance-critical operations:  AVX2
To enable them in other operations, rebuild TensorFlow with the appropriate compiler flags.
2023-04-16 21:27:07.990591: I tensorflow/core/common_runtime/process_util.cc:146] Creating new thread pool with default inter op setting: 2. Tune using inter_op_parallelism_threads for best performance.
输入任意作为生成歌词的起始:喜欢
喜欢出眼的孤晚
你星合带我我的一到
随当我的笑
红开月片带
生十放放雨线 挥了了得干
这将附拿为你年
迷得的行的初惑
当不紫的人 如人无遗情
冷繁我的一大喜畔 认雪里你着我
深转的梦
我眼可在爱下
你们最去里耳深新的南远
暖有人回很困味
温开摇载我的冷泪
他在不筋慢上 埋身入到那疼么全
你的多失是多对一多的火难
相必就想生只都完不
看能说也太容中
是有一人 许么眼始才活
是要人失初维玫就就今在要就全音
写不可回在不头最能头可走的失里
我是听的梦的喜单来都有眼锁了她
摆脱衙了夜窗都了孤心
仿她天能

F:\anaconda3\python.exe "E:/PyCharm Community Edition 2021.2.3/train_lstm/train_lstm_word_based.py"
2023-04-16 23:00:00.611644: I tensorflow/core/platform/cpu_feature_guard.cc:193] This TensorFlow binary is optimized with oneAPI Deep Neural Network Library (oneDNN) to use the following CPU instructions in performance-critical operations:  AVX2
To enable them in other operations, rebuild TensorFlow with the appropriate compiler flags.
2023-04-16 23:00:00.773888: I tensorflow/core/common_runtime/process_util.cc:146] Creating new thread pool with default inter op setting: 2. Tune using inter_op_parallelism_threads for best performance.
2023-04-16 23:00:02.191776: I tensorflow/core/common_runtime/process_util.cc:146] Creating new thread pool with default inter op setting: 2. Tune using inter_op_parallelism_threads for best performance.
embedding layer is:: KerasTensor(type_spec=TensorSpec(shape=(None, 6, 800), dtype=tf.float32, name=None), name='embedding/embedding_lookup/Identity_1:0', description="created by layer 'embedding'")
build model.....
Model: "model"
_________________________________________________________________
 Layer (type)                Output Shape              Param #   
=================================================================
 x_train (InputLayer)        [(None, 6)]               0         
                                                                 
 embedding (Embedding)       (None, 6, 800)            1865600   
                                                                 
 LSTM_1 (LSTM)               (None, 6, 800)            5123200   
                                                                 
 LSTM_2 (LSTM)               (None, 1600)              15366400  
                                                                 
 Dense_1 (Dense)             (None, 2332)              3733532   
                                                                 
=================================================================
Total params: 26,088,732
Trainable params: 26,088,732
Non-trainable params: 0
_________________________________________________________________
None
F:\anaconda3\lib\site-packages\keras\optimizers\optimizer_v2\adam.py:110: UserWarning: The `lr` argument is deprecated, use `learning_rate` instead.
  super(Adam, self).__init__(name, **kwargs)
Train....
WARNING:tensorflow:`period` argument is deprecated. Please use `save_freq` to specify the frequency in number of batches seen.
Epoch 1/30
691/691 [==============================] - ETA: 0s - loss: 6.2159 - accuracy: 0.1085
Epoch 1: val_loss improved from inf to 6.04871, saving model to .\model_epoch30_2lstm_1dense_seq6_phrase_based_best.h5
691/691 [==============================] - 329s 462ms/step - loss: 6.2159 - accuracy: 0.1085 - val_loss: 6.0487 - val_accuracy: 0.1160
Epoch 2/30
691/691 [==============================] - ETA: 0s - loss: 5.9450 - accuracy: 0.1200
Epoch 2: val_loss improved from 6.04871 to 5.99066, saving model to .\model_epoch30_2lstm_1dense_seq6_phrase_based_best.h5
691/691 [==============================] - 329s 476ms/step - loss: 5.9450 - accuracy: 0.1200 - val_loss: 5.9907 - val_accuracy: 0.1220
Epoch 3/30
691/691 [==============================] - ETA: 0s - loss: 5.8189 - accuracy: 0.1268
Epoch 3: val_loss improved from 5.99066 to 5.89026, saving model to .\model_epoch30_2lstm_1dense_seq6_phrase_based_best.h5
691/691 [==============================] - 306s 443ms/step - loss: 5.8189 - accuracy: 0.1268 - val_loss: 5.8903 - val_accuracy: 0.1257
Epoch 4/30
691/691 [==============================] - ETA: 0s - loss: 5.6723 - accuracy: 0.1319
Epoch 4: val_loss improved from 5.89026 to 5.79656, saving model to .\model_epoch30_2lstm_1dense_seq6_phrase_based_best.h5
691/691 [==============================] - 316s 457ms/step - loss: 5.6723 - accuracy: 0.1319 - val_loss: 5.7966 - val_accuracy: 0.1293
Epoch 5/30
691/691 [==============================] - ETA: 0s - loss: 5.4981 - accuracy: 0.1396
Epoch 5: val_loss improved from 5.79656 to 5.67159, saving model to .\model_epoch30_2lstm_1dense_seq6_phrase_based_best.h5
691/691 [==============================] - 327s 473ms/step - loss: 5.4981 - accuracy: 0.1396 - val_loss: 5.6716 - val_accuracy: 0.1346
Epoch 6/30
691/691 [==============================] - ETA: 0s - loss: 5.2895 - accuracy: 0.1474
Epoch 6: val_loss improved from 5.67159 to 5.55534, saving model to .\model_epoch30_2lstm_1dense_seq6_phrase_based_best.h5
691/691 [==============================] - 315s 456ms/step - loss: 5.2895 - accuracy: 0.1474 - val_loss: 5.5553 - val_accuracy: 0.1371
Epoch 7/30
691/691 [==============================] - ETA: 0s - loss: 5.0373 - accuracy: 0.1606
Epoch 7: val_loss improved from 5.55534 to 5.43768, saving model to .\model_epoch30_2lstm_1dense_seq6_phrase_based_best.h5
691/691 [==============================] - 323s 468ms/step - loss: 5.0373 - accuracy: 0.1606 - val_loss: 5.4377 - val_accuracy: 0.1431
Epoch 8/30
691/691 [==============================] - ETA: 0s - loss: 4.7511 - accuracy: 0.1786
Epoch 8: val_loss improved from 5.43768 to 5.28905, saving model to .\model_epoch30_2lstm_1dense_seq6_phrase_based_best.h5
691/691 [==============================] - 312s 452ms/step - loss: 4.7511 - accuracy: 0.1786 - val_loss: 5.2891 - val_accuracy: 0.1590
Epoch 9/30
691/691 [==============================] - ETA: 0s - loss: 4.4522 - accuracy: 0.2075
Epoch 9: val_loss improved from 5.28905 to 5.12466, saving model to .\model_epoch30_2lstm_1dense_seq6_phrase_based_best.h5
691/691 [==============================] - 325s 470ms/step - loss: 4.4522 - accuracy: 0.2075 - val_loss: 5.1247 - val_accuracy: 0.1775
Epoch 10/30
691/691 [==============================] - ETA: 0s - loss: 4.1476 - accuracy: 0.2482
Epoch 10: val_loss improved from 5.12466 to 5.00481, saving model to .\model_epoch30_2lstm_1dense_seq6_phrase_based_best.h5
691/691 [==============================] - 310s 449ms/step - loss: 4.1476 - accuracy: 0.2482 - val_loss: 5.0048 - val_accuracy: 0.1955
Epoch 11/30
691/691 [==============================] - ETA: 0s - loss: 3.8423 - accuracy: 0.2941
Epoch 11: val_loss improved from 5.00481 to 4.81261, saving model to .\model_epoch30_2lstm_1dense_seq6_phrase_based_best.h5
691/691 [==============================] - 304s 439ms/step - loss: 3.8423 - accuracy: 0.2941 - val_loss: 4.8126 - val_accuracy: 0.2254
Epoch 12/30
691/691 [==============================] - ETA: 0s - loss: 3.5432 - accuracy: 0.3444
Epoch 12: val_loss improved from 4.81261 to 4.67752, saving model to .\model_epoch30_2lstm_1dense_seq6_phrase_based_best.h5
691/691 [==============================] - 318s 460ms/step - loss: 3.5432 - accuracy: 0.3444 - val_loss: 4.6775 - val_accuracy: 0.2536
Epoch 13/30
691/691 [==============================] - ETA: 0s - loss: 3.2465 - accuracy: 0.3979
Epoch 13: val_loss improved from 4.67752 to 4.53187, saving model to .\model_epoch30_2lstm_1dense_seq6_phrase_based_best.h5
691/691 [==============================] - 320s 463ms/step - loss: 3.2465 - accuracy: 0.3979 - val_loss: 4.5319 - val_accuracy: 0.2777
Epoch 14/30
691/691 [==============================] - ETA: 0s - loss: 2.9621 - accuracy: 0.4513
Epoch 14: val_loss improved from 4.53187 to 4.38059, saving model to .\model_epoch30_2lstm_1dense_seq6_phrase_based_best.h5
691/691 [==============================] - 305s 441ms/step - loss: 2.9621 - accuracy: 0.4513 - val_loss: 4.3806 - val_accuracy: 0.3122
Epoch 15/30
691/691 [==============================] - ETA: 0s - loss: 2.6872 - accuracy: 0.5041
Epoch 15: val_loss improved from 4.38059 to 4.21560, saving model to .\model_epoch30_2lstm_1dense_seq6_phrase_based_best.h5
691/691 [==============================] - 315s 456ms/step - loss: 2.6872 - accuracy: 0.5041 - val_loss: 4.2156 - val_accuracy: 0.3410
Epoch 16/30
691/691 [==============================] - ETA: 0s - loss: 2.4248 - accuracy: 0.5576
Epoch 16: val_loss improved from 4.21560 to 4.08963, saving model to .\model_epoch30_2lstm_1dense_seq6_phrase_based_best.h5
691/691 [==============================] - 303s 438ms/step - loss: 2.4248 - accuracy: 0.5576 - val_loss: 4.0896 - val_accuracy: 0.3705
Epoch 17/30
691/691 [==============================] - ETA: 0s - loss: 2.1754 - accuracy: 0.6073
Epoch 17: val_loss improved from 4.08963 to 3.93891, saving model to .\model_epoch30_2lstm_1dense_seq6_phrase_based_best.h5
691/691 [==============================] - 303s 439ms/step - loss: 2.1754 - accuracy: 0.6073 - val_loss: 3.9389 - val_accuracy: 0.4014
Epoch 18/30
691/691 [==============================] - ETA: 0s - loss: 1.9425 - accuracy: 0.6544 
Epoch 18: val_loss improved from 3.93891 to 3.82455, saving model to .\model_epoch30_2lstm_1dense_seq6_phrase_based_best.h5
691/691 [==============================] - 30438s 44s/step - loss: 1.9425 - accuracy: 0.6544 - val_loss: 3.8245 - val_accuracy: 0.4293
Epoch 19/30
691/691 [==============================] - ETA: 0s - loss: 1.7218 - accuracy: 0.6966
Epoch 19: val_loss improved from 3.82455 to 3.68174, saving model to .\model_epoch30_2lstm_1dense_seq6_phrase_based_best.h5
691/691 [==============================] - 340s 492ms/step - loss: 1.7218 - accuracy: 0.6966 - val_loss: 3.6817 - val_accuracy: 0.4658
Epoch 20/30
691/691 [==============================] - ETA: 0s - loss: 1.5157 - accuracy: 0.7379
Epoch 20: val_loss improved from 3.68174 to 3.56363, saving model to .\model_epoch30_2lstm_1dense_seq6_phrase_based_best.h5
691/691 [==============================] - 322s 466ms/step - loss: 1.5157 - accuracy: 0.7379 - val_loss: 3.5636 - val_accuracy: 0.4883
Epoch 21/30
691/691 [==============================] - ETA: 0s - loss: 1.3256 - accuracy: 0.7750
Epoch 21: val_loss improved from 3.56363 to 3.46436, saving model to .\model_epoch30_2lstm_1dense_seq6_phrase_based_best.h5
691/691 [==============================] - 337s 487ms/step - loss: 1.3256 - accuracy: 0.7750 - val_loss: 3.4644 - val_accuracy: 0.5176
Epoch 22/30
691/691 [==============================] - ETA: 0s - loss: 1.1541 - accuracy: 0.8065
Epoch 22: val_loss improved from 3.46436 to 3.36404, saving model to .\model_epoch30_2lstm_1dense_seq6_phrase_based_best.h5
691/691 [==============================] - 1441s 2s/step - loss: 1.1541 - accuracy: 0.8065 - val_loss: 3.3640 - val_accuracy: 0.5449
Epoch 23/30
691/691 [==============================] - ETA: 0s - loss: 0.9930 - accuracy: 0.8374
Epoch 23: val_loss improved from 3.36404 to 3.26704, saving model to .\model_epoch30_2lstm_1dense_seq6_phrase_based_best.h5
691/691 [==============================] - 317s 459ms/step - loss: 0.9930 - accuracy: 0.8374 - val_loss: 3.2670 - val_accuracy: 0.5724
Epoch 24/30
691/691 [==============================] - ETA: 0s - loss: 0.8485 - accuracy: 0.8685
Epoch 24: val_loss improved from 3.26704 to 3.17621, saving model to .\model_epoch30_2lstm_1dense_seq6_phrase_based_best.h5
691/691 [==============================] - 332s 480ms/step - loss: 0.8485 - accuracy: 0.8685 - val_loss: 3.1762 - val_accuracy: 0.5924
Epoch 25/30
691/691 [==============================] - ETA: 0s - loss: 0.7176 - accuracy: 0.8927
Epoch 25: val_loss improved from 3.17621 to 3.11386, saving model to .\model_epoch30_2lstm_1dense_seq6_phrase_based_best.h5
691/691 [==============================] - 540s 782ms/step - loss: 0.7176 - accuracy: 0.8927 - val_loss: 3.1139 - val_accuracy: 0.6126
Epoch 26/30
691/691 [==============================] - ETA: 0s - loss: 0.6013 - accuracy: 0.9182
Epoch 26: val_loss improved from 3.11386 to 3.04579, saving model to .\model_epoch30_2lstm_1dense_seq6_phrase_based_best.h5
691/691 [==============================] - 341s 493ms/step - loss: 0.6013 - accuracy: 0.9182 - val_loss: 3.0458 - val_accuracy: 0.6273
Epoch 27/30
691/691 [==============================] - ETA: 0s - loss: 0.4965 - accuracy: 0.9367
Epoch 27: val_loss improved from 3.04579 to 2.98705, saving model to .\model_epoch30_2lstm_1dense_seq6_phrase_based_best.h5
691/691 [==============================] - 323s 467ms/step - loss: 0.4965 - accuracy: 0.9367 - val_loss: 2.9871 - val_accuracy: 0.6416
Epoch 28/30
691/691 [==============================] - ETA: 0s - loss: 0.4084 - accuracy: 0.9527
Epoch 28: val_loss improved from 2.98705 to 2.96121, saving model to .\model_epoch30_2lstm_1dense_seq6_phrase_based_best.h5
691/691 [==============================] - 332s 480ms/step - loss: 0.4084 - accuracy: 0.9527 - val_loss: 2.9612 - val_accuracy: 0.6478
Epoch 29/30
691/691 [==============================] - ETA: 0s - loss: 0.3306 - accuracy: 0.9636 
Epoch 29: val_loss improved from 2.96121 to 2.95598, saving model to .\model_epoch30_2lstm_1dense_seq6_phrase_based_best.h5
691/691 [==============================] - 8987s 13s/step - loss: 0.3306 - accuracy: 0.9636 - val_loss: 2.9560 - val_accuracy: 0.6529
Epoch 30/30
691/691 [==============================] - ETA: 0s - loss: 0.2683 - accuracy: 0.9710
Epoch 30: val_loss improved from 2.95598 to 2.93468, saving model to .\model_epoch30_2lstm_1dense_seq6_phrase_based_best.h5
691/691 [==============================] - 327s 473ms/step - loss: 0.2683 - accuracy: 0.9710 - val_loss: 2.9347 - val_accuracy: 0.6548

Process finished with exit code 0


2023-04-17 12:57:24.666280: I tensorflow/core/platform/cpu_feature_guard.cc:193] This TensorFlow binary is optimized with oneAPI Deep Neural Network Library (oneDNN) to use the following CPU instructions in performance-critical operations:  AVX2
To enable them in other operations, rebuild TensorFlow with the appropriate compiler flags.
2023-04-17 12:57:24.668160: I tensorflow/core/common_runtime/process_util.cc:146] Creating new thread pool with default inter op setting: 2. Tune using inter_op_parallelism_threads for best performance.
输入任意作为生成歌词的起始:喜欢
喜欢回念的回
可怕万留残雨
缅缚要点年
我情始出许听错对那满松儿
脑流补的食
科雨手
外拍了去浮手
你说轻了了了什
你当不落有内
所用会在秒发上长地影
三个人字气去
错领少还多不以 你常息脸
如今这在你的霜 瞄怎楚
你的不味悬的半雨
表旋无才全少 归头步
那又你这够燃
老意界
也知情生去没然银逃容
让关林去十始伞的脚指
书情这变释 摆谁得黄 了些美
有零场里纷约沉默
消真已年到没幻意
追书你着能不痛走
完声也电间
又南间酒夜  那爱我的保念
点课张的歌途
只你的喜和每十够都文心


2023-04-17 18:16:37.570332: I tensorflow/core/platform/cpu_feature_guard.cc:193] This TensorFlow binary is optimized with oneAPI Deep Neural Network Library (oneDNN) to use the following CPU instructions in performance-critical operations:  AVX2
To enable them in other operations, rebuild TensorFlow with the appropriate compiler flags.
2023-04-17 18:16:37.614413: I tensorflow/core/common_runtime/process_util.cc:146] Creating new thread pool with default inter op setting: 2. Tune using inter_op_parallelism_threads for best performance.
乌鸦停在海岸荒露里节寒山
对把演张风短
终不不道沿途风复们
我们这经许此旋迟睡
孤爱的临落
月风造冷 局爱的信情
算梦成是口 难刻容过放座
爱摇 渐起们年来春雪 幸片的了十得让还是不么
家国困上悲化
通上为你间没有是够听
我抱相首残唤我摇道的揪
可怀搞自音各在在陪你的唱白
又来的风想
视时上了活在还情心的
那准城向的衙品驰
我着你的风景
点喝眉指染敌提深学岁快非着副脆弱
分真春满江音
没别么何好
你又绝孩开
我轻听尽首也起 就来像我的备
和妹月到环店
窗色了风中
掀角一样都会束会
番恍太路情自难


最后编辑于
©著作权归作者所有,转载或内容合作请联系作者
  • 序言:七十年代末,一起剥皮案震惊了整个滨河市,随后出现的几起案子,更是在滨河造成了极大的恐慌,老刑警刘岩,带你破解...
    沈念sama阅读 216,039评论 6 498
  • 序言:滨河连续发生了三起死亡事件,死亡现场离奇诡异,居然都是意外死亡,警方通过查阅死者的电脑和手机,发现死者居然都...
    沈念sama阅读 92,223评论 3 392
  • 文/潘晓璐 我一进店门,熙熙楼的掌柜王于贵愁眉苦脸地迎上来,“玉大人,你说我怎么就摊上这事。” “怎么了?”我有些...
    开封第一讲书人阅读 161,916评论 0 351
  • 文/不坏的土叔 我叫张陵,是天一观的道长。 经常有香客问我,道长,这世上最难降的妖魔是什么? 我笑而不...
    开封第一讲书人阅读 58,009评论 1 291
  • 正文 为了忘掉前任,我火速办了婚礼,结果婚礼上,老公的妹妹穿的比我还像新娘。我一直安慰自己,他们只是感情好,可当我...
    茶点故事阅读 67,030评论 6 388
  • 文/花漫 我一把揭开白布。 她就那样静静地躺着,像睡着了一般。 火红的嫁衣衬着肌肤如雪。 梳的纹丝不乱的头发上,一...
    开封第一讲书人阅读 51,011评论 1 295
  • 那天,我揣着相机与录音,去河边找鬼。 笑死,一个胖子当着我的面吹牛,可吹牛的内容都是我干的。 我是一名探鬼主播,决...
    沈念sama阅读 39,934评论 3 416
  • 文/苍兰香墨 我猛地睁开眼,长吁一口气:“原来是场噩梦啊……” “哼!你这毒妇竟也来了?” 一声冷哼从身侧响起,我...
    开封第一讲书人阅读 38,754评论 0 271
  • 序言:老挝万荣一对情侣失踪,失踪者是张志新(化名)和其女友刘颖,没想到半个月后,有当地人在树林里发现了一具尸体,经...
    沈念sama阅读 45,202评论 1 309
  • 正文 独居荒郊野岭守林人离奇死亡,尸身上长有42处带血的脓包…… 初始之章·张勋 以下内容为张勋视角 年9月15日...
    茶点故事阅读 37,433评论 2 331
  • 正文 我和宋清朗相恋三年,在试婚纱的时候发现自己被绿了。 大学时的朋友给我发了我未婚夫和他白月光在一起吃饭的照片。...
    茶点故事阅读 39,590评论 1 346
  • 序言:一个原本活蹦乱跳的男人离奇死亡,死状恐怖,灵堂内的尸体忽然破棺而出,到底是诈尸还是另有隐情,我是刑警宁泽,带...
    沈念sama阅读 35,321评论 5 342
  • 正文 年R本政府宣布,位于F岛的核电站,受9级特大地震影响,放射性物质发生泄漏。R本人自食恶果不足惜,却给世界环境...
    茶点故事阅读 40,917评论 3 325
  • 文/蒙蒙 一、第九天 我趴在偏房一处隐蔽的房顶上张望。 院中可真热闹,春花似锦、人声如沸。这庄子的主人今日做“春日...
    开封第一讲书人阅读 31,568评论 0 21
  • 文/苍兰香墨 我抬头看了看天上的太阳。三九已至,却和暖如春,着一层夹袄步出监牢的瞬间,已是汗流浃背。 一阵脚步声响...
    开封第一讲书人阅读 32,738评论 1 268
  • 我被黑心中介骗来泰国打工, 没想到刚下飞机就差点儿被人妖公主榨干…… 1. 我叫王不留,地道东北人。 一个月前我还...
    沈念sama阅读 47,583评论 2 368
  • 正文 我出身青楼,却偏偏与公主长得像,于是被迫代替她去往敌国和亲。 传闻我的和亲对象是个残疾皇子,可洞房花烛夜当晚...
    茶点故事阅读 44,482评论 2 352

推荐阅读更多精彩内容