Texture Synthesis Using Convolutional Neural Networks(卷积神经网络进行纹理合成)

@[TOC]

原文地址

原文地址:Texture Synthesis Using Convolutional Neural Networks
原文翻译:https://blog.csdn.net/cicibabe/article/details/70991588

主要观点





具体实现

"Texture Synthesis Using Convolutional Neural Networks" - Tensorflow implementation
In summary, we will try to generate texture based on the sample texture image from the scratch random noisy image.

Step 1: Preprocessing the input image

Step 2: Computing the output for all the layers for the input image.

Step 3: What is loss function in this problem and computing the loss function.

Step 4: Running Tensorflow model to minimize the loss and optimize the input noise variable.

Step 5: Post processing and displaying the image

Step 6: Automating the stuffs

Step 7: Plotting the successful results.

step4 : Running Tensorflow model to minimize the loss and optimize the input noise variable.

C:\Users\lenovo\Jupyter Notbook\Texture-Synthesis-Using-Convolutional-Neural-Networks-master\tensorflow_vgg\vgg16.npy
npy file loaded
build model started
build model finished: 0s
Epoch: 100/10000  Loss:  5529756400000000.0
Epoch: 200/10000  Loss:  1722265000000000.0
Epoch: 300/10000  Loss:  1083315040000000.0
Epoch: 400/10000  Loss:  790199800000000.0
Epoch: 500/10000  Loss:  570981500000000.0
Epoch: 600/10000  Loss:  410827380000000.0
Epoch: 700/10000  Loss:  297603500000000.0
Epoch: 800/10000  Loss:  220161130000000.0
Epoch: 900/10000  Loss:  168884270000000.0
Epoch: 1000/10000  Loss:  135689240000000.0
Epoch: 1100/10000  Loss:  114025070000000.0
Epoch: 1200/10000  Loss:  99368410000000.0
Epoch: 1300/10000  Loss:  88925840000000.0
Epoch: 1400/10000  Loss:  81028790000000.0
Epoch: 1500/10000  Loss:  74668535000000.0
Epoch: 1600/10000  Loss:  69297890000000.0
Epoch: 1700/10000  Loss:  64574640000000.0
Epoch: 1800/10000  Loss:  60283637000000.0
Epoch: 1900/10000  Loss:  56295533000000.0
Epoch: 2000/10000  Loss:  52524070000000.0
Epoch: 2100/10000  Loss:  48925063000000.0
Epoch: 2200/10000  Loss:  45459750000000.0
Epoch: 2300/10000  Loss:  42112372000000.0
Epoch: 2400/10000  Loss:  38868896000000.0
Epoch: 2500/10000  Loss:  35743810000000.0
Epoch: 2600/10000  Loss:  32742763000000.0
Epoch: 2700/10000  Loss:  29852806000000.0
Epoch: 2800/10000  Loss:  27081340000000.0
Epoch: 2900/10000  Loss:  24441560000000.0
Epoch: 3000/10000  Loss:  21946470000000.0
Epoch: 3100/10000  Loss:  19601977000000.0
Epoch: 3200/10000  Loss:  17414532000000.0
Epoch: 3300/10000  Loss:  15395135000000.0
Epoch: 3400/10000  Loss:  13546737000000.0
Epoch: 3500/10000  Loss:  11869900000000.0
Epoch: 3600/10000  Loss:  10365760000000.0
Epoch: 3700/10000  Loss:  9022828000000.0
Epoch: 3800/10000  Loss:  7828565000000.0
Epoch: 3900/10000  Loss:  6765571000000.0
Epoch: 4000/10000  Loss:  5823080000000.0
Epoch: 4100/10000  Loss:  4990639000000.0
Epoch: 4200/10000  Loss:  4259004500000.0
Epoch: 4300/10000  Loss:  3619611700000.0
Epoch: 4400/10000  Loss:  3064605200000.0
Epoch: 4500/10000  Loss:  2590721700000.0
Epoch: 4600/10000  Loss:  2192407400000.0
Epoch: 4700/10000  Loss:  1864649100000.0
Epoch: 4800/10000  Loss:  1597569000000.0
Epoch: 4900/10000  Loss:  1380381600000.0
Epoch: 5000/10000  Loss:  1202861100000.0
Epoch: 5100/10000  Loss:  1058127100000.0
Epoch: 5200/10000  Loss:  939163900000.0
Epoch: 5300/10000  Loss:  840591540000.0
Epoch: 5400/10000  Loss:  758178300000.0
Epoch: 5500/10000  Loss:  687998500000.0
Epoch: 5600/10000  Loss:  627689200000.0
Epoch: 5700/10000  Loss:  575582900000.0
Epoch: 5800/10000  Loss:  530246400000.0
Epoch: 5900/10000  Loss:  490397470000.0
Epoch: 6000/10000  Loss:  455180550000.0
Epoch: 6100/10000  Loss:  423790480000.0
Epoch: 6200/10000  Loss:  395681070000.0
Epoch: 6300/10000  Loss:  370407100000.0
Epoch: 6400/10000  Loss:  347494480000.0
Epoch: 6500/10000  Loss:  326646300000.0
Epoch: 6600/10000  Loss:  307627660000.0
Epoch: 6700/10000  Loss:  290129280000.0
Epoch: 6800/10000  Loss:  273938200000.0
Epoch: 6900/10000  Loss:  258948480000.0
Epoch: 7000/10000  Loss:  245060340000.0
Epoch: 7100/10000  Loss:  232265710000.0
Epoch: 7200/10000  Loss:  220370030000.0
Epoch: 7300/10000  Loss:  209295410000.0
Epoch: 7400/10000  Loss:  198971560000.0
Epoch: 7500/10000  Loss:  189266130000.0
Epoch: 7600/10000  Loss:  180154430000.0
Epoch: 7700/10000  Loss:  171579100000.0
Epoch: 7800/10000  Loss:  163506320000.0
Epoch: 7900/10000  Loss:  155893020000.0
Epoch: 8000/10000  Loss:  148692430000.0
Epoch: 8100/10000  Loss:  141869370000.0
Epoch: 8200/10000  Loss:  135439640000.0
Epoch: 8300/10000  Loss:  129361560000.0
Epoch: 8400/10000  Loss:  123593610000.0
Epoch: 8500/10000  Loss:  118108680000.0
Epoch: 8600/10000  Loss:  112880360000.0
Epoch: 8700/10000  Loss:  107913210000.0
Epoch: 8800/10000  Loss:  103188360000.0
Epoch: 8900/10000  Loss:  98686030000.0
Epoch: 9000/10000  Loss:  94398145000.0
Epoch: 9100/10000  Loss:  90303500000.0
Epoch: 9200/10000  Loss:  86406760000.0
Epoch: 9300/10000  Loss:  82687780000.0
Epoch: 9400/10000  Loss:  79144060000.0
Epoch: 9500/10000  Loss:  75771310000.0
Epoch: 9600/10000  Loss:  72547975000.0
Epoch: 9700/10000  Loss:  69478646000.0
Epoch: 9800/10000  Loss:  66521575000.0
Epoch: 9900/10000  Loss:  63701450000.0
Epoch: 10000/10000  Loss:  61003633000.0

step6 : Automating the stuffs

Configuration : 5 - Upto Pooling Layer 4
D:\LEARN\Deep Learning\3. Style Transfer Paper implementations\Texture generation from Image using CNN\tensorflow_vgg\vgg16.npy
npy file loaded
build model started
build model finished: 0s
[]
All layers' outputs have been computed sucessfully.
D:\LEARN\Deep Learning\3. Style Transfer Paper implementations\Texture generation from Image using CNN\tensorflow_vgg\vgg16.npy
npy file loaded
build model started
build model finished: 0s
[<tf.Variable 'input_noise:0' shape=(1, 256, 256, 3) dtype=float32_ref>]
Epoch: 1000/10000  Loss:  172597660000000.0
Epoch: 2000/10000  Loss:  58709074000000.0
Epoch: 3000/10000  Loss:  26250159000000.0
Epoch: 4000/10000  Loss:  7911461400000.0
Epoch: 5000/10000  Loss:  2571915500000.0
Epoch: 6000/10000  Loss:  1335424300000.0
Epoch: 7000/10000  Loss:  869748830000.0
Epoch: 8000/10000  Loss:  622193900000.0
Epoch: 9000/10000  Loss:  470882400000.0
Epoch: 10000/10000  Loss:  376950520000.0
©著作权归作者所有,转载或内容合作请联系作者
  • 序言:七十年代末,一起剥皮案震惊了整个滨河市,随后出现的几起案子,更是在滨河造成了极大的恐慌,老刑警刘岩,带你破解...
    沈念sama阅读 220,639评论 6 513
  • 序言:滨河连续发生了三起死亡事件,死亡现场离奇诡异,居然都是意外死亡,警方通过查阅死者的电脑和手机,发现死者居然都...
    沈念sama阅读 94,093评论 3 396
  • 文/潘晓璐 我一进店门,熙熙楼的掌柜王于贵愁眉苦脸地迎上来,“玉大人,你说我怎么就摊上这事。” “怎么了?”我有些...
    开封第一讲书人阅读 167,079评论 0 357
  • 文/不坏的土叔 我叫张陵,是天一观的道长。 经常有香客问我,道长,这世上最难降的妖魔是什么? 我笑而不...
    开封第一讲书人阅读 59,329评论 1 295
  • 正文 为了忘掉前任,我火速办了婚礼,结果婚礼上,老公的妹妹穿的比我还像新娘。我一直安慰自己,他们只是感情好,可当我...
    茶点故事阅读 68,343评论 6 397
  • 文/花漫 我一把揭开白布。 她就那样静静地躺着,像睡着了一般。 火红的嫁衣衬着肌肤如雪。 梳的纹丝不乱的头发上,一...
    开封第一讲书人阅读 52,047评论 1 308
  • 那天,我揣着相机与录音,去河边找鬼。 笑死,一个胖子当着我的面吹牛,可吹牛的内容都是我干的。 我是一名探鬼主播,决...
    沈念sama阅读 40,645评论 3 421
  • 文/苍兰香墨 我猛地睁开眼,长吁一口气:“原来是场噩梦啊……” “哼!你这毒妇竟也来了?” 一声冷哼从身侧响起,我...
    开封第一讲书人阅读 39,565评论 0 276
  • 序言:老挝万荣一对情侣失踪,失踪者是张志新(化名)和其女友刘颖,没想到半个月后,有当地人在树林里发现了一具尸体,经...
    沈念sama阅读 46,095评论 1 319
  • 正文 独居荒郊野岭守林人离奇死亡,尸身上长有42处带血的脓包…… 初始之章·张勋 以下内容为张勋视角 年9月15日...
    茶点故事阅读 38,201评论 3 340
  • 正文 我和宋清朗相恋三年,在试婚纱的时候发现自己被绿了。 大学时的朋友给我发了我未婚夫和他白月光在一起吃饭的照片。...
    茶点故事阅读 40,338评论 1 352
  • 序言:一个原本活蹦乱跳的男人离奇死亡,死状恐怖,灵堂内的尸体忽然破棺而出,到底是诈尸还是另有隐情,我是刑警宁泽,带...
    沈念sama阅读 36,014评论 5 347
  • 正文 年R本政府宣布,位于F岛的核电站,受9级特大地震影响,放射性物质发生泄漏。R本人自食恶果不足惜,却给世界环境...
    茶点故事阅读 41,701评论 3 332
  • 文/蒙蒙 一、第九天 我趴在偏房一处隐蔽的房顶上张望。 院中可真热闹,春花似锦、人声如沸。这庄子的主人今日做“春日...
    开封第一讲书人阅读 32,194评论 0 23
  • 文/苍兰香墨 我抬头看了看天上的太阳。三九已至,却和暖如春,着一层夹袄步出监牢的瞬间,已是汗流浃背。 一阵脚步声响...
    开封第一讲书人阅读 33,320评论 1 272
  • 我被黑心中介骗来泰国打工, 没想到刚下飞机就差点儿被人妖公主榨干…… 1. 我叫王不留,地道东北人。 一个月前我还...
    沈念sama阅读 48,685评论 3 375
  • 正文 我出身青楼,却偏偏与公主长得像,于是被迫代替她去往敌国和亲。 传闻我的和亲对象是个残疾皇子,可洞房花烛夜当晚...
    茶点故事阅读 45,345评论 2 358

推荐阅读更多精彩内容