当前位置: 首页 > news >正文

口红做网站多少钱襄阳头条新闻

口红做网站多少钱,襄阳头条新闻,wordpress+中文版,php做网站用框架前言 上一篇文章 TensorFlow案例学习#xff1a;对服装图像进行分类 中我们跟随官方文档学习了如何进行预处理数据、构建模型、训练模型等。但是对于像我这样的业余玩家来说训练一个模型是非常困难的。所以为什么我们不站在巨人的肩膀上#xff0c;使用已经训练好了的成熟模…前言 上一篇文章 TensorFlow案例学习对服装图像进行分类 中我们跟随官方文档学习了如何进行预处理数据、构建模型、训练模型等。但是对于像我这样的业余玩家来说训练一个模型是非常困难的。所以为什么我们不站在巨人的肩膀上使用已经训练好了的成熟模型呢 这篇文章简单介绍如何使用成熟的模型 使用成熟模型 使用 下载模型 官方为我们提供了适合 TensorFlow 的模型下载网站TensorFlow Hub 需要科学上网 这里我选择的模型是mobilenet_v2模型下载地址是 https://tfhub.dev/google/imagenet/mobilenet_v2_140_224/classification/5 模型简介 MobileNetV2是由Google开发的一种轻量级的卷积神经网络架构专门设计用于在移动设备和嵌入式设备上进行图像分类和目标检测任务。MobileNetV2是MobileNet系列的第二个版本相比于前一个版本它在准确性和性能之间取得了更好的平衡。 MobileNetV2在ImageNet图像分类任务上取得了与更大更复杂的模型相当的准确性但模型的大小和计算量却大大减少。这使得MobileNetV2成为在资源受限的设备上进行实时图像分类和目标检测的理想选择。 在TensorFlow中你可以使用预训练的MobileNetV2模型也可以根据自己的需求进行微调和训练。TensorFlow提供了相应的API和工具使得使用和训练MobileNetV2变得更加方便。 使用 将下载后的模型解压到项目里引入就好完整代码 # 导入tensorflow 和科学计算库 import tensorflow as tf import numpy as np # tensorflow-hub是一个TensorFlow库的扩展它提供了一个简单的接口用于重用已经训练好的机器学习模型的部分 import tensorflow_hub as hub # 字体属性 from matplotlib.font_manager import FontProperties # matplotlib是用于绘制图表和可视化数据的库 import matplotlib.pylab as plt # 用于加载json文件 import json# 导入模型 # 不能直接加载模型文件需要加载器目录 # 加载mobilenet_v2模型这里要加载文件夹不要直接加载pb文件 # 模型如何加载要看文档原来使用tf.keras.models.load_model加载一直失败 model tf.keras.Sequential([hub.KerasLayer(imagenet_mobilenet_v2_140_224_classification_5) ]) print(模型信息:,model)# 预处理输入数据 # 1、mobilenet需要的图片尺寸是 224 * 224 image tf.keras.preprocessing.image.load_img(pics/dog.png,target_size(224,224)) # 2、将图片转为数组既是只有一张图片 image tf.keras.preprocessing.image.img_to_array(image) # 3、扩展数组维度使其符合模型的输入 image np.expand_dims(image, axis0) # 4、使用mobilenet_v2提供的预处理函数对图像处理包括图像归一化、颜色通道顺序调整、像素值标准化等操作 image tf.keras.applications.mobilenet_v2.preprocess_input(image)# 预测 predictions model.predict(image) # 获取最高概率对应的类别索引 predicted_index np.argmax(predictions) # 概率值 confidence np.max(predictions) print(索引和概率值是,predicted_index,confidence)# 加载映射文件 with open(imagenet-classes.json,r) as f:labels_dict json.load(f)# 类别的索引是字符串这里要简单处理一下这里-1是因为官方提供的多了一个0背景我找到的标签没有这个因此要-1 class_name labels_dict[str(predicted_index-1)]# 可视化显示 font FontProperties() font.set_family(Microsoft YaHei) plt.figure() # 创建图像窗口 plt.xticks([]) plt.yticks([]) plt.grid(False) # 取消网格线 plt.imshow(image[0]) # 显示图片 plt.xlabel(class_name,fontpropertiesfont) plt.show() # 显示图形窗口注意点 去除掉一些不必要的代码其实代码真的很少。但是你却可以利用这些代码实现图片分类。虽然代码不多但是却遇到了很多问题。 加载模型 加载模型时就遇到了问题这是当头一棒。 首先加载模型时你需要加载整个文件夹不要加载文件里的某个文件 model tf.keras.Sequential([hub.KerasLayer(imagenet_mobilenet_v2_140_224_classification_5) ])文件夹如下 加载模型使用的函数一开始百度查到的加载模型的函数是tf.keras.models.load_model 。但是在这里不适用导致模型一直加载不成功后来看了一下官方文档要这样加载 这里我们要下载tensorflow_hub加载命令中的hub是其简写。tensorflow-hub是一个TensorFlow库的扩展它提供了一个简单的接口用于重用已经训练好的机器学习模型的部分。不同的模型有不同的加载方式使用 TensorFlow Hub 网站提供的模型时要注意其使用说明。 pip install tensorflow_hub预处理输入数据 模型加载好了还不够你还需要处理好数据。使你的图片符合模型的输入格式也就是下面这段代码 # 预处理输入数据 # 1、mobilenet需要的图片尺寸是 224 * 224 image tf.keras.preprocessing.image.load_img(pics/dog.png,target_size(224,224)) # 2、将图片转为数组既是只有一张图片 image tf.keras.preprocessing.image.img_to_array(image) # 3、扩展数组维度使其符合模型的输入 image np.expand_dims(image, axis0) # 4、使用mobilenet_v2提供的预处理函数对图像处理包括图像归一化、颜色通道顺序调整、像素值标准化等操作 image tf.keras.applications.mobilenet_v2.preprocess_input(image)映射文件 预测完成后你需要知道这个图片到底是什么模型是基于Imagenet (ILSVRC-2012-CLS) 数据集训练的因此你需要获取到对应的映射文件。这里我从网上找到了但是有点小区别 官方用的映射文件在最前面添加了一个0但是我找到的这个映射文件没有。如果修改序号的话数据太多了因此在获取类别时手动将序号进行了-1操作class_name labels_dict[str(predicted_index-1)] 映射文件 {-1:[background],0:[n01440764,tench],1:[n01443537,goldfish],2:[n01484850,great white shark],3:[n01491361,tiger shark],4:[n01494475,hammerhead],5:[n01496331,electric ray],6:[n01498041,stingray],7:[n01514668,cock],8:[n01514859,hen],9:[n01518878,ostrich],10:[n01530575,brambling],11:[n01531178,goldfinch],12:[n01532829,house finch],13:[n01534433,junco],14:[n01537544,indigo bunting],15:[n01558993,robin],16:[n01560419,bulbul],17:[n01580077,jay],18:[n01582220,magpie],19:[n01592084,chickadee],20:[n01601694,water ouzel],21:[n01608432,kite],22:[n01614925,bald eagle],23:[n01616318,vulture],24:[n01622779,great grey owl],25:[n01629819,European fire salamander],26:[n01630670,common newt],27:[n01631663,eft],28:[n01632458,spotted salamander],29:[n01632777,axolotl],30:[n01641577,bullfrog],31:[n01644373,tree frog],32:[n01644900,tailed frog],33:[n01664065,loggerhead],34:[n01665541,leatherback turtle],35:[n01667114,mud turtle],36:[n01667778,terrapin],37:[n01669191,box turtle],38:[n01675722,banded gecko],39:[n01677366,common iguana],40:[n01682714,American chameleon],41:[n01685808,whiptail],42:[n01687978,agama],43:[n01688243,frilled lizard],44:[n01689811,alligator lizard],45:[n01692333,Gila monster],46:[n01693334,green lizard],47:[n01694178,African chameleon],48:[n01695060,Komodo dragon],49:[n01697457,African crocodile],50:[n01698640,American alligator],51:[n01704323,triceratops],52:[n01728572,thunder snake],53:[n01728920,ringneck snake],54:[n01729322,hognose snake],55:[n01729977,green snake],56:[n01734418,king snake],57:[n01735189,garter snake],58:[n01737021,water snake],59:[n01739381,vine snake],60:[n01740131,night snake],61:[n01742172,boa constrictor],62:[n01744401,rock python],63:[n01748264,Indian cobra],64:[n01749939,green mamba],65:[n01751748,sea snake],66:[n01753488,horned viper],67:[n01755581,diamondback],68:[n01756291,sidewinder],69:[n01768244,trilobite],70:[n01770081,harvestman],71:[n01770393,scorpion],72:[n01773157,black and gold garden spider],73:[n01773549,barn spider],74:[n01773797,garden spider],75:[n01774384,black widow],76:[n01774750,tarantula],77:[n01775062,wolf spider],78:[n01776313,tick],79:[n01784675,centipede],80:[n01795545,black grouse],81:[n01796340,ptarmigan],82:[n01797886,ruffed grouse],83:[n01798484,prairie chicken],84:[n01806143,peacock],85:[n01806567,quail],86:[n01807496,partridge],87:[n01817953,African grey],88:[n01818515,macaw],89:[n01819313,sulphur-crested cockatoo],90:[n01820546,lorikeet],91:[n01824575,coucal],92:[n01828970,bee eater],93:[n01829413,hornbill],94:[n01833805,hummingbird],95:[n01843065,jacamar],96:[n01843383,toucan],97:[n01847000,drake],98:[n01855032,red-breasted merganser],99:[n01855672,goose],100:[n01860187,black swan],101:[n01871265,tusker],102:[n01872401,echidna],103:[n01873310,platypus],104:[n01877812,wallaby],105:[n01882714,koala],106:[n01883070,wombat],107:[n01910747,jellyfish],108:[n01914609,sea anemone],109:[n01917289,brain coral],110:[n01924916,flatworm],111:[n01930112,nematode],112:[n01943899,conch],113:[n01944390,snail],114:[n01945685,slug],115:[n01950731,sea slug],116:[n01955084,chiton],117:[n01968897,chambered nautilus],118:[n01978287,Dungeness crab],119:[n01978455,rock crab],120:[n01980166,fiddler crab],121:[n01981276,king crab],122:[n01983481,American lobster],123:[n01984695,spiny lobster],124:[n01985128,crayfish],125:[n01986214,hermit crab],126:[n01990800,isopod],127:[n02002556,white stork],128:[n02002724,black stork],129:[n02006656,spoonbill],130:[n02007558,flamingo],131:[n02009229,little blue heron],132:[n02009912,American egret],133:[n02011460,bittern],134:[n02012849,crane],135:[n02013706,limpkin],136:[n02017213,European gallinule],137:[n02018207,American coot],138:[n02018795,bustard],139:[n02025239,ruddy turnstone],140:[n02027492,red-backed sandpiper],141:[n02028035,redshank],142:[n02033041,dowitcher],143:[n02037110,oystercatcher],144:[n02051845,pelican],145:[n02056570,king penguin],146:[n02058221,albatross],147:[n02066245,grey whale],148:[n02071294,killer whale],149:[n02074367,dugong],150:[n02077923,sea lion],151:[n02085620,Chihuahua],152:[n02085782,Japanese spaniel],153:[n02085936,Maltese dog],154:[n02086079,Pekinese],155:[n02086240,Shih-Tzu],156:[n02086646,Blenheim spaniel],157:[n02086910,papillon],158:[n02087046,toy terrier],159:[n02087394,Rhodesian ridgeback],160:[n02088094,Afghan hound],161:[n02088238,basset],162:[n02088364,beagle],163:[n02088466,bloodhound],164:[n02088632,bluetick],165:[n02089078,black-and-tan coonhound],166:[n02089867,Walker hound],167:[n02089973,English foxhound],168:[n02090379,redbone],169:[n02090622,borzoi],170:[n02090721,Irish wolfhound],171:[n02091032,Italian greyhound],172:[n02091134,whippet],173:[n02091244,Ibizan hound],174:[n02091467,Norwegian elkhound],175:[n02091635,otterhound],176:[n02091831,Saluki],177:[n02092002,Scottish deerhound],178:[n02092339,Weimaraner],179:[n02093256,Staffordshire bullterrier],180:[n02093428,American Staffordshire terrier],181:[n02093647,Bedlington terrier],182:[n02093754,Border terrier],183:[n02093859,Kerry blue terrier],184:[n02093991,Irish terrier],185:[n02094114,Norfolk terrier],186:[n02094258,Norwich terrier],187:[n02094433,Yorkshire terrier],188:[n02095314,wire-haired fox terrier],189:[n02095570,Lakeland terrier],190:[n02095889,Sealyham terrier],191:[n02096051,Airedale],192:[n02096177,cairn],193:[n02096294,Australian terrier],194:[n02096437,Dandie Dinmont],195:[n02096585,Boston bull],196:[n02097047,miniature schnauzer],197:[n02097130,giant schnauzer],198:[n02097209,standard schnauzer],199:[n02097298,Scotch terrier],200:[n02097474,Tibetan terrier],201:[n02097658,silky terrier],202:[n02098105,soft-coated wheaten terrier],203:[n02098286,West Highland white terrier],204:[n02098413,Lhasa],205:[n02099267,flat-coated retriever],206:[n02099429,curly-coated retriever],207:[n02099601,golden retriever],208:[n02099712,Labrador retriever],209:[n02099849,Chesapeake Bay retriever],210:[n02100236,German short-haired pointer],211:[n02100583,vizsla],212:[n02100735,English setter],213:[n02100877,Irish setter],214:[n02101006,Gordon setter],215:[n02101388,Brittany spaniel],216:[n02101556,clumber],217:[n02102040,English springer],218:[n02102177,Welsh springer spaniel],219:[n02102318,cocker spaniel],220:[n02102480,Sussex spaniel],221:[n02102973,Irish water spaniel],222:[n02104029,kuvasz],223:[n02104365,schipperke],224:[n02105056,groenendael],225:[n02105162,malinois],226:[n02105251,briard],227:[n02105412,kelpie],228:[n02105505,komondor],229:[n02105641,Old English sheepdog],230:[n02105855,Shetland sheepdog],231:[n02106030,collie],232:[n02106166,Border collie],233:[n02106382,Bouvier des Flandres],234:[n02106550,Rottweiler],235:[n02106662,German shepherd],236:[n02107142,Doberman],237:[n02107312,miniature pinscher],238:[n02107574,Greater Swiss Mountain dog],239:[n02107683,Bernese mountain dog],240:[n02107908,Appenzeller],241:[n02108000,EntleBucher],242:[n02108089,boxer],243:[n02108422,bull mastiff],244:[n02108551,Tibetan mastiff],245:[n02108915,French bulldog],246:[n02109047,Great Dane],247:[n02109525,Saint Bernard],248:[n02109961,Eskimo dog],249:[n02110063,malamute],250:[n02110185,Siberian husky],251:[n02110341,dalmatian],252:[n02110627,affenpinscher],253:[n02110806,basenji],254:[n02110958,pug],255:[n02111129,Leonberg],256:[n02111277,Newfoundland],257:[n02111500,Great Pyrenees],258:[n02111889,Samoyed],259:[n02112018,Pomeranian],260:[n02112137,chow],261:[n02112350,keeshond],262:[n02112706,Brabancon griffon],263:[n02113023,Pembroke],264:[n02113186,Cardigan],265:[n02113624,toy poodle],266:[n02113712,miniature poodle],267:[n02113799,standard poodle],268:[n02113978,Mexican hairless],269:[n02114367,timber wolf],270:[n02114548,white wolf],271:[n02114712,red wolf],272:[n02114855,coyote],273:[n02115641,dingo],274:[n02115913,dhole],275:[n02116738,African hunting dog],276:[n02117135,hyena],277:[n02119022,red fox],278:[n02119789,kit fox],279:[n02120079,Arctic fox],280:[n02120505,grey fox],281:[n02123045,tabby],282:[n02123159,tiger cat],283:[n02123394,Persian cat],284:[n02123597,Siamese cat],285:[n02124075,Egyptian cat],286:[n02125311,cougar],287:[n02127052,lynx],288:[n02128385,leopard],289:[n02128757,snow leopard],290:[n02128925,jaguar],291:[n02129165,lion],292:[n02129604,tiger],293:[n02130308,cheetah],294:[n02132136,brown bear],295:[n02133161,American black bear],296:[n02134084,ice bear],297:[n02134418,sloth bear],298:[n02137549,mongoose],299:[n02138441,meerkat],300:[n02165105,tiger beetle],301:[n02165456,ladybug],302:[n02167151,ground beetle],303:[n02168699,long-horned beetle],304:[n02169497,leaf beetle],305:[n02172182,dung beetle],306:[n02174001,rhinoceros beetle],307:[n02177972,weevil],308:[n02190166,fly],309:[n02206856,bee],310:[n02219486,ant],311:[n02226429,grasshopper],312:[n02229544,cricket],313:[n02231487,walking stick],314:[n02233338,cockroach],315:[n02236044,mantis],316:[n02256656,cicada],317:[n02259212,leafhopper],318:[n02264363,lacewing],319:[n02268443,dragonfly],320:[n02268853,damselfly],321:[n02276258,admiral],322:[n02277742,ringlet],323:[n02279972,monarch],324:[n02280649,cabbage butterfly],325:[n02281406,sulphur butterfly],326:[n02281787,lycaenid],327:[n02317335,starfish],328:[n02319095,sea urchin],329:[n02321529,sea cucumber],330:[n02325366,wood rabbit],331:[n02326432,hare],332:[n02328150,Angora],333:[n02342885,hamster],334:[n02346627,porcupine],335:[n02356798,fox squirrel],336:[n02361337,marmot],337:[n02363005,beaver],338:[n02364673,guinea pig],339:[n02389026,sorrel],340:[n02391049,zebra],341:[n02395406,hog],342:[n02396427,wild boar],343:[n02397096,warthog],344:[n02398521,hippopotamus],345:[n02403003,ox],346:[n02408429,water buffalo],347:[n02410509,bison],348:[n02412080,ram],349:[n02415577,bighorn],350:[n02417914,ibex],351:[n02422106,hartebeest],352:[n02422699,impala],353:[n02423022,gazelle],354:[n02437312,Arabian camel],355:[n02437616,llama],356:[n02441942,weasel],357:[n02442845,mink],358:[n02443114,polecat],359:[n02443484,black-footed ferret],360:[n02444819,otter],361:[n02445715,skunk],362:[n02447366,badger],363:[n02454379,armadillo],364:[n02457408,three-toed sloth],365:[n02480495,orangutan],366:[n02480855,gorilla],367:[n02481823,chimpanzee],368:[n02483362,gibbon],369:[n02483708,siamang],370:[n02484975,guenon],371:[n02486261,patas],372:[n02486410,baboon],373:[n02487347,macaque],374:[n02488291,langur],375:[n02488702,colobus],376:[n02489166,proboscis monkey],377:[n02490219,marmoset],378:[n02492035,capuchin],379:[n02492660,howler monkey],380:[n02493509,titi],381:[n02493793,spider monkey],382:[n02494079,squirrel monkey],383:[n02497673,Madagascar cat],384:[n02500267,indri],385:[n02504013,Indian elephant],386:[n02504458,African elephant],387:[n02509815,lesser panda],388:[n02510455,giant panda],389:[n02514041,barracouta],390:[n02526121,eel],391:[n02536864,coho],392:[n02606052,rock beauty],393:[n02607072,anemone fish],394:[n02640242,sturgeon],395:[n02641379,gar],396:[n02643566,lionfish],397:[n02655020,puffer],398:[n02666196,abacus],399:[n02667093,abaya],400:[n02669723,academic gown],401:[n02672831,accordion],402:[n02676566,acoustic guitar],403:[n02687172,aircraft carrier],404:[n02690373,airliner],405:[n02692877,airship],406:[n02699494,altar],407:[n02701002,ambulance],408:[n02704792,amphibian],409:[n02708093,analog clock],410:[n02727426,apiary],411:[n02730930,apron],412:[n02747177,ashcan],413:[n02749479,assault rifle],414:[n02769748,backpack],415:[n02776631,bakery],416:[n02777292,balance beam],417:[n02782093,balloon],418:[n02783161,ballpoint],419:[n02786058,Band Aid],420:[n02787622,banjo],421:[n02788148,bannister],422:[n02790996,barbell],423:[n02791124,barber chair],424:[n02791270,barbershop],425:[n02793495,barn],426:[n02794156,barometer],427:[n02795169,barrel],428:[n02797295,barrow],429:[n02799071,baseball],430:[n02802426,basketball],431:[n02804414,bassinet],432:[n02804610,bassoon],433:[n02807133,bathing cap],434:[n02808304,bath towel],435:[n02808440,bathtub],436:[n02814533,beach wagon],437:[n02814860,beacon],438:[n02815834,beaker],439:[n02817516,bearskin],440:[n02823428,beer bottle],441:[n02823750,beer glass],442:[n02825657,bell cote],443:[n02834397,bib],444:[n02835271,bicycle-built-for-two],445:[n02837789,bikini],446:[n02840245,binder],447:[n02841315,binoculars],448:[n02843684,birdhouse],449:[n02859443,boathouse],450:[n02860847,bobsled],451:[n02865351,bolo tie],452:[n02869837,bonnet],453:[n02870880,bookcase],454:[n02871525,bookshop],455:[n02877765,bottlecap],456:[n02879718,bow],457:[n02883205,bow tie],458:[n02892201,brass],459:[n02892767,brassiere],460:[n02894605,breakwater],461:[n02895154,breastplate],462:[n02906734,broom],463:[n02909870,bucket],464:[n02910353,buckle],465:[n02916936,bulletproof vest],466:[n02917067,bullet train],467:[n02927161,butcher shop],468:[n02930766,cab],469:[n02939185,caldron],470:[n02948072,candle],471:[n02950826,cannon],472:[n02951358,canoe],473:[n02951585,can opener],474:[n02963159,cardigan],475:[n02965783,car mirror],476:[n02966193,carousel],477:[n02966687,carpenters kit],478:[n02971356,carton],479:[n02974003,car wheel],480:[n02977058,cash machine],481:[n02978881,cassette],482:[n02979186,cassette player],483:[n02980441,castle],484:[n02981792,catamaran],485:[n02988304,CD player],486:[n02992211,cello],487:[n02992529,cellular telephone],488:[n02999410,chain],489:[n03000134,chainlink fence],490:[n03000247,chain mail],491:[n03000684,chain saw],492:[n03014705,chest],493:[n03016953,chiffonier],494:[n03017168,chime],495:[n03018349,china cabinet],496:[n03026506,Christmas stocking],497:[n03028079,church],498:[n03032252,cinema],499:[n03041632,cleaver],500:[n03042490,cliff dwelling],501:[n03045698,cloak],502:[n03047690,clog],503:[n03062245,cocktail shaker],504:[n03063599,coffee mug],505:[n03063689,coffeepot],506:[n03065424,coil],507:[n03075370,combination lock],508:[n03085013,computer keyboard],509:[n03089624,confectionery],510:[n03095699,container ship],511:[n03100240,convertible],512:[n03109150,corkscrew],513:[n03110669,cornet],514:[n03124043,cowboy boot],515:[n03124170,cowboy hat],516:[n03125729,cradle],517:[n03126707,crane],518:[n03127747,crash helmet],519:[n03127925,crate],520:[n03131574,crib],521:[n03133878,Crock Pot],522:[n03134739,croquet ball],523:[n03141823,crutch],524:[n03146219,cuirass],525:[n03160309,dam],526:[n03179701,desk],527:[n03180011,desktop computer],528:[n03187595,dial telephone],529:[n03188531,diaper],530:[n03196217,digital clock],531:[n03197337,digital watch],532:[n03201208,dining table],533:[n03207743,dishrag],534:[n03207941,dishwasher],535:[n03208938,disk brake],536:[n03216828,dock],537:[n03218198,dogsled],538:[n03220513,dome],539:[n03223299,doormat],540:[n03240683,drilling platform],541:[n03249569,drum],542:[n03250847,drumstick],543:[n03255030,dumbbell],544:[n03259280,Dutch oven],545:[n03271574,electric fan],546:[n03272010,electric guitar],547:[n03272562,electric locomotive],548:[n03290653,entertainment center],549:[n03291819,envelope],550:[n03297495,espresso maker],551:[n03314780,face powder],552:[n03325584,feather boa],553:[n03337140,file],554:[n03344393,fireboat],555:[n03345487,fire engine],556:[n03347037,fire screen],557:[n03355925,flagpole],558:[n03372029,flute],559:[n03376595,folding chair],560:[n03379051,football helmet],561:[n03384352,forklift],562:[n03388043,fountain],563:[n03388183,fountain pen],564:[n03388549,four-poster],565:[n03393912,freight car],566:[n03394916,French horn],567:[n03400231,frying pan],568:[n03404251,fur coat],569:[n03417042,garbage truck],570:[n03424325,gasmask],571:[n03425413,gas pump],572:[n03443371,goblet],573:[n03444034,go-kart],574:[n03445777,golf ball],575:[n03445924,golfcart],576:[n03447447,gondola],577:[n03447721,gong],578:[n03450230,gown],579:[n03452741,grand piano],580:[n03457902,greenhouse],581:[n03459775,grille],582:[n03461385,grocery store],583:[n03467068,guillotine],584:[n03476684,hair slide],585:[n03476991,hair spray],586:[n03478589,half track],587:[n03481172,hammer],588:[n03482405,hamper],589:[n03483316,hand blower],590:[n03485407,hand-held computer],591:[n03485794,handkerchief],592:[n03492542,hard disc],593:[n03494278,harmonica],594:[n03495258,harp],595:[n03496892,harvester],596:[n03498962,hatchet],597:[n03527444,holster],598:[n03529860,home theater],599:[n03530642,honeycomb],600:[n03532672,hook],601:[n03534580,hoopskirt],602:[n03535780,horizontal bar],603:[n03538406,horse cart],604:[n03544143,hourglass],605:[n03584254,iPod],606:[n03584829,iron],607:[n03590841,jack-o-lantern],608:[n03594734,jean],609:[n03594945,jeep],610:[n03595614,jersey],611:[n03598930,jigsaw puzzle],612:[n03599486,jinrikisha],613:[n03602883,joystick],614:[n03617480,kimono],615:[n03623198,knee pad],616:[n03627232,knot],617:[n03630383,lab coat],618:[n03633091,ladle],619:[n03637318,lampshade],620:[n03642806,laptop],621:[n03649909,lawn mower],622:[n03657121,lens cap],623:[n03658185,letter opener],624:[n03661043,library],625:[n03662601,lifeboat],626:[n03666591,lighter],627:[n03670208,limousine],628:[n03673027,liner],629:[n03676483,lipstick],630:[n03680355,Loafer],631:[n03690938,lotion],632:[n03691459,loudspeaker],633:[n03692522,loupe],634:[n03697007,lumbermill],635:[n03706229,magnetic compass],636:[n03709823,mailbag],637:[n03710193,mailbox],638:[n03710637,maillot],639:[n03710721,maillot],640:[n03717622,manhole cover],641:[n03720891,maraca],642:[n03721384,marimba],643:[n03724870,mask],644:[n03729826,matchstick],645:[n03733131,maypole],646:[n03733281,maze],647:[n03733805,measuring cup],648:[n03742115,medicine chest],649:[n03743016,megalith],650:[n03759954,microphone],651:[n03761084,microwave],652:[n03763968,military uniform],653:[n03764736,milk can],654:[n03769881,minibus],655:[n03770439,miniskirt],656:[n03770679,minivan],657:[n03773504,missile],658:[n03775071,mitten],659:[n03775546,mixing bowl],660:[n03776460,mobile home],661:[n03777568,Model T],662:[n03777754,modem],663:[n03781244,monastery],664:[n03782006,monitor],665:[n03785016,moped],666:[n03786901,mortar],667:[n03787032,mortarboard],668:[n03788195,mosque],669:[n03788365,mosquito net],670:[n03791053,motor scooter],671:[n03792782,mountain bike],672:[n03792972,mountain tent],673:[n03793489,mouse],674:[n03794056,mousetrap],675:[n03796401,moving van],676:[n03803284,muzzle],677:[n03804744,nail],678:[n03814639,neck brace],679:[n03814906,necklace],680:[n03825788,nipple],681:[n03832673,notebook],682:[n03837869,obelisk],683:[n03838899,oboe],684:[n03840681,ocarina],685:[n03841143,odometer],686:[n03843555,oil filter],687:[n03854065,organ],688:[n03857828,oscilloscope],689:[n03866082,overskirt],690:[n03868242,oxcart],691:[n03868863,oxygen mask],692:[n03871628,packet],693:[n03873416,paddle],694:[n03874293,paddlewheel],695:[n03874599,padlock],696:[n03876231,paintbrush],697:[n03877472,pajama],698:[n03877845,palace],699:[n03884397,panpipe],700:[n03887697,paper towel],701:[n03888257,parachute],702:[n03888605,parallel bars],703:[n03891251,park bench],704:[n03891332,parking meter],705:[n03895866,passenger car],706:[n03899768,patio],707:[n03902125,pay-phone],708:[n03903868,pedestal],709:[n03908618,pencil box],710:[n03908714,pencil sharpener],711:[n03916031,perfume],712:[n03920288,Petri dish],713:[n03924679,photocopier],714:[n03929660,pick],715:[n03929855,pickelhaube],716:[n03930313,picket fence],717:[n03930630,pickup],718:[n03933933,pier],719:[n03935335,piggy bank],720:[n03937543,pill bottle],721:[n03938244,pillow],722:[n03942813,ping-pong ball],723:[n03944341,pinwheel],724:[n03947888,pirate],725:[n03950228,pitcher],726:[n03954731,plane],727:[n03956157,planetarium],728:[n03958227,plastic bag],729:[n03961711,plate rack],730:[n03967562,plow],731:[n03970156,plunger],732:[n03976467,Polaroid camera],733:[n03976657,pole],734:[n03977966,police van],735:[n03980874,poncho],736:[n03982430,pool table],737:[n03983396,pop bottle],738:[n03991062,pot],739:[n03992509,potters wheel],740:[n03995372,power drill],741:[n03998194,prayer rug],742:[n04004767,printer],743:[n04005630,prison],744:[n04008634,projectile],745:[n04009552,projector],746:[n04019541,puck],747:[n04023962,punching bag],748:[n04026417,purse],749:[n04033901,quill],750:[n04033995,quilt],751:[n04037443,racer],752:[n04039381,racket],753:[n04040759,radiator],754:[n04041544,radio],755:[n04044716,radio telescope],756:[n04049303,rain barrel],757:[n04065272,recreational vehicle],758:[n04067472,reel],759:[n04069434,reflex camera],760:[n04070727,refrigerator],761:[n04074963,remote control],762:[n04081281,restaurant],763:[n04086273,revolver],764:[n04090263,rifle],765:[n04099969,rocking chair],766:[n04111531,rotisserie],767:[n04116512,rubber eraser],768:[n04118538,rugby ball],769:[n04118776,rule],770:[n04120489,running shoe],771:[n04125021,safe],772:[n04127249,safety pin],773:[n04131690,saltshaker],774:[n04133789,sandal],775:[n04136333,sarong],776:[n04141076,sax],777:[n04141327,scabbard],778:[n04141975,scale],779:[n04146614,school bus],780:[n04147183,schooner],781:[n04149813,scoreboard],782:[n04152593,screen],783:[n04153751,screw],784:[n04154565,screwdriver],785:[n04162706,seat belt],786:[n04179913,sewing machine],787:[n04192698,shield],788:[n04200800,shoe shop],789:[n04201297,shoji],790:[n04204238,shopping basket],791:[n04204347,shopping cart],792:[n04208210,shovel],793:[n04209133,shower cap],794:[n04209239,shower curtain],795:[n04228054,ski],796:[n04229816,ski mask],797:[n04235860,sleeping bag],798:[n04238763,slide rule],799:[n04239074,sliding door],800:[n04243546,slot],801:[n04251144,snorkel],802:[n04252077,snowmobile],803:[n04252225,snowplow],804:[n04254120,soap dispenser],805:[n04254680,soccer ball],806:[n04254777,sock],807:[n04258138,solar dish],808:[n04259630,sombrero],809:[n04263257,soup bowl],810:[n04264628,space bar],811:[n04265275,space heater],812:[n04266014,space shuttle],813:[n04270147,spatula],814:[n04273569,speedboat],815:[n04275548,spider web],816:[n04277352,spindle],817:[n04285008,sports car],818:[n04286575,spotlight],819:[n04296562,stage],820:[n04310018,steam locomotive],821:[n04311004,steel arch bridge],822:[n04311174,steel drum],823:[n04317175,stethoscope],824:[n04325704,stole],825:[n04326547,stone wall],826:[n04328186,stopwatch],827:[n04330267,stove],828:[n04332243,strainer],829:[n04335435,streetcar],830:[n04336792,stretcher],831:[n04344873,studio couch],832:[n04346328,stupa],833:[n04347754,submarine],834:[n04350905,suit],835:[n04355338,sundial],836:[n04355933,sunglass],837:[n04356056,sunglasses],838:[n04357314,sunscreen],839:[n04366367,suspension bridge],840:[n04367480,swab],841:[n04370456,sweatshirt],842:[n04371430,swimming trunks],843:[n04371774,swing],844:[n04372370,switch],845:[n04376876,syringe],846:[n04380533,table lamp],847:[n04389033,tank],848:[n04392985,tape player],849:[n04398044,teapot],850:[n04399382,teddy],851:[n04404412,television],852:[n04409515,tennis ball],853:[n04417672,thatch],854:[n04418357,theater curtain],855:[n04423845,thimble],856:[n04428191,thresher],857:[n04429376,throne],858:[n04435653,tile roof],859:[n04442312,toaster],860:[n04443257,tobacco shop],861:[n04447861,toilet seat],862:[n04456115,torch],863:[n04458633,totem pole],864:[n04461696,tow truck],865:[n04462240,toyshop],866:[n04465501,tractor],867:[n04467665,trailer truck],868:[n04476259,tray],869:[n04479046,trench coat],870:[n04482393,tricycle],871:[n04483307,trimaran],872:[n04485082,tripod],873:[n04486054,triumphal arch],874:[n04487081,trolleybus],875:[n04487394,trombone],876:[n04493381,tub],877:[n04501370,turnstile],878:[n04505470,typewriter keyboard],879:[n04507155,umbrella],880:[n04509417,unicycle],881:[n04515003,upright],882:[n04517823,vacuum],883:[n04522168,vase],884:[n04523525,vault],885:[n04525038,velvet],886:[n04525305,vending machine],887:[n04532106,vestment],888:[n04532670,viaduct],889:[n04536866,violin],890:[n04540053,volleyball],891:[n04542943,waffle iron],892:[n04548280,wall clock],893:[n04548362,wallet],894:[n04550184,wardrobe],895:[n04552348,warplane],896:[n04553703,washbasin],897:[n04554684,washer],898:[n04557648,water bottle],899:[n04560804,water jug],900:[n04562935,water tower],901:[n04579145,whiskey jug],902:[n04579432,whistle],903:[n04584207,wig],904:[n04589890,window screen],905:[n04590129,window shade],906:[n04591157,Windsor tie],907:[n04591713,wine bottle],908:[n04592741,wing],909:[n04596742,wok],910:[n04597913,wooden spoon],911:[n04599235,wool],912:[n04604644,worm fence],913:[n04606251,wreck],914:[n04612504,yawl],915:[n04613696,yurt],916:[n06359193,web site],917:[n06596364,comic book],918:[n06785654,crossword puzzle],919:[n06794110,street sign],920:[n06874185,traffic light],921:[n07248320,book jacket],922:[n07565083,menu],923:[n07579787,plate],924:[n07583066,guacamole],925:[n07584110,consomme],926:[n07590611,hot pot],927:[n07613480,trifle],928:[n07614500,ice cream],929:[n07615774,ice lolly],930:[n07684084,French loaf],931:[n07693725,bagel],932:[n07695742,pretzel],933:[n07697313,cheeseburger],934:[n07697537,hotdog],935:[n07711569,mashed potato],936:[n07714571,head cabbage],937:[n07714990,broccoli],938:[n07715103,cauliflower],939:[n07716358,zucchini],940:[n07716906,spaghetti squash],941:[n07717410,acorn squash],942:[n07717556,butternut squash],943:[n07718472,cucumber],944:[n07718747,artichoke],945:[n07720875,bell pepper],946:[n07730033,cardoon],947:[n07734744,mushroom],948:[n07742313,Granny Smith],949:[n07745940,strawberry],950:[n07747607,orange],951:[n07749582,lemon],952:[n07753113,fig],953:[n07753275,pineapple],954:[n07753592,banana],955:[n07754684,jackfruit],956:[n07760859,custard apple],957:[n07768694,pomegranate],958:[n07802026,hay],959:[n07831146,carbonara],960:[n07836838,chocolate sauce],961:[n07860988,dough],962:[n07871810,meat loaf],963:[n07873807,pizza],964:[n07875152,potpie],965:[n07880968,burrito],966:[n07892512,red wine],967:[n07920052,espresso],968:[n07930864,cup],969:[n07932039,eggnog],970:[n09193705,alp],971:[n09229709,bubble],972:[n09246464,cliff],973:[n09256479,coral reef],974:[n09288635,geyser],975:[n09332890,lakeside],976:[n09399592,promontory],977:[n09421951,sandbar],978:[n09428293,seashore],979:[n09468604,valley],980:[n09472597,volcano],981:[n09835506,ballplayer],982:[n10148035,groom],983:[n10565667,scuba diver],984:[n11879895,rapeseed],985:[n11939491,daisy],986:[n12057211,yellow ladys slipper],987:[n12144580,corn],988:[n12267677,acorn],989:[n12620546,hip],990:[n12768682,buckeye],991:[n12985857,coral fungus],992:[n12998815,agaric],993:[n13037406,gyromitra],994:[n13040303,stinkhorn],995:[n13044778,earthstar],996:[n13052670,hen-of-the-woods],997:[n13054560,bolete],998:[n13133613,ear],999:[n15075141,toilet tissue]}对模型进行微调 MobileNetV2 文档说过可以根据自己的需求对模型进行微调那么应该如何做呢 官方教程使用 TensorFlow Hub 进行迁移学习 刚好这篇教程也是使用的 MobileNetV2 模型我们可以参照官方教程来进行模型微调。 使用 教程里介绍了如何使用MobileNetV2这里我们可以学到一些其他知识。 预测网络图片 import PIL.Image as Image# 1、mobilenet需要的图片尺寸是 224 * 224 #image tf.keras.preprocessing.image.load_img(pics/dog.png,target_size(224,224)) image tf.keras.utils.get_file(bird.jpg,https://scpic.chinaz.net/files/default/imgs/2023-08-29/7dc085b6d3291303.jpg) image Image.open(image).resize((224,224))这里要注意你更改网络图片地址后一定要修改对应的名称否则还会加载上一次的图片。 获取标签 之前还因为数据集不一致而犯愁这里官方为我们提供了一种方式 labels_path tf.keras.utils.get_file(ImageNetLabels.txt,https://storage.googleapis.com/download.tensorflow.org/data/ImageNetLabels.txt) imagenet_labels np.array(open(labels_path).read().splitlines())这个标签是一个纯文本并且需要科学上网才能够访问 迁移学习 如何做呢官方文档介绍 1、需要有数据集文档已经为我们准备了先用一下官方提供的数据集后面研究一下如何自己制作数据集 2、从 TensorFlow Hub 下载一个预训练模型这个我们也已经有了 3、重新训练顶部最后一个层以识别自定义 数据集中的类 数据花卉数据 import pathlib# 下载并解压到当前目录 data_file tf.keras.utils.get_file(flower_photos.tgz,https://storage.googleapis.com/download.tensorflow.org/example_images/flower_photos.tgz,cache_dir.,extractTrue)data_root pathlib.Path(data_file).with_suffix()上面是官方文档的方式这里我们直接通过浏览器下载并解压到当前文件夹里需要科学上网 文件还是比较大的 看了一下图片都是普通的图片这样方便我们后面自己整理训练集数据当然最好图片也别搞得太大 加载数据并划分数据集 # 32张图片为一个批次尺寸设置为224*224 batch_size 32 img_height 224 img_width 224# 加载图像数据集并将其分割为训练集和验证集验证集比例为20% train_ds tf.keras.utils.image_dataset_from_directory(flower_photos, # 目录validation_split0.2, # 验证集占20%subsettraining, # 将数据集划分为训练集seed 123, # 随机种子用于数据集随机划分image_size (img_width,img_height) , # 调整图像大小batch_size batch_size # 每个批次中包含的图像数量 ) # 验证集 val_ds tf.keras.utils.image_dataset_from_directory(flower_photos, # 目录validation_split0.2, # 验证集占20%subsetvalidation, # 将数据集划分为验证集seed 123, # 随机种子用于数据集随机划分image_size (img_width,img_height) , # 调整图像大小batch_size batch_size # 每个批次中包含的图像数量 )执行后可以看到控制台输出如下信息发现了3670张图片将这些图片归为5类其中2936张图片用于训练。这里归为5类刚好对于目录下有5中不同的图片 获取花卉种类 # 花卉种类 class_names np.array(train_ds.class_names) print(花卉种类,class_names)刚好对应那5个文件夹的名称 对训练集和验证集的图像数据进行归一化处理 归一化的目的在另一篇文章已经解释过了这里不再说了 # 归一化 normalization_layer tf.keras.layers.Rescaling(1./255) # 创建了一个Rescaling层将像素值缩放到0到1之间 。 1./255是 1/255保留小数差点没看懂 train_ds train_ds.map(lambda x,y:(normalization_layer(x),y)) val_ds val_ds.map(lambda x, y: (normalization_layer(x), y))对数据集进行缓存和预取操作 # 使用缓冲预取避免产生I/O阻塞 AUTOTUNE tf.data.AUTOTUNE train_ds train_ds.cache().prefetch(buffer_sizeAUTOTUNE) val_ds val_ds.cache().prefetch(buffer_sizeAUTOTUNE)# 验证数据是否成功加载和处理 for image_batch, labels_batch in train_ds:print(image_batch.shape)print(labels_batch.shape)break输出结果与官方文档一致可以继续 (32, 224, 224, 3)表示一个一个批次有32张图片尺寸是224*224图像的通道数是3RGB通道 对一批图像运行分类器进行预测 # 对一批图片运行分类器进行预测 result_batch model.predict(train_ds) # 加载映射文件这里我将其下载到了本地 # 文件下载地址https://storage.googleapis.com/download.tensorflow.org/data/ImageNetLabels.txt imagenet_labels np.array(open(labels.txt).read().splitlines()) # 在给定的张量中找到沿指定轴的最大值的索引 predict_class_names imagenet_labels[tf.math.argmax(result_batch, axis-1)] print(预测类别,predict_class_names)# 绘制出预测与图片 plt.figure(figsize(10,9)) plt.subplots_adjust(hspace0.5) for n in range(30):plt.subplot(6,5,n1)plt.imshow(image_batch[n])plt.title(predict_class_names[n])plt.axis(off) _ plt.suptitle(ImageNet predictions) plt.show()与官方运行效果一致可以继续往下进行。 下载无头模式 之前使用的模型是用来进行分类的现在按照官方教程还需要下载对于的 特征提取模型。分类模型 imagenet_mobilenet_v2_140_224_classification_5 对应的特征提取模型下载地址是https://tfhub.dev/google/imagenet/mobilenet_v2_140_224/feature_vector/5 下载并解压到项目里进行使用 # 加载特征提取器 feature_extractor_layer hub.KerasLayer(imagenet_mobilenet_v2_140_224_feature_vector_5, # 预训练模型input_shape(224,224,3), # 指定图像输入的高度、宽度和通道数trainableFalse #训练过程中不更新特征提取器的权重 ) # 特征提取器为每个图像返回一个 1280 长的向量在此示例中图像批大小仍为 32 feature_batch feature_extractor_layer(image_batch) print(特征批次形状,feature_batch.shape)问题不大是因为使用的模型不同引起的。 附加分类头 # 附加分类头 new_model tf.keras.Sequential([feature_extractor_layer,tf.keras.layers.Dense(len(class_names)) # 指定输出分类这里的花是5类 ])可以正常执行继续 训练模型 现在已经提取出了特征接下来是训练模型。 # 开始训练暂时只训练10轮。history记录了训练过程中的各项指标便于后续分析和可视化 history new_model.fit(train_ds, # 训练数据集validation_dataval_ds, # 验证数据集用于在训练过程中监控模型的性能epochs10, # 训练的总轮次callbackstensorboard_callback # 回调函数用于在训练过程中执行特定操作比如记录日志 )从下图可以看到经过10轮的训练准确率越来越高 文件夹下也可以看见训练日志 另外在控制台运行tensorboard --logdir logs/fit 启动 TensorBoard 控制台会返回一个网址可以查看指标如何随每个纪元而变化并跟踪其他标量值 检查预测 到这里基本上差不多了现在进行简单的预测 predicted_batch new_model.predict(image_batch) predicted_id tf.math.argmax(predicted_batch, axis-1) predicted_label_batch class_names[predicted_id] print(花卉种类,predicted_label_batch)plt.figure(figsize(10,9)) plt.subplots_adjust(hspace0.5)for n in range(30):plt.subplot(6,5,n1)plt.imshow(image_batch[n])plt.title(predicted_label_batch[n].title())plt.axis(off) _ plt.suptitle(Model predictions) plt.show()运行结果与官方文档一致 导出并重新加载模型 模型训练好后可以将其导出为SavedModel以便以后重复使用。 SavedModel是TensorFlow中的一种模型保存格式。它是一种用于将训练好的模型保存到磁盘并能够方便地加载和使用的标准化格式。SavedModel可以保存模型的架构、权重参数、优化器状态以及其他与模型相关的设置。 SavedModel提供了一种跨平台、跨语言的模型保存和加载方式可以在不同的TensorFlow版本和不同的编程语言中使用。这使得我们可以将模型从一种环境迁移到另一种环境或者与其他TensorFlow应用程序共享模型。 在导出模型时你可以使用tf.saved_model.save()函数将模型保存为SavedModel格式。这个函数接受模型对象和保存路径作为参数并将模型的相关信息保存到指定路径下的文件夹中。例如 tf.saved_model.save(model, export_path)其中model是你要保存的模型对象export_path是保存路径。 导出模型后你可以在其他地方使用tf.saved_model.load()函数加载SavedModel并使用加载的模型进行预测或其他操作。 导出 # 导出训练好的模型 export_path /tmp/saved_models/flower_model new_model.save(export_path)完整代码 # 导入tensorflow 和科学计算库 import tensorflow as tf import numpy as np # tensorflow-hub是一个TensorFlow库的扩展它提供了一个简单的接口用于重用已经训练好的机器学习模型的部分 import tensorflow_hub as hub # 字体属性 from matplotlib.font_manager import FontProperties # matplotlib是用于绘制图表和可视化数据的库 import matplotlib.pylab as pltimport datetime# 导入模型 # 不能直接加载模型文件需要加载器目录 model tf.keras.Sequential([hub.KerasLayer(imagenet_mobilenet_v2_140_224_classification_5) ])# 32张图片为一个批次尺寸设置为224*224 batch_size 32 img_height 224 img_width 224# 加载图像数据集并将其分割为训练集和验证集验证集比例为20% train_ds tf.keras.utils.image_dataset_from_directory(flower_photos, # 目录validation_split0.2, # 验证集占20%subsettraining, # 将数据集划分为训练集seed 123, # 随机种子用于数据集随机划分image_size (img_width,img_height) , # 调整图像大小batch_size batch_size # 每个批次中包含的图像数量 ) # 验证集 val_ds tf.keras.utils.image_dataset_from_directory(flower_photos, # 目录validation_split0.2, # 验证集占20%subsetvalidation, # 将数据集划分为验证集seed 123, # 随机种子用于数据集随机划分image_size (img_width,img_height) , # 调整图像大小batch_size batch_size # 每个批次中包含的图像数量 )# 花卉种类 class_names np.array(train_ds.class_names) print(花卉种类,class_names)# 归一化 normalization_layer tf.keras.layers.Rescaling(1./255) # 创建了一个Rescaling层将像素值缩放到0到1之间 。 1./255是 1/255保留小数差点没看懂 train_ds train_ds.map(lambda x,y:(normalization_layer(x),y)) val_ds val_ds.map(lambda x, y: (normalization_layer(x), y))# 使用缓冲预取避免产生I/O阻塞 AUTOTUNE tf.data.AUTOTUNE train_ds train_ds.cache().prefetch(buffer_sizeAUTOTUNE) val_ds val_ds.cache().prefetch(buffer_sizeAUTOTUNE)# 验证数据是否成功加载和处理 for image_batch, labels_batch in train_ds:print(image_batch.shape)print(labels_batch.shape)break# 对一批图片运行分类器进行预测 result_batch model.predict(train_ds) # 加载映射文件这里我将其下载到了本地 imagenet_labels np.array(open(labels.txt).read().splitlines()) # 在给定的张量中找到沿指定轴的最大值的索引 predict_class_names imagenet_labels[tf.math.argmax(result_batch, axis-1)] print(预测类别,predict_class_names)# 绘制出预测与图片 # plt.figure(figsize(10,9)) # plt.subplots_adjust(hspace0.5) # for n in range(30): # plt.subplot(6,5,n1) # plt.imshow(image_batch[n]) # plt.title(predict_class_names[n]) # plt.axis(off) # _ plt.suptitle(ImageNet predictions) # plt.show()# 加载特征提取器 feature_extractor_layer hub.KerasLayer(imagenet_mobilenet_v2_140_224_feature_vector_5, # 预训练模型input_shape(224,224,3), # 指定图像输入的高度、宽度和通道数trainableFalse #训练过程中不更新特征提取器的权重 ) # 特征提取器为每个图像返回一个 1280 长的向量在此示例中图像批大小仍为 32 feature_batch feature_extractor_layer(image_batch) print(特征批次形状,feature_batch.shape)# 附加分类头 new_model tf.keras.Sequential([feature_extractor_layer,tf.keras.layers.Dense(len(class_names)) # 指定输出分类这里的花是5类 ])# 训练模型 new_model.compile(optimizertf.keras.optimizers.Adam(), # 使用Adam优化器作为优化算法losstf.keras.losses.SparseCategoricalCrossentropy(from_logitsTrue), # 使用SparseCategoricalCrossentropy作为损失函数metrics[acc] # 使用准确率作为评估指标 ) # 训练日志 log_dir logs/fit/ datetime.datetime.now().strftime(%Y%m%d-%H%M%S) # 用于在训练过程中收集模型指标和摘要数据并将其写入TensorBoard日志文件中 tensorboard_callback tf.keras.callbacks.TensorBoard(log_dir log_dir,histogram_freq1 )# 开始训练暂时只训练10轮。history记录了训练过程中的各项指标便于后续分析和可视化 history new_model.fit(train_ds, # 训练数据集validation_dataval_ds, # 验证数据集用于在训练过程中监控模型的性能epochs10, # 训练的总轮次callbackstensorboard_callback # 回调函数用于在训练过程中执行特定操作比如记录日志 )# 简单预测 # predicted_batch new_model.predict(image_batch) # predicted_id tf.math.argmax(predicted_batch, axis-1) # predicted_label_batch class_names[predicted_id] # print(花卉种类,predicted_label_batch)# plt.figure(figsize(10,9)) # plt.subplots_adjust(hspace0.5)# for n in range(30): # plt.subplot(6,5,n1) # plt.imshow(image_batch[n]) # plt.title(predicted_label_batch[n].title()) # plt.axis(off) # _ plt.suptitle(Model predictions) # plt.show()# 导出训练好的模型 export_path tmp/saved_models/flower_model new_model.save(export_path)使用微调后的模型 模型已经保存到本地现在我们来使用一下 # 导入tensorflow 和科学计算库 import tensorflow as tf import numpy as np # matplotlib是用于绘制图表和可视化数据的库 import matplotlib.pylab as plt import PIL.Image as Image from matplotlib.font_manager import FontProperties# 导入模型 model tf.keras.models.load_model(tmp/saved_models/flower_model)# 预处理输入数据 image tf.keras.utils.get_file(sunflower.jpg,https://scpic.chinaz.net/files/pic/pic9/202006/bpic20492.jpg) image Image.open(image).resize((224,224)) image tf.keras.preprocessing.image.img_to_array(image) image np.expand_dims(image, axis0) image tf.keras.applications.mobilenet_v2.preprocess_input(image)# 预测 predictions model.predict(image) predicted_index np.argmax(predictions)print(索引是,predicted_index)class_names [雏菊,蒲公英 ,玫瑰 ,向日葵 ,郁金香]# 可视化显示 font FontProperties() font.set_family(Microsoft YaHei) plt.figure() # 创建图像窗口 plt.xticks([]) plt.yticks([]) plt.grid(False) # 取消网格线 plt.imshow(image[0]) # 显示图片 plt.xlabel(class_names[predicted_index] ,fontpropertiesfont) plt.show() # 显示图形窗口非常成功 使用自己的数据训练模型 上面的代码已经非常成功了但是使用的数据是官方提供的这里我们基于上面的教程将数据集换成自己的。 我这里简单准备了45张猫咪的图片15张白猫、15张黑猫、15张黑白猫。数据量不够不够可以简单试一下效果 数据 训练导出模型 准确率有点高感觉有点慌 使用 还能接受吧
http://wiki.neutronadmin.com/news/85938/

相关文章:

  • 购买网站域名 空间wordpress分类归档不科学
  • 购物建设网站费用网页布局网站
  • 佛山网站建设有限公司找别人做网站要注意什么
  • 中国制造网官方网站首页前端刚毕业开多少工资
  • 高端网站建设软件开发个人网站推荐
  • 自己做网站有什么意义网站开发转型
  • 上海浦东设计网站建设网站建设策划书范本
  • 杭州公司建站模板工作室主题网站模板
  • 站长工具a级北京餐饮品牌设计公司
  • 网站开发初学wordpress后台经常504
  • 小何自助建站湖南二维码标签报价
  • 好一点的网站建设公司如何自己建网址
  • 长春门户网站建设北京软件外包公司排行榜
  • 天津做公司网站阿里云网站商城建设
  • 找人做网站如何担保快速做网站关键词排名
  • WordPress资讯站点源码在线设计公司logo图标
  • 温州中小企业网站建设自己编写网站
  • 国外网站建设品牌wordpress企业中文模板
  • 动漫网站做毕业设计简单吗市场调研网站有哪些
  • 触屏版网站模板张家界网站建设公司
  • 网站空间商排行榜成都装修设计公司首选
  • 网站设计公司 无锡海口自助建站系统
  • 淘宝关键词排名查询网站市场营销策略方案
  • 建设招标网 手机官方网站wordpress分类图标列表
  • 网站结构和布局区别游戏网站开发过程
  • 新手做网站视频大理微网站建设
  • 成都建立网站的公司浙江综合网站建设配件
  • 项目建设备案网站在百度上怎么发布广告
  • 网站建设必备条件wordpress怎么做
  • 公司官网用什么建站程序新中式装修效果图