做测试中使用 hugface的transformers 在线下载了一堆目录的文件,需要整理用于离线使用

import os ,sys ,shutil
import re
import json
cache_path='D:\\PYTORCH_TRANSFORMERS_CACHE\\'
bert_pretain='D:\\bert_model\\'
for p,f,s in os.walk(cache_path):#print (p)for file in s:#print(re.split('[.]',file))if re.split('[.]',file)[-1]=='json' :#print ('.'.join(re.split('[.]',file)[0:-1]))target_file='.'.join(re.split('[.]',file)[0:-1])j=json.load(open(p+'/'+file))print(j['url'])model_name=j['url'].split('/resolve/')[0].split('/')[-1]model_file_name=j['url'].split('/')[-1]print(model_name,model_file_name)target_path=bert_pretain+model_name#print(target_path)if not os.path.exists(target_path):os.makedirs(target_path) print(p+target_file)print(target_path+'/'+model_file_name)if os.path.exists(p+target_file)&~os.path.exists(target_path+'/'+model_file_name):shutil.copyfile(p+target_file ,target_path+'/'+model_file_name)  
https://huggingface.co/xlnet-large-cased/resolve/main/pytorch_model.bin
xlnet-large-cased pytorch_model.bin
D:\PYTORCH_TRANSFORMERS_CACHE\0c2b00a768ca7c5b3534b75606a47a7e1125b10ce354b217022de5a12029859c.7fff7afe180c24f31dabdb196f95ca2e26a8aa357c1db6137f4fec6430db9776
D:\bert_model\xlnet-large-cased/pytorch_model.bin
https://huggingface.co/hfl/chinese-electra-180g-base-discriminator/resolve/main/pytorch_model.bin
chinese-electra-180g-base-discriminator pytorch_model.bin
D:\PYTORCH_TRANSFORMERS_CACHE\1b16640da935adb492a0d70765ccca3300c01a7f2ce15991ff648ecd34cd48ed.2b59522b202567cfa0c53424492aa0cfa90f6a5b34205e4f4b21b8b903994ed1
D:\bert_model\chinese-electra-180g-base-discriminator/pytorch_model.bin
https://huggingface.co/wptoux/albert-chinese-large-qa/resolve/main/pytorch_model.bin
albert-chinese-large-qa pytorch_model.bin
D:\PYTORCH_TRANSFORMERS_CACHE\1bf27d86a25788f69cbf644e74289ab3eff3fdeb3bb533bfc96e08aec2250d5c.f9736c09f45fd2d06cf369c230c49d986759f331cae064f5b1d203a1639a09f6
D:\bert_model\albert-chinese-large-qa/pytorch_model.bin
https://huggingface.co/xlnet-large-cased/resolve/main/config.json
xlnet-large-cased config.json
D:\PYTORCH_TRANSFORMERS_CACHE\1f0d5fc4143aa8fe332810bac98d442fed5483549adcd9656e5709cd470003a0.a0945cddd1ef8f9d9c40c35c36bad4908625533057baeeafc6d26a9550f18c60
D:\bert_model\xlnet-large-cased/config.json
https://huggingface.co/wptoux/albert-chinese-large-qa/resolve/main/special_tokens_map.json
albert-chinese-large-qa special_tokens_map.json
D:\PYTORCH_TRANSFORMERS_CACHE\1f536158f09966b7ddca1f2c06264b78535edb420d115380141a23361d136d78.dd8bd9bfd3664b530ea4e645105f557769387b3da9f79bdb55ed556bdd80611d
D:\bert_model\albert-chinese-large-qa/special_tokens_map.json
https://huggingface.co/hfl/chinese-electra-180g-base-discriminator/resolve/main/added_tokens.json
chinese-electra-180g-base-discriminator added_tokens.json
D:\PYTORCH_TRANSFORMERS_CACHE\215afa37f85d31a1dc760ee7ea2b40414d6f3b518a20ce319a66bdd9a79ca510.5cc6e825eb228a7a5cfd27cb4d7151e97a79fb962b31aaf1813aa102e746584b
D:\bert_model\chinese-electra-180g-base-discriminator/added_tokens.json
https://huggingface.co/hfl/chinese-electra-180g-base-discriminator/resolve/main/vocab.txt
chinese-electra-180g-base-discriminator vocab.txt
D:\PYTORCH_TRANSFORMERS_CACHE\23fd079e75a41fee404f2fe8a5b7899af4afbae037b66ad3b79f23023f1e50be.accd894ff58c6ff7bd4f3072890776c14f4ea34fcc08e79cd88c2d157756dceb
D:\bert_model\chinese-electra-180g-base-discriminator/vocab.txt
https://huggingface.co/kssteven/ibert-roberta-base/resolve/main/pytorch_model.bin
ibert-roberta-base pytorch_model.bin
D:\PYTORCH_TRANSFORMERS_CACHE\27deb62eabe44acce2863c490a8dcbeb98053eeb8a0275c10326865020aaac43.c555a2e298657aa872e144ac44cbd115f4825d74d48414d00781f2fd00cbb3cd
D:\bert_model\ibert-roberta-base/pytorch_model.bin
https://huggingface.co/hfl/chinese-electra-180g-base-discriminator/resolve/main/tokenizer.json
chinese-electra-180g-base-discriminator tokenizer.json
D:\PYTORCH_TRANSFORMERS_CACHE\2c3d89a81a0b01080027739ef180adbee95025e6003a8992fa4980ef0a9463ec.660ed5c7513bf13d4607410502a84e0de517eb889ff8c401068a1688868e1ccb
D:\bert_model\chinese-electra-180g-base-discriminator/tokenizer.json
https://huggingface.co/kssteven/ibert-roberta-base/resolve/main/tokenizer.json
ibert-roberta-base tokenizer.json
D:\PYTORCH_TRANSFORMERS_CACHE\34f3d5049e70c4f56a6bd42bd6a9e4f2599164010440a37895e99aafb084c114.fc9576039592f026ad76a1c231b89aee8668488c671dfbe6616bab2ed298d730
D:\bert_model\ibert-roberta-base/tokenizer.json
https://huggingface.co/xlnet-large-cased/resolve/main/spiece.model
xlnet-large-cased spiece.model
D:\PYTORCH_TRANSFORMERS_CACHE\3af982b422f8bb8c510fdd1112afe6f5ec3f3219ef859edcf4c3826bec14832e.d93497120e3a865e2970f26abdf7bf375896f97fde8b874b70909592a6c785c9
D:\bert_model\xlnet-large-cased/spiece.model
https://huggingface.co/kssteven/ibert-roberta-base/resolve/main/tokenizer_config.json
ibert-roberta-base tokenizer_config.json
D:\PYTORCH_TRANSFORMERS_CACHE\45e95c68c39ff43eaa320e2634280dc9be9fbeac498575241b162472b280a60f.e7fcf26aa8cb28b14292f4ba90abfc66937ea92782061f8552cbd61edc0f7c0a
D:\bert_model\ibert-roberta-base/tokenizer_config.json
https://huggingface.co/distilbert-base-uncased-finetuned-sst-2-english/resolve/main/config.json
distilbert-base-uncased-finetuned-sst-2-english config.json
D:\PYTORCH_TRANSFORMERS_CACHE\4e60bb8efad3d4b7dc9969bf204947c185166a0a3cf37ddb6f481a876a3777b5.9f8326d0b7697c7fd57366cdde57032f46bc10e37ae81cb7eb564d66d23ec96b
D:\bert_model\distilbert-base-uncased-finetuned-sst-2-english/config.json
https://huggingface.co/roberta-base/resolve/main/pytorch_model.bin
roberta-base pytorch_model.bin
D:\PYTORCH_TRANSFORMERS_CACHE\51ba668f7ff34e7cdfa9561e8361747738113878850a7d717dbc69de8683aaad.c7efaa30a0d80b2958b876969faa180e485944a849deee4ad482332de65365a7
D:\bert_model\roberta-base/pytorch_model.bin
https://huggingface.co/kssteven/ibert-roberta-base/resolve/main/merges.txt
ibert-roberta-base merges.txt
D:\PYTORCH_TRANSFORMERS_CACHE\5422ece434498216e797c6ef1ecef875e497cb88774ffb57341843b89567fa70.5d12962c5ee615a4c803841266e9c3be9a691a924f72d395d3a6c6c81157788b
D:\bert_model\ibert-roberta-base/merges.txt
https://huggingface.co/wptoux/albert-chinese-large-qa/resolve/main/config.json
albert-chinese-large-qa config.json
D:\PYTORCH_TRANSFORMERS_CACHE\61055dc1dfe98ea7e5028eb70dc8819823b10e8a13486f9176c5f28cb6c1336a.79acb9e30f2e23c7e73eb49d29e01e9fe406e2032f21e044939223d70207639e
D:\bert_model\albert-chinese-large-qa/config.json
https://huggingface.co/kssteven/ibert-roberta-base/resolve/main/vocab.json
ibert-roberta-base vocab.json
D:\PYTORCH_TRANSFORMERS_CACHE\673a127a1efd88da2f306da00117064b72a5cc86bcca7be220d81c9c74369858.647b4548b6d9ea817e82e7a9231a320231a1c9ea24053cc9e758f3fe68216f05
D:\bert_model\ibert-roberta-base/vocab.json
https://huggingface.co/xlnet-large-cased/resolve/main/tokenizer.json
xlnet-large-cased tokenizer.json
D:\PYTORCH_TRANSFORMERS_CACHE\6a4afd4829edeea0c7fe7735eccea233e66e79729e574966cfd9ec47f81d269a.2a683f915238b4f560dab0c724066cf0a7de9a851e96b0fb3a1e7f0881552f53
D:\bert_model\xlnet-large-cased/tokenizer.json
https://huggingface.co/roberta-base/resolve/main/config.json
roberta-base config.json
D:\PYTORCH_TRANSFORMERS_CACHE\733bade19e5f0ce98e6531021dd5180994bb2f7b8bd7e80c7968805834ba351e.35205c6cfc956461d8515139f0f8dd5d207a2f336c0c3a83b4bc8dca3518e37b
D:\bert_model\roberta-base/config.json
https://huggingface.co/hfl/chinese-electra-180g-base-discriminator/resolve/main/special_tokens_map.json
chinese-electra-180g-base-discriminator special_tokens_map.json
D:\PYTORCH_TRANSFORMERS_CACHE\76aef35396bf3e77a19ce4972226da9d23226e3c3b6319cf3d232ec1ce3e9031.dd8bd9bfd3664b530ea4e645105f557769387b3da9f79bdb55ed556bdd80611d
D:\bert_model\chinese-electra-180g-base-discriminator/special_tokens_map.json
https://huggingface.co/distilbert-base-cased-distilled-squad/resolve/main/config.json
distilbert-base-cased-distilled-squad config.json
D:\PYTORCH_TRANSFORMERS_CACHE\81e8dfe090123eff18dd06533ead3ae407b82e30834c50c7c82c2305ce3ace12.ca0305b1f128274fa0c6e4859d1c1477d0e34a20be25da95eea888b30ece9cf3
D:\bert_model\distilbert-base-cased-distilled-squad/config.json
https://huggingface.co/distilbert-base-uncased-finetuned-sst-2-english/resolve/main/vocab.txt
distilbert-base-uncased-finetuned-sst-2-english vocab.txt
D:\PYTORCH_TRANSFORMERS_CACHE\83261b0c74c462e53d6367de0646b1fca07d0f15f1be045156b9cf8c71279cc9.d789d64ebfe299b0e416afc4a169632f903f693095b4629a7ea271d5a0cf2c99
D:\bert_model\distilbert-base-uncased-finetuned-sst-2-english/vocab.txt
https://huggingface.co/distilbert-base-uncased-finetuned-sst-2-english/resolve/main/pytorch_model.bin
distilbert-base-uncased-finetuned-sst-2-english pytorch_model.bin
D:\PYTORCH_TRANSFORMERS_CACHE\8d04c767d9d4c14d929ce7ad8e067b80c74dbdb212ef4c3fb743db4ee109fae0.9d268a35da669ead745c44d369dc9948b408da5010c6bac414414a7e33d5748c
D:\bert_model\distilbert-base-uncased-finetuned-sst-2-english/pytorch_model.bin
https://huggingface.co/wptoux/albert-chinese-large-qa/resolve/main/vocab.txt
albert-chinese-large-qa vocab.txt
D:\PYTORCH_TRANSFORMERS_CACHE\b4e2bf6a6135d917a3ad4ddfe65184836aa367fa73d96161ba02f8f0a3cd07d4.accd894ff58c6ff7bd4f3072890776c14f4ea34fcc08e79cd88c2d157756dceb
D:\bert_model\albert-chinese-large-qa/vocab.txt
https://huggingface.co/kssteven/ibert-roberta-base/resolve/main/special_tokens_map.json
ibert-roberta-base special_tokens_map.json
D:\PYTORCH_TRANSFORMERS_CACHE\c2fbe5b3fb8f721fc3f4d0e8e2b8d2ed77db69bef5aaf6779bbee917930b1c95.cb2244924ab24d706b02fd7fcedaea4531566537687a539ebb94db511fd122a0
D:\bert_model\ibert-roberta-base/special_tokens_map.json
https://huggingface.co/roberta-base/resolve/main/merges.txt
roberta-base merges.txt
D:\PYTORCH_TRANSFORMERS_CACHE\cafdecc90fcab17011e12ac813dd574b4b3fea39da6dd817813efa010262ff3f.5d12962c5ee615a4c803841266e9c3be9a691a924f72d395d3a6c6c81157788b
D:\bert_model\roberta-base/merges.txt
https://huggingface.co/kssteven/ibert-roberta-base/resolve/main/config.json
ibert-roberta-base config.json
D:\PYTORCH_TRANSFORMERS_CACHE\cfb510f67e8b7caa315edb63cf273dcabea566cc7c79256c9279b9aabfabc1e2.6e328a8b48a360bcdc4fa4628970901425656415d87641bc286c517e3f274c05
D:\bert_model\ibert-roberta-base/config.json
https://huggingface.co/roberta-base/resolve/main/vocab.json
roberta-base vocab.json
D:\PYTORCH_TRANSFORMERS_CACHE\d3ccdbfeb9aaa747ef20432d4976c32ee3fa69663b379deb253ccfce2bb1fdc5.d67d6b367eb24ab43b08ad55e014cf254076934f71d832bbab9ad35644a375ab
D:\bert_model\roberta-base/vocab.json
https://huggingface.co/distilbert-base-uncased-finetuned-sst-2-english/resolve/main/tokenizer_config.json
distilbert-base-uncased-finetuned-sst-2-english tokenizer_config.json
D:\PYTORCH_TRANSFORMERS_CACHE\d44ec0488a5f13d92b3934cb68cc5849bd74ce63ede2eea2bf3c675e1e57297c.627f9558061e7bc67ed0f516b2f7efc1351772cc8553101f08748d44aada8b11
D:\bert_model\distilbert-base-uncased-finetuned-sst-2-english/tokenizer_config.json
https://huggingface.co/roberta-base/resolve/main/tokenizer.json
roberta-base tokenizer.json
D:\PYTORCH_TRANSFORMERS_CACHE\d53fc0fa09b8342651efd4073d75e19617b3e51287c2a535becda5808a8db287.fc9576039592f026ad76a1c231b89aee8668488c671dfbe6616bab2ed298d730
D:\bert_model\roberta-base/tokenizer.json
https://huggingface.co/hfl/chinese-electra-180g-base-discriminator/resolve/main/config.json
chinese-electra-180g-base-discriminator config.json
D:\PYTORCH_TRANSFORMERS_CACHE\d820a4640dd0941f13efa75aaf0e618feaa452ae3dc1273aab0ccaf2518dad1a.a2041ee12db6173e5336e826e864a87202039f99cee50c6598f43bec207a9e10
D:\bert_model\chinese-electra-180g-base-discriminator/config.json
https://huggingface.co/hfl/chinese-electra-180g-base-discriminator/resolve/main/tokenizer_config.json
chinese-electra-180g-base-discriminator tokenizer_config.json
D:\PYTORCH_TRANSFORMERS_CACHE\e4a342dc9852a75edad532b36968addf2b6d0da241b6938a99de2a0a9e37667f.d23f50bbddc3fb34db5a76d47fa9bdd5d75bf4201ad2d49abbcca25629b3e562
D:\bert_model\chinese-electra-180g-base-discriminator/tokenizer_config.json
https://huggingface.co/wptoux/albert-chinese-large-qa/resolve/main/tokenizer_config.json
albert-chinese-large-qa tokenizer_config.json
D:\PYTORCH_TRANSFORMERS_CACHE\f09a2ad95ee81ac29d743d3147c78b8c2946c31331e7feb2735e09593e903aeb.56af104a95626511966a55b49534621585cf3139a6b17691620ba971664ac74b
D:\bert_model\albert-chinese-large-qa/tokenizer_config.json

复制 python cache 文件到预训练模型相关推荐

  1. 【pytorch】Timm库从本地权重文件初始化预训练模型

    由于github连接不稳定,导致有时候Timm库自动从URL下载模型失败.这时候如果要使用预训练模型,就需要提前下载到本地,但是timm没有直接的对应接口,需要做一些调整. 读取整个模型的情况(fea ...

  2. OpenCV调用TensorFlow预训练模型

    OpenCV调用TensorFlow预训练模型 [尊重原创,转载请注明出处]https://panjinquan.blog.csdn.net/article/details/80570120 强大Op ...

  3. Pytorch:NLP 迁移学习、NLP中的标准数据集、NLP中的常用预训练模型、加载和使用预训练模型、huggingface的transfomers微调脚本文件

    日萌社 人工智能AI:Keras PyTorch MXNet TensorFlow PaddlePaddle 深度学习实战(不定时更新) run_glue.py微调脚本代码 python命令执行run ...

  4. python调用yolov3模型,pytorch版yolov3训练自己的数据(数据,代码,预训练模型下载链接)...

    1.数据预处理 准备图片数据(JPEGImages),标注文件(Annotations),以及划分好测试集训练集的索引号(ImageSets) 修改代码中voc_label.py文件中的路径以及类别, ...

  5. python训练模型有什么用_为什么要使用预训练模型?一些顶级的预训练模型介绍...

    如今,自然语言处理应用已经变得无处不在.自然语言处理应用能够快速增长,很大程度上要归功于通过预训练模型实现迁移学习的概念.在本文中,我将介绍一些顶级的预训练模型,你可以用它们来开始你的自然语言处理之旅 ...

  6. python寻找近义词:预训练模型 nltk+20newsbydate / gensim glove 转 word2vec

    本文用python寻找英文近义词(中文:https://github.com/huyingxi/Synonyms) 使用的都是预训练模型 方法一.nltk+20newsbydate (运行时下载太慢/ ...

  7. 使用YOLOV5-6.2预训练模型(yolov5s)进行detect的详细说明(detect.py)文件解析

    目录 准备 源文件和预训练文件下载 python版本以及torch版本说明: 文件目录说明 测试文件 detect.py使用 测试单张图片 测试一个文件夹里的图片 准备 源文件和预训练文件下载 下载链 ...

  8. Python中将pandas的dataframe拷贝到剪切板并保持格式实战:to_clipboard()函数、复制到Excel文件、复制到文本文件(默认是tsv格式)、复制到文本文件(设置逗号分隔符)

    Python中将pandas的dataframe拷贝到剪切板并保持格式实战:to_clipboard()函数.复制到Excel文件.复制到文本文件(默认是tsv格式).复制到文本文件(设置逗号分隔符) ...

  9. CV之NS之VGG16:基于预训练模型VGG16训练COCO的train2014数据集实现训练《神奈川冲浪里》风格配置yml文件

    CV之NS之VGG16:基于预训练模型VGG16训练COCO的train2014数据集实现训练<神奈川冲浪里>风格配置yml文件 目录 一.训练 1.<神奈川冲浪里>风格 2. ...

最新文章

  1. Facebook开源最大规模并行语料,45亿语料,覆盖576种语言对
  2. 中小型商业银行的软件安全测试之道
  3. 视频转码能力哪家强?腾讯云、阿里云、七牛云多维度对比
  4. java 动态编译 canino_java动态编译
  5. [转]php-fpm配置具体解释
  6. SAP CRM呼叫中心的邮件发送实现 - Function module CRM_EMAIL_SEND_EMAIL
  7. c#中中读取嵌入和使用资源文件的方法
  8. vs python生成exe文件_使用VScode编写python程序并打包成.exe文件-文件夹变成exe
  9. mpvue 中控制swiper滑动,禁止滑动,只允许左滑动,不允许右滑
  10. 康乐php一键脚本,kangle一键脚本
  11. CentOS7安装mysql8并配置
  12. 茶颜只有“霸蛮”,没有悦色
  13. php导出excel不兼容wps,#wps保存dbf不兼容#如何将Excel文件另存为DBF3格式
  14. Docker玩转Rhadoop
  15. 计算机文件自动备份到移动硬盘,1个让移动硬盘自动备份的简单方法!
  16. linux开发板推荐
  17. Ant Design Vue表格序号递增问题
  18. 理解置信区间和置信水平
  19. ELK日志分析平台(二)----logstash数据采集
  20. 杭电计算机专业期末考试助攻,杭电嘻哈:舶来文化亦可玩出小清新

热门文章

  1. 银光类似web visio的节点连线控件Essential Diagram免费下载地址
  2. 网线直连,Synergy低延迟顺滑共享鼠标键盘
  3. 递归边界条件不足的解决方法
  4. Quectel EC200A-CN移植
  5. MySQL inet aton函数,MySQL IP转数字函数 INET_ATON() INET_NTOA()
  6. 数据库系统(DBS)2
  7. matlab双纵坐标的绘图命令_[转载]MATLAB画双纵坐标 plotyy的用法 对数坐标
  8. 9、RH850 SPI(CSIH) 通讯功能和配置
  9. 获取当前登录用户的IP地址代码
  10. detectron研读