絮叨两句:
博主是一名软件工程系的在校生,利用博客记录自己所学的知识,也希望能帮助到正在学习的同学们
人的一生中会遇到各种各样的困难和折磨,逃避是解决不了问题的,唯有以乐观的精神去迎接生活的挑战
少年易老学难成,一寸光阴不可轻。
最喜欢的一句话:今日事,今日毕


博主刚刚接触爬虫,有什么不足之处请大家谅解,也希望能指导一下


系列文章目录

从Python爬虫到Spark预处理数据的真实需求[一]
从Python爬虫到Spark预处理数据的真实需求[二]
从Python爬虫到Spark预处理数据的真实需求[三]
从Python爬虫到Spark预处理数据的真实需求[四]
从Python爬虫到Spark预处理数据的真实需求[五]


文章目录

  • 系列文章目录
  • 前言
  • 代码
    • 火花塞
    • 机油
    • 轮胎
    • 刹车片
    • 添加剂
    • 原厂件
  • 总结

前言

这一章是用来对没有获取到的数据进行再次请求获取,进行更新


提示:以下是本篇文章正文内容,下面案例可供参考

代码

火花塞

import requests
from fake_useragent import UserAgent
import pymysql
from bs4 import BeautifulSoup
def get_proxy():return  requests.get('http://xxxxxxxxxxx/get/').json()['proxy']
def getHTML(url):proxy = get_proxy()  # 获取代理ipua = UserAgent()  # 实例化
# 请求头就可以写成cookie = '__jdu=577937999; areaId=15; ipLoc-djd=15-1213-3410-0; PCSYCityID=CN_330000_330100_330105; shshshfpa=bbe2e678-8333-005c-d01a-b070738f7860-1597809413; shshshfpb=pqSL0Bsl%2FLma%20U3QU6OB1xw%3D%3D; mt_xid=V2_52007VwcUVFVaVFIXQSldVWJWFwVVX05cGx0eQAAyVhRODQhWWQNJH1gEY1QWBwhcWwovShhfBHsCG05eWUNaG0IcVA5mACJQbVhiUh9IGV4MYgMbU1xfV14eQR1bAVcDFFZZ; user-key=68c44d85-8cac-4072-8369-c117f62d8eb3; cn=0; unpl=V2_ZzNtbURfFhZwXEEAKx4OVWJTElsSUUoUdQsRAHkbWgFmCkEKclRCFnQUR11nGl0UZwQZWEVcQxxFCEdkeBBVAWMDE1VGZxBFLV0CFSNGF1wjU00zQwBBQHcJFF0uSgwDYgcaDhFTQEJ2XBVQL0oMDDdRFAhyZ0AVRQhHZH8fWg1lBRpVSmdzEkU4dlN7EFQGZDMTbUNnAUEpCk5Weh5YSGMFFFVAUUsdfThHZHg%3d; __jdv=76161171|baidu-pinzhuan|t_288551095_baidupinzhuan|cpc|0f3d30c8dba7459bb52f2eb5eba8ac7d_0_79d24e6ea6ca4a17a78012fe337508bf|1597913171572; __jda=122270672.577937999.1597809411.1597906781.1597909689.7; __jdc=122270672; 3AB9D23F7A4B3C9B=3SJYGJUIMVOSXHMAT54Z7M54MSN7POALYPRYHXXL4OTIUAYWVYTBG6AFPA4L4Q5ED37GELWUAZFAMTA6KV6JQSFCHA; shshshfp=b65beee4eac3565989e568b588a5f619; shshshsID=1d65ff1a1d5a2ecc10479b3f2d1ce72b_39_1597913210747; __jdb=122270672.48.577937999|7.1597909689'headers = {"User-Agent": ua.random,'Cookie': cookie}trytimes = 100  # 重试的次数for i in range(trytimes):try:response = requests.get(url, headers=headers, proxies={"http": "https://{}".format(proxy)}, timeout=1)# response = requests.get(url, headers=headers,timeout=3)#  注意此处也可能是302等状态码if response.status_code == 200:breakexcept:# logdebug(f'requests failed {i}time')print(f'requests failed {i} time','要获取的URL:',url)return response.textdef getProduct(https_li_href,brand_name,product_sku,product_Price):db = "UPDATE `xxuan_car_jd_hhs_product` SET "sql = {'skuid': '','name': '','brand': '','price': '','url': '','commodity_Name': '','image': '','sales': '','material': '','type': '','ArticleNumbera': '','GrossWeight': ''}sql['url'] = https_li_hrefsql['brand'] = brand_namesql['price'] = product_Pricesql['skuid'] = product_skuproduct_HTML = getHTML(https_li_href)produc_soup = BeautifulSoup(product_HTML, 'html.parser')# 商品标题名称sku_name_wrap = produc_soup.find('div', attrs={'class': 'itemInfo-wrap'})if sku_name_wrap != None:sku_name = sku_name_wrap.find('div', attrs={'class': 'sku-name'})if sku_name != None:sku_name = sku_name.textsku_name = str(sku_name).strip()sql['commodity_Name'] = sku_name# 商品图片spec_img = produc_soup.find('img', attrs={'id': 'spec-img'})if spec_img == None:spec_img = 'NULL'else:spec_img = spec_img['data-origin']# print("https:",spec_img)imageURL = f"https:{spec_img}"if imageURL.__contains__('NULL'):sql['image'] = f"NULL"else:sql['image'] = imageURL# 商品规格信息parameter_list = produc_soup.find('ul', attrs={'class': 'parameter2 p-parameter-list'})if parameter_list != None:li_all_parameter = parameter_list.findAll('li')for li in li_all_parameter:# print(li)if str(li.text).__contains__('商品名称:'):if li.text == None:sql['name'] = 'NULL',else:sql['name'] = str(li.text).replace('商品名称:', '')elif str(li.text).__contains__('销售规格:'):if li.text == None:sql['sales'] = 'NULL',else:sql['sales'] = str(li.text).replace('销售规格:', '')elif str(li.text).__contains__('产品材质:'):if li.text == None:sql['material'] = 'NULL',else:sql['material'] = str(li.text).replace('产品材质:', '')elif str(li.text).__contains__('产品类型:'):if li.text == None:sql['type'] = 'NULL',else:sql['type'] = str(li.text).replace('产品类型:', '')elif str(li.text).__contains__('货号:'):if li.text == None:sql['ArticleNumbera'] = 'NULL',else:sql['ArticleNumbera'] = str(li.text).replace('货号:', '')elif str(li.text).__contains__('商品毛重:'):if li.text == None:sql['GrossWeight'] = 'NULL',else:sql['GrossWeight'] = str(li.text).replace('商品毛重:', '')# print(sql)for i in sql:if len(str(sql[i])) == 0:sql[i] = 'NULL'if i != "GrossWeight":db += f"{i}='{sql[i]}',"else:db += f"{i}='{sql[i]}' WHERE skuid='{product_sku}' AND brand='{brand_name}' AND price='{product_Price}' AND url='{https_li_href}';"# print(db)'''首先生成插入语句,等写入直接source加载'''# with open('E:\\xxuan_car_jd_mobil_product.txt', 'a', encoding='utf-8') as w:#     w.write(db + '\r')#     print(db)'''直接插入'''# print(db)# conneMysql(db)return dbdef connectMysql():conn = pymysql.connect(host='localhost',user='root',password='root',db='jd_qipei',charset='utf8',autocommit=True,  # 如果插入数据,, 是否自动提交? 和conn.commit()功能一致。)cur = conn.cursor()insert_sql = 'select  * from `xxuan_car_jd_hhs_product` where  name ="NULL"'cur.execute(insert_sql)ALL_NULL=cur.fetchall()for Null in ALL_NULL:sku=str(Null).split(',')[1].replace("'","").strip()brand=str(Null).split(',')[3].replace("'","").strip()href=str(Null).split(',')[5].replace("'","").strip()price=str(Null).split(',')[4].replace("'","").strip()db=getProduct(https_li_href=href,brand_name=brand,product_sku=sku,product_Price=price)print(db)cur.execute(db)conn.commit()if __name__ == '__main__':connectMysql()

机油

import requests
from fake_useragent import UserAgent
import pymysql
from bs4 import BeautifulSoup
def get_proxy():return  requests.get('http://xxxxxxxxxxx/get/').json()['proxy']
def getHTML(url):proxy = get_proxy()  # 获取代理ipua = UserAgent()  # 实例化
# 请求头就可以写成cookie = '__jdu=577937999; areaId=15; ipLoc-djd=15-1213-3410-0; PCSYCityID=CN_330000_330100_330105; shshshfpa=bbe2e678-8333-005c-d01a-b070738f7860-1597809413; shshshfpb=pqSL0Bsl%2FLma%20U3QU6OB1xw%3D%3D; mt_xid=V2_52007VwcUVFVaVFIXQSldVWJWFwVVX05cGx0eQAAyVhRODQhWWQNJH1gEY1QWBwhcWwovShhfBHsCG05eWUNaG0IcVA5mACJQbVhiUh9IGV4MYgMbU1xfV14eQR1bAVcDFFZZ; user-key=68c44d85-8cac-4072-8369-c117f62d8eb3; cn=0; unpl=V2_ZzNtbURfFhZwXEEAKx4OVWJTElsSUUoUdQsRAHkbWgFmCkEKclRCFnQUR11nGl0UZwQZWEVcQxxFCEdkeBBVAWMDE1VGZxBFLV0CFSNGF1wjU00zQwBBQHcJFF0uSgwDYgcaDhFTQEJ2XBVQL0oMDDdRFAhyZ0AVRQhHZH8fWg1lBRpVSmdzEkU4dlN7EFQGZDMTbUNnAUEpCk5Weh5YSGMFFFVAUUsdfThHZHg%3d; __jdv=76161171|baidu-pinzhuan|t_288551095_baidupinzhuan|cpc|0f3d30c8dba7459bb52f2eb5eba8ac7d_0_79d24e6ea6ca4a17a78012fe337508bf|1597913171572; __jda=122270672.577937999.1597809411.1597906781.1597909689.7; __jdc=122270672; 3AB9D23F7A4B3C9B=3SJYGJUIMVOSXHMAT54Z7M54MSN7POALYPRYHXXL4OTIUAYWVYTBG6AFPA4L4Q5ED37GELWUAZFAMTA6KV6JQSFCHA; shshshfp=b65beee4eac3565989e568b588a5f619; shshshsID=1d65ff1a1d5a2ecc10479b3f2d1ce72b_39_1597913210747; __jdb=122270672.48.577937999|7.1597909689'headers = {"User-Agent": ua.random,'Cookie': cookie}trytimes = 1000  # 重试的次数for i in range(trytimes):try:response = requests.get(url, headers=headers, proxies={"http": "https://{}".format(proxy)}, timeout=1)# response = requests.get(url, headers=headers,timeout=3)# 注意此处也可能是302等状态码if response.status_code == 200:breakexcept:# logdebug(f'requests failed {i}time')print(f'requests failed {i} time','要获取的URL:',url)return response.textdef getProduct(https_li_href,brand_name):db = "UPDATE `xxuan_car_jd_mobil_product` SET "sql = {'skuid': '','name': '','brand': '','type': '','url': '','originplace': '','netweight': '','commodity_Name': '','image': '','viscosity': '','volume': ''}sql['url'] = https_li_hrefsql['brand'] = brand_nameproduct_HTML = getHTML(https_li_href)produc_soup = BeautifulSoup(product_HTML, 'html.parser')# 商品标题名称sku_name_wrap = produc_soup.find('div', attrs={'class': 'itemInfo-wrap'})if sku_name_wrap != None:sku_name = sku_name_wrap.find('div', attrs={'class': 'sku-name'})if sku_name != None:sku_name = sku_name.textsku_name = str(sku_name).strip()sql['commodity_Name'] = sku_name# print("商品标题名称:",sku_name)# print('商品价格:',li_price)# summary_price = produc_soup.find('div', attrs={'class': 'summary-price J-summary-price'})# if summary_price != None:#     p_price = summary_price.find('div', attrs={'class': 'dd'}).find('span', attrs={'class': 'pricing'})#     if p_price != None:#         p_price = str(p_price.text).replace('[', '').replace(']', '').replace('¥', '')#     else:#         p_price = 'NULL'#     sql['price'] = p_price# else:#     sql['price'] = 'NULL'# 商品图片spec_img = produc_soup.find('img', attrs={'id': 'spec-img'})if spec_img == None:spec_img = 'NULL'else:spec_img = spec_img['data-origin']# print("https:",spec_img)imageURL = f"https:{spec_img}"if imageURL.__contains__('NULL'):sql['image'] = f"NULL"else:sql['image'] = imageURL# 商品规格信息parameter_list = produc_soup.find('ul', attrs={'class': 'parameter2 p-parameter-list'})if parameter_list != None:li_all_parameter = parameter_list.findAll('li')for li in li_all_parameter:if str(li.text).__contains__('商品名称:'):if li.text == None:sql['name'] = 'NULL',else:sql['name'] = str(li.text).replace('商品名称:', '')elif str(li.text).__contains__('商品编号:'):if li.text == None:sql['skuid'] = 'NULL',else:sql['skuid'] = str(li.text).replace('商品编号:', '')elif str(li.text).__contains__('商品毛重:'):if li.text == None:sql['netweight'] = 'NULL',else:sql['netweight'] = str(li.text).replace('商品毛重:', '')elif str(li.text).__contains__('商品产地:'):if li.text == None:sql['originplace'] = 'NULL',else:sql['originplace'] = str(li.text).replace('商品产地:', '')elif str(li.text).__contains__('粘度:'):if li.text == None:sql['viscosity'] = 'NULL',else:sql['viscosity'] = str(li.text).replace('粘度:', '')elif str(li.text).__contains__('机油种类:'):if li.text == None:sql['type'] = 'NULL',else:sql['type'] = str(li.text).replace('机油种类:', '')elif str(li.text).__contains__('容量:'):if li.text == None:sql['volume'] = 'NULL',else:sql['volume'] = str(li.text).replace('容量:', '')# print(sql)for i in sql:if len(str(sql[i])) == 0:sql[i] = 'NULL'if i != "volume":db += f"{i}='{sql[i]}',"else:db += f"{i}='{sql[i]}'  WHERE url='{https_li_href}' AND brand='{brand_name}'"'''首先生成插入语句,等写入直接source加载'''# with open('E:\\xxuan_car_jd_mobil_product.txt', 'a', encoding='utf-8') as w:#     w.write(db + '\r')#     print(db)'''直接插入'''# print(db)# conneMysql(db)return dbdef connectMysql():conn = pymysql.connect(host='localhost',user='root',password='root',db='jd_qipei',charset='utf8',autocommit=True,  # 如果插入数据,, 是否自动提交? 和conn.commit()功能一致。)cur = conn.cursor()insert_sql = 'select  * from `xxuan_car_jd_mobil_product` where  skuid ="NULL"'cur.execute(insert_sql)ALL_NULL=cur.fetchall()for Null in ALL_NULL:brand=str(Null).split(',')[3].replace("'","").strip()href=str(Null).split(',')[6].replace("'","").strip()# price=str(Null).split(',')[4].replace("'","").strip()# print(brand,'-----',href)db=getProduct(https_li_href=href,brand_name=brand)print(db)# print(db)cur.execute(db)conn.commit()if __name__ == '__main__':connectMysql()

轮胎

import requests
from fake_useragent import UserAgent
import pymysql
from bs4 import BeautifulSoup
def get_proxy():return  requests.get('http://xxxxxxxxxxx/get/').json()['proxy']
def getHTML(url):proxy = get_proxy()  # 获取代理ipua = UserAgent()  # 实例化
# 请求头就可以写成cookie = '__jdu=577937999; areaId=15; ipLoc-djd=15-1213-3410-0; PCSYCityID=CN_330000_330100_330105; shshshfpa=bbe2e678-8333-005c-d01a-b070738f7860-1597809413; shshshfpb=pqSL0Bsl%2FLma%20U3QU6OB1xw%3D%3D; mt_xid=V2_52007VwcUVFVaVFIXQSldVWJWFwVVX05cGx0eQAAyVhRODQhWWQNJH1gEY1QWBwhcWwovShhfBHsCG05eWUNaG0IcVA5mACJQbVhiUh9IGV4MYgMbU1xfV14eQR1bAVcDFFZZ; user-key=68c44d85-8cac-4072-8369-c117f62d8eb3; cn=0; unpl=V2_ZzNtbURfFhZwXEEAKx4OVWJTElsSUUoUdQsRAHkbWgFmCkEKclRCFnQUR11nGl0UZwQZWEVcQxxFCEdkeBBVAWMDE1VGZxBFLV0CFSNGF1wjU00zQwBBQHcJFF0uSgwDYgcaDhFTQEJ2XBVQL0oMDDdRFAhyZ0AVRQhHZH8fWg1lBRpVSmdzEkU4dlN7EFQGZDMTbUNnAUEpCk5Weh5YSGMFFFVAUUsdfThHZHg%3d; __jdv=76161171|baidu-pinzhuan|t_288551095_baidupinzhuan|cpc|0f3d30c8dba7459bb52f2eb5eba8ac7d_0_79d24e6ea6ca4a17a78012fe337508bf|1597913171572; __jda=122270672.577937999.1597809411.1597906781.1597909689.7; __jdc=122270672; 3AB9D23F7A4B3C9B=3SJYGJUIMVOSXHMAT54Z7M54MSN7POALYPRYHXXL4OTIUAYWVYTBG6AFPA4L4Q5ED37GELWUAZFAMTA6KV6JQSFCHA; shshshfp=b65beee4eac3565989e568b588a5f619; shshshsID=1d65ff1a1d5a2ecc10479b3f2d1ce72b_39_1597913210747; __jdb=122270672.48.577937999|7.1597909689'headers = {"User-Agent": ua.random,'Cookie': cookie}trytimes = 1000  # 重试的次数for i in range(trytimes):try:response = requests.get(url, headers=headers, proxies={"http": "https://{}".format(proxy)}, timeout=1)# response = requests.get(url, headers=headers,timeout=3)# 注意此处也可能是302等状态码if response.status_code == 200:breakexcept:# logdebug(f'requests failed {i}time')print(f'requests failed {i} time','要获取的URL:',url)return response.textdef getProduct(https_li_href,brand_name,price):db = "UPDATE `xxuan_car_jd_lt_product` SET "sql = {  'skuid':'','name':'','brand':'','url':'','price':'','commodity_Name':'','image':'','netweight':'','originplace':'','size':'','width':'','number':'','performance':'','Flattening':'','characteristics':'','type':''}sql['url'] = https_li_hrefsql['brand'] = brand_namesql['price']=priceproduct_HTML = getHTML(https_li_href)produc_soup = BeautifulSoup(product_HTML, 'html.parser')# 商品标题名称sku_name_wrap = produc_soup.find('div', attrs={'class': 'itemInfo-wrap'})if sku_name_wrap != None:sku_name = sku_name_wrap.find('div', attrs={'class': 'sku-name'})if sku_name != None:sku_name = sku_name.textsku_name = str(sku_name).strip()sql['commodity_Name'] = sku_name# print("商品标题名称:",sku_name)# print('商品价格:',li_price)# summary_price = produc_soup.find('div', attrs={'class': 'summary-price J-summary-price'})# if summary_price != None:#     p_price = summary_price.find('div', attrs={'class': 'dd'}).find('span', attrs={'class': 'pricing'})#     if p_price != None:#         p_price = str(p_price.text).replace('[', '').replace(']', '').replace('¥', '')#     else:#         p_price = 'NULL'#     sql['price'] = p_price# else:#     sql['price'] = 'NULL'# 商品图片spec_img = produc_soup.find('img', attrs={'id': 'spec-img'})if spec_img == None:spec_img = 'NULL'else:spec_img = spec_img['data-origin']# print("https:",spec_img)imageURL = f"https:{spec_img}"if imageURL.__contains__('NULL'):sql['image'] = f"NULL"else:sql['image'] = imageURL# 商品规格信息parameter_list = produc_soup.find('ul', attrs={'class': 'parameter2 p-parameter-list'})if parameter_list != None:li_all_parameter = parameter_list.findAll('li')for li in li_all_parameter:if str(li.text).__contains__('商品名称:'):if li.text == None:sql['name'] = 'NULL',else:sql['name'] = str(li.text).replace('商品名称:', '')elif str(li.text).__contains__('商品编号:'):if li.text == None:sql['skuid'] = 'NULL',else:sql['skuid'] = str(li.text).replace('商品编号:', '')elif str(li.text).__contains__('商品毛重:'):if li.text == None:sql['netweight'] = 'NULL',else:sql['netweight'] = str(li.text).replace('商品毛重:', '')elif str(li.text).__contains__('商品产地:'):if li.text == None:sql['originplace'] = 'NULL',else:sql['originplace'] = str(li.text).replace('商品产地:', '')elif str(li.text).__contains__('尺寸:'):if li.text == None:sql['size'] = 'NULL',else:sql['size'] = str(li.text).replace('尺寸:', '')elif str(li.text).__contains__('胎面宽度:'):if li.text == None:sql['width'] = 'NULL',else:sql['width'] = str(li.text).replace('胎面宽度:', '')elif str(li.text).__contains__('扁平比:'):if li.text == None:sql['Flattening'] = 'NULL',else:sql['Flattening'] = str(li.text).replace('扁平比:', '')elif str(li.text).__contains__('货号:'):if li.text == None:sql['number'] = 'NULL',else:sql['number'] = str(li.text).replace('货号:', '')elif str(li.text).__contains__('花纹性能:'):if li.text == None:sql['performance'] = 'NULL',else:sql['performance'] = str(li.text).replace('花纹性能:', '')elif str(li.text).__contains__('轮胎特性:'):if li.text == None:sql['characteristics'] = 'NULL',else:sql['characteristics'] = str(li.text).replace('轮胎特性:', '')elif str(li.text).__contains__('车型类别:'):if li.text == None:sql['type'] = 'NULL',else:sql['type'] = str(li.text).replace('车型类别:', '')# print(sql)for i in sql:if len(str(sql[i])) == 0:sql[i] = 'NULL'if i != "type":db += f"{i}='{sql[i]}',"else:db += f"{i}='{sql[i]}'  WHERE url='{https_li_href}' AND brand='{brand_name}'"'''首先生成插入语句,等写入直接source加载'''# with open('E:\\xxuan_car_jd_mobil_product.txt', 'a', encoding='utf-8') as w:#     w.write(db + '\r')#     print(db)'''直接插入'''# print(db)# conneMysql(db)return dbdef connectMysql():conn = pymysql.connect(host='localhost',user='root',password='root',db='jd_qipei',charset='utf8',autocommit=True,  # 如果插入数据,, 是否自动提交? 和conn.commit()功能一致。)cur = conn.cursor()insert_sql = 'select  * from `xxuan_car_jd_lt_product` where  skuid ="NULL"'cur.execute(insert_sql)ALL_NULL=cur.fetchall()for Null in ALL_NULL:brand=str(Null).split(',')[3].replace("'","").strip()href=str(Null).split(',')[4].replace("'","").strip()price=str(Null).split(',')[5].replace("'","").strip()# print(brand,'-----',href)db=getProduct(https_li_href=href,brand_name=brand,price=price)print(db)# print(db)cur.execute(db)conn.commit()if __name__ == '__main__':connectMysql()

刹车片

import requests
from fake_useragent import UserAgent
import pymysql
from bs4 import BeautifulSoup
def get_proxy():return  requests.get('http://xxxxxxxxxxx/get/').json()['proxy']
def getHTML(url):proxy = get_proxy()  # 获取代理ipua = UserAgent()  # 实例化
# 请求头就可以写成cookie = '__jdu=577937999; areaId=15; ipLoc-djd=15-1213-3410-0; PCSYCityID=CN_330000_330100_330105; shshshfpa=bbe2e678-8333-005c-d01a-b070738f7860-1597809413; shshshfpb=pqSL0Bsl%2FLma%20U3QU6OB1xw%3D%3D; mt_xid=V2_52007VwcUVFVaVFIXQSldVWJWFwVVX05cGx0eQAAyVhRODQhWWQNJH1gEY1QWBwhcWwovShhfBHsCG05eWUNaG0IcVA5mACJQbVhiUh9IGV4MYgMbU1xfV14eQR1bAVcDFFZZ; user-key=68c44d85-8cac-4072-8369-c117f62d8eb3; cn=0; unpl=V2_ZzNtbURfFhZwXEEAKx4OVWJTElsSUUoUdQsRAHkbWgFmCkEKclRCFnQUR11nGl0UZwQZWEVcQxxFCEdkeBBVAWMDE1VGZxBFLV0CFSNGF1wjU00zQwBBQHcJFF0uSgwDYgcaDhFTQEJ2XBVQL0oMDDdRFAhyZ0AVRQhHZH8fWg1lBRpVSmdzEkU4dlN7EFQGZDMTbUNnAUEpCk5Weh5YSGMFFFVAUUsdfThHZHg%3d; __jdv=76161171|baidu-pinzhuan|t_288551095_baidupinzhuan|cpc|0f3d30c8dba7459bb52f2eb5eba8ac7d_0_79d24e6ea6ca4a17a78012fe337508bf|1597913171572; __jda=122270672.577937999.1597809411.1597906781.1597909689.7; __jdc=122270672; 3AB9D23F7A4B3C9B=3SJYGJUIMVOSXHMAT54Z7M54MSN7POALYPRYHXXL4OTIUAYWVYTBG6AFPA4L4Q5ED37GELWUAZFAMTA6KV6JQSFCHA; shshshfp=b65beee4eac3565989e568b588a5f619; shshshsID=1d65ff1a1d5a2ecc10479b3f2d1ce72b_39_1597913210747; __jdb=122270672.48.577937999|7.1597909689'headers = {"User-Agent": ua.random,'Cookie': cookie}trytimes = 100  # 重试的次数for i in range(trytimes):try:response = requests.get(url, headers=headers, proxies={"http": "https://{}".format(proxy)}, timeout=1)# response = requests.get(url, headers=headers,timeout=3)#  注意此处也可能是302等状态码if response.status_code == 200:breakexcept:# logdebug(f'requests failed {i}time')print(f'requests failed {i} time','要获取的URL:',url)return response.textdef getProduct(https_li_href,brand_name,product_Sku,product_Price):db = "UPDATE `xxuan_car_jd_scp_product` SET "sql = {'skuid': '','name': '','brand': '','price':'','url': '','commodity_Name':'','image':'','Additivetype':'','TypesOfAdditives':'','NetContent':'','ArticleNumber':'','boiling':'','package':'','GrossWeight':'','CommodityOrigin':'','process':'','Installation':'','type':'','texture':''}sql['url']=https_li_hrefsql['brand']=brand_namesql['price']=product_Pricesql['skuid']=product_Skuproduct_HTML = getHTML(https_li_href)produc_soup = BeautifulSoup(product_HTML, 'html.parser')# 商品标题名称sku_name_wrap = produc_soup.find('div', attrs={'class': 'itemInfo-wrap'})if sku_name_wrap != None:sku_name = sku_name_wrap.find('div', attrs={'class': 'sku-name'})if sku_name != None:sku_name = sku_name.textsku_name = str(sku_name).strip()sql['commodity_Name'] = sku_name# 商品图片spec_img = produc_soup.find('img', attrs={'id': 'spec-img'})if spec_img == None:spec_img = 'NULL'else:spec_img = spec_img['data-origin']# print("https:",spec_img)imageURL = f"https:{spec_img}"if imageURL.__contains__('NULL'):sql['image'] = f"NULL"else:sql['image'] = imageURL# 商品规格信息parameter_list = produc_soup.find('ul', attrs={'class': 'parameter2 p-parameter-list'})if parameter_list != None:li_all_parameter = parameter_list.findAll('li')for li in li_all_parameter:# print(li)if str(li.text).__contains__('商品名称:'):if li.text == None:sql['name'] = 'NULL',else:sql['name'] = str(li.text).replace('商品名称:', '')elif str(li.text).__contains__('商品编号:'):if li.text == None:sql['skuid'] = 'NULL',# passelse:sql['skuid'] =str(li.text).replace('商品编号:', '')# passelif str(li.text).__contains__('产品类别:'):if li.text == None:sql['type'] = 'NULL',else:sql['type'] = str(li.text).replace('产品类别:', '')elif str(li.text).__contains__('包装规格:'):if li.text == None:sql['package'] = 'NULL',else:sql['package'] = str(li.text).replace('包装规格:', '')elif str(li.text).__contains__('干湿沸点:'):if li.text == None:sql['boiling'] = 'NULL',else:sql['boiling'] = str(li.text).replace('干湿沸点:', '')elif str(li.text).__contains__('货号:'):if li.text == None:sql['ArticleNumber'] = 'NULL',else:sql['ArticleNumber'] = str(li.text).replace('货号:', '')elif str(li.text).__contains__('商品毛重:'):if li.text == None:sql['GrossWeight'] = 'NULL',else:sql['GrossWeight'] = str(li.text).replace('商品毛重:', '')elif str(li.text).__contains__('商品产地:'):if li.text == None:sql['CommodityOrigin'] = 'NULL',else:sql['CommodityOrigin'] = str(li.text).replace('商品产地:', '')elif str(li.text).__contains__('产品工艺:'):if li.text == None:sql['process'] = 'NULL',else:sql['process'] = str(li.text).replace('产品工艺:', '')elif str(li.text).__contains__('安装位置:'):if li.text == None:sql['Installation'] = 'NULL',else:sql['Installation'] = str(li.text).replace('安装位置:', '')elif str(li.text).__contains__('类别:'):if li.text == None:sql['type'] = 'NULL',else:sql['type'] = str(li.text).replace('类别:', '')elif str(li.text).__contains__('材质:'):if li.text == None:sql['texture'] = 'NULL',else:sql['texture'] = str(li.text).replace('材质:', '')# print(sql)for i in sql:if len(str(sql[i])) == 0:sql[i] = 'NULL'if i != "texture":db += f"{i}='{sql[i]}',"else:db += f"{i}='{sql[i]}'  WHERE skuid='{product_Sku}' AND brand='{brand_name}' AND price='{product_Price}' AND url='{https_li_href}';"# print(db)'''首先生成插入语句,等写入直接source加载'''# with open('E:\\xxuan_car_jd_mobil_product.txt', 'a', encoding='utf-8') as w:#     w.write(db + '\r')#     print(db)'''直接插入'''# print(db)# conneMysql(db)return dbdef connectMysql():conn = pymysql.connect(host='localhost',user='root',password='root',db='jd_qipei',charset='utf8',autocommit=True,  # 如果插入数据,, 是否自动提交? 和conn.commit()功能一致。)cur = conn.cursor()insert_sql = 'select  * from `xxuan_car_jd_scp_product` where  name ="NULL"'cur.execute(insert_sql)ALL_NULL=cur.fetchall()for Null in ALL_NULL:sku=str(Null).split(',')[1].replace("'","").strip()brand=str(Null).split(',')[3].replace("'","").strip()href=str(Null).split(',')[5].replace("'","").strip()price=str(Null).split(',')[4].replace("'","").strip()print('sku:',sku,'----brand:',brand,'----href:',href,'----price:',price)db=getProduct(https_li_href=href,brand_name=brand,product_Sku=sku,product_Price=price)print(db)cur.execute(db)conn.commit()if __name__ == '__main__':connectMysql()

添加剂

import requests
from fake_useragent import UserAgent
import pymysql
from bs4 import BeautifulSoup
def get_proxy():return  requests.get('http://xxxxxxxxxxx/get/').json()['proxy']
def getHTML(url):proxy = get_proxy()  # 获取代理ipua = UserAgent()  # 实例化
# 请求头就可以写成cookie = '__jdu=577937999; areaId=15; ipLoc-djd=15-1213-3410-0; PCSYCityID=CN_330000_330100_330105; shshshfpa=bbe2e678-8333-005c-d01a-b070738f7860-1597809413; shshshfpb=pqSL0Bsl%2FLma%20U3QU6OB1xw%3D%3D; mt_xid=V2_52007VwcUVFVaVFIXQSldVWJWFwVVX05cGx0eQAAyVhRODQhWWQNJH1gEY1QWBwhcWwovShhfBHsCG05eWUNaG0IcVA5mACJQbVhiUh9IGV4MYgMbU1xfV14eQR1bAVcDFFZZ; user-key=68c44d85-8cac-4072-8369-c117f62d8eb3; cn=0; unpl=V2_ZzNtbURfFhZwXEEAKx4OVWJTElsSUUoUdQsRAHkbWgFmCkEKclRCFnQUR11nGl0UZwQZWEVcQxxFCEdkeBBVAWMDE1VGZxBFLV0CFSNGF1wjU00zQwBBQHcJFF0uSgwDYgcaDhFTQEJ2XBVQL0oMDDdRFAhyZ0AVRQhHZH8fWg1lBRpVSmdzEkU4dlN7EFQGZDMTbUNnAUEpCk5Weh5YSGMFFFVAUUsdfThHZHg%3d; __jdv=76161171|baidu-pinzhuan|t_288551095_baidupinzhuan|cpc|0f3d30c8dba7459bb52f2eb5eba8ac7d_0_79d24e6ea6ca4a17a78012fe337508bf|1597913171572; __jda=122270672.577937999.1597809411.1597906781.1597909689.7; __jdc=122270672; 3AB9D23F7A4B3C9B=3SJYGJUIMVOSXHMAT54Z7M54MSN7POALYPRYHXXL4OTIUAYWVYTBG6AFPA4L4Q5ED37GELWUAZFAMTA6KV6JQSFCHA; shshshfp=b65beee4eac3565989e568b588a5f619; shshshsID=1d65ff1a1d5a2ecc10479b3f2d1ce72b_39_1597913210747; __jdb=122270672.48.577937999|7.1597909689'headers = {"User-Agent": ua.random,'Cookie': cookie}trytimes = 100  # 重试的次数for i in range(trytimes):try:response = requests.get(url, headers=headers, proxies={"http": "https://{}".format(proxy)}, timeout=1)# response = requests.get(url, headers=headers,timeout=3)#  注意此处也可能是302等状态码if response.status_code == 200:breakexcept:# logdebug(f'requests failed {i}time')print(f'requests failed {i} time','要获取的URL:',url)return response.textdef getProduct(https_li_href,brand_name,product_Price):db = "UPDATE `xxuan_car_jd_tjj_product` SET "sql = {'skuid': '','name': '','brand': '','price': '','url': '','commodity_Name': '','image': '','Additivetype': '','TypesOfAdditives': '','NetContent': '','ArticleNumber': '','GrossWeight': '','CommodityOrigin': ''}sql['url'] = https_li_hrefsql['brand'] = brand_namesql['price'] = product_Priceproduct_HTML = getHTML(https_li_href)produc_soup = BeautifulSoup(product_HTML, 'html.parser')# 商品标题名称sku_name_wrap = produc_soup.find('div', attrs={'class': 'itemInfo-wrap'})if sku_name_wrap != None:sku_name = sku_name_wrap.find('div', attrs={'class': 'sku-name'})if sku_name != None:sku_name = sku_name.textsku_name = str(sku_name).strip()sql['commodity_Name'] = sku_name# 商品图片spec_img = produc_soup.find('img', attrs={'id': 'spec-img'})if spec_img == None:spec_img = 'NULL'else:spec_img = spec_img['data-origin']# print("https:",spec_img)imageURL = f"https:{spec_img}"if imageURL.__contains__('NULL'):sql['image'] = f"NULL"else:sql['image'] = imageURL# 商品规格信息parameter_list = produc_soup.find('ul', attrs={'class': 'parameter2 p-parameter-list'})if parameter_list != None:li_all_parameter = parameter_list.findAll('li')for li in li_all_parameter:# print(li)if str(li.text).__contains__('商品名称:'):if li.text == None:sql['name'] = 'NULL',else:sql['name'] = str(li.text).replace('商品名称:', '')elif str(li.text).__contains__('商品编号:'):if li.text == None:sql['skuid'] = 'NULL',# passelse:sql['skuid'] = str(li.text).replace('商品编号:', '')# passelif str(li.text).__contains__('添加剂类型:'):if li.text == None:sql['Additivetype'] = 'NULL',else:sql['Additivetype'] = str(li.text).replace('添加剂类型:', '')elif str(li.text).__contains__('添加剂种类:'):if li.text == None:sql['TypesOfAdditives'] = 'NULL',else:sql['TypesOfAdditives'] = str(li.text).replace('添加剂种类:', '')elif str(li.text).__contains__('净含量:'):if li.text == None:sql['NetContent'] = 'NULL',else:sql['NetContent'] = str(li.text).replace('净含量:', '')elif str(li.text).__contains__('货号:'):if li.text == None:sql['ArticleNumber'] = 'NULL',else:sql['ArticleNumber'] = str(li.text).replace('货号:', '')elif str(li.text).__contains__('商品毛重:'):if li.text == None:sql['GrossWeight'] = 'NULL',else:sql['GrossWeight'] = str(li.text).replace('商品毛重:', '')elif str(li.text).__contains__('商品产地:'):if li.text == None:sql['CommodityOrigin'] = 'NULL',else:sql['CommodityOrigin'] = str(li.text).replace('商品产地:', '')# print(sql)for i in sql:if len(str(sql[i])) == 0:sql[i] = 'NULL'if i != "CommodityOrigin":db += f"{i}='{sql[i]}',"else:db += f"{i}='{sql[i]}' WHERE brand='{brand_name}' AND price='{product_Price}' AND url='{https_li_href}';"# print(db)'''首先生成插入语句,等写入直接source加载'''# with open('E:\\xxuan_car_jd_mobil_product.txt', 'a', encoding='utf-8') as w:#     w.write(db + '\r')#     print(db)'''直接插入'''# print(db)# conneMysql(db)return dbdef connectMysql():conn = pymysql.connect(host='localhost',user='root',password='root',db='jd_qipei',charset='utf8',autocommit=True,  # 如果插入数据,, 是否自动提交? 和conn.commit()功能一致。)cur = conn.cursor()insert_sql = 'select  * from `xxuan_car_jd_tjj_product` where  skuid ="NULL"'cur.execute(insert_sql)ALL_NULL=cur.fetchall()for Null in ALL_NULL:brand=str(Null).split(',')[3].replace("'","").strip()href=str(Null).split(',')[5].replace("'","").strip()price=str(Null).split(',')[4].replace("'","").strip()db=getProduct(https_li_href=href,brand_name=brand,product_Price=price)print(db)cur.execute(db)conn.commit()if __name__ == '__main__':connectMysql()

原厂件

import requests
from fake_useragent import UserAgent
import pymysql
from bs4 import BeautifulSoup
def get_proxy():return  requests.get('http://xxxxxxxxxxx/get/').json()['proxy']
def getHTML(url):proxy = get_proxy()  # 获取代理ipua = UserAgent()  # 实例化
# 请求头就可以写成cookie = '__jdu=577937999; areaId=15; ipLoc-djd=15-1213-3410-0; PCSYCityID=CN_330000_330100_330105; shshshfpa=bbe2e678-8333-005c-d01a-b070738f7860-1597809413; shshshfpb=pqSL0Bsl%2FLma%20U3QU6OB1xw%3D%3D; mt_xid=V2_52007VwcUVFVaVFIXQSldVWJWFwVVX05cGx0eQAAyVhRODQhWWQNJH1gEY1QWBwhcWwovShhfBHsCG05eWUNaG0IcVA5mACJQbVhiUh9IGV4MYgMbU1xfV14eQR1bAVcDFFZZ; user-key=68c44d85-8cac-4072-8369-c117f62d8eb3; cn=0; unpl=V2_ZzNtbURfFhZwXEEAKx4OVWJTElsSUUoUdQsRAHkbWgFmCkEKclRCFnQUR11nGl0UZwQZWEVcQxxFCEdkeBBVAWMDE1VGZxBFLV0CFSNGF1wjU00zQwBBQHcJFF0uSgwDYgcaDhFTQEJ2XBVQL0oMDDdRFAhyZ0AVRQhHZH8fWg1lBRpVSmdzEkU4dlN7EFQGZDMTbUNnAUEpCk5Weh5YSGMFFFVAUUsdfThHZHg%3d; __jdv=76161171|baidu-pinzhuan|t_288551095_baidupinzhuan|cpc|0f3d30c8dba7459bb52f2eb5eba8ac7d_0_79d24e6ea6ca4a17a78012fe337508bf|1597913171572; __jda=122270672.577937999.1597809411.1597906781.1597909689.7; __jdc=122270672; 3AB9D23F7A4B3C9B=3SJYGJUIMVOSXHMAT54Z7M54MSN7POALYPRYHXXL4OTIUAYWVYTBG6AFPA4L4Q5ED37GELWUAZFAMTA6KV6JQSFCHA; shshshfp=b65beee4eac3565989e568b588a5f619; shshshsID=1d65ff1a1d5a2ecc10479b3f2d1ce72b_39_1597913210747; __jdb=122270672.48.577937999|7.1597909689'headers = {"User-Agent": ua.random,'Cookie': cookie}trytimes = 100  # 重试的次数for i in range(trytimes):try:response = requests.get(url, headers=headers, proxies={"http": "https://{}".format(proxy)}, timeout=1)# response = requests.get(url, headers=headers,timeout=3)#  注意此处也可能是302等状态码if response.status_code == 200:breakexcept:# logdebug(f'requests failed {i}time')print(f'requests failed {i} time','要获取的URL:',url)return response.textdef getProduct(https_li_href,brand_name,product_sku,product_Price):db = "UPDATE `xxuan_car_jd_ycj_product` SET "sql = {'skuid': '','name': '','brand': '','freezing': '','url': '','originplace': '','netweight': '','price': '','commodity_Name': '','image': '','category': '','package':'','boiling':'','sales':'','installation':'','transmission':''}sql['url']=https_li_hrefsql['brand']=brand_namesql['skuid']=product_skusql['price']=product_Priceproduct_HTML = getHTML(https_li_href)produc_soup = BeautifulSoup(product_HTML, 'html.parser')# 商品标题名称sku_name_wrap = produc_soup.find('div', attrs={'class': 'itemInfo-wrap'})if sku_name_wrap != None:sku_name = sku_name_wrap.find('div', attrs={'class': 'sku-name'})if sku_name != None:sku_name = sku_name.textsku_name = str(sku_name).strip()sql['commodity_Name'] = sku_name# 商品图片spec_img = produc_soup.find('img', attrs={'id': 'spec-img'})if spec_img == None:spec_img = 'NULL'else:spec_img = spec_img['data-origin']# print("https:",spec_img)imageURL = f"https:{spec_img}"if imageURL.__contains__('NULL'):sql['image'] = f"NULL"else:sql['image'] = imageURL# 商品规格信息parameter_list = produc_soup.find('ul', attrs={'class': 'parameter2 p-parameter-list'})if parameter_list != None:li_all_parameter = parameter_list.findAll('li')for li in li_all_parameter:# print(li)if str(li.text).__contains__('商品名称:'):if li.text == None:sql['name'] = 'NULL',else:sql['name'] = str(li.text).replace('商品名称:', '')elif str(li.text).__contains__('商品编号:'):if li.text == None:# sql['skuid'] = 'NULL',passelse:# sql['skuid'] =str(li.text).replace('商品编号:', '')passelif str(li.text).__contains__('商品毛重:'):if li.text == None:sql['netweight'] = 'NULL',else:sql['netweight'] = str(li.text).replace('商品毛重:', '')elif str(li.text).__contains__('商品产地:'):if li.text == None:sql['originplace'] = 'NULL',else:sql['originplace'] = str(li.text).replace('商品产地:', '')elif str(li.text).__contains__('产品类别:'):if li.text == None:sql['category'] = 'NULL',else:sql['category'] = str(li.text).replace('产品类别:', '')elif str(li.text).__contains__('冰点:'):if li.text == None:sql['freezing'] = 'NULL',else:sql['freezing'] = str(li.text).replace('冰点:', '')elif str(li.text).__contains__('包装规格:'):if li.text == None:sql['package'] = 'NULL',else:sql['package'] = str(li.text).replace('包装规格:', '')elif str(li.text).__contains__('干湿沸点:'):if li.text == None:sql['boiling'] = 'NULL',else:sql['boiling'] = str(li.text).replace('干湿沸点:', '')elif str(li.text).__contains__('销售规格:'):if li.text == None:sql['sales'] = 'NULL',else:sql['sales'] = str(li.text).replace('销售规格:', '')elif str(li.text).__contains__('安装位置:'):if li.text == None:sql['installation'] = 'NULL',else:sql['installation'] = str(li.text).replace('安装位置:', '')elif str(li.text).__contains__('变速箱类型:'):if li.text == None:sql['transmission'] = 'NULL',else:sql['transmission'] = str(li.text).replace('变速箱类型:', '')print(sql)for i in sql:if len(str(sql[i])) == 0:sql[i] = 'NULL'if i != "transmission":db += f"{i}='{sql[i]}',"else:db += f"{i}='{sql[i]}' WHERE skuid='{product_sku}' AND brand='{brand_name}' AND price='{product_Price}' AND url='{https_li_href}';"# print(db)'''首先生成插入语句,等写入直接source加载'''# with open('E:\\xxuan_car_jd_mobil_product.txt', 'a', encoding='utf-8') as w:#     w.write(db + '\r')#     print(db)'''直接插入'''# print(db)# conneMysql(db)return dbdef connectMysql():conn = pymysql.connect(host='localhost',user='root',password='root',db='jd_qipei',charset='utf8',autocommit=True,  # 如果插入数据,, 是否自动提交? 和conn.commit()功能一致。)cur = conn.cursor()insert_sql = 'select  * from `xxuan_car_jd_ycj_product` where  name ="NULL"'cur.execute(insert_sql)ALL_NULL=cur.fetchall()for Null in ALL_NULL:sku=str(Null).split(',')[1].replace("'","").strip()brand=str(Null).split(',')[3].replace("'","").strip()href=str(Null).split(',')[5].replace("'","").strip()price=str(Null).split(',')[8].replace("'","").strip()db=getProduct(https_li_href=href,brand_name=brand,product_sku=sku,product_Price=price)cur.execute(db)conn.commit()if __name__ == '__main__':connectMysql()

总结

希望能帮助到大家谢谢

从Python爬虫到Spark预处理数据的真实需求[四]相关推荐

  1. 从Python爬虫到Spark预处理数据的真实需求[三]

    絮叨两句: 博主是一名软件工程系的在校生,利用博客记录自己所学的知识,也希望能帮助到正在学习的同学们 人的一生中会遇到各种各样的困难和折磨,逃避是解决不了问题的,唯有以乐观的精神去迎接生活的挑战 少年 ...

  2. 从Python爬虫到Spark预处理数据的真实需求[二]

    絮叨两句: 博主是一名软件工程系的在校生,利用博客记录自己所学的知识,也希望能帮助到正在学习的同学们 人的一生中会遇到各种各样的困难和折磨,逃避是解决不了问题的,唯有以乐观的精神去迎接生活的挑战 少年 ...

  3. 从Python爬虫到Spark预处理数据的真实需求[五](Spark)

    絮叨两句: 博主是一名软件工程系的在校生,利用博客记录自己所学的知识,也希望能帮助到正在学习的同学们 人的一生中会遇到各种各样的困难和折磨,逃避是解决不了问题的,唯有以乐观的精神去迎接生活的挑战 少年 ...

  4. python爬虫数据分析可以做什么-python爬虫爬取的数据可以做什么

    在Python中连接到多播服务器问题,怎么解决你把redirect关闭就可以了.在send时,加上参数allow_redirects=False 通常每个浏览器都会设置redirect的次数.如果re ...

  5. python中国大学排名爬虫写明详细步骤-Python爬虫--2019大学排名数据抓取

    Python爬虫--2019大学排名数据抓取 准备工作 输入:大学排名URL连接 输出:大学排名信息屏幕输出 所需要用到的库:requests,bs4 思路 获取网页信息 提取网页中的内容并放到数据结 ...

  6. 如何用python爬股票数据_python爬虫股票数据,如何用python 爬虫抓取金融数据

    Q1:如何用python 爬虫抓取金融数据 获取数据是数据分析中必不可少的一部分,而网络爬虫是是获取数据的一个重要渠道之一.鉴于此,我拾起了Python这把利器,开启了网络爬虫之路. 本篇使用的版本为 ...

  7. 如何用python抓取文献_浅谈Python爬虫技术的网页数据抓取与分析

    浅谈 Python 爬虫技术的网页数据抓取与分析 吴永聪 [期刊名称] <计算机时代> [年 ( 卷 ), 期] 2019(000)008 [摘要] 近年来 , 随着互联网的发展 , 如何 ...

  8. python 爬虫-京东用户评论数据和用户评分

    python 爬虫-京东用户评论数据和用户评分 在京东页面查找(例如:oppo r15),选择第一个商品点击进入. 点击第一个评论页面: 点击第二个评论页面: 第三个评论页面: 发现第二页和第三页的网 ...

  9. 【爬虫+数据可视化毕业设计:英雄联盟数据爬取及可视化分析,python爬虫可视化/数据分析/大数据/大数据屏/数据挖掘/数据爬取,程序开发-哔哩哔哩】

    [爬虫+数据可视化毕业设计:英雄联盟数据爬取及可视化分析,python爬虫可视化/数据分析/大数据/大数据屏/数据挖掘/数据爬取,程序开发-哔哩哔哩] https://b23.tv/TIoy6hj

最新文章

  1. 神经网络模型中class的forward函数何时调用_用Keras从零开始6步骤训练神经网络
  2. mongodb数据库扩展名_MongoDB学习笔记:MongoDB 数据库的命名、设计规范
  3. java浮点数原理,浮点型数据存储原理
  4. kafka是什么_终于知道Kafka为什么这么快了!
  5. 十六、CI框架之数据库操作get用法
  6. python提速qq邮箱邮件_python3通过qq邮箱发送邮件
  7. Bootstrap 排版地址
  8. centos 减少tty数量的方法
  9. 电脑耗电知多少及如何节能
  10. AOL架构原则.优秀API设计.Yeoman工具
  11. [JavaEE] 了解Java连接池
  12. Windows下配置安装Git(一)
  13. jQuery根据纬度经度查看地图
  14. 前端面试系列-JavaScript作用域和作用域链
  15. 形容词,名词记忆(四):al后缀常用词
  16. 实时可视化大数据项目01 -- 项目介绍
  17. 创龙TI Sitara列AM4376/AM4379 ARM Cortex-A9高性能低功耗处理器
  18. Maven使用与配置
  19. Cool Edit Pro
  20. 大学计算机英语听力,计算机二级 -【听力改革】大学英语六级听力改革基础训练 TEST 5(附音频MP3) -我要模考网...

热门文章

  1. chrony配置外部时钟源后stratum=16
  2. 输出斐波拉契数列前30项,每行5个
  3. 59深度解密五十九:利用“抖音”进行吸粉的简单、另类玩法
  4. 我的世界rpg服务器无限点卷无限金币地址,我的世界RPG插件无限刷任何物品 | 手游网游页游攻略大全...
  5. Android 控制第三方音乐播放器
  6. 功放限幅保护_限幅器在音响系统中限幅阈值的计算方法
  7. 新疆计算机系统集成资质年审,新疆涉密信息系统集成资质
  8. 计算机开机显示器不亮,电脑开机显示器不亮该如何解决
  9. zend及Slim 漏洞合集
  10. 用爽银管理您的信用卡,完美征信不是难题