Python爬虫 -- 运用bs4爬取数据保存至sql数据库

网址链接:http://bang.dangdang.com/books/bestsellers/

• 导包

import pymysql
import requests
from bs4 import BeautifulSoup

• 放入网址链接,转换格式

url = 'http://bang.dangdang.com/books/bestsellers/'
html = requests.get(url).text

• 连接数据库

db = pymysql.connect(host='localhost', user='root', password='123456', port=3306, db='xxx')

• 获取游标

cursor = db.cursor()  # 获取游标

• 创建数据表(如果表不存在则创建)

sql = 'CREATE TABLE IF NOT EXISTS xxxxx(' \
          'title VARCHAR(255) NOT NULL ,' \
          'level1 VARCHAR(255) NOT NULL ,' \
          'smmm VARCHAR(255) NOT NULL )'

• 执行命令

cursor.execute(sql)

• 运用bs4爬取数据,添加数据

soup = BeautifulSoup(html,'html.parser')
div2 = soup.find('ul', class_='bang_list clearfix bang_list_mode')
div = div2.find_all('li')
for i in div:
    title = i.find('div', class_='name').text
    level1 = i.find('div', class_='star').text
    sm = i.find_all('div',class_='publisher_info')
    smm = sm[0].text
    smmm =sm[1].text
    sql = 'INSERT INTO xxxxx(title,level1,smmm) VALUE("{}","{}","{}")'.format(
        title, level1, smmm)
    cursor.execute(sql)  # 执行SQL语句字符串
    db.commit()  # 提交事务

来看看完整的代码:

import pymysql
import requests
from bs4 import BeautifulSoup

url = 'http://bang.dangdang.com/books/bestsellers/'
html = requests.get(url).text

db = pymysql.connect(host='localhost', user='root', password='123456', port=3306, db='xxx')
cursor = db.cursor()  # 获取游标
sql = 'CREATE TABLE IF NOT EXISTS xxxxx(' \
          'title VARCHAR(255) NOT NULL ,' \
          'level1 VARCHAR(255) NOT NULL ,' \
          'smmm VARCHAR(255) NOT NULL )'

cursor.execute(sql)

soup = BeautifulSoup(html,'html.parser')
div2 = soup.find('ul', class_='bang_list clearfix bang_list_mode')
div = div2.find_all('li')
for i in div:
    title = i.find('div', class_='name').text
    level1 = i.find('div', class_='star').text
    sm = i.find_all('div',class_='publisher_info')
    smm = sm[0].text
    smmm =sm[1].text
    sql = 'INSERT INTO xxxxx(title,level1,smmm) VALUE("{}","{}","{}")'.format(
        title, level1, smmm)
    cursor.execute(sql)  # 执行SQL语句字符串
    db.commit()  # 提交事务

运行结果:

Python爬虫 -- 运用bs4爬取数据保存至sql数据库_第1张图片


完成啦,希望有帮助到大家,有疑问或问题也可在评论区留言~ 

你可能感兴趣的:(Python爬虫,python,爬虫,开发语言,数据库,sql)