import requests from bs4 import BeautifulSoup from pprint import pprint for page in range(5): url = "https://movie.douban.com/subject/30391186/comments?start={}&limit=20&sort=new_score&status=P".format(page*20) headers = { "User-Agent": "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/58.0.3029.110 Safari/537.3" } response = requests.get(url, headers=headers) soup = BeautifulSoup(response.text, "html.parser") comments = [] for comment in soup.select(".comment-item"): content = comment.select_one(".comment-content").get_text().strip() author = comment.select_one(".comment-info a").get_text().strip() likes = comment.select_one(".votes").get_text().strip() time = comment.select_one(".comment-time").get_text().strip() location = comment.select_one(".comment-location").contents[-1].strip() comments.append({ "content": content, "author": author, "likes": likes, "time": time, "location": location }) print("第{}页的数据:".format(page+1)) pprint(comments),我要把数据存入数据库
时间: 2024-04-03 22:35:36 浏览: 117
import reimport requestsfrom bs4 import BeautifulSoupimport t
好的,您可以使用Python中的SQLite3模块来将数据存储到本地数据库中。您可以按照以下步骤操作:
1. 导入SQLite3模块
```
import sqlite3
```
2. 创建数据库连接并打开游标
```
conn = sqlite3.connect('comments.db')
cursor = conn.cursor()
```
3. 创建数据表
```
cursor.execute('''
CREATE TABLE IF NOT EXISTS comments (
id INTEGER PRIMARY KEY AUTOINCREMENT,
content TEXT,
author TEXT,
likes TEXT,
time TEXT,
location TEXT
)
''')
```
4. 插入数据
```
for comment in comments:
cursor.execute('''
INSERT INTO comments (content, author, likes, time, location)
VALUES (?, ?, ?, ?, ?)
''', (comment['content'], comment['author'], comment['likes'], comment['time'], comment['location']))
```
5. 提交更改并关闭连接
```
conn.commit()
conn.close()
```
这样您就可以将数据存入名为"comments.db"的本地SQLite数据库中了。
阅读全文