爬虫爬取http://www.weather.com.cn/weather40dn/101250804.shtml网站40天天气数据的代码,代码包括网络请求,html解析,数据清洗,数据以excel表格存储至本地,并且代码注释清晰
时间: 2024-04-02 11:34:14 浏览: 127
好的,以下是Python代码实现:
```python
import requests
from bs4 import BeautifulSoup
import pandas as pd
# 网络请求,获取网页内容
def get_html(url):
headers = {
'User-Agent': 'Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/58.0.3029.110 Safari/537.3'}
response = requests.get(url, headers=headers)
if response.status_code == 200:
response.encoding = 'utf-8' # 防止中文乱码
return response.text
else:
return None
# 获取天气数据
def get_weather_data(html):
soup = BeautifulSoup(html, 'html.parser')
weather_data = [] # 存储天气数据
days = soup.find('ul', class_='t clearfix').find_all('li')
for day in days:
# 日期
date = day.find('h1').string
# 天气情况
weather = day.find_all('p')
# 温度
temperature = weather[0].string.strip()
# 风力
wind = weather[1].string.strip()
# 将数据存入列表中
weather_data.append({'日期': date, '天气': weather[0].string, '温度': temperature, '风力': wind})
return weather_data
# 数据清洗
def data_cleaning(weather_data):
for data in weather_data:
# 温度
temperature = data['温度']
temperature_range = temperature.split('℃/')
high_temperature = temperature_range[0]
low_temperature = temperature_range[1].replace('℃', '')
data['最高温度'] = high_temperature
data['最低温度'] = low_temperature
del data['温度']
# 风力
wind = data['风力']
wind_direction = wind.split(' ')[0]
wind_power = wind.split(' ')[1]
data['风向'] = wind_direction
data['风力等级'] = wind_power
del data['风力']
return weather_data
# 数据保存至Excel表格
def save_to_excel(weather_data):
# 创建DataFrame
df = pd.DataFrame(weather_data)
# 按日期排序
df = df.sort_values(by='日期')
# 保存到Excel文件中
writer = pd.ExcelWriter('weather_data.xlsx')
df.to_excel(writer, index=False)
writer.save()
if __name__ == '__main__':
url = 'http://www.weather.com.cn/weather40dn/101250804.shtml'
html = get_html(url)
if html:
weather_data = get_weather_data(html)
weather_data = data_cleaning(weather_data)
save_to_excel(weather_data)
print('数据已保存至本地!')
else:
print('网络请求失败!')
```
代码注释比较详细,主要分为四个部分:
1. 网络请求:使用requests库发送网络请求,获取网页内容。
2. 获取天气数据:使用BeautifulSoup库解析HTML文档,获取天气数据。
3. 数据清洗:对获取的天气数据进行清洗,包括温度和风力的处理。
4. 数据保存至Excel表格:使用pandas库将数据保存至本地Excel文件中。
阅读全文