Python Microblogging Network Red: A Face Recognition API Integration
Foreword
Before diving into the world of microblogging network red, let’s take a moment to appreciate the beauty of the red net. With the rise of social media, the red net has become an essential platform for users to share their thoughts, ideas, and experiences. In this article, we will explore the world of microblogging network red and integrate a face recognition API to analyze the beauty of the users.
Web Analytics
The first step in our journey is to collect data from the microblogging network red. We can do this by using the lxml library to parse the HTML of the webpage. The URL we are interested in is https://weibo.com/a/hot/7549094253303809_1.html. This page is simple, and we can directly use the lxml library to parse it.
import requests
from lxml import etree
import re
headers = {'Cookie': ''} # Replace with your own cookie
url = 'https://weibo.com/a/hot/7549094253303809_1.html'
res = requests.get(url, headers=headers)
html = etree.HTML(res.text)
infos = html.xpath('//div[@class="UG_list_a"]')
Reptile Code
Based on the ideas discussed above, we can write the reptile code to collect data from the microblogging network red.
for info in infos:
name = info.xpath('div[2]/a[2]/span/text()')[0]
content = info.xpath('h3/text()')[0].strip()
imgs = info.xpath('div[@class="list_nod clearfix"]/div/img/@src')
print(name, content)
i = 1
for img in imgs:
href = 'https:' + img.replace('thumb180', 'mw690')
print(href)
res_1 = requests.get(href, headers=headers)
fp = open('row_img/' + name + '+' + content + '+' + str(i) + '.jpg', 'wb')
fp.write(res_1.content)
i = i + 1
Face Recognition API
Before we can use the face recognition API, we need to create a face recognition application. We can do this by opening the URL http://ai.baidu.com/tech/face and creating a new application.
import requests
ak = '' # Replace with your API Key
sk = '' # Replace with your Secret Key
host = 'https://aip.baidubce.com/oauth/2.0/token?grant_type=client_credentials&client_id={}&client_secret={}'.format(ak, sk)
res = requests.post(host)
print(res.text)
Integrated Use
Now that we have collected data from the microblogging network red and obtained a token from the face recognition API, we can integrate the two to analyze the beauty of the users.
import requests
import os
import base64
import json
import time
def get_img_base(file):
with open(file, 'rb') as fp:
content = base64.b64encode(fp.read())
return content
file_path = 'row_img'
list_paths = os.listdir(file_path)
for list_path in list_paths:
img_path = file_path + '/' + list_path
print(img_path)
token = '24.890f5b6340903be0642f9643559aa7a1.2592000.1557979582.282335-15797955'
request_url = "https://aip.baidubce.com/rest/2.0/face/v3/detect"
request_url = request_url + "?access_token=" + token
params = {'Image': get_img_base(img_path), 'Image_type': 'BASE64', 'Face_field': 'age, beauty, gender'}
res = requests.post(request_url, data=params)
json_result = json.loads(res.text)
code = json_result['error_code']
if code == 222202:
continue
try:
gender = json_result['result']['face_list'][0]['gender']['type']
if gender == 'male':
continue
beauty = json_result['result']['face_list'][0]['beauty']
new_beauty = round(beauty / 10, 1)
print(img_path, new_beauty)
if new_beauty >= 8:
os.rename(os.path.join(file_path, list_path), os.path.join('8 minutes', str(new_beauty) + '+' + list_path))
elif new_beauty >= 7:
os.rename(os.path.join(file_path, list_path), os.path.join('7 minutes', str(new_beauty) + '+' + list_path))
elif new_beauty >= 6:
os.rename(os.path.join(file_path, list_path), os.path.join('6 points', str(new_beauty) + '+' + list_path))
elif new_beauty >= 5:
os.rename(os.path.join(file_path, list_path), os.path.join('5 minutes', str(new_beauty) + '+' + list_path))
else:
os.rename(os.path.join(file_path, list_path), os.path.join('other points', str(new_beauty) + '+' + list_path))
except KeyError:
pass
except TypeError:
pass
time.sleep(1)
This article has demonstrated how to collect data from the microblogging network red and integrate a face recognition API to analyze the beauty of the users. The code has been provided in a clear and concise manner, and the explanations have been kept simple and easy to understand.