Build a Python Bot That Sends Personalized Love Letters with GIFs via Email
This tutorial shows how to use Python to scrape GIF images, automatically generate romantic messages, obtain AI‑generated replies, and send the whole package as an email through a 163 SMTP server, providing a complete, runnable example.
Introduction
This article demonstrates a playful Python project that combines web scraping, text generation, AI chat, and email automation to simulate a virtual girlfriend sending personalized love letters with GIFs.
1. Scrape GIF Images
Three GIF URLs are stored in a list, a random one is selected, downloaded with requests.get(url).content, and saved locally.
def getbb():
w0='https://img1.baidu.com/it/u=1762637264,598758602&fm=26&fmt=auto&gp=0.jpg'
w1='https://img1.baidu.com/it/u=2231058723,1803013600&fm=11&fmt=auto&gp=0.jpg'
w2='https://img0.baidu.com/it/u=3960011140,3634140813&fm=11&fmt=auto&gp=0.jpg'
wlist=[w0,w1,w2]
global i
i=random.randint(0,2) # random GIF index
url=wlist[i]
req=requests.get(url).content
with open(f'wbb{i}.gif','wb') as p:
p.write(req)2. Generate Romantic Phrases
The function fetches a random page from a Chinese poetry site, parses it with BeautifulSoup, extracts love phrases, and returns one randomly.
def getwords():
texts=[]
url='https://www.duanwenxue.com/huayu/lizhi/list_{}.html'.format(random.randint(1,114))
response=requests.get(url)
texts.append(response.text)
articles=[]
for text in texts:
soup=BeautifulSoup(text,'lxml')
arttis=soup.find('div', class_='list-short-article').find_all('a',{'target':'_blank'})
articles.extend([a.text.strip() for a in arttis])
todaywords=articles[random.randint(0,len(articles)-1)]
return todaywords3. AI Reply via Qingyunke
The selected phrase is sent to the free Qingyunke chatbot API, which returns an AI‑generated reply.
def qingyunke(msg):
url=f'http://api.qingyunke.com/api.php?key=free&appid=0&msg={msg}'
html=requests.get(url)
return html.json()["content"]4. Send Email with 163 SMTP
The script composes an HTML email that includes the love phrase, the AI reply, and the previously downloaded GIF, then sends it using the 163.com SMTP server. Users need to fill in their 163 email address and generated authorization code.
def sendemail():
msgword=getwords()
res=qingyunke(msgword)
xhx='your_163_email'
pwd='your_authorization_code'
wy163list=[xhx]
host_server='smtp.163.com'
sender=f'{xhx}@163.com'
receiver=f'{wy163list[0]}@163.com'
mail_title='Ice Ice sends you a new email'
mail_content=f"Dear {wy163list[0]}, I am Ice Ice.<p>Your previous message: {msgword}</p><p>My reply: {res}</p>"
msg=MIMEMultipart()
global i
with open(f'wbb{i}.gif','rb') as f:
msgImage=MIMEImage(f.read())
msgImage.add_header('Content-ID','<image1>')
msg.attach(msgImage)
msg['Subject']=Header(mail_title,'utf-8')
msg['From']=Header('Ice Ice','utf-8')
msg['To']=receiver
msg.attach(MIMEText(mail_content,'html'))
try:
smtp=SMTP_SSL(host_server)
smtp.set_debuglevel(1)
smtp.ehlo(host_server)
smtp.login(sender,pwd)
smtp.sendmail(sender,receiver,msg.as_string())
smtp.quit()
print('邮件发送成功')
except smtplib.SMTPException:
print('无法发送邮件')5. Full Script
The complete code combines the above functions and runs them when the script is executed.
# -*- coding: utf-8 -*-
from bs4 import BeautifulSoup
import random, requests, smtplib
from smtplib import SMTP_SSL
from email.mime.text import MIMEText
from email.mime.multipart import MIMEMultipart
from email.header import Header
from email.mime.image import MIMEImage
# Functions getbb, getwords, qingyunke, sendemail defined as above
if __name__=='__main__':
getbb()
sendemail()Conclusion
The project showcases how Python can be used for web scraping, text processing, AI interaction, and email automation, providing a fun example that can be extended or adapted for other automated messaging tasks.
Signed-in readers can open the original source through BestHub's protected redirect.
This article has been distilled and summarized from source material, then republished for learning and reference. If you believe it infringes your rights, please contactand we will review it promptly.
Python Crawling & Data Mining
Life's short, I code in Python. This channel shares Python web crawling, data mining, analysis, processing, visualization, automated testing, DevOps, big data, AI, cloud computing, machine learning tools, resources, news, technical articles, tutorial videos and learning materials. Join us!
How this landed with the community
Was this worth your time?
0 Comments
Thoughtful readers leave field notes, pushback, and hard-won operational detail here.
