Automate Daily Recipe Emails with Python Web Scraping

This tutorial shows how to build a Python script that scrapes daily recipe articles from a website, formats the content, and automatically sends it via email using SMTP, providing a practical example of web scraping and email automation.

Python Crawling & Data Mining
Python Crawling & Data Mining
Python Crawling & Data Mining
Automate Daily Recipe Emails with Python Web Scraping

Hello, I'm a Python enthusiast.

Introduction

Recently a friend shared an interesting code that periodically sends recipe shares. Below is the implementation.

Implementation Idea

The idea is simple: first use a Python web scraper to fetch daily articles from a website, store the content in a variable, then compose an email using a template and send it.

Implementation Process

Here is the full code:

import requests, bs4
import smtplib
import schedule
import time
from bs4 import BeautifulSoup
from email.mime.text import MIMEText
from email.header import Header

# account = input('请输入你的邮箱:')
# password = input('请输入你的密码:')
# receiver = input('请输入收件人的邮箱:')
account = '{0}'.format('请输入你的邮箱:')
password = '{0}'.format('请输入你的密码:')
receiver = '{0}'.format('请输入收件人的邮箱:')

def recipe_spider():
    list_all = ''
    num = 0
    for a in range(1, 11):
        headers = {'user-agent':'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_13_6) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/71.0.3578.98 Safari/537.36'}
        n = '{0}{1}{2}'.format('https://home.meishichina.com/show-top-type-recipe-page-', a, '.html')
        res_foods = requests.get(n, headers=headers)
        bs_foods = bs4.BeautifulSoup(res_foods.text, 'html.parser')
        list_foods = bs_foods.find('div', class_='space_left')
        for food in list_foods.find_all('li'):
            num = num + 1
            name = food.find('h2').text.strip()
            foods = food.find('p', class_='subcontent').text.strip()
            url_food = food.find('a')['href'].strip()
            food_info = '
%s、%s
 %s
 链接: %s
' % (num, name, foods, url_food)
            list_all = list_all + food_info
    return list_all

def send_email(list_all):
    global account, password, receiver
    mailhost = 'smtp.qq.com'
    qqmail = smtplib.SMTP_SSL(mailhost, 465)
    qqmail.login(account, password)
    content = '亲爱的,本周的热门菜谱如下' + list_all
    message = MIMEText(content, 'plain', 'utf-8')
    subject = '周末吃个啥——美食天下'
    message['Subject'] = Header(subject, 'utf-8')
    try:
        qqmail.sendmail(account, receiver, message.as_string())
        print('邮件发送成功')
    except:
        print('邮件发送失败')
    qqmail.quit()

def job():
    print('开始一次任务')
    list_all = recipe_spider()
    send_email(list_all)
    print('任务完成')

if __name__ == '__main__':
    job()
# schedule.every(0.05).minutes.do(job)
# while True:
#     schedule.run_pending()
#     time.sleep(1)

After running the script, the recipient will receive an email containing the fetched recipes.

You can also schedule the script to send yourself weekly reminders.

Conclusion

This article demonstrates a small project that combines Python web scraping and automated email sending to deliver daily recipe recommendations.

Original Source

Signed-in readers can open the original source through BestHub's protected redirect.

Sign in to view source
Republication Notice

This article has been distilled and summarized from source material, then republished for learning and reference. If you believe it infringes your rights, please contactadmin@besthub.devand we will review it promptly.

PythonTutorialscheduleweb-scrapingsmtplibrecipeemail-automation
Python Crawling & Data Mining
Written by

Python Crawling & Data Mining

Life's short, I code in Python. This channel shares Python web crawling, data mining, analysis, processing, visualization, automated testing, DevOps, big data, AI, cloud computing, machine learning tools, resources, news, technical articles, tutorial videos and learning materials. Join us!

0 followers
Reader feedback

How this landed with the community

Sign in to like

Rate this article

Was this worth your time?

Sign in to rate
Discussion

0 Comments

Thoughtful readers leave field notes, pushback, and hard-won operational detail here.