Automate Daily Lyrics Emails with Python: Web Scraping + SMTP

This tutorial shows how to use Python to crawl song lyrics from a website, format them, and automatically email them on a schedule, including full source code and instructions for setting up a Windows Task Scheduler job.

Python Crawling & Data Mining
Python Crawling & Data Mining
Python Crawling & Data Mining
Automate Daily Lyrics Emails with Python: Web Scraping + SMTP

Introduction

The author, a Python enthusiast, shares a handy script that periodically sends song lyrics to a specified email address.

Implementation Idea

The solution consists of two parts: a Python web crawler that fetches lyric information from an online source and stores it in a variable, and an email‑sending routine that builds a mail template and delivers the lyrics. Scheduling can be handled with Windows Task Scheduler.

Implementation Process

The complete Python code is provided below.

import json, random
import requests
import parsel
import smtplib
import schedule
import time
from bs4 import BeautifulSoup
from email.mime.text import MIMEText
from email.header import Header

account = '{0}'.format('请输入你的邮箱:')
password = '{0}'.format('请输入邮箱授权码:')
receiver = '{0}'.format('请输入收件人的邮箱:')

def recipe_spider():
    headers = {
        'Cookie': 'kw_token=6A3S4588YMS',
        'csrf': '6A3S4588YMS'
    }
    url = 'http://www.kuwo.cn/api/www/bang/bang/musicList?bangId=93&pn=1&rn=30'
    resp = requests.get(url, headers=headers)
    text = resp.text
    text_dict = json.loads(text)
    musicList = text_dict['data']['musicList']
    music_list = []
    dit = {}
    for music in musicList:
        dit['musicrid'] = music['musicrid'].split('_')[1]
        dit['name'] = music['name']
        music_list.append(dit.copy())
    list_num = []
    for i in range(30):
        music_num = music_list[i]['musicrid']
        list_num.append(music_num)
    a = random.choice(list_num)
    url1 = str('http://www.kuwo.cn/play_detail/' + a)
    html_data = requests.get(url = url1).text
    sel = parsel.Selector(html_data)
    name = sel.xpath('//*[@class="song_name flex_c"]/span/text()').get().strip()
    lyric = sel.xpath('//*[@id="lyric"]/div/p/text()').getall()
    lyric1 = '
'.join(lyric)
    return lyric1

def send_email(lyric1):
    global account,password,receiver
    mailhost = 'smtp.qq.com'
    qqmail = smtplib.SMTP_SSL(mailhost,465)
    qqmail.login(account,password)
    content = '亲爱的,今天歌名是:' + lyric1
    message = MIMEText(content, 'plain', 'utf-8')
    subject = '今天听什么(附歌词)'
    message['Subject'] = Header(subject, 'utf-8')
    try:
        qqmail.sendmail(account, receiver, message.as_string())
        print('邮件发送成功')
    except:
        print('邮件发送失败')
    qqmail.quit()

def job():
    print('开始一次任务')
    lyric1 = recipe_spider()
    send_email(lyric1)
    print('任务完成')

if __name__ == '__main__':
    job()

After entering your email address, authorization code, and the recipient’s address, running the script sends an email containing the song name and its lyrics.

You can create a recurring task in Windows Task Scheduler to execute this script automatically each day.

Configure the task by opening the Task Scheduler, creating a new task, and setting the trigger time and the script to run.

Conclusion

The article demonstrates a small project that combines Python web crawling and automated email sending to deliver daily lyrics, and shows how to schedule it with Windows Task Scheduler.

schedulingSMTPemail automationlyrics
Python Crawling & Data Mining
Written by

Python Crawling & Data Mining

Life's short, I code in Python. This channel shares Python web crawling, data mining, analysis, processing, visualization, automated testing, DevOps, big data, AI, cloud computing, machine learning tools, resources, news, technical articles, tutorial videos and learning materials. Join us!

0 followers
Reader feedback

How this landed with the community

Sign in to like

Rate this article

Was this worth your time?

Sign in to rate
Discussion

0 Comments

Thoughtful readers leave field notes, pushback, and hard-won operational detail here.