Backend Development 24 min read

Eight Ways to Implement Python Scheduled Tasks

This article presents a comprehensive guide to implementing periodic tasks in Python, covering eight approaches including simple while‑loop with sleep, Timeloop, threading.Timer, sched, schedule, APScheduler, Celery, and Apache Airflow, each with code examples and practical notes.

Python Programming Learning Circle
Python Programming Learning Circle
Python Programming Learning Circle
Eight Ways to Implement Python Scheduled Tasks

In daily work, periodic tasks are often needed; while Linux's crond can be used, Python offers flexible alternatives. This guide details eight common Python scheduling methods.

Using while True + sleep()

The time.sleep(secs) function pauses the current thread for a specified number of seconds, allowing simple interval‑based scheduling via an infinite loop.

<code>import datetime
import time

def time_printer():
    now = datetime.datetime.now()
    ts = now.strftime('%Y-%m-%d %H:%M:%S')
    print('do func time :', ts)

def loop_monitor():
    while True:
        time_printer()
        time.sleep(5)  # pause 5 seconds

if __name__ == "__main__":
    loop_monitor()
</code>

Main drawbacks:

Only interval scheduling; cannot specify exact times like 08:00 daily.

sleep blocks the thread, preventing other operations during the pause.

Using Timeloop library

Timeloop provides a decorator‑based API for running functions at fixed intervals.

<code>import time
from timeloop import Timeloop
from datetime import timedelta

tl = Timeloop()

@tl.job(interval=timedelta(seconds=2))
def sample_job_every_2s():
    print("2s job current time : {}".format(time.ctime()))

@tl.job(interval=timedelta(seconds=5))
def sample_job_every_5s():
    print("5s job current time : {}".format(time.ctime()))

@tl.job(interval=timedelta(seconds=10))
def sample_job_every_10s():
    print("10s job current time : {}".format(time.ctime()))
</code>

Using threading.Timer

The Timer(interval, function, args=[], kwargs={}) creates a non‑blocking timer that can launch multiple asynchronous tasks.

<code># Example code omitted for brevity
</code>

Using built‑in sched module

The sched.scheduler(timefunc, delayfunc) class provides a generic event scheduler that can handle delayed execution and supports multithreading.

<code>import datetime
import time
import sched

def time_printer():
    now = datetime.datetime.now()
    ts = now.strftime('%Y-%m-%d %H:%M:%S')
    print('do func time :', ts)
    loop_monitor()

def loop_monitor():
    s = sched.scheduler(time.time, time.sleep)  # create scheduler
    s.enter(5, 1, time_printer, ())
    s.run()

if __name__ == "__main__":
    loop_monitor()
</code>

Using schedule library

Schedule is a lightweight third‑party module that allows human‑readable interval, daily, or custom time scheduling.

<code>import schedule
import time

def job():
    print("I'm working...")

schedule.every(10).seconds.do(job)
schedule.every().day.at("10:30").do(job)

while True:
    schedule.run_pending()
    time.sleep(1)
</code>

Using APScheduler

APScheduler (Advanced Python Scheduler) mirrors Quartz features, offering date, interval, and cron‑style triggers with persistence options.

Trigger (scheduling logic)

Job store (in‑memory or external DB)

Executor (thread or process pool)

Scheduler (core component)

<code>from apscheduler.schedulers.blocking import BlockingScheduler
from datetime import datetime

def job():
    print(datetime.now().strftime("%Y-%m-%d %H:%M:%S"))

sched = BlockingScheduler()
sched.add_job(job, 'interval', seconds=5, id='my_job_id')
sched.start()
</code>

Using Celery

Celery is a robust distributed task queue that can also handle scheduled jobs via Celery Beat, typically backed by RabbitMQ or Redis.

Producer creates tasks and pushes them to a broker.

Workers consume tasks and execute them.

Result backend stores execution outcomes.

Using Apache Airflow

Airflow defines workflows as DAGs (Directed Acyclic Graphs) with rich operators (BashOperator, PythonOperator, etc.) and supports complex dependencies, scheduling, and monitoring.

DAGs organize tasks and their execution order.

Operators encapsulate the actual work.

Executors (Sequential, Local, Celery, Kubernetes, etc.) run tasks.

Overall, the article equips readers with a variety of Python‑based scheduling tools, from simple loops to enterprise‑grade orchestrators, enabling them to choose the most suitable solution for their specific use case.

PythonschedulingCeleryCronAirflowapscheduler
Python Programming Learning Circle
Written by

Python Programming Learning Circle

A global community of Chinese Python developers offering technical articles, columns, original video tutorials, and problem sets. Topics include web full‑stack development, web scraping, data analysis, natural language processing, image processing, machine learning, automated testing, DevOps automation, and big data.

0 followers
Reader feedback

How this landed with the community

login Sign in to like

Rate this article

Was this worth your time?

Sign in to rate
Discussion

0 Comments

Thoughtful readers leave field notes, pushback, and hard-won operational detail here.