top of page
Search

Daily Job Board Updates? There’s a Script for That!

  • Writer: IT_Nurse
    IT_Nurse
  • Oct 23, 2024
  • 4 min read

Updated: Dec 19, 2024


ree

Have you ever wished you could automate a repetitive task in your daily life? Recently, I embarked on a project to simplify how I monitor job postings in my area, and I learned some valuable technical skills along the way—skills that are useful not just for me but also for anyone curious about automating routine processes.


The Challenge

Keeping track of new job postings is time-consuming. Most of us don’t have the luxury of checking multiple job boards every single day. This inspired me to create a system that could gather information from a job board automatically, save the data in a structured format, and allow me to analyze it at my convenience.


The Solution: Automation with Python and Task Scheduler

I started by learning how to scrape information from the web using Python. I focused on the Government of New Brunswick’s job board, where postings are listed in a structured table format. Python’s requests and BeautifulSoup libraries allowed me to extract high-level information about each job posting, including the job title, department, location, and closing date.

Once the data was scraped, I used Python to save it to a CSV file. Each file is named with the current date and stored in a dedicated folder, creating a historical archive of job postings. This archive is incredibly useful because it lets me look back at past postings I might have missed.

Here’s a look at the Python script I developed:

import requests
from bs4 import BeautifulSoup
import csv
from datetime import datetime
import os

# Function to scrape job details from the website
def scrape_job_details():
    url = 'https://www.ere.gnb.ca/competition.aspx?strType=o&strSort=d&strIntraSort=t'
    response = requests.get(url)
    
    if response.status_code == 200:
        soup = BeautifulSoup(response.text, 'html.parser')
        job_table = soup.find('table', {'id': 'dgCompetition'})
        if job_table:
            rows = job_table.find_all('tr', {'class': 'Body_TxT_Black'})
            jobs = []
            for row in rows:
                columns = row.find_all('td')
                job_title = columns[0].get_text(strip=True)
                job_number = columns[1].get_text(strip=True)
                department = columns[2].get_text(strip=True)
                location = columns[3].get_text(strip=True)
                closing_date = columns[4].get_text(strip=True) if len(columns) > 4 else ''
                jobs.append([job_title, job_number, department, location, closing_date])
            return jobs
        else:
            print("Job table not found.")
            return []
    else:
        print(f"Failed to retrieve the page. Status code: {response.status_code}")
        return []

script_date = datetime.now().strftime('%Y-%m-%d')
folder_path = 'C:/Python/GNB'
if not os.path.exists(folder_path):
    os.makedirs(folder_path)
file_name = f"{script_date} - GNB Jobs.csv"
file_path = os.path.join(folder_path, file_name)
jobs = scrape_job_details()

with open(file_path, mode='w', newline='', encoding='utf-8') as file:
    writer = csv.writer(file)
    writer.writerow(['Job Title', 'Job Number', 'Department', 'Location', 'Closing Date'])
    for job in jobs:
        writer.writerow(job)

print(f"Job details saved to {file_path}")

But I didn’t stop there. Running this script manually every day would defeat the purpose of automation. That’s where Windows Task Scheduler came in. I set up a .bat file that triggers the script daily, ensuring that the process runs like clockwork, even if I’m not at my computer:

@echo off
cd C:\Python\GNB
python GNB.py
pause

Building on the Basics

The Python script is just the first step. Once the data is saved, I use Power Query in Excel to aggregate these daily files into a single workbook. This lets me filter for job types I’m interested in—saving time and making the process much more efficient. The historical data also provides insights into job trends, such as which departments are hiring most frequently or what types of roles have been posted in the past.

Looking ahead, I plan to bring this data into Power BI. While Excel is great, Power BI offers a more user-friendly and interactive interface for exploring the data. With Power BI, I can build dashboards that visually track trends and filter postings in ways that are more intuitive than scrolling through rows in Excel.

Another goal is to expand the system to include job postings from additional employers or websites. By doing so, I could consolidate all relevant job opportunities into one central location, saving even more time and effort.


Challenges and Future Improvements

One challenge I’ve encountered is that the current script only captures high-level information from the job board. Details like job descriptions or specific qualifications aren’t included because they require additional steps to extract. In the future, I’d like to enhance the script to capture these details, giving me a more complete dataset.

Another area for improvement is error handling. For example, if the website changes its structure, the script might stop working. Adding more robust error messages and monitoring systems would help ensure the script runs smoothly over time.


What I Learned

This project has been a great learning experience. Not only did I improve my Python skills, but I also gained a better understanding of how automation tools like Windows Task Scheduler can simplify everyday tasks. By combining Python with Power Query and Power BI, I’ve created a system that’s practical, scalable, and adaptable to my needs.


Why It Matters

Automation isn’t just about saving time—it’s about creating opportunities to focus on what matters most. For me, that’s finding the right job opportunities without spending hours scrolling through job boards. For recruiters and employers, this project showcases my ability to identify problems, learn new tools, and implement solutions that make life easier.


If you’ve ever wanted to learn a technical skill or solve a repetitive problem, I encourage you to give it a try. You might be surprised at what you can achieve!

 
 
 

Comments


Contact

Thanks for submitting!

  • Black LinkedIn Icon
  • Black Facebook Icon
  • Black Twitter Icon
  • Black Google+ Icon
  • Black Instagram Icon

©2024 by IT_Nurse. Proudly created with Wix.com

Last Updated: December-2024

bottom of page