top of page
Search

Using Python to Sniff Out Bailey's Best Daycare Moments

  • Writer: IT_Nurse
    IT_Nurse
  • Aug 31, 2024
  • 4 min read

Updated: Sep 29, 2024


ree

Ever since I discovered that I can use Chat GPT to help me create Python scripts, I've gotten a kick out of using Python to make my life easier. Here's my latest example:


When I'm out and about with my dog Bailey, I often get compliments on how well-behaved he is. Not that he's perfectly behaved all the time, but he is usually pretty chill and doesn't ever jump up on visitors. I give credit for this to his doggy daycare. He's been going a few days per week since he's been a pup, and I think the combination of the training he gets there from the workers, being socialized with other dogs, and coming home exhausted helps him to be laid back when he's at home.


We've taken him to several different daycares over the years, and each has been awesome in its own way. This current one has a web portal where they provide 'Report Cards' for every daycare visit, with notes about how their day went, what activities they participated in, and which other dogs they seemed to enjoy playing with. Occasionally there will also be a section for 'REPORT CARD PHOTOS' at the bottom.


ree

He's been going to his particular daycare for several years now, and this week it occurred to me that, since I sometimes forget to check his report card, I may have missed photos. The web portal provides one page with a big table that lists out the links for each report card, so I started clicking through them one-by-one. I quickly realized that with 280 report cards to review, this process would take a while. For each report card I had to click on the link, visually inspect the report card when it appeared in it's new browser tab, close the tab, and make sure I was correctly clicking on the next report card in the sequence.


Cue Python!


I wound up with the following process:

  • I opened up the web portal and navigated to the page that lists the links for each report card and did a copy/paste into Excel, then saved the file.

ree

  • I asked ChatGPT to write me a python script to meet the following requirements:

    1. Open the Excel file

    2. Loop through each row

    3. For each row, open the link in my web browser

    4. After every 20 records, prompt the user (in the terminal) to enter 'CONTINUE' to process the next 20 records, or EXIT to exit the script.


This way, I could keep my mouse in the same position in my browser and repeatedly press the 'X' button to close each report card tab. Since there were almost 300 records to process, and I didn't want my browser to to become overwhelmed with tabs, I figured breaking them into batches of 20 would make things more manageable. I also wound up adding the EXIT clause as it took me a few times to get the code right and I wanted an easy way to terminate the script while I was troubleshooting.


Here's the code I wound up with:

# Import the openpyxl library for working with Excel files
import openpyxl

# Import the webbrowser module to open URLs in the default web browser
import webbrowser  

# Load the workbook and select the active worksheet
workbook = openpyxl.load_workbook('Bailey.xlsx')  
sheet = workbook.active  

# Initialize counter to keep track of how many records have been processed
counter = 0  

# Loop through all rows in the second column
for i, row in enumerate(sheet.iter_rows(min_row=1, min_col=2, max_col=2, values_only=False)):
    # Iterate through each row in the second column of the worksheet
    cell = row[0]  
	# Check if the cell contains a hyperlink
    if cell.hyperlink:  
        link = cell.hyperlink.target  # Get the URL of the hyperlink
        print(f"Opening {link}")  # Print the URL being opened
        webbrowser.open(link)  # Open the URL in the default web browser
    else:
		# If no hyperlink, print a message with the row number
        print(f"Row {i+1}: No hyperlink found in this cell")  
    
    # Increment the counter
    counter += 1  
    
    # Pause after every 20 records
    if counter % 20 == 0:  
        # Prompt the user to either continue or exit
        user_input = input("Processed 20 records. Type 'CONTINUE' to proceed or 'EXIT' to exit: ")
        if user_input.strip().upper() == 'EXIT':  # If the user types 'EXIT', end the loop and stop the script
            print("Exiting the script.")
            break  # Break the loop and stop processing further records

# Final message indicating that all hyperlinks have been processed
print("All hyperlinks have been processed.")

It wound up taking less than 5 minutes to go through all 280 report cards, and I did wind up finding (and saving) several photos I had missed over the years.


This definitely wasn't the most efficient process as it still required me to visually inspect each report card and download the photos. However, I'm still learning Python and this hit the sweet spot of being doable within the timeframe I had to play around with Python today. In the future I would love to learn more about web scraping; it would have been great to have the script loop through each report card and download the photos on it's own. And therein lies the fun of Python: there's always more to learn! 😄


Have you used Python to solve a problem lately? If so I would love to hear your story in the comments.


Thanks for reading!


Lisa


 
 
 

Comments


Contact

Thanks for submitting!

  • Black LinkedIn Icon
  • Black Facebook Icon
  • Black Twitter Icon
  • Black Google+ Icon
  • Black Instagram Icon

©2024 by IT_Nurse. Proudly created with Wix.com

Last Updated: December-2024

bottom of page