This seemingly arbitrary challenge made me focus all my energy on learning JavaScript so one day I could improve the script he built and add my small grain of sand to his contribution.įast-forward to December 2020, Hamlet and I had a Zoom call to talk about collaborating in his #RSTwittorial series and I had a few ideas that I wanted to share with him, including the JS version of his script that I built a long time ago but I thought it wasn’t good enough to share. But because he was using Puppeteer, a JavaScript library, I figured I could replicate his script using JavaScript. After trying for a while, I gave up trying to modify his script in Python because I didn’t know what I was doing. Even though we didn’t know each other before and I am a nobody in the SEO community, he was kind enough to reply:īack then, I started to learn some JavaScript, and had zero experience with Python. But mostly I was impressed by how easily Hamlet explained the way he created the script.Īfter a few tests, I found a couple of potential bugs and I wrote to Hamlet to see if he encountered the same issues. I was fascinated by the tool, as I didn’t know about any solutions that could extract this data programmatically. I’ve divided this post into four parts so it’s your choice to read it all or jump to the specific part you’re interested in:īack in April 2019, Hamlet Batista wrote an article in SEJ called “ How to Automate the URL Inspection Tool with Python & JavaScript”. Because I think that’s what Hamlet wanted from us. To express my appreciation for the teacher and the man that Hamlet was, and of course, to share with the SEO Community. I am not going to lie, this is mostly an exercise for me to channel my grief. That person is the great Hamlet Batista who unfortunately passed away only a few days ago (January 2021). I’ve built it because I wanted to improve another open source project from a person I respect deeply, but when he launched it I didn’t have the coding abilities to do so. However, I didn’t build this script to help me out with my workload. Hence, if your sites are bigger than 100 URLs you will need multiple days to extract this data or use other methods outside of GSC.Īlthough I don’t use the URL Inspection tool every day, it definitely gives me a peek into how Google sees a small section of my sites. Meaning that you won’t be able to get over 100 URLs per property, per day. Additionally, there is a “daily request limit” per property. The main problem is that it is a very manual process to gather this information, as there is no “bulk option” and no API access. Has Google found a specific URL from my site?.But at a fundamental level, it answers two important questions: There is so much great information to unpack and to debug your presence in Google Search. The URL Inspection Tool is one of the most useful reports from the newest version of Google Search Console.
0 Comments
Leave a Reply. |
Details
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |