I couldn’t be more excited to share this with you all.
I’ve spent the last few months adjusting to being a new dad, but in the last few weeks I got an itch to build something.
The video above walks through the details, but I’m releasing a completely free, open-source Streamlit web app for Google Search Console!
You can access and analyze your GSC data directly from the Google Search Console API. (If you want to check out the code/clone the repo, you’re welcome to do so here)
Once you sign-in with your Google account, you can access data from any web property that your account has access to. (This means if you happen to work for an agency and use any sort of aggregator access accounts, you can see all of those properties on one screen).
Key Features Of The App
You can authenticate securely with any Google account
Select from all your available sites in Google Search Console
Perform manual data pulls with customizable parameters
Apply advanced filters to refine your queries
View data grouped by page or in detailed format
Generate specialized reports, including:
Cannibalization Report, showing you any pages on your site that are ranking for the same keywords
CTR Yield Curve, helping you understand your click-through-rate by position
Pages to Audit—identifying any pages in your data that don’t meet your given click/impressions thresholds
And more to come! (Big shoutout to Antione Eripret’s gscwrapper package for making these additional functions so easy to add.)
Why Did I Make This?
Put simply—the GSC UI obscures SO much data from you. As far back as 2022, GSC was anonymizing more than 50% of queries, and it’s only gotten worse, in my experience, since then.
Using the API doesn’t completely solve this problem, but you’ll often find a massive difference in the depth of query data available using the API instead of just the UI.
I wanted to democratize that data, making sure anyone can access it, whether they feel comfortable playing around with APIs and SQL databases or not.
In my experience, a lot of SEOs know about this problem, and they may even know a solution, but they aren’t comfortable accessing all this data on their own.
So, I built a basic UI that allows anyone to login with Google and access this data.
What Are the Limitations?
There are a few things you should know about working with the API:
You are limited to 50,000 rows of data exported per day, per property. That means for smaller websites, you’re probably fine,
A “row” is any given datapoint with the dimensions you select (so just “day” is going to be one row per day, adding “page” means one row per page per day, adding “query” means one row per page, per query, etc.)
So, for larger sites, it may be worth working with smaller date ranges or using less dimensions if you want specific, detailed data.
If you need to get past this limitation, unfortunately, BigQuery is literally your only option. Antione Eripret, the same author of the gscwrapper package used in this project has an excellent article on this topic over at Advanced Web Ranking.
Although you will get more data than the UI, you will still have lots of anonymized queries. However, they won’t just disappear from your data when you filter—you just may have clicks on a page where query is “none”
What’s Next?
I plan on adding lots more features to this app, eventually, but I wanted to get a working version out there as quickly as possible. This should allow most people to access any data they would need.
If you have requests or ideas for features, leave them as a comment on this post or shoot me a message. Happy to take requests and work on expanding this into the most useful free GSC tool out there.
Thank you for your support, as always, and I hope you find this extremely helpful in your SEO journey.
Share this post