Let’s say you have a lot of crawl budget assigned to your domain and you want to use it to benchmark your own Core Web Vital metrics against your direct competitors. Using Chrome UX report APIs is OKish for domain level metrics but you obviously want more data.
Step 1: Crawl your competitors website to be able to map all URLs they have to get a full picture.
Step 5: add a HTML sitemap containing all the URLs within the competitors folder so Google can actually find them organically by crawling oldskool. Buy some forum profile links, because yeah, we do oldskool SEO here.
Step 6: add a XML sitemap containing all those URLs to the newly created GSC profile so you have some visibility on how slow Google can be with crawling and indexing shit.
Step 7: once you had a burst of crawls from the ever happy Googlebot, head out to your CWS dashboards:
Why does this work?
Because with temporary 302 redirects, Google attributes everything from the final destination URL to original URL. That works with onpage elements like meta title and description, but also with reporting CWS data in Google Search Console. 302 redirect hijacking is not something new, so if you find it interesting, search some SEO blog archives for it.
fine line between successful and disastrous implementations. Below I will share
10 tips to prevent SEO disasters to happen with your own or your clients sites.
Many of my (former) clients have great content teams working on optimizing websites. Sending them a spreadsheet of data with some fancy click through ratios for keywords or URLs that perform better or worse than other datapoints will not get them enthousiastic at all. Sending them a fancy interactive graph in Data Studio will get them to work (not a 100% success ratio!) on boring things like optimizing meta descriptions and titles. So how can you spend 5 minutes of your time on Data Studio and make your life as a marketer much easier? Follow the instructions below.