Disclaimer. I’m an SEO professional, so my perspective is probably jaded and biased. Take the following info as a grain of salt. I hope this helps Appsumoers understand this tool better :)
TLDNR:
This tool will probably work best for sites with limited meta info setup. It loads quickly and it's a fast way to add and manage meta info. For larger sites which already use a well-structured SEO strategy, I am still waiting to see a large positive impact after testing it for 30 days. I’m looking forward to future product features and traffic results!
--
INITIAL IMPRESSION:
The script seems to work well and has a very low CWV JS payload of 9kb which is great for maintaining high page speed. The platform is very intuitive to use, and the descriptions are easy to make and set up.
TESTING:
For context, I am 30 days into testing Nytro across several different WordPress ecommerce stores and other client sites. I am deploying Nytro through Google Tag Manager in a custom html tag. The oldest and largest site used for testing was launched in 2014 with around 98K pages. All of the sites were previously optimized for SEO with primary + secondary keywords in the meta titles and descriptions which match on-page content strategies. These on-page strategy keywords were also uploaded into the Nytro platform.
METADATA OPTIMIZATION RESULTS:
When comparing the original versus optimized metadata in the Nytro platform, most of the changes in the optimized descriptions appear to slightly change, rearraign, or add an additional keyword word or two into the original description. It did find a few pages with no meta description (they were ecommerce tag pages) and generated a meta description for those which was helpful.
PROBLEMS ENCOUNTERED:
The sites used for testing are running a CDN with a firewall, so I did encounter a crawler problem when the Nytro bot was rate limited from crawling pages too quickly. I encountered this a few times after processing around 10K pages. When this happened, the Nytro crawler hit a "You Are Being Redirected" splash page. It then started using that redirected error message meta title and description in the platform. The solution was to search for and reprocess results in the dashboard. Nytro support also helped me whitelist the crawler user agent. Great support!
NEXT STEPS:
The part I am waiting to see is how the Nytro changes influence the traffic trends. Right now, I have Nytro set to auto optimize all pages, and have waited around 30 days with limited results. I'm looking for positive organic changes measured through Google Search Console and GA4, and hoping to see positive trends soon.
ABOUT GOOGLE:
In relation to how Google sees the meta changes, we know the set meta info isn't the end all be all. According to developers.google documentation on 'controlling your snippets in search results', we can learn that "Google primarily uses the content on the page to automatically determine the appropriate snippet... Snippets are automatically created from page content... This means that Google Search might show different snippets for different searches... Google will sometimes use the <meta name="description"> tag from a page to generate a snippet in search results, if we think it gives users a more accurate description than would be possible purely from the on-page content."
The takeaway from Google is that our set meta info for a webpage isn’t always used and Google often generates its own.
BOTTOM LINE:
I'll be keeping this tool even though my results are coming in slowly. I really like the way Nytro optimizes things quickly and how easy it is to use. I hope to see more platform features in the future! For example, it would be really cool to implement an A/B testing feature which could swap between metadata versions once measured clicks in Google Search Console reached a statistically significant sample size. Also, being able to set a page limit by site or subdomain would be very helpful when working with really large sites.