Deny Indexing, Archiving by page problem

I can’t seem to selectively deny indexing on pages by using the Search Engine Control page setting in Sparkle. The resultant installed pages don’t contain the appropriate robots meta tag contents. My particular pages of interest (in fact, ALL my pages) contain a robots entry that looks like this…

<meta name=“robots” content=“max-image-preview:large”

and there are no other robots items. I’ve tried the Indexing:Deny and Archival:Deny settings with and without the Optimize for Search Engines setting. Still no “noindex, noarchive” contents on the appropriate pages. Not sure where the “max-image-preview:large” comes from, unless from dual device size support.

I can’t find a bug in this area. I tested by turning off search engine control and setting on individual pages the deny.

If you deny indexing globally the setting will not go in individual pages, but in the robots.txt file, which takes precedence over the individual page settings, so perhaps that is what’s confusing.

Very puzzling then. I have no global prohibition on indexing or archiving, only about 6 pages of a 100 page site are set to the Indexing:Deny and Archival:Deny Page setting. The rendered pages don’t include any “noindex” or “noarchive” settings. Nor does robots.text have any such settings - only “Disallow: /images/” and “Disallow: /download/”. sitemap.xml DOES properly NOT include any of the pages or a folder index.html page which references them, which are NOT linked to other pages in the site. The Page settings seem to have no effect… Google’s Search Console reports the pages.