So, the Lord God Almightly, google search console, has many of my personal subdomain 

(https://pattern.usefulcomponents.com/)

URLs at 'crawled not indexed' status, some of which are short URLs such as...

 https://pattern.usefulcomponents.com/listing/1517830630 

...where the pattern "listings" sitemap 

( /sitemaps.xml?sitemap=listings&offset=0 )

specifies this item as...

https://pattern.usefulcomponents.com/listing/1517830630/variable-radio-tuning-capacitor-575pf   .

Those should appear as non-canonical duplicates, causing a bit of a crawl budget problem, right?  They don't.  I can't control the pattern robots.txt file contents.  All I can do is manually, one-by-one, request removals for the short URLs.  Having done this, today it's decided that I've re-submitted the "pages" sitemap

( /sitemaps.xml?sitemap=pages&offset=0 )

on 06-AUG-2025, which I haven't.  Odd.  All very strange indeed.

At some stage I may crack this and be able to put something here that's relevant to actual potential customers.  Meanwhile, if they can't see the darn listings, it's back to fathoming the URL gobble-de-gook, and trying things at random.

Like making this post and resubmitting the "blogs" sitemap, and seeing what happens.

( /sitemaps.xml?sitemap=blogs&offset=0)

I really shouldn't have to become a full time web monkey to make this work.

Funs!