Can someone tell me why my pattern by Etsy site has a directive in the robots.txt file explicitly telling web crawlers such as google to ignore all my listings?

Erm... That's kinda bad for business, isn't it?

User-agent: Spinn3r

Disallow: /

User-agent: *

Disallow: /api/

Disallow: /thanks

Disallow: */listing/*/similar

Sitemap: https://pattern.usefulcomponents.com/sitemaps.xml?sitemap=listings&offset=0

Sitemap: https://pattern.usefulcomponents.com/sitemaps.xml?sitemap=blogs&offset=0

Sitemap: https://pattern.usefulcomponents.com/sitemaps.xml?sitemap=pages&offset=0

#

# #   \

#

#    -----

#   | . . |

#    -----

#  \--|-|--/

#     | |

#  |-------|