Skip to content

Bug: sitemap.xml and robots.txt not generated as expected #5812

@tokenicrat

Description

@tokenicrat

Pre-submission Checklist

  • I have searched existing issues and confirmed this bug has not been reported
  • I can reproduce this bug on the latest version or the demo site
  • This is a bug, not a question (use Discussions for questions)

Memos Version

v0.26.2

Deployment Method

Docker

Database

SQLite

Browser & OS

Chrome 146 on Arch Linux (rolling, latest)

Bug Description

ff53187 introduced sitemap.xml and robots.txt generation and was release in v0.18.1. This feature is now (as of v0.26.2) gone. Visiting /sitemap.xml and /robots.txt of an instance simply returns 404 (can be verified on official demo site).

This is a crucial features for public sites to work normally. Apart from SEO, many website collections require valid robots.txt and sitemap.xml as part of verification.

I tried to track release notes and code changes to find out when it's removed, but no release mentioned this removal and server/frontend/frontend.go seemed to be refactored and reorganized. Documentation linked v0.18.1 release note is invalid, and now "Admin" section doesn't mention this feature.

I wonder if this is an intended behavior or a bug during refactor? Thank you for your attention.

Steps to Reproduce

  1. Set up an instance (or use official demo) with MEMOS_INSTANCE_URL configured correctly
  2. Visit /robots.txt and /sitemap.xml

Expected Behavior

These URLs exists and contain valid data.

Screenshots, Logs & Additional Context

No response

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions