@floppy Could you elaborate on this? What exactly do you mean, local files as feeds, like directories where each file is an entry, or files where each line is an entry
When reading the #newsboat documentation, I found that in the "urls" file you could not only place web URIs, but also local files (lines starting with "file://").
I think that might allow interesting possibilities. For example regularly fetching a website without RSS capabilities, turning it into a proper format, and storing it on disk. But I wondered whether there might be more interesting ways to use this feature.
I agree very much, but I might actually end up using that feature in the way I used as an example (feedify non-RSS websites).
So far I only found solutions that required me to use some online or locally running deamon of some sort. I find this annoying. Whipping up some script that fetches and converts some web page on demand suits me more.
@floppy Writing a file would also require a daemon of some sort, even if it's just cron ;)
I think exec: and filter: will suit your use-case better: https://newsboat.org/releases/2.23/docs/newsboat.html#_scripts_and_filters_snownews_extensions For example, I have a program that converts GitHub search results into Atom feeds, which I use to find downstream issues about Newsboat: https://github.com/Minoru/github-search-to-rss
Fosstodon is an English speaking Mastodon instance that is open to anyone who is interested in technology; particularly free & open source software.