I need confirmation if I am thinking correctly about .

If I store visitor's user agent, the page they visited and when they visited the page, I don't need to display "is it okay if we store this about you" before the data can be logged, right?

I know that if you log the IP address of a visitor, you need to let the visitor accept or deny this or just tell the visitor that their IP address will be stored.

@edgren I think the general rule is that you need permission if the data can be traced back to the user. If you are storing Firefox 86 plus some page visiting info. I’d say that’s okay, but in the case a user has a unique browser it would actually be traceable back to the user. So it may be seen as a borderline case.

Why is the user agent important to store?

@bjonte Alright. Many thanks for your answer 🙂 I will give the visitors a choice before logging and also give them full control over their own data while on the page.

The user agent is not that important, but it gives me handy information about devices and browser, so I can make my blog better 🙂

@edgren I am not a lawyer, but here is my best guess:

GDPR has other ways for you to lawfully process user data, these are listed in Art. 6(1), usually logging IP addresses can fall under Art. 6(1)(f): "legitimate interest pursued by the controller" (controller=server admin). A legitimate interest can be to protect your server from malicious activities e.g. by means of fail2ban. For fail2ban to work, you need to log IP addresses.
Of course you also have to comply with the rest of Art. 5 GDPR.

@Johann150 Many thanks! 😃 Now I have some grounds to go on.

@sotolf Well, it is not that important, but it's useful information when it comes to developing the blog, so it becomes better on devices and browsers 🙂

But I will let the visitors choose if this information is OK to log or not.

@edgren I just don't see the purpose of that, why not just make something that renders okay in most browsers? I don't see how a Browser string will help there, so if you get fewer people using firefox for example you'll just make the page look bad in it? That makes zero sense to me.

@sotolf My blog renders good in modern browsers already 🙂 But if I log the user agent, I can see how many bots and such things visits my website. I was forced to disable my crawler detector[1] (I haven't updated the source code on Codeberg yet) because it disrupted the meta tags to show when linking a blog post somewhere.


@edgren It doesn't render in my web browser I only see a page that says: "Hämtar inläggen - var god vänta" and navigation links on the side.

@sotolf Hm, OK. Please take a screenshot and send it here, so I can see.

@sotolf Thanks 😊 Look in the upper left corner. You have the reason there 🙂 I will make that message more visible.

@edgren So reading a post or using the site at all is counted as "vissa funktioner?" Doesn't that also mean that the site will be impossible to use for people relying on a11y programs as well?

@sotolf Well. I wrote that message really fast, and I forgot about it 🙈

Now I want your opinion. Would you accept to allow JavaScript for my blog, or would you refuse to allow JavaScript to be loaded?

@edgren I don't surf with javascript activated on any sites unless I'm really needing to, I agree that people like me are few and far between, but I don't see why javascript should be needed just to show basic text?

But when I reach a site that isn't usable at least the basic functions without javascript I just close it and use another one.

What is a real problem though is if say a blind person wants to access the site and the reader can't parse the js?

@sotolf Alright. Thank you for your toot 🙂 I will try to make it load the posts without JavaScript 😊

I use JavaScript to load the posts because of the page load, especially on the English part of my blog. With JS, the home page loads much faster because the posts are loaded via JS 🙂

@edgren Yeah, I'm of the opinion that a site should at least serve basic function without relying on executing whatever code it wants on the persons machine, :)

It sounds a bit strange that using a intermediate step like javascript would make it load faster though?

@sotolf That is true 🙂

If you go to a website that are slow to load, your patience will run out after a few seconds (depending on how patience you are, of course). You will not be able to see what's happening in the background, and you can't use the menu on the website because one section of the website are blocking your access to it.

But if the website loads the posts via JS, you can interrupt the process and actually use the menu... much better, right? 🙂

@edgren Except when you're using a browser that doesn't have js or need accessibility tools and the site won't work at all, then I'd say a slower load or doing some performance testing to find the reason for the slow load (big pictures and so on) way more than a site that I can't use.

@sotolf Yepp. That's why I will load the posts without JavaScript if it is disabled 🙂

@sotolf @edgren it can helpful in identifying bad web bots (if they are not too bad and use some Mozilla user agent anyway)

@Johann150 @edgren For that you won't need to save the userstring would you? you'd just do a regex and deny access if it is a bad browser string, no need to save it.

@sotolf @edgren of course, if you do have a full list of all bad bots on the internet i would be very happy if you shared it 😁

@Johann150 @edgren How would saving the browser string help in any way?

@sotolf @edgren okay, maybe better use case

i can see that some link i posted on fedi is being hammered by a particular instance because pleroma is broken (has happened)

@Johann150 @edgren wouldn't an ip-based filter do exactly the same? I mean the IPs are in the server logs anyway.

@sotolf can't send an email to an IP address, but can if i have
"Pleroma 2.3.50-502-g3f582136-develop; <[...]@[...].gf>; Bot"

@Johann150 whois? dns lookup? that's kind of a dishonest arguement isn't it?

@sotolf i dont want to contact their hosting provider though

@Johann150 Well you might have a point there, then just log non-standard ones.

@sotolf what is a "standard one"? user agents are really bad to parse because they are not standardized and usually also contain "im compatible with this and this and this" etc.

@Johann150 Well, I guess if its really needed, still feels kind of unneccesary to log all of them to defeat an edge case to me though, there should be a better way.

Sign in to participate in the conversation

Fosstodon is an English speaking Mastodon instance that is open to anyone who is interested in technology; particularly free & open source software.