Skip Navigation

Google Tries to Defend Its Web Environment Integrity as Critics Slam It as Dangerous

Attacks and doxing make me personally MORE likely to support stronger safety features in chromium, as such acts increase my suspicion that there is significant intimidation from criminals who are afraid this feature will disrupt their illegal and/or unethical businesses, and I don't give in to criminals or bullies

Kick a puppy
Get attacked for kicking a puppy
"These attacks make me MORE likely to keep kicking puppies, as I don't give in to intimidation from criminals and bullies that want healthy puppies for their nefarious ends."

90 comments
  • Quick correction: website scraping and ad blocking is not unlawful. It both is a means to make the web more accessible and the latter also reduces CO2 emission through reducing electricity usage from irrelevant ads. The same case could be made for web scraping as a user can make their own feed of news without having to sift through hundreds of pages. This as well can be done in a way that does not disrupt the pages‘ normal function.

    That is where the two larger issues come in:

    • people can argue that you need to pay for viewing a page/getting information through apps And
    • branding powerusers as criminals („unlawful“) is unfair and false

    The „pay for information“ is largely a phylosophical problem. It is no problem to pay for someones book or online course but the blanket statement that one has to pay for it is false. As an open source developer I give my work freely to others and in turn receive theirs freely as well (if they use the appropriate license of course).

    We really have two sides forming. The „open internet“ crowd that works together for free or maybe accepts donations and the proprietary crowd which is having a huge influence right now.

    Google putting in web DRM will cement that situation and make it possible that you can only use vanilla stuff on your browser and ultimately even shutting down any access to open source things completely by making it impossible to run on ubuntu since google will only accept windows clients (this is a possible outcome, not a guaranteed one).

    All in all, we are unable to perfectly anticipate the outcome of this but if we see great harming potential, it is fair to weigh it agains the potential benefits (which is the lofty goal of weeding out bots and scammers). I think the cost benefit relation is heavily tilted here.

    TL;DR: Tinkering with your browser is not illegal and should be allowed to continue. The cost of (potentially) weeding out bots and scammers is not worth potentially ruining the open source community.

  • Off-Topic: I saw this same exact post in lemmy.world.

    Are some posts posted cross-instances? How does that work.

    • I think if the local and remote instances are federated - for posts submitted to remote communities that have subscribers from the local instance - posts to the local instance can be annotated with cross-posted to: links, whenever the local instance is aware of other federated posts that have a matching URL in other OP posts.

      A single OP can manually cross post to other communities using the cross-post button next to the title of a post, although that will auto populate the body text of the new post with quoted text from the original, as well as an embedded hyperlink to the original.

      So coss-posts can be both auto detected by Lemmy, or manually created by OP(s).

90 comments