From a technical and legal standpoint, ignoring ethics and dignity, is there anything preventing us from scripting a scraper that recreates reddit posts in a lemmy instance? Like maybe top 50 posts of the top 20 subreddits, without comments. I think it would help convince people to join, since the major argument for sticking with reddit is that it has more content. Thoughts?

  • pixelpop3@programming.dev
    link
    fedilink
    English
    arrow-up
    41
    ·
    edit-2
    1 year ago

    I don’t like the idea. It seems like those fake websites that scrape stackoverflow and SEO to ruin Google search. Avoiding those sites are among the reasons people type “reddit” into searches. People want authentic interactions and I think mirroring reddit into Fediverse lacks authenticity and undermines its authenticity. Content here should be from people who are here.

    If someone wants to assimilate content from reddit into something new and post it here that’s good. That means the person is here and can be interacted with.

    If someone wants to repost their own content here, that’s also fine. They are here to interact with.

    I just really think it’s a bad idea to deliberately build a ghost town and think people will move in.

    • sepiroth154@feddit.nl
      link
      fedilink
      English
      arrow-up
      11
      ·
      1 year ago

      This is a good point. Also the good stuff will (probably) be manually posted so these tools only duplicate the bad stuff.

  • itchick2014 [Ohio]
    link
    fedilink
    English
    arrow-up
    22
    ·
    edit-2
    1 year ago

    lemmit.online already has this and I blocked the communities of that instance because I don’t want Reddit content…I want Lemmy content.

    If you do decide to do it, use a bot specific instance with bot specific communities. Don’t flood existing communities with bot content.

  • auv_guy@programming.dev
    link
    fedilink
    English
    arrow-up
    15
    arrow-down
    1
    ·
    1 year ago

    Why should someone switch when the content is the same but you cannot reach the OP? In fact you would need to go to reddit to reach the OP! I think this would drive people back to reddit.

    • feylec@mander.xyz
      link
      fedilink
      English
      arrow-up
      4
      ·
      1 year ago

      One of Reddit’s main pressure points for forcing reopening was that it was “unfair for the users” to keep data hidden and inaccessible. Mirroring all that data takes away some of the leverage. So I can understand the value… we can move and we don’t have to worry about Reddit taking its ball and going home or claiming it deserves to be hostile because it is the steward of so much information.

      • pixelpop3@programming.dev
        link
        fedilink
        English
        arrow-up
        2
        ·
        1 year ago

        Leverage for what purpose? To fix reddit? Let reddit die or not die.

        Reddit has always come after mirrors and they will easily get courts to take down the instances. Don’t forget that prior to the API change they came after pushshift.

        Additionally, anyone mirroring reddit on the moral basis that the content is owned by the creators and reddit is an exploitative rentseeker, has an obligation to not become a rentseeker themselves. This means things like ensuring that content that users voluntarily delete is also deleted in the mirrors. Reddit in fact had a large battle with pushshift about this years ago such that pushshift supposedly now only keeps history of moderator and admin edits. I agree with that ethically.

        And in many cases you may be legally required to do this. To be clear Reddit made pushshift change to respecting user delete requests because of legal exposure and compliance risks.

        Not to mention that you don’t really know that anyone intends their content to be mirrored on sites they do not use. Particularly now that Reddit seems to be forcing private subreddits to be open. There’s no moral high ground for doing this.

  • letsgo@lemm.ee
    link
    fedilink
    English
    arrow-up
    12
    ·
    1 year ago

    Not directly/automatically, no. But I personally don’t see anything wrong with the same article being reported on HackerNews, Slashdot, Reddit and Lemmy; that’s just similar sites doing similar sitey things.

  • U+1F914 🤔@lemmy.world
    link
    fedilink
    English
    arrow-up
    11
    ·
    1 year ago

    There is value in real people selecting what to post on a link aggregator like lemmy/reddit/… .
    I don’t want to loose that human feeling, both in posts and comments.
    Of course the voting mechanism can do a lot of the heavy lifting, but having a flood of robot posts with a score of one might have a negative effect on good posts getting discovered.

    Hopefully the community will grow naturally to a point where it can satisfy my doom-scrolling addiction.

  • liori@lemm.ee
    link
    fedilink
    English
    arrow-up
    6
    ·
    1 year ago

    At such scale, a scraper wouldn’t be necessary, that’s easily doable by humans involved in these communities—with a human touch as well.

  • Double_A@discuss.tchncs.de
    link
    fedilink
    English
    arrow-up
    5
    ·
    1 year ago

    I personally find those scraped posts annoying in other communities. First the formatting looks ugly in the list view. Second it just feels awkward to even interact with such a post, especially if it’s a question where the real content is in the comment section on reddit.

  • ThoughtGoblin@lemm.ee
    link
    fedilink
    English
    arrow-up
    2
    ·
    edit-2
    1 year ago

    It would normalize bot submissions, which is bad for a lot of reasons. Not the least, disproportional bot activity is one of the categories used for defederation for instances like lemm.ee.