• Darkenfolk@dormi.zone
    link
    fedilink
    English
    arrow-up
    33
    arrow-down
    1
    ·
    1 year ago

    Honestly, that title made me throw up in my mouth a little and the article made it worse.

    Not everything needs to be moderated and fear mongering because terrorists are supposedly using certain social media doesn’t help that much either.

    If criminals can’t use a certain platform due to moderation they can simply hop to some other platform.

      • danhakimi@kbin.socialOP
        link
        fedilink
        arrow-up
        1
        arrow-down
        2
        ·
        1 year ago

        Rationalization for silencing people that OP disagrees with. Just call them “terrorists” and now it’s moral.

        I’m sorry, is there another title you would like to use for mass murderers who engage in mass murder for the purpose of causing terror? Is there somebody here who agrees with ISIS, with their position that people who don’t love ISIS shouldn’t have heads?

        I don’t think I need to rationalize my position that scammers and killers are bad and should not be given a free platform upon which to reach hundreds of millions of people with unlimited video uploads. I think that’s a perfectly rational position as is.

        I don’t think opposing neo-Nazi conspiracies makes me a Nazi. I do think you said that in a weak attempt to shut down all rational discussion here.

    • danhakimi@kbin.socialOP
      link
      fedilink
      arrow-up
      3
      arrow-down
      16
      ·
      1 year ago

      If terrorists are forced to pay a sysadmin to host a slow, makeshift matrix server in their moms’ basement, rather than having free access and unlimited uploads to a global network of 550 million rubes stupid enough to fall for crypto scams, I consider that a win for the world.

      If crypto scammers move to some platform nobody’s ever heard of, and nobody uses, because it’s nothing but crypto scams, I consider that a win for the world.

      It’s not incoherent to want to make the lives of extremists less convenient. I’m not saying we can or should bother trying to eradicate their access to messaging altogether. I’m saying we should recognize it as a problem and try to address it instead of saying “oh, wow, terrorists use our platform? Cool. Fun. Neat.”

      • pelotron
        link
        fedilink
        English
        arrow-up
        15
        ·
        1 year ago

        Criminals also use things like public restrooms. Maybe we should gate those behind background checks.

        • danhakimi@kbin.socialOP
          link
          fedilink
          arrow-up
          1
          arrow-down
          6
          ·
          1 year ago

          If you find a pattern of serial killers using a particular set of public restrooms, you might, in fact, want to consider security of some kind at those public restrooms. Maybe cameras near the entrance and a security guard situated in the damn mall would help.

      • Darkenfolk@dormi.zone
        link
        fedilink
        English
        arrow-up
        2
        ·
        1 year ago

        But that is kind of the issue here isn’t it? Bad elements are not inconveniented at all by any of that, it’s normal people who suffer from getting censored.

        I personally am not willing to give up any online freedoms I have, just on the off chance that it might be inconvenient for criminals.

        • danhakimi@kbin.socialOP
          link
          fedilink
          arrow-up
          1
          arrow-down
          2
          ·
          1 year ago

          Bad elements are not inconveniented at all by any of that,

          … how the fuck are they not inconvenienced by that? Can you not agree that telegram is convenient for them? The unlimited fucking video uploads and built-in audience?

          it’s normal people who suffer from getting censored.

          Eh. I’m not sure what the real risk is there, IDK how Telegram is going to hear “let’s censor terrorists” and end up censoring normal people. And I’m also not sure having a post or channel shut down on Telegram would really lead to suffering, I’ve never seen or heard of a decent telegram channel… The best ones mostly still seem to be charlatans, maybe a couple for, like, companies or reporters that are just present everywhere, but nothing that actually serves a valuable role in society… But yeah, I don’t think there’s any huge risk of those charlatans being banned, if they’re who you’re worried about.

          I personally am not willing to give up any online freedoms I have, just on the off chance that it might be inconvenient for criminals.

          Like, it is objectively convenient and useful and helpful to “criminals” who celebrate after burning children alive by calling their parents and asking them if they’re proud, like, kind of goes beyond “criminal,” but yeah, they’re going to send the videos they took to other terrorists and spread them to checks notes spread terror, which is the whole fucking thing they do.

          But also, you’re not willing to give up your “freedom” to share videos of people you beheaded on telegram? Is that an important freedom to you?

          • Darkenfolk@dormi.zone
            link
            fedilink
            English
            arrow-up
            1
            ·
            1 year ago

            First off, let me just say that I am not quite sure how telegram works. So far I know it’s just a far saver WhatsApp that guarantees that your info isn’t being sold.

            Second, most of my issues with your ideas of censoring content is meant in a more general sense.

            [… how the fuck are they not inconvenienced by that?]

            You mean apart from the fact they can just hop over to some other platform?

            Also I am not quite sure how you imagine telegram to actually do something against terrorists while also guaranteeing the privacy aspect of the service they are providing. That’s literally their whole thing, privacy.

            [But also, you’re not willing to give up your “freedom” to share videos of people you beheaded on telegram? Is that an important freedom to you?]

            I am not even going to bother with all these other straw man fallacies. If you can’t have a civil discussion without putting words in my mouth and twisting the meaning of them we are done here.

            • danhakimi@kbin.socialOP
              link
              fedilink
              arrow-up
              1
              arrow-down
              1
              ·
              edit-2
              1 year ago

              First off, let me just say that I am not quite sure how telegram works. So far I know it’s just a far saver WhatsApp that guarantees that your info isn’t being sold.

              None of that is true. Telegram is Open Source, but is not safer than WhatsApp in any way. Telegram messages are not e2ee by default, and group chats can not be e2ee, unlike Whatsapp which encrypts all messages end-to-end, always.

              There’s no real safeguard in place to prevent telegram from selling your data. It probably isn’t doing it for now, it claims that it doesn’t sell your data, but so does facebook. Their privacy policies aren’t that different. Telegram does have your data, it could sell your data, it just doesn’t. Note that telegram is not a nonprofit.

              Second, most of my issues with your ideas of censoring content is meant in a more general sense.

              I’m not calling for “more general” censorship, I’ve been quite specific.

              You mean apart from the fact they can just hop over to some other platform?

              How is that not an inconvenience? Moving your entire. organization, finding a platform that won’t ban you, finding a platform that will host unlimited beheading videos at high resolution, finding a platform with the same broadcast feature so they can spread propaganda to hundreds of millions of people, finding that all for free… What part of that is just as convenient as continuing to do what they’re doing?

              Also I am not quite sure how you imagine telegram to actually do something against terrorists while also guaranteeing the privacy aspect of the service they are providing.

              Telegram is literally already aware of which accounts belong to Hamas, Pavel Durov has publicly commented on why he doesn’t feel like Hamas is a problem, this is not a question of discovering terrorism on the platform, this is a question of figuring out what to do about it.

              That’s literally their whole thing, privacy.

              Privacy is aggressively not a thing Telegram is about. You seem to have fallen for some of their weird marketing. The only things about Telegram that might preserve your privacy are the option to sign up anonymously (a lot of messaging apps have this, and it’s not necessarily a good thing), and the fact that it isn’t owned by facebook (most messaging apps aren’t).

              This is especially true of broadcast channels, which any schmuck can view without installing the app or having an account at all.

              I am not even going to bother with all these other straw man fallacies. If you can’t have a civil discussion without putting words in my mouth and twisting the meaning of them we are done here.

              But that’s what I’m talking about. I said Telegram shouldn’t provide terrorists with a means to share videos of beheadings, and you have a problem with the freedom-related implications of that, I’m really not sure what your point was.

  • PlexSheep@feddit.de
    link
    fedilink
    English
    arrow-up
    17
    arrow-down
    1
    ·
    1 year ago

    Criminals use the www! It’s time for moderation to get serious.

    It does not work like this. At least, it should not.

  • LainOfTheWired@lemy.lol
    link
    fedilink
    English
    arrow-up
    16
    arrow-down
    1
    ·
    1 year ago

    Why can’t we just better educate people on how to avoid online scams. Oh wait that wouldn’t give the government an excuse to legislate another part of our lives into oblivion.

    And it’s funny how suddenly we are having all these terrorist problems it’s like something else is causing it, but once again solving it probably doesn’t benefit the government.

    • danhakimi@kbin.socialOP
      link
      fedilink
      arrow-up
      3
      arrow-down
      7
      ·
      1 year ago

      I’m all for better education, but there will always be people who don’t understand the technology, and scammers and extremists will always look for new ways to trick people and radicalize people.

      Most of these regulators are just asking the platforms what they’re doing to combat extremism, not actively regulating the platforms. Regulators are, by and large, afraid of technology, and afraid that they’ll regulate it incorrectly. But by questioning the companies, they can apply pressure to make sure the companies take moderation seriously. The fear of hypothetical regulation and strong negative PR is usually enough to get the companies to at least try to do better. That’s a good thing.

      And it’s funny how suddenly we are having all these terrorist problems it’s like something else is causing it, but once again solving it probably doesn’t benefit the government.

      I have no fucking idea what you’re trying to say here.

      • sic_semper_tyrannis@feddit.ch
        link
        fedilink
        English
        arrow-up
        2
        ·
        edit-2
        1 year ago

        And it’s funny how suddenly we are having all these terrorist problems it’s like something else is causing it, but once again solving it probably doesn’t benefit the government.

        I have no fucking idea what you’re trying to say here.

        That the government is making up the terrorist problems so while people are scared they can come to our rescue, legislate, and thus have more control.

    • danhakimi@kbin.socialOP
      link
      fedilink
      arrow-up
      1
      arrow-down
      8
      ·
      1 year ago

      Well, presumably the people who control all of the servers and encryption keys and directly profit from the app’s users. But whatever, as long as we see fewer scams and fewer terrorists, I’m not picky about who is shutting them up.

  • bloopernova@infosec.pub
    link
    fedilink
    English
    arrow-up
    10
    arrow-down
    1
    ·
    1 year ago

    Where is telegram hosted/managed?

    Honestly I doubt even delisting from the play/app stores will stop people from using it, at least on Android.

    • danhakimi@kbin.socialOP
      link
      fedilink
      arrow-up
      4
      arrow-down
      8
      ·
      1 year ago

      According to wikipedia, worldwide servers, HQ in Dubai.

      Honestly I doubt even delisting from the play/app stores will stop people from using it, at least on Android.

      It would make it much more difficult for scammers to reach victims, and dramatically stem its growth, but that’s not really what I’m calling for. I’m mostly just hoping that the world includes these messaging services when thinking about how to address and regulate social media and extremism, rather than excluding them because they misclassify them as not being social networks.

  • glad_cat@lemmy.sdf.org
    link
    fedilink
    English
    arrow-up
    9
    arrow-down
    1
    ·
    1 year ago

    Please send me all the private messages of your phone so that I can moderate them. You’re not a hypocrite, aren’t you?

    • danhakimi@kbin.socialOP
      link
      fedilink
      arrow-up
      2
      arrow-down
      4
      ·
      edit-2
      1 year ago

      No, my private messages are well-encrypted. But people voluntarily send their private messages to Telegram without e2ee and avail themselves of Telegram’s moderation, which they know it does. They just know that their crypto scams are profitable enough given how little of a fuck Telegram gives that they’re willing to put up with it. They know that sharing unlimited videos on private servers would cost money, and that money would mean less money to buy weapons with, so why not enjoy unlimited uploads and share them with half a billion people who think “yeah, that’s fine.”

        • danhakimi@kbin.socialOP
          link
          fedilink
          arrow-up
          1
          arrow-down
          2
          ·
          1 year ago

          Why don’t you start with the messages people have already voluntarily given you?

          Anybody who has received the messages I’ve voluntarily sent them is free to do with those messages as they see fit. I certainly haven’t asked any of them to store large videos for me for free without input.

    • HarkMahlberg@kbin.social
      link
      fedilink
      arrow-up
      2
      arrow-down
      1
      ·
      1 year ago

      OP said something I disagree with, therefore all of kbin must be like him!

      How childish. I don’t see anyone assuming lemmy users are all tankies because they visited lemmygrad once.

        • danhakimi@kbin.socialOP
          link
          fedilink
          arrow-up
          1
          arrow-down
          4
          ·
          1 year ago

          The asshole who wrote the article and stands by it, how dare he argue for his position!

          If you have a response to something I said, I’d encourage you to actually make it rather than just being a dick about it.

            • danhakimi@kbin.socialOP
              link
              fedilink
              arrow-up
              1
              arrow-down
              3
              ·
              1 year ago

              My position is one I am quite proud of. Again, instead of being a dick, you could try responding to me, as though I were a person. There’s nothing hypocritical about my position, I expect all unencrypted services I use to be moderated as well.

              • ram@bookwormstory.social
                link
                fedilink
                English
                arrow-up
                2
                ·
                1 year ago

                Man, chill, we already know you’re from kbin.social, you don’t need to keep riding it into the ground.

  • atrielienz@lemmy.world
    link
    fedilink
    English
    arrow-up
    3
    ·
    1 year ago

    Facebook is a CP paradise because of the way private groups are set up. Been a problem for years. Even with all the moderation. What do you expect the moderation will do in this instance?

    • danhakimi@kbin.socialOP
      link
      fedilink
      arrow-up
      1
      arrow-down
      1
      ·
      1 year ago

      Facebook should be better at moderation. I’m not familiar with the Child Porn myself, but I think this is a problem with moderation that relies on reporting versus somewhat more proactive moderation, such as automatically scanning content (with human review).

      I expect the moderation to frustrate the efforts of scammers, extremists, and terrorists so that scams are no longer profitable and so that they can no longer spread hate and terror as effectively.

      • atrielienz@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        1 year ago

        Scams will always be profitable. The difference between scamming someone in real life and scamming them via the internet isn’t all the much different.

        Scammers use phone calls to scam people too. Are you suggesting we tap and monitor everyone’s phones for keywords?

        The thing about privacy is that you seem to be willing to let people or organisations (that we can’t prove have our best interest at heart) violate people’s privacy in order to get the result you want. And there’s no proof that you will get that result.

        Meanwhile someone who’s human has to make the determination that something is criminal or something is CP and that means we have to pay people to comb through all that data.

        That’s very taxing on the individuals involved. It does harm to them.

        Now I’m sure you’ll say something about expectation of privacy when submitting anything to the web. But people do have the expectation of privacy online. Take a look at people who are deliberately de-googling or up in arms about web sites collecting their data to target them with ads.

        • danhakimi@kbin.socialOP
          link
          fedilink
          arrow-up
          1
          arrow-down
          1
          ·
          1 year ago

          Scams will always be profitable. The difference between scamming someone in real life and scamming them via the internet isn’t all the much different.

          It is! People are wary of scams, it’s not worth wasting an hour trying to scam one random person anymore. Spending a few hours trying to spam millions of people is a worthwhile use of a scammer’s time, especially when there’s no risk of being banned or caught.

          Scammers use phone calls to scan people too. Are you suggesting we tap and monitor everyone’s phones for keywords?

          That’s not a solution I would pose for that problem, no, but there are laws in place intended to reduce spam calling, and many, many tech companies are proposing many, many solutions for that, and nobody is saying “oh no, what if you accidentally classify an innocent telemarketing-spammer as a scammer?” Good riddance.

          The thing about privacy is that you seem to be willing to let people or organisations (that we can’t prove have our best interest at heart) violate people’s privacy in order to get the result you want. And there’s no proof that you will get that result.

          I think people should use e2ee-enabled chat services if they expect privacy. Telegram users don’t bother turning on secret chats because they lack all of the features that make people want to use telegram. I think Whatsapp users have a much more reasonable expectation of privacy, and Whatsapp still goes through efforts to reduce spam and misinformation on its platform. I think Matrix users have a more reasonable expectation of privacy (since encryption is on by default, and can be used in group chats and spaces and on calls and with every other matrix feature), but Matrix servers still do not federate with Al Qaeda rocket.chat servers, because they know better.

          Meanwhile someone who’s human has to make the determination that something is criminal or something is CP and that means we have to pay people to comb through all that data.

          Telegram has to. It can afford to. I’ll remind you that I did not set criminality as the bar.

          That’s very taxing on the individuals involved. It does harm to them.

          Now think of what it does to the rest of the people who have to see it. If you can pay one person to willingly review content for child porn that couldn’t be identified in advance so a million others don’t have to see it, is that really the worst thing?

          Now I’m sure you’ll say something about expectation of privacy when submitting anything to the web. But people do have the expectation of privacy online. Take a look at people who are deliberately de-googling or up in arms about web sites collecting their data to target them with ads.

          Data collection feels very different to me, but you’re specifically sending your messages to Telegram. Like, Google scanning my emails is something I have trouble objecting to (although I object to the use of that information for ad-related purposes, of course).

          By “de-googling,” do you mean the whole “right to be forgotten” thing? I think that’s nonsense, clearly at odds with the right to remember, and largely used by wealthy assholes to deflect attention from shitty things they did.

          • HarkMahlberg@kbin.social
            link
            fedilink
            arrow-up
            1
            ·
            edit-2
            1 year ago

            I’m a mixed bag of agree and disagree on these points but I’m only going to point out that the “De-Googling” trend doesn’t really have anything to do with the right to be forgotten. It has more to do with enshittification - Google shutting down services, making their current services harder to use, charging money for what used to be free services, charging more money for already paid services, adding ads, etc etc. Basically people finding alternative software to Google because Google’s practices have become increasingly volatile and their services less and less reliable.

            • atrielienz@lemmy.world
              link
              fedilink
              English
              arrow-up
              1
              ·
              1 year ago

              To hear people on the privacy subreddits and even the privacy Lemmy communities tell it, it’s absolutely about the data these companies are collecting. I’ll grant you it’s about what the companies are perceived to be doing with the data the collect (serving ads), but I don’t think I personally ever made the point that op did (that it was about right to be forgotten).

              Either way, I think op may have missed my point. As technology evolves people will find new ways to abuse it. And there’s a level of privacy people should have the expectation of, and our privacy laws don’t do enough as it is. Op is really suggesting that we further violate everyone’s privacy in the name of protecting them and they don’t want to hear that it’s a bad idea or one where we would have to put our trust in a company or companies to apply this monitoring.

              They also don’t seem to want to hear about the burn out rate of people tasked with moderating content and validating that that content is against TOS or breaks the law. Having humans trawl communities or even just messaging app text data for CP and scams is bound to have a detrimental effect.

              • danhakimi@kbin.socialOP
                link
                fedilink
                arrow-up
                2
                arrow-down
                1
                ·
                1 year ago

                To hear people on the privacy subreddits and even the privacy Lemmy communities tell it, it’s absolutely about the data these companies are collecting.

                Sure. But I can’t blame them for collecting data that I literally decide to send them for no reason but my own, I can only blame them for using that data in a shitty way.

                If I post something on Instagram, I know that they’re collecting the photo I post, that’s how posting works, that’s not the issue. The issue comes if they try scanning peoples’ faces to invade their privacy, or build an advertising profile about me. Sending unencrypted chat messages is not that different.

                If I download Whatsapp, and I enable the contacts permission, and it uploads all of the Contacts data on my phone, that’s super not okay, because I never wanted to give them that data in the first place, they just jacked it.(I disable contacts permission for whatsapp on my phone, but most users would never know that data gets uploaded to begin with.

                • atrielienz@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  1 year ago

                  Users are responsible for the conduct and permissions they give to companies. Absolving them of that responsibility doesn’t make sense ethically or legally. We can’t just say “they didn’t know because WhatsApp didn’t tell them”. That’s not really an accurate statement. They more than likely agreed to use the app and in exchange they would receive free use and WhatsApp would receive that data. But they more than likely didn’t read the agreement before agreeing. That’s on them.

            • danhakimi@kbin.socialOP
              link
              fedilink
              arrow-up
              1
              arrow-down
              1
              ·
              1 year ago

              but I’m only going to point out that the “De-Googling” trend doesn’t really have anything to do with the right to be forgotten. It has more to do with enshittification - Google shutting down services, making their current services harder to use, charging money for what used to be free services, charging more money for already paid services, adding ads, etc etc. Basically people finding alternative software to Google because Google’s practices have become increasingly volatile and their services less and less reliable.

              Ohhhhh that de-Googling. Yeah, I’ve done a bit of that, disabled the Google app on my phone entirely since Firefox does its job better, but I’m on Android and doing all that setup every time I get a new phone is just a headache.

          • atrielienz@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            edit-2
            1 year ago

            You object to the business model of the free service you subscribe to? That’s what you basically said. You want to use Google’s free email service. You agreed to allow them to collect your data and target you with ads as is their business model. But you object to them collecting your data to target you with ads. That doesn’t make any sense.

            Spear phishing and so on are still a thing. Scams via regular SMS messages? Still a thing. It absolutely is profitable to target one person depending on how you target them and what you get in return. These scams and the businesses and companies that fight them are constantly playing whack a mole. They wouldn’t bother to continue trying to scam via email and SMS if it wasn’t profitable still.

            Saying Telegram has to monitor their users and the content sent via the service and suggesting that they should (as an extension of that) violate the privacy of the users to monitor them all for illegal activity because they have “no reasonable expectation of privacy” is an interesting take. Even the police are supposed to subpoena your texts if they can show reasonable cause.

            You’re throwing reasonable cause out the window. If it’s an app for private messaging the people who use it have the expectation that their messages are private. This isn’t a forum we’re talking about. It’s not twitch. They are sending messages to a specific recipient. They aren’t making a public post on Facebook.

              • atrielienz@lemmy.world
                link
                fedilink
                English
                arrow-up
                1
                ·
                1 year ago

                And if Hamas is violating their TOS while using the service, then they should do something about it. In the same way that Google blocks what scammers it can find who are using google voice numbers to scam people. But what you are suggesting isn’t that they take action against people or organisations that are violating the TOS or using the service to break the law.

                You are suggesting they essentially listen in on every conversation or message sent on the service to find people breaking the law or violating the TOS. That’s not the same thing.

                • danhakimi@kbin.socialOP
                  link
                  fedilink
                  arrow-up
                  1
                  arrow-down
                  1
                  ·
                  1 year ago

                  You are suggesting they essentially listen in on every conversation or message sent on the service

                  I’m not suggesting that!

                  They don’t need to listen to private chats to see what Hamas is doing, it’s open, it’s public, they’re not being subtle, and Pavel Durov himself has publicly commented on it. He just doesn’t give a fuck.

    • kick_out_the_jams@kbin.social
      link
      fedilink
      arrow-up
      1
      ·
      1 year ago

      The only time I see medium is links to blog type stuff.

      I’ve seen telegram spam all across the internet, the list of platforms I haven’t seen telegram spam on is basically the list of dead ones.

      • southsamurai@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        Exactly. Medium is just a joke because it’s only people blabbering shit. It’s just extra long tweets.

        Not that comments on lemmy are any better, but they don’t require linking to an external source for unsourced opinion pieces.

      • danhakimi@kbin.socialOP
        link
        fedilink
        arrow-up
        1
        arrow-down
        1
        ·
        1 year ago

        The only time I see medium is links to blog type stuff.

        I’m so confused, what else do people expect to see on Medium, ads for sugary breakfast cereal?

  • yip-bonk@kbin.social
    link
    fedilink
    arrow-up
    2
    arrow-down
    1
    ·
    1 year ago

    Totally agree that huge social media systems need to be understood as disproportionately affecting misinformation. I don’t know anything about Telegram, though.

    Are the pushback people fReEzE PeAChErs or something? Is Telegram just lovely? Dunno.

    • HarkMahlberg@kbin.social
      link
      fedilink
      arrow-up
      8
      ·
      edit-2
      1 year ago

      I’m part of a few Telegram channels full of highly progressive IRL friends and colleagues. I also know Telegram is full of channels dedicated to crypto shilling, liveleak-esque video and imagery, piracy chats, privacy chats, QAnon forums, etc etc. I used it to communicate with family when I was out of the country and didn’t want to pay for roaming charges.

      Telegram itself is just a piece of software. Telegram’s community is wide and varied. Does it need moderation? Yeah probably. Who should be doing the moderating, not just of individual channels but of all the channels? Eh, I don’t have a good answer to that.

      • danhakimi@kbin.socialOP
        link
        fedilink
        arrow-up
        1
        arrow-down
        4
        ·
        1 year ago

        Telegram itself is just a piece of software. Telegram’s community is wide and varied. Does it need moderation? Yeah probably. Who should be doing the moderating, not just of individual channels but of all the channels? Eh, I don’t have a good answer to that.

        As long as you agree it should be happening, I appreciate that. I think Telegram should probably worry about it, and keep looking for solutions, but also that people should report the problematic groups and channels they come across, and be aware of the issue just to put a little more PR pressure on them to come up with a solution.