• BetaDoggo_@lemmy.world
    link
    fedilink
    English
    arrow-up
    163
    arrow-down
    18
    ·
    11 months ago

    Nobody cares until someone rich is impacted. Revenge porn has been circulating on platforms uninhibited for many years, but the second it happens to a major celebrity suddenly there’s a rush to do something about it.

    • givesomefucks@lemmy.world
      link
      fedilink
      English
      arrow-up
      90
      arrow-down
      5
      ·
      11 months ago

      What?

      This isn’t revenge porn, it’s fakes of celebrities.

      Something that was done for decades, and one of the biggest parts of early reddit. So it’s not “the second” either.

      The only thing that’s changed is people are generating it with AI.

      The ones made without AI (that have been made for decades) are a lot more realistic and a lot more explicit. It just takes skill and time, which is why people were only doing it for celebrities.

      The danger of AI is any random person could take some pictures off social media and make explicit images. The technology isn’t there yet, but it won’t take much longer

    • Ð Greıt Þu̇mpkin@lemm.ee
      link
      fedilink
      English
      arrow-up
      22
      arrow-down
      1
      ·
      11 months ago

      I think it’s more about the abject danger that unregulated AI replication of noteworthy figures represents to basically everything

      Also, revenge porn is illegal in I think every state but South Carolina and even then it might have been banned since I saw that stat

    • Deceptichum@kbin.social
      link
      fedilink
      arrow-up
      19
      arrow-down
      1
      ·
      11 months ago

      While I agree with the sentiment that rich people’s issues have more influence.

      How Many States Have Revenge Porn Laws?

      All states, excluding Massachusetts and South Carolina, have separate statutes specifically related to revenge porn. It’s important to note, however, that a person may still be prosecuted for revenge porn under other statutes in those two states.

      https://www.findlaw.com/criminal/criminal-charges/revenge-porn-laws-by-state.html

    • Mango@lemmy.world
      link
      fedilink
      English
      arrow-up
      8
      arrow-down
      2
      ·
      11 months ago

      You think it wasn’t celebrities first? The issue here is specifically Taylor Swift.

    • Fades@lemmy.world
      link
      fedilink
      English
      arrow-up
      10
      arrow-down
      11
      ·
      11 months ago

      What a braindead take. Both the US and many other intl countries have enacted AI safety and regulation rules, this is an extension of that effort. The idea is to set a precedent for this kind of behavior. They are also looking into how AI is being used for election interference like having AI Biden tell people not to vote.

      Everybody cares, just because it’s not all in place day 0 doesn’t mean nobody does

  • Aniki 🌱🌿@lemm.ee
    link
    fedilink
    English
    arrow-up
    68
    arrow-down
    17
    ·
    edit-2
    11 months ago

    This wasn’t a problem until the rich white girl got it. Now we must do… something. Let’s try panic!

    -The Whitehouse, probably.

    • frickineh@lemmy.world
      link
      fedilink
      English
      arrow-up
      26
      arrow-down
      10
      ·
      11 months ago

      Honestly, I kind of don’t even care. If that’s what it takes to get people to realize that it’s a serious problem, cool. I mean, it’s aggravating, but at least now something might actually happen that helps protect people who aren’t megastars.

        • awwwyissss@lemm.ee
          link
          fedilink
          English
          arrow-up
          8
          arrow-down
          18
          ·
          11 months ago

          Blah blah blah so tiring to hear this thoughtless perspective constantly pushed in the fediverse.

          • PapaStevesy@midwest.social
            link
            fedilink
            English
            arrow-up
            10
            ·
            edit-2
            11 months ago

            I’m just saying, nothing about this should lead anyone to the conclusion that anyone in power is suddenly going to start caring about poor people. They’re literally only talking about this because a billionaire got its feelings hurt.

            • awwwyissss@lemm.ee
              link
              fedilink
              English
              arrow-up
              1
              ·
              11 months ago

              Fair enough, and at the end of the day I probably hate billionaires as much as you do.

          • eskimofry@lemmy.world
            link
            fedilink
            English
            arrow-up
            5
            arrow-down
            4
            ·
            11 months ago

            If you don’t like discourse that is different from your beliefs then plug your ears and shout lalala as you have been doing so for decades.

            • intensely_human@lemm.ee
              link
              fedilink
              English
              arrow-up
              1
              ·
              11 months ago

              Someone complaining about the same thoughtless perspective is not complaining about discourse.

              Just once I’d love to have actual discourse about capitalism. I’ve never met a person who expressed hatred of capitalism who seemed capable of of discourse unfortunately.

      • intensely_human@lemm.ee
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        3
        ·
        11 months ago

        The only thing that could possibly happen to protect people from this is to make AI illegal. That would be (a) impossible to enforce without draconian decrease in individual freedom, like keeping people stuffed in crates of packing foam instead of free to move around, and (b) absolutely horrible if it were successfully enforced.

        AI is cheaper and easier to proliferate than any drug. We have not succeeded in controlling drugs, despite their physical requirements of mass and volume making them visible in reality, a feature AI does not share.

        The attempt to control AI can and will destroy all our freedoms if we let it. Again, the only way to control something so ephemeral as computation is to massively restrict all freedom.

    • Fades@lemmy.world
      link
      fedilink
      English
      arrow-up
      8
      arrow-down
      6
      ·
      11 months ago

      It absolutely was a problem before and it’s not because Taylor is white. Revenge porn laws aren’t new and AI legislation has been in the works before this popped off.

      You also gonna say nobody cared about election interference until an AI recording of Biden told people not to vote?

      Just because you weren’t aware doesn’t mean it wasn’t happening

  • Zozano@lemy.lol
    link
    fedilink
    English
    arrow-up
    34
    arrow-down
    2
    ·
    11 months ago

    Do you want more AI gens of nude Taylor Swift? Because that’s how you get more AI gens of nude Taylor Swift.

  • guyrocket@kbin.social
    link
    fedilink
    arrow-up
    27
    arrow-down
    1
    ·
    11 months ago

    This will be interesting.

    How to write legislation to stop AI nudes but not photo shopping or art? I am not at all sure it can be done. And even if it can, will it withstand a courtroom free speech test?

    • macrocarpa@lemmy.world
      link
      fedilink
      English
      arrow-up
      14
      arrow-down
      2
      ·
      11 months ago

      I think it’s not feasible to stop or control it, for several reasons -

      1. People are motivated to consume ai porn
      2. There is no barrier to creating it
      3. There is no cost to create it
      4. There are multiple generations of people who have shared the source material needed to create it.

      We joke about rule 34 right, if you can think of it there is porn of it. It’s now pretty straightforward to fulfil the second part of that, irrespective as to the thing you thought of. Those pics of your granddsd in his 20s in a navy uniform? Your high school yearbook picture? Six shots of your younger sister shared by an aunt on Facebook? Those are just as consumable by ai as tay tay is.

    • Grimy@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      2
      ·
      edit-2
      11 months ago

      You write legislation that bans all three because there is no difference between generating, photoshopping or drawing lewds of someone without their consent.

      Banning this on an individual level would be impossible, so you let the platforms that host it get sued.

      We have the technology to detect if an image is NSFW and if it includes a celebrity. Twitter is letting this happen on purpose.

      The images spread across X in particular on Wednesday night, with one hitting 45 million views before being taken down. The platform was slow to respond, with the post staying up for around 17 hours.

      It’s hard to pretend it wasn’t reported by Taylors fans many time during this time and the moderators didn’t know about this image half an hour after it was posted.

    • Susaga@ttrpg.network
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      2
      ·
      11 months ago

      If the image is even slightly convincing, it’s essentially just defamation with digital impersonation thrown in. Yeah, that might catch photoshop in its net, but you’d need to be a DAMN good artist to get caught in it as well.

      • NoIWontPickaName@kbin.social
        link
        fedilink
        arrow-up
        2
        arrow-down
        3
        ·
        11 months ago

        So what level is slightly convincing?

        What about people that happen to look like someone famous?

        What level of accuracy is necessary?

        If I label some random blonde ai generated porn “Taylor Slow”, does that count?

        They are both blonde after all.

    • remotelove@lemmy.ca
      link
      fedilink
      English
      arrow-up
      24
      arrow-down
      8
      ·
      edit-2
      11 months ago

      Well, it’s not really just about Swift. There are probably many other people that are going through this. Not every person who generates nudes of someone else is going to make it to the news, after all.

      I could see this being a problem in highschools as really mean pranks. That is not good. There are a million other ways I could see fake nudes being used against someone.

      If someone spread pictures of me naked: 1. I would be flattered and 2. Really ask why someone wants to see me naked in the first place.

      If anything, just an extension of any slander(?) laws would work. It’s going to be extremely hard to enforce any law though, so there is that.

      However, how long have revenge porn laws been a thing? Were they ever really a thing?

  • cosmicrookie@lemmy.world
    link
    fedilink
    English
    arrow-up
    25
    arrow-down
    7
    ·
    11 months ago

    Wait… They want to stop only Taylor Swift AI fakes? Not every AI fake representing a real person???

    • AngryCommieKender@lemmy.world
      link
      fedilink
      English
      arrow-up
      16
      arrow-down
      4
      ·
      11 months ago

      Only AI fakes of billionaires. They’re just admitting that there’s a two tiered legal system, and if you’re below a certain “value,” you will not be protected.

    • ehrik@lemmy.world
      link
      fedilink
      English
      arrow-up
      16
      arrow-down
      4
      ·
      11 months ago

      Y’all need to read the article and stop rage baiting. It’s literally a click away.

      “Legislation needs to be passed to protect people from fake sexual images generated by AI, the White House said this afternoon.”

  • Bonesy91@lemmy.world
    link
    fedilink
    English
    arrow-up
    33
    arrow-down
    15
    ·
    11 months ago

    This is what the white house is concerned about… Fuck them. Like there is so much worse going on in America but oh no one person has ai fake porn images heaven forbid!

    • MirthfulAlembic@lemmy.world
      link
      fedilink
      English
      arrow-up
      24
      arrow-down
      7
      ·
      11 months ago

      The White House is capable of having a position on more than one issue at a time. There also doesn’t seem to be a particular bill they are touting, so this seems to be more of a “This is messed up. Congress should do something about it” situation than “We’re dropping everything to deal with this” one.

      • go_go_gadget@lemmy.world
        link
        fedilink
        English
        arrow-up
        9
        arrow-down
        6
        ·
        11 months ago

        The White House is capable of having a position on more than one issue at a time.

        Doubt.

    • XeroxCool@lemmy.world
      link
      fedilink
      English
      arrow-up
      12
      arrow-down
      7
      ·
      11 months ago

      Nice job reading the article, any one of these articles, to actually get context and not just react to headlines.

      People are asking about Swift. The government isn’t buddying up to her specifically. Swift is only the most famous face of this issue with very focused growth on this.

  • thantik@lemmy.world
    link
    fedilink
    English
    arrow-up
    23
    arrow-down
    6
    ·
    11 months ago

    I’d much rather that we do nothing, let it proliferate to the point where nobody trusts nudes at all any more.

      • thantik@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        11 months ago

        You really need healthier relationships in your life I think; my wife would have no reason to do such a thing.

      • thantik@lemmy.world
        link
        fedilink
        English
        arrow-up
        27
        arrow-down
        6
        ·
        edit-2
        11 months ago

        That’s perfect. It should be legal. Making pornography of someone illegal is just a different scale of grey from say…making drawing muhammad illegal, etc.

        I can already hire an artist to make me some porn of …I dunno…Obama or something. Why should that be illegal just because someone does it with AI instead?

          • thantik@lemmy.world
            link
            fedilink
            English
            arrow-up
            20
            arrow-down
            3
            ·
            edit-2
            11 months ago

            Hate to break it to you, this is already legal. “Non Consensual Porn” only applies to photographs. Nobody should have to consent to everything like that.

            If I draw you standing under the eiffel tower, fully clothed - the legality shouldn’t change just because you don’t LIKE what’s being drawn.

            • Эшли Карамель@discuss.tchncs.de
              link
              fedilink
              English
              arrow-up
              5
              arrow-down
              8
              ·
              11 months ago

              I’m aware it’s already legal, hence why action should be taken. plus videos are just a bunch of photos stitched together so I don’t see your point of it only applying to photos.

              • thantik@lemmy.world
                link
                fedilink
                English
                arrow-up
                12
                arrow-down
                4
                ·
                edit-2
                11 months ago

                Because it being nude/etc is the only thing that is different from people just simply drawing others in art.

                Just because you don’t like pornography, shouldn’t change the legality of it. It’s prudism and puritanism at its finest.

                • Эшли Карамель@discuss.tchncs.de
                  link
                  fedilink
                  English
                  arrow-up
                  4
                  arrow-down
                  9
                  ·
                  11 months ago

                  it’s not porn in general that should be illegal. ONLY pornography where the person has not explicitly said they would like to be in it. such as deepfake porn, or drawn where the person has also not said they would like to be in it.

            • Ð Greıt Þu̇mpkin@lemm.ee
              link
              fedilink
              English
              arrow-up
              4
              arrow-down
              13
              ·
              11 months ago

              Nobody should have to consent to everything like that

              I’m sorry but holy fuck that is just morally bankrupt.

              Someone should have the ABSOLUTE right to control any distribution of their image when of a sexual nature that they didn’t actively consent to being out there

              Anything less is the facilitation of the culture of sexual abuse that lets the fappening or age of consent countdown clocks happen

              Drawing a picture of someone under the eifel tower is a wildly different act than drawing them in the nude without them knowing and agreeing with full knowledge of what you plan to do with that nude piece.

              • Fal@yiffit.net
                link
                fedilink
                English
                arrow-up
                11
                arrow-down
                4
                ·
                11 months ago

                Calling this sexual abuse is absolutely insulting and disgusting

                • Ð Greıt Þu̇mpkin@lemm.ee
                  link
                  fedilink
                  English
                  arrow-up
                  3
                  arrow-down
                  10
                  ·
                  11 months ago

                  Trying to pretend it’s not is feeding the culture of not listening to victims.

                  It’s like saying that cat calling is harmless, forcing people to be reminded they are seen as a sex object is well known and documented as a tool of keeping the victim “in their place.”

                  It’s harassment, and when done at the scale famous folks experience for the crime of being well known and also attractive, basically amounts to a campaign of terror via sexual objectification.

                  Nevermind how tolerating it makes space for even more focused acts of terror like doxxing and making threats of sexual assault.

              • DreamerofDays@kbin.social
                link
                fedilink
                arrow-up
                6
                arrow-down
                1
                ·
                11 months ago

                I’m wondering if the degree of believability of the image has, or should have any bearing on the answer here. Like, if a third party who was unaware of the image’s provenance came across it, might they be likely to believe the image is authentic or authorized?

                For another angle, we allow protections on the usage of fictional characters/their images. Is it so wild to think that a real person might be worthy of the same protections?

                Ultimately, people are going to be privately freaky how they’re gonna be privately freaky. It mostly only ever becomes a problem when it stops being private. I shouldn’t have to see that a bunch of strangers made porn to look like me, and neither should Taylor. And mine are unlikely to make it into tabloids.

                • thantik@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  4
                  ·
                  11 months ago

                  From https://www.owe.com/resources/legalities/7-issues-regarding-use-someones-likeness/

                  A. The short answer is no. Individuals do not have an absolute ownership right in their names or likenesses. But the law does give individuals certain rights of “privacy” and “publicity” which provide limited rights to control how your name, likeness, or other identifying information is used under certain circumstances.

                  From that page, it actually looks like there is a very specific criteria for this - and Taylor Swift HERSELF is protected because she is a celebrity.

                  However, there are still a lot of gotchas. So instead of making the product/art itself illegal, using it as harassment should be what’s illegal. Attaching someone’s name to it in an attempt to defame them is what’s already illegal here.

          • intensely_human@lemm.ee
            link
            fedilink
            English
            arrow-up
            2
            arrow-down
            1
            ·
            11 months ago

            Having an image exist somewhere of them isn’t the sort of thing a person should have to consent to.

            Consent is for things that affect that person.

  • CALIGVLA@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    33
    arrow-down
    16
    ·
    edit-2
    11 months ago

    U.S. government be like:

    Thousands of deep fakes of poor people: I sleep.

    Some deep fakes of some privileged Hollywood elite: R E A L S H I T.

  • iheartneopets@lemm.ee
    link
    fedilink
    English
    arrow-up
    24
    arrow-down
    8
    ·
    11 months ago

    Taylor is just trying to distract us from her jet emissions again, just like her new PR relationship with that Kelce guy was almost certainly to distract us from her dating that Matty Healy dude that openly said he enjoys porn that brutalizes black women (and also from her jet emissions).

    She’s not stupid. She’s a billionaire very aware of how news cycles work.

  • jaschen@lemmynsfw.com
    link
    fedilink
    English
    arrow-up
    16
    arrow-down
    14
    ·
    11 months ago

    It’s a victimless crime. I mean, making it illegal doesn’t stop people from doing it.

    I think once it gets to a point that nobody trusts AI porn, like how we don’t trust photoshopped porn, then nobody would care anymore.

    • Djtecha@lemm.ee
      link
      fedilink
      English
      arrow-up
      13
      arrow-down
      9
      ·
      11 months ago

      I hope you are joking. The target here is very much a victim. I don’t want people fucking up my reputation just cus.

      • jaschen@lemmynsfw.com
        link
        fedilink
        English
        arrow-up
        15
        arrow-down
        6
        ·
        11 months ago

        So someone photoshopped your face on another person is also illegal? What if someone talks about you doing a hypothetical sex act with their friends in private. Would that be illegal?

        In all 3 instances, your reputation is fucked up. But which one is illegal? When does it end?

        • Djtecha@lemm.ee
          link
          fedilink
          English
          arrow-up
          3
          arrow-down
          7
          ·
          11 months ago

          When they post it online in a way that is very hard for anyone to distinguish. If someone wants to jerk off to a made up image of me at home, I think that’s kind of unhealthy, but not the biggest of issues. There are already cases of people committing suicide because someone generated some Ai and posted it online. It’s fucked. At least with photoshop you had to know how to do some work. With this you can just go online, upload a picture and the backend tools do all the work.

          • jaschen@lemmynsfw.com
            link
            fedilink
            English
            arrow-up
            9
            arrow-down
            6
            ·
            11 months ago

            First of all, what you think is unhealthy should stay in your home and not someone else’s home. You don’t get to decide what is unhealthy for everyone. Just yourself. It’s people like you that think it’s unhealthy for people to abort their babies that we lost Roe.

            AI generated videos are currently distinguishable. It’s quite easy to spot. All of the time it’s posted on the site that specifically calls out its AI generated. So there is little confusion if your reputation is getting messed up.

            With Photoshop you don’t actually need any skills. It’s all AI generated as well. It’s actually zero skills. As a matter of fact, I have software that can take an AI generated image and then turn it into a video with just prompts to an AI generator.

            But let’s get back to what I said earlier. Who is actually hurt here?

    • WarmSoda@lemm.ee
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      6
      ·
      edit-2
      11 months ago

      People here are forgetting there is an entire industry full of models and actors (and producers and studios etc) whose sole purpose is to create nude images for money.

      The porn industry. You know, the one that basically decides how we all consume media. This type of problem that Swift is having well be an enormous issue for porn.

      It has to be stopped. It’s not just about one popular celebrity.

        • WarmSoda@lemm.ee
          link
          fedilink
          English
          arrow-up
          4
          arrow-down
          3
          ·
          11 months ago

          Porn decided VHS was the standard back in the day.
          PlayStation decided DVD was afterwards.
          And so and so forth

          • ArcaneSlime@lemmy.dbzer0.com
            link
            fedilink
            English
            arrow-up
            3
            ·
            11 months ago

            Well no, I don’t think porn “decided” that. Beta was better quality, but more expensive and not “better” enough to warrant it for most home consumers. Sony also marketed it poorly, and capped beta tapes at 1hr.

            Sure the porn industry sold home tapes, but they didn’t sell them on VHS because “you’ll use a VCR and like it by god,” they sold them on VHS because most home video consumers had VCRs due to them being cheaper than beta machines/tapes. Easier to sell them a VHS tape for their VCR than to convince them they should have bought the beta machine.

            CDs won over cassettes and carts because they are better, they sound better than tapes and (at the time it was believed) they last forever, and they could fit more on them than contemporary carts. They physically take up less space than the (video) cassettes too, which mattered in the days of shelving (instead of HDDs). It wasn’t just because nintendo said “nah fam we like carts” and sony said “we like the shiny bagel,” the consumer also liked the shiny bagel.

            Now people have stopped buying disks and soon disks will stop being sold in favor of download/streaming only, this isn’t because some shadowy they’ve “decided,” it’s because the people have decided not to pick up physical copies of disks anymore.

            • Aa!@lemmy.world
              link
              fedilink
              English
              arrow-up
              4
              ·
              11 months ago

              For the record, the biggest reason vhs beat Betamax was the same thing Sony has struggled with in later years.

              They chose to keep the media format proprietary, while JVC opened the VHS format for any company to make. That’s how VHS became more ubiquitous, and it’s a similar story as to why minidisc never became mainstream, and memory sticks are gone.

              Which arguably falls under “marketed poorly”, but I think the added context helps

          • jaschen@lemmynsfw.com
            link
            fedilink
            English
            arrow-up
            2
            arrow-down
            3
            ·
            11 months ago

            We still need their bodies for AI porn so they are actually fine. This is no different from stunt actors who we CGI faces to their work.

          • jaschen@lemmynsfw.com
            link
            fedilink
            English
            arrow-up
            7
            arrow-down
            3
            ·
            11 months ago

            Nah, I’m close to 50 and remembered a time when people had rights. Freedom to do what we want without government intervention. Hey, what I do in my own home does not concern you.

            Nobody is physically getting hurt here.

            • WarmSoda@lemm.ee
              link
              fedilink
              English
              arrow-up
              1
              arrow-down
              5
              ·
              edit-2
              11 months ago

              Being a moron at 50 doesn’t negate having brain damage at an early age. You can’t possibly be this ignorant. There are more ways to be hurt than physically.

              • jaschen@lemmynsfw.com
                link
                fedilink
                English
                arrow-up
                4
                arrow-down
                2
                ·
                11 months ago

                Oh we are calling names now. My apologies for thinking you were mature.

                Please explain this moron. Besides physically, how would AI porn hurt someone?

                • WarmSoda@lemm.ee
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  arrow-down
                  3
                  ·
                  11 months ago

                  You can very easily read the many many comments in this post for that information. But you won’t, and you’ll just ignore everything like you have been.

      • Meowoem@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        3
        ·
        11 months ago

        The sad reality for commercial porn lovers like yourself is the industry is doomed anyway, when we start fixing the poverty problem and transitioning from a greed based society it’ll be very hard to coerce women into forms of work like this. If we had a good UBI and housing program where people have access to resources to improve themselves and their material situation there aren’t going to be many women that choose to get fucked in the ass for several unsexy hours so the crew can film from all the right angles.

        You should probably join with the campaign to stop ai completely if you want to maintain a society where enough women find themselves desperate enough to do porn that it can sustain the industry - imagine if automation efficiency gains enabled localized manufacturing from locally sourced materials thus significantly lowering the cost of living for all! Even women in third world countries wouldn’t be poor enough to need to join the commercial porn industry.

        Yes sir if there’s one industry we need to protect it’s the one that uses poor and substance addicted women as sex objects to be exploited for capitalistic gain.

        • jaschen@lemmynsfw.com
          link
          fedilink
          English
          arrow-up
          4
          arrow-down
          1
          ·
          11 months ago

          While I agree that we should have UBI and universal healthcare, your gross assumption that all women are “forced” into this trade because they lack funds is what I disagree with.

          While some women are doing it for the money, some women are doing it because it’s their own kink. Check out /gonewild. Most if not all those women are there showing off their bodies because they want to. No funds. No other reason except they want to share.

          Again yes there are sex trafficked women in the porn industry. There should be more laws that help these women.

          Fun story. I went to CES about 10 years ago and the AVN convention is next door and as a joke my workers and I went to check it out. I had the pleasure of talking to some of the actresses and some of them paid money to be there. It was such a great experience talking to them and seeing the humane side to porn. The women there generally love the craft.

        • WarmSoda@lemm.ee
          link
          fedilink
          English
          arrow-up
          2
          ·
          11 months ago

          when we start fixing the poverty problem and transitioning from a greed based society

          Let me know when that happens

          • Meowoem@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            1
            arrow-down
            2
            ·
            11 months ago

            I’ve started already, learn about open source and creative commons if you want to start being part of the solution instead of the problem