Its sort of crazy to think about how big tech companies have a smaller and smaller window to be a "fun" and interesting story/idea. Facebook was pretty fun for a bit, google was obviously an idea factory for a while and even stuff like the doodles were a big deal.
Stuff like Uber and AirBnB were controversial at some levels but still generally "game changers" in specific industries and it was fun/interesting to be early adopters.
OpenAI was under the radar IRT public consciousness pre-gpt3.5.... we all had fun w/ chatGPT... and then immediately OAI starts generating headlines that are not fun/inventive/quirky. A lot of regulatory stuff, governments around the world. Instant globalization + general horror.
Before Deepseek, Meta open-sourced a good LLM. At the time, the narrative pushed by OpenAI and Anthropic was centered on 'safety.' Now, with the emergence of Deepseek, OpenAI and Anthropic have pivoted to a national security narrative. It is becoming tiresome to watch these rent seekers attacking open source to justify their valuations.
>> In the proposal, OpenAI also said the U.S. needs âa copyright strategy that promotes the freedom to learnâ and on âpreserving American AI modelsâ ability to learn from copyrighted material.â
Perhaps also symmetric "freedom to learn" from OpenAI models, with some provisions / naming convention? U.S. labs are limited in this way, while labs in China are not.
The easiest logical way I can make sense of this problem is to apply it to humans. Copyrighted material has tremendously impacted my thinking and work, but I had to pay to access it. And as long as I'm not publishing copies of the copyrighted work, derivative work seems to be fair use. This seems fair for everyone, if they want to train on a resource then they should pay for it.
The only angle I can see this working for OpenAI is pushing the anti-China national security threat narrative, which I expect to see a lot more of this year (especially with this administration). While I personally hate that, I can definitely see how AI + drones are the obvious future in warfare, so I don't think it's that far-fetched to work.
It coincides with this: OpenAI calls DeepSeek âstate-controlled,â calls for bans on âPRC-producedâ models
https://techcrunch.com/2025/03/13/openai-calls-deepseek-stat...
I'm surprised to see only one comment here addressing the issue of Chinese AI companies just flatly ignoring US copyright and IP laws/norms. I wonder if there is a viable path where we can facilitate some sort of economic remuneration for people who write and create visual art while not giving up the game to Chinese companies.
This seems to be a thorny dilemma.
If they want to avoid paying for the creative effort of authors and other artists then they should also not charge for the use of their models.
>liability protections
The industry that just ran roughshod over a couple million copyright holders intentionally despite knowing it is on legal shaky ground now wants liability protection for itself?
Bunch of immoral shysters...
The demand here for federal preemption of state law has nothing to do with copyright. Copyright is entirely federal level today. It has to do with preventing the use of AI to enable various forms of oppression.[1] Plus the usual child porno stuff.
What AI companies are really worried about is a right of appeal from decisions made by a computer. The EU has that. "Individuals should not be subject to a decision that is based solely on automated processing (such as algorithms) and that is legally binding or which significantly affects them."[2] This moves the cost of LLM errors from the customer to the company offering the service.
[1] https://calmatters.org/economy/technology/2024/09/california...
[2] https://commission.europa.eu/law/law-topic/data-protection/r...
Relevant (I don't know why the article doesn't link to them directly): https://openai.com/global-affairs/openai-proposals-for-the-u... https://cdn.openai.com/global-affairs/ostp-rfi/ec680b75-d539...
It probably needs to be a law not an executive order but I don't hate the idea.
States have the power to make it prohibitively expensive to operate in those states, leaving people to either go to VPNs or use AI's hosted in other countries where they don't care if they're not following whatever new AI law California decides to pass. And companies would choose just to use datacenters not in the prohibitive states and ban ips from those states.
Course if a company hosts in us-east-1, and allows access from California, would the inter state commerce clause not take effect and California would have no power anyways?
Funfact: The reason Hollywood is in California is because Edisonâs camera patents didnât apply there. Altman might actually have a good point â if your competition doesnât care about your laws, youâre in trouble.
https://www.mentalfloss.com/article/51722/thomas-edison-drov...
I think there will be a huge change in public perception of copyright in general, as increasingly more people realise that everything is a derivative work.
It seems really weird that Congress isnât making a law about this. Instead, weâre asking courts to contort old laws to apply to something which is pretty different from the things they were originally intended for. Or just asking the executive to make law by diktat. Maybe letting the wealthiest and most powerful people in the world will work out. Maybe not.
This issue is too complicated for Congress to handle? Too bad. Offloading it to the president or a judge doesnât solve that problem.
The world is becoming more and more complicated and we need smart people who can figure out how things work, not a retirement community.
How big was the check that came with this request? For the right price their logo can go on the rose garden lawn.
Free market y all !
OpenAI calls DeepSeek 'state-controlled,' calls for bans
Here's a direct link to the article: https://www.bloomberg.com/news/articles/2025-03-13/openai-as...
the GOP: "states' rights! states' rights!!"
also the GOP: "not those rights! only the rights we want to share"
The original link has apparently been changed to a content-free Yahoo post, for some reason only known to "moderators", which makes existing comments bizarre to read.
The original link pointed to this OpenAI document:
https://openai.com/global-affairs/openai-proposals-for-the-u...
It contains this remarkable phrase:
> For innovation to truly create new freedoms, Americaâs builders, developers, and entrepreneursâour nationâs greatest competitive advantageâmust first have the freedom to innovate in the national interest.
I don't think people need "new freedoms". They need their existing freedoms, that are threatened everywhere and esp. by the new administration, to be respected.
And I would argue that America's greatest strength isn't their "builders"; it's its ability to produce BS at such a massive scale (and believe in it).
This OpenAI "proposal" is a masterpiece of BS. An American masterpiece.
I've heard so many ridiculous stories about 'AI' that I'm at the point where I initially took this to mean the LLM and not the company had made the request.
I expect that interpretation won't seem outlandish in the future.
Am I the only one who thinks âfreedom to learnâ is an anthropomorphising euphemism?
âWe want more regulation! AI is too dangerous, too powerful for any person off the street to use!â
Meanwhile exact same guy in Europe:
âLess regulation! You are strangling our innovation!â
JD vance seems to be quite aware of OpenAIs meta strategy so I wouldn't be surprised if this is declined (ie semi specifically aimed at something they want to force them to comply with).
DeepSeek/whoever training on OpenAI outputs is ... bad.
OpenAI training on every content creator's outputs is ... good.
The full 15-page proposal from OpenAI to the White House:
https://cdn.openai.com/global-affairs/ostp-rfi/ec680b75-d539...
Regulations were convenient to slow down competitorsâyou know, the ones you heavily lobbied forâit was all great. But now that you've done your part and others are finally catching up, suddenly it's all about easing restrictions to protect your lead? Beautiful.
So, why not pay the price of each copyrighted work ingested by the model?
The right loves states rights, unless it conflicts with their personal preferences.
Well funded companies want regulations because it stops up and coming companies from competing. Now they want exemptions from those regulations because it would be too restrictive.
> OpenAI has asked the Trump administration to help shield artificial intelligence companies from a growing number of proposed state regulations if they voluntarily share their models with the federal government.
That sounds like corruption
Still not convinced how a model training on data, is not the same as a human looking at that data and then using it indirectly as itâs now a part of his knowledge base
He should have offered for every purchase of OpenAI services, a portion would be used to purchase TrumpCoin. That would have been a more effective bribe.
Funny how fast those AI prophets went from:
- The government need to prepare because soon they will need to give money to all those people we made obsolete and unemployed. And there is nothing to stop us.
to:
- We need money from the government to do that thing we told you about.
Can anyone also use copyrighted source code, e.g. from OpenAI?
I really hope OpenAI fails in doing this. If this usage is allowed, then it means that there is no path towards me being OK with publishing anything on the internet again.
I'm assuming this has zero effect on non-US AI companies?
OpenAI (2023): Don't even bother trying to compete against us, you will not win and you will lose.
OpenAI (2025): pLeAse bAn dEEpSeEk!!11!, bAn poWerFulL oPen wEight Ai mOdeLs!!1
Move to a different state.
Is it so unrealistic? Many companies and people leave beautiful Cali due to over-regulation.
Wonder if the rules will protect the information providers or the consumers.
It is interesting that it is not the Hollywood/Music/Entertainment copyright lobby (RIAA, MPAA etc.) that is lobbying US states to go after OpenAI and other American AI companies.
It's the New York Times and various journalist and writers' unions that are leading the charge against American AI.
American journalists and opinion piece writers want to kill American AI and let China and Russia have the global lead. Why? Have they taught about the long consequences of what they are doing?
clear attempt circumnavigate the clear copyright violations of the AI era and kick the can down the road.
> National security hinges on unfettered access to AI training data, OpenAI says.
If it's a Republican administration, yell "national security". If it's Democratic, claims it's in the name of child safety.
Sounds great!
Write a law. We don't have an emperor.
First they should investigate the fake suicide!
âPlease help us. Weâre only a little business worth $157 billion!â - The company ripping off everyone thatâs ever written or drawn anything. Companyâs like AirBnB and Uber breaking the rules, gaining control of the market, and then pushing up prices was bad. âOpenâ AI is just a whole other level of hubris.
If AI actually reaches human-level intelligence in the next few years, the Pentagon and congress are going to start yelling about National Security and grabbing control over the whole industry, so I doubt state regulations are going to matter much anyway.
(And if it doesn't reach human-level intelligence, then OpenAI's value will pop like a balloon.)
I just canceled my OpenAI subscriptions over this.
Closed ai
Putting legal issues aside for a moment, I argue copyrighted material should be considered fair use simply by virtue of the enormous societal benefits LLMs/AI bring in making the vast expanse of human knowledge accessible.
Itâs a major step forward for humanity.
I heard the theory that Elon Musk has a significant control over the current US government. They're not best pals with Sam Altman. This seems like it might be a good way to see how much power Elon actually has over the government?
Related:
Googleâs comments on the U.S. AI Action Plan
https://blog.google/outreach-initiatives/public-policy/googl...
Buried the lede:
> OpenAI also reiterated its call for the government to take steps to support AI infrastructure investments and called for copyright reform, arguing that Americaâs fair use doctrine is critical to maintaining AI leadership. OpenAI and other AI developers have faced numerous copyright lawsuits over the data used to build their models.
âFreedom to make moneyâ
Maybe this data constraint from data vs GPU constraint for China will force America to innovate. Maybe innovate in data generation
I'm disgusted by the mindset that companies should be able to do whatever they want when it comes to technology as impactful and revolutionary as AI.
AI sucks up the collective blood, sweat and tears of human work without permission or compensation and then re-monetizes it. It's a model that is even more asymmetrical than Google Search, whom at least gives back some traffic to creators (if lucky).
AI is going to decide on human lives if it drives your car or makes medical diagnoses or decisions. This needs regulation.
AI has the ability for convincing deepfakes, attacking the essence of information and communication in itself. This needs regulation, accountability, at least a discussion.
As AI grows in its capability, it will have an enormous impact on the work force, both white collar and blue collar. It may lead to a lot of social unrest and a political breakdown. "Let's see what happens" is wildly irresponsible.
You cannot point to foreign competition as a basis for a no-rule approach. You should start with rules for impactful/dangerous technology and then hold parties to account, both domestic and foreign.
And if it is true that we're in a race to AGI, realize that this means the invention of infinite labor. Bigger than the industrial revolution and information age combined.
Don't you think we should think that scenario through a little, rather than winging it?
The inauguration had the tech CEOs lined up directly behind Trump, clearly signaling who runs the country. Its tech and its media. How can you possible have trust in a technology even more powerful ending up in ever richer and more autocratic hands?
But I suppose the reality is that Altman should donate $100 million to Trump and tell him that he's the greatest man ever. Poof, regulation is gone.
private property is sacrosanct except when an exception that only applies to them it would make a billionaire richer
"If what we're doing is not fair use, then we can't operate"? OK, so? The world doesn't owe you the ability to operate the way you are. So whether it breaks your business model has no bearing on the question, which is, "is that fair use, or not?"
In the "just because everyone else is jumping off a bridge, should you do it":
> Pfizer Asks White House for Relief From FDA Drug Human Testing Rules
> Pfizer has asked the Trump administration to help shield pharmaceutical companies from a growing number of proposed state and federal regulations if they voluntarily share their human trial results with the federal government.
> In a 15-page set of policy suggestions released on Thursday, the Eliquis maker argued that the hundreds of human-testing-related bills currently pending across the US risk undercutting Americaâs technological progress at a time when it faces renewed competition from China. Pfizer said the administration should consider providing some relief for pharmaceutical companies big and small from state rules â if and when enacted â in exchange for voluntary access to testing data.
> Chris Lehane, Pfizer's vice president of global affairs, said in an interview, "China is engaged in remarkable progress in drug development by testing through Uyghur volunteers in the Xinjiang province. The US is ceding our strategic advantage by not using untapped resources sitting idle in detention facilities around the country."
> George C. Zoley, Executive Chairman of GEO Group, said, "Our new Karnes ICE Processing Center has played an important role in helping ICE meeting the diverse policy priorities of four Presidential Administrations. We stand ready to continue to help the federal government, Pfizer, and other privately-held companies achieve their unmet needs through human trials in our new 1,328-bed Texas facility."
> OpenAI also proposed that AI companies get access to government-held data, which could include health-care information, Lehane said.
Yea, straight up, go fuck yourselves. You want copyright laws changed to vouchsafe your straight up copyright whitewashing and now you just want medical data "because."
Pay for it or go away. I'm tired of these technoweenies with their hands out. Peter Thiel needs a permanent vacation.
Isnât Elon Musk sort of in a tiff with OpenAI, and also seemingly very influential to Trump?
I feel like OpenAI is going to have to make some concessions to get favor from the Trump administration.
HAHAHA. Remember when Sam was absolutely frothing at the mouth to "regulate AI" two years ago?
> https://www.nytimes.com/2023/05/16/technology/openai-altman-...
> https://edition.cnn.com/2023/06/09/tech/korea-altman-chatgpt...
You see, American AI is going to take over the world. It's just that it's temporarily short of funds. I mean, GPUs. Uh, there are pesky laws in the way.
Totally not the fault of a gigantic overcommitment based on wishing, no.
Is it me or does it feel like most of what the federal government does nowadays is make it illegal for government to make things illegal?
I hate this game. I hate that Sam Altman publicly supported Trump (both financially and by showing up). Maybe I hate that he "had" to do this for the sake of his company, or maybe I hate that he _didn't_ have to do it and is a hypocrite. Maybe I just hate how easily laws can be shaped by $1M and a few nice words. Either way, I hate that it worked.
Tell you what, set up a Federal level disclosure process online of all the copyright protected works used in training OpenAI for the creators / rights holders to get equity (out of the pockets of the C-Suite and Board) via claiming their due, and weâll take you seriously.
All the profit and none of the liability is Coward Capitalism.
Related (adjacent content from the same report):
OpenAI urges Trump administration to remove guardrails for the industry (cnbc.com) - https://news.ycombinator.com/item?id=43354324
Maybe these idiot CEOs shouldn't have screamed from the rooftops about how they can't wait till AI lets them fire all the plebs, then maybe someone would actually care if their company is over or not
I know a lot of people will hate on things like this, but the reality is they are right that guardrails only serve to hurt us in the long run, at least at this pivotal point in time. I don't like Trump personally as a caveat.
Yes it is a fact they did build themselves up on top of mountains of copyrighted material, and that AI has a lot of potential to do harm, but if they are forced to stop or slow down foreign actors will just push forward and innovate without guardrails and we will just fall behind as the rest of the world pushes forward.
Its easy to see how foreign tech is quickly gaining ground. If they truly cared about still propping America up, they should allow some guardrails to be pushed past.
Steal content and then ask god for forgiveness. Works like a charm :)
All these whiney creatives who feel threatened just need to suck it up and deal with it. Even if they got their way in the US, another app in another country will just use their data without permission. All they are doing is ensuring those apps wouldnt be American.
If you can't play by the rules, don't play the game.
LLM race may be over, but the AI race surely isn't. My baby seems to have grown into a fully functioning intelligence without reading the entire content of the internet. AI is not equivalent to LLMs, silly, silly child.
Maybe in a present:
- Dominated by a intractable global manufacturer/technologist (China) that doesn't care about copyright
- Proliferated by a communication network that doesn't care about copyright (Internet)
and a future where:
- We have thinking machines on par with human creativity that get better based on more information (regardless of who owns the rights to the original synapses firing)
That maybe, just maybe, the whole "who should pay to use copyrighted work?" question is irrelevant, antiquated, impossible, redundant...
And for once we instead realize in the face of a new world, an old rule no longer applies.
(Similar to a decade ago when we debated if a personal file was uploaded to a cloud provider should a warrant apply)
For those who have used the image generation models and even the text models to create things, there is no way you can look at the Disney-look-alike images and NOT see that as copyright infringement...
>Chris Lehane, OpenAIâs vice president of global affairs, said in an interview that the US AI Safety Institute â a key government group focused on AI â could act as the main point of contact between the federal government and the private sector. If companies work with the group voluntarily to review models, the government could provide them âwith liability protections including preemption from state based regulations that focus on frontier model security,â according to the proposal.
Given OpenAI's history and relationship with the "AI safety" movement, I wouldn't be surprised to find out later that they also lobbied for the same proposed state-level regulations they're seeking relief from.