Back to News
Advertisement
Advertisement

⚡ Community Insights

Discussion Sentiment

61% Positive

Analyzed from 4795 words in the discussion.

Trending Topics

#google#government#don#lawful#military#used#seems#should#doesn#companies

Discussion (204 Comments)Read Original on HackerNews

tombertabout 3 hours ago
When my sister and I would play monopoly as kids, we had lost the manual so whenever we didn’t like the outcome of whatever happened, we would make up rules about what was right. Technically then, it was very easy stay compliant while still being able to do well because we could rewrite the rules.

Also, since I was older I feel like I was able to get away with those redefinitions a lot more often…

caycep5 minutes ago
I was going to post about whether there were still "laws" in the US, but this post gets the point across much better
smallmancontrovabout 2 hours ago
The word "lawful" always seems to get dragged out when people in power are doing some especially heinous rulemaking, like throwing a hissy fit over a single company trying to voluntarily draw a line at domestic surveillance and fully automated killchains.
WarmWashabout 1 hour ago
Anthropic wanted the ability to verify compliance whereas OAI and Google are fine with "trust us". Which is how it always is, and always has been.

For better or worse, the government is the one who audits, and has it's own internal systems for self audits. So no one except them tells them what they can or cannot do. The government would never put itself in a position where civilians died because Amodei didn't like the vibe of the case being worked.

In a way it's wild that people are upset that the government didn't put a billionaire megacorp CEO in the drivers seat of intelligence.

trhway5 minutes ago
>So no one except them tells them what they can or cannot do.

you're missing "laundering the responsibility" approach - find a lawyer who writes that the thing is legal in his opinion, and voila.

bkoabout 1 hour ago
A private corporation can choose not to sell to the government. A lot of them do exactly this. A lot of hoops to jump through.

However, if they do sell to the government, they shouldn't have some sneaky way to exert control over decision making using their products. We're a country of laws, and for better or for worse, these laws are made by elected officials and those appointed by elected officials.

Why an American company wouldn't want American defense to have the most capable tools at their disposal is a different matter all together, but here we are.

joshuamortonabout 1 hour ago
> they shouldn't have some sneaky way to exert control over decision making using their products.

why not, many companies have all sorts of rules you agree to when using their products, including many legal ("lawful") things. Are you saying that the government as a client should be unbound by contractual obligations that apply to other clients?

tombertabout 1 hour ago
This administration has made it very clear that they will do what they can to change laws whenever convenient, without congressional oversight, whether or not they are "allowed" to.

Trump implemented tariffs he wasn't allowed to immediately, he started a war he probably wasn't allowed to in order to (allegedly) distract from associating with a pedophile, he wrote an executive order trying to undo the fourteenth amendment, he has actively been abducting and imprisoning lawful residents (and even citizens!) and actively pushed for racial profiling to do so.

If a company feels like the government will simply rewrite the laws in order to advance any kind of political whim (including to be weaponized against that very company!), it's not wrong or even weird for them to want to add safeguards to their product.

To be clear, this isn't weird or uncommon. Lots the stuff you sign in the EULA isn't preventing you from doing things that are "illegal".

bkoabout 1 hour ago
I'd prefer our elected officials own the manual, accepting the fact that [person I don't like] could be in power and they can re-write the rules, then a private billion dollar corporation. Especially when it comes to defense.
mc32about 1 hour ago
Ha! If the Congress did diddly squat about eavesdropping on them by organizations that aren’t supposed to spy on citizens back in the Obama days (we also spied on allies’s governments but that’s kinda what all of them do) there is no hope in them reining things back at all… for mere hoi polloi.
bkoabout 1 hour ago
I guess we have to appoint Amodei and Altman as our benevolent dictators to keep Congress in check!
cucumber3732842about 3 hours ago
The big reason it's "obvious" when tech megacorps do it is because big tech is new to the game and doesn't have an existing regulatory capture system already up and running and legitimized like medical, civil engineering, energy, agriculture, chemical, etc, do.

If this were 3M making nasty stuff for Northrop to put in bombs and drop on brown people or Exxon scheming up something bad in Alaska or bulldozing a national park for solar panels or some other legacy BigCo doing slimy things that are in the interests of them and the government but against the interest of the public they'd have 40yr of preexisting trade group publications, bought and paid for academic and media chatter, etc, etc, that they could point to and say "look, this is fine because the stuff we paid into in advance to legitimize these sorts of things as they come up says it is" though obviously they'd use very different words.

GeekyBearabout 1 hour ago
> big tech is new to the game and doesn't have an existing regulatory capture system already up and running

The career officials in the Obama FTC started proceedings for an antitrust lawsuit against Google over a decade ago.

The political appointees (of both parties) shut it down.

It seems to me that regulatory capture has been working for Google for some time now.

WarmWashabout 1 hour ago
Google has a monopoly because of the internet's insistence on ad blocking, and outright indignant refusal to dare pay a greedy company for thinking they could ask for money for a "free" web service.

It's basically impossible to get off the ground competing against google when 30-40% of people are just freeloading your service, and 80-90% think the internet is an ethereal realm that everyone could have ad and subscription access to if we could only agree to starve these greedy middle men.

tombertabout 1 hour ago
I mean it's basically an extremely high-stakes version of the (possibly apocryphal) Upton Sinclair quote: "It is difficult to get a man to understand something, when his salary depends upon his not understanding it."

Most people (at least the people I've talked to, which admittedly is somewhat of a lefty bubble but I think even more generally) agree that companies getting to or close to "monopoly" status is a pretty bad thing, and that they should be broken up. Political candidates get a lot of social credit for claiming that they're going to do exactly that. The moment that they actually get into a position where they actually could do something about it, they suddenly remember who their campaign contributors are, and can then create reasons to avoid actually solving any of these problems.

Very occasionally we have successes in this field, like the breakup of Standard Oil and AT&T), but of course both of these sort of became toothless since we basically allowed both of these companies to re-acquire each other and form the same problems again.

There are similar reasons as to why politicians will occasionally push for regulations to not allow themselves to invest in companies that their policies affect, but somehow manages to never get through.

Politicians are very rarely punished for breaking political promises, but often rewarded for making the promises. They are also rewarded by their corporate overlords for breaking these promises.

SecretDreamsabout 2 hours ago
> If this were 3M making nasty stuff for Northrop to put in bombs and drop on brown people or Exxon scheming up something bad in Alaska or bulldozing a national park for solar panels or some other legacy BigCo doing slimy things that are in the interests of them and the government but against the interest of the public they'd have 40yr of preexisting trade group publications, bought and paid for academic and media chatter, etc, etc, they could point to and say "look, this is fine because the stuff we paid into in advance to legitimize these sorts of things as they come up says it is" though obviously they'd use very different words.

My friend, this paragraph needed some periods. I could not follow what you were trying to say - but it seemed interesting enough to consider retyping.

Vachyasabout 2 hours ago
Good comment, and I agree lol

I read it twice (admittedly quickly) but couldn't grasp the point even though I felt like it was there.

anematodeabout 3 hours ago
Who could have seen this one coming. From yesterday: https://www.cbsnews.com/news/google-ai-pentagon-classified-u... ("Hundreds of Google workers urge CEO to refuse classified AI work with Pentagon").

Any AI researcher who continues to work here is morally compromised.

orochimaaruabout 3 hours ago
Why is it morally wrong for a US citizen to work with their government?
finghinabout 2 hours ago
The acts of the government being wrong in an upsetting amount of cases would be a big reason.
tyreabout 2 hours ago
It’s not, but legal is not the same as ethical.

For a long time, and probably still, it was legal for the US to torture enemy combatants. It was never ethical.

rob74about 2 hours ago
If you add to that the very broad limits of what the current administration considers "legal" (as in "pretty much anything we want to do"), I can understand feeling uneasy as a Google employee...
gigatreeabout 2 hours ago
You’d need some shared ethical/moral framework to make that claim, which doesn’t really seem to exist anymore
fookerabout 2 hours ago
Because, we have pretty convincing historical precedent that 'just following orders' does not work as a defense when your government does something indefensible.
ReptileManabout 1 hour ago
Worked just well for the paperclip guys.
josefxabout 2 hours ago
What makes you think that Googles AI experts are US citizens?
hashmapabout 2 hours ago
working to directly advance a product used substantially to oppress people via surveillance or war crimes, when you have many other choices, is immoral. easy.
_vertigoabout 3 hours ago
It’s not morally wrong per-se but just because you are working with your government does not mean what you’re doing is necessarily moral
cooper_gangliaabout 3 hours ago
Just because you are working with your government does not mean what you’re doing is necessarily immoral, either.
SauciestGNU41 minutes ago
Because the government is comprised of Nazis now and is waging wars of expansionist conquest abroad and murdering domestic dissidents at home. Anyone working toward enabling that deserves to be on the receiving end of the systems they build.
IshKebababout 1 hour ago
Because their current government is immoral.
unethical_banabout 2 hours ago
Are you intentionally lumping in all civic service in one moral bucket? Is working at the post office morally equivalent to developing panopticon technology to suppress protest and track citizens?
hawk_about 1 hour ago
Sorry to Godwin the thread but the Third Reich would like a word.
mattnewtonabout 2 hours ago
Idk about morality, but it’s certainly a way to stop dystopian mass surveillance nightmares if everyone capable of building one refuses.

So if you live in the US and don’t want one government agency in the US to have this power (that is ambiguous under current law), one way you can try to avoid it is by refusing to sell it to them and urging others to do the same.

It’s a long shot sure, but it certainly seems more effective than hoping the legislature wakes up and reigns in the executive these days.

psychoslaveabout 2 hours ago
Given most government policies and direct engagement in all kind of monstrosities over the last millennia, there is really no reason to limit the case to USA, indeed.
tastyface35 minutes ago
Because the current government is a vindictive, murderous, proto-fascist government. (But you know that already.)
declan_robertsabout 3 hours ago
Thankfully Russia, China, etc have the same qualms as we do in the United States and will refused to send their brightest engineers to work on weapons so they don't become "morally compromised"!!!
titzerabout 2 hours ago
I don't think the long-term game theory of race to the bottom works out quite how you think.

"Our enemies would have no qualms building a weapon that will end life on earth! We better build it first because we're the good guys!"

declan_robertsabout 2 hours ago
Послушайте этого парня!
yibgabout 1 hour ago
We also used to point to Russia and China as places we don't want to copy.
notJimabout 2 hours ago
This was the same logic that was used when building nuclear weapons, and many of the scientists involved in that tried to find a different path (most notably Niels Bohr). I think we would be in a much better world if they had been successful. It's good that we're trying again w/ LLMs.
mvelbaumabout 1 hour ago
it's a good thing nobody listened to them because it's supremely retarded.
genxyabout 1 hour ago
People in those countries do have qualms, they are people after all and they choose to work in other fields.
tensorabout 2 hours ago
The US is sure becoming an unfree scary place just like Russia. Keep it up following those role models!
gambitingabout 3 hours ago
I don't know if you're being sarcastic(sounds like you are!) but indeed a lot of engineers left Russia after the war in Ukraine started as they didn't want to be drafted and didn't want to contribute to the war effort in some way, even if indirectly. Of course, many stayed or even willingly help. See how many engineers from Iran work abroad too, for moral and other reasons.

The point is - this happens everywhere, it's not just some weird western thing.

JeremyNT19 minutes ago
Also yesterday, on Brin getting cozy with this administration:

https://www.nytimes.com/2026/04/27/us/politics/sergey-brin-g...

tjwebbnorfolkabout 3 hours ago
Why is it morally compromising to work with the military of the country you live in?
plaidthunderabout 3 hours ago
I'm not anti-military as a rule but... c'mon. Opinions on the US military vary.

In extremis, were the people working for Pol Pot just good patriots with no moral culpability?

We could surely at least agree that there are cases where working for the military of your home country doesn't fully excuse you from your actions.

In fact, I think international tribunals have existed which operated on just those principles.

mrexcessabout 3 hours ago
We can all agree that working for the Nazi government’s military would be morally compromising, right?

You propose that other governments militaries would not be so compromising. Seems reasonable.

But the question then becomes, what is the operative distinction between the two?

boringgabout 1 hour ago
Why is it that this line items comes up EVERY TIME an article comes out in a knee jerk reaction - its so incredibly absolute:

"Any AI researcher who continues to work here is morally compromised."

It feels like a constant campaign and the posters seem so incredibly self righteous and unthoughtful.

crumpledabout 1 hour ago
Probably because the articles are talking about how the AI will be used in immoral ways, and that the people who know that and continue doing the work must be morally compromised.

I know that there might be $several ways those highly-paid engineers might still rationalize their work. Some of them might have ideological reasons to treat entire classes of people as unworthy of life. Within the model of their ideologies, the most evil things might be perfectly moral.

I wonder what reasons you have to disagree with people's moral stance against using AI as a weapon.

devinabout 3 hours ago
That's what the 7 figure salaries are for.
testfrequencyabout 3 hours ago
It’s funny to me how many progressive people I know and am friends with who work at these AI companies which are marginalized demographics (Trans, Gay, Latino, Black).

Still have faded Bernie stickers on their cars, No Kings organizers, “fuck SF I’m in the east bay for life fuck tech” - and you all make 7 figures Monday - Friday by supporting the death of society and democracy.

I don’t dare say anything though because “money is money”, the bay is expensive..but I do sure as shit judge every single person I know who joined OAI, Anthropic, Google, and Meta.

foobar_______about 2 hours ago
Preach. The hypocrisy is startling. I think people started at these companies maybe years ago with "good intentions" and are willing to turn a blind eye. But now, given just how glaring clear it is, I don't think it is really excusable anymore. To be clear, people can work wherever they want including these companies but what kills me is the hypocrisy. They are pathological liars to themselves if they somehow think they aren't complicit.
beernetabout 3 hours ago
Agreed. Just shows that big money doesn't dilude small character.
site-packages1about 3 hours ago
I would suggest looking inwards if this is how you really feel.
robrenaudabout 1 hour ago
Is every American tax payer morally compromised?
eks39118 minutes ago
Yes ;)

I agree with the intent of your rhetorical question, so I'm jesting with you. I'm justifying my "yes" with the hopefully humorous distraction that every person, including American taxpayers, has at some point made a nonsustainable/selfish (my definition of immoral) decision.

thisisauseridabout 2 hours ago
I agree that it is immoral to obey some laws. Which ones are you saying are immoral here?
ddtaylorabout 1 hour ago
An AI researcher can work anywhere they want, can't they? At the minimum they could work in a different field entirely. It seems like a false dichotomy to frame the question around laws.
ReptileManabout 1 hour ago
Morality is relative and malleable. And usually people are quite good at claiming that whatever suits my agenda is moral.
2OEH8eoCRo0about 3 hours ago
Is it any less moral than surveilling your neighbors and/or turning your neighbors against each other with social media?
mvelbaumabout 1 hour ago
Any AI researcher who refuses to support his own country in a technological arms race is morally bankrupt, foolishly naive and does not deserve to enjoy the the way of life created for him by those who sacrificed their lives.
site-packages1about 3 hours ago
> Any AI researcher who continues to work here is morally compromised.

Arguably it's exactly the opposite. In the same way we ask billionaires to pay their taxes because the regulatory regime is what allowed them the structure to make their billions in the first place, the national security of the country the AI researchers are in is what allows them to make a vast salary to work on interesting, leading edge capabilities like AI. They should feel obligated to help the military.

sailfastabout 3 hours ago
This all works if you assume that any action the government takes must be “lawful”. The assumption here is that the Pentagon is obeying the law and any unlawful use would go through normal reporting / violation channels - same as any illegal order or violation or whistleblower report.

The Pentagon does not want Google or anyone else deciding what they can and cannot use their AI for. They’re saying we won’t break the law, and that should be enough for you - pinky swear!

And that seems to be enough for Google. Though I might request some auditing capability that is agentic to verify rather than take them at their word.

Next step: is Google FEDRAMP’d yet for this and for classified enclaves? Or do they also go through Palantir’s AI vehicle?

gwbas1cabout 1 hour ago
I look at this as a case of "pick your battles."

In war, the civilians can't audit every move of the military. (It's impractical, both for reacting timely, and for keeping secrets from the enemy.)

If the military doesn't work with Google, they will work with someone else who might not put the same amount of pressure on the military about the practical limits on AI. Or, even worse, our enemy might use a significantly better AI that we do.

My hope is that "war" shifts to AI vs AI, machine vs machine. Calling people who work on AI for wartime purposes immoral is fundamentally immoral when AI in war replaces the need for human casulties.

mitthrowaway2about 1 hour ago
As a private contractor, you can sign a contract to deliver pizza or bandages to US soldiers, but also put into the contract that you won't deliver lethal weapons, if that's your own ethical stance. You don't need to audit every move of the military, just the stuff you're doing at their request.

And sure, maybe that just means the military decides to take their business elsewhere. But if you have confidence that your service is the best, then you sell based on that.

eks3918 minutes ago
I think you and your parent have great arguments. Your pizza deliverer chose his battle, which was to only deliver pizza, not materiel, and is commendable. Your parent seems to want to delegate death from humans to AI, which seems to me like a simplification that won't turn out exactly like that, but the premise of deciding whether that is a battle to pick is valid. If you want to start blurring the lines between the analogy and literality, if you choose to pick every battle to fight, there's not enough human bandwidth to do it all, and delegation to AI could be helpful. That last sentence is more loose, so I won't defend it, but I couldn't help not making a tie between picking your battles and literal battles. Perhaps a form of dark humor there.
ceejayozabout 3 hours ago
Who defines "lawful" if Google and the Pentagon disagree?

> The classified deal apparently doesn’t allow Google to veto how the government will use its AI models.

Seems concerning?

CobrastanJorjiabout 3 hours ago
That's presumably the trick, and it's not a subtle one; it's why the article puts it on quotes in the headline. Google gets to claim that it stood up for principles because it boldly insisted that the government obey the law, and the government will claim that whatever it decides to do is lawful. It's the same as what OpenAI did except not handled buffoonishly.
f33d5173about 3 hours ago
Lawful is presumably defined in the usual, common sense, ie we can do whatever the f we want until a court physically forces us not to.
dmdabout 3 hours ago
And since the court has no way to physically force anything - that's the executive branch's function, (it's right there in the name) - lawful has no meaning whatsoever if it's the executive branch that wants to break the law.
muvlonabout 2 hours ago
And the Pentagon has historically gotten away with damn near everything even in the judicial branch by appealing to national security.
impulser_about 2 hours ago
No it doesn't at all. Private corporations shouldn't be telling the government what it can and can't do. That's the job of the people. You want private corporation overriding your vote?
xp84about 1 hour ago
Agree. It seems on the surface convenient right now when people think the company (or rank and file employees?) are on their political “team” but they’d get less comfortable when oil companies or other “bad” companies dictate terms to the government. “We’ll provide fuel for the military if and only if you overturn the leader of $COUNTRY”

(Yes, I recognize that past military entanglements do read as favors for Big Oil, but that’s more because lobbyists directly purchased the corrupt and useless Congress)

ceejayozabout 1 hour ago
> “We’ll provide fuel for the military if and only if you overturn the leader of $COUNTRY”

A mechanism to address this exists, though.

https://en.wikipedia.org/wiki/Defense_Production_Act_of_1950

ceejayozabout 2 hours ago
> Private corporations shouldn't be telling the government what it can and can't do.

So Google can't tell the government it needs a warrant to perform a search? Google can't sue over something the government did?

It's Google's product they want to buy.

serial_devabout 2 hours ago
Just follow the orders, man!
impulser_about 2 hours ago
I'm talking about lawful, like it written in the terms.
yibgabout 1 hour ago
Of course it can. Terms of service and contractual obligation (should) apply to governments as well. Google is perfectly capable of outlining what's acceptable use and what's not, and the government is free to accept or reject and not use the product. Google is choosing not to set the boundaries.
tdb7893about 3 hours ago
Especially concerning with the how creative the executive branch can be when it comes to what laws mean. With little oversight, it seems guaranteed that it will be used for unlawful activities (despite whatever tortured argument some lawyer will have put into a memo somewhere).
xp84about 1 hour ago
Yeah, they’re really bad! Seems like it might be time to try convincing people to vote for someone else! Democrats haven’t tried that play since 2012, preferring the “scorn and insult anyone outside your base” strategy that’s worked so well since.
kingleopoldabout 2 hours ago
"who watches watchmen"

question as old as time itself

cooper_gangliaabout 2 hours ago
Google should never be determining what is lawful or not.
belzebubabout 3 hours ago
There's big air quotes energy in their statement
ethagnawlabout 3 hours ago
The classified aspect is probably the most concerning. How can I write my representative (and expect a form letter response six weeks later) if I don't know what I'm objecting to or even if I should be objecting?
cooper_gangliaabout 2 hours ago
Why would you write a letter if you don't know what you're objecting to or even if you should be objecting?
ceejayozabout 2 hours ago
Can't I object to not knowing?
ethagnawlabout 1 hour ago
That's kind of my point? I'm concerned by what has been made available but can't form a complete opinion and decide if I need to take action without knowing the full extent of the agreement.
dismalafabout 1 hour ago
By definition "the law" is the set of laws that the government passes. So it's a roundabout way of saying the government can pretty much do what they want.

Also, this is probably the only acceptable arrangement when it comes to industry-government contracts. The government will always have more information than civilians.

jonathanstrangeabout 2 hours ago
One thing is sure, they don't have international law in mind...
ApolloFortyNineabout 3 hours ago
This has to be one of the strangest "debates" in history.

Congress and the courts obviously.

If you think there's a hole in the law tell your congressman, don't, for some reason, try and put Google or any Ai company above the government.

ceejayozabout 3 hours ago
> Congress and the courts obviously.

The first is fully neutered. The second is far too slow.

"Nothing unlawful" needing to be in the contract is inherently concerning, as it's typically the default, assumed state of such a thing.

deepsunabout 3 hours ago
"follow the law" in contracts IMO is there to be able to claim a "breach of contract" by one party.
calgooabout 3 hours ago
Please! That ship sailed a long time ago. Sure tell your congressman, who is most likely bribed (lobbying is bribing, lets use the real words) by the same companies to accept the deal. The courts can try, but who is going to enforce it when the people above says that its fine.
shevy-javaabout 3 hours ago
It kind of reminds me of a mix of Skynet in Terminator and Minority Report. But nowhere near as interesting. More annoying than anything else.

I am kind of mad at James Cameron here. Skynet was evil but interesting. Reallife controlled by Google is evil but not interesting - it is flat out annoying.

pkilgore26 minutes ago
No remedies, no right.

What are the consequences of breach? Otherwise, Americans only use for this is to wipe their ass, and only if they can find a paper version.

hgoelabout 3 hours ago
How well does this hold up in terms of legal scrutiny when previous actions indicate that the Pentagon would retaliate against Google if they didn't accept this "lawful use only" farce?

Could Google back out of this agreement later by arguing that they were coerced?

Not trying to suggest that Google would be opposed to doing evil, but curious about how solid this agreement would be in practice.

john_strinlaiabout 3 hours ago
there is 0 reason that the definitions of 'lawful' for the purposes of these agreements should be classified.
svachalekabout 3 hours ago
There's a reason, you just won't like it.
jlduggerabout 1 hour ago
"When the president does it, that means it is not illegal" -- a former president
mvelbaumabout 1 hour ago
"Turns out I'm really good at killing people. Didn't know that was gonna be a strong suit of mine." - Nobel Peace Laureate Barack Obama
jldugger27 minutes ago
I'm not sure what point you're making, or whether it actually contradicts my own.
kbelderabout 1 hour ago
That's how I'd like Google to behave in regards to dealing with me.
franciscator44 minutes ago
Do not get distracted, that technology is used to kill people.
ripvanwinkleabout 2 hours ago
One observation.

Having your work being used by the govt in ways you disagree with feels similar to having your taxes used in ways you disagree.

When you pay taxes you have no say in the bombs acquired with that and where they are dropped. The latter though doesn't seem to provoke the same push back

dmitabout 2 hours ago
> When you pay taxes you have no say in the bombs acquired with that and where they are dropped.

Vote in elections, local and general.

jMylesabout 1 hour ago
> When you pay taxes you have no say in the bombs acquired with that and where they are dropped. The latter though doesn't seem to provoke the same push back

Indeed - paying "taxes" to a murderous entity is a horrible affront to morality and humanity. We do it because we're terrified; we are not perfect moral creatures. But we still know it's wrong.

Barrin92about 2 hours ago
you answered your own implicit question. You have a choice who you sell your work to, you don't have a choice what your taxes do. Seems pretty straight forward why the former elicits more push back. The government forces you to pay taxes it doesn't force you to build them tools of surveillance or weapons.
ripvanwinkleabout 2 hours ago
IF the feds are a sufficiently large market your viability as a business might depend on keeping them happy.

btw i am not making a judgement call on the ai usage issue itself, just saying that this and taxes are more equivalent than it might seem

Barrin92about 1 hour ago
>IF the feds are a sufficiently large market your viability as a business

sure if you're Lockheed you might be screwed, but that's not the case for Google. Military contracts, or even government contracts as a whole are a tiny fraction of the King Kong Sized gorilla that is Google.

The fact that Anthropic puts up a fight but OpenAI/Microsoft and Google don't I find hard to characterize as anything other than pathetic. These guys could, if the wanted to, afford a lawyer or to two to push back on the administration. They do that pretty successfully with their taxes in most places btw.

Advertisement
flufluflufluffyabout 3 hours ago
> We remain committed to the private and public sector consensus that AI should not be used for domestic mass surveillance or autonomous weaponry without appropriate human oversight.

And starts the lying to our faces. The public and private (from your own employees!) consensus is that it should not be used for those things at all, regardless of “human oversight.”

calgooabout 3 hours ago
I hate this part: `domestic mass surveillance`

So the rest of the world is fine to spy on, its the domestic part they don't agree with. So go on, destroy lives all around the world, helping the powers at be build the fascist state. Its fine to use Gemini to tell what building to blow up; its fine for Gemini to wrongly identify people and cause hundreds or thousands of deaths based on the telling the military who to attack.

ethinabout 2 hours ago
The fundamental problem with these "agreements" is that they are utterly nonsensical as written. Google has one idea of "lawful" and what it means; the Pentagon most definitely has a vastly different interpretation meaning "whatever we want". These companies make these agreements because they do not understand (either deliberately or just by the factor of them not understanding the intelligence sector) that when the intelligence community says "we will only use this for lawful purposes," what they are really telling you is something very, very different. With entities like the Pentagon your agreements should probably both define what "lawful" really means and should provide as few ambiguities as you can manage. Ideally you'd provide zero ambiguities but I'm not sure that's achievable in practice.
Havocabout 1 hour ago
And in love and war all is fair...

Reality is this ship sailed once the US/Palantir rolled out AI target selection

ctothabout 3 hours ago
Huh. I never realized the T-800 runs on Android. Makes sense, I guess.
chabesabout 2 hours ago
Snakes. All of them
CrzyLngPwd28 minutes ago
Meh.

Lawful didn't stop Project MKUltra, or attacking countless countries, or overthrowing countless governments, or murdering countless people, or kidnapping people and torturing them, or...

The USA can do anything it wants, to anyone, any time.

Imnimoabout 3 hours ago
Unsurprising from Google, but still bad. If Google has no right to object to a particular use, this is equivalent in practice to "any use, lawful or not".
anygivnthursdayabout 3 hours ago
Is Iran already a vibe war or those are just coming?
mullingitoverabout 3 hours ago
Reminder that this administration has some absolute howler theories about what constitutes lawful behavior[1].

[1] https://www.nytimes.com/2025/09/20/us/politics/tom-homan-fbi...

OtomotOabout 1 hour ago
Lawful means nothing but "according to law", which is a meaningless statement...

Remember that even the third Reich had laws!

Advertisement
qzncabout 3 hours ago
And that is news-worthy because unlawful use is normal?
HNisCISabout 2 hours ago
Refusing to participate WORKS.

I've had the unfortunate experience of working at a startup that started courting some autonomous weapons companies and HOLY SHIT were they the bottom of the barrel. Levels of incompetence you wouldn't believe, just good ol' boys who wanted to play with energetics. Then the company I was working for also hemorrhaged all their top engineers because they found the work unsettling.

The takeaway is that your refusal to assist these shitheads does have an impact, they have to pay more for talent and they have a much harder time courting good talent.

threeptsabout 2 hours ago
and the pentagon determines the law?
jcgrilloabout 3 hours ago
It's pretty funny how these guys are all becoming some kind of internet version of, like, Halliburton. It seems pretty desperate. B2C and B2B applications didn't pan out I guess?
zarzavatabout 3 hours ago
It's one of two identified uses for AI that is profitable today: writing code and blowing up schools. They are desperate to show the market that the technology is anything more than a money pit.
ctothabout 2 hours ago
The thing is we're in a new Cold War, and most of our adversaries have gotten the memo and most of us ... haven't. Yes, becoming a new Halliburton is a rational move if you see the board right now. I don't like it even one tiny bit.
a456463about 1 hour ago
I don't like it even tiny bit. But other people are doing it, so I mma go full steam ahead.

This is exactly what got us here.

morkalorkabout 3 hours ago
Will lawful use be determined in secret courts a la NSA and FISA?
Sanzigabout 3 hours ago
Doubtful it will even get that far, the DoJ will simply draft an appropriate fig leaf memo with a predetermined conclusion and the government will simply plow on ahead.

https://en.wikipedia.org/wiki/Torture_Memos

stephbookabout 3 hours ago
They simply say they have that memo. Who knows whether they even drafted it for real? And if anyone starts looking, Gemini can quickly draft one itself. Nice!
vrganjabout 3 hours ago
Don't be silly.

"When the president does it, that means that it is not illegal." - Richard Nixon

kentmabout 2 hours ago
Also the Supreme Court, half of Congress, and apparently something like 40% of the American populace.
joering2about 2 hours ago
The sign contract for any lawful use ?? Can you sign a contract with US government for some unlawful use??
cdrnsfabout 3 hours ago
Lawful is meaningless in the context of the Trump administration. Should Google waver (which they won't), they'll be declared a supply chain risk or otherwise bullied into submission.
Ritewutabout 2 hours ago
Google holds immense power in their position. Trump can make their life very difficult but Google can make life for Trump very difficult as well. They have no need to kneel, they are choosing to.
threeptsabout 2 hours ago
Google simply cannot justify this power struggle, it can doesn't mean it will. It got to the top by kissing the ring and that's how they stay at the top.
Ritewutabout 1 hour ago
I know. I'm noting that Google could fight this. They just won't.
f33d5173about 2 hours ago
what immense power?
Ritewutabout 2 hours ago
You don't think Google having control over the most used email, most used browser, most used search engine, most used video website, and most used phone OS gives them immense power?
jMyles43 minutes ago
These deals / arrangements / affronts / conspiracies will continue as long as there are sums of money too large to say no to.

It's so unbelievable obvious at this point that the Pentagon, and everything like it across the globe, needs a deprecation plan. We don't need these massive states anymore for security or regularity; we can communicate around the world at the speed of light and bypass their notions of how we're supposed to relate to one another.

Enough is enough. Spin down the nukes. Bring home the ships. Send the money back.

psychoslaveabout 3 hours ago
Do no evil. Well don't make anything illegal at least. I mean, let's not do what is different from whatever we wish at the moment.
threeptsabout 1 hour ago
Evil to them is not making money. It's pretty subjective.
Brian_K_Whiteabout 3 hours ago
What a handy word "lawful".
Advertisement
ChrisArchitectabout 3 hours ago
shevy-javaabout 3 hours ago
The beginning of Skynet 6.0.
mattdeboardabout 2 hours ago
"don't be evil"
grafmaxabout 2 hours ago
There's a lot of money in genocide.
SpicyLemonZestabout 2 hours ago
As a big critic of the OpenAI deal, this kinda sounds like a nothingburger to me. Of course Google doesn't get a veto on operational decisions, no customer would ever agree to such a thing. The problem with OpenAI was that they took advantage of Anthropic standing their ground to wedge their way in, which was both bad on its own terms and raises serious concerns about whether they're being honest on the real terms of the deal.
vrganjabout 3 hours ago
See also: https://en.wikipedia.org/wiki/IBM_and_the_Holocaust

Capital and Big Tech have always been opportunistic enablers, not principled actors. Corporate Values have always been nothing but internal propaganda. "Don't be evil", what a farce.