RU version is available. Content is displayed in original English for accuracy.
Advertisement
Advertisement
⚡ Community Insights
Discussion Sentiment
82% Positive
Analyzed from 3209 words in the discussion.
Trending Topics
#google#cider#more#ide#still#software#lot#don#company#used

Discussion (75 Comments)Read Original on HackerNews
[1] https://tritium.legal
https://knowledge.workspace.google.com/admin/gemini/ai-ultra...
Seems false.
When I first started the environment you used depended entirely on language. In the C++ and Python space, there was the vim and emacs divide. With Java it was more complicated. Some still used vim/emacs but a lot of people used Eclipse.
Now Eclipse was a real problem at Google because of the source control system. Java IDEs are primarily built to import binaries, specifically jars. In the outside world, these dependencies are managed via Ant (very early days), Maven/Gradle or the like.
At Google there's a mono-repo (Perforce/Piper) and you check out parts of it locally and rely on the rest via a network connection (to SrcFS IIRC, it's been awhile). This was neat because you could edit a file locally and the dependencies would just recompile (via Blaze).
So for Eclipse a whole lot of initialization had to be done and the IDE would fall over. A lot. It had a team of ~10 working on it at one point. Then somebody did a 20% project called magicjar. Magicjar took a Perforce client and built all the dependencies as jars that could be imported directly without parsing the entire source tree (which was usually huge). This made it possible, even preferred, to use IntelliJ, which is what I did. Magicjar was great.
Other people actually made CLion work reasonably well with C++ too. That was nice. This was a much bigger undertaking with many more corner cases just given how C++ works (ie headers and templates).
So checking out a client was relatively heavyweight, even with a minimal local tree. And, if you worked on Google3, you had to do this a lot. You might need to do a config file change. This was the real starting point for Cider because it was way nicer to do config file changes with it.
Obviously I don't know where all this went from there. VS Studio as a Cider frontend? Ok, that was news to me. Engineers being unhappy when things change and when the slightest thing works differently is the least surprising thing I've ever heard.
Oh it's worth adding that in my time many people didn't use Perforce (P4) directly. They used somebody else's project, which was a Git frontend for it, called Git5. I believe it was already being deprecated while I was still there. But Git5 modelled a P4 change as a branch so you could play around with your Git commits locally and then squash them into a single P4 change. I actually liked this a lot.
The history of Google's relationship to version control is even more interesting than editors - it went from CVS in 1998 to Perforce (P4) in 2000, then gcheckout and g4 in ~2006, then OverlayFS was invented in 2008, git5 came out in 2009, CitC obsoleted OverlayFS in ~2012, Piper built this all into the VCS in ~2013-2014, while I was gone from 2014-2020 apparently we got hg and jujutsu frameworks, and then when I got back in 2020 you'd just check out a .blazeproject from your IDE and everything would magically work. Many of these started as 20% projects (I used to have lunch with the guy who invented OverlayFS; interesting character and one of the best programmers I knew) and then got folded into the "official" way of doing things once grassroot adoption showed the execs that this was how people really wanted to work.
I do think Google will continue to get results out of their tooling, as long as they are investing in the tooling. But that is not zero cost. Is it worth it for what they are doing? Largely seems to be.
But it isn't like they are that much more successful at software projects than any other company? They are still largely an ads company, no?
They have a ton of other software in 2026. And they have a pretty diverse (and diversifying) income stream today. Like 30-40% from non-ads.
Is it worth it? That’s for them to say, but they can ramp up cloud services at scale pretty fast as a core competency.
So, sure, lots of spots for software there. But still nothing that would make me think of them as a software company. Or, worse, a lot of software that I don't have a strongly favorable view on. :D
Sure, the money is mostly in ads, but serving searches, AI, youtube, and all the rest at the scale Google does it requires a technical tour-de-force. Does Google do it better than everyone? Absolutely not. But it does it better than many.
Certainly it isn't the _only_ way to do it--other companies also manage to do it. But not all that many at the same scale. It's an existence proof that you can.
Consider that they spend more on trying to build up and support this central IDE than most companies dream of losing in productivity to not having this.
I re-read this several times trying to figure out where the irony was hidden. But... it's not there?
So, again, are they that much more successful at software than other companies? They have more hilarious flops than any other company.
Don't get me wrong. I still use some of the stuff. I don't hate them. I don't even think they are particularly bad at things. I just don't think they are any more successful than other software companies. Specifically at the software side of it.
The aspect I miss is the distributed compilation hinted at in the article. I remember back at the end of 1990s using distcc and things, but that never seemed to happen in the Java world and the tooling like maven etc is structured to make everything one long dependent chain. Shame.
Our bazel system is full of custom skylark code so understanding the build means effectively reading a bunch of ad-hoc code written with varying degrees of competence and with confusing dependencies. I’m kinda ashamed I don’t have a deep understanding of a tool I use daily - but every time I try reading the documentation I quickly give up.
The article is framed around "all Googlers" but there is still a very large contingent of Googlers who cannot use these tools.
For anything with native UIs, I suppose you could "remote desktop" into an app or a simulator running in the cloud but at that point you might as well run that locally and cut out all the issues introduced by networking.
I'd like to hear the perspective of the developer/user; the IDE provider has some incentive to take credit and imply high utilization reflects success rather than Google policy.
I'm interested in how tooling conditions developer expectations more broadly. I'd love to see a comparison of Linux OS development (all local+open+git, open but contributor hierarchy) vs Google (monorepo+required tooling, pre-allocated authority) from someone who's done both.
I don't know which team that was, but to add to that, official support for IntelliJ at Google started quite a bit earlier. I was the second person to join a team writing IntelliJ plugins. We wrote a Blaze plugin not too long after Blaze launched, as it was becoming more popular.
Google tells me that Blaze launched in 2006, so I think it must have been 2007 or 2008.
When Google wanted engineers to use AI features, it turned them on in Cider-V by default. And if you turned them off, later updates would turn them back on. This is very good for your adoption metrics, but might not tell you exactly what you want to know about engineer happiness.
Such a dominant IDE also allows management to ignore the long-tail of users who aren't using it.
I once worked at a place where VPs were looking at sprint burndown charts, and asked what happened if the line didn't look a lot like the line expected by JIRA. The telemetry is therefore often a curse, as any metric becomes a target. How many companies today have KPIs about having automated code reviews, which are then ignored by the devs, because said reviews are just wrong on almost everything?
The learnings of Seeing Like A State don't apply just to governments.
As the team had to collaborate with the VSCode team, we got clearance for sharing information about it. The screenshots in the article were posted publicly on GitHub (in vscode issues). You can also find screenshots in https://research.google/blog/smart-paste-for-context-aware-a...
More generally, a lot has been communicated on developer infrastructure at Google.
When it finally failed in the most annoying way possible (the touch screen, which I do not use, started creating phantom clicks in the upper right corner of the display) I went looking for another Chromebook that was light, powerful, and well-built. Finding none, I now use MacBook Air and weep for the time I lose every time it needs an OS update.
You have access to an extremely powerful remote workstation that from a UI perspective functions almost identically to a local workstation, via Chrome Remote Desktop. Plus, no one builds things locally, even on that machine. There is a huge, absolutely amazing distributed build system that everyone uses for everything. (Again, Android and Chromium are different.)
So you don't really need a powerful local machine. I held out for a long time--there were a lot of growing pains in the early days. But eventually it got really, really good.
One is a framework called Wiz, which renders the frontend for a bunch of Google web apps. You can imagine that the Wiz team might want to refactor an API, but not have to worry about different apps using different versions. In a monorepo, they can just find all the callsites and update them in the same commit that makes the API change. There's no package.json in google3 - everything builds from HEAD. Therefore, the commit that makes a breaking change is also the commit that fixes the would-be breakage.
This architecture evolved. Google used to use Perforce, which was a common commercial version control system before Git. Google had to figure out how to express the dependencies between packages in the monorepo (which can be in different languages with different build tools). They eventually created Bazel, which expresses those dependencies and orchestrates their build tools.
Build orchestration took a few attempts. Google3 is the third version of the monorepo, that is, the one that uses Bazel for dependency management.
Afterwards I was issued a 12" Pixelbook and it was surprisingly much more usable than I had expected! I could ssh into a Linux box for running builds and tests. Cider worked perfectly. It was snappy enough to serve as a thin client even on a 4K screen.
Now, ironically with so many extensions and LLM computing, users seem to forget that they chose Cider because of its lightweight.
My recollection from 2009-2011 is that emacs and vim were the dominant editors (just as the TV show Silicon Valley depicted), and there was a decent-sized minority using Eclipse and Intellij, both of which had official support for Google tooling. The command line still largely ruled though, even though the official Google developer workstation was Goobuntu, Google-flavored Ubuntu. This reflected the overall developer population of the time.
I think Cider actually was invented a little earlier than the article describes. I have vague memories of some engineers experimenting with web-based IDEs that would integrated directly with Critique (the code-review software) as early as 2013-2014. Its use was not widespread when I left in 2014; there was still the impression that it wasn't powerful enough for daily driving.
When I came back in 2020, emacs/vim use was much lower, again probably reflecting differences in the general population of developers. Many more of the developers had been trained in the post-2010 developer ecosystem of VSCode, IntelliJ, etc, and this was reflected in tool usage at Google too. I'd say IntelliJ was the dominant IDE, with Cider a close second and Cider-V just starting to take market share. You still had to pry emacs and vim from a grizzled old veteran's hands.
By 2022 I'd transferred to an Android team, and Android Studio with Blaze was the dominant IDE, even as general IntelliJ usage in the company was falling. Cider just didn't have the same Android-specific support. Company-wide Cider-V was growing the fastest, taking market share from both IntelliJ and Cider-V.
By 2024 Cider-V was dominant and there started to be a concerted push to standardize on it, particularly since new AI agent tools were coming out and they couldn't be supported on all editors that Googlers wanted to use.
As of my departure in 2026, the company-wide push was to standardize on Antigravity [1], which, as I understand it, won a turf war within the developer tools org and got blessed as the "official" Google AI coding agent. This also has the effect of concentrating developer time dogfooding Google's external AI coding offering, which hopefully should improve its quality. There's still significant Cider-V usage, but it's dropping, and execs are pushing Antigravity hard.
[1] https://antigravity.google/
I'm a UXE, so I tend to use the same tools an external developer might. But I never got the impression that Cider was a recent development.
I’m well thinking I may as well trade my brick of an m5 pro for a 13” chromebook, it’s a strange time.
It's also nice that it stores all my preferences in the cloud, so switching machines is seamless (helpful when my macbook broke a couple weeks ago and I had to use a loaner chromebook for a day).
It's also well integrated with google3 and codesearch, and seamlessly runs tests on remote machines with tmux integration and all.
Not all of google tooling is my favorite (like their source control), but the IDE is great.