Hacker Newsnew | past | comments | ask | show | jobs | submit | kridsdale3's commentslogin

It's the same thing with a different name and different default settings.

Are they actually cut from the same codebase? The internal version has workspace support and other features cut from Cider I assume

The settings in the internal version are "Antigravity User Settings". Pretty sure they're the same.

Consumer version:

BE_EVIL=true

Internal version:

BE_EVIL=false


As an employee, I'm using Antigravity (CLI version) every day (because we can't use Claude) and it rules. I am way more productive than I was with CIDER-V, which itself was very nice.

/me shudders. cider-v...

I can't think of a better manifestation of the biological male imperitive.

12 months ago everyone agreed that Search was doomed, ChatGPT would kill Google, and Bard/Gemini were a joke.


When I went through YC in 2007 a founder whose name you know drunkenly told me at a party that Google Docs and Macbooks would have Microsoft out of business by 2012. Someone here told me in 2018 I was nuts to buy a gas-powered car because in less time than you would drive a car for, everyone will have switched to electric and there will be no gas stations left.

The impending deaths of most things are greatly exaggerated.


People overestimate the rate of change in the short run and underestimate the impact of change in the long run.



Huh, funny. I always associated ripgrep with the find/grep replacement written in Rust:

https://github.com/burntsushi/ripgrep

Wonder if it was an intentional pun.


Next: “SWE is largely solved, expect mass unemployment in the next few years”?


That's already half of the threads I see on reddit lately.


Except we're actually seeing a non-trivial reduction in SWE jobs, particularly entry-level roles.

May be short term and turn around at some point, but the current trends definitely feel lower vs. higher.


Are we? Compare to when? How about compared to the 30 year variation, not just the last 6.

In “big tech” or internet services, or also the non-tech companies that employ most engineers?


I wonder how much of that is driven by organic market forces or through anti-competitive practices.

For example, Chinese electric vehicles are selling like hotcakes in Europe but you'd be hard-pressed to find any in the US.


what was ur startup


Search is doomed for people creating content that depends on organic search traffic because Google's AI is providing the content directly to people doing the search.

My decade old tech blog with 500+ posts now gets 10x less traffic than it did a few years ago and I'm actually on the fence on pulling the plug on my 10 year old business because traffic is so low it now costs me more to host video courses that I sell than I make per month from them. In turn this comes with other implications, such as maybe stopping my YouTube channel and no longer contributing to open source because paying bills has priority over hobbies. I enjoy spending time on these things and morally was always ok with giving away almost everything I do and learned for free, but income requirements are very quick to slap you into reality.


You may be right, but that was not the point. There were many voices who claimed that AI would make Google obsolete.


If Google Search hadn't adopted AI it would have quickly been rendered obsolete, so yeah.


Both can be true? You can be doing really well and still have long term risk. Dethroning incumbents takes longer than people think and it’s possible that search growth goes 20%, 10%, -10%, -50%


By everyone, maybe you mean "only people dumb enough to post on hacker news"


This loss of easter eggs in software, along with the rise of enshittification, both have the same source:

Software used to be made by Programmers, with taste and opinions, according to their talent and personality, in solo or small groups. Now they are run by Project Managers and Data Scientists chasing KPIs through engagement measurement tools and AB tests.

Fun easter eggs cannot be justified. They are cut. Personality doesn't move the metric as much as the mean / common denominator most basic thing. That's what ships.

All software and web content has gone this way in the last 13 years or so.


For big games, it’s a legal thing. Ever since the Hot Coffee debacle, we are not allowed to hide content, all of which has to be approved by legal.



Use Mimestream. It's wonderful and GMail-backend-only fully native Mac App by one of the former lead engineers of the Apple Mail app.


$50/year? GTFO


I can (barely, but sustainably) run Q3.5 397B on my Mac Studio with 256GB unified. It cost $10,000 but that's well within reach for most people who are here, I expect.


Hacker News moment


$10k is well outside my budget for frivolous computer purchases.


It would be plenty in-budget if the software part of local AI was a bit more full-featured than it is at present. I want stuff like SSD offload for cold expert weights and/or for saved/cached KV-context, dynamic context sizing, NPU use for prefill, distributed inference over the network, etc. etc. to all be things that just work for most users, without them having to set anything up in an overly error-prone way. The system should not just explode when someone tries to run something slightly larger; it should undergo graceful degradation and let them figure out where the reasonable limits are.


But it's well within the budget of a small company that wants to run a model locally. There are plenty of reasons to run one locally even if it's not state of the art, such as for privacy, being able to do unlimited local experiments, or refining it to solve niche problems.


yeah, but if you really really wanted to and/or your livelyhood depended on it, you probably could afford it.


99.97% of HN users are nodding… :)


There are way too many good uses of these models for local that I fully expect a standard workstation 10 years from now to start at 128GB of RAM and have at least a workstation inference device.


or if you believe a lot of HN crowd we are in AI bubble and in 10 years inference will be dirt cheap when all of this crashes and we have all this hardware in data centers and it won't make any sense to run monster workstations at home (I work 128GB M4 but not run inference, just too many electron apps running at the same time...) :)


> I work 128GB M4 but not run inference, just too many electron apps running at the same time.

This is somewhat depressing - needing a couple of thousand bucks worth of ram just to run your chat app and code/text editor and API doco tool and forum app and notetaking app all at the same time...


Crucial (Micron) sold 128GB of DDR5-5600 in SODIMM form for $280 a year ago. It would be slower tham the same amount on an M4 Mac, but still, I object to characterizing either as “a couple thousand bucks worth”.


I( get that number by optioning up a Mac Studio to 128GB at the Apple Store.

(Admittedly, Apple should be facing criminal price gouging law suits for their ram pricing.)


Inference will be dirt cheap for things like coding but you'll want much more compute for architectural planning, personal assistants with persistent real time "thinking / memory", as well as real time multimedia. I could put 10 M4s to work right now and it won't be enough for what I've been cooking.


That's kind of a specific percentage. What numbers did you use to get there?


Just have to reclassify it as non-frivolous then. $10k's not a lot for something as important as a car, if you live somewhere where one is required. Housing is typically gonna cost you more than $10k to own. I probably spend close to $10k for food for 1.5 years.

So if you just huff enough of the AI Kool aid, you too can own a Mac Studio. Or an M5 MacBook. Or a dual 3090 rig.


For some reason you were being downvoted but I enjoy hearing how people are running open weights models at home (NOT in the cloud), and what kind of hardware they need, even if it's out of my price range.


I'm running it on my Intel Xeon W5 with 256GB of DDR5 and Nvidia 72GB VRAM. Paid $7-8k for this system. Probably cost twice as much now.

Using UD-IQ4_NL quants.

Getting 13 t/s. Using it with thinking disabled.


I get 20 t/s on the UD-Q6_K_XL quant, Radeon 6800 XT.


In where I am living, 10k USD is a little more than 3 years worth of rent, for a relatively new and convenient 2 bedroom apartment.


$277 a month for a two bedroom is literally 6-10% of what someone in the SF Bagholder Area pays.

Either you're in Africa, southeast Asia or south/central Amarica.

How do you even afford internet?


Yes, I am in SEA. Home internet here costs 10$ per month.

My point was: not every person browsing this site has high living standard, and the ability to spend 10k on computing is a privilege.


you have proved my point


They already got swiss-cheesed by DOGE though.


yeah, none of the government is "protected" considering whose in charge of every aspect of it. Social engineering is the biggest technologic and domestic weakness.

Bizarre watching people talk about the insecurity of technology.


> none of the government is "protected"

Nothing is absolute zero. Doesn’t make temperature meaningless.


They likely saw you as a narc.

The law is just an annoying thing getting in the way of our profits.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: