22 Comments

So I don't think the number of transistors is what we measure but compute power. Quantum will continue the doubling of compute.

Expand full comment

I agree about measuring power and not the number of transistors (arbitrary, although super interesting to track over time and easily quantifiable). I'm not sure quantum will be the next thing that keeps Moore's Law (the looser definition, not # of transistors per se) going, but I'm pretty confident that something will.

Expand full comment

I keep it going in my novel using organic computing 😆

Expand full comment

The critical question is whether we are getting double the performance because of all this doubling. My experience says No, but I look for others to chime in. I also believe that it now takes more than two years for doubling to happen. As the hardware gets better, the software gets bigger. Conversely, it allows people to not worry about code performance and focus on productivity, as hardware doubling will take care of most bottlenecks.

I agree for the foreseeable future, we will see doubling. However, it should not be every two years as I believe 3-D integration, etc., will continue the trend.

Expand full comment

Well, it's lumpy for me. I 10xed my productivity in the ability to research and understand more topics over the last 2 years, so for that it's way more than 2x.. but I can't say that I've seen comparable productivity boosts anywhere else.

I'm not sure about double the performance, but I think it's worth looking at something else: whether we are doubling in price-performance. In other words, over 6 years, you'd normally expect to see 3 of these doubles, so something you could have bought 6 years ago should theoretically be 8x cheaper today.

I think this has happened consistently over a long period of time, but I'm not sure about the 2020s.

Expand full comment

I think most of the productivity gains will be due to a higher average bar and will get even higher with all the Co-Pilots and other improvements in the future. I have worked with enough software engineers/programmers from below average to highly talented (mythical 10x). I have not seen anyone achieving 10x productivity across the board. It is generally 2-3 times. I also think 10x is more of a myth than reality. As humans, we waste time on other activities.

I am not an economist (I have taken only an Economics 101 course and read a few books), so I could be completely wrong, but most of the productivity gains in the future will come from an average skilled or less experienced person's ability to use co-pilots to accomplish more. However, an above-average person/highly talented person will see a lot less gains, and I saw a few studies recently mentioning that. Would it be a 20-30% GDP gain? Not until almost 80-90% of work is done by AI or automated. Unless we hit AI Winter, sometime in the next 10-15 years, we will start seeing 4-5% G.D.P. in the US.

Expand full comment

4% GDP growth in the US would be about double what we have today, so that would be just amazing to see. Standards of living will continue to rise rapidly, unless all the gains go to the very top (not such an easy thing to avoid, as it turns out). I'm also not an economist, and far from it, but I am enthusiastic and I do enjoy books about economics. I've probably listened to (reading them is way tougher) a few dozen books now, so I'm just about ready for my Audible PhD.

I think you're right, though - many of the gains are over-exaggerated. If you're in one of those narrow spots where you're benefiting disproportionately, you can become a lot more productive on a personal level, but this will be very uneven. It already is.

Expand full comment

Here is the study, I was talking about:

https://papers.ssrn.com/sol3/papers.cfm?abstract_id=4573321

Consultants across the skills distribution benefited significantly from having AI augmentation, with those below the average performance threshold increasing by 43% and those above increasing by 17% compared to their own scores.

Expand full comment

Yeah, I've seen similar studies, and anecdotally, this kind of makes sense.

What has been surprising is the idea that the more routine, boring stuff would be replaced, but in fact, some of the more creative jobs are under threat (although I have much more complex thoughts on that whole situation, this is definitely the feeling at the moment). That was a surprise to me, but then again, so was generative AI's absolute explosion.

Expand full comment

Unfortunately, most gain will go to the top 10-20%. That’s how capitalism has worked so far. However, overall, it should improve everyone's standard of living, too.

Expand full comment

"$666.66 from the 70s buys you something like a million times more computing power."

Meh. I'll wait for next Thursday, when I can get a billion times more computing power. I ain't wasting my money for no reason.

But it is crazy to think how consistently Moore's law has held up. And even crazier to think that AI, if it lives up to the positive expectations, may launch us into an even more exponential curve.

Expand full comment

Yeah. I think I could have taken all week to write this and speculated all about that, but maybe that's best left for another day.

Expand full comment

No rush: Just wait for next Thursday, when your brain's a million times faster!

Expand full comment

I'm going to sleep until then. Wake me up with telepathic brain waves from your nanobots when it's time.

Expand full comment

Pffft, "nanobots," how old-fashioned of you. What is this, week 19 of 2024?

Buddy, here in week 20, we've got biomorphic sludge that does everything for us. In turn, we do everything for biomorphic sludge. King Sludge is the greatest. King Sludge is all. Long live King Sludge.

Expand full comment

I understand King Sludge's reign has been relegated to the ancient past, and 42,945 other kings have reigned since I started writing this sentence.

Expand full comment

&%/# #!! #)/( !!#)(

"?

------------

Expand full comment

I remember move the article, twice. My grandfather was part of the movement. Even though he didd not believe. It was were the physics lead him.

Expand full comment

You remember Moore's original article?

Expand full comment

My grandfather had the journal.

Expand full comment

Ah, okay. I think some context was missing from your post, but that's pretty rad!

Expand full comment

It was split to the article. As you said: "Some times as article writes itself."

Expand full comment