Google's Urs Holzle: Moore's Law Is Ending - InformationWeek

InformationWeek is part of the Informa Tech Division of Informa PLC

This site is operated by a business or businesses owned by Informa PLC and all copyright resides with them.Informa PLC's registered office is 5 Howick Place, London SW1P 1WG. Registered in England and Wales. Number 8860726.

Google's Urs Holzle: Moore's Law Is Ending
Newest First  |  Oldest First  |  Threaded View
User Rank: Ninja
11/30/2016 | 11:56:23 AM
Re: Other areas of opportunity
A nice thought. But if we see 10% performance increases in software every 18 months, I'd be more than happy. There's only so much you can improve software. After a point, it's almost impossible. Software has depended upon hardware improvements throughout computing history. This will be a shock to the software industry. No more will they be able to just throw half baked and not really needed features in to sell an upgrade. By the mid 2020's, it may become very difficult to sell upgrades, and software subscriptions that are sold with the implied promise of continuing improvements will also become a hard sell. Why pay every year when there's nothing really useful coming down the pipe?
User Rank: Ninja
11/24/2016 | 10:09:34 AM
Re: We no longer refer to "warehouse-scale"
I would be amazed if his 30% a year comes true after the first couple years, or so. Once the low hanging fruit is picked, the rest will be much harder to achieve. As I keep reminding people, it's not yet assured that we will be able to go to 5nm. That's just a hope right now, as some of the methods we're now using won't work at that size, and replacement technologies haven't been found yet. If 7nm is the practical limit, we're going to reach the end before 2021. After that, it will be a matter of finding more and more clever ways around designs. We can see from Intel, that eking out more performance is difficult. Others have found even more of a problem at the same nodes. More cores aren't a solution either. It will be an interesting decade.
User Rank: Ninja
11/21/2016 | 7:43:19 AM
Other areas of opportunity
For the longest time the solution to performance issues was to throw more memory and processors at the job. With that being less of an option we need to shift to two areas of opportunity:

- faster and more reliable networking

- application performance and footprint

Better networking already exists, except that the US refuses to spend the money on better infrastructure. There is not enough competition in the market, especially at the consumer level. There is typically one provider for networking services and that provider milks the outdated infrastructure as much as possible with high access fees. Great for their business, but overall a massive detractor to distributed computing and cloud services.

I see more chance in optimizing applications for performance and footprint. Developers need to go back to the idea that they are coding on a Commodore 64. Memory is a premium, disk access is slow, and networking has huge latency if it is available at all. Code within these confinements and get crafty. The FOSS folks at times go into that direction while companies like Microsoft go the other way. The footprint of Windows without any apps installed is huge. Yes, there are versions that are very lean, but they drop us back into the stoneage of command lines. Others can do better and with a GUI.

Moore's Law needs to applied to software: every 18 months make your apps run twice as well using the same amount of resources as before.
User Rank: Strategist
11/14/2016 | 2:14:48 PM
Moore's Law is actually dead - but is can be resuscitated
Moore's has really been a guideline touting an economic benefit accruing by physically shrinking IC feature widths.  As such, that guideline, halving of costs as device densities double is effectively dead as the costs to drive device density have skyrocketed.  A basic fab is now $10B, up from $1B a decade ago.  The Tools to populate the fab have not changed radically in over a decade as EUV scanners keep being another generation away.  

The potential resuscitation for the Data Center will come as FPGA based controllers provide intelligence to manage scaled fabric in the data center for routing packets across server clusters.

But until we get off silicon - it is questionalble when a path to cost reductions or density increases will continue.  It appears the path forward is for many to step back and reduce costs using "good enough" older fabs that are fully depreciated.  Looks like a longer period of status in process technology while we look at changes in architecture.
Charlie Babcock
Charlie Babcock,
User Rank: Author
11/10/2016 | 2:44:54 PM
We no longer refer to "warehouse-scale"
Holzle is of course the co-author of one of the most important early cloud documents, The Data Center as a Computer: Introduction to the Design of Warehouse-Scale Machines, sometimes shortened to The Data Center Is the Computer

The State of Cloud Computing - Fall 2020
The State of Cloud Computing - Fall 2020
Download this report to compare how cloud usage and spending patterns have changed in 2020, and how respondents think they'll evolve over the next two years.
Why 2021 May Turn Out to be a Great Year for Tech Startups
John Edwards, Technology Journalist & Author,  2/24/2021
How GIS Data Can Help Fix Vaccine Distribution
Jessica Davis, Senior Editor, Enterprise Apps,  2/17/2021
11 Ways DevOps Is Evolving
Lisa Morgan, Freelance Writer,  2/18/2021
Register for InformationWeek Newsletters
Current Issue
2021 Top Enterprise IT Trends
We've identified the key trends that are poised to impact the IT landscape in 2021. Find out why they're important and how they will affect you.
White Papers
Twitter Feed
Sponsored Live Streaming Video
Everything You've Been Told About Mobility Is Wrong
Attend this video symposium with Sean Wisdom, Global Director of Mobility Solutions, and learn about how you can harness powerful new products to mobilize your business potential.
Sponsored Video
Flash Poll