
Could you figure out the answer? To my frustration, I couldn't, without peeking at a few of the comments. It makes perfect sense now. Not a bad puzzle!
Could you figure out the answer? To my frustration, I couldn't, without peeking at a few of the comments. It makes perfect sense now. Not a bad puzzle!
I've read Nick's newsletter for a long time and recommended it, his web site, and his book to many people. What I like is that much of it is more sensible and less trite than so many of the job-seeking rules of thumb that you read. I don't agree with him 100%, and some of his advice seems hard to implement, such as refusing to disclose your current salary. But, he has good reasons and it's all worthy of your consideration.
I hope you are enjoying the holiday season and wish you interesting projects and prosperity in 2011!
Past history is crystal clear and his description spot on. His assessment of current and future value-add is plausible and thought provoking. Sounds dire for many silicon-centric companies, though. Will you be a survivor? What does it say for the fortunes of the EDA industry? His answer seems to be focus on system/software, complemented by C-based hardware design.
Beyond 22nm, many things get exotic. Extreme UV (EUV) has been the next big lithography change for ages -- is it finally required for 15nm? FINFETS, Carbon Nanotubes, and Stacked Die, Oh My! We're not in Kansas any more.
But this is much more focused than a merchant ASIC business. FPGA itself is more like a Standard Product, with high volumes and lots of benefit from using advanced processes. Altera and Xilinx are the most leading-edge customers for the established foundries. It'll be interesting to see Intel's motives beyond just "further monetizing the fab".
It's exciting to see GPU Computing gaining traction and accolades for certain highly parallel applications. I'm anxiously looking forward to GPU Computing helping to solve EDA problems. So far, there's been some nibbling around the edges and algorithms experimented with, but I'm not aware of production EDA products based on GPU. Yet. What EDA applications would most benefit from massively parallel processing, such as is offered by a GPU?
Does the same thing happen with EDA software? The state of the EDA business in Asia is something I only know from the occasional rumor. How widespread is EDA piracy? How common is it to "crack" FlexLM? And, is EDA software already heavily discounted in Asian markets?
The last point most concerns American workers. Not only are American competing from a higher wage/cost-of-living, but what if the essential tools of our trade are cheaper overseas, as well? Strike two!
Though I'm not sure I'd always like to be his employee, it is pleasing to be his customer. Customers willing to pay a stiff premium are the sincerest form of flattery.
As my readers know, I find SNUG to be the most valuable conference for hands-on IC design engineers. I always leave with a list of new ideas to try back at work.
Why not launch your publishing career and boost your reputation by showing the cool stuff you've worked on?
Top themes:
Product-wise, Aart emphasized their custom design competitor to Virtuoso, which I don't find super exciting. It's always nice to have a more modern implementation of a workhorse tool, but not earth shattering. Synopsys bread and butter tools, which pay the paychecks, didn't get much air time.
I think some of the market share claims could be misinterpreted (90% of 32 nanometer chips), but that's standard fare for the ways vendors advertise this.
Prof. Horowitz's solution is provocative and plausible, though not a "slam dunk". I'd like him to quantify how much his approach would reduce the total cost of nanometer semiconductor design. Also, how applicable is it to domains beyond processors? Isn't it very difficult to create an "architecture generator" for each domain? For another perspective, here's a blog post reviewing the talk.
Companies on the list relevant to EDA/ASIC engineers:
SURPRISE: EDA cost per transistor is coming down the same learning curve as all the other input costs like materials, chemicals, labor, etc. (above) and it has been doing so throughout semiconductor history.
What is Wally Rhines' prescription for increased EDA value-add (and profits)? EDA vendors
... must incorporate the embedded software development and system analysis costs into their design tools and flows. To the extent this is accomplished, there isn’t a cost problem and the 30%+ per year per transistor cost reduction can be achieved. That’s why EDA companies first became involved in embedded software in the mid 1990’s. That involvement will grow as EDA companies take on more responsibility for the total design challenge and its costs. It’s just part of the evolution of roles in the semiconductor industry. (Walden C. Rhines is chairman & CEO, Mentor Graphics Corporation.)
Although in the short term there are still plenty of us fighting it out for nanometer-scale timing (and now power) closure, the focus on software and systems is plausible for the long term. We must find quantum leaps in productivity to take advantage of Moore's law in semiconductors.
You can see other trip reports and interesting links in the delicious.com sidebar of my blog, where I post EDA and Semiconductor links.
With one notable exception. Did you notice that his list doesn't feature a certain EDA company that's hard to overlook? That's right, Synopsys. He doesn't feature one Synopsys product in his list! That's a curious oversight to me. While I do think that much of EDA innovation comes from smaller companies, Synopsys does hold its own in innovation compared to Cadence or Mentor, but those companies did have featured products in John's list. Are John and Synopsys having a spat, or is he getting ready to roll out his exclusive must-see Synopsys list?
John Cooley is the original user's voice in EDA, and his DeepChip web site and mailing list continue to have the biggest following. I used to think of John as "the Michael Moore of EDA" -- a rabble-rouser, confronting the dominant companies and their executives, and sticking up for the little guy.
He still has that image, but I've come to realize that the in-depth tool evaluations that he posts aren't always the innocent sharings of chip engineers that they appear to be. There's definitely an element of the EDA vendors' PR machines in some write-ups. That's OK, as long as you take them as such. If these articles are really EDA vendor white papers or press releases, at least they're written in a language that speaks directly to the challenges we face in chip design. With that caveat, here's a list of his latest nuggets:
The blogosphere is afire with commentary on Cadence's acquisition of Denali, including
Though I'm not a financial analyst, I do have reasonable familiarity with the stock market. And, being the thrifty guy that I am, I'm having trouble to justify Cadence's valuation of Denali. Pulling a quote from Gabe's analysis post, Denali had trailing twelve-month revenues of $43 million, implying a 6.3x EV/sales. Priced at over 6X Sales! For a company that's been around a number of years, and while it has a healthy business, is not forecast for hyper-growth? How do companies justify such valuations? (I'm asking sincerely--perhaps there's an analysis angle that makes this look like a good bet.)
The more I read, the more I scratch my head about whether this is a good venture for Cadence. Gotta give them credit for trying something outside the EDA product box, though. It seems they are gunning for Synopsys DesignWare IP business, and using Denali to give themselves a jump start.
Caveat: I haven't read Cadence's "EDA360 Manifesto", so I may not fully appreciate the cleverness of their overall strategy.
Though as an implementation guy, I don't use Denali products, the first things that sprang to mind are:
Off the top of my head guess list of bigger deals: CCT (Cadence), Ambit (Cadence), Avant! (Synopsys). Maybe ViewLogic (Synopsys), Verisity (Cadence).
On a more somber note, let's hope that Cadence is able to preserve the technology and brain trust that was able to make Denali a well-established mid-level EDA vendor. Cadence has had mixed success in managing acquisitions, with some apparently not paying off (Ambit) and others holding up well (Verplex).
In the interest of preserving history, I'll list their picks here.
I always thought it curious that Apple supposedly pulled off such an ambitious chip with their first major in-house development. Pundits assumed that it was done with the help of Apple's P.A. Semi acquisition, but P.A. Semi had been focusing on the "Power" architecture, not ARM.
I love technology history more than the average person, but I am dumb-founded that with all the software engineering development, object-oriented design, and graphical user interface work that's happened over the decades, we are relying on a language designed before many practicing engineers were born! (People may complain about Verilog HDL's crustiness, but it's "only" 25 years old.)
Though software engineering is not my principal occupation, I do enjoy programming and languages. I know "C" well enough to quickly shoot myself in the foot, but think Java is a far more elegant language. It's sort of like I've heard that Python is a better scripting language, but everyone still uses Perl.
Anyone have explanations for why "C" remains so popular? Isn't this a significant reason why computers have so many security holes? When you hear about "buffer overrun" and "malicious code execution", think "C" pointers! I can see using "C" for embedded systems if resources are very limited, but otherwise, I'm truly surprised by this survey.
The conference featured record attendance (over 2,100) and a good balance of user papers on verification, implementation, signoff, and low-power design.
Some additional details, not quite as juicy, follow in How Apple’s A4 chip lets iPad run cooler, save battery life. I was most surprised at the claim from "a very trusted source" that PA Semi didn’t do the A4. It was the existing VLSI team. Could it be true? All other speculation I read was that this was the fruit of Apple's investment in PA Semi. (Although PA Semi was working on PowerPC designs, and the iPad's A4 is almost surely ARM-based.)
Update: Business Week's A4 story. I was never expecting Intel to be a contender for the iPad design win. And how does an analyst already estimate the cost of the A4 chip and the iPad's bill of materials? Has anyone got their hands on one outside of Apple?
As good tech consumers, we all have our opinions about the features, price point, and odds of it changing the world. Steve Jobs called it "magical". (Fake Steve says he uses neuro-linguistic programming.)
But I digress. What's the significance for chip-design engineers? Though we won't have the details from a full teardown for another couple of months, Apple did announce that they've designed the central chip in the iPad, dubbing it the "A4". Early coverage of A4 includes
It's intriguing that Apple has consciously decided to build an advanced chip design organization, since they are a consumer electronics company. There are comparable alternatives for single-chip mobile device computers, including the Qualcomm Snapdragon and NVIDIA Tegra. What's Apple's angle in doing it themselves? Is it just their obsessive need for secrecy? Or will they somehow integrate differentiating functions that justify the large investment and risk of running a chip design operation? It flies against the industry trend (which PC manufacturer designs its own chips?), but let's see how this plays out. I'm not foolish enough to dismiss Steve Jobs' ability to Think Different.
![]() |
A short but encouraging news article at X-bit labs, TSMC’s Problems with 40nm Process Technology Largely Over, reports that TSMC has solved its widely-rumored yield problems with its leading-edge 40nm process. Hallelujah! This would be great news for TSMC, their customers, their customers' customers, ...
An interesting side note is the claim that at present only ATI, graphics business unit of Advanced Micro Devices, Altera and NVIDIA Corp. use TSMC 40nm process technology. Really, only three production users of 40nm? Apparently the design pipeline is getting stretched out over many generations, including 65nm on up to 130nm. Ah, I miss the good old days of 130nm design.
(Tip o' the hat to Daniel Nenni for the "tweet" tip.)
Methodology Annual pay for Bachelors graduates without higher degrees. Typical starting graduates have 2 years of experience; mid-career have 15 years. See full methodology for more. |
The other is ominous for the long-term: Is Moore's Law near its end? - Now Hear This! - Blog on EDN. But really, running out of steam in less than five years? Be sure to read the comment stream, as there are some good additions there.
My perspective is that a near-term uptick is certainly overdue. By all indications, the US economy is starting to recover, and with it, consumer spending on electronics will get a nice boost. There's certainly a lot of innovation being shown at the Consumer Electronics Show this week.
As for Moore's Law, I think it keeps working as long as we're using planar CMOS. If we have to transition to fundamentally different materials, or 3D structures, all bets are off.
Can you tell I had to reset my SolvNet password today? Does Synopsys really hate customers this much? My feelings haven't changed since I last changed my password.