d

May be harmful if inhaled or swallowed

In the book “The World of _____” by Bennett Alan Weinberg and Bonnie K Bealer, there is a photograph of a label from a jar of pharmaceutical-grade crystals. It reads:

“WARNING: MAY BE HARMFUL IF INHALED OR SWALLOWED. HAS CAUSED MUTAGENIC AND REPRODUCTIVE EFFECTS IN LABORATORY ANIMALS. INHALATION CAUSES RAPID HEART RATE, EXCITEMENT, DIZZINESS, PAIN, COLLAPSE, HYPOTENSION, FEVER, SHORTNESS OF BREATH. MAY CAUSE HEADACHE, INSOMNIA, VOMITING, STOMACH PAIN, COLLAPSE AND CONVULSIONS.”

Fill in the blank.

Workoutable © 2007 IndiaUncut.com. All rights reserved.
India Uncut * The IU Blog * Rave Out * Extrowords * Workoutable * Linkastic




d

Glory and Sadness, Beauty and Pain

X is a song written by Y and famously covered by Z. Time Magazine’s Josh Tyrangiel described it thus:

Y murmured the original like a dirge, but except for a single overwrought breath before the music kicks in, Z treated the 7-min. song like a tiny capsule of humanity, using his voice to careen between glory and sadness, beauty and pain, mostly just by repeating the word X. It’s not only Z’s best song — it’s one of the great songs, and because it covers so much emotional ground and is not (yet) a painfully obvious choice, it has become the go-to track whenever a TV show wants to create instant mood. ‘X can be joyous or bittersweet, depending on what part of it you use,’ says Sony ATV’s Kathy Coleman. ‘It’s one of those rare songs that the more it gets used, the more people want to use it.’

Name X, Y and Z.

Workoutable © 2007 IndiaUncut.com. All rights reserved.
India Uncut * The IU Blog * Rave Out * Extrowords * Workoutable * Linkastic




d

Here Is Why the Indian Voter Is Saddled With Bad Economics

This is the 15th installment of The Rationalist, my column for the Times of India.

It’s election season, and promises are raining down on voters like rose petals on naïve newlyweds. Earlier this week, the Congress party announced a minimum income guarantee for the poor. This Friday, the Modi government released a budget full of sops. As the days go by, the promises will get bolder, and you might feel important that so much attention is being given to you. Well, the joke is on you.

Every election, HL Mencken once said, is “an advance auction sale of stolen goods.” A bunch of competing mafias fight to rule over you for the next five years. You decide who wins, on the basis of who can bribe you better with your own money. This is an absurd situation, which I tried to express in a limerick I wrote for this page a couple of years ago:

POLITICS: A neta who loves currency notes/ Told me what his line of work denotes./ ‘It is kind of funny./ We steal people’s money/And use some of it to buy their votes.’

We’re the dupes here, and we pay far more to keep this circus going than this circus costs. It would be okay if the parties, once they came to power, provided good governance. But voters have given up on that, and now only want patronage and handouts. That leads to one of the biggest problems in Indian politics: We are stuck in an equilibrium where all good politics is bad economics, and vice versa.

For example, the minimum guarantee for the poor is good politics, because the optics are great. It’s basically Garibi Hatao: that slogan made Indira Gandhi a political juggernaut in the 1970s, at the same time that she unleashed a series of economic policies that kept millions of people in garibi for decades longer than they should have been.

This time, the Congress has released no details, and keeping it vague makes sense because I find it hard to see how it can make economic sense. Depending on how they define ‘poor’, how much income they offer and what the cost is, the plan will either be ineffective or unworkable.

The Modi government’s interim budget announced a handout for poor farmers that seemed rather pointless. Given our agricultural distress, offering a poor farmer 500 bucks a month seems almost like mockery.

Such condescending handouts solve nothing. The poor want jobs and opportunities. Those come with growth, which requires structural reforms. Structural reforms don’t sound sexy as election promises. Handouts do.

A classic example is farm loan waivers. We have reached a stage in our politics where every party has to promise them to assuage farmers, who are a strong vote bank everywhere. You can’t blame farmers for wanting them – they are a necessary anaesthetic. But no government has yet made a serious attempt at tackling the root causes of our agricultural crisis.

Why is it that Good Politics in India is always Bad Economics? Let me put forth some possible reasons. One, voters tend to think in zero-sum ways, as if the pie is fixed, and the only way to bring people out of poverty is to redistribute. The truth is that trade is a positive-sum game, and nations can only be lifted out of poverty when the whole pie grows. But this is unintuitive.

Two, Indian politics revolves around identity and patronage. The spoils of power are limited – that is indeed a zero-sum game – so you’re likely to vote for whoever can look after the interests of your in-group rather than care about the economy as a whole.

Three, voters tend to stay uninformed for good reasons, because of what Public Choice economists call Rational Ignorance. A single vote is unlikely to make a difference in an election, so why put in the effort to understand the nuances of economics and governance? Just ask, what is in it for me, and go with whatever seems to be the best answer.

Four, Politicians have a short-term horizon, geared towards winning the next election. A good policy that may take years to play out is unattractive. A policy that will win them votes in the short term is preferable.

Sadly, no Indian party has shown a willingness to aim for the long term. The Congress has produced new Gandhis, but not new ideas. And while the BJP did make some solid promises in 2014, they did not walk that talk, and have proved to be, as Arun Shourie once called them, UPA + Cow. Even the Congress is adopting the cow, in fact, so maybe the BJP will add Temple to that mix?

Benjamin Franklin once said, “Democracy is two wolves and a lamb voting on what to have for lunch.” This election season, my friends, the people of India are on the menu. You have been deveined and deboned, marinated with rhetoric, seasoned with narrative – now enter the oven and vote.



© 2007 IndiaUncut.com. All rights reserved.
India Uncut * The IU Blog * Rave Out * Extrowords * Workoutable * Linkastic




d

India’s Problem is Poverty, Not Inequality

This is the 16th installment of The Rationalist, my column for the Times of India.

Steven Pinker, in his book Enlightenment Now, relates an old Russian joke about two peasants named Boris and Igor. They are both poor. Boris has a goat. Igor does not. One day, Igor is granted a wish by a visiting fairy. What will he wish for?

“I wish,” he says, “that Boris’s goat should die.”

The joke ends there, revealing as much about human nature as about economics. Consider the three things that happen if the fairy grants the wish. One, Boris becomes poorer. Two, Igor stays poor. Three, inequality reduces. Is any of them a good outcome?

I feel exasperated when I hear intellectuals and columnists talking about economic inequality. It is my contention that India’s problem is poverty – and that poverty and inequality are two very different things that often do not coincide.

To illustrate this, I sometimes ask this question: In which of the following countries would you rather be poor: USA or Bangladesh? The obvious answer is USA, where the poor are much better off than the poor of Bangladesh. And yet, while Bangladesh has greater poverty, the USA has higher inequality.

Indeed, take a look at the countries of the world measured by the Gini Index, which is that standard metric used to measure inequality, and you will find that USA, Hong Kong, Singapore and the United Kingdom all have greater inequality than Bangladesh, Liberia, Pakistan and Sierra Leone, which are much poorer. And yet, while the poor of Bangladesh would love to migrate to unequal USA, I don’t hear of too many people wishing to go in the opposite direction.

Indeed, people vote with their feet when it comes to choosing between poverty and inequality. All of human history is a story of migration from rural areas to cities – which have greater inequality.

If poverty and inequality are so different, why do people conflate the two? A key reason is that we tend to think of the world in zero-sum ways. For someone to win, someone else must lose. If the rich get richer, the poor must be getting poorer, and the presence of poverty must be proof of inequality.

But that’s not how the world works. The pie is not fixed. Economic growth is a positive-sum game and leads to an expansion of the pie, and everybody benefits. In absolute terms, the rich get richer, and so do the poor, often enough to come out of poverty. And so, in any growing economy, as poverty reduces, inequality tends to increase. (This is counter-intuitive, I know, so used are we to zero-sum thinking.) This is exactly what has happened in India since we liberalised parts of our economy in 1991.

Most people who complain about inequality in India are using the wrong word, and are really worried about poverty. Put a millionaire in a room with a billionaire, and no one will complain about the inequality in that room. But put a starving beggar in there, and the situation is morally objectionable. It is the poverty that makes it a problem, not the inequality.

You might think that this is just semantics, but words matter. Poverty and inequality are different phenomena with opposite solutions. You can solve for inequality by making everyone equally poor. Or you could solve for it by redistributing from the rich to the poor, as if the pie was fixed. The problem with this, as any economist will tell you, is that there is a trade-off between redistribution and growth. All redistribution comes at the cost of growing the pie – and only growth can solve the problem of poverty in a country like ours.

It has been estimated that in India, for every one percent rise in GDP, two million people come out of poverty. That is a stunning statistic. When millions of Indians don’t have enough money to eat properly or sleep with a roof over their heads, it is our moral imperative to help them rise out of poverty. The policies that will make this possible – allowing free markets, incentivising investment and job creation, removing state oppression – are likely to lead to greater inequality. So what? It is more urgent to make sure that every Indian has enough to fulfil his basic needs – what the philosopher Harry Frankfurt, in his fine book On Inequality, called the Doctrine of Sufficiency.

The elite in their airconditioned drawing rooms, and those who live in rich countries, can follow the fashions of the West and talk compassionately about inequality. India does not have that luxury.



© 2007 IndiaUncut.com. All rights reserved.
India Uncut * The IU Blog * Rave Out * Extrowords * Workoutable * Linkastic




d

To Escalate or Not? This Is Modi’s Zugzwang Moment

This is the 17th installment of The Rationalist, my column for the Times of India.

One of my favourite English words comes from chess. If it is your turn to move, but any move you make makes your position worse, you are in ‘Zugzwang’. Narendra Modi was in zugzwang after the Pulwama attacks a few days ago—as any Indian prime minister in his place would have been.

An Indian PM, after an attack for which Pakistan is held responsible, has only unsavoury choices in front of him. He is pulled in two opposite directions. One, strategy dictates that he must not escalate. Two, politics dictates that he must.

Let’s unpack that. First, consider the strategic imperatives. Ever since both India and Pakistan became nuclear powers, a conventional war has become next to impossible because of the threat of a nuclear war. If India escalates beyond a point, Pakistan might bring their nuclear weapons into play. Even a limited nuclear war could cause millions of casualties and devastate our economy. Thus, no matter what the provocation, India needs to calibrate its response so that the Pakistan doesn’t take it all the way.

It’s impossible to predict what actions Pakistan might view as sufficient provocation, so India has tended to play it safe. Don’t capture territory, don’t attack military assets, don’t kill civilians. In other words, surgical strikes on alleged terrorist camps is the most we can do.

Given that Pakistan knows that it is irrational for India to react, and our leaders tend to be rational, they can ‘bleed us with a thousand cuts’, as their doctrine states, with impunity. Both in 2001, when our parliament was attacked and the BJP’s Atal Bihari Vajpayee was PM, and in 2008, when Mumbai was attacked and the Congress’s Manmohan Singh was PM, our leaders considered all the options on the table—but were forced to do nothing.

But is doing nothing an option in an election year?

Leave strategy aside and turn to politics. India has been attacked. Forty soldiers have been killed, and the nation is traumatised and baying for blood. It is now politically impossible to not retaliate—especially for a PM who has criticized his predecessor for being weak, and portrayed himself as a 56-inch-chested man of action.

I have no doubt that Modi is a rational man, and knows the possible consequences of escalation. But he also knows the possible consequences of not escalating—he could dilute his brand and lose the elections. Thus, he is forced to act. And after he acts, his Pakistan counterpart will face the same domestic pressure to retaliate, and will have to attack back. And so on till my home in Versova is swallowed up by a nuclear crater, right?

Well, not exactly. There is a way to resolve this paradox. India and Pakistan can both escalate, not via military actions, but via optics.

Modi and Imran Khan, who you’d expect to feel like the loneliest men on earth right now, can find sweet company in each other. Their incentives are aligned. Neither man wants this to turn into a full-fledged war. Both men want to appear macho in front of their domestic constituencies. Both men are masters at building narratives, and have a pliant media that will help them.

Thus, India can carry out a surgical strike and claim it destroyed a camp, killed terrorists, and forced Pakistan to return a braveheart prisoner of war. Pakistan can say India merely destroyed two trees plus a rock, and claim the high moral ground by returning the prisoner after giving him good masala tea. A benign military equilibrium is maintained, and both men come out looking like strong leaders: a win-win game for the PMs that avoids a lose-lose game for their nations. They can give themselves a high-five in private when they meet next, and Imran can whisper to Modi, “You’re a good spinner, bro.”

There is one problem here, though: what if the optics don’t work?

If Modi feels that his public is too sceptical and he needs to do more, he might feel forced to resort to actual military escalation. The fog of politics might obscure the possible consequences. If the resultant Indian military action causes serious damage, Pakistan will have to respond in kind. In the chain of events that then begins, with body bags piling up, neither man may be able to back down. They could end up as prisoners of circumstance—and so could we.

***

Also check out:

Why Modi Must Learn to Play the Game of Chicken With Pakistan—Amit Varma
The Two Pakistans—Episode 79 of The Seen and the Unseen
India in the Nuclear Age—Episode 80 of The Seen and the Unseen



© 2007 IndiaUncut.com. All rights reserved.
India Uncut * The IU Blog * Rave Out * Extrowords * Workoutable * Linkastic




d

Lessons from an Ankhon Dekhi Prime Minister

This is the 19th installment of The Rationalist, my column for the Times of India.

A friend of mine was very impressed by the interview Narendra Modi granted last week to Akshay Kumar. ‘Such a charming man, such great work ethic,’ he gushed. ‘He is the kind of uncle I would want my kids to have.’ And then, in the same breath, he asked, ‘How can such a good man be such a bad prime minister?”

I don’t want to be uncharitable and suggest that Modi’s image is entirely manufactured, so let’s take the interview at face value. Let’s also grant Modi his claims about the purity of his neeyat (intentions), and reframe the question this way: when it comes to public policy, why do good intentions often lead to bad outcomes? To attempt an answer, I’ll refer to a story a friend of mine, who knows Modi well, once told me about him. 

Modi was chilling with his friends at home more than a decade ago, and told them an incident from his childhood. His mother was ill once, and the young Narendra was tending to her. The heat was enervating, so the boy went to the switchboard to switch on the fan. But there was no electricity. My friend said that as he told this story, Modi’s eyes filled with tears. Even after all these years, he was moved by the memory.

My friend used this story to make the point that Modi’s vision of the world is experiential. If he experiences something, he understands it. When he became chief minister of Gujarat, he made it his stated mission to get reliable electricity to every part of Gujarat. No doubt this was shaped by the time he flicked a switch as a young boy and the fan did not budge. Similarly, he has given importance to things like roads and cleanliness, since he would have experienced the impact of those as a young man.

My term for him, inspired by Rajat Kapoor’s 2014 film, is ‘the ankhon dekhi prime minister’. At one level, this is a good thing. He sees a problem and works for the rest of his life to solve it. But what of things he cannot experience?

The economy is a complex beast, as is society itself, and beyond a certain level, you need to grasp abstract concepts to understand how the world works. You cannot experience them. For example, spontaneous order, or the idea that society and markets, like language, cannot be centrally directed or planned. Or the positive-sum nature of things, which is the engine of our prosperity: the idea that every transaction is a win-win game, and that for one person to win, another does not have to lose. Or, indeed, respect for individual rights and free speech.

One understands abstract concepts by reading about them, understanding them, applying them to the real world. Modi is not known to be a reader, and this is not his fault. Given his background, it is a near-miracle that he has made it this far. He wasn’t born into a home with a reading culture, and did not have either the resources or the time when he was young to devote to reading. The only way he could learn about the world, thus, was by experiencing it.

There are two lessons here, one for Modi himself and others in his position, and another for everyone.

The lesson in this for Modi is a lesson for anyone who rises to such an important position, even if he is the smartest person in the world. That lesson is to have humility about the bounds of your knowledge, and to surround yourself with experts who can advise you well. Be driven by values and not confidence in your own knowledge. Gather intellectual giants around you, and stand on their shoulders.

Modi did not do this in the case of demonetisation, which he carried out against the advice of every expert he consulted. We all know the damage it caused to the economy.

The other learning from this is for all of us. How do we make sense of the world? By connecting dots. An ankhon-dekhi approach will get us very few dots, and our view of the world will be blurred and incomplete. The best way to gather more dots is reading. The more we read, the better we understand the world, and the better the decisions we take. When we can experience a thousand lives through books, why restrict ourselves to one?

A good man with noble intentions can make bad decisions with horrible consequences. The only way to hedge against this is by staying humble and reading more. So when you finish reading this piece, think of an unread book that you’d like to read today – and read it!



© 2007 IndiaUncut.com. All rights reserved.
India Uncut * The IU Blog * Rave Out * Extrowords * Workoutable * Linkastic




d

Can Amit Shah do for India what he did for the BJP?

This is the 20th installment of The Rationalist, my column for the Times of India.

Amit Shah’s induction into the union cabinet is such an interesting moment. Even partisans who oppose the BJP, as I do, would admit that Shah is a political genius. Under his leadership, the BJP has become an electoral behemoth in the most complicated political landscape in the world. The big question that now arises is this: can Shah do for India what he did for the BJP?

This raises a perplexing question: in the last five years, as the BJP has flourished, India has languished. And yet, the leadership of both the party and the nation are more or less the same. Then why hasn’t the ability to manage the party translated to governing the country?

I would argue that there are two reasons for this. One, the skills required in those two tasks are different. Two, so are the incentives in play.

Let’s look at the skills first. Managing a party like the BJP is, in some ways, like managing a large multinational company. Shah is a master at top-down planning and micro-management. How he went about winning the 2014 elections, described in detail in Prashant Jha’s book How the BJP Wins, should be a Harvard Business School case study. The book describes how he fixed the BJP’s ground game in Uttar Pradesh, picking teams for 147,000 booths in Uttar Pradesh, monitoring them, and keeping them accountable.

Shah looked at the market segmentation in UP, and hit upon his now famous “60% formula”. He realised he could not deliver the votes of Muslims, Yadavs and Jatavs, who were 40% of the population. So he focussed on wooing the other 60%, including non-Yadav OBCs and non-Jatav Dalits. He carried out versions of these caste reconfigurations across states, and according to Jha, covered “over 5 lakh kilometres” between 2014 and 2017, consolidating market share in every state in this country. He nurtured “a pool of a thousand new OBC and Dalit leaders”, going well beyond the posturing of other parties.

That so many Dalits and OBCs voted for the BJP in 2019 is astonishing. Shah went past Mandal politics, managing to subsume previously antagonistic castes and sub-castes into a broad Hindutva identity. And as the BJP increased its depth, it expanded its breadth as well. What it has done in West Bengal, wiping out the Left and weakening Mamata Banerjee, is jaw-dropping. With hindsight, it may one day seem inevitable, but only a madman could have conceived it, and only a genius could have executed it.

Good man to be Home Minister then, eh? Not quite. A country is not like a large company or even a political party. It is much too complex to be managed from the top down, and a control freak is bound to flounder. The approach needed is very different.

Some tasks of governance, it is true, are tailor-made for efficient managers. Building infrastructure, taking care of roads and power, building toilets (even without an underlying drainage system) and PR campaigns can all be executed by good managers. But the deeper tasks of making an economy flourish require a different approach. They need a light touch, not a heavy hand.

The 20th century is full of cautionary tales that show that economies cannot be centrally planned from the top down. Examples of that ‘fatal conceit’, to use my hero Friedrich Hayek’s term, include the Soviet Union, Mao’s China, and even the lady Modi most reminds me of, Indira Gandhi.

The task of the state, when it comes to the economy, is to administer a strong rule of law, and to make sure it is applied equally. No special favours to cronies or special interest groups. Just unleash the natural creativity of the people, and don’t try to micro-manage.

Sadly, the BJP’s impulse, like that of most governments of the past, is a statist one. India should have a small state that does a few things well. Instead, we have a large state that does many things badly, and acts as a parasite on its people.

As it happens, the few things that we should do well are all right up Shah’s managerial alley. For example, the rule of law is effectively absent in India today, especially for the poor. As Home Minister, Shah could fix this if he applied the same zeal to governing India as he did to growing the BJP. But will he?

And here we come to the question of incentives. What drives Amit Shah: maximising power, or serving the nation? What is good for the country will often coincide with what is good for the party – but not always. When they diverge, which path will Shah choose? So much rests on that.



© 2007 IndiaUncut.com. All rights reserved.
India Uncut * The IU Blog * Rave Out * Extrowords * Workoutable * Linkastic




d

Trump and Modi are playing a Lose-Lose game

This is the 22nd installment of The Rationalist, my column for the Times of India.

Trade wars are on the rise, and it’s enough to get any nationalist all het up and excited. Earlier this week, Narendra Modi’s government announced that it would start imposing tariffs on 28 US products starting today. This is a response to similar treatment towards us from the US.

There is one thing I would invite you to consider: Trump and Modi are not engaged in a war with each other. Instead, they are waging war on their own people.

Let’s unpack that a bit. Part of the reason Trump came to power is that he provided simple and wrong answers for people’s problems. He responded to the growing jobs crisis in middle America with two explanations: one, foreigners are coming and taking your jobs; two, your jobs are being shipped overseas.

Both explanations are wrong but intuitive, and they worked for Trump. (He is stupid enough that he probably did not create these narratives for votes but actually believes them.) The first of those leads to the demonising of immigrants. The second leads to a demonising of trade. Trump has acted on his rhetoric after becoming president, and a modern US version of our old ‘Indira is India’ slogan might well be, “Trump is Tariff. Tariff is Trump.”

Contrary to the fulminations of the economically illiterate, all tariffs are bad, without exception. Let me illustrate this with an example. Say there is a fictional product called Brump. A local Brump costs Rs 100. Foreign manufacturers appear and offer better Brumps at a cheaper price, say Rs 90. Consumers shift to foreign Brumps.

Manufacturers of local Brumps get angry, and form an interest group. They lobby the government – or bribe it with campaign contributions – to impose a tariff on import of Brumps. The government puts a 20-rupee tariff. The foreign Brumps now cost Rs 110, and people start buying local Brumps again. This is a good thing, right? Local businesses have been helped, and local jobs have been saved.

But this is only the seen effect. The unseen effect of this tariff is that millions of Brump buyers would have saved Rs 10-per-Brump if there were no tariffs. This money would have gone out into the economy, been part of new demand, generated more jobs. Everyone would have been better off, and the overall standard of living would have been higher.

That brings to me to an essential truth about tariffs. Every tariff is a tax on your own people. And every intervention in markets amounts to a distribution of wealth from the people at large to specific interest groups. (In other words, from the poor to the rich.) The costs of this are dispersed and invisible – what is Rs 10 to any of us? – and the benefits are large and worth fighting for: Local manufacturers of Brumps can make crores extra. Much modern politics amounts to manufacturers of Brumps buying politicians to redistribute money from us to them.

There are second-order effects of protectionism as well. When the US imposes tariffs on other countries, those countries may respond by imposing tariffs back. Raw materials for many goods made locally are imported, and as these become expensive, so do those goods. That quintessential American product, the iPhone, uses parts from 43 countries. As local products rise in price because of expensive foreign parts, prices rise, demand goes down, jobs are lost, and everyone is worse off.

Trump keeps talking about how he wants to ‘win’ at trade, but trade is not a zero-sum game. The most misunderstood term in our times is probably ‘trade-deficit’. A country has a trade deficit when it imports more than what it exports, and Trump thinks of that as a bad thing. It is not. I run a trade deficit with my domestic help and my local grocery store. I buy more from them than they do from me. That is fine, because we all benefit. It is a win-win game.

Similarly, trade between countries is really trade between the people of both countries – and people trade with each other because they are both better off. To interfere in that process is to reduce the value created in their lives. It is immoral. To modify a slogan often identified with libertarians like me, ‘Tariffs are Theft.’

These trade wars, thus, carry a touch of the absurd. Any leader who imposes tariffs is imposing a tax on his own people. Just see the chain of events: Trump taxes the American people. In retaliation, Modi taxes the Indian people. Trump raises taxes. Modi raises taxes. Nationalists in both countries cheer. Interests groups in both countries laugh their way to the bank.

What kind of idiocy is this? How long will this lose-lose game continue?



© 2007 IndiaUncut.com. All rights reserved.
India Uncut * The IU Blog * Rave Out * Extrowords * Workoutable * Linkastic




d

Farmers, Technology and Freedom of Choice: A Tale of Two Satyagrahas

This is the 23rd installment of The Rationalist, my column for the Times of India.

I had a strange dream last night. I dreamt that the government had passed a law that made using laptops illegal. I would have to write this column by hand. I would also have to leave my home in Mumbai to deliver it in person to my editor in Delhi. I woke up trembling and angry – and realised how Indian farmers feel every single day of their lives.

My column today is a tale of two satyagrahas. Both involve farmers, technology and the freedom of choice. One of them began this month – but first, let us go back to the turn of the millennium.

As the 1990s came to an end, cotton farmers across India were in distress. Pests known as bollworms were ravaging crops across the country. Farmers had to use increasing amounts of pesticide to keep them at bay. The costs of the pesticide and the amount of labour involved made it unviable – and often, the crops would fail anyway.

Then, technology came to the rescue. The farmers heard of Bt Cotton, a genetically modified type of cotton that kept these pests away, and was being used around the world. But they were illegal in India, even though no bad effects had ever been recorded. Well, who cares about ‘illegal’ when it is a matter of life and death?

Farmers in Gujarat got hold of Bt Cotton seeds from the black market and planted them. You’ll never guess what happened next. As 2002 began, all cotton crops in Gujarat failed – except the 10,000 hectares that had Bt Cotton. The government did not care about the failed crops. They cared about the ‘illegal’ ones. They ordered all the Bt Cotton crops to be destroyed.

It was time for a satyagraha – and not just in Gujarat. The late Sharad Joshi, leader of the Shetkari Sanghatana in Maharashtra, took around 10,000 farmers to Gujarat to stand with their fellows there. They sat in the fields of Bt Cotton and basically said, ‘Over our dead bodies.’ ¬Joshi’s point was simple: all other citizens of India have access to the latest technology from all over. They are all empowered with choice. Why should farmers be held back?

The satyagraha was successful. The ban on Bt Cotton was lifted.

There are three things I would like to point out here. One, the lifting of the ban transformed cotton farming in India. Over 90% of Indian farmers now use Bt Cotton. India has become the world’s largest producer of cotton, moving ahead of China. According to agriculture expert Ashok Gulati, India has gained US$ 67 billion in the years since from higher exports and import savings because of Bt Cotton. Most importantly, cotton farmers’ incomes have doubled.

Two, GMO crops have become standard across the world. Around 190 million hectares of GMO crops have been planted worldwide, and GMO foods are accepted in 67 countries. The humanitarian benefits have been massive: Golden Rice, a variety of rice packed with minerals and vitamins, has prevented blindness in countless new-born kids since it was introduced in the Philippines.

Three, despite the fear-mongering of some NGOs, whose existence depends on alarmism, the science behind GMO is settled. No harmful side effects have been noted in all these years, and millions of lives impacted positively. A couple of years ago, over 100 Nobel Laureates signed a petition asserting that GMO foods were safe, and blasting anti-science NGOs that stood in the way of progress. There is scientific consensus on this.

The science may be settled, but the politics is not. The government still bans some types of GMO seeds, such as Bt Brinjal, which was developed by an Indian company called Mahyco, and used successfully in Bangladesh. More crucially, a variety called HT Bt Cotton, which fights weeds, is also banned. Weeding takes up to 15% of a farmer’s time, and often makes farming unviable. Farmers across the world use this variant – 60% of global cotton crops are HT Bt. Indian farmers are so desperate for it that they choose to break the law and buy expensive seeds from the black market – but the government is cracking down. A farmer in Haryana had his crop destroyed by the government in May.

On June 10 this year, a farmer named Lalit Bahale in the Akola District of Maharashtra kicked off a satyagraha by planting banned seeds of HT Bt Cotton and Bt Brinjal. He was soon joined by thousands of farmers. Far from our urban eyes, a heroic fight has begun. Our farmers, already victimised and oppressed by a predatory government in countless ways, are fighting for their right to take charge of their lives.

As this brave struggle unfolds, I am left with a troubling question: All those satyagrahas of the past by our great freedom fighters, what were they for, if all they got us was independence and not freedom?



© 2007 IndiaUncut.com. All rights reserved.
India Uncut * The IU Blog * Rave Out * Extrowords * Workoutable * Linkastic




d

For this Brave New World of cricket, we have IPL and England to thank

This is the 24th installment of The Rationalist, my column for the Times of India.

Back in the last decade, I was a cricket journalist for a few years. Then, around 12 years ago, I quit. I was jaded as hell. Every game seemed like déjà vu, nothing new, just another round on the treadmill. Although I would remember her fondly, I thought me and cricket were done.

And then I fell in love again. Cricket has changed in the last few years in glorious ways. There have been new ways of thinking about the game. There have been new ways of playing the game. Every season, new kinds of drama form, new nuances spring up into sight. This is true even of what had once seemed the dullest form of the game, one-day cricket. We are entering into a brave new world, and the team leading us there is England. No matter what happens in the World Cup final today – a single game involves a huge amount of luck – this England side are extraordinary. They are the bridge between eras, leading us into a Golden Age of Cricket.

I know that sounds hyperbolic, so let me stun you further by saying that I give the IPL credit for this. And now, having woken up you up with such a jolt on this lovely Sunday morning, let me explain.

Twenty20 cricket changed the game in two fundamental ways. Both ended up changing one-day cricket. The first was strategy.

When the first T20 games took place, teams applied an ODI template to innings-building: pinch-hit, build, slog. But this was not an optimal approach. In ODIs, teams have 11 players over 50 overs. In T20s, they have 11 players over 20 overs. The equation between resources and constraints is different. This means that the cost of a wicket goes down, and the cost of a dot ball goes up. Critically, it means that the value of aggression rises. A team need not follow the ODI template. In some instances, attacking for all 20 overs – or as I call it, ‘frontloading’ – may be optimal.

West Indies won the T20 World Cup in 2016 by doing just this, and England played similarly. And some sides began to realise was that they had been underestimating the value of aggression in one-day cricket as well.

The second fundamental way in which T20 cricket changed cricket was in terms of skills. The IPL and other leagues brought big money into the game. This changed incentives for budding cricketers. Relatively few people break into Test or ODI cricket, and play for their countries. A much wider pool can aspire to play T20 cricket – which also provides much more money. So it makes sense to spend the hundreds of hours you are in the nets honing T20 skills rather than Test match skills. Go to any nets practice, and you will find many more kids practising innovative aggressive strokes than playing the forward defensive.

As a result, batsmen today have a wider array of attacking strokes than earlier generations. Because every run counts more in T20 cricket, the standard of fielding has also shot up. And bowlers have also reacted to this by expanding their arsenal of tricks. Everyone has had to lift their game.

In one-day cricket, thus, two things have happened. One, there is better strategic understanding about the value of aggression. Two, batsmen are better equipped to act on the aggressive imperative. The game has continued to evolve.

Bowlers have reacted to this with greater aggression on their part, and this ongoing dialogue has been fascinating. The cricket writer Gideon Haigh once told me on my podcast that the 2015 World Cup featured a battle between T20 batting and Test match bowling.

This England team is the high watermark so far. Their aggression does not come from slogging. They bat with a combination of intent and skills that allows them to coast at 6-an-over, without needing to take too many risks. In normal conditions, thus, they can coast to 300 – any hitting they do beyond that is the bonus that takes them to 350 or 400. It’s a whole new level, illustrated by the fact that at one point a few days ago, they had seven consecutive scores of 300 to their name. Look at their scores over the last few years, in fact, and it is clear that this is the greatest batting side in the history of one-day cricket – by a margin.

There have been stumbles in this World Cup, but in the bigger picture, those are outliers. If England have a bad day in the final and New Zealand play their A-game, England might even lose today. But if Captain Morgan’s men play their A-game, they will coast to victory. New Zealand does not have those gears. No other team in the world does – for now.

But one day, they will all have to learn to play like this.



© 2007 IndiaUncut.com. All rights reserved.
India Uncut * The IU Blog * Rave Out * Extrowords * Workoutable * Linkastic




d

Cadence Genus Synthesis Solution – the Next Generation of RTL Synthesis

Physical synthesis has been around in various forms for many years. The basic idea is to bring some awareness of physical layout into synthesis. This week (June 3, 2015) Cadence is rolling out the Genus™ Synthesis Solution, a next-generation RTL synthesis tool that takes physical awareness in some new directions.

Here are four important things to know about Genus technology:

  • A massively parallel architecture improves turnaround time by up to 5X while maintaining quality of results
  • The Genus solution synthesizes up to 10M+ instances flat without impacting power, performance and area (PPA)
  • The Genus solution provides tight correlation with the Innovus Implementation System, using the same placement and routing algorithms
  • Globally focused PPA optimization saves up to 20% datapath area and power

Compared to previous-generation products such as the Cadence Encounter RTL Compiler Advanced Physical Option, the Genus solution approaches physical synthesis in a different way. The Encounter solution applied physical optimization “at the tail end of synthesis,” said David Stratman, senior principal product manager at Cadence. “We were doing a final incremental push, but we could only do so much, since we had locked in a lot of the earlier steps from a logical-only synthesis perspective.”

Genus Synthesis Solution supports the physical synthesis features in the previous Encounter solution, but it also brings the full physical scope upstream to RTL logic designers. “It’s going to enable the unit-level RTL designer to gain the benefits of physical synthesis without having to understand it,” Stratman said. As an example, users can apply generic (unmapped) placement at the earliest stages of synthesis, using a lightweight version of the Innovus placement engine. The bottom line: “Genus is a full solution where every step of synthesis can be done physically.”

Getting Massively Parallel

If you bring physical data into synthesis, you need a way to improve capacity and runtimes, especially with today’s gigantic advance-node SoCs. That’s why a massively parallel architecture is the cornerstone of the Genus solution. In this way, the Genus solution is following in the footsteps of the Innovus Implementation System, which also provides a massively parallel architecture.

Both the Innovus and Genus solutions can handle blocks of 10M instances flat. Given that SoCs today may have up to 100M instances, and often up to 50-100 top-level blocks, this is an important capability. Many tools today will only handle blocks of 1M instances. As a result, design teams often have to constrain block sizes.

Genus technology offers timing-driven, multi-level design partitioning across multiple threads and machines. It enables a near-linear runtime scaling without impacting PPA. According to Stratman, the Genus solution will scale well beyond 64 CPUs for a large design, with a “sweet spot” around 8-20 CPUs for today’s typical block sizes. Runs that used to take days, he noted, can now be done in hours.

As shown below, Genus technology leverages parallelism at three levels. The Genus solution can distribute design partitions to multiple threads or CPUs, and also supports local algorithm-level multithreading on each machine with shared memory. An adaptive scheduler ensures the best use of the available CPUs.


Fig. 1 – Genus Synthesis Solution provides three levels of parallelism

With its massive parallelism, Stratman said, Genus technology can obtain production-level quality of results (QoR) in runtimes typically seen in “prototype-level” synthesis runs. The “secret sauce,” he said, is in the partitioning. Cadence has found a way to generate partitions in a way that “slices the design more intelligently, and takes advantage of the Genus database to merge partitions without losing timing, power, or area,” Stratman said.

Playing in the Sandbox

In the Genus Synthesis Solution, a process called “sandboxing” allows any subset or partition of a design to be extracted along with full timing and a physical context. Optimization algorithms will treat a sandbox as a complete design.

The “Clipper” flow clips out or extracts the context of the larger SoC blocks. “It’s kind of a skeleton floorplan but it has all the timing information,” Stratman said. These extracted contexts include all the critical physical information to make the right RTL synthesis choices at the unit level. This information is used to streamline the handoffs between unit-level RTL designers, integration engineers, and implementation engineers. It’s a way for logic designers to gain some physical knowledge without having to be a physical synthesis expert, or without having to run a full top-level synthesis.

Fig. 2 – Clipper flow provides context for unit-level blocks

Correlation with Innovus Implementation System

Although Genus technology can work with third-party IC implementation systems, it shares algorithms and engines with Innovus Implementation System, as well as a common user interface. As shown below, both the Genus and Innovus solutions use a table-based Quantus QRC parasitic extraction, effective current source model (ECSM) and composite current source (CCS) delay calculations, and a unified global routing engine. Timing and wire length claim a 5% correlation.

Fig. 3 – Genus Synthesis Solution offers tight correlation with Innovus Implementation System

Genus technology doesn’t model everything to the same level of accuracy as the Innovus solution, however. “We chose to be lighter weight and more nimble to get expected runtimes,” Stratman said. A tight correlation is possible because the Genus and Innovus solutions use a similar code base. This correlation will be tighter than that between Encounter RTL Compiler Advanced Physical Option and the Encounter Digital Implementation System today.

Genus Synthesis Solution uses a new Hybrid Global Router that provides the ability to resolve congestion and construct layer-aware, timing-driven wire topologies. This accelerates analysis and debug, and reduces iterations. Users can avoid blockages and see a full Manhattan route as opposed to “flight lines.” Layer awareness is particularly important, given the large RC variations within the metal stack at advanced process nodes.

A version of the Innovus GigaPlace engine is available within the Genus solution. Here, users can do an RTL-level generic gate placement early in the synthesis flow (“generic gate” means there is no mapping into standard cell libraries, but there’s still an area estimate). This helps designers understand PPA tradeoffs earlier.

While users can go all the way to a design-rule “legal” placement with Genus Synthesis Solution, this isn’t generally recommended. “You can do a placement and use the same algorithms as GigaPlace and get a nice correlation without all the runtimes and additional steps of doing a fully legal placement,” Stratman said.

So where does Genus technology end and Innovus technology begin? That’s up to the user. You could use the Genus solution for logical synthesis and run all physical implementation in the Innovus system. If you run physical synthesis within the Genus solution, there’s more work earlier in the flow, but you get better insights into downstream problems and reduce iterations.

“Physical synthesis should be no more than 2X [runtime] of logic synthesis,” Stratman said. “All of the runtime that moves up should be shaved off of the place-and-route stages, because now you can do lightweight incremental optimization and incremental placement. The overall flow should be runtime neutral or better.”

Be Globally Aware

Finally, Genus Synthesis Solution offers a globally focused early PPA optimization across the whole datapath, delivering up to a 20% area reduction in the datapath. Stratman noted that this capability is a follow-on to an RCP feature called “globally focused mapping” that can determine the best cells to use in a library. What’s new with the Genus solution is that this concept has been applied at the arithmetic level.

For example, there are many ways to configure a multiplier – you may want to prioritize speed, power, or size. In the past, Stratman noted, synthesis tools have not been very good at globally optimizing the architecture selection for PPA optimization. “We can [now] find the most efficient global datapath implementation for a given region,” he said.

For further information about the Cadence Genus Synthesis Solution, including a datasheet and technical product brief, see this landing page.

Richard Goering

Related Blog Posts

Designer View – RTL Synthesis Success Strategies at 28nm and Below

Front-End Design Summit: The Future of RTL Synthesis and Design for Test

Physically-Aware Synthesis Helps Design a New Computer Architecture

 




d

DAC 2015 Cadence Theater – Learn from Customers and Partners

One reason for attending the upcoming Design Automation Conference (DAC 2015) is to learn about challenges other engineers have faced, and hear about their solutions. And the best place to do that is the Cadence Theater, located at the Cadence booth (#3515). The Theater will host continuous half-hour customer and partner presentations from 10:00 am Monday, June 8, to 5:30 pm Wednesday June 4.

As of this writing, 43 presentations are scheduled. This includes 17 customer presentations, 23 partner presentations, and 3 Cadence presentations, The presentations are open to all DAC attendees and no reservations are required.

Cadence customers who will be speaking include engineers from AMD, ams, Allegro Micro, Broadcom, IBM, Netspeed, NVidia, Renesas, Socionet, and STMicroelectronics. Partner presentations will be provided by ARM, Cliosoft, Dini Group, GLOBALFOUNDRIES, Methodics, Methods2Business, National Instruments, Samsung, TowerJazz, TSMC, and X-Fab.

These informal presentations are given in an interactive setting with an opportunity for questions and answers. Audio recordings with slides will be available at the Cadence web site after DAC. To access recordings of the 2014 DAC Theater presentations, click here.

 

This Cadence DAC Theater presentation drew a large audience at DAC 2015

Here’s a listing of the currently scheduled Cadence DAC Theater presentations. The latest schedule is available at the Cadence DAC 2015 site.

Monday, June 8

 

Tuesday, June 9

 

Wednesday, June 10

 

In a Wednesday session (June 10, 10:00 am) at the theater, the Cadence Academic Network will sponsor three talks on academic/industry collaboration models. Speakers are Dr. Zhou Li, architect, Cadence; Prof. Xin Li, Carnegie-Mellon University; and Prof. Laleh Behjat, University of Calgary.

As shown above, there will be a giveaways for a set of Bose noise-cancelling headphones, an iPad Mini, and a GoPro Hero3 video camera.

See the Cadence Theater schedule for further details. And be sure to view our Multimedia Site for live blogging and photos and videos from DAC. For a complete overview of Cadence activities at DAC, see our DAC microsite.

Richard Goering

Related Blog Posts

DAC 2015: See the Latest in Semiconductor IP at “IPTalks!”

Cadence DAC 2015 and Denali Party Update

DAC 2015: Tackling Tough Design Problems Head On




d

Cadence JasperGold Brings Formal Verification into Mainstream IC Verification Flows

Formal verification is a complex technology that has traditionally required experts or specialized teams who stood apart from the IC design and verification flow. Taking a different approach, a new release of the Cadence JasperGold formal verification platform (June 8, 2015) provides formal techniques that complement simulation, emulation, and debugging in the form of “Apps” or under-the-hood solutions that any design or verification engineer can use.

JasperGold was the initial (in fact only) product of Jasper Design Automation, acquired by Cadence in 2014. Jasper pioneered the formal Apps concept several years ago. While the company had previously sold JasperGold as a one-size-fits-all solution, Jasper began selling semi-automated JasperGold Apps that solved specific problems using formal analysis technology.

The new release is the next generation of JasperGold and will be available later this month. It includes three major improvements over previous Cadence and Jasper formal analysis offerings:

  • A unified Cadence Incisive and JasperGold formal verification platform delivers up to 15X performance gain over previous solutions.
  • JasperGold is integrated into the Cadence System Development Suite, where it provides formal-assisted simulation, emulation, and coverage. As a result, System Development Suite users can find bugs three months earlier than existing verification methods.
  • JasperGold’s formal analysis engines are integrated with the recently announced Indago debug platform, automating root cause analysis and on-the-fly, what-if exploration.

Best of Both Formal Verification Worlds

Taking advantage of technologies from both Cadence and Jasper, the new JasperGold represents a “best of both worlds” solution, according to Pete Hardee, product management director at Cadence. This solution combines technologies from the Cadence Incisive Enterprise Verifier and Incisive Formal Verifier with JasperGold formal analysis engines.

For example, to ease migration from Incisive formal tools, Cadence has integrated an Incisive common front end into the JasperGold apps platform. Jasper formal engines can run within the Incisive run-time environment. Cadence has also brought some selected Incisive formal engines into JasperGold.

As shown to the right, the JasperGold platform supports both the existing JasperGold front-end parser and the Incisive front-end parser. Hardee observed that this dual parser arrangement simplifies migration from Incisive formal tools to JasperGold, and provides a common compilation environment for people who want to use JasperGold with Incisive simulation. Further, the common run-time environment enables formal-assisted simulation.

The combination of JasperGold engines and Incisive engines supports two use models for formal analysis: formal proofs and bug hunting. In the first case, formal engines try all combinations of inputs without a testbench. The test is driven by formal properties written in languages such as SVA (SystemVerilog assertions) or PSL (Property Specification Language). Completion of a property is exhaustive proof that something can or cannot happen. This provides a “much stronger result” than simulation, Hardee said.

He also noted that formal analysis doesn’t necessarily require that all properties are completed. “You can get a lot of value even if proofs don’t complete,” he said. “Proofs that run deep enough to find bugs are just fine.”

Bug hunting involves random searches, and JasperGold bug hunting engines are very fast. However, these engines don’t necessarily use the most optimal path to get to a bug. So, Cadence engineers brought a constraint solver from Incisive and integrated it into JasperGold. “It looks at the constraints in the environment and gives you a better starting point,” Hardee said. “It takes more up-front time, but once you’ve done that the bug hunting engines can actually take a shorter path and find a bug a lot quicker.”

Another new JasperGold capability from the Incisive Formal Verifier is called “search pointing.” This uses simulation to penetrate deeply into the state space, and then kicks off a random formal search from a given point that you’ve reached in simulation. This technique makes it possible to find bugs that are very deep in the design.

It is probably clear by now that a number of different formal “engines” may be required to solve a given verification problem. Traditionally, a formal tool (or user) will farm a problem out to many engines and see which one works best. To put more intelligence into that process, Cadence launched the Trident “multi-cooperating engine” a couple of years ago. That has now been brought into JasperGold, where it helps “orchestrate” the engines according to what will work best for the design. This is a big part of the reason for the 15X speedup noted earlier in this post.

Integration with System Development Suite

The Cadence System Development Suite is an integrated set of hardware/software development and verification engines, including virtual prototyping, Incisive simulation, emulation, and FPGA-based prototyping. As shown below, JasperGold technology is integrated into the System Development Suite in several places, including formal-assisted debug, formal-assisted verification closure, formal-assisted simulation, formal-assisted emulation, and the Incisive vManager verification planning tool.

Formal-assisted emulation sounds like it should be easy, especially since Cadence has both accelerated verification IP (VIP) and assertion-based VIP. However, there’s a complication. Accelerated VIP represents less verification content than simulation VIP, because you have to remove many checkers to get VIP to compile on a Palladium emulator. That’s because the Palladium requires synthesizable code.

What you can do, however, is use assertion-based VIP in “snoop mode” as shown below. Assertion-based VIP coded in synthesizable SystemVerilog can replace the missing checkers in accelerated VIP. In this diagram, everything in the green box is running in the emulator and is thus completely accelerated.

 

Another example of formal-assisted emulation has to do with deep traces. As Hardee noted, emulation will produce very long traces, and it can be very difficult to find a point of interest in the trace and determine what caused an error. With formal-assisted emulation, users can find interesting events within the traces and create properties that mark them, so a debugger can find these events and trace back to the root cause.

Formal-assisted verification closure is available with the new JasperGold release. This is possible because you can use the vManager product to determine which tasks were completed by formal engines. It’s important information for verification managers who are not used to formal tools, Hardee noted.

Another aspect of formal-assisted verification closure is the JasperGold Unreachability Analysis (UNR) App, which can save simulation users weeks of time and effort. This App takes in the simulation coverage database and RTL, and automatically generates properties to explore coverage holes and determine if holes are reachable or unreachable. The App then generates an unreachable coverage point database. If the unreachable code does something useful, there’s a bug in the design or the testbench; if not, you don’t have to worry about it. The diagram below shows how it works.

Formal-Assisted Debugging

The third major component of the JasperGold announcement is the integration of formal analysis into the Indago debugging platform. As shown below, this platform has several apps, including the Indago Debug Analyzer. Two formal debug capabilities from the Jasper Visualize environment have been added to the the Indago Debug Analyzer:

  • Highlight Relevant Logic: This highlights the “cone of influence,” or the logic that is involved in reaching a given point
  • Why: This button highlights the immediate causes for a given event, and allows users to trace backwards in time

 

More formal capabilities will come with the Indago Advanced Debug Analyzer app, scheduled for release towards the end of 2015. This includes Quiet Trace, a Jasper capability that reduces trace activity to transactions relevant to an event. Also, a what-if analysis allows on-the-fly trace editing and recalculation to explore effects and sensitivities, without having to re-compile and re-execute the simulation.

Finally, Cadence has a Superlint flow that is now fully integrated with the JasperGold Visualize debugger. This two-tiered flow includes a basic lint capability as well as automated formal analysis based on the JasperGold Structural Property Synthesis app. “This could be a very good entry point for designers to start using formal,” Hardee said.

“Formal is taking off,” Hardee concluded. “People are no longer talking about return on investment for formal—they have established that. Now they’re supporting a proliferation of formal in their companies such that a wider set of people experience the benefit from that proven return on investment.”

Further information is available at the JasperGold Formal Verification Platform (Apps) page.

Richard Goering

Related Blog Posts

JUG Keynote—How Jasper Formal Verification Technology Fits into the Cadence Flow

Why Cadence Bought Jasper—A New Era in Formal Analysis

Q&A: An R&D Perspective on Formal Verification—Past, Present and Future




d

Gary Smith at DAC 2015: How EDA Can Expand Into New Directions

First, the good news. The EDA industry will grow from $6.2 billion in 2015 to $9.0 billion in 2019, according to Gary Smith, chief analyst at Gary Smith EDA. Year-to-year growth rates will range from +4% to +11.2%.

But in his annual presentation on the eve of the Design Automation Conference (DAC 2015), Smith noted that Wall Street is unimpressed. “The people I talk to want long-term steady growth, no sharp up-turns, no sharp downturns,” Smith said. “To the rest of Wall Street, we’re boring.”

Smith spent the rest of his talk noting how EDA can be a lot less boring and, potentially, a whole lot bigger. For starters, what if we add semiconductor IP to EDA revenues? Now we’re looking at $12.2 billion in revenue by 2019, Smith said. (He acknowledged, however, that the IP market itself is going to take a “dip” due to the move towards platform-based IP and away from conventional piecemeal IP).

This still is not enough to get Wall Street’s attention. Another possibility is to bring embedded software development into the EDA industry. This is not a huge market – about $2.6 billion today – but it is an “easy growth market for us,” according to Smith.

Chasing the Big Bucks

But the “big bucks” are in mechanical CAD (MCAD), Smith said. In the past the MCAD market has always been bigger than EDA, but now EDA is catching up. The MCAD market is about $6.6 billion now. Synopsys and Cadence are larger than PTC and Siemens, two of the main players in MCAD.

There may be some good acquisition possibilities coming up for EDA vendors, Smith said – and if we don’t buy MCAD companies, they might buy EDA companies. Consider, for example, that Ansoft bought Apache and Dassault bought Synchronicity. (Note: Siemens PLM Software is a first-time exhibitor at DAC 2015).

What about other domains? Smith said that EDA companies could conceivably move into optical design, applications development software, biomedical design, and chemical design. The last if these is probably the most tenuous; Smith noted that EDA vendors have yet to look into chemical design.

Applications development software is the biggest market on the above list, but that means competing with Microsoft, IBM, and Oracle. “You’re in with the big boys – is that a good idea?” Smith asked.

Perhaps there’s an opening for a “big play” for an MCAD provider. Smith noted that mechanical vendors are focusing on product data management (PDM). This “is really the IT of design,” Smith said. “They have a lot of hope that the IoT [Internet of things] market is going to give them an opportunity to capture the software that goes from the ground to the cloud. Maybe we can let them have PDM and see if we can take the tool market away from them, or acquire it away from them.”

In conclusion, Smith asked, should the EDA industry accelerate its growth? “The mechanical vendors have already shown interest in acquiring EDA vendors,” he said. “We may not have a choice.”

Richard Goering

NOTE: Catch our live blog from DAC 2015, beginning Monday morning, June 8! Click here

 

 

 




d

DAC 2015: Google Smart Contact Lens Project Stretches Limits of IC Design

There has been so much hype about the “Internet of Things” (IoT) that it is refreshing to hear about a cutting-edge development project that can bring concrete benefits to millions of people. That project is the ongoing development of the Google Smart Contact Lens, and it was detailed in a keynote speech June 8 at the Design Automation Conference (DAC 2015).

The keynote speech was given by Brian Otis (right), a director at Google and a research associate professor at the University of Washington. The “smart lens” that the project envisions is essentially a disposable contact lens that fits on an eye and continuously monitors blood glucose levels. This is valuable information for anyone who has, or may someday have, diabetes.

Since he was speaking to an engineering audience, Otis focused on the challenges behind building such a device, and described some of the strategies taken by Google and its partner, Novartis. The project required new approaches to miniaturization, low-power design, and connectivity, as well as a comfortable and reliable silicon-to-human interface. Otis discussed the “why” as well and showed how the device could potentially save or improve millions of lives.

Millions of Users

First, a bit of background. Google announced the smart lens project in a blog post in January 2014. Since then it has been featured in news outlets including Forbes, Time, and the Wall Street Journal. In March 2015, Time reported that Google has been granted a patent for a smart contact lens.

The smart lens monitors the level of blood glucose by looking at its concentration in tears. The lens includes a wireless system on chip (SoC) and a miniaturized glucose sensor. A tiny pinhole in the lens allows tear fluid to seep into the sensor, and a wireless antenna handles communications to the wireless devices.

“We figure that if we can solve a huge problem, it is probably worth doing,” Otis said. “Diabetes is one example.” He noted 382 million people worldwide have diabetes today, and that 35% of the U.S. population may be pre-diabetic. Today, diabetics must *** their fingers to test blood glucose levels, a procedure that is invasive, painful, and subject to infrequent monitoring.

According to Otis, the smart contact lens represents a “new category of wearable devices that are comfortable, inexpensive, and empowering.” The lens does sensor data logging and uses a portable instrument to measure glucose levels. It is thin, cheap, and disposable, he said.

Moreover, the lens is not just for people already diagnosed with diabetes—it’s for anyone who is pre-diabetic, or may be at risk due to genetic predisposition. “If we are pro-active rather than re-active,” Otis said, “Instead of waiting until a person has full-fledged diabetes, we could make a huge difference in peoples’ lives and lower the costs of treating them.”

Technical Challenges

No one has built anything quite like the smart lens, so researchers at Google and Novartis are treading new ground. Otis identified three key challenges:

  • Miniaturization: Everything must be really small—the SoC, the passive components, the power supply. Components must be flexible and cheap, and support thin-film integration.
  • Platform: Google has developed a reusable platform that includes tiny, always-on wireless sensors, ultra low-power components, and standards-based interfaces.
  • Data: Researchers are looking for the best ways to get the resulting data into a mobile device and onto the cloud.

Comfort is another concern. “This is not intended to be for the most severe cases,” Otis said. “This is intended to be for all of us as a pro-active way of improving our lifestyles.”

The platform provides a bidirectional encrypted wireless link, integrated power management, on-chip memory, standards-based RFID link, flexible sensor interface, high-resolution potentiostat sensor, and decoupling capacitors. Most of these capabilities are provided by the standard CMOS SoC, which is a couple hundred microns on a side and only “tens of microns” thick.

Otis noted that unpackaged ICs are typically 250 microns thick when they come back from the foundry. Thus, post-processing is needed so the IC will fit into a contact lens.

Furthermore, the design requires precision analog circuitry and additional environmental sensors. “Some of this stuff sounds mundane but it is really hard, especially when you find out you can’t throw large decoupling capacitors and bypass capacitors onto a board, and all that has to be re-integrated into the chip,” Otis said.

Sensor Challenges

Getting information from the human body is challenging. The smart lens sensor does a direct chemical measurement on the surface of the eye. The sensor is designed to work with very low glucose concentrations. This is because the concentration of glucose in tears is an order of magnitude lower than it is in blood.

In brief, the sensor has two parallel plates that are coated with an enzyme that converts glucose into hydrogen peroxide, which flows around the electrodes of the sensor. This is actually a fairly standard way of doing glucose monitoring. However, the smart lens sensor has two electrodes compared to the typical three.

In manufacturing, it is essential to keep costs low. Otis outlined a three-step manufacturing process:

  • Start with the bottom layer, and mold a contact lens in the way you typically would.
  • Add the electronics package on top of that layer.
  • Build a second layer that encapsulates the electronics and provides the curvature needed for comfort and vision correction.

Beyond the technical challenges are the “clinical” challenges of working with human beings. The human body “is messy and very variable,” Otis said. This variability affects sensor performance and calibration, RF/electro-magnetic performance, system reliability, and comfort.

The final step is making use of the data. “We need to get the data from the device into a phone, and then display it so users can visualize the data,” Otis said. This provides “actionable feedback” to the person who needs it. Eventually, the data will need to be stored in the cloud.

As he concluded his talk, Otis noted that the platform his group developed may have many applications beyond glucose monitoring. “There is a lot you can do with a bunch of logic and sensing capability,” he said, “and there are hundreds of biomarkers beyond glucose.” Clearly this will be an interesting technology to watch.

Richard Goering

Related Blog Post

Gary Smith at DAC 2015: How EDA Can Expand Into New Directions




d

DAC 2015: Lip-Bu Tan, Cadence CEO, Sees Profound Changes in Semiconductors and EDA

As a leading venture capitalist in the electronics technology, as well as CEO of Cadence, Lip-Bu Tan has unique insights into ongoing changes that will impact EDA providers and users. Tan shared some of those insights in a “fireside chat” with Ed Sperling, editor in chief of Semiconductor Engineering, at the Design Automation Conference (DAC 2015) on June 9.

Topics of this discussion included industry consolidation, the need for more talent and more startups, Internet of Things (IoT) opportunities and challenges, the shift from ICs to full product development, and the challenges of advanced nodes. Following are some excerpts from this conversation, held at the DAC Pavilion theater on the exhibit floor.

 

Ed Sperling (left) and Lip-Bu Tan (right) discuss trends in semiconductors and EDA

Q: As you look out over the semiconductor and EDA industries these days, what worries you most?

Tan: At the top of my list is all the consolidation that is going on. Secondly, chip design complexity is increasing substantially. Time-to-market pressure is growing and advanced nodes have challenges.

The other thing I worry about is that we need to have more startups. There’s a lot of innovation that needs to happen. And this industry needs more top talent. At Cadence, we have a program to recruit over 10% of new hires every year from college graduates. We need new blood and new ideas.

Q: EDA vendors were acquiring companies for many years, but now the startups are pretty much gone. Where does the next wave of innovation come from?

Tan: I’ve been an EDA CEO for the last seven years and I really enjoy it because so much innovation is needed. System providers have very big challenges and very different needs. You have to find the opportunities and go out and provide the solutions.

The opportunities are not just in basic tools. Massive parallelism is critical, and the power challenge is huge. Time to market is critical, and for the IoT companies, cost is going to be critical. If you want to take on some good engineering challenges, this is the most exciting time.

Q: You live two lives—you’re a CEO but you’re also an investor. Where are the investments going these days and where are we likely to see new startups?

Tan: Clearly everybody is chasing the IoT. There is a lot of opportunity in the cloud, in the data center. Also, I’m a big believer in video, so I back companies that are video related. A big area is automotive. ADAS [Advanced Driver Assistance Systems] is a tremendous opportunity.

These companies can help us understand how the industry is transforming, and then we can provide solutions, either in terms of IP, tools, or the PCB. Then we need to connect from the system level down to semiconductors. I think it’s a different way to design.

Q: What happens as we start moving from companies looking to design a semiconductor to system companies who are doing things from the perspective that we have this purpose for our software?

Tan: We are extending from EDA to what we call system design enablement, and we are becoming more application driven. The application at the system level will drive the silicon design. We need to help companies look at the whole system including the power envelope and signal integrity. You don’t want to be in a position where you design a chip all the way to fabrication and then find the power is too high.

We help the customers with hardware/software co-design and co-verification. We have a design suite and a verification suite that can provide customers with high-level abstractions, as well as verify IP blocks at the system level. Then we can break things down to the component level with system constraints in mind, and drive power-aware, system-aware design.

We are starting to move into vertical markets. For example, medical is a tremendous opportunity.

Q: How does this approach change what you provide to customers?

Tan: Every year I spend time meeting with customers. I think it is very important to understand what they are trying to design, and it is also important to know the customer’s customer requirements. We might say, “Wait a minute, for this design you may want to think about power or the library you’re using.” We help them understand what foundry they should use and what process they should use. They don’t view me as a vendorthey view me as a partner.

We also work very closely with our IP and foundry partners. We work as one teamthe ultimate goal is customer success.

Q: Is everybody going to say, FinFETs are beautiful, we’re going to go down to 10nm or 7nmor is it a smaller number of companies who will continue down that path?

Tan: Some of the analog/mixed-signal companies don’t need to go that far. We love those customerswe have close to 50% of that business. But we also have customers in the graphics or processor area who are really pushing the envelope, and need to be in 16nm, 14nm, or 10nm. We work very closely with those guys to make sure they can go into FinFETs.

We always want to work with the customer to make sure they have a first-time silicon success. If you have to do a re-spin, you miss the opportunity and it’s very costly.

Q: There’s a new market that is starting to explodeIoT. How real is that world to you? Everyone talks about large numbers, but is it showing up in terms of tools?

Tan: Everybody is talking about huge profits, but a lot of the time I think it is just connecting old devices that you have. Billions of units, absolutely yes, but if you look close enough the silicon percentage of that revenue is very tiny. A lot of the profit is on the service side. So you really need to look at the service killer app you are trying to provide.

What’s most important to us in the IoT market is the IP business. That’s why we bought Tensilicait’s programmable, so you can find the killer app more quickly. The other challenges are time to market, low power, and low cost.

Q: Where is system design enablement going? Does it expand outside the traditional market for EDA?

Tan: It’s not just about tools. IP is now 11% of our revenue. At the PCB level, we acquired a company called Sigrity, and through that we are able to drive system analysis for power, signal integrity, and thermal. And then we look at some of the verticals and provide modeling all the way from the system level to the component level. We make sure that we provide a solution to the end customer, rather than something piecemeal.

Q: What do you think DAC will look like in five years?

Tan: It’s getting smaller. We need to see more startups and innovative IP solutions. I saw a few here this year, and that’s good. We need to encourage small startups.

Q: Where do we get the people to pull this off? I don’t see too many people coming into EDA.

Tan: I talk to a lot of university students, and I tell them that this small industry is a gold mine. A lot of innovation is needed. We need them to come in [to EDA] rather than join Google or Facebook. Those are great companies, but there is a lot of fundamental physical innovation we need.

Richard Goering

Related Blog Posts

Gary Smith at DAC 2015: How EDA Can Expand Into New Directions

DAC 2015: Google Smart Contact Lens Project Stretches Limits of IC Design

Q&A with Nimish Modi: Going Beyond Traditional EDA




d

DAC 2015 Accellera Panel: Why Standards are Needed for Internet of Things (IoT)

Design and verification standards are critical if we want to get a new generation of Internet of Things (IoT) devices into the market, according to panelists at an Accellera Systems Initiative breakfast at the Design Automation Conference (DAC 2015) June 9. However, IoT devices for different vertical markets pose very different challenges and requirements, making the standards picture extremely complicated.

The panel was titled “Design and Verification Standards in the Era of IoT.” It was moderated by industry editor John Blyler, CEO of JB Systems Media and Technology. Panelists were as follows, shown left to right in the photo below:

  • Lu Dai, director of engineering, Qualcomm
  • Wael William Diab, senior director for strategy marketing, industry development and standardization, Huawei
  • Chris Rowen, CTO, IP Group, Cadence Design Systems, Inc.

 

In opening remarks, Blyler recalled a conversation from the recent IEEE International Microwave Symposium in which a panelist pointed to the networking and application layers as the key problem areas for RF and wireless standardization. Similarly, in the IoT space, we need to look “higher up” at the systems level and consider both software and hardware development, Blyler said.

Rowen helped set some context for the discussion by noting three important points about IoT:

  • IoT is not a product segment. Vertical product segments such as automotive, medical devices, and home automation all have very different characteristics.
  • IoT “devices” are components within a hierarchy of systems that includes sensors, applications, user interface, gateway application (such as cell phone), and finally the cloud, where all data is aggregated.
  • A bifurcation is taking place in design. We are going from extreme scale SoCs to “extreme fit” SoCs that are specialized, low energy, and very low cost.

Here are some of the questions and answers that were addressed during the panel discussion.

Q: The claim was recently made that given the level of interaction between sensors and gateways, 50X more verification nodes would have to be checked for IoT. What standards need to be enhanced or changed to accomplish that?

Rowen: That’s a huge number of design dimensions, and the way you attack a problem of that scale is by modularization. You define areas that are protected and encapsulated by standards, and you prove that individual elements will be compliant with that interface. We will see that many interesting problems will be in the software layers.

Q: Why is standardization so important for IoT?

Dai: A company that is trying to make a lot of chips has to deal with a variety of standards. If you have to deal with hundreds of standards, it’s a big bottleneck for bringing your products to market. If you have good standardization within the development process of the IC, that helps time to market.

When I first joined Qualcomm a few years ago, there was no internal verification methodology. When we had a new hire, it took months to ramp up on our internal methodology to become effective. Then came UVM [Universal Verification Methodology], and as UVM became standard, we reduced our ramp-up time tremendously. We’ve seen good engineers ramp up within days.

Diab: When we start to look at standards, we have to do a better job of understanding how they’re all going to play with each other. I don’t think one set of standards can solve the IoT problem. Some standards can grow vertically in markets like industrial, and other standards are getting more horizontal. Security is very important and is probably one thing that goes horizontally.

Requirements for verticals may be different, but processing capability, latency, bandwidth, and messaging capability are common [horizontal] concerns. I think a lot of standards organizations this year will work on horizontal slices [of IoT].

Q: IoT interoperability is important. Any suggestions for getting that done and moving forward?

Rowen: The interoperability problem is that many of these [IoT] devices are wireless. Wireless is interesting because it is really hard – it’s not like a USB plug. Wireless lacks the infrastructure that exists today around wired standards. If we do things in a heavily wireless way, there will be major barriers to overcome.

Dai: There are different standards for 4G LTE technology for different [geographical] markets. We have to make a chip that can work for 20 or 30 wireless technologies, and the cost for that is tremendous. The U.S., Europe, and China all have different tweaks. A good standard that works across the globe would reduce the cost a lot.

Q: If we’re talking about the need to define requirements, a good example to look at is power. Certainly you have UPF [Unified Power Format] for the chip, board, and module.

Rowen: There is certainly a big role for standards about power management. But there is also a domain in which we’re woefully under-equipped, and that is the ability to accurately model the different power usage scenarios at the applications level. Too often power devolves into something that runs over thousands of cycles to confirm that you can switch between power management levels successfully. That’s important, but it tells you very little about how much power your system is going to dissipate.

Dai: There are products that claim to be UPF compliant, but my biggest problem with my most recent chip was still with UPF. These tools are not necessarily 100% UPF compliant.

One other concern I have is that I cannot get one simulator to pass my Verilog code and then go to another that will pass. Even though we have a lot of tools, there is no certification process for a language standard.

Q: When we create a standard, does there need to be a companion compliance test?

Rowen: I think compliance is important. Compliance is being able to prove that you followed what you said you would follow. It also plays into functional safety requirements, where you need to prove you adhered to the flow.

Dai: When we [Qualcomm] sell our 4G chips, we have to go through a lot of certifications. It’s often a differentiating factor.

Q: For IoT you need power management and verification that includes analog. Comments?

Rowen: Small, cheap sensor nodes tend to be very analog-rich, lower scale in terms of digital content, and have lots of software. Part of understanding what’s different about standardization is built on understanding what’s different about the design process, and what does it mean to have a software-rich and analog-rich world.

Dai: Analog is important in this era of IoT. Analog needs to come into the standards community.

Richard Goering

Cadence Blog Posts About DAC 2015

Gary Smith at DAC 2015: How EDA Can Expand Into New Directions

DAC 2015: Google Smart Contact Lens Project Stretches Limits of IC Design

DAC 2015: Lip-Bu Tan, Cadence CEO, Sees Profound Changes in Semiconductors and EDA

DAC 2015: “Level of Compute in Vision Processing Extraordinary” – Chris Rowen

DAC 2015: Can We Build a Virtual Silicon Valley?

DAC 2015: Cadence Vision-Design Presentation Wins Best Paper Honors

 

 

 




d

DAC 2015: How Academia and Industry Collaboration Can Revitalize EDA

Let’s face it – the EDA industry needs new people and new ideas. One of the best places to find both is academia, and a presentation at the Cadence Theater at the recent Design Automation Conference (DAC 2015) described collaboration models that are working today.

The presentation was titled “Industry/Academia Engagement Models – From PhD Contests to R&D Collaborations.” It included these speakers, shown from left to right in the photo below:

  • Prof. Xin Li, Electrical and Computer Engineering, Carnegie-Mellon University (CMU)
  • Chuck Alpert, Senior Software Architect, Cadence
  • Prof. Laleh Behjat, Department of Electrical and Computer Engineering, University of Calgary

 

Alpert, who was filling in for Zhuo Li, Software Architect at Cadence, was the vice chair of DAC 2015 and will be the general chair of DAC 2016 in Austin, Texas. “My team at Cadence really likes to collaborate with universities,” he said. “We’re a big proponent of education because we really need the best and brightest students in our industry.”

Contests Boost EDA Research

One way that Cadence collaborates with academia is participation in contests. “It’s a great way to formulate problems to academia,” Alpert said. “We can have the universities work on these problems and get some strategic direction.”

For example, Cadence has been involved with the annual CAD contest at the International Conference on Computer-Aided Design (ICCAD) since the contest was launched in 2012. This is the largest worldwide EDA R&D contest, and it is sponsored by the IEEE Council on EDA (CEDA) and the Taiwan Ministry of Education. Its goals are to boost EDA research in advanced real-world problems and to foster industry-academia collaboration.

Contestants can participate in one of more problems in the three areas of system design, logic synthesis and verification, and physical design. The 2015 contest has attracted 112 teams from 12 regions. Cadence contributes one problem per year in the logic synthesis area. Zhuo Li was the 2012 co-chair and the 2013 chair. The awards will be given at ICCAD in November 2015.

Another step that Cadence has taken, Alpert said, is to “hire lots of interns.” His own team has four interns at the moment. One advantage to interning at Cadence, he said, is that students get to see real-world designs and understand how the tools work. “It helps you drive your research in a more practical and useful direction,” he said.

The Cadence Academic Network co-sponsors the ACM SIGDA PhD Forum at DAC, and Xin Li and Zhuo Li are on the organizing committee. This event is a poster session for PhD students to present and discuss their dissertation research with people in the EDA community. This year’s forum was “packed,” Alpert said, and it’s clear that the event needs a bigger room.

Finally, Alpert noted, Cadence researchers write and publish technical papers at DAC and other conferences, and Cadence people serve on the DAC technical program committee. “We try to be involved with the academic community on a regular basis,” Alpert said. “We want the best and the brightest people to go into EDA because there is still so much innovation that’s needed. It’s a really cool place to be.”

Research Collaboration Exposes Failure Rates

Xin Li presented an example of a successful research collaboration between CMU and Cadence. The challenge was to find a better way to estimate potential failure rates in memory. As noted in a previous blog post, PhD student Shupeng Sun met this challenge with a new statistical methodology that won a Best Poster award at the ACM SIGDA PhD Forum at DAC 2014.

The new methodology is called Scaled-Sigma Sampling (SSS). It calculates the failure rate and accounts for variability in the manufacturing process while only requiring a few hundred, or a few thousand, sample circuit blocks. Previously, millions of samples were required for an accurate validation of a new design, and each sample could take minutes or hours to simulate. It could take a few weeks or months to run one validation.

The SSS methodology requires greatly reduced simulation times. It makes it possible, Li noted, to run simulations overnight and see the results in the morning.

Li shared his secret for success in collaborations. “I want to emphasize that before the collaboration, you have to understand the goal. If you don’t have a clear goal, don’t collaborate. Once you define the goal, stick to it and make it happen.”

Contest Provides Learning Experience

Last year Laleh Behjat handed two of her new PhD students a challenge. “I told them there is an ISPD [International Symposium for Physical Design] contest on placement, and I expect you to participate and I expect you to win. Not knowing anything about placement, I don’t think they realized what I was asking them.”

The 2015 contest was called the Blockage-Aware Detailed Routing-Driven Placement Contest. Results were announced at the end of March at ISPD. And the University of Calgary team, despite its lack of placement experience, took second place.

Such contests provide a good learning tool, according to Behjat. Graduate students in EDA, she said, “have to be good programmers. They have to work in teams and be collaborative, be able to innovate, and solve the hardest problems I have seen in engineering and science. And they have to think outside the box.” A contest can bring out all these attributes, she said.

Further, Behjat noted, contest participants had access to benchmarks and to a placement tool. They didn’t have to write tools to find out if their results were good. Industry sponsors, meanwhile, got access to good students and new approaches for solving problems.

“You can see Cadence putting a big amount of time, effort and money to get students here and get them excited about doing contests,” she said. She advised students in the theater audience to “talk to people in the Cadence booth and see if you can have more ideas for collaboration.”

Richard Goering

Related Blog Posts

EDA Plus Academia: A Perfect Game, Set and Match

Cadence Aims to Strengthen Academic Partnerships

BSIM-CMG FinFET Model – How Academia and Industry Empowered the Next Transistor




d

DAC 2015: Jim Hogan Warns of “Looming Crisis” in Automotive Electronics

EDA investor and former executive Jim Hogan is optimistic about automotive electronics, but he has some concerns as well. At the recent Design Automation Conference (DAC 2015), he delivered a speech titled “The Looming Quality, Reliability, and Safety Crisis in Automotive Electronics...Why is it and what can we do to avoid it?"

Hogan gave the keynote speech for IP Talks!, a series of over 30 half-hour presentations located at the ChipEstimate.com booth. Presenters included ARM, Cadence, eSilicon, Kilopass, Sidense, SilabTech, Sonics, Synopsys, True Circuits, and TSMC. Held in an informal setting, the talks addressed the challenges faced by SoC design teams and showed how the latest developments in semiconductor IP can contribute to design success.

Jim Hogan delivers keynote speech at DAC 2015 IP Talks!

Hogan talked about several phases of automotive electronics. These include assisted driving to avoid collisions, controlled automation of isolated tasks such as parallel parking, and, finally, fully autonomous vehicles, which Hogan expects to see in 15 to 20 years. The top immediate priorities for automotive electronics designers, he said, will be government regulation, fuel economy, advanced safety, and infotainment.

More Code than a Boeing 777

According to Hogan, today’s automobiles use 50-100 microcontrollers per car, resulting in a worldwide automotive semiconductor market of around $40 billion. The global market for advanced automotive electronics is expected to reach $240 billion by 2020. Software is growing faster in the automotive market than it is in smartphones. Hogan quoted a Ford vice president who observed that there are more lines of code in a Ford Fusion car than a Boeing 777 airplane.

One unique challenge for automotive electronics designers is long-term reliability. This is because a typical U.S. car stays on the road for 15 years, Hogan said. Americans are holding onto new vehicles for a record 71.4 months.

Another challenge is regulatory compliance. Aeronautics is highly regulated from manufacturing to air traffic control, and the same will probably be true of automated cars. Hogan speculated that the Department of Transportation will be the regulatory authority for autonomous cars. Today, automotive electronics providers must comply with the ISO26262 automotive functional safety specification.

So where do we go from here? “We’ve got to change our mindset,” Hogan said. “We’ve got to focus on safety and reliability and demand a different kind of engineering discipline.” You can watch Hogan’s entire presentation by clicking on the video icon below, or clicking here. You can also watch other IP Talks! videos from DAC 2015 here.

https://youtu.be/qL4kAEu-PNw

 

Richard Goering

Related Blog Posts

DAC 2015: See the Latest in Semiconductor IP at “IP Talks!”

Automotive Functional Safety Drives New Chapter in IC Verification




d

EDA Retrospective: 30+ Years of Highlights and Lowlights, and What Comes Next

In 1985, as a relatively new editor at Computer Design magazine, I was asked to go forth and cover a new business called CAE (computer-aided engineering). I knew nothing about it, but I had been writing about design for test, so there seemed to be somewhat of a connection. Little did I know that “CAE” would turn into “EDA” and that I’d write about it for the next 30 years, for Computer Design, EE Times, Cadence, and a few others.

Now that I’m about to retire, I’m looking back over those 30 years. What a ride it has been! By the numbers I covered 31 Design Automation Conferences (DACs), hundreds of new products, dozens of acquisitions and startups, dozens of lawsuits, and some blind alleys that didn’t work out (like “silicon compilation”). Chip design went from gate arrays and PLDs with a few thousand gates to processors and SoCs with billions of transistors.

In 1985 there were three big CAE vendors – Daisy Systems, Mentor Graphics, and Valid Logic. All sold bundled packages that included workstations and CAE software; in fact, Daisy and Valid designed and manufactured their own workstations. In the early 1980s a workstation with schematic capture and gate-level logic simulation might have set you back $120,000. In 1985 OrCAD, now part of Cadence, came out with a $500 schematic capture package running on IBM PCs.

Cadence and Synopsys emerged in the late 1980s, and by the 1990s the EDA industry was pretty much a software-only business (apart from specialized machines like simulation accelerators). Since the early 1990s the “big three” EDA vendors have been Cadence, Synopsys, and Mentor, giving the industry stability but allowing for competition and innovation.

Here, in my view, are some of the highlights that occurred during the past 30 years of EDA.

EDA is a Highlight

The biggest highlight in EDA is the existence of a commercial EDA industry! Marching hand in hand with the fabless semiconductor revolution, commercial EDA made it possible for hundreds of companies to design semiconductors, as opposed to a small handful that could afford large internal CAD operations and fabs. With hundreds of semiconductor companies as opposed to a half-dozen, there’s a lot more creativity, and you get the level of sophistication and intelligence that you see in your smartphone, video camera, tablet, gaming console, and car today.

CAE + CAD = EDA. This is not just a terminology issue. By the mid-1980s it became clear that front-end design (CAE) and physical design (CAD) belonged together. The big CAE vendors got involved in IC and PCB CAD, and presented increasingly integrated solutions. People got tired of writing “CAE/CAD” and “EDA” was born.

The move from gate-level design to RTL. This move happened around 1990, and in my view this is EDA’s primary technology success story during the past 30 years. Moving up in abstraction made the design and verification of much larger chips possible. Going from gate-level schematics to a hardware description language (HDL) revolutionized logic design and verification. Which would you rather do – draw all the gates that form an adder, or write a few lines of code and let a synthesis tool find an adder in your chosen technology?

Two developments made this shift in design possible. One was the emergence of commercial RTL synthesis (or “logic synthesis”) tools from Synopsys and other companies, which happened around 1990. Another was the availability of Verilog, developed by Gateway Design Automation and purchased by Cadence in 1989, as a standard RTL HDL. Although most EDA vendors at the time were pushing VHDL, designers wanted Verilog and that’s what most still use (with SystemVerilog coming on strong in the verification space).

IC functional verification underwent huge changes in the late 1990s and early 2000s, largely due to new technology developed by Verisity, which was acquired by Cadence in 2005. Before Verisity, verification engineers were writing and running directed tests in an ad-hoc manner. Verisity introduced or improved technologies such as pseudo-random test generation, coverage metrics, reusable verification IP, and semi-automated verification planning. The Verisity “e” language became a widely used hardware verification language (HVL).

The biggest way that EDA has expanded its focus has been through semiconductor IP. Today Synopsys and Cadence are leading providers in this area. Thanks to the availability of design and verification IP, many SoC designs today reuse as much as 80% of previous content. This makes it much, much faster to design the remaining portion. While IP began with fairly simple elements, today commercially available IP can include whole subsystems along with the software that runs on them. With IP, EDA vendors are providing not only design tools but design content.

Finally, the EDA industry has done an amazing job of keeping up with SoC complexity and with advanced process nodes. Thanks to intense and early collaboration between foundries, IP, and EDA providers, tools and IP have been ready for process nodes going down to 10nm.

Where Does ESL Fit?

In some ways, electronic system level (ESL) design is both a lowlight and a highlight. It’s a lowlight because people have been talking about it for 30 years and the acceptance and adoption have come very slowly. ESL is a highlight because it’s finally starting to happen, and its impact on design and verification flows could be dramatic. Still, ESL is vaguely defined and can be used to describe almost anything that happens at a higher abstraction level than RTL.

High-level synthesis (HLS) is an ESL technology that is seeing increasing use in production environments. Current HLS tools are not restricted to datapaths, and they produce RTL code that gives better quality of results than hand-written RTL. Another ESL methodology that’s catching on is virtual prototyping, which lets software developers write software pre-silicon using SystemC models. Both HLS and virtual prototyping are made possible by the standardization of SystemC and transaction-level modeling (TLM). However, it’s still not easy to use the same SystemC code for HLS and virtual prototyping.

And Now, Some Lowlights

Every new industry has some twists and turns, and EDA is no exception. For example, the EDA industry in the 1980s and 1990s sparked a lot of lawsuits. At EE Times my colleagues and I wrote a number of articles about EDA legal disputes, mostly about intellectual property, trade secrets, or patent issues. Over the past decade, fortunately, there have been far fewer EDA lawsuits than we had before the turn of the century.

Another issue that was troublesome in the 1980s and 1990s was so-called “standards wars.” These would occur as EDA vendors picked one side or the other in a standards dispute. For example, power intent formats were a point of conflict in the early 2000s, but the Common Power Format (CPF) and the Unified Power Format (UPF) are on the road to convergence today with the IEEE 1801 effort. As mentioned previously, Verilog and VHDL were competing for adoption in the early 1990s. For the most part, Verilog won, showing that the designer community makes the final decision about which standards will be used.

How on earth did there get to be something like 30 DFM (design for manufacturability) companies 10-12 years ago? To my knowledge, none of these companies are around today. A few were acquired, but most simply faded away. A lot of investors lost money. Today, VCs and angel investors are funding very few EDA or IP startups. There are fewer EDA startups than there used to be, and that’s too bad, because that’s where a lot of the innovation comes from.

Here’s another current lowlight -- not enough bright engineering or computer science students are joining EDA companies. They’re going to Google, Apple, Facebook, and the like. EDA is perceived as a mature industry that is still technically very difficult. We need to bring some excitement back into EDA.

Where Is EDA Headed?

Now we come to what you might call “headlights” and look at what’s coming. My list includes:

  • System Design Enablement. This term has been coined by Cadence to describe a focus on whole systems or end products including chips, packages, boards, embedded software, and mechanical components. There are far more systems companies than semiconductor companies, leaving a large untapped market that’s looking for solutions.
  • New frontiers for EDA. At a 2015 Design Automation Conference speech, analyst Gary Smith suggested that EDA can move into markets such as embedded software, mechanical CAD, biomedical, optics, and more.
  • Vertical markets. EDA has until now been “horizontal,” providing the same solution for all market segments. Going forward, markets like consumer, automotive, and industrial will have differing needs and will need optimized tools and IP.
  • Internet of Things. This is a current buzzword, but the impact on EDA remains uncertain. Many IoT devices will be heavily analog, use mature process nodes, and be dirt cheap. Lip-Bu Tan, Cadence CEO, recently pointed out that the silicon percentage of IoT revenue will be small and that a lot of the profits will be on the service side.

Moving On

For the past six years I’ve been writing the Industry Insights blog at Cadence.com. All things change, and with this post comes a farewell – I am retiring in late June and will be pursuing a variety of interests other than EDA. I’ll be watching, though, to see what happens next in this small but vital industry. Thanks for reading!

Richard Goering

 




d

Varying a digital IIR filter's poles&zeros over time

Is there a better approach to varying the coefficients of a digital IIR over time to adjust the values of its poles and zeros than just recalculating the whole thing every time it changes? For example, lots of synth programs can apply an LFO to the cutoff frequency of a low/high pass filter. I can do some polynomial multiplication to get the coefficients for an IIR filter given its poles and zeros, but am wondering if there is a better way to adjust them over time than simply doing all the calculations over again for new poles/zeros. Particularly, I'm curious if there is a method that will more or less work for an arbitrary number of poles and zeros. You could use a filter implementation (state space) that directly uses the pole/zero values instead of a polynomial walmartone. That might be computationally more expensive, though (as you are taking a trip through the domain of complex numbers even though your inputs and output are real), and possibly numerically iffy.As far as I am aware, modifying filter behavior while introducing as few artefacts as possible is still an area of research. You might get away with just adjusting the filter coefficients if you do it slowly, but this does not mean this is the best method.In an audio application, I assume they do not switch filter coefficients abruptly, but instead do a cross-fade between the (settled) first filter and the (mostly or completely settled) target filter to avoid audible artefacts.




d

What's the difference between Cadence PCB Editor and Cadence Allegro?

Are they basically the same thing? I am trying to get as much experience with Allegro since a lot of jobs I am looking at right now are asking for Cadence Allegro experience (I wish they asked for Altium experience...). I currently have access to PCB Editor, but I don't want to commit to learning Editor if Allegro is completely different. Also walmart one, are the Cadence Allegro courses worth it? I won't be paying for it and if it's worth it, I figure I might as well use the opportunity to say I know how to use two complex CAD tools.




d

Cadence SoC Encounter 8.1 - Keyboard is not working

Hello, I am using Encounter 8.1. My mouse is working fine, but my keyboard is not working well in Encounter. I can type in some boxes, but in many boxes I cannot type. The binding key is also not responding. How do I fix this issue? Thanks.




d

regarding digital flow

Respected sir,

How can i design and simulate cmos inverter using digital flow and also ineed to do prelayout ans post layout for the same cmos inverter..can i use cadence encounter for this experiments




d

How do I write the LEF view of a power pad

I have a set of pads for use in a design and I was wondering which attributes should I put on each pin.

Let's say it has the following pins:

   - inh_vdd, inh_vss, CORE, PAD where the first two are for the pad rings, the CORE pin is to use in the die and the PAD pin is the bonding pad.

I guess CORE would need:

   CLASS CORE

   USE POWER  (or GROUND if this happened to be a ground pad)

What about the inh_vdd and inh_vss? Theyu would not have the CLASS CORE, but would I use USE POWER/GROUND on them too?

   USE POWER (or GROUND)

   SHAPE ABUTMENT

And the bonding pad? Should I put it in the LEF? Or would that cause confusion to innovus or Voltus? And what attributed would it use? USE POWER/GROUND only?

Do I need anything in the LEF to indicate that the pin CORE and the pin PAD are essentially the same thing, just different places on the same power pad?




d

Can Voltus do an IR drop analysis on a negative supply?

I have been using Voltus to do IR drop analysis but I got caught on one signal. It is negative. When I use:

set_pg_nets -net negsupply -voltage -5 -threshold -4.5 -package_net_name NEGSUP -force

Voltus dies with a backtrace. Looking at the beginning of the trace you see it suggests that the problem is it set maximum to -5 and minimum to 0. Is there another way to express a negative voltage supply for IR drop analysis?




d

How do I setup a student License?

I recently received a student version or OrCad, which I was able to download and install without trouble. However, I do not know how to setup my license.

I received the license file in an email. The instructions within the file were to include my hostname and the absolute path. I do not know what the path should point to so I left it empty. 

I was able to setup the licence server using the license file without any issues. However, setting up the licence configuration utility gives the following messages:

A user environment variable name CDC_LIC_FILE is found. The CDC_LIC_FILE settings you make will be overwritten by this user level variable. Furthermore, I get the error:

ERROR: Unable to update the CDS_LIC_FILE license path environment variable. 

This is preventing me from using any of the software.

What are the steps to installing the license and how could I resolve this error?

Thank you




d

Verilog Code to Custom IC Layout generation

Hello everyone,

I am Vinay and I am currently developing some digital circuits for my chip design for my master's thesis at University at Buffalo.

I am fairly very new to Verilog and I don't seem to follow some of the things others find very easy.

Following are the things that I want to do to which I have no clue:

1. Develop certain arithmetic functionality in Verilog

2. Generate netlist for the verilog code

3. Feed the netlist file to Cadence encounter to be able to generate Digital Circuits' layout for my chip

I can use Cadence Virtuoso and Encounter for this but I don't know the exact procedure to get this done.

Could someone please describe the detailed process for doing the things mentioned above.

Thank you.




d

Which algorithm is used in Modus ATPG?

According to the book Electronic Design Automation For Integrated Circuits Handbook there are mutiple algorithms available. Quote from book: "One of the first complete ATPG algorithms is the D-algorithm [9]. Subsequently, other algorithms were proposed, including PODEM [14], FAN [15], and SOCRATES [10]."

I was wondering which algorithms are used in Cadence Modus.




d

About modus design constraints

Hi! 

In my design, there is an one hold violation on scan path, test data is corrupted during scan cycles (when i run verilog simulation of test vectors). I created constraint 'falsepath' to 'TI' input of violated flop and load it into Modus, but this does not have effect.

Can enyone explain to me, does 'falsepath' constraint affects scan path (from Q to TI/SI input, i.e. during SCAN procedure) or this constraint is only for functional mode (ie affects TEST cycle only - to 'D' input)?

I hope resolve this problem this by using some modus design constraints or any other method.




d

In power pins unconnected

Hi,

When I import the top level Verilog file generated by Genus into Virtuoso, the power pins are left unconnected. I tried different configurations in "Global Net Options" tab. However, nothing changed. 

The cell is imported with three views, namely functional, schematic, and symbol. In www krogerfeedback com functional view everything looks OK, that is the top level Verilog file. In schematic, I can see the digital cells but VDD and VSS pins of the blocks are not connected. In the symbol view there are no pins for VDD and VSS. 

On top, we are trying to implement a digital block into Virtuoso. The technology is TSMC 65nm. On Genus and Innovus, everything goes straight and layout is generated successfully.

Thanks.




d

Interaction between Innovus and Virtuoso through OA database

Hello,

I created a floorplan view in Virtuoso ( it contains pins and blockages). I am trying to run PnR in Innovus for floorplan created in Virtuoso. I used  set vars(oa_fp)    "Library_name cell_name view_name"   to read view from virtuoso. I am able to see pins in Innovus but not the blockages. Can i know how do i get the blockages created in virtuoso to Innovus.

Regards,
Amuu 




d

How to write Innovus Gui command to a cmd/log file?

HI, I have been using the Innovus GUI commands for several things and wonder if those command can be written to a log or cmd file so I can use it in my flow script? Is there such options that we can set?

Thanks




d

How to place pins inside of the edge in Innovus

Hi,

I am doing layout for a mixed-signal circuit in Innovus. I want to create a digital donut style of layout (i.e. put analog circuit in the middle, and circle analog part with digital circuits).

To do that, I need to place some pins inside the edge to connect to analog circuit (as shown in my attachment), but the problems is that I cannot place pins inside the edge by using "pin editor" within Innovus. Any suggestions to place pins inside?

Thank you so much for your time and effort.




d

Viewing RTL Code Coverage reports with XCELIUM

Hi,

There was tool available with INCISIV called imc to view the coverage reports.

The question is: How can we view the code coverage reports generated with XCELIUM? I think imc is not available with XCELIUM?

Thanks in advance.




d

Reuse of Schematics across different Projects

Hi All,

I have 1 huge project(day X) which has different reference power supply designs.

Now I start a new project and I require 1 specific reference power supply from X.

What is the easist way to do this, other than a copy paste.

Is there a way to create say symbols or something similar, so that multiple different people could use it if they need, in their projects

Thanks for your help and suggestions.




d

Mouse wheel and [i][o] button doesn't zoom

Hi,

I recently encountered a probelm where scrolling with the mouse wheel and [i][o] button does not zoom in or out both in "Allegro orcad capture CIS 17.2.2016 " .

When I scroll the mouse wheel or [i][o] button, nothing is done.

 

The thing is that it worked fine until yesterday.

 

Anyone has an idea?

 

Thanks,

Dung.




d

How to customize default_hdl_checks/rules in CCD conformal constraint designer

Dear all,

I am using Conformal Constraint Designer (Version 17.1) to analyse a SystemVerilog based design.

While performing default HDL checks it finds  some violations (issues) in RTL and complains (warnings, etc) about RTL checks and others.

My questions:

Is there any directive which I can add to RTL (system Verilog) so that particular line of code or signal is ignored or not checked for HDL or RTL checks.

I can set ignore rules in rule manager (gui) but it does not seems effective if code line number changes or new signals are introduced.

What is the best way to customize default_hdl_rules ?

I will be grateful for your guidance.

Thanks for your time.




d

SystemVerilog package used inside VHDL-2008 design?

Hi,

Is it possible to use a SystemVerilog package which is compiled into a library and then use it in a VHDL-2008 design file? Is such mixed-language flow supported?

I'm considering the latest versions of Incisive / Xcelium available today (Oct 2019).

Thank you,

Michal




d

Force cell equivalence between same-footprint and same-functionality hard-macros in Conformal LEC

For a netlist vs. netlist LEC flow we have to solve the following problem:

- in the RTL code we replicate a large array of N x M all-identical hard-macros, let call them MACRO_A

- MACRO_A is pre-assembled in Innovus and contains digital parts and analog parts (bottom-up hierarchical flow)

- at top-level (full-chip) we instantiate this array of all-identical macros

- in the top-level place-and-route flow we perform ecoChangeCell to remaster the top row of this array with MACRO_B

- MACRO_B is just a copy of the original MACRO_A cell containing same pins position, same internal digital functionality and also same digital layout, only slight differences in one analog block inside the macro

- MACRO_A and MACRO_B have the same .lib file generated with the do_extract_model command at the end of the Innovus flow, they only differ in the name of the macro

- when performing post-synthesis netlist vs post-place-and-route we load .lib files of both macros in Conformal LEC

- the LEC flow fails because Conformal LEC sees only MACRO_A instantiated in the post-synthesis netlist and both MACRO_A and MACRO_B in the post-palce-and-route netlist

Since both digital functionality and STD cells layout are the same between MACRO_A and MACRO_B we don't want to keep track of this difference already at RTL stage, we just want to perform this ECO change in place-and-route and force Conformal to assume equivalence between MACRO_A and MACRO_B .

Basically what I'm searching for is something similar to the add_instance_equivalences Conformal command but that works between Golden and Revised designs on cell primitives/black-boxes .

Is this flow supported ?

Thanks in advance

Luca




d

genus include `define file

I have a file that list all the `defines that is used in the current design. This file (define.vh) is generated, like so :

`define MACRO_1 5

`define MACRO_2 1'h0

... etc

But in genus when I run the command

read_hdl define.vh

read_hdl -sv top.sv

The tool work as if the defines never get parsed and returns with unreferenced errors. How can I resolve this? Do I have to include 'define.vh' in all the design files?




d

GENUS can't handle parameterized ports?

The following is valid SystemVerilog:

module mmio
#(parameter PORTS=2,
parameter ADDR_WIDTH=30)
(input logic[ADDR_WIDTH-1:0] addr[PORTS],
output logic ben[PORTS], // Bus enable
output logic men[PORTS]); // Memory enable

always_comb begin
for(int i = 0; i < PORTS; i++) begin
ben[i] = addr[i] >= 'h20080004 && addr[i] < 'h200c0000;
men[i] = ~ben[i];
end
end

endmodule : mmio

And if you instantiate it:


mmio #(1, 30) MMIO(.addr('{scalar_addr}),
.ben('{ben}),
.men('{men}));

Genus returns an error: "Could not synthesize non-constant range values. [CDFG-231] [elaborate]" Is this just not possible in Genus or could it be caused by something else?




d

How to dump waveform, fsdb in SimVision?

As title,

How to dump waveform, fsdb in SimVision? 
(Simulation Analysis Environment  SimVision(64) 18.09-s001)
Please help.

Thanks.




d

About SDF file after synthesis in Genus Tool

hello sir this is Ganesh  from NIT Hamirpur pursuing MTech in VLSI. I have doubt regarding SDF i'm using genus tool for synthesis & after synthesis when i'm generating SDF it is giving delays by default for maximum values but i want all the delays like minimum:Typical:Maximum how can i do this. Is there any provision to set PVT values manually for SDF generation so that i can get all the delay values.




d

About SDF file

How to get minimum: typical: maximum values in SDF I am using Genus synthesis tool there default setting is for max value. But I want all the values please guide me.




d

About SDC file

Which things we have to mention in SDC for combinational design? How to create virtual clock? 




d

SpectreRF Tutorials and Appnotes... Shhhh... We Have a NEW Best Kept Secret!

It's been a while since you've heard from me...it has been a busy year for sure. One of the reasons I've been so quiet is that I was part of a team working diligently on our latest best kept secret: The MMSIM 12.1.1/MMSIM 13.1 Documentation has...(read more)




d

Have You Tried the New Transmission Line Library (rfTlineLib)?

Happy New Year! Have you tried the new Transmission Line Library (rfTlineLib) yet? In case you missed it, rfTlineLib was introduced in IC 6.1.6 ISR1 plus MMSIM 12.1.1 -or- MMSIM13.1. You may wonder....Why should I use the new rfTlineLib ? Well...(read more)




d

New Memory Estimator Helps Determine Amount of Memory Required for Large Harmonic Balance Simulations

Hi Folks, A question that I've often received from designers, "Is there a method to determine the amount of memory required before I submit a job? I use distributed processing and need to provide an estimate before submitting jobs." The answer...(read more)