Monday, September 1, 2008

6: The Great Divisional Wars

As a junior designer it all seemed straightforward ; I was putting a few hundred logic gates onto a chip to make a more compact/cost effective implementation of an electronic product. I remember it needed a A0 sheet of paper, a logic template and it always took 3 months [1]. But now I come to think about it, I suppose I was actually designing the circuit for the whole product, just putting as much of it as possible into the chip.

... In the meantime my boss was doing something else. It seems he had been scheming the product and now whilst steering me, was also directing mechanical designers to do casework, and cables, and PCBs, and assembly drawings, and test-jigs, and scheduling regulatory tests, and other even less exciting (management) stuff. When the chip came back I realised that the system was my baby too; The chip had to work, but for the customers to buy them, they also had to work at least as well as expected in the Product. So we spent another 6 months getting the product ready for market and ready for manufacture [2].

Over the years integration density steadily increased and I was soon looking at designs in the order of a hundreds of thousands of gates. Immensely more complex and expensive their design was the responsibility of a specialist team. These experts had divested themselves of most 'external' issues to focus on getting the chip right and out on time. Design was complete when simulations ran and a test pattern was delivered ... Somebody else put it into a product and made it work and out of sight is out of mind (SEP [3]).

... Historians will identify this as the start of the Great Divisional Wars. The silos started as shallow depressions, but became deep pits with strong walls. Soon you were striving to do your bit right, so you could sit comfortably in the knowledge that as the ship sank, it was somebody else's fault!

But Moore's Law didn't stop and as it progressed, more specialisations emerged and the divisions grew and hardened. Soon it became possible to put computers onto a chip along with other stuff (Early 90's), and the issue of embedded software raised its head in the hallowed halls of the Hardware Divisions. Fortunately 'we' knew how to handle it. Software lived in an external ROM, so it was clearly someone else's domain. So all we needed to do was put a compute engine on the chip and the Software Divisions can work out how to use it for themselves: Qed.

... Alas, the exponential continued and today capacity has became so great that design of the whole system product is once again moving back towards being the designers responsibility ... But which designer? By now divisions have often organised as independent service companies, maintained in their traditional roles by their specialist EDA acolytes and investors. Broadly classified as Hardware and Software none has the natural experience to grasp the system role, the natural ability to change direction or re-structure, or the inclination to bring in a third party. Canute like altercation began between the factions for ownership of the system role whilst maintaining traditional roles and boundaries. War is never constructive.

Applying some logic to this situation. Design is actually a hierarchically recursive, pseudo logical process, of Requirements partitioning and refinement. It concludes when all the threads of Analysis have been matched to established Physical Mappings. Its partner, Verification, confirms back up though that hierarchy, that the actual physical implementation meets or exceeds the Requirements for that level!

Taking the elephant a chunk at a time ...

Firstly, Design is not Hardware or Software, but Analysis: And always commences at the System (Product) Level. This is the process today: The highest levels making use of that very powerful conceptual modelling engine, the human brain; the lower levels utilising a variety of mathematical modelling approaches.

Design is a process of assessment of the implementation approaches available to meet the product Requirement. Because the task is complex and the detail must be complete, then it is broken down into hierarchical sub-tasks based on various criteria or experience; and again; and again; until viable implementation technology, and static and dynamic 'configuration detail' identified for every thread of analysis.

... The need for different languages to cover the domains encountered in that analysis process, and their need to interwork is apparent. Also the roll of Reuse, to offer early termination for some of the threads ... But it also warns of the Emergent Behavioural consequence through introduction of functionality which exceeds that specified by the analysis process itself.

Partitioning decisions throughout will depend on functional and non-functional criteria: Power, Performance, Appearance, Weight, Colour, Availability of a Hardware Team, TTM, Cost, Form, Experience, Quality, etc. 'Hardware' and 'Software' will emerge during this analysis process, along with the platform requirements to support them.

In the extremes we know 'Pure Software' is good for handling state complexity, is easy to design with, can handle late fixes, but is power inefficient. Whilst 'Pure Hardware' is great for signal processing, is power efficient and naturally concurrent, but difficult to design and change. Less obvious is the continuous line between these extremes, along which all hybrid architectures can be located (eg: CISC, VLIW, GPU, NOC, DSP, FPGA, etc); each providing its unique partitioning values.

... It is apparent that traditional views of Hardware and Software are just arbitrary classification of modelled logical process. They are essentially the same thing and are both located close to the bottom of the Analysis process!

But isn't this obvious?

... Just because it matches what we know, doesn't mean its structure is obvious! If we can see how it is, then we are more able to visualise how to optimise it. Recognising that the process is hierarchical and recursive; with a matching hierarchical needs for Requirement partitioning, and Verification construction ... means that that the process should not be seen as a one-off Software or Hardware EDA problem; but a generic and recursive Analysis one.

All Divisions (Corporate and Business) are an administrative convenience of the time; so must evolve with the time. Truth will ultimately prevail, so fighting to maintain arbitrary divisional status-quo is not sustainable. We will not change this situation overnight, but if we can see where we need to go, we will get there ... albeit by baby steps.

Think of it as your opportunity to make love, not war!

Cheers.ian

Ref:
3: "Somebody Else's Problem". Hitch Hiker's Guide to the Galaxy. D.Adams

No comments: