MTE Annual tools review
|
When MTE asked me to write an item on “compilers and tools” I realised how fast a year has gone and I dragged out the piece I did last year and contemplated the changes over the year… that is what I wrote last year. This is becoming a habit another year has rocketed by and the industry has moved on quite a way.
One change is that traditional emulators are both on the wane and at the same time becoming more sophisticated, versatile and less expensive in real terms. The traditional ICE was just getting to the point where it was reliable, fully featured, easy to use and sensibly affordable. On top of this there are now many tools that integrate with ICE giving an unprecedented level of modeling and testing for embedded development.
However, just as the majority start to see the point of using and ICE the game has changed. Most new chips have some form of on chip debug. Whether it be: JTAG, BDM, OCDS, Core Sight or something similar. Come to that: new versions of some old MCU are getting JTag or BDM. These systems can use far simpler debugger hardware and a standard connector to get the information out of the chip. On top of this the trace connectors are also becoming standardised eg Nexus or the Mictor connector used for ARM. Therefore debug (also board test and programming) can be tracked on to a target board without having to have a particular tool in mind. No more worrying about clearances for large ICE pods when designing.
This had been predicted by many, including myself in my Embedded Debuggers paper (http://quest.phaedsys.org), though my final predicted solution was a single line fibre optic carrying all the debug information including the trace that would be on a level with a traditional ICE. At the moment the current crop of JTAG/BDM type debuggers carry less information than a full ICE and are not real-time. They are “run-stop” and flash programming tools.
This trend is reducing the cost and complexity of the debug interface hardware. There are many cheap JTAG and BDM tools out there. However you get what you pay for. One of the more expensive JTAG manufacturers dynamically tunes the FPGA’s to account for the signal skew at higher speeds. Something many of the cheap JTAG tools have no idea about let alone make provision for. The other thing it is that it is all well and good building a simple hardware interface for JTAG but you have to process the data in real-time. Run-stop is one thing but trying to do anything more ambitious will show you why the more expensive tools are worth the money. Some can handle trace and a good trace is not a luxury. Not all JTAG or BDM tools are equal.
Another problem is that things are getting multi-cored. More than one processor on the board or in some cases more than one processor in an ASIC. The lesser tools can’t really handle this, at least not with the synchronisation required. You still get what you pay for in this world. At least the widespread introduction of inexpensive entry-level tools should wean a lot more off the printf and port pin wiggling debug methods: something that should have gone a long time ago.
On-chip debug and standardisation in debug hardware interfaces is the trend in 2005. The other tendency I have seen over the 2003-2004 is for everyone to try to make their IDE all singing and dancing.. Metrowerks did this with Code Warrior, iSystem with WinIDEA, Windriver with Tornado. Others are starting to add bells and whistles to their IDEs and they are getting larger and larger. You could end up with several Integrated Development Environments that have a lot of similar functionality.
Well most compilers and linkers can be command line driven so you can use an independent programmer’s editor, like Brief as a standard software development IDE. Though most compiler IDE’s now have most of the functionality the old programmer’s editors had which is why Brief has all but gone.
However, adding in other tools and functions to a compiler IDE is a different kettle of fish. They all have their own (almost) similar script system and API or tool interface but there is more than that to adding a simulator or debugger. Each debugger-IDE combination will need different software.
There are also some tools that have specialised functionality such as the RistanCASE DAC (documentation and flow/structure charts etc) that for ease of use are built around their own IDE. I have a about 10 separate IDE’s …. it’s too many integrated development environments. Ironically I quite often get requests for an IDE for PC-Lint! PC-Lint is a command line tool, which can be integrated into almost any IDE.
There is a new, quiet revolution on the horizon: Eclipse. This is an IDE that has been designed as a modular IDE system with open interfaces (and open source). This is a step up from the usual scripting and proprietary API that many IDE’s have. Now, whilst this has been designed for windows/unix/linux development it is a major step in the right direction and the principal is sound. The other major advantage is that it was an IBM project so the quality is there. I am always worried about the varying level of quality in many Open-Source projects.
It means that there will be a standard IDE interface system we can all work to. No, I am not suggesting that we will all be using Eclipse, but that the commercial IDE’s will start to use the Eclipse interfaces and compatibility. They will be “Eclipse Based”. Why will they do this?
There are already “several hundred” plug-ins for Eclipse and this number will grow. Certainly most (all?) the low-end tool developers are going to use Eclipse because it saves much development effort and gives them a “universal” front end. Many will use it on principal because it is part of the Open-source movement. Also the industry is being championed by some of the high-end companies e.g. Fujitsu, Serena Software, Sybase, Borland, IBM, MERANT, QNX Software Systems, Rational Software, Red Hat, SuSE, TogetherSoft, LynuxWorks and WebGain with others have taken it up. This means that many producers of utilities and other tools will want to capitalise on an instant market. They can concentrate on their core competence's and not have to worry about keeping up with everyone else’s IDE.
Also it will encourage people to write add-on modules at all levels. Companies can create plug-ins where they have custom test and development equipment without fear that the IDE interface is proprietary or will disappear next month. It should also encourage development of tools that can run on Linux or Windows though I expect that some of the more sophisticated and hardware dependant tools may not make the jump for a while.
So in 2005 and certainly moving into 2006 there should be a standard IDE [system] along with more integration and interoperability between tools. I have also seen a lot more interest in modeling, auto-code, static analysis and testing tools which will promote the interoperability. That is what I expect next years review to be about. Long gone are the days where “printf-debugging” was seen as the norm. At last it looks as though software Engineering is coming of age but there is still a way to go.
Eur Ing Chris Hills BSc CEng MIET MBCS MIEEE FRGS FRSA is a Technical Specialist and can be reached at This Contact
Copyright Chris A Hills 2003 -2008
The right of Chris A Hills to be identified as the author of this work has been asserted by him in accordance with the Copyright, Designs and Patents Act 1988