MTE Annual tools review
|
When MTE asked me to write an item on “compilers and tools” I realised how fast a year has gone and I dragged out the piece I did last year and contemplated the changes over the year…
Some interesting trends have surfaced. More parts have JTAG (or other form of On Chip Debugging System) on them. From 8-bit 8051’s through the 16-bit16* family to the new 32-bit ARM parts. An “Emulator” is now available from 50 pounds upwards and what is more one that is part of the compiler IDE. Well this is not entirely accurate. Yes, the parts do have JTAG. Yes JTAG debuggers can be inexpensive. Yes, they are usually integrated into the compiler IDE. So the write, compile, debug is all in one place.
However, note that I said JTAG debuggers, and put JTAG emulator in quotes. People often think that a JTAG debugger is an ICE. It is not. JTAG was originally designed as a PCB test system where it performed boundary scan. That is it serially transmitted, on one line, the state of the pins on the part. This was not just MCU’s but many types of chips. This is considerably less expensive than a bed of nails testing, especially if you have to buy the tester. The debugging side of JTAG, which is similar to Motorola’s Background Debug Mode (BDM) came when a debug block was added. This is still accessed in the same way as the boundary scan data I.e. serially. This means that the amount of data that can be retrieved is restricted. Also the facilities such as break points, triggers and watches are also restricted. Many systems only support one or two hardware breakpoints. In fact JTAG is often referred to as a “run-control” debugger not an emulator.
Some JTAG systems do have trace on them. The trace is a separate connector in addition to the JTAG connector. There are several standards, just as there are also several JTAG connectors (10,14,16,20). For example on the ARM parts the trace many be implemented as a 4, 8 or 16 bit access. Thus retrieving a 32-bit address could take 8, 4 or 2 transfers. This is also not the same as a full ICE in that it is possible for the trace module to loose information much like a PC RS232 port would loose data over 9600 baud in the good old days. So remember JTAG debuggers may not be the inexpensive solution you thought they might be. In this world you still get what you pay for.
Whilst more and more parts have JTAG on them and the cost of tools comes down I don’t think that engineers are looking at methods for using these tools correctly. You can’t use them in the same way you could a full-blown ICE. New strategies need to be developed.
At the other end of the tool chain more people are using more auto-code generators, largely with UML, the current must have on the CV. CASE tools are getting increasingly sophisticated and whilst expensive are coming down in cost in real terms. Although UML works well with C++ targeting embedded C is another matter. Choose your tool with care. Make sure it will work in an Object Based (not OO) mode and can be adapted to the embedded C dialect you are using (and the target hardware). Remember very few embedded C compilers are ISO C compliant. They are mostly half way between the 1990 and the 1999 standards. The other problem is debugging the model. Whilst this can be simulated most will also now work with emulators and debuggers. However with the restrictions noted with JTAG debuggers care must be taken with the test strategy.
At the other end of the scale free compilers are gaining popularity. This is frightening many people: the tool vendors for one. Some are putting on a brave face others are worried. Some are worried for the obvious reasons a dent in sales but there is more to it than that. Just as everyone thinks they can write a novel every programmer thinks they can write a compiler. Their boss regales them with stories of writing his own assembler way back in the dark ages before C etc. Most of these budding compiler writers seem to have a dubious copy of “ANSI C”, K&R or a “good book on C”. The problem is that very few people understand C in the way it is required in order to write a compiler. They don’t have the ISO C standards and TC’s or more importantly the test suites. There are two test suites for C compilers. Plum-Hall and Perennial. The authors for both of these are part of the ISO C committees and they have access to all the standards committees debates on the precise meanings and implementations.
Several of the free compilers I have come across do not track the ISO C standard, are certainly not tested against any of the recognised test suites and worse still are open source. Why is open source bad in this case? Because when there is a bug the programmer can fix it.
As a support engineer I have had calls from people who have found bugs in a well-known commercial compiler. Granted some are bugs. One complaint I recall was that the compiler did not correctly handle the 256th and only the 256th CASE in a switch statement. Lots of other complaints are that the compiler is not correctly handling C. In every case so far the complainant has been wrong! The compiler was correctly handling C. In several cases I was told that the 8-bit embedded cross compiler was not MS-C compatible…. ignoring the fact that the embedded compiler was accurately implementing ISO C in the aspect in question.
So the fact that the programmer can “fix” bugs in a compiler because he has the source fills me with dread. It should also worry all users and especially project managers. Unless you fully test the compiler on a recognised (and expensive) test suite or go through the entire source line by line (with the appropriate C standard) and have a very good understanding of, not only the C language but compiler design how do you know you have a good copy of the compiler? You could have the version some well meaning programmer has just introduced some subtle bugs into last week whilst fixing a “bug” that may or may not have been real. What is more the fixes may or may not be documented.
Recently one commercial tool vendor said to me there is not a lot wrong with XYZ open source tool that couldn’t be sorted by a re-rewrite by a competent engineer to clean up some of the dubious parts. Then of course do a full test on the system. The problem is that they are not going to put an engineer on to it because the resultant tool would have to be open source. They would to it if they could re-coup the time/money spent on it. One company I know did replace part of a GPL tool with a much-improved version of their own but as they would not release the source and wanted to sell the tool they said that they got black listed by the FSF.
One major software vendor has recently been commenting on Linux and the fact that even though you do have all the source you are no better off as you are unlikely to find any backdoors or intentional or otherwise. Thus you have a security risk. The sheer size of something like Linux means that you cannot check it all without considerable cost in time and money. In my own office the time spent getting some free software to work actually made it more expensive than what, on the face of it , it would have cost for a commercial package.
The reason why tool vendors are worried (apart from the loss of revenue) is that with the current commercial tools work to standards. Many of the free tools work approximately to their own standards. Also many people can “fix” bugs in these tools and make them work differently. At least I now if I get a compiler from a manufacture all copies of it will work in the same way. I am not going to get two supposedly identical versions behaving differently. Thus it becomes difficult to interface some tools to others. Especially where some of the tools use techniques that are under NDAs. The Hooks emulation system for example. This is a licensed system that most 8051 Ice vendors use who in the open source community is going to spend a lot of money to interface to this system? The other problem is that even if they do the source cannot be open. The only solution is a clean room open source hardware ICE. This is clearly not going to happen so where are the new tools going to come from?
There is another subtler problem has been described to me from several sources. As some of the smaller commercial companies disappear and some of the larger ones cut back the knowledge base disappears as well. There is an overall lowering of the standard of tools. The good ones are still there but the number of less good ones is increasing as a percentage. At some point there will be a shortage of good tools for high integrity development on some platforms. The observation came from some who tests software and to some extent the tools and he is worried for his safety. How long will it be before the use of dubious tools kills some one? Embedded software is complex now and getting more so and there is more chance of errors. Software Engineering has to become just that. Not a hobby or cottage industry
Eur Ing Chris Hills BSc CEng MIET MBCS MIEEE FRGS FRSA is a Technical Specialist and can be reached at This Contact
Copyright Chris A Hills 2003 -2008
The right of Chris A Hills to be identified as the author of this work has been asserted by him in accordance with the Copyright, Designs and Patents Act 1988