MTE Embedded Compiler Trends
|
Although I do not write embedded compilers I know many embedded compiler writers. More to the point I have used, in anger, many 8 to 64 bit embedded compilers over many years.
Having discussed current trends with several compiler writers, the conclusion is that generally the recent improvements in embedded compilers have been incremental. That is the compiler writers are gradually tightening up their code, improving the optimisation's, etc. Some specific compilers do make significant improvements by replacing a module using an old algorithm with an improved one. However, these are not ground breaking “new” algorithms but developments that they have just got round to fitting in to their system. Other compilers have improved by recompiling they compiler as a 32 bit application from the old 16 bit system. E.g. a Win9* application instead of a DOS or WIN3* application.
Whilst the compiler itself has not undergone any recent ground breaking transformations, there have been major developments in embedded compiler suites. Ten years ago embedded compilers (if not all compilers) were command line systems working from a make file or batch file. Some had an editor from which they were able to launch their compiler or make. Some people still work this way.
Over the last ten years there have been two main changes. The first and obvious one is that everything is now point and click! The other is that everything looks the same. The second point hides a far greater change. The main transformation in embedded systems has been integration. When Win9* arrived the MS look and feel (along with the WIN32 API ) became entrenched. Eventually all compilers got an "MS" style IDE. This might sound trivial to Engineers but think about how you used to work and how you can work now. My Keil 8051 compiler IDE will let you edit your files but has full project control and will also let you add menu items so that you can directly run, with parameters, your version control tool, the Eprom programmer or Flash download tool, PC Lint, a CASE tool etc.
The ability to run another application and transfer information to it via parameters is now common. At the same time this feature was becoming refined IDE's were adding functionality of their own. I remarked to one compiler writer recently that I had integrated Grep on the tool bar. He was unhappy at this and pointed out that the new "Find" function in the IDE was in fact a grep engine. Most compilers now have much improved static analysis though I always prefer to use PC-Lint. I like the checking tool to be independent of the person who wrote the compiler. Many other features are now included like graphical calling trees, memory maps and variable cross-reference. Many of the old small tools that were previously essential separate items are now part of the IDE suite. You rarely see "Programmers Tool Kits" advertised any more.
The revolution (evolution?) has arrived in the next stage. This is partly due to Windows DLL's, OLE and to some extent the cost of re-inventing the wheel. As mentioned the small tools are now part of the IDE. The definition of "small" depends on the compiler vendor and the market it is in. The next step is the user being able to use or integrate with larger tools from third parties such as ICE. This is not the same as simply running a shell and passing parameters.
Most compiler IDE's have source level debuggers and simulators. Some drive simple monitors now converted to run wigglers. However, in order to drive a full ICE directly from the IDE requires a bit more than opening a shell and passing parameters. The ICE vendors supply the compiler's debugger writer with a dll and an API to directly drive the ICE. It is not just driving ICE where this is happening. There are integrations between the "compiler" and SPICE engines that are part of CAD tools. For example the Keil C51 compiler and Proteus CAD tools: Proteus permits the design and simulation of a full electronic circuit including analogue parts and the 8051. The interface dll also permits the source level debugging of the code on that circuit with the Keil c51 (integrated) debugger. So the circuit can be designed and tested with the software before the board has even been made. The problem is that any faults that are still there are going to be so subtle or hidden that you are going to need an ICE to find them! Of course the Keil C51 debugger can now directly drive the various 8051 ICE.
Whilst mentioning testing… The same thing is happening with ICE vendors. Their software is also gaining open API's so that as well as compilers, code testing tools can drive an ICE. This means that the code, instead of being tested in simulation for path coverage and timing analysis etc, can be tested in the actual target mcu.
The IDE's simulators are gaining "open" API's so that in the case of Keil's 8051 system you can write other peripheral modules for the [integrated] simulator. This is very useful where soft 8051 cores are used in ASICs or FPGA's such as the Trisend E5. The on chip peripheral mix, or indeed the peripheral it self, can be unique to your design. One university has written a number of peripherals including LCD displays and key-pads. See Keil App Note on the web site.
Now I can do everything from within the IDE or direct linking to other tools. The diagram of the Keil system shows how the IDE has moved from from being an editor with features. It envelopes the system and seamlessly connects it to other systems. Due to the way OLE and dll's work and the tendency to all use the same MS style integration has never been easier.
So there it is: The Awful Truth! Microsoft has had a major influence on embedded development tools.
Eur Ing Chris Hills BSc CEng MIET MBCS MIEEE FRGS FRSA is a Technical Specialist and can be reached at This Contact
Copyright Chris A Hills 2003 -2008
The right of Chris A Hills to be identified as the author of this work has been asserted by him in accordance with the Copyright, Designs and Patents Act 1988