Ask Slashdot: Sources For Firmware and Hardware Books? 88
First time accepted submitter cos(0) writes "Between O'Reilly, Wrox, Addison-Wesley, The Pragmatic Bookshelf, and many others, software developers have a wide variety of literature about languages, patterns, practices, and tools. Many publishers even offer subscriptions to online reading of the whole collection, exposing you to things you didn't even know you don't know — and many of us learn more from these publishers than from a Comp Sci curriculum. But what about publishers and books specializing in tech underneath software — like VHDL, Verilog, design tools, and wire protocols? In particular, best practices, modeling techniques, and other skills that separate a novice from an expert?"
The Finest Machine on kickstarter (Score:4, Funny)
Unfortunately... (Score:5, Insightful)
I am an embedded software engineer who does some hardware/electrical engineering too. Unfortunately there is very little material on this subject specifically. Basically you need to learn how to code in C (not C++ or C#, raw C), learn some electronics and then maybe learn VHDL/Verilog as well. You then put it all together in your own mind.
It is really hard to recruit people with those skills so they are worth having. You will need some hands on experience though. You can simulate wire protocols and some hardware but none of the simulations are particularly good practice for real life, and employers will want examples of work anyway. A university level course would be best.
Re: (Score:1)
Re: (Score:3)
With C++, there's no way to really tell...
Re: (Score:2)
You could ask the same of C code. It comes down to the developer. Remember, C++ compilers are (supposed) to be compatible with C as well.
Re: (Score:2)
Re: (Score:2)
Just because you CAN do something does not mean you SHOULD do it.
Re: (Score:2)
Considering that C++ came our of a Bell Labs that is unsurprising. They were programmed satellites and real time switches for years.
you've got a c++ runtime in your firmware? (Score:2)
How much space does that take exactly? Or were you talking about using a _subset_ of C++?
Re: (Score:3)
For projects with limited resources, it is quite normal to use a subset of the language. For example, you may eschew use of the heap library functions in C, or the use of floating point calculations.
I too use C++ (a subset), even for embedded systems as small as AVR. Having a restrictive environment doesn't mean you can't use object classes and templates.
The main risk with using C++ in a tight memory space is hidden use of the heap (for example in the STL). I don't find that code bloat is a problem if you k
Re: (Score:2)
Re: (Score:1)
This sounds a lot like what I do. I'm also an embedded software engineer working in a hardware shop. We have a few books on various buses and stuff. I often end up looking at Linux driver source for insight into particular hardware. Simply using powerful hardware tools like JTAG debuggers and PCIe analyzers gives some understanding of how the hardware actually works.
Sadly though, I agree with you for the most part, that there isn't a lot of good literature out there. A lot of the more specific informat
Re: (Score:3)
Xilinx actually has excellent and free data sheets and manuals available (as do other vendors) on their site. When I was transitioning from RF to digital years ago, I taught myself how to design with and program FPGAs entirely from Xilinx documentation.
Re: (Score:1)
You'll just spend a summer trying to track it down. A lot of the information is available in documents, it's just spread between several documents.
Re: (Score:3, Informative)
I have the same kind of background and would add that the manufacturers documentation is a must read. Once you have the foundation in programming from the types of books you mentioned, and an understanding of computer hardware, you need to get into the hardware specifics of your platform. Embedded/Firmware differs from what I would call general programming in that it is much more hardware specific. Many chip manufactures have software examples, documentation even training classes(sometimes even free) that g
Re: (Score:1)
> It is really hard to recruit people with those skills.
Are you talking US? I'm curious if there is any demand for such people in Europe. AFAIK Europe is no longer relevant. One example: not a single mass market digital camera was produced in Europe.
Re: (Score:2)
Re: (Score:2)
For embedded systems maybe one or two of my peers in school really understood everything. Even at that level we were just poking a few memory addresses and sending in data down the bus to flip a few latches. Precision, timing and the code in a big pot tended to throw some off. Made a killing in tutoring for those classes as my rates were extremely high. (No one was displeased)
Most if it has rotted away, but I can still look at a spec sheet to get a feel for the chip. The things we do today are always impres
Re: (Score:2)
Nothing magic - say, find the position of the least-significant set bit in a uint32_t or something.
Oh hell. I've been doing this sort of thing for nearly 10 years and just today I was poking around on an FPGA and not only did I get the endianness wrong, I got the order of the bits in the bytes backwards! The fact that this was a 32 bit PPC machine talking over a 16 bit bus with Intel byte sex just confused everything.
Moral of the story: no one can find the LSB of any data type on any machine until they've gone and looked for it!
Re: (Score:1)
As far as bits are concerned, bits of what?
Re: (Score:2)
Admittedly this doesn't answer the original poster's question of what publishers are best in this area, but I second the 'learn VHDL/Verilog' advice.
If you buy an inexpensive development board from a company like Xilinx, Altera or Digilent, you can immediately begin to experiment in developing your own digital circuits (there are some hugely expensive dev boards, but you really just need a cheap Spartan 3 board or similar to start out). Check out Opencores.org, which is sort of like the Sourceforge of digi
Gaisler VHDL style (Score:5, Interesting)
It's not a book, but this book chapter is more-or-less compulsory reading for someone planning to get into HDL programming:
A structured VHDL design method [gaisler.com]
Re: (Score:2)
Not Me (Score:5, Interesting)
and many of us learn more from these publishers than from a Comp Sci curriculum
Not me, man. And, don't get me wrong, I love pragprog and I worship O'Reilly and NoStarch. Hell, I review books for them on Slashdot! But no book would have been able to teach me about automata theory or linear algebra and differential equations like my college courses did. I'm sorry but I must argue that there's a lot of application and implementation to be gleaned from these books -- not so much on the theory and foundational concepts. At least for me there's something really difficult about reading a book about really "far out there" concepts and truly understanding them without human intervention. Maybe I'm just stupid but I find the best tech books show me "little jumps" while my college courses were the only way I could make "the big jumps" in knowledge quickly enough.
Plus, going to a liberal arts college meant general requirements that furthered me along in ethics, philosophy, etc more than these books did. I wouldn't go selling a college education short even though it seems to be the popular thing to bash these days.
Re: (Score:2)
Re: (Score:2)
I got my minor in philosophy, so I learned to bullshit pretty well.
And you'll always have a place in Sales, Customer Service, or tech Support :) You only get politics as an option if you majored in BS and failed Ethics & Morality.
Re: (Score:2)
I got my minor in philosophy, so I learned to bullshit pretty well.
And you'll always have a place in Sales, Customer Service, or tech Support :) You only get politics as an option if you majored in BS and failed Ethics & Morality.
Nah, I use my powers for good. One nice benefit of knowing bullshit, though, is also knowing how to spot it.
Re: (Score:2)
Re: (Score:2)
Maybe I'm weird, but I got more of a college education outside of college than in it.. For instance, my school dropped their compiler design course due to lack of enrollment, so I bought a textbook and taught myself. I learned physics and linear algebra through MIT OCW (though I admit I didn't retain much of either after 5 years). I got a C in discrete math because the prof refused to give the homework until 5-10 minutes after the bell rang, and I didn't have time to stick around that long, but I practice my knowledge of algorithms by doing Project Euler problems.. I'm not calling college a total wash.. I got my minor in philosophy, so I learned to bullshit pretty well. All the phil students thought i was weird though since the only area of philosophy i found interesting was epistemology. I also learned how to not be a dick, and how to talk to girls. But as far as my major was concerned, I would have been better off with a stack of books and some peace and quiet.
Epistemiology is perhaps the most practical and interesting aspect of philosophy (at least to me). Maybe because of my CS background and practical uses of logic in design and argumentation (making a case for something or against something.)
Re: (Score:2)
But no book would have been able to teach me about automata theory or linear algebra and differential equations like my college courses did.
Does that include the course texts?
Re: (Score:2)
I learned more from reading my textbooks (and studying the problem solutions), then anything the professor taught as he mumbled into the chalkboard and erased the equations before I had a chance to copy them down. I had 4 maybe 5 good teachers in college and the other ~35 were poor.
Re: (Score:2)
Have a look at "Collective Intelligence". It's a great crash course in ML, comparable to Ng's course (though it skimps a little on the theory).
Re: (Score:2)
All you have to do is ask one question
Do you know what a Laplace transform is. You end up with two groups, both can be crap or brilliant, but on has a wall they will never cross.
Datasheets (Score:2)
Re: (Score:3, Informative)
Don't forget the appnotes- those are great for picking up pieces of passed down lore that you won't otherwise be exposed to unless you hang out with EE / Hardware types. The problem for me is gaining awareness that a class of parts exists so that I can read the appnotes for them.
Addison Wesley (Score:3)
I would also look at what the Association for Computing Machinery has. I had a memebership a while back and it seems they had so resources on this topic. In fact looking back they have a book called VHDL for Programmable Logic, which I have no idea if it what you are looking for, but there you go.
Free Free Range VHDL (Score:3, Informative)
VHDL (Score:2)
I don't know why a software guy wants to learn VHDL, but here's where I started: VHDL For Designers http://www.amazon.com/VHDL-For-Designers-Stefan-Sjoholm/dp/0134734149 [amazon.com]
Re: (Score:2)
There's a wierd breed (of which I happen to be a member) of people who aren't quite software guys, and aren't quite hardware guys, but do a little bit of both. My title is "Embedded Software Engineer", but i look over (and redline) schematics regularly. Some EE courses certianly didn't hurt.
OS software guys deal with datasheets a lot (Score:2)
Anyone dealing with device drivers, early bringup code, or exception handlers needs to be able to read datasheets. Generally not so much VHDL though.
Why o Why? (Score:2)
VHDL seems to be thrown about slashdot a lot. Of course the 1990s saw heated debate about how VHDL is better than Verilog, but if you look at the ground reality, hardly anybody is doing new VHDL design. Even Europe, the last bastion of VHDL is moving to Verilog.
So if you want to upgrade your skills, and are new to the field, try Verilog and System Verilog.
Though SV started as a "simulation and tbench" language", its being increasingly used in design.
System verilog for design (google it) is a popular book.
PS
Re: (Score:2)
VHDL seems to be thrown about slashdot a lot. Of course the 1990s saw heated debate about how VHDL is better than Verilog, but if you look at the ground reality, hardly anybody is doing new VHDL design. Even Europe, the last bastion of VHDL is moving to Verilog. So if you want to upgrade your skills, and are new to the field, try Verilog and System Verilog. Though SV started as a "simulation and tbench" language", its being increasingly used in design.
System verilog for design (google it) is a popular book.
PS: I am working in EDA and VLSI field for past 11 years, and have seen multiple designs from many large Semiconductor companies.
Um, no. I work for a multi-billion dollar aerospace company in the USA and we are strictly VHDL. It is simply better. Verilog is a low-level ASIC gate wiring language. VHDL can do everything from high level to low level, and reads better. If you're doing FPGAs, VHDL is the best. I consider this http://www.amazon.com/Designers-Guide-VHDL-Systems-Silicon/dp/1558602704 [amazon.com] to be the absolute best book.
CE curriculums and Embedded Systems Conferences (Score:3)
Re: (Score:2)
A Computer Engineering curriculum is much better than a traditional CS degree for this type of work, so you might look at what texts are being used in high quality CE programs. The Embedded Systems Conferences [ubmdesign.com] from UBM are also a good source of training for low level firmware implementation.
Indeed. Now that I'm in my 40's and in a need to switch to a more hardware'y line of work, I'm finding myself in a need to go back to grad school and work towards a CE degree. My advice for people going to school is to work on two separate majors - CS and CE or CS and EE. Or at the very least to work on a double major or a minor on one of the two (or CS and MIS for the business/enterprise inclined). An extra year/year and a half will open so many doors it's not even funny.
Joseph Cavanagh or Michael Ciletti (Score:1)
Look for books by Joseph Cavanagh or Michael Ciletti
Couple of recommendations to get started... (Score:5, Informative)
- Grab a couple books on C/C++ and Verilog. I highly recommend "Fundamentals of Digital Logic with Verilog Design", great for both learning and for reference. For C/C++, I've always been a fan of the Sam's "Learn __ in 24 hours" books.
- Get yourself a FPGA development card, so you can get some "hardware play" in and familiarize yourself with some development tools. I have an Altera DE1 educational card that's a few years old, but it's got endless blocks on it (displays, LEDs, buttons, flash, SDRAM, VGA, sound... you name it) which makes it a great little card for embedded system learning. There's a whole set of Verilog and Nios (embedded processor) tutorials available for it, and lots of online hackers who have ported x86 processors (Zet project), hardware emulations of the NES, etc... to it. Xilinx and Actel also make some nice evaluation boards that seem to be targeted fairly often by hobbyists.
Other than that... you can study the heck out of wire protocols, but you'll probably forget everything you learn unless you end up implementing it. You're better off trying to learn as many general things as you can - how to create well organized C/C++ and Verilog code, making your designs meet timing and such - so that if you end up having to implement something, you've got the basics already in place and don't need too much incremental learning. Also if you have some fun ideas for FPGA projects, implement your heart out - that sort of stuff looks great on a resume.
Good luck!
"...tech underneath software..." (Score:5, Funny)
There isn't any. It's VMs all the way down now. Hardware is so 20th century.
Re: (Score:2)
While i disagree, even if you were right, someone has to write the hypervisors, which just happen to run on hardware.
What I used (Score:3)
I recently taught myself VHDL and used Pong P. Chu's [amazon.com] book. I liked it quite a bit. It did an especially good job of reinforcing the mindset of approaching VHDL programming as digital circuit design, not software design.
Vendor tools (Score:2, Informative)
Just like software project, you'll need to actually dive in to learn anything outside of quoting books. Experience is what separate the junior engineers from the fresh out of school.
You can get "free" CPLD/FPGA vendor tools from the big 3 chips suppliers - Xilinx, Altera and Lattice. There are restriction on the design tool - either size/the chips selection, but other than that they are more than generous for some large non-trivial real life designs. The environment also have simulations again with some
The Reuse Methodology Manual (Score:2)
Re: (Score:1)
Library.nu (Score:2)
Oh wait, those f-ing bastards from the "book cartel" got them shut down.
Never mind, good luck finding anything of value.
These I used myself (Score:1)
Being a hardware engineer my self some books I have been reading from time to time are
VHDL:
Pong P.Chu - RTL Hardware design using VHDL (2007)
Volnei A. Pedroni - Circuit Design with VHDL (2004)
One of the following is really good for VHDL but I don't remember which of the two I keep confusing them...
VHDL Handbook by HARDI electronics (1997)
Peter J. Ashenden - VHDL Cookbook (1990)
U. Meyer-Baese - Digital Signal Processing With FPGA (2007)
Laung-Terng Wang, Cheng-Wen Wu, Xiaoqing Wen - VLSI Test principles and
A few books to try (Score:2)
I'm really enjoying the testbook for the VHDL/FPGA RTL design class I'm taking now. RTL Hardware Design Using VHDL: Coding for Efficiency, Portability, and Scalability by Pong Chu. It doesn't bog down talking about all possibilities the language allows for legal syntax. The author really seems to focus on common practice for coding into a chip. There's very little if any testbench/simulation in this book, so look elsewhere for that, this one is all about the circuit design. Rather than only explaining what
Re: (Score:2)
VHDL and Verilog create gates and flip-flops, not code. Learning the languages isn't all that difficult. Learning the particulars about the FPGA's your "software" is going to go into is not trivial and requires hardware design knowledge.
Your FPGA isn't running on a virtual machine somewhere on the cloud. You need to know how to use a multimeter to make a FPGA go.
When's the last time you had a C++ file that defined your pin locations?
I think a logic analyzer is a bit more important than a multimeter when it comes to digital design.
fpga4fun (Score:4, Interesting)
This is where I started about a year ago:
http://www.fpga4fun.com/ [fpga4fun.com]
Got a Xilinx dev kit, and it didn't take me too many weeks to get my project up to a stage where other people started using it.
Besides programming, a background in general electronics does help. Even if you're coding in a somewhat C-like language, it's nothing like a sequential program, but a description of hardware with real, physical parallelity. To me, it often helps to look at a circuit diagram to understand some debugging issues.
Data sheets from chip manufacturers are essential for some of the trickier points. If you need to choose between the two largest players, I recommend Altera over Xilinx, as they are somewhat more open, but mostly there are no huge differences.
This is what I use / suggest. (Score:1)
Second, pick a language, I use mostly VHDL.
I like:
"VHDL for Logic Synthesis" Rushton
"VHDL Coding Styles and Methodologies" Cohen
Also google "VHDL Math tricks of the trade" (pdf) you'll need that to stay sane if you actually do algorithms
Jack Ganssle (Score:1)
Nobody has mentioned Jack Ganssle or Embedded Systems Magazine? Visit http://www.ganssle.com/ [ganssle.com], subscribe to The Embedded Muse, and if possible go to one of his Seminars.
I'm lucky, I started as an electronic tech in the late 70's. I learned to write software to wiggle wires for troubleshooting, engineering found out about it and drafted me. Went to school for my degree (math), and haven't had much trouble working since.
Here is a good book. (Score:1)
"Embedded Systems Firmware Demystified" by Ed Sutter (the one with the computerized toaster on the cover) is a pretty good book to start out reading, and of course doing the examples:) The book is from 2002, but there is still a lot of good stuff in there. IIRC it was copyright Lucent Technologies, and comes with the GNU compiler and many examples from Linux.
An oscilloscope or logic analyzer, and a few months working through the examples in the book with some real hardware will really help!
Cheers!
Chose another field (Score:1)
Check out 'Advanced FPGA Design' ( ISBN-10: 0470054379 ).
Pretty good fundamental overview about tradeoffs when designing for area/frequency/power etc.
Has some stuff about clock domain crossing too.
-Learn basic Verilog ( asic-world.com is good )
-Start getting a feel for how your logic will be synthesized into primitives like LUT, Block Ram, etc.
-Get into parametric defines, generate statements, more advanced Verilog language features (makes your code more valuable, adaptable, reusable, etc)
-Pick up a copy of
Are you sure you want HDL? (Score:3)
First, I'd suggest deciding what you want to focus on -- firmware (embedded programming) or HDL. If you're coming from a CS background, you might want to start with digital systems and computer architecture before plunging into HDL. Designing a CPU at the gate level will teach you more about how the hardware works than writing a page or two of behavioral HDL. These basics are also good for embedded programming. Without knowing more about your background, it's hard to know what to suggest. If you're coming from PC application programming (or, god forbid, web scripting) with no electronics or low-level background, get ready for a shock -- you're not in Kansas anymore. Personally, I'd suggest starting with some basic 8-bit AVR or PIC projects, since there's a lot of material on the net to help you. I've mostly learned on the job, so rather than giving you books, I'll suggest some topics to study:
Software .text so you can parse the error messages when your program gets too big for the chip it's in. Read this article [muppetlabs.com] for some hints in a PC app context.
1. C programming in general and pointers in particular. Use this as a bridge to assembly.
2. Enough assembly to understand what sort of operations there are and how they're used. Don't bother writing huge programs in assembly; just make you sure you can read your debugger's output. This is a good chance to figure out whether your CPU has a hardware multiplier (probably) and divider (probably not). This tells you which operations are fast and which are dog-slow.
3. Where all the pieces of your program go in memory (code and constants in flash, data in RAM). Learn about standard assembly sections like
4. Bit twiddling. AND, OR, XOR, inversion, shifting. Basically, any C operator listed under the "bitwise" section.
Hardware (theory)
1. Registers. CPU registers. Memory-mapped IO registers. Read-only vs. writable registers. Reset states of registers. You're going to be dealing with a lot of registers.
2. General-purpose IO pins and all their features. Look at a schematic in a microcontroller datasheet and understand input/high-impedance vs. output vs. input with an internal pull-up/down. Maybe driver current strength if you want to make *big* blinking lights.
3. Clocking. Where the clock comes from (crystal? internal oscillator?), how precise it is, how it's divided down, and how to turn off the division so the chip will run at full speed.
4. How to read a datasheet. Figure out what voltage(s) you need to power the chip and what (if anything) should be connected to each pin on power-up. Datasheets are very long. Learn to skim.
5. How to limit the current on LEDs with resistors so they don't blow up.
6. How to use a linear voltage regulator to get a clean 5V out of whatever DC power source you can Frankenstein together.
7. At least a cursory knowledge of voltage, current, and Ohm's law. Know how to determine power dissipation in a resistor (compute it) or an LED (read it out of the datasheet).
Hardware (bench-top)
1. How to use a multimeter. Spring for a nice auto-ranging one on eBay.
2. How to wire up a breadboard. Don't bother soldering yet; this is much easier. Hint: keep your wires short and neatly-arranged. Get a wire stripper and learn how to use it.
3. (Optional) If you have some money to throw around, pick up a bench-top power supply off of eBay. A triple-output supply is the most you'll ever need, but single-output is fine for simple MCU projects. This is more convenient and reliable than cutting up a wall wart.
4. (Even more optional) If you have a lot of money and are pretty serious, get a digital storage oscilloscope (NOT an analog-only scope, unless it's really cheap). This will do wonders for your debugging. Buy used unless you're rich or hard-core. Bench-top electronic tools are not cheap.
5. Find a local electronics store. Fry's is a decent choice (for tools, too!), and many cities have smaller but cheaper
Testbenches (Score:1)
There are lot's of books on HDLs, but to get good results for non-trivial designs you also need to learn how to test them.
http://www.amazon.com/Writing-Testbenches-Functional-Verification-Edition/dp/1402074018 is a great guide to this.
Good books (Score:1)