PDA

View Full Version : Polling vs Interrupt-driven Architectures



Al1970
08-03-2009, 02:50 AM
[Moderator Note: I can't insert a post or reorder them out-of-date. This thread was split from here (http://forums.trossenrobotics.com/showthread.php?p=33000) . No big deal, it just needed its own thread. For any reference that doesn't make sense, please see the other thread... You can see where it was split since I left a similar note at that point. -A]

Hi:

Sorry Adrenalynn & Inxfergy, I have to disagree with a lot of what you said. I read Inxfergy's write up on interrupts and did not care for it. If you care to see what I have to say about them you can go to my site at: http://diyrobots.webng.com

PiGuy, I didn't understand your post on what you are trying to do. I can say that I have used a 16F687 PIC to read the pulses from a "VEX" transmitter, get servo commands from the serial port and run 6 servos and never used an interrupt. You will never learn how to program a PIC well if you start programming them using "C" ! It doesn't give you the feel of what is going on in the PIC. Learn 1st then if you want to use "C" later you can. Myself, I would never use "C" on a PIC at least not at the power they are now.

Al

Adrenalynn
08-03-2009, 03:32 AM
Hi Al,

I briefly browsed what you authored. Alas, working on realtime systems for twenty years, I can't get behind "interrupts aren't necessary". To do your writeup justice, I'd need to spend some time and actually read it for comprehension. If your basic premise is the lack of necessity of interrupts, we're never going to reach a consensus. And let me know if you ever work on any commercial/critical systems with that mindset. I hate it when my antilock brakes get ignored 'cause I was turning up the radio while hitting the windshield washer. ;)

>> Myself, I would never use "C" on a PIC at least not at the power they are now.

Why's that? C doesn't run on a PIC. In fact, C doesn't _run_ on anything at all. It compiles to object code and then links to machine code. Just like assembler.

Adrenalynn
08-03-2009, 03:46 AM
Having read it a little more completely - I suspect our experiences differ a little.

>> A good programmer knows the timing of all the signals in his project. Let me say that again. A good programmer knows the timing of all the signals in his project.

If only life were so perfect. The moment you interface to the "Real world", the ANALOG world, that goes right out the window. I know I've never been able to give my users a stopwatch and insist that they only press keystrokes at regular timed intervals. And my weather station project - that darned wind just doesn't give me any kind of apriori knowledge of changes in direction. And the credit card terminal OS I did, that darned modem negotiation just didn't sit still for the noisy variable line conditions - or the silly users tagging the cancel key while I was trying to read their keypresses and display them on the LCD and get a response back from the bank/processor.

In real world programming it is entirely impossible to know the "timing of all signals in a project".

"A really good programmer knows that they can never know, or presume to know, the timing of all signals in a project."

lnxfergy
08-03-2009, 07:07 AM
I read Inxfergy's write up on interrupts and did not care for it. If you care to see what I have to say about them you can go to my site at: http://diyrobots.webng.com

So since the PIC doesn't have a real interrupt controller, you hate interrupts? Wow. Thanks! The AVR does have an true interrupt vector table, with 2 dedicated external interrupt channels. Which are great for exactly the things I listed: encoders, detecting short pulses, and an "always-fast" bumper. Sure, interrupts aren't for every task, but for some, they are a good solution, maybe even the best or only solution.


Myself, I would never use "C" on a PIC at least not at the power they are now.

I'm thinking you should get a different compiler then. A good compiler should be able to build code that runs, on average, as fast as a human programmer's code. Obviously, for sections where timing is crictical, you'll have to still use some inline assembly. I'm not that familiar with the PIC compilers available, but the AVR architecture is actually slightly tuned for C-code, since C is way more portable and maintainable than assembly.

-Fergs

Pi Guy
08-03-2009, 01:42 PM
Lets say you have the same setup as above but this time one of the pulses only will stay high for .5 us. If you only used polling you could not "see" a pulse that is that short. On the other hand a PIC's interrupt circuit can "see" a pulse that is that short. So that statement is false.

Another place where an interrupt shines is in keeping things in sync. If you have 2 PIC that have no hardware for communications. The best way for the two to communicate is by using interrupts.


I think what he is trying to say is sometimes people use interrupts at times when it is unecessary, but from what he says here explains that the exeption would be tasks such as serial communication, which I intend to do.

Adrenalynn
08-03-2009, 06:09 PM
Fergs Wrote: >> Obviously, for sections where timing is crictical, you'll have to still use some inline assembly

A really REALLY good programmer would hand-optimize the machine code their linker spits out if timing was that critical. :tongue:

Al1970
08-04-2009, 03:08 AM
Sorry but I have to disagree with just about everything say. Adrenalynn and Inxfergy you are trying to twist everything in my write up on using polling or interrupts. Nowhere does it say I hate interrupts.

"I hate it when my antilock brakes get ignored 'cause I was turning up the radio while hitting the windshield washer."

Learn how to use polling and it all will work just find.

Guess what Adrenalynn the "Real world, the ANALOG world," is NOT just an analog world! The electron moving from one level to another, NOT analog. A tree branch breaking off NOT analog. The way some parts of the brain works, NOT analog. The saturation point of a medium, NOT analog.

"I know I've never been able to give my users a stopwatch and insist that they only press keystrokes at regular timed intervals."

Hitting a key on a keyboard is an event. The signal made by the switch you better know the timing of or either you will not be able to detect the key or it will show as more than one key press.

I have seen programs written for PICs in "C" by people who never learn to program in assembler 1st. They will make timing loops that do nothing but blink leds on a robot and never think twice about it. They just wasted hundreds of thousands of instructions that could have been used for polling and hundreds of other things. Not the fact that MPASM assembler will save you an instruction here and there over "C". That is the same thinking people use when it comes to using interrupts. I am working on a robot right now where the encoder wheel gives 2000 pulses per second with two motors that 4000 pulses. If I used interrupts here it would be a waste of computer power since interrupts can take 10 times the software time a poll would take. That is why I disagree with what you said Inxfergy.

Al

Adrenalynn
08-04-2009, 06:42 AM
How exactly is a tree branch breaking off digital? I've never seen one spontaneously cease to exist... Have you?

The brain is currently thought to function as a series of threshold events - analog.

All silly examples since we're not programming spontaneously vanishing tree limbs or hacking synaptic junctions. If we were interested in the latter, we'd be doing extraordinarily non-digital forward and backward propagation models to approximate the analog interference of the signals and if we were interested in the former, we'd be doing some trippy trinary quantum computing.

Yes, hitting the key is an "event" - would you show me an event-driven (internally) MCU? If you do - it's an RTOS and the underlying RT message passing architecture is interrupt-driven. Events occur when an interrupt vectors execution to the event. Hey - here's an example: What happens when you press a key on the keyboard on your PC and the 8249 stuffs it into the buffer? We'll skip the early PC and go more modern and relevent...



mov ah, 11h
[B]int 16h ;See if we've got a key available. Look, ma!
; worthless interrupts!
je GetaKey ;jump off to our keystroke handler.

push ds
push bx
mov ax, 40h
mov ds, ax

cli ;Clearing Interrupts! Don't linger here!
mov bx, PtrHead ;Pointer to next character.
mov ax, [bx] ;Get the character.
incptr PtrHead ;Bump up Head Pointer
pop bx
pop ds

iret ;let's return from this worthless interrupt we
;don't need or want...



I suppose there is the slight chance that the millions of engineers that have been writing low-level code to the PC architecture might just have overlooked that whole "key presses should really be polling instead of interrupt-driven" issue...


The "signal made by a switch" has no timing. Which is, in large part, the reason you need to debounce it. You debounce a switch because you can NEVER KNOW the "timing" of a noisy electrical circuit. Because an electro-mechanical switch is anything but digital. Furthermore - I can hold a make/break switch in for as long or as short a period as I like. Which is near-random. Analog. Just like the rest of the world.

The take-away here is that _every experienced engineer_ uses interrupts for time-critical or real-time systems programming. Do you really believe that they are all wrong and you are the only one on the planet to discover that polling works fine for all time-critical application? I suppose all of us that have put space-craft in orbit are just inferior programmers that haven't learned "how to use polling", huh? The aforementioned antilock braking system engineers haven't learned "how to use polling". The RT OS message/stack passing architects haven't learned "how to use polling"? Intel, AMD, Atmel, STM, TI, Cypress, National, Fujitsu, Infineon, Freescale, Dallas, Toshiba, NEC,... All morons. Now we're just gettin' silly...


As far as executing an interrupt taking "10 times the time" compared to polling - would you detail which architecture you're referring to and give an example of a real-world non-tight-looped complex application where an interrupt that takes, say, two to eight clocks (RISC to CRISC) is going to be ten times slower than, say, some 50,000 line application that needs to poll multiple events frequently? Sure - an ISR can be poorly implemented - just like anything else. Maybe the problem we're running into here is that you haven't implemented anything of substantial length and complexity?

Meh - forget it. I'll let you have the last word on this one. I'm done...

lnxfergy
08-04-2009, 06:43 AM
I still have to disagree with the basis of your argument:


Programming is NOT like going to the movies. It is NOT just a matter of taste. Programming is like math. An instruction takes X amount of time. Put some intructions together and make a routine. That routine is going to take X amount of time. So you can prove if your program will work better using interrupts or not. What do I mean by better? If your program does everything you want it to and does it faster than another program then that program is better.

If your program is sufficiently fast to accomplish what is necessary, it gets the job done, period. You are completely missing two very important points in deciding which code is 'better': portability of code (which using assembly with throw right out the window...), and the readability/ease-of-maintenance of code. Having a main loop, plus several easy to follow interrupts, is also a way of breaking up the logic of the program. Having massive spaghetti code with polling points slapped in everywhere can really be a pain (whereas the interrupt can be in the background at any time, regardless of which functions are running).

Generally speaking, hobbyists aren't even close to fully utilizing the number of instructions per second that their micro can offer. That doesn't mean they should just run around and write piles of crud code, but it does mean they have some luxury over a true blue embedded designer. But, with well written code, especially a well written ISR that minimizes the number of variables used (and thus the # of instructions for a context switch), I can't see an interrupt being 10x slower, at least not on a modern architecture that doesn't suck.

As for my tutorial, sure, the "ISR based bumper" is a ridiculous example by itself, but it is easy to understand -- isn't that the point of a tutorial, to make something easier to understand? Having a physical entity such as the bumper attached to our interrupt is easier to wrap your head around than a mystery device that outputs an 0.5uS pulse. Of course, the main loop does nothing at all, but hey, it's an example to highlight how an interrupt works, and it's a stub of a program, it currently doesn't even make the robot move in an intelligent way.

-Fergs

Adrenalynn
08-04-2009, 07:16 AM
[clip!] but hey, it's an example to highlight how an interrupt works, and it's a stub of a program, it currently doesn't even make the robot move in an intelligent way.


I would go one step further: We always need to assume that our sensor package will grow more and more complex, more and more involved, more and more timing-critical. I look at what you have as a head-start of a skeleton with one simple sensor to set the stage. The remaining sensors are just left as an exercise for the learner.

Al1970
08-05-2009, 12:49 AM
I would be nice Adrenalynn if you stop trying to change what I said. I said "The way SOME parts of the brain works, NOT analog. See the word "some" not "all"

As far as the PC goes, you left out the fact that the keyboard had it's own microprocessor because the engineers knew running the keyboard directly to the PC and trying to have it run by interrupts didn't work. It just took too much of the PC computer power.

Inxfergy, you did not have one con about using interrupts in your write up and there are many.

I had enough of this nonsense.

Al

Adrenalynn
08-05-2009, 01:08 AM
I had enough of this nonsense.


Finally - we find common ground. ;)

Acidtech
08-05-2010, 11:08 PM
Seems the subject is moving far afield but I'd like to point out a case where interrupts are definitely prefered over polling.

Multiple intermittent input signalling. Why would you want to wast the code space/time to pull a dozen input signals(as an example) when they could all be on interrupts? Instead you could be doing something usefull with all those processor cycles you would otherwise use up polling for changes.

Another example. You have a single timeing critical function(producint NTSC video signal timings) and alot of other non-timing critical code(user input, game AI code etc...). You could of course write your NTSC video signalling code inline with your user input handler code, but if you had too add or change any of that non-timing critical code you now have to rewrite large chunks of your timeing critical NTSC code, or at a minimum recalculate all the timings on that code. Instead put all the NTSC code generation in an interrupt handler and you 've effectively disconnected that timing critical code from the non-critical code.

These are of course two very different situations. One is for efficiency and the other is for ease of coding and yet I suspect you'd find it hard to refute interrupts are better in both cases than a polling solution. In general if you have an appropriate interrupt source you should use it. Let the hardware handle the work it was designed for.

Al1970
08-11-2010, 01:28 AM
I see we are back to this again.

"Seems the subject is moving far afield but I'd like to point out a case where interrupts are definitely prefered over polling.

Multiple intermittent input signalling. Why would you want to wast the code space/time to pull a dozen input signals(as an example) when they could all be on interrupts? Instead you could be doing something usefull with all those processor cycles you would otherwise use up polling for changes."

This is not true. It does NOT take a lot of space to poll 12 inputs. If you use something like a PIC 16F887; it would take most likely about 100 instructions to do it with interrupts. There is no hardware to help you; you have to do it in the software.

That means you have to jump to a ISR, save the regs that are being used. Poll, yes I did say poll all the inputs that are using interrupts, since the PIC has no hardware to tell you which input caused the interrupt. Then you have to write more software to deal with what happens if another interrupt happens while you are still working on the 1st interrupt. This software has to know which interrupt should be allow to interrupt the 1st and which should not. If the 1st interrupt gets interrupted then the software has to know that it is no longer working on the 1st interrupted and it is now working on the 2nd interrupt. The software has to save the regs which can not be in the same place that you saved the last regs. Of course all this has to be undone as you go from the 2nd interrupt to the 1st interrupt. You don't think all this takes time?

There is no magic in using interrupts. If you use more then one interrupt on a PIC, you really have to know what you are doing and in almost all cases polling would beat it hands down.

Al
http://diyrobots.webng.com

Acidtech
08-12-2010, 10:02 AM
It has nothing to do with space. It has to do with TIME. You are wasting time polling for those inputs that you could be spending doing other stuff. It may not seem like much to poll 12 inputs but thats at least 12 instructions(most likely many more than that) you could be doing something else.

Polling takes resources interrutps do not. Period. The blanket statement that interrupts are never needed is wrong. They aren't needed in every case but there are cases where it is more efficient to use them. That is why they were invented.

Al1970
08-13-2010, 12:16 AM
These are your statements:

"Why would you want to wast the code space/time to pull a dozen input signals(as an example) when they could all be on interrupts?"

"It has nothing to do with space"

You can't even agree with yourself!

"The blanket statement that interrupts are never needed is wrong."

That is true; that is why I NEVER made that statement.

Al
http://diyrobots.webng.com (http://diyrobots.webng.com)