Author Topic: "The AGC wan't powerful enough"  (Read 24986 times)

Offline Abaddon

  • Saturn
  • ****
  • Posts: 1132
Re: "The AGC wan't powerful enough"
« Reply #15 on: November 30, 2019, 02:49:03 PM »
So how much power do they think is required?
Nobody knows. The claim is that the AGC was insufficient and there it stops. What, in their opinion, would constitute sufficient power is a desert populated by tumbleweeds.

Boeing 747s flew before the moon landing, so I guess the state of computing power at the time was adequate for a large jet airliner, but inadequate for a spacecraft.  Unless Boeing 747s are fake too.
Well, they must be. If nobody could fly the first Apollo landing because it had not been tested, then nobody could have flown the first 747 untested either. Same stupid logic makes all aircraft impossible.

Much of the computing power used in a lot of modern applications is for a high resolution graphics interface.  If you don't have that, it's a lot easier.
Yup. Running a 7 segment LED is a trivial issue compared to a modern graphics card. Indeed while one can run an AGC simulator on your PC, most of the time your PC will be doing something else. This is not the case with an actual AGC. It does nothing else at all. It doesn't have to.

Offline JayUtah

  • Neptune
  • ****
  • Posts: 3787
    • Clavius
Re: "The AGC wan't powerful enough"
« Reply #16 on: November 30, 2019, 04:19:34 PM »
I could ask why you're doing this, given the low cost and ubiquity of microcontrollers and small microprocessors you could dedicate to the task...

It's a legacy algorithm the customer wants to run in its supercomputing cloud.  I gather they don't need the real-time capacity anymore, just the SIMD multiprocessing synchronization.  There is actually quite a lot of this sort of conversion happening, where data are gathered in the field with comparatively dumb methods and then uploaded for central processing.  Previously the data would have been processed locally.  This particular one would be a trivial job if the code had been written cleanly to start with.  From the bits I've seen, it's poorly modularized.

The point was that code like
Code: [Select]
while ( true )
   ;
while you're doing nothing but waiting for an interrupt was often acceptable in embedded systems.  In more primitive architectures like the AGC, you can't do any of the various things modern embedded microprocessors do, like going to sleep.  I'm guessing an "idle" CPU in the AGC would just be running a tight loop.  In a modern time-sharing system, the proper thing to do would be to yield the CPU and let the scheduler decide what to do.
"Facts are stubborn things." --John Adams

Offline JayUtah

  • Neptune
  • ****
  • Posts: 3787
    • Clavius
Re: "The AGC wan't powerful enough"
« Reply #17 on: November 30, 2019, 04:35:13 PM »
Sure, but that is a hardware limitation, not a code limitation. Which was, ironically, implemented by interrupts.

Having only four bits for an interrupt bus was extremely short-sighted of Intel.  But the 1201 and 1202 alarms raise issues my senior software guys are always getting on the juniors' case over:  just because something is happening under the hood doesn't mean you get to ignore it.  Every option incurs overhead.  In the AGC case, they had worked out on paper how much time the computer would normally be expected to spend servicing interrupts.  And they planned the rest of the code with that "fixed" overhead in mind.  When the number of radar interrupts doubled unexpectedly, this had a cascade effect on the rest of the hardware-software system combined.

Yes, you could have solved some of the Apollo 11 problems in hardware.  Provide more erasable memory, and you get more core sets.  That way a program is less likely to complain that it can't find an empty core set and raise a program alarm.  Similarly, always tie the radar power supply clocks and you'll never get duplicate interrupts.  But that's not how embedded systems designers think.  They're writing code for a computing instrument they designed, and whose operational behavior is something they can know intimately.  Today developers are taught to write portable code.  Those techniques seem ill-suited to even modern embedded systems design.  The AGC programmers wrote code that exactly fit their machine.  This was more the norm.
"Facts are stubborn things." --John Adams

Offline Abaddon

  • Saturn
  • ****
  • Posts: 1132
Re: "The AGC wan't powerful enough"
« Reply #18 on: November 30, 2019, 05:21:04 PM »
Sure, but that is a hardware limitation, not a code limitation. Which was, ironically, implemented by interrupts.

Having only four bits for an interrupt bus was extremely short-sighted of Intel.
Sure, in hindsight. Remember the notion that nobody could possibly need more that 640k RAM? Remember Extended memory up to 1 meg RAM? Put bluntly, developers will find a way to consume all available resources, it's just how it is.

But the 1201 and 1202 alarms raise issues my senior software guys are always getting on the juniors' case over:  just because something is happening under the hood doesn't mean you get to ignore it.  Every option incurs overhead.  In the AGC case, they had worked out on paper how much time the computer would normally be expected to spend servicing interrupts.  And they planned the rest of the code with that "fixed" overhead in mind.  When the number of radar interrupts doubled unexpectedly, this had a cascade effect on the rest of the hardware-software system combined.
I have always regarded the 1201 and 1202 as the AGC equivalent of "I don't know".

Yes, you could have solved some of the Apollo 11 problems in hardware.  Provide more erasable memory, and you get more core sets.  That way a program is less likely to complain that it can't find an empty core set and raise a program alarm.  Similarly, always tie the radar power supply clocks and you'll never get duplicate interrupts.  But that's not how embedded systems designers think.  They're writing code for a computing instrument they designed, and whose operational behavior is something they can know intimately.
And that makes sense in context. "I don't know" is a valid answer to any question. I would rather have that then an outright crash on a lunar mission.

Today developers are taught to write portable code.  Those techniques seem ill-suited to even modern embedded systems design.  The AGC programmers wrote code that exactly fit their machine.  This was more the norm.
Ah. That is a personal bugbear of mine. Cross platform developers are becoming divorced from the actual hardware. The hardware matters no matter how one slices it. Why else would we have Apples and PCs? And software developed for each? Not to mention linux.

On top of that we have a gaggle of devs working in apps for smartphones and such devices.

To my mind, and understanding of the underlying hardware is essential. Not so much to the new young dudes, apparently.

Offline raven

  • Uranus
  • ****
  • Posts: 1637
Re: "The AGC wan't powerful enough"
« Reply #19 on: November 30, 2019, 09:23:30 PM »
Its probably the case that it would have taken more "computer power" to fake the moon landings in 1969 that it did to actually go there.
Given that I've heard the claim it used CGI (a rediculously preposterous idea) that would need to be decades ahead of its time to have even a hope of achieving what was seen on TV, let alone in photographs, I have no doubt of that.

Offline JayUtah

  • Neptune
  • ****
  • Posts: 3787
    • Clavius
Re: "The AGC wan't powerful enough"
« Reply #20 on: December 01, 2019, 04:45:09 PM »
Sure, in hindsight.

I remember it being discussed at the time.  Remember also, as you say, segmented memory addressing, a flash-in-the-pan technique that was very quickly superseded by flat address spaces in other, better architectures.  But we got stuck with segment registers for far longer than needed simply because Intel cemented it in place.  In deference to your defense, I'll point out that the AGC had a pretty bonkers memory banking scheme.  For all its beauty, a lot of it too was obsolete right out of the gates (multiple puns intended).  I'll quit ragging on Intel.

Quote
Remember the notion that nobody could possibly need more that 640k RAM?

Apocryphal statement.  But yes, the history of computer technology is the history of people making bizarrely wrong guesses about what the future would bring.  It wasn't too long ago that 15 teraFLOPS was a pretty fast computer.  Almost half my senior software staff comes from the gaming industry.  Those guys know how to push hardware.  But they also know how to analyze the hardware and optimize for it so that they don't push beyond the hardware..  That talent is what I wish were more prevalent in the software industry, and I think that's what the AGC programmers exemplified.

Quote
I have always regarded the 1201 and 1202 as the AGC equivalent of "I don't know".
[...]
"I don't know" is a valid answer to any question. I would rather have that then an outright crash on a lunar mission.

I had to read this a couple of times before I understood it enough to agree with it.  Yes, I think an important consideration in any critical system -- however designed and built -- is not to promise (or insinuate) anything it can't deliver.  So on the one hand, an automated system should never behave as if it has things well in hand when it can know it doesn't.  On the other hand, it should do its best to fail gracefully.  And by that I mean fall back to successively less capable modes of operation rather than stop suddenly altogether.  Even sophisticated automotive controllers often have a "limp mode" that provides basic engine operation.  And for PGNS there was AGS.  But especially with highly qualified pilots, you don't want to err on the side of suppressing failure indications in a misguided attempt to limp along as if nothing was wrong.  One can make the case that certain large airframe manufacturers need to learn that lesson anew.

The way the AGC was architected, we could discuss forever what a "crash" means, in the computer sense.  But the real genius was that while 1201 and 1202 simply signaled symptoms in terms of undesirable, detectable software states, a human could make a judgment.  The AGC didn't know why the Executive was overloaded, or why there were no available core sets.  That level of introspection was not provided by the programmers.  But Steve Bales knew.  Which is to say, he knew that the consequences of not running certain periodic tasks would be an accumulation of uncorrected error, but that as long as that condition did not persist, the entire vehicle could stay within flight tolerances even though not strictly within the designated deadband.  It's the equivalent of taking your eyes off the road for a minute to fiddle with the radio.  It's naturally not as safe as maintaining situational awareness, but it can be tolerated briefly.

Quote
Ah. That is a personal bugbear of mine. Cross platform developers are becoming divorced from the actual hardware. The hardware matters no matter how one slices it.

Yeah, there's a lot of potential discussion to be had there, and if we had it I'd like the more professional software developers to weigh in on it.  I've rarely seen reuse done well, even with the best of intentions.  I've rarely seen portability done well, but I'm sure some of the open-source community could easily come up with good examples.  What irks me above all are some programmers who come from a certain language culture (which shall remain nameless) who are blithely unaware that any sort of hardware exists at all.  A few of these people -- a very few, thankfully -- seem to have no idea whatsoever how computers work.

That said, as ka9q points out, often the right answer is simply to throw more silicon at the problem.  If $2,000 worth of additional RAM will solve someone's problem in as much time as it takes to install the SIMMs, then why would any conscientious engineering company expend ten times that much or more in programmer time and money to bring the present solution under the existing hardware umbrella?  For many classes of problems in computing, there are severe limits to what can be optimized by programmers burning their neurons long into the night.  I've seen talented programmers achieve factors (not just margins) of increased resource efficiency -- admittedly in originally poor code.  But I've also seen expensive software improvement efforts that result in only marginal increases in performance or efficiency, sometimes at the expense of correctness in a complicated algorithm.  Whatever else an algorithm is, it has to be correct.

I've found that electrical engineers take a very different approach to software than computer scientists.  Historically they write only embedded software, and they don't think for a moment that they can change the hardware without also having to change the software, or that the software they write for one gadget will transfer seamlessly to some other gadget.  The commercial reality of reuse and standardization is changing this, but if you want to talk just in terms of what EEs think software is, it's instructive.
"Facts are stubborn things." --John Adams

Offline bknight

  • Neptune
  • ****
  • Posts: 3107
Re: "The AGC wan't powerful enough"
« Reply #21 on: December 02, 2019, 03:10:29 PM »
I can remember Derek's repeat of John's theory that the AGNC wasn't up to snuff in two threads (UM and here) prior to A13.  Of course he could not present any evidence to that end, but his buddy that worked for Hughes let him know.  ::)

We're going onto six months post A11 anniversary and still no dramatic proof that Apollo was fake and no D to show his stuff.
Truth needs no defense.  Nobody can take those footsteps I made on the surface of the moon away from me.
Eugene Cernan

Offline Zakalwe

  • Uranus
  • ****
  • Posts: 1588
Re: "The AGC wan't powerful enough"
« Reply #22 on: December 02, 2019, 04:36:14 PM »
I can remember Derek's repeat of John's theory that the AGNC wasn't up to snuff in two threads (UM and here) prior to A13.  Of course he could not present any evidence to that end, but his buddy that worked for Hughes let him know.  ::)

We're going onto six months post A11 anniversary and still no dramatic proof that Apollo was fake and no D to show his stuff.

Who was the other loon that was convinced that Aldrin was going to break down and admit it was all a hoax on the eve of (IIRC) the 40th anniversary (or was it the 45th)? We're still waiting for that one to happen too....
"The strain of anti-intellectualism has been a constant thread winding its way through our political and cultural life, nurtured by the false notion that democracy means that 'my ignorance is just as good as your knowledge.' " - Isaac Asimov

Offline Abaddon

  • Saturn
  • ****
  • Posts: 1132
Re: "The AGC wan't powerful enough"
« Reply #23 on: December 02, 2019, 05:56:00 PM »
Sure, in hindsight.

I remember it being discussed at the time.  Remember also, as you say, segmented memory addressing, a flash-in-the-pan technique that was very quickly superseded by flat address spaces in other, better architectures.  But we got stuck with segment registers for far longer than needed simply because Intel cemented it in place.  In deference to your defense, I'll point out that the AGC had a pretty bonkers memory banking scheme.  For all its beauty, a lot of it too was obsolete right out of the gates (multiple puns intended).  I'll quit ragging on Intel.
My opinion on that is that it was a triumph of marketing over engineering.
Quote
Remember the notion that nobody could possibly need more that 640k RAM?

Apocryphal statement.  But yes, the history of computer technology is the history of people making bizarrely wrong guesses about what the future would bring.  It wasn't too long ago that 15 teraFLOPS was a pretty fast computer.  Almost half my senior software staff comes from the gaming industry.  Those guys know how to push hardware.  But they also know how to analyze the hardware and optimize for it so that they don't push beyond the hardware..  That talent is what I wish were more prevalent in the software industry, and I think that's what the AGC programmers exemplified.
Alas, it is still all too common.

Quote
I have always regarded the 1201 and 1202 as the AGC equivalent of "I don't know".
[...]
"I don't know" is a valid answer to any question. I would rather have that then an outright crash on a lunar mission.

I had to read this a couple of times before I understood it enough to agree with it.  Yes, I think an important consideration in any critical system -- however designed and built -- is not to promise (or insinuate) anything it can't deliver.  So on the one hand, an automated system should never behave as if it has things well in hand when it can know it doesn't.  On the other hand, it should do its best to fail gracefully.  And by that I mean fall back to successively less capable modes of operation rather than stop suddenly altogether.  Even sophisticated automotive controllers often have a "limp mode" that provides basic engine operation.  And for PGNS there was AGS.  But especially with highly qualified pilots, you don't want to err on the side of suppressing failure indications in a misguided attempt to limp along as if nothing was wrong.  One can make the case that certain large airframe manufacturers need to learn that lesson anew.
Ah the semi mythical graceful exit. There still exists a surprising amount of software that refuses point blank to provide any data in the event of something catastrophic. To often one sees a helpful message like "There was an error." and that's it. Nothing else.

The way the AGC was architected, we could discuss forever what a "crash" means, in the computer sense.  But the real genius was that while 1201 and 1202 simply signaled symptoms in terms of undesirable, detectable software states, a human could make a judgment.  The AGC didn't know why the Executive was overloaded, or why there were no available core sets.  That level of introspection was not provided by the programmers.  But Steve Bales knew.  Which is to say, he knew that the consequences of not running certain periodic tasks would be an accumulation of uncorrected error, but that as long as that condition did not persist, the entire vehicle could stay within flight tolerances even though not strictly within the designated deadband.  It's the equivalent of taking your eyes off the road for a minute to fiddle with the radio.  It's naturally not as safe as maintaining situational awareness, but it can be tolerated briefly.
Albeit he had backroom engineers supporting him, it was still a huge call to make.

Quote
Ah. That is a personal bugbear of mine. Cross platform developers are becoming divorced from the actual hardware. The hardware matters no matter how one slices it.

Yeah, there's a lot of potential discussion to be had there, and if we had it I'd like the more professional software developers to weigh in on it.  I've rarely seen reuse done well, even with the best of intentions.  I've rarely seen portability done well, but I'm sure some of the open-source community could easily come up with good examples.  What irks me above all are some programmers who come from a certain language culture (which shall remain nameless) who are blithely unaware that any sort of hardware exists at all.  A few of these people -- a very few, thankfully -- seem to have no idea whatsoever how computers work.
Fundamental SQL is not half bad at that, right up until one gets to the more esoteric procedure calls, however.

That said, as ka9q points out, often the right answer is simply to throw more silicon at the problem.  If $2,000 worth of additional RAM will solve someone's problem in as much time as it takes to install the SIMMs, then why would any conscientious engineering company expend ten times that much or more in programmer time and money to bring the present solution under the existing hardware umbrella?
Hence the need to develop a minimum spec, and stick to it whatever marketing might say.

For many classes of problems in computing, there are severe limits to what can be optimized by programmers burning their neurons long into the night.  I've seen talented programmers achieve factors (not just margins) of increased resource efficiency -- admittedly in originally poor code.  But I've also seen expensive software improvement efforts that result in only marginal increases in performance or efficiency, sometimes at the expense of correctness in a complicated algorithm.  Whatever else an algorithm is, it has to be correct.
And that is a tightrope to walk. Chuck hardware or programming resources at the problem? I have been fortunate with most of my clients in that when I make that call, they simply say "OK" based on a solid case and authorise whatever loot is needed for whichever route forward. But not everybody is so fortunate.

I've found that electrical engineers take a very different approach to software than computer scientists.  Historically they write only embedded software, and they don't think for a moment that they can change the hardware without also having to change the software, or that the software they write for one gadget will transfer seamlessly to some other gadget.  The commercial reality of reuse and standardization is changing this, but if you want to talk just in terms of what EEs think software is, it's instructive.
I lack much experience with EEs in the programming realm, so cannot comment much. What little I have suggests that they are concerned largely with naturally embedded solutions such as PLCs that control traffic lights, for example. I learned those ropes some 35 years ago and have no used it since, nor would I make the attempt. But I know it is an art to operate within those constraints.

Offline Abaddon

  • Saturn
  • ****
  • Posts: 1132
Re: "The AGC wan't powerful enough"
« Reply #24 on: December 02, 2019, 06:01:01 PM »
I can remember Derek's repeat of John's theory that the AGNC wasn't up to snuff in two threads (UM and here) prior to A13.  Of course he could not present any evidence to that end, but his buddy that worked for Hughes let him know.  ::)

We're going onto six months post A11 anniversary and still no dramatic proof that Apollo was fake and no D to show his stuff.

Who was the other loon that was convinced that Aldrin was going to break down and admit it was all a hoax on the eve of (IIRC) the 40th anniversary (or was it the 45th)? We're still waiting for that one to happen too....

Oh, that is ringing some bells. Can't dredge up the name from the memory banks. Sorry.

Offline Zakalwe

  • Uranus
  • ****
  • Posts: 1588
Re: "The AGC wan't powerful enough"
« Reply #25 on: December 03, 2019, 03:59:44 AM »

Who was the other loon that was convinced that Aldrin was going to break down and admit it was all a hoax on the eve of (IIRC) the 40th anniversary (or was it the 45th)? We're still waiting for that one to happen too....

Oh, that is ringing some bells. Can't dredge up the name from the memory banks. Sorry.

Was it the worlds most boring accountant that had Neil Armstrong's ghost appearing to him in his dreams? Or the "atomic bombs don't work" loon? After a while they all start to blur into a mushy mess of stupidity, resentment and ignorance!  ;D
"The strain of anti-intellectualism has been a constant thread winding its way through our political and cultural life, nurtured by the false notion that democracy means that 'my ignorance is just as good as your knowledge.' " - Isaac Asimov

Offline rocketman

  • Mercury
  • *
  • Posts: 17
Re: "The AGC wan't powerful enough"
« Reply #26 on: December 03, 2019, 06:20:05 AM »
Or the "atomic bombs don't work" loon?

Seems like that one should have negative consequences on one's likelihood of survival, at least back in the days of more active testing.

Offline ApolloEnthusiast

  • Venus
  • **
  • Posts: 38
Re: "The AGC wan't powerful enough"
« Reply #27 on: December 03, 2019, 08:04:56 AM »
We're going onto six months post A11 anniversary and still no dramatic proof that Apollo was fake and no D to show his stuff.
He probably meant the 60th anniversary and typed 50th by mistake.  We'll see in another 10 years  ::)

Offline bknight

  • Neptune
  • ****
  • Posts: 3107
Re: "The AGC wan't powerful enough"
« Reply #28 on: December 03, 2019, 08:24:41 AM »
We're going onto six months post A11 anniversary and still no dramatic proof that Apollo was fake and no D to show his stuff.
He probably meant the 60th anniversary and typed 50th by mistake.  We'll see in another 10 years  ::)
LOL Well he indicated he was a physicist and could have made a fat finger entry.  No he stated the same BS in two threads (UM and here) so I conclude he believed that Apollo didn't land on the Moon, except A14, A15 and A16.
Truth needs no defense.  Nobody can take those footsteps I made on the surface of the moon away from me.
Eugene Cernan

Offline ka9q

  • Neptune
  • ****
  • Posts: 3014
Re: "The AGC wan't powerful enough"
« Reply #29 on: December 03, 2019, 09:23:42 AM »
I have always regarded the 1201 and 1202 as the AGC equivalent of "I don't know".
They really mean "I'm running out of real time". The specific ways that's detected don't matter. Adding more memory wouldn't have fixed the problem because running out of memory was only a symptom of the real problem.

This became instantly obvious to me when I saw the recent Youtube demo of a recreated AGC re-running the Apollo 11 landing with spurious rendezvous radar interrupts added. The COMP ACTY light went on almost continuously, as it had to. Had I been in Aldrin's position I like to think that I would have immediately realized what was going on, though not why or whether I could continue.  Any real-time computer system MUST have some idle time left over, or it won't keep up.