Tuesday, September 3, 2024

TeleScript Software - Optimizing for Windows 11

TeleScript Software

Optimizing Performance for Windows 11

In earlier days, computer performance could largely be predicted by examining hardware specifications, and even from manufacturer’s model numbers. Today, virtually all of a particular brand’s offerings, are highly configurable offering a choice of processors, display cards, memory capacity, and internal drive specs. In addition, computers today are “driver centric” – that is the performance of hardware might depend more on driver design than on the actual physical device specs. For this reason, it’s difficult and uncertain to make a reliable suggestion when customers ask for a recommendation of a computer model that will work well with TeleScript Software.

The only real test is to buy and try. Since there are hundreds if not thousands of potential candidates, this is obviously not possible. Nevertheless, there are some general properties that make a specific system more likely to provide good performance. Over the years at Telescript West, Inc., we have configured around several hundred systems for customers using a variety of brands, using Operating systems from MS-DOS, and Microsoft Windows OS versions starting at Windows 95, Windows Vista, Windows 7, Windows 8/8.1, Windows 10, and, most recently, Windows 11. Each computer and each OS version has presented some unique considerations. In this document, I’ve summarized my findings for best performance on Windows 11.

Most experimental data is derived from the Performance Test Software available from PassMark.com. Though there are several available benchmarking applications, PassMark is the one that I've found best for TeleScript  Software, because the others seem to overlook 2D Graphics performance which is essential for smooth scrolling in TeleScript ProNEWS and TeleScript AV (TeleScript TECH is more dependent on 3D Graphics and is very forgiving of hardware configuration.) In addition, a large, searchable database is available on PassMark.com, comparing performance data from various system components. I highly recommend this web site and their inexpensive, comprehensive software for assessing computer performance.

I. Selecting a Computer

Higher price is not always better. There is not always a correspondence between price and performance – at least not specifically with TeleScript Software. Some high end, so called “gaming computers”, perform no better then some lower end computers.

What is likely to be important? Although in the driver-centric world, as explained above, and the manufacturer’s board design, the same components may perform differently in different systems. In general, here are some hardware specs that might be significant:

1. More RAM is better. Why? Because more memory will mean fewer system requests for virtual memory. A virtual memory request from the OS will involve a read and/or write to the system drive. Drive access is always a high priority OS request and may interrupt a thread important to smooth scrolling. On today’s complex multi-core CPUs, the frequency of virtual memory requests, and the impact on program performance is not as predictable as in earlier single core systems, but experiments show that more RAM always performs better with TeleScript Software.

2. A faster hard drive is important. Largely, for the same reason that more RAM helps system performance, it’s largely about virtual memory. A fast SSD always performs better in experiments. Be certain not to run short of drive space, which will severely impact your computers performance.

3. 2D Graphics performance is more impactful than 3D Graphics performance. The algorithms used to display prompting text, and to smoothly update the scrolling display, are more dependent on 2D performance. I have used software provided by PassMark to find display hardware that reportedly has good 2D performance. Some systems, such a some gaming computers, that have excellent 3D performance, important for games and graphics, have poorer 2D specs. The display's 2D Graphic is often used for text display and is at the heart of TeleScript Software performance.

4. CPU speed is helpful, but not as critical. In earlier computers, the CPU was often involved in display buffering, but I don’t see any evidence that modern display card design uses this older technology.

5. High resolution displays are NOT recommended. A display with a resolution of 1920x1080 is more than adequate for a good teleprompter display on all common monitors used far this application. The display card must fill EVERY pixel of an LCD display when the text scrolls. Some drivers may have algorithms to do minimal updates, but in a prompting script, virtually every displayed pixel will change as text scrolls.

It’s important to note that neither scaling nor changing resolution via the display card’s control app, or Windows settings will actually change the monitor resolution. In fact, setting lower resolution – such as setting the “resolution” of a high DPI LCD monitor to a lower perceived resolution – may actually negatively impact scrolling performance. In order to make screen content look larger, the display driver, or possibly the OS, uses dithering algorithms to display screen images. This adds an extra layer of processing for each pixel displayed – every pixel STILL has to be displayed. The LCD pixel mask can’t actually change.

The bottom line is that more pixels require more time to display. For example, a 2560x1600 laptop montor has more than 4 million pixels to write. A 1920x1080 LCD display has only about 2 million pixels – half as many. All else being equal, this means the computer will take twice the time to write the display. Display resolution is a major contributor to scrolling smoothness.

If you are using an external display, which is the normal teleprompter configuration, the same criterion applies regarding display resolution. Also, although what goes on inside any display card is a black box and the methods are not public, duplicating the display – used to be called “cloning” – is achieved by filling one display buffer and “blitting” to the second display buffer. However the second display is achieved, it’s not just a ‘Y’ cable… there are two separate display buffers that must be filled and the size of the buffers depends upon the display physical resolution. More pixels means more time.

6. Update drivers on any system. On any computer system, it’s vitally important to update drivers for all components, but particularly the video/display driver. On some computers, the manufacturer provided driver may be installed. On others, notably Dell computers, require the manufacturer’s customized drivers. Unfortunately, in some cases, customized drivers may lag the chip manufacturer’s drivers. I have encountered many cases in which a driver update fixes a performance issue or a screen artifact. To repeat, today’s systems are driver-centric – identical hardware may perform better or worse depending upon the system drivers.

II. System Tweaks

In earlier iterations of Windows and associated drivers often provided some simple adjustments that were almost certain to provide acceptable performance, there are few, if any, exposed display configurable items in Windows 10/11. As an example, on Windows 7, it was possible to set the color depth to 16 bits, which cut the time required to draw the display by a factor of 2. Below are some system components that may be configured to improve performance.

1. BitLocker drastically affects graphics 2D performance. BitLocker is an effective tool for preventing unauthorized data access and has been around since Windows Vista. However, on some Windows 11 systems, it is enabled by default, whereas on previous versions, it was enabled by the user. In performance tests with BitLocker enabled vs. disabled, I discovered that graphics 2D performance was impacted by almost 50%. This is shown in following images from PassMark software.

BitLocker Disabled

 

 

 BitLocker Enabled

 

The impact on system performance is apparent:

CPU Mark reduced by 13%

3D Graphics Mark reduced by 20%

Memory Mark reduced by 10%

Disk Mark impact less that 1%

Overall system impact over 17%

HOWEVER, the all important 2D graphics mark is impacted by 46%.

The graphics 2D performance reduction is almost equivalent to that of double screen resolution. As mentioned, scrolling smoothness is highly dependent on 2D graphics performance.

Obviously, discretion must be used to determine if the benefits of BitLocker overshadow its inevitable impact on TeleScript Software performance. My recommendation is to enable BitLocker for removable drives but to disable for the primary hard drive. For best security, disable internet access on critical prompting computers. TeleScript Software does NOT require internet access. Private networks can be used to provide script edits. All imported data should be carefully checked for malware, which can be carried by many common data files. Also, as has been proven repeatedly in recent times, there is no existing technology that can provide 100% protection against hackers other than air-gapping critical systems. Clearly, prompting the President is a critical application. So, disable BitLocker and air-gap the prompting system.

2. Memory Integrity and Core Isolation may degrade performance. I do not have objective evidence that Memory Integrity degraded TeleScript Software performance, however, a subject assessment is that it does have a negative impace, but not nearly to the degree that is measured from BitLocker or high DPI displays. Disabling MI should be considered as a possible performance issue.

3. All startup programs should be disabled. I do not suggest disabling malware protection. On the other hand, I HIGHLY recommend using PassMark software to measure the impact of Anti-Virus and other malware protection. Personally, I think Microsoft’s malware protection is sufficient if it is conscientiously updated for the most recent threats. Try to eliminate all applications running in the background. Even innocuous seeming system apps such as Indexing may have significant impact.

4. Keep the OS updated – but be aware of potential issues. Despite press reports of bugs in updates, it’s my opinion that these reports can generally be attributed to competition gone wrong. Even the highly publicized CrowdStrike issue, which many sources tried to blame on a Windows issue, was in truth, due to the publishers inadequate testing. This is not to say that there are no problematic updates, or that patched system should not be thoroughly tested before being put in to use in critical applications. However, in general, updates are absolutely essential for a best performance. Test should include checks that previously set system configurations (such as BitLocker status) have not been affected by the update. As previously mentioned, drivers should be updated as frequently as needed.

5. All systems have settings that can only be performed in the system BIOS. Just like the system drivers, the system BIOS should be kept up to date. How this is done depends upon the individual system. Some systems have detailed BIOS settings that will probably require some research to understand – manufacturers are known to use terms specific to their own architecture as if they were common generic properties. Be certain to research any BIOS setting before you make a change, and save the existing configuration for restoration if needed.

These suggestions are largely based upon measurements using PassMark software and the experience of setting up a relatively small number of computer systems two of which are detailed below. This is neither comprehensive – there may be other effective tweaks or configurable items – nor is are the suggested items guaranteed to provide consistent results, but all the computers I have set up using these guidelines have had performance as good or better as has been experienced with Windows 10 and earlier OS iterations.

SYSTEM 1 – Dell 9540 (new computer setup)

OS Name Microsoft Windows 11 Pro
Version 10.0.22631 Build 22631
System Dell Model Latitude 9450 2-in-1
Processor Intel(R) Core(TM) Ultra 7 165U, 2100 Mhz, 12 Core(s), 14 Logical Processor(s)
BIOS Version/Date Dell Inc. 1.4.1, 6/11/2024
BIOS Mode UEFI
Installed Physical Memory (RAM) 16.0 GB
Total Virtual Memory 16.1 GB
Available Virtual Memory 7.86 GB

[Display]

Adapter Type Intel(R) Graphics Family, Intel Corporation compatible
Adapter RAM 128.00 MB (134,217,728 bytes)
Driver Version 31.0.101.5382
Resolution 2560 x 1600 x 60 hertz
Bits/Pixel 32

Notes: BitLocker disabled, delivered with Windows 11/Pro, scrolling very good, occasional brief hitch indicative of a virtual memory access at about 10-15 second interval, external display at 1920x1080 no artifacts.


SYSTEM 2 – Dell Latitude 5420 (refurbished system, upgraded from Win10)

OS Name Microsoft Windows 11 Pro
Version 10.0.22631 Build 22631
System Model Latitude 5420
Processor 11th Gen Intel(R) Core(TM) i5-1145G7 @ 2.60GHz, 1498 Mhz, 4 Core(s), 8 Logical Processor(s)
BIOS Version/Date Dell Inc. 1.36.2, 4/1/2024
BIOS Mode UEFI
Installed Physical Memory (RAM) 8.00 GB

[Display]

Adapter Type Intel(R) Iris(R) Xe Graphics Family, Intel Corporation compatible
Adapter RAM 128.00 MB (134,217,728 bytes)
Driver Version 31.0.101.5333
Resolution 1920 x 1080 x 60 hertz
Bits/Pixel 32

Notes: BitLocker disabled, system updated from Windows 10, scrolling excellent on internal and external monitors,

Wednesday, August 8, 2012

A Good Article on Software Reliability

I was once asked by a non-programmer, "So you program computers? What does that mean... like you just know the right keys to press?"
I suppose you could possibly distill it to that... like the monkeys at typewriters, given enough pressing of the right keys -- say 20-50 million in the right order -- and you might create a usable application.
Still, this comment, coming from a reasonably bright human, stunned me in its ignorance, and I don't mean "ignorance" to be insulting, just descriptive. But how many of the billions of humans whose daily lives depend on computers, and computer programming, have any concept of what's involved in creating even a simple program, let alone a large, multi-featured application.
Well, certainly not most contributors to mainstream media. It's rarely that I view or read anything in popular sources such as newspapers or TV regarding computers and software creation that reflects even a rudimentary understanding of the underlying concepts of automata. This, however, came from the New York Times. It's a good explanation of why it's so difficult to create software with no bugs.


Errant Code? It’s Not Just a Bug

We need code, and attentive human beings, to solve problems like the runaway stock trades at Knight Capital.

Thursday, December 23, 2010

Getting the Smoothest Teleprompter Scrolling

This article will tell you how to  use TeleScript AVtm in a dual VGA monitor configuration to be sure your client sees the smoothest scroll on the teleprompter monitor. This discussion is a fairly advanced, technically – but if you’re a teleprompting professional, these are things you really need to know!

Artifacts – what are they?

Video artifacts are, basically, anything in the display that is not intended. Artifacts may include hum bars, ghosting, and other periodic glitches.

HUM BARS are periodic “stripes” that move from top to bottom or vice versa. They are caused by “hum” induced into the video by interference from the power supply causing a “beat” with the vertical sweep in the display monitor. Common sources of hum bars are poor grounding, inadequate or broken shielding on coax cables, and running power cables in parallel and too close to signal cables. Hum bars can be hard to track down, but are generally the easiest artifact to diagnose and understand.

GHOSTING is a shadow of the primary image appearing usually to the right or left (or both). Ghosting is usually caused by poor cables, impedance mismatch, or cable runs that are too long. The problem can usually be solved by replacing cables, or reducing their length. Often times, a line driver or distribution amp can solve ghosting problems.

The Artifact unique to teleprompting – Frame Sync Compression Bars

Even with the demise of CRT monitors, one powerful commonality remains: the manner in which each frame is synchronized with the display. A compression bar is an artifact that is caused by lack of synchronization between the computer and the display monitor is, while not completely unique to teleprompting, is still most troublesome and clearly viewable with a clearly delineated, steadily moving text display.

The computer maintains a “frame buffer” which is generally an area of physical memory (RAM) provided in the user’s video card.  The frame buffer is where the visual display is created by the system. The application programmer uses whatever tools are available to create the desired visual image in the memory provided. The video card then converts the frame buffer to signals that are meaningful to the display device where the digital signal become an image -- and our eyes do the rest.

The images created in the frame buffer are released at a steady rate – generally 60 fps (frames per second) or 70 fps. Today’s monitors actually “converse” with the computer system and can operate at a variety of frame rates. For teleprompting it has been common to adhere to the established NTSC frame rate of 60 fps to maintain compatibility, though this is NOT required in VGA (non-composite) systems.

Frame sync compression bars occur when partially formed frame buffer are displayed – we see the top part of one frame, and the bottom part of another. When this process is repeated, the “break” point generally moves toward the top or bottom. The visual effect to our eyes is similar to a hum bar – a line which moves up or down the screen. Frame sync compression bars are possibly the hardest artifact to understand, and certainly the hardest to tame in the world of teleprompting.

In the paragraphs that follow, I’ll explain how the computer using vertical and horizontal scan to refresh pixels in the display. I’ll also describe how the computer’s VBI (vertical blanking interrupt) is used by the video card to prevent frame sync issues. Finally, I’ll tell how to set up TeleScript AV in a way that will totally avoid frame sync compression bars.

How the computer establish sync with the display – the basics

Just as we were saddled with the low resolution of NTSC TV for many years because of compatibility issues, we’ve also inherited some baggage from early CRT (cathode ray tube) computer displays. CRTs create a display with a single moving electron beam emission which moves in a very systematic path across the face of the picture tube. There, the electrons strike a “phosphor” coating which glows from the electron bombardment.

Composite Video as described by the National Television System Committee in 1953, consists of a single modulated signal which combines not only the value for the intensity of the moving electron beam, but also several signals to synchronize the end users display to the originating device. The display is divided into 30 Frames Per Second – each frame consisting of 2 interlaced fields of 262.5 each. In actual practice, the frame rate is very slightly below 60 fps – about 59.9 fps. (This explain why 60Hz power signals cause predictable moving hum bars in NTSC CRTs.)

Interlacing fields make small text harder to read. Thus, computer systems flatten the interlaced pattern and generally display every vertical line on every field which makes them, effectively, 60 FPS.

The CRT electronics contain sweep generation circuitry which moves the electron field smoothly from the left to the right, and from top to bottom. Each time the beam has completed on horizontal line, in order to prevent an unsightly retrace line, the signal is blanked. A signal is sent in the composite source which tells the display to shut off the electron beam and to move the electron gun back to screen left. Horizontal Sync tells the display that it’s time to start the beam moving screen right again. (See the diagram below.)

Likewise, when the electron beam has reached the bottom right corner of it’s travel, the signal is blanked and moved back to the upper left of the screen. This is known as vertical retrace. When it’s time for the beam to start its travel again, a Vertical Sync signal is sent. The period time during which the electron beam is blanked while moving back to the top of the screen is called the Vertical Blanking Interval or VBI.

So, what does this have to do with modern displays and teleprompting?

Who knows what system computer engineers might have come up with if not for the constraints of CRTs and NTSC video. But when LCDs displays became commonplace, it was still required to maintain compatibility existing CRT displays. Modern video cards still use the same system… horizontal and vertical retrace, vertical blanking and all.

Particularly of interest is the Vertical Blanking Interval. The designers of video editing equipment learned that all edits would have to be performed during VBI. Otherwise, you’d see part of one scene on the top of the frame, and part of another scene on the bottom of the frame. The resulting glitch is very distracting and is not acceptable to video producers. Vertical Interval Switching is used on all modern video editing and switching equipment.

In teleprompter displays, the scrolling text is the equivalent of scene changes. The illusion of moving text is created by displaying successive frames which are displaced one scan line up or down from the previous frame. This glitch is exacerbated by the fact that it will be repeated with a frequency determined by the scrolling rate and the teleprompter program’s drawing rate. To get the smoothest scrolling, the display of successive frames must be synchronized with the VBI.

So what’s the problem – TeleScript AV is VBI synchronized, isn’t it?

True… all TeleScript software is VBI synchronized. At least, it is if the computer reliably reports the Vertical Retrace. Here’s where problems sneak in.

There’s a scan converter between the computer and the teleprompter display monitor.

In this case, the VGA signal may be resampled before it’s converted to composite form. If the VGA signal is 1024x768, for example, some horizontal lines must be dropped and interpolated. If the VGA is runnin gat 60 Hz, then an occasional frame must be dropped to maintain the required 59.94 NTSC vertical frequency. But must specifically, the VBI of the display device is no longer the same as the VBI reported by the computer’s video card. This is why it is impossible to guarantee that there will be no frame sync compression bars when using a scan converter! Removal of artifacts is entirely in the realm of the scan converter – not the application program!

So… staying away from scan converters in general, what’s source of the problems in VGA mode?

Dual video cards

I recall my early conversations with ATI, nVidia, and other designers such as Paradise who have long since departed from the competitive arena. I asked for laptops that could simultaneously display on an external monitor and the internal screen. The problems involved in this design task are not, by any means, trivial, and the solutions were a long time in coming. However, virtually all notebook, netbook and laptop computers now feature simultaneous display.

End of problems - end of story --  right?

No… not at all. Beginning of new problems.

How portable computers create simultaneous display

Most – I’d say ALL, but that leaves room for error - modern computers that feature simultaneous display have a video with TWO video controllers. These controllers can either share a frame buffer (clone mode) or they can divide video RAM and have their own frame buffer (extended desktop.) Each of the two video controllers is truly independent and the respective VBI of each controller is not related or synchronized. This is GOOD because it allows you to use different manufacturer’s displays, different screen resolutions, etc. If the external video were just an electronically buffered “Y” connection, this would not be the case and no one would be happy with the result.

Clone Mode is problematic, as far as Vertical Interval Sync to the external display. Clone mode is established by a means that’s dependent on the manufacturer and the creator of the driver software. In some schemes, the video card itself uses the two controllers and frame buffers independently. It “clones” the display by “blitting” (quickly copying) the contents of primary display’s frame buffer into the frame buffer of the secondary display. In other schemes, the video controllers are allowed to share the frame buffer, scanning at an independent rate. The “blitting” scheme actually is capable of fewer artifacts – because a wise hardware/driver designer can be specify that the blitting is performed and displayed with Vertical Interval Sync… from both controllers. However, this is not frequently the case. Though individual manufacturer’s schemes are seldom public information, it would appear the virtually all either use the shared memory concept, or use a non-synchronized blitting scheme.

In clone mode, frame sync compression bars appear on the external monitor because the video is READ by the secondary video controller. However, it’s WRITTEN and synchronized to the VBI of the primary controller. Consequently, the secondary controller displays the frame buffer as it is being refreshed, resulting in a frame sync compression bar on the display.

Finally, the best bet for avoiding frame sync compression bars: use the Extended Desktop mode with TeleScript AV.  The application senses which video controller is servicing the external display and synchronizes with that VBI. This inquiry is done each time you move the window, so it’s generally transparent to the user. In extended desktop mode, using a VGA-DA to feed the prompter, and with a confidence monitor, the display will be synced with the external video controller’s VBI. Use the primary display for Runlist, auxiliary on-screen dialog controller, timer, Find-Replace dialog – whatever will help you to be the best operator.

Sunday, August 22, 2010

My Dirty Little Secret

I actually have two dirty little secrets:

1. I'm a professional musician and I write computer programs; and,
2. I'm a professional computer programmer and I play the guitar professionally.

There... thank gawd I've got that off my chest.

In our right brain, left brain society, few people consider that there can be hideous hybrids among us who actually use both lobes of the brain, sometimes... gasp... AT THE SAME TIME!!! In my case, it's exacerbated by occasionally writing assembly code and dabbling in hillbilly music... dang ol' microprocessors anyway.

I don't even remember which activities are supposedly consigned to which brain half. I do know that, whether this is fair or not, a certain amount of distrust is created in others when they learn that you're not like them. The art students learn their faithful, former friend took a business class! The accountant's client realizes his once reliable reckoner writes poetry! Ah... the betrayal is beyond belief.

With a passion that I've only seen elsewhere in Canadians who discover other famous Canadians, I have collected a small, but representative group of these left/right brained mutants who threaten the fabric of society. Ready to name names?

Leonardo Di Vinci -- painter, sculptor, mathematician, scientist; so diverse in his activities that the term, Renaissance Man, is virtually synonymous with his name. Don't you just wonder what sort of music he might gravitate towards in today's world... I'm guessin' Hillbilly.

Robert Persig -- though maybe a little inclined toward the technical side, his Zen and the Art of Motorcycle Maintenance is the primer on the pathology of Bilobalism. (Note: I just made up this word and I claim common law copyright as of this moment!)

Charles Ives -- American modernist composer and pioneer in (gasp) life insurance and estate planning. If this weren't shameful enough, Ives was also a standout athlete at Yale. He was fond of Stephen Foster's music, which was certainly the hillbilly of its time.

Brian May -- writer of "Fat Bottomed Girl" and "We Will Rock You", former Queen guitarist, astrophysicist and current Chancellor of Liverpool John Moores University. "Betrayer of purity... we will rock you, indeed! Let us begin the gathering of the stones!", shouts the rabble in Monty Python voices. "Thing Called Love" is arguably akin to hillbilly.

Hedy Lamarr - Paragon of beauty and major contract star during MGM's "Golden Age". She was a math whiz and developed the basis for spread-spectrum communications technology used in WiFi and Bluetooth. Lamarr co-starred with Bob Cummings, who was born in Joplin, MO, Gateway to the Ozarks... you can't get more hillbilly than that!

And here's a good collection of Musicians with PhDs.

Thok Have Mammoth on Face -- Stone Age Programming

Welcome to my computer programming blog. I'm a first -- maybe second, depending on definitions -- generation programmer. If the "art" of programming were, say, visual art, my entry time would be about the equivalent of the point where prehistoric humans began to create images with dyes derived from plants, rather than just to chisel images on rocks using harder rocks.

My first programming language was FORTRAN IV. This sounds like it might be quite a ways up the generational ladder, and, in computer life terms where a second is like an eon, it is. However, in human years, the interval between the initial introduction of FORTRAN (IBM's Mathematical Formula Translation System) and the arrival of FORTRAN IV was only a blink -- less than the elapsed time between the introduction of the iPod and the iPad. In fact, when I was diligently creating programs in FORTRAN IV, it had already been replaced by FORTRAN 66, though this was primarily a standardized version of FORTRAN IV.

FORTRAN IV or FORTRAN Googol -- it didn't matter to me... I quickly moved on to much cooler computer concepts -- but more about that later. For now, let me assert that I discovered not only did the computer do things for me that I really hated to do for myself -- like long division -- but that I was really good at writing programs. This awakening was akin to something I'd experienced earlier when I first played an electric guitar. It just clicked. My set of genes was custom tailored to the task.

To appreciate how big a difference the computer made in the life of scientists, engineers, and, most importantly to me, college students, you have to consider the timeline. The first affordable personal desktop calculators were not available until the early seventies. My first one (which I still have, by the way) was a Radio Shack EC-375... nice big buttons, with several simple scientific-like capabilities such as Square Root. By this time, however, my years of cranking out the solutions to dreaded "word problems", was a hideous memory. Take your typical basic physics word problem (PLEASE, in Henny Youngman voice). It might require a few seconds to see how to get to the answer... but another 10 minutes to grind through to the ten digit number which represented the continuation of your student deferment. If you don't know what I'm talking about, just consider yourself lucky -- both about the ten digit number and the student deferment.

To me, it was fascinating that a means existed of describing to a machine how to work a problem, and then the machine would do the heavy lifting and spit out the answer in a millisecond or two. But how did the machine do this? All I knew at the outset was that I could type my program on a keypunch, give a stack of cards to a acne scarred sociopath with B.O. (called a sysop), and only a day or two later, get a ten pound sheaf of paper which somewhere within contained information about the errors that, if corrected by repeating the keypunch/submission process, would provide the very answer that I requested.

There were issues here that I very soon caused a hypersensitivity for me. The keypunch was like the Godzilla of Typewriters. The user would insert a bunch of IBM style punch cards into a hopper, and the machine would punch holes (in Hollerith Code, named for Herman Hollerith, the founder of the company that became IBM) that could be read by the computer's card reader. The sound created by the keypunch for each keypress was similar to the sound of a bad snare drum struck way too hard by an inexperienced drummer. I was already well acquainted with this noise by dint of a decade of participation in garage bands. An entire room of keypunches operating simultaneously was the drum corp nightmare that only became reality in the hands of Fleetwood Mac years later. Oh yeah... and the machines were always busy, requiring at least an hour of waiting time.

The noise and wait were a simple necessity. However, nothing could have prepared me for the anxiety produced in anticipation of potential damage to the deck of cards, sometimes numbering in the thousands, that could be inflicted by rain, wind, beer (always beer), or, gawd forbid, dropping the deck, particularly when said dropping was accompanied by rain, wind or beer. After one dropping in beer episode (a DIBE), I learned to insert sequence numbers on the cards. There was a card sorter that could read sequence number and re-order the deck after the damaged cards were retyped. Yikes... do you appreciate your Netbook a little more now?

The big mystery to me, though, was what did the computer do? How the hell did it make sense of those little punches, do the arithmetic, store the data, and print error diagnostics (usually) and answers (occasionally.) For my life, this was a jumping off point -- much like the reaction to the sound of the electric guitar's E-string that I first heard in 1957. This jumping off point, surprisingly, led me back in time, not forward, for a certain period. From that point, I could move forward with the foundation of the knowledge of basics.

In future blog articles, I hope to give non-technical readers an idea of what goes on in the computer, and just what manifestation of the mental malady called computer programming has led to writing this blog.