Glossary of Computer Industry Terms, Companies and other stuff

By Nick Holland, with possible help from other people, who will be credited when they provided me with stuff!
Last update: 6-27-2007

This is NOT any attempt to be an all-inclusive computer industry glossary.  In fact, to be honest, it might be more humorous than useful, but if you aren't careful, you might learn something...

A  B  C  D  E  F  G  H  I  J  K  L  M  N  O  P  Q  R  S  T  U  V  W  X  Y  Z


Apple: One of the older "home" computer manufacturers still producing computers.  Noted for many things, including re-inventing systems which all the rest of the world thought were standardized for the name of "Innovation", using software to do what the rest of the industry did in hardware, and making many features we take for granted in today's computers popular in the mass market (and often wrongly claiming credit for their invention instead of just their popularization).  Once considered a great evil in the computer industry for their vicious lawsuits over "look and feel", now considered the "good guys" and "victims" of the former good guys and victims.

Atari: Maker of early computers such as the Atari 400 and 800, which brought computers home for low prices, and later the Atari ST, which again earned attention for power at a low price, but also at a low quality...

Altair: One of the very earliest computers, although your typical kid would in no way recognize it as a computer, for it basically consisted of a large number of lights and switches.  Very good at balancing the checkbook -- just set all switches to zero, for it pretty well drained your checking account when it came out in the mid 1970s.  For reference, an Altair with 4k of RAM was a BIG system...

Amiga: A computer originally developed by an independent company, which was purchased by Commodore before it hit the market place in the mid-to-late 1980s.  A very graphical machine, developed a very loyal following even before it hit the market.  The temptation would be to compare the fierce loyalty of Amiga owners to that of Macintosh owners, but while the loyalty is comparable, most Amiga fans were poor college students, whereas the Macintosh had a sizable following in business.  Hint: Poor college students are POOR!  Not a good market for most products!  A very notable trait of the early Amigas: They did their graphics using NTSC (U.S. television) timing standards, which meant with VERY little effort, they could be used to create very good animations and graphic displays which could be recorded OR BROADCAST using standard video equipment.  One of the only business niches the Amiga seemed to have penetrated was the TV news and weather broadcast business as a graphics generation system.  The company (Commodore) faded a few years back, but the Amiga name is making some kind of comeback, exactly what, I'm not sure.  I could look it up, but based on past track records, I must say don't believe ANYTHING until it ships and is evaluated.  Microsoft may have learned much of what they know of vaporware and unkept promises from Amiga.

Asynchronous Communications, or ASYNC:  Typically, RS-232 serial communications.  The characters can be sent or received at any time, without regard to fitting into a rigid timing frame.  This is done by means of a start bit and a stop bit, transitions which indicate the data is coming.  While it is convenient for many applications and the protocols are simple, the additional overhead of start and stop bits account for approximately 20% overhead to the data stream.

ATM:  Probably the most over-used three letters in the computer industry.  Most commonly: Automatic Teller Machine:  Those wondrous devices which dispense money for new computer stuff on demand.  Unfortunately, using computer technology, they tend to do TOO accurate a job of tracking the money you extract from them.
    Asynchronous Transfer Mode:  One of those Computer Industry three-letter combinations that when you find out what it means, doesn't help you in the slightest.  ATM is the Up and Coming thing in high-speed network backbones and will soon be attaching all computers in the world.  And, it has been this for at least five years (keep in mind that a computer "generation" is about two ATM has been right around the corner for something like 50 "computer years").  Yeah, right.  ATM probably deserves the award for the product with the most hype that actually happened, and the whole world still said "yeah, so what?".  Everything else keeps getting faster, better, and cheaper faster than ATM.
    Adobe Type Manager: Now somewhat forgotten, ATM is a technology Adobe came out with to display PostScript fonts on your computer's screen and to allow them to be printed to non-PostScript printers.
    For quite some time, I was trying to figure out what Adobe Type Manager or Automatic Teller Machines had to do with networking...

Artificial Intelligence:  Years ago, it was hoped that computers would someday be taught to think, and this would be Artificial Intelligence or "AI".  In the 1980s, as computers steadfastly refused to think for themselves and instead insisted upon doing as instructed, AI was given a new definition: Being able to do things that couldn't be done by computers a few years ago, and thus, AI became a success!  People in the computer industry rarely admit defeat, they just redefine success.  Now, once again, people are starting to talk about "thinking computers".  Personally, the more we learn about how the brain works, the more I believe the digital computer won't be able to do it, no more than a hand saw can be used as a delivery truck.  Wrong tool for the job.


Bit: a  "Binary digIT", the basic unit of digital information.  As a digital signal, a single bit represents a logical True or False, a numeric One or Zero, a state of being On or Off, a high or low voltage, and I'm sure other things I'm not thinking of at this time.  Computers, however, usually consider a bit to be a logical True or False, or a quantity one or zero.  This is worth spending some time on, for without an understanding of the basic unit of digital information, pretty much everything else in a computer is hopelessly abstract.

Byte: A collection of 8 bits running around together in some sort of order.  Often, one byte can represent one character of text.  Since a byte is 8 bits, and each bit has two possible states, there are 28 distinct single byte values, for example, 0 through 255 (or -128 to 127, or 7 through 262, or 256 different colors, or anything else you wish to define the bit combinations to mean).  Note:  Some older (1970s vintage) literature defines the byte as the basic unit of information of a particular computer design.  So, by this definition, a PDP-11 has a 16 bit byte, a PDP-8 has a 12 bit byte, an IBM PC has either an 8 or  16 bit byte (depending on what you count), a Commodore 64 has an 8 bit byte, etc.  In the late 70s, some kind of consensus was reached that a byte is 8 bits, and the basic unit of information of a processor became defined as the "Word length".  In the 1980s, an attempt was made to define a "Word" as two bytes, or 16 bits, but if someone says "Word" and you really care, you might want to ask how big of a word they mean.

Bug: What we have when a computer doesn't work as we hope.  According to computer lore, the term came from a problem discovered in the Mark I electro-mechanical computer at (I believe) MIT in the 1940s.  A program malfunction was traced to a moth which had been smashed in the relay contacts, preventing their proper operation.  This means the first debugging tool was a pair of tweezers.  (An early small computer debugging tool was named "DDT").  This story is may be true, but it isn't likely to be the source of the word "bug" in systems failures, as I have seen people cite references to "bugs in the system" long before this story allegedly took place.


CMOS: Properly: Complementary Metal Oxide Semiconductor.  An integrated circuit fabricating process which produces components which use very little power, in theory using power only when switching states, and in practice, amazingly close to the theory.  For this reason, it is the preferred technology for most modern computer circuitry.  Improperly:  The  battery-backed up memory which stores configuration information in an IBM PC compatible.  YES, this particular circuit does use CMOS technology, but as already pointed out, in a modern PC, the entire computer is CMOS.   A better term would be Battery backed memory, or configuration memory.

Consultant: An otherwise unemployable jerk, brought in by management to tell them the exact same things their underlings have been saying for months, but since they paid this guy somewhere between $100 and $500/hr to tell them, they are more likely to listen to his faulty advice than to listen to the faulty advice of their employees.

Cracker: One who gets pleasure out of breaking into or otherwise damaging other people's computer systems.  The truly ignorant (including the media) praise the skills of these people.  Sorry, but the computer skills of a typical cracker are about on par with the artistic skills of the typical graffiti vandal.  Most work off scripts guiding them step by step through the process of breaking into the target machine, or use the ignorance of the maintainers of the system to achieve their goal (one does not praise the cleverness of a burglar who noticed the back door was unlocked, and the key to the front door was under the door mat!)  These people deserve no praise, if a cracker wishes to prove they are skilled with the computer, there are many things they can do that are creative and productive.  Clever and creative people don't have to turn to destruction and vandalism.  Sometimes, the term "Hacker" is used synonymously, this is wrong.  Hackers are the creative and constructive true masters of the computer.  They got better things to do than to crack systems...unless it is at the owners request as a security check.   See Script Kiddie

Caffeine: One of the Geek's four basic food groups.  The others include fat, sugar, and cholesterol.

CP/M:   "Control Program/Monitor", the very popular disk operating system developed by Digital Research.  CP/M-80 was an 8 bit OS capable of running on 8080 processors with as little as 16K of RAM (the more famous CP/M v2.2 required 32K).  Typically, the OS took up less than 15k of RAM, which sounds impressive until you remember the processor maxed out at 64k.  Later versions of this well-designed and well-implemented OS included MP/M (a multi-user version!), CP/M-86 (16 bit version for 8086), CP/M 68k (for the Motorola 68000 series), Concurrent CP/M (Multi-tasking), Concurrent DOS (multi-tasking with MS-DOS emulation and windowing).


Data:  Information.  Often, the stuff computers crunch on.  One could divide the stuff computers work with into two things: Data and Programs.  Data being the information which is being processed, programs being the stuff that controls what is done to the data.  One of the on-going arguments of the 20th century was "Is 'data' singular or plural?"  Answer: Whatever sounds best, and if this is all you have to worry about, your life is doing quite well.

Disk:  A storage system for computers.  There are two types of disks (not discs...) in common use on computers: Floppy and Hard.  Floppy disks are the limited storage devices used to initially boot up your computer, and used by many as a convenient way to loose data and spread viruses.  The name "Floppy" comes from the flexible mylar used as a backing for the magnetic material, whereas hard disks use ridged aluminum platters.  Many people mistakenly consider the 3.5" floppies "Hard disks" because of their "hard" shell surrounding the media, compared to the old 5.25" and 8" floppies, which were very clearly floppy.  This is incorrect, but understandable if you have never shredded a 3.5" disk to see the guts.

Disc:  Apparently, CD-ROMs are 'Discs', hard drives and floppies are 'Disks'.  I think there should be some elaboration on this.  I think this is silly.  In my book, a disk is a round, flat thing, a 'Disc' is a round flat thing attempting to look sophisticated.

Dynamic RAM: The most popular kind of memory in computers today.  There are two basic divisions of RAM: Static and Dynamic.  Static RAM holds its data in a logic gate called a "Flip-Flop", which holds the value as long as power is applied.  In other words, the data is held accurately as long as power remains applied and it isn't told to change the data, imagine a good lights witch.  You forcibly flip the switch from one position to the other, and it stays there.  Dynamic RAM, on the other hand, stores data in tiny capacitors, which loose their their charge over time (typically, 1-2ms -- that's not much time for a person, but for a computer, it is a relatively long time period).  Imagine here a cheap light switch -- you flip it up to on, and it starts to sag back down to off.  So, you have to keep going back, and if the switch is on, you have to push it back up towards on, and do your work in between runs to the switch.  Dynamic RAM sounds very inefficient, and it DOES take a performance penalty on the system, but it has MUCH greater chip densities than Static RAM, meaning lower costs and greater storage, so Dynamic RAM is the standard.  Now some people might say "Oh, but DRAM isn't the standard any more!"  Well, before the current age of micro-specializing terms that used to be more broad, Dynamic RAM just was any kind of RAM which required periodic refreshes.  Now, they have subdivided DRAM into SDRAM, EDO DRAM, RDRAM, etc.  Yes, there are substantially different in compatibility, but they use much the same basic storage technology (the primary difference is in how the data is retrieved from the storage and delivered to the processor).


EPROM: Erasable, Programmable Read Only Memory.  See ROM, PROM, FLASH ROMs, EEPROM.  Most computers have used EPROMs to store their initial boot loaders for most of recorded history (or at least the mid 1970s).  This initially seems rather strange, as the EPROM is never erased in most older systems (the advent of "Flash ROM" has changed this in the last five or so years).  The story is an EPROM was used not for the convenience of the user, but the manufacturer.  The alternative was the Mask Programmable ROM, which were cheaper in bulk than EPROMs, but much more expensive in small quantity, having a high setup cost.  Manufacturers generally found they were changing the ROMs so often to add new features or remove bugs that very few would ever see the benefit of mass-produced mask programmed ROMs.  For this reason, usually they chose to factory program the EPROM, and if a bug was found, erase the pre-programmed EPROMs and reprogram them.  As the manufacturing costs have dropped for EPROM chips, a curious device has surfaced: the non-erasable EPROM.  These chips are programmable on a standard EPROM programmer, but they are not erasable, as they lack the quartz window used to erase them.
      EPROMs are erased by exposure to ultraviolet light.  Sunlight does the job given a longer period of time, florescent lights can erase them over many months of exposure -- that sticker covering the window isn't there to hide the pretty contents of the chip, but to keep it from erasing in light!  The entire EPROM is erased in one shot, although it can be programmed a little at a time, if desired.
     EPROMs store their data by holding a charge on a floating gate -- basically, a capacitor which holds its charge for a very long time.  The "capacitor" is charged by a higher-then-normal voltage applied in a particular way.  They are a limited life device -- manufacturers typically suggest a life expectancy of around 10 years, experience shows this is a conservative estimate, but don't count on an EPROM for indefinite storage.

EEPROM:  Electrically Erasable Read Only Memory.  VERY similar to an EPROM, except the chip can be erased by electricity, rather than with ultraviolet light.  Gotta look up if an EEPROM can be reprogrammed one byte at a time or if the entire chip is erased at once.

Endless Loop: See Loop, Endless


Flash Memory: A variant on EEPROMs, where banks of the chip are erased at once.  This type of chip has become popular for computer ROMs, offering "easy" field reprogramming.  The process of reprogramming the ROMs on a computer has now started to be called "Flashing the ROMs".  Not exactly proper usage (as flashing the ROMs is erasing them, not reprogramming them!).  They are also commonly used in things like digital cameras or MP3 players, where data can be stored for a long period of time with no power consumption from the batteries, and yet, individual items can be deleted and replaced.


'G' or 'Giga' (pronounced with a hard 'g', like 'gone' or 'goofy'..yes, by rules of English, is should be a soft 'g', I have no explanation): Geek suffix for 230 = 1,073,741,824 or more commonly, one billion (1,000,000,000).  When measuring hard disks,  assume it means one billion.  When measuring RAM, assume it means 230.  See rant on 'K' and 'M'.

Geek: Social undesirable who, if not for the computer industry, would often be considered unemployable, but because of the computer industry, they often take home more money than people who work for a living.

Google: The "standard" Internet Search Engine. Originated the concept of the "just search" screen, with minimal graphics, and minimal "other stuff". Gone from being "just another search engine" to being a verb ("to Google" for something, as in "Google for it, dammit!"), and the closest thing to the index of all human knowledge. Also provides many other services, such as mail, mapping, advertising, and much more.


Hard Disk: Mass storage device.  Holds programs and data on most computer systems.  Data is stored on iron oxide coated aluminum platters, which are spun at typically speeds between 3600RPM and 11,000RPM.  Data is stored and recovered by movable heads, which store data on a large number of tracks on the platters.  The platters are often stacked, a hard disk drive might have anywhere between one and ten platters, with two heads on each (top and bottom).   Data is stored as a set of concentric rings (tracks) broken up into a fixed or variable number of sectors.  All the data from the same track of each surface on each platter is collectively called a cylinder, and about here, I bet you really wish I had a picture handy...

Hacker: A person who has mastered the art and science of making computers and software do much more than the original designers intended.  A true hacker is to be respected, even if socially a bit undesirable.  Not to be confused with "crackers".  A true hacker can find plenty of constructive projects to work on, breaking things is more a mark of children (of any age).


Intel: A designer and manufacturer of the microprocessors.  After a rather ignorant but well read computer columnist labeled their 80286 processor used in the IBM AT "Brain dead" because it didn't emulate the inferior 8088 (a chip Intel was attempting to phase out) processor absolutely perfectly, they learned the way to market success wasn't to build a technologically good chip, but a marketable chip.  The consumer has been paying ever since, getting exactly what the deserve: a very souped up version of the 1973 vintage 8008 processor.

Internet:  The most effective productivity destroyer in business today.  Designed originally as a way for geeks to communicate with each other, it was later found that geeks were barely able to communicate, so it was opened up for the general public.


Junk:  The quality of products produced by the computer industry today.


'K':  Geek suffix for 210 = 1024.  Under the Metric system, it is the abbreviation for 'Kilo' or 1000.   Very early on in the computer industry, it was discovered that 210 was conveniently close to 1000 to be adopted by the computer industry.  As storage capacities grew, those 24 bytes started adding up, to the point that by the later 1970s, it was not uncommon to see some computers advertising a memory capacity of 64k, others advertising RAM capacities of 65k, and yet they were both referring to the same capacity.  64k = 64 x 210 = 216 = 65,536 which sure looks like 65 thousand, and thus, 65k, even though it would take 64 1K banks of RAM to get to that point.  By 1980, industry people had decided that a 'k' was 1024.  Period.  At that time, engineers ruled the industry.  By the 1990s, the marketing departments ran the industry, and things changed when it was time to talk about 'M' and 'G'


Link:  Connecting one thing to another.  A common use is in programming, the linking stage of compiling connects all the modules of the program into a runnable chunk of code.  Another use is in file systems -- a "link" is a connection between things in the file system.  A "symbolic link" in Unix looks like a file, but actually is just a "tag" which points to the actual file.  This way, one file can appear to be in several different places at once..sometimes very useful, other times very confusing.
    Entertainingly, in my first pass at writing an entry for the word "link", I neglected to think of a Link on a web page.  On the other hand, if you are not familiar with links on web pages, you would probably NOT be here right now...

Loop, Endless:  See Endless Loop.


'M', 'Meg', 'Mega': One of two things: Either 106 = 1,000,000 or 1Kx1K = 220 = 1,048,576.  If you are talking about memory, 1M is 220.  Before the mid 1990s, if you were talking about  hard disks, 1M also was 220.   In the mid 1990s, however, several hard disk manufacturers discovered that if they re-defined 1M to be 1,000,000, they could make their hard disks look bigger without any engineering changes.  As you can imagine, as soon as the rest of the industry realized this manufacturer didn't get penalized in the market place, but rather got rewarded by increased sales, they did it too.  So now, we have two totally different definitions of 'M', based on what it is you are measuring.  Measuring RAM (which is highly influenced by powers of 2)? 1M = 220.  Measuring disk space (which have data organized in ways not clearly influenced by powers of 2)? 1M = 1,000,000.  The marketing jerks have managed a partial victory, but the geeks have held on to some degree.  This could be considered something like nautical miles and "land" miles: Measuring similar but different things with different but similar units, and confusion of the units or the item being measured can drive you nuts.

Microsoft: Originator of the mass market BASIC interpreter.  At one point, someone had counted MS as being responsible for over 200 different dialects of the BASIC language on a host of very different computers (and that was BEFORE the IBM PC came out!).  Legend has it that Bill Gate's first BASIC Interpreter (for the Altair) was shipped sans copyright notice, and thus, was freely distributed at the computer clubs in the mid 1970s, probably spreading the MS name far further than it ever would have if Bill had remembered to copyright his $150 program.  In 1981, they purchased the code for what became MS-DOS v1, basically a 16 bit re-write and strip down of CP/M-80.  The company also deserves credit for a FORTRAN compiler which gave erroneous results on simple math problems, Windows v1, v2, 386, v3.0 and the more well known 3.1 95, 98, NT, and 2000.  Noted for blatantly unfair monopolistic trade practices, and software which sells better than it works.


Nibble:  Virtually unused term now, rarely used ever, actually.  Refers to half a byte, or four bits.  Under 'Byte', I mentioned that once upon a time, the size of a byte was actually not eight bits, but the word size of the computer...this leads to the question, was a nibble four bits or half a word size?  I don't have an answer for this, I *think* it was four bits, but I don't recall having seen anything of that vintage to indicate a firm definition either way.  This is the computer industry, we make it up as we go along.

Nerd: Something akin to the term 'Geek', the exact distinction has been lost (at least by me, and this is my list!).  Ages and ages ago, a group of friends, some of Electrical Engineering majors and others Computer Science majors had decided that EEs were one, and CS were the other.  I don't recall which was which.  I have also seen it said a nerd is someone who is immersed in technology, a geek is a nerd that enjoys it.  Whatever.


Open Architecture:  Something good, but rarely defined past that.  Our system is Open Architecture, the competition's system is not.  Few people will argue with that definition.  I've seen almost every computer, including the IBM AS/400 at one time or another called "Open Architecture"  It is supposed to mean something  about being able to put other manufacturers hardware and software on a system, but truly closed systems no longer exist, so now it is mostly a way of self-praise and competitor bashing.

Open Software: 1) Software published with complete source code.  2) A religion practiced by advocates of definition 1.  Advocates of Open Software say because the source code to the programs is in the user's hands, bugs can be fixed by talented users, resulting in better quality software.  Critics of the Open Software movement point out, there is no financial motivation to the programmers to improve or perfect their programs.  Both sides may have some points, although the Open Software people are pretty clearly ahead in quality of product, the traditional software publishers are certainly making more money, and probably making a product more "user friendly".


Plug-and-Play: A technology developed by Microsoft and Intel to make people like me a lot of money.  Also a dammed lie.  See Truth in Advertising.

PC:  Originally, a "Personal Computer", a computer used by one person, typically small enough to fit on or under a desk.  IBM adopted the name for their first commercially successful small computer, the IBM PC.  It has since come to mean a computer crippled by the 1981 design IBM came up with, as in "Do you prefer PCs or Macs?"

PCI: The "modern" high-performance microcomputer expansion bus.  Characteristics: Resources are allocated by slot (unlike the ISA bus, where the slots were indistinguishable from each other).  32 Bit bus.  Some processor independence (Intel, PowerPC, Alpha, etc).  132MBps theoretical data transfer rate.

PROM: Programmable, Read Only Memory.  A ROM which can be (typically) programmed at the factory (and sometimes the field).  Generically, it is any ROM whose data could be set sometime after the chip is fabricated.  It often means, however, "fusible link" PROMs, where the data is stored in microscopic "fuses" which can be blown (with a high voltage/current pulse)  or left intact to represent the desired data.  Almost never seen anymore, for their programming time was long (heat), the costs were high, and EPROMs got cheap.


QWERTY: A name given to the standard US keyboard layout, named after the "first" six letters on the keyboard.  Some claim the keyboard layout was chosen to slow the typist so they wouldn't jam the early, slow typewriter mechanism.  These people are usually trying to sell you a keyboard with a different layout.   Later reports declare this explanation as nonsense.  The speed difference found by those who learn more "scientifically" designed keyboards is usually real, but very small, around 10%.


RAM: Random Access Memory.  That didn't help much.  I really dislike it when people throw around three and four letters and assume everyone knows what they mean.  On the other hand, RAM is probably best just pronounced "RAM", the explanation doesn't really help the layperson, and to the technical person, the term is too vague.  RAM is the changeable storage in your computer, could also be called the "memory" of the computer, although this is also a term which doesn't warrant too close of an examination.  The Random Access part of the title indicates that any block may be accessed in any order without a huge speed penalty (as opposed to old technology, such as mercury delay lines, or shift registers, which put data out in a certain order, and if you wanted a particular piece of data, wait for it).  RAM has come to mean the read/write memory used in computers to store transient (i.e., not permanently loaded) programs and data, the stuff that typically goes "bye-bye" when the power goes out.  And, there are exceptions to virtually anything in this entry.  Frustrating thing to define, actually.

RTFM: Geek-speak for Read The Flaming Manual.  Exasperated cry of many a weary support person.  Other words are often substituted for Flaming...

Recursion:  See Recursion.

ROM:  Read Only Memory.  Memory which holds programs and data which can not be changed, and maintains its data without power.  Generically, these cover PROMs, EPROMs, EEPROMs, etc., but often, it means specifically mask-programmed ROMs.  These ROMs are very cheap, but they require huge quantities of identical chips, for the program is actually encoded in the masks which are used to fabricate the ROMs.  Mask-programmed ROMs have virtually fallen out of existence, which is unfortunate, as they are one of the only truly permanent storage mediums, but only only standard setting manufacturers (IBM, Apple) could normally justify the setup-costs and permanence of design of mask-programmed ROMs


SCSI:  Small Computer Systems Interface.  A popular interface used to attach "small" computers to devices such as hard disks, tape drives, CD-ROM and other mass storage devices, scanners, and less commonly, printers, terminals, and other devices.  See also Wide SCSI, Ultra SCSI.

Static RAM:  RAM which holds its data with nothing more than the application of power (see Dynamic RAM).  Typically used only where the additional performance or lower power consumption is more important than the higher cost and lower density.

Script Kiddie:  A cracker, typically rather young of either body or mind (hence, the 'kiddie' portion), who delights in breaking into people's computers by following someone else's script.  They typically have no understanding of what they are actually accomplishing, except that by following the instructions, they get someplace they know they aren't supposed to be, and thus, this is cool.


TLA: Three Letter Acronym.  Something the computer industry is obsessed with, mostly to impress/intimidate others.  CPU, RAM, NIC are all examples of TLAs.  For the most part, I really recommend avoiding them, just say the blooming words.

Truth in Advertising:  A concept which has no place on this page, as it does not apply to the computer industry.


Ultra SCSI: A faster variant of SCSI, typically capable of a bus speed of 20MBps for narrow SCSI and 40MBps for wide SCSI.  Good way to frustrate yourself if your cables and terminators are not very high quality.

USB: Universal Serial Bus.  Probably a cool idea, basically, a high-speed, multi-device serial interface.  Actually, closer perhaps to a simplistic network system for peripherals.  Crams a lot of devices into a single interrupt, which is something we REALLY need on the PC architecture.  Claims to be "plug-and-play", and comes closer than other things describing themselves that way.  Currently being used for keyboards, mice, scanners, printers, digital cameras, etc.  One port on the computer can be "split" into multiple ports using hubs.  USB ports actually also provide a 5v power supply, so many devices can be powered directly from your computer (or the hub), so the device may connect to the outside world with only one cable.

ULOS: Unix Like Operating System.  This is my own creation.  This is a catch-all for all the operating systems which in one way or another emulate, look like, act like, or arguably are Unix, but can't say that in the fear of being sued by Unix System Labs or whoever "owns" Unix this week.  Examples: Linux, OpenBSD, FreeBSD, NetBSD, Cromix, Dynix, Coherent.  Many people use *nix as a shorthand, but you see, that leaves out a number of significant choices.


VRAM: Video RAM:  Very loosely defined, typically, a type of "dual ported" RAM, where the computer can be stuffing data in it at the same time the video controller is pulling data out.  Usually it just means a manufacturer is trying to impress you with a TLA


Wide SCSI:  NOT the fat slob next door.  Some time back, someone decided that the standard SCSI interface could be sped up.  The obvious choice was to "widen" it to 16 bits wide from the previous 8 bit, effectively doubling the data throughput, and in the process, doubling the number of devices which can be supported from 7 to 15.

WYSIWYG:  "What You See Is What You Get".  Indicates that the image you see on your computer screen is representing what you will see on paper.  I am first aware of this phrase (not the "acronym") in 1982 in an ad for the then cutting-edge word processor, Wordstar.  Now, Wordstar is considered an example of the antithesis of WYSIWYG.   Some examples of probably unargued not WYSIWYG applications: Postscript (the printer language), HTML (the Web language), TEX, and many main-frame page-layout applications.


XYZZY: 1) A password in the original Adventure game, one of the very early computer games.  2) An undocumented command in some releases of MS-DOS v2.  XYZZY ON would set the command processor to display the "Return code" of an exiting application.  Cool feature for debugging, too bad it was dropped, and it was never documented.

XMODEM:  An early file transfer protocol, developed by Ward Christensen, an early God of the small computer world..  Basically, 128 bytes of data and a checksum or CRC were sent between the sender and the receiver, and the receiver sent an acknowledgement back to the sender, saying "Got it, send more".  Fairly efficient in the days of 110bps and 300bps modems, found to be rather inefficient by the time modems had reached 2400bps and almost useless today.  Still, it was an elegant and simple protocol for its day, and most of the later protocols basically took the idea of XMODEM and fixed the problems, rather than starting from scratch, by improving the error detection, enlarging the packet size and in some cases, allowing the acknowledgement for a packet to be accepted long after later packets have been sent.


YMODEM: An extension of the XMODEM protocol, increasing the packet size to 1k (from 128 bytes).

Y2K: Short for "Year 2000" (and keep in mind abbreviations like that are why we got into trouble in the first place!)  A hoax the computer industry put upon the rest of the world.  We got rich, and you were stuffing money in our pockets.  We are all happy, right?  Mostly, people got caught being worried about how other people were doing their jobs, rather than doing their own jobs.


Zoo: The computer industry, except this defames the otherwise upstanding Zoo industry.

ZMODEM: An extension of YMODEM and XMODEM most significantly, using a "Sliding window" protocol where several packets could be sent to the receiver before an acknowledgement is required.  For example, packets 1, 2, 3, and 4 could be on their way to the receiver before the first acknowledgement is received, and if packet 2 turned out to be corrupted, it could be resent later.

Back to Computer Opinion Page
Back to Home

$Id: glossary.html,v 1.7 2007/06/27 19:14:42 nick Exp $