In computer science, there are many fields of study that carve out different pieces of the computer pie. One continuum that is obvious is from the theory of computation on one end (using idealized pictures of computing machines and doing mathematical proofs about computing in the very abstract) to electrical/computer engineering on the other end (constructing real computing machines and doing physical experiments about computing in the very concrete). That is, to study computing you use different levels of abstraction to hide details you don't care about in order to focus on the ones you do.
This is nothing new, a proven engineering principle. You drive a car without worrying about how it works, working with the abstractions of steering wheel, pedals, signals, parking brake, gears, without worrying about how they really work. Automatic gear shifts apply another abstraction to what many people no doubt consider unnecessary details about gear shifting, using the clutch, internals that your car should really take care of for you. This kind of thing applied repeatedly over the century that we have had cars has turned drivers from mechanics into naive consumers who care more about their lattes and cellphones than what they are doing with their Jetta's user interface. There are still people who look under the hood and work with the details, but you might call them technicians or scientists.
There is another continuum in computer science, from researchers on one end to users on the other end. One rule of thumb I read somewhere said that the researchers are about ten years ahead of the IT industry (if I find that reference I'll post it). That is, the conventional wisdom of today's computer science users has been around since before the birth of the World Wide Web. But whole industries (like desktop publishing, where I work now) are just using tools that somebody else created based on principles that somebody else researched.
(Aside: the programming language Lisp is an extreme example of what I am talking about. From Paul Graham, a Lisp hacker who made millions by creating the technology that is now Yahoo Store: "It's 2002, and programming languages have almost caught up with 1958... What I mean is that Lisp was first discovered by John McCarthy in 1958, and popular programming languages are only now catching up with the ideas he developed then.")
One of these researcher's principles is the object model of computing. Have you ever wondered how a word processor works? It seems very natural to me to type into an empty field and see text appear. Well, what the computer is doing behind the scenes is much more complicated. Every keystroke, every button you press, every thing you do is being logged for multiple Undo; everything in Word has a name. Still more, you are always operating on "objects" by typing in the current Document or making the current Selection of text bold.
(Aside: I always think of objects, for some reason, as undifferentiated lumps of clay I can then sculpt, sort of like making a piece of paper to write on out of pure data (as close as you can get to a pile of hyle without having pure chaos or Being or something). One way to think about computer science is that you are making machines out of clay. The computer science "atom", it turns out, is the most formless distinction possible: 0 or 1, on or off, existent or nonexistent. If the bit had any less form, computer science would be a monism.)
To become a powerful user of a computer program like this (for instance, to program it yourself to do repetitive things while you sip your iced tea), you need to learn the objects in the system. You learn what they are, how they can be accessed and messed with, how to work with them in combinations and build your own mini-programs. Basically, this is like stumbling upon a UFO and trying to get it to fly. You come to the project without any helpful background knowledge, and a given button can mean almost anything, but if you could just get it to work it would do something awesome. Also, computer programmers are aliens.
The objects for a program like Microsoft Word (which has accreted a lot of bells and whistles over the years) are staggeringly huge. It would take a lifetime to understand them all and use them perfectly and efficiently. Nobody does this, of course, because Microsoft is a monopolist who ran a bunch of great companies out of business (that's a fact for legal purposes, I understand), and also because in a few years Microsoft Word GW is coming out (GW for Gee Whiz) and it is likely to break compatibility with all the stuff you did before. Your old programs just won't work anymore because the objects' names and characteristics change around on you.
The complexity of software systems is getting to be too big for any one person or even group of people to handle and predict. There are online games now with GNPs just behind Russia's.
There is a very interesting program that never seems to break compatibility with its prior versions, always runs perfectly, and never crashes.
It is so large and complex that it is hard to understand how it accomplishes this staggering feat; people have been trying to understand it as long as there have been people.
Though they are but users of the program trying to understand the model, we call them "scientists".
We named the program "the universe".