This book is a best-seller, which is a little surprising, considering the public had not yet quite warmed to the Internet when the book was published in 1996. Sure, the Web was getting big just then, and people were starting to see stuff about the Internet in the newspaper all the time, but that kind of hip new high-tech trend usually takes a little longer than that to get a book coming off the shelves and into the minds of the general public. But clearly, people were interested in the Internet, and they wanted the human touch. They didn't want to read about how to read Usenet newsgroups or how to connect to the Library of Congress telnet server so much as they wanted to understand what was behind it all, what had started this whole Internet thing and why.
Katie Hafner is already well-known to people who read books about computer culture as the co-author of Cyberpunk. This time, she teams up with her husband, Matthew Lyon, to write what may well be the most definitive book history of the Internet to date. Like every really good, thorough history, WWSUL begins with subject matter that seems totally unrelated to the book's main theme, so the reader can understand the true forces at work behind the subsequent events. In this case, WWSUL begins with the incarnation of ARPA (Advanced Research Projects Agency), the government-funded research think-tank founded to give the United States of America a boost in its technological development, a boost that was much-needed at a time when the USSR had just beaten them in launching the first space satellite. From there, it proceeds to describe the very first form of the Internet, which began as any new invention does: As an idea. From the original conceptions of linking a bunch of computers together across the United States (nobody was quite thinking of a global network just at that time yet), to the very first IMPs (Interface Message Processors, which were what we call "routers" today), which were the very first routers of the Internet, to the long process of building a real, actual network around a growing infrastructure, WWSUL never stops being interesting.
There simply is no other book quite like this one. No other book so exquisitely captures the days when computers were just beginning to be made, and the world's first hackers were beginning to warm to computers and do things with them that nobody had ever intended them to do. You will not find another book on early hackerdom so captivating, so detailed, and so vibrant.
Levy is a good author, but it doesn't come through when reading this book much, because the story is so fascinating that it feels like the story is writing itself. It's easy to forget, when reading Hackers, that this book is the result of a lot of research (mainly personal interviews with people who were involved in the dawn of hackerdom) and great writing; The story actually transcends the feeling of being well-written, because it would be great even if it were badly-written. So even though Levy is good at putting the tale together, his subject outshines his talent, at least in this book.
If you know about the computer industry, you've probably heard of Soul Of A New Machine. You know, the book that won a 1982 Pulitzer Prize. The book which everybody who read absolutely loved to death. The book which is a humanitarian glimpse into the lives of computer engineers, focusing equally on both the machine itself, and the people who made it, turning the story into something interesting even for non-technical readers.
Well, SOANM is all of these things. However, if you have not already read it, you should put things into context before you pick it up. Remember, it must be understood that SOANM was published in 1981. Microcomputers had just barely begun to have any significant impact on the home consumer market back then, and most of the great historical computer chronicles did not yet exist, not even Steven Levy's "Hackers". Thus, *any* book which tried to approach computer engineering from the perspective of an everyday, casual-reading novel-style book would have been hailed as remarkable, since no such book existed at that time. The idea of talking about computers in plain English, instead of tech-speak, was still a rather new and strange idea back then, and so perhaps SOANM received slightly more praise from both critics and readers than it was due.
In other words, SOANM cannot quite be expected to hold its own against some of the other great chronicles listed here, like "Hackers" or "Where Wizards Stay Up Late". This is not to say that it is inferior to those books, but rather that it would not have been regarded as quite so revolutionary or unique in its tone if those books had been around.
Having said that, if you liked those books, you'll like SOANM, so go get it. It's a great book. It is the story of a group of computer engineers at Data General in the minicomputer era, a time when minicomputers were beginning to go 32-bit and Data General was struggling to develop a new computer that would help it maintain its position as a leading industry manufacturer. The computer is discussed in deep detail (for a non-technical book anyway), but the book never loses sight of the people in it as well; The engineers, with all their quirks and characteristics, are described as the story rolls along. SOANM still stands as a triumph of writing, making the world of computers a little more human.
In theory, there's a story behind almost every invention. Any invention needs, at a bare minimum, an idea, and there's usually a story behind how someone came up with that idea. In addition, several inventions require much more than just the idea; they require a long period of research and development to turn the idea into reality.
The transistor is one of those inventions. The idea behind it is fairly simple: A channel for electrical current to flow through with a "control lead" on it that allows you to control, with another circuit, how much current can flow through that channel. In fact, a device already existed to serve this function long before the transistor: The vacuum tube. The transistor doesn't really do anything that a vacuum tube couldn't do; the difference is that the transistor can be made much smaller and more efficient than any vacuum tube could ever be.
Despite its simplicity, the road to the development of the transistor was long and difficult, with many failed experiments along the way. Part of this has to do with the non-intuitive nature of the transistor and the quantum physics that underly its operation; you can't see it work, because everything happens at the subatomic level, so researchers had to do a lot of educated guessing to figure out what was going on with their early experiments. Then, even after a working transistor was finally developed, it took a lot more development before the device could finally be mass-produced and manufactured into a commercially-available package that engineers could buy and use right off the shelf.
Unlike the making of the personal computer or the Internet, this story of the transistor's development is one that doesn't touch a nerve with the general public. Most folks probably have only a vague idea of what a transistor is--if indeed they have any idea at all--and although the transitor is quite directly responsible for the development of the modern personal computer as it exists today, the principles behind its operation and invention remain a mystery to many.
It's perhaps not surprising, therefore, that Crystal Fire, the only major attempt I'm aware of at seriously chronicling the history of the transistor is clearly not written for the general public. Unlike Where Wizards Stay Up Late or Soul Of A New Machine, in which technical details are only occasionally tossed in as adjuncts to the story (resulting in a book that's accessible to almost anyone, including non-technical folks), this is quite the opposite: A science-oriented book in which non-technical facts occasionally pop up as sidelines. The first couple of chapters of Crystal Fire are humanist in nature, tracing the birth and early life of Walter Brattain, John Bardeen, and William Shockley--the three men generally credited with the invention of the transistor--but after that, the book becomes focused mainly on the science behind the research it discusses. Casual readers may still find a interesting bits in this book, but it is most certainly aimed at serious scientists and engineers.
That said, however, this book is of great value as a more literate introduction to the principles of semiconductors. By examining the social and technological backgrounds that existed as the transistor was being developed, the reader gains an appreciation of the principles and applications of the device that you simply can't convey in an academic textbook. Crystal Fire explains in great detail the principles and development of P-N junctions, field-effect transistors, and the quantum physics associated with them, along with the experiments that led to the discovery and development of these breakthroughs, in a way that's quite accessible. By tracing the history of different ideas, you discover how and why scientists first started putting them together, and you can therefore start to build in your mind a picture of the transistor forming, which is often a more intuitive approach than that taken by textbooks, which almost invariably start with fully-formed transistor and then attempt to explain how each part works.
If you intend to learn the principles of semiconductor electronics, you might want to read this book first; it'll give you some grounding in the relevance of these subjects that will probably be helpful later on when you study them in more detail. I have a few issues with the clarity of the writing in some portions of this book, but I'll grant that the subject matter is sometimes extremely difficult to write about; the authors have probably done the best job possible of making this subject as accessible as it can be to a general audience. This book will be of some interest to historians, but I believe that it will be of greatest interest and benefit to scientists.
Xerox PARC (Palo Alto Research Center) is famous, mostly for being the invention site of a surprising number of technological innovations that are an integral and ubiquitous part of our world today. Well-known examples include the laser printer and Ethernet, but perhaps the most familiar PARC innovation of all was the GUI (Graphical User Interface). The story of how Xerox, the photocopier company, developed a computer called the Alto that could almost surely have blown every other PC on the market out of the water for several years to come, is famous. And yet, the Alto never even came close to being a commercial success. The reasons why make a fascinating true-life business story.
Stranger yet is the fact that this story has not been well-documented; Fumbling The Future is the book that fills the gap. A chronicle of Xerox's history, it begins with their origins as the Haloid Company, a photographic supplies company which persistently reasearched and developed xerography, and with plenty of relevant business insight, tells the tale of how calculated risk-taking and leadership brought the company to be the leader in its field, only to spectacularly drop the ball when it came to their next big endeavor of digital technology.
Fumbling The Future is an amazingly readable book. It's not just that the authors are both Harvard graduates (Mr. Smith from Harvard Law School and Mr. Alexander from Harvard Business School). It's also that these guys can write. Not only is Fumbling The Future an expertly insightful book that catalogues the things Xerox did right and wrong, it's also a page-turning story of ambition, triumph, and defeat, with the classic underdog fighting against the established powers. Of course, this story doesn't have as much of a happy ending, but it's a satisfying journey, from the first page to the last. If you want some insights into why companies succeed and fail in our fast-paced information economy, this book is an important read.
If you haven't heard of Neuromancer, you probably won't recognize most of the other books on this page either. For the uninitiated, this is the book that's credited with envisioning the Internet as it exists today; This was the first science-fiction to popularize the idea of the "Matrix", a global computer network which was as much a part of everyday life as television.
It's more likely, however, that you've heard of Neuromancer but never read it. Everybody talks about this book, but not that many people seem to have actually read it. This is one good reason why it's so highly-regarded, because in my personal opinion, Neuromancer really isn't much of a book. Sure, it's got vision, but that's about all it has. William Gibson has a maddening writing style that's excessively descriptive on the small details, and non-descriptive on the big picture. Neuromancer, like all of Gibson's books, is minute in its descriptions of people, places, and things, but never really explains exactly what the plot is. The reader is thus left to decide what exactly is going on, inferring through deliberately-opaque conversations. Besides this, the characters tend to be irritatingly pretentious; The Sprawl is a dark, dirty place where everybody seems to have a drug addiction and compulsive swearing is the norm. This is amusing at first, but fill a whole book with this tough-guy machismo and it wears thin before too long. Having read Neuromancer a while ago, I honestly cannot think of a memorable scene in the book. Nothing ever really seemed to happen; It was mostly a series of incomprehensible conversations, spaced with periods of brutal violence and multi-page location descriptions. Like 1984 (another book which I am planning to read), this book had some interesting ideas, and is remembered for them, but it's got no substance.
The sequel to Neuromancer, Count Zero is reportedly the best of Gibson's books. I am still in the process of reading it, so I will not comment too much just yet, but I am told by people familiar with his writing that CZ is generally considered the most coherent of all the Sprawl trilogy. Certainly better than Mona Lisa Overdrive (Count Zero's sequel), and less muddied than Neuromancer, CZ seems to have promise. I already like it better than Neuromancer.
One of the most praised documentaries on the cracker underground, Cyberpunk is actually three case studies of famous crackers: The first is Kevin Mitnick, the second is "Pengo" and the German Chaos Computer Club (CCC), and the third is Robert Tappan Morris, the guy who made the famous worm that crashed many of the systems on the Internet. Interestingly, the book was written before Mitnick became famous, and so is based on his earlier activities before his well-known tryst with DMV computers.
People seem to like Cyberpunk, because it's neutral. Whereas every other cracker documentary that I've read is written by an author who puts plenty of personal opinion into the book, Hafner and Markoff are both journalists, and as such, Cyberpunk is more like a news story than a personal perspective. There is little in the way of taking sides here; The authors do not have an agenda. They simply tell the story, and the reader is left to draw his/her own conclusions. The result is an informative, readable look into some of the most famous cracker cases ever.
The world of computer break-ins has its superstars just like any other; The two Kevins (Mitnick and Poulsen) and the legendary Captain Crunch (John Draper), for example, are well-known figures about whom several books have been written. But @ Large is a book about a much lesser-known figure, who indeed has not (to the best of my knowledge) been documented in any other book. This alone makes the subtitle of this book suspicious; If the cracker documented here really did effect the world's biggest Internet invasion, wouldn't it have been better-documented in the media? As the reader eventually discovers, there are reasons given why the media would not want to report on this story, but the fact that none of the key characters in the story ever have their names revealed (the mystery cracker, as well as his family, are all given fictional names to protect their identity) makes the story seem even more questionable. Yet the book openly claims in the opening pages, before the main text begins, that it is a work of nonfiction.
@ Large is about one cracker in Portland, Oregon, who, through sheer persistence, managed to penetrate many computer systems on the Internet. The story takes on greater and greater heights, until eventually the cracker gets into the routers that direct traffic between individual systems on the Net, the theory being that with access to the passageways that data travels on, he could intercept any communications that pass on the Internet whatsoever without having to actually break into any host machines (since the hosts must still transmit over the network to get their data from one place to another). Along the way, the book seems to take every opportunity possible to try and convince the reader that the security of the Internet is dangerously vulnerable. Every chapter begins with a quote from some other source, and most of these are of people making comments to that effect. Indeed, the reader quickly gets the impression that @ Large's main goal is to scare the reader. Paranoia is the order of the day here.
After its conclusion, @ Large also carries a long "epilogue", which is really not a final chapter to the story at all, but rather a commentary by the authors on the sorry state of modern-day Internet security, further adding to the impression that the authors have a strong agenda here.
If we ignore the realism of this book for just a moment, it is actually a pretty good book. The story is very interesting and well-told, and the technical details are accurate and well-explained, so that even not-very-technical readers can enjoy this thrilling story. It makes a great piece of reading that anyone casually interested in the subject of computer break-ins can get into.
However, I cannot deny the sense that something rings false here. The fact that this story has not been reported anywhere else, in any medium, makes it questionable. The single-mindedness of the authors also does serious damage to their credibility.
If you are wondering how feasible it is that a youngster who is physically and mentally handicapped--as is the cracker of this book--could penetrate the Internet on such a thorough level, the answer is that it's certainly possible. The authors stress that if a person with these handicaps could so effectively get through the security online, what could a regular person, or a group of people, with a real agenda accomplish? Yet the feasibility of a large-scale Internet invasion is significantly lowered when you think about the details of this book.
What made "Matt Singer" so successful was the huge expanse of free time that he had, and his robotic persistence. @ Large effectively identifies these as the cracker's most effective tools, for he did not employ any brilliant tricks, but simple "brute force" style methods, and almost limitless time on his hands. Indeed, it becomes clear that a person given enough time can penetrate almost any system, but the question then becomes how much time is required to do so; The Infinite-Monkey Theorem applies here. In Singer's case, a lot of time was required. And in his single-minded attacks on the systems he encountered, he was detected many, many times. Before the book is even one-third of the way through, the cracker has already been identified, and long before the book comes to its conclusion, the cracker in fact meets face-to-face with the system admins of some of the very computers he has cracked. The ending is completely anticlimactic: Singer is arrested, an event which you could see coming from miles away. So what was the point? Sure he broke into several systems, but he got caught many times over.
@ Large emphasizes in its epilogue that if you're tempted to think that Internet security has increased since the events documented in the book, think again. That may have been true when the book was written, but it's worth mentioning that this book was published in 1997, when the Internet boom was still just beginning and the big-business world of e-commerce had yet to really take flight. While the Internet certainly had its insecurities back then, it must be understood that the computer world is a fast-changing one, and in the few short years since 1997, the world around us in general has changed tremendously, and so has the way things work on the Internet. I do not wish to debate Internet security (that is another subject for another place), but suffice it to say that no matter what the authors may tell you, today the face of computer security has very much changed since Matt Singer's escapades in this book. Singer would have been arrested much more quickly if he had tried to break into any sensitive sites (those run by the government or military, for example), but since he wisely chose to stick to places with little time or money to take legal action against him, he got away with his antics for a much longer period of time. Today, it would be much the same (most businesses still don't have the time or money to chase down every cracker who tries to break into their systems), but with effective, cheap hardware firewalls readily available to end-users in this day and age, which are fully capable of filtering all traffic that passes through them on the packet level, even beginning any kind of crack attack over a computer network is vastly more difficult today than it was in Singer's time.
Having said that, this is still an excellent and interesting book that is worth reading by anyone interested in computer security. But don't give in to the paranoia; The one-sided picture painted here is not the whole story. Read it, enjoy it, then read some more technical (and current) books about Internet security, and you'll understand.
Another very well-known book in computer culture, TCE is Mr. Stoll's true story of how he tracked a German cracker through the Internet (or what passed for the Internet back then) and brought him to justice. Although this book has received a bunch of praise from nearly everyone who's read it, I found myself irritated by it more than anything. Mr. Stoll has decidedly left-leaning politics, and he constantly tries to make himself appear like an anti-government hippie while at the same time constantly calling different Federal law enforcement agencies to aid him in his hunt. While the book does acknowledge the hipocrisy of this, the fact that it happens time and time again makes it seem ridiculous rather than humorously ironic. I'm also irked that (whether he intended it that way or not), the book makes Stoll out to be the guy who caught the cracker, when in fact in the end he didn't do much except track a few of the cracker's actions in computer systems and then get on the phone to the phone company, the FBI, or whatever other agency he happened to be calling for help at that point in time. The book is not without its clever insights and interesting moments, and it has value simply for its real-world revelations (remember, this *is* a true story), but I couldn't manage to derive much enjoyment out of it.
I should mention that Mr. Stoll's politics are very different from mine, and the clear differences in mindset may be why I couldn't enjoy this book. While I am no lover of big government, I have never cared much for the hippie-esque, Grateful Dead-loving lifestyle that he seems to live. (He actually mentions attending a Grateful Dead concert in the book.) His later books on why computers should not be allowed in classrooms inflamed me even further, for several reasons, one of which was probably the blindingly obvious (to me) effect that students who are not allowed to learn with computers become more sheep-like, more inclined to follow the system and become bricks in the wall rather than individuals, which seems to go directly against Stoll's overwhelming emphasis on anti-institutionalism. I guess in the end, I can't help but feel like everything Stoll has written is self-contradictory, but draw your own conclusions; Even if you don't agree with him, the guy has a lot of interesting opinions.
This is one of the most popular of all the cracker history books. Unfortunately, it becomes clear after spending a little time with this book that that's not a good thing: MoD is a sound-bite junkie's account of the cracker underworld. This book is virtually content-free, trying too hard to be cool and not doing enough to convey any real information. It becomes an even bigger shame when you realize that the book is actually not badly-written; This really is an interesting story and Slatalla and Quittner tell it well, but their main goal is clearly to mesmerize the reader, much as At Large had an agenda that consisted mainly of vague attempts to instill paranoia in the reader. MoD focuses on the eponymous Masters Of Deception, a relatively unknown cracker group that was spun off from the much better-known LoD (Legion Of Doom). Since it's an important group which hasn't been well-documented anywhere else, this book is still worth a read, but if you haven't read much on cracker culture yet, don't make this the first book you pick on the subject.
This is one of several books in a sub-genre of modern fiction that's popped up about computer crime. The field is littered with mediocre efforts, including Raw Data (which was actually a Harlequin novel, which should tell you something), Bad Memory, The Blue Nowhere, and innumerable less well-known but similarly cheesy novels. In my opinion, while it's not a great book, The Termination Node is one of the best, simply because it seems to be written by an author who actually knows something about computers, and as such, the technical details are at least slightly more accurate than in most of the other books, making this a sort of "easy reading Neal Stephenson" type of book (in terms of the technical accuracy, that is, not in terms of the deep philosophical meaning that tends to be present in Stephenson's books). It's got some entertainment value, and if you just want a quick, fictional story that's about computers and crackers, then this fits the bill.
Upon reading this book, my first reaction was: Wow.
Of course, the obvious question, then, is: Good wow or bad wow? Well, it's both.
The Blue Nowhere is a suspense/crime novel by Jeffery Deaver, a man who's got quite a few respected suspense thriller novels under his belt by now, inclusing The Bone Collector and The Empty Chair. It's one of the better "computer hacker" novels, but like any attempt at fictionalizing the world of computer people, it's got its idiosyncracies.
It's clear that The Blue Nowhere wants to be credible as a book about technology written by somebody who has some clue of what they're talking about, as quickly evidenced by things like the fact that the chapter numbers are denoted in binary numbers. Like most novels about high-tech, this one feels the need to constantly explain aspects of technology that are brought up in the story. A good portion of the book is thick with explanations about every new piece of technology that arises in the story, from what a "scram switch" is to the expansions of ubiquitous chat-room acronyms used online. The result feels almost like a technology primer written by somebody who happened to glance through key points of some books just before writing this novel. (That may be exactly what it is, as Deaver credits several books for helping him write this one, including The New Hacker's Dictionary and staples of real-world-cracker literature like The Cuckoo's Egg.) Thankfully, as the book gets closer to its conclusion, new bits of technology stop popping up as frequently and the story starts to proceed relatively unhindered.
Also like most novels about high-tech, The Blue Nowhere requires some suspension of disbelief, especially if you're a real-life techie. Some of the aspects of technology here are just not possible. For example, the much-discussed "Trapdoor" program, a worm that spreads by embedding itself into Internet traffic, almost sounds like it might work at first; Deaver even uses the word "steganography" to describe it, which is a real concept similar to what he describes. However, in reality, you cannot propagate a worm just by embedding its code in some data, because for a worm to spread that code must be executed, something which doesn't happen if you just tack the code onto a data transmission. The biggest suspension of disbelief required, however, relates to a police detective named Frank Bishop, and the constant concessions he makes to Wyatt Gillette, a cracker who's under temporary release from prison. It's the classical story about an outlaw who's so good that the police need his help in bringing down a dangerous killer who's better than the police are. While this in itself rarely happens in real life, what's really ridiculous is that even after Gillette escapes from custody and they have to bring him back, Bishop decides to "give him one more chance". This kind of thing happens constantly, with Bishop sticking up for a convicted criminal at the risk of his job. For example, the following is an actual passage from the book that follows after Bishop has just decided to let Gillette borrow a police car so he can go driving to visit his ex-wife:
"Wait," Bishop asked, frowning. "You have a driver's license?"
Gillette laughed. "No, I don't have a driver's license."
Bishop shrugged and said, "Well, just don't get stopped."
Yes, Bishop is supposed to be the "good cop" in this story who's kind to people and gives them a break, but there's a difference between giving somebody a break and letting an unlicensed driver who's still a convicted felon borrow a police car so he can go on a joyride. Deaver notes that he's "taken some significant liberties with the structure and operation of federal and California state law enforcement agencies", and he's not kidding. The driver's license conversation is probably the most extreme example of Bishop's ridiculously easygoing nature, but it's implausible enough that I couldn't help but shake my head.
On the good side, the book tells a good story. Deaver is pretty good at creating suspense, and the book does a lot in several places to work on the assumptions and tensions of the reader without actually creating a grisly picture. Although the police are after a murderer who kills people by stabbing, there are only a couple of actual murders in the book; The rest of the time, the reader is simply led into a state of terror leading up to a planned murder, playing up the sick nature of the killer they're after. The killer is a hacker who's long been detached from the real world and lost the distinction between cyberspace (what Deaver calls the "Blue Nowhere") and reality, and the book carries an important, though obvious, message about the importance of staying well-grounded when you're a hacker, and understanding that although the Machine World is a fascinating place that's become an important part of our world, what really matters in this life is still people.
It's really just another thriller, and it won't be the most engrossing book you ever read. But it's a page-turner that reads through pretty quickly. (The movie rights to The Blue Nowhere have been bought by Warner Brothers and they're turning it into a movie, so that should say something.) If you want a fictional novel about computer hackers/crackers, this is one to consider.
There comes a point in the life of the electronics enthusiast when you realize that to really understand electricity, you're going to need to study at the level of a university physics student. Books which tell you about what capacitors do or explain the same old concepts like Kirchhoff's laws are fun and relatively easy to understand, but they don't give you the full picture of what's really happening to the electrons in the system. To discern the real truth behind the behavior of electrons is a subject more in the realm of physics than electrical engineering. As such, if you're an electronics tech who wants to gain a genuinely above-average understanding of your field, you'll need to step off the beaten path for a bit.
Of course, there are countless physics textbooks out there which explain electricity and magnetism (often shortened to "E&M" among physics students). What you want is a textbook that's better than the rest. I can't say that this book is the best in the world for its subject, but it's won a fair amount of praise from students and teachers alike, and I feel that praise is deserved.
One of the more unusual things about this book is its writing style: It lacks the stiff, formalized structure found in most postsecondary textbooks, instead offering a conversational, informal style that often reads like a text file from an underground BBS explaining some technical concept. In tune with this writing style is the pace of introduction of new material: The book progresses at a pace that's steady, but relatively gentle by most physics textbooks' standards. The result is a surprisingly readable and enjoyable book that's challenging, but feasibly manageable even for the person with no formal physics background. That's not to say that this book is necessarily easy; it covers a sometimes-difficult subject, and like almost any effort to explain such a subject, there are times when the writing can seem muddy and unhelpful. And make no mistake, you'll need at least a little grounding in differential and integral calculus to make sense of much of this book (though if you don't have it already, you might be able to painfully work it out as you go along, in which case this book could be the reason you've been waiting for to actually learn calculus).
Along the way, you'll find that electrodynamics is one of the most elegant fields of physics, resolving into a few relatively simple equations and laws in a way that other fields of physics don't. As the book also notes near the beginning, electrical forces are the reason behind most of the forces we experience on an everyday basis (with the exception of gravity)--virtually every sensation that you feel and see is the result of electrical forces on the nano scale. The book then proceeds to introduce you to all the key concepts of E&M: Electrostatics (including electric fields, Coulomb's Law, and voltage), magnetostatics, and Maxwell's equations. And that's only the first half of the book! Once these fields of "classical" electrodynamics have been covered, the book veers into comparatively advanced material, including electromagnetic waves, radiation, and how electrodynamics relates to the theory of relativity. This book is definitely much more than you need to be a competent electronics technician, but if you're the kind of person who enjoys really knowing things, then it's tough to think of a better starting point for launching you into the field of electricity and magnetism. This book can also serve well as a primer for more advanced books on the subject, like John David Jackson's famously brutal Classical Electrodynamics, Third Edition (Wiley, ISBN 04-713-0932-X). Of course, the entire field of physics pretty much goes on forever (or at least, beyond the scope of the human mind), but between these two books, you should have more than enough material to get through the E&M portions of any undergraduate physics class.
Within any field of academia, it tends to become the case that a certain textbook, or sometimes a few textbooks, will becomes the de facto standard for study in that field. In electrical engineering, if one had to pick such a book, a book whose name seems to come up time and time again and which is commonly used both as the text for a college class and as a self-study guide by students who can't afford college, it would likely be The Art Of Electronics, by Horowitz and Hill. This book receives endless praise from teachers and students alike, who love its easygoing style and logical organization of subjects, allowing the reader to grasp fundamental concepts and proceed to more advanced concepts only when the necessary preliminaries have been mastered. From my own perspective, the biggest strength of this book is its completeness: It spans quite a lot of material (as you might expect, given that it's around 1,000 pages), and although the book seems somewhat analog-centric, it has a significant section towards the back on digital and computer electronics. No doubt it's a pretty good book, but for some people, it may be a little difficult to follow. The book appears to try hard to challenge the reader, and challenged the reader will be! This is not a simple book; it covers a lot of ground, and if you read it cover to cover, you'll be ever pushing yourself to get your mind around new concepts. If that's what you want, this is a great book.
For those who might want something a little slower-paced, though, it's tough to imagine a better book than Stan Gibilisco's Teach Yourself Electricity And Electronics. As is often the case with McGraw-Hill's books, this seems more targeted at the hobbyist rather than the aspiring professional. As such, the book is very light on math and serious academic aspects of EE, instead concentrating on the practical essentials you need to know to get into electronics as smoothly as possible. The downside to this book is that it's written at a lower level: It contains approximately the equivalent of a community college education in electronics, whereas The Art Of Electronics would set you on the path to a 4-year degree. You'll be able to do a lot of cool stuff after reading Gibilisco, but your knowledge won't quite be equivalent to that of an engineer.
Also worth mentioning is Clair A. Bayne's Applied Electricity And Electronics, a very popular book in community colleges which has more of an electrical focus. This book will teach you about the big-power stuff that "electronics" books won't even mention. Bayne is perhaps a little easier than Horowitz and Hill or even Gibilisco, but this book will give you a broader perspective of the electrical industry. It may be the book to read if you'd rather work for the local power company than a chip shop.
Although it's not really an "electronics textbook", per se, Albert Paul Malvino's Electronic Principles is definitely worth a mention for the serious electronics student. The reason I say it's not really an electronics textbook is because it is entirely focused on devices; this book will teach you great stuff about diodes, transistors, and op amps, but precious little circuit theory. This shouldn't be the first book you read, but at some point you'll need to learn about devices, so when you're ready to do so, this is the authority. Malvino consistently does an excellent job of introducing new concepts without pushing the reader too hard, letting you learn lots of things but never forcing you to learn too much or too fast.
Besides the above-mentioned Electronic Principles, Dr. Malvino's other main contribution to the bookosphere is Digital Computer Electronics, a book which purports to explain how computer hardware works at the lowest possible level. While this might not be the only book that attempts to do this, it was the first well-known book to do this successfully and in language that almost any person could understand.
Digital Computer Electronics assumes no background in digital electronics, and so the first part of the book, comprising 9 chapters and over 100 pages, has little to do with computers specifically, but rather introduces the fundamental concepts of logic gates, flip-flops, registers, and memory circuits. This is all great stuff and if you don't know about it, you probably should before you read the book any further. But if you already have a grounding in digital electronics and are familiar with these subjects, and want this book for its computer-specific content, then you'll want to just skip ahead to chapter 10, which begins with the real meaty stuff. This chapter fully explains the architecture of a computer architecture called SAP (Simple As Possible), which appears to be an invention of the authors. SAP comprises a full computer system, including several registers (program counter, accumulator, MAR (Memory Address Register), B register (essentially a second accumulator), instruction register, and output register), an ALU, some RAM, and a simple output display consisting of 8 LEDs. The SAP architecture shows how the CPU is actually constructed; all of its "black box" ICs are 7400-series logic chips (except for a 555 used to generate the clock, and a voltage regulator used on the power supply), meaning that the full logic functionality of the system is easily grasped. This system is the best I've ever seen published in any book for showing how a CPU works internally. In fact, probably the most valuable part of the book is the SAP schematic diagrams in chapter 10; the schematics are mostly so simple and clear that you don't really need to read the chapter text.
Once you've understood SAP, you've reached the high note, and you can honestly say that you understand how a CPU works inside. You can put the book away, or if you want to expand on your knowledge a bit, you can follow the book into subsequent chapters on SAP-2, which adds jump opcodes, and SAP-3, which adds some registers and opcodes in an effort to be a system comparable to the 8085 CPU. No small task, but this book pulls it off with admirable lucidity; clearly, the book is written with the intent of actually being understood, and it shows. After SAP-3, the book moves on to explain the use of popular real-world CPUs, including the 6502, 6800, 8080, and Z80. The book contains a reasonably large amount of material, but the main reason to get it is for the SAP-1 through SAP-3 schematics, and they're reason enough to get it.
Semiconductor design--the level of electronics design at which you actually specify the lines of silicon that will go into a microchip--is a touchy business. It requires a lot of highly specialized knowledge, and that knowledge is not readily attained anywhere other than from learning on the job. There is a conspicuous lack of "entry-level" jobs in this field, just as there's really no such thing as an "entry-level" physician or astronaut. This creates a significant barrier to entry for those who want to get into the field, and the result hurts people on both sides: Engineers have trouble fitting into jobs because there's not a good place to start from, and companies have a hard time finding qualified design engineers because companies are rarely willing to do on-the-job training in electronics anymore.
Recognizing that most people who transfer into a semiconductor design role will probably migrate over from other professional electrical engineering fields, there's a notable lack of books intended for people just starting out in semiconductor design. Enter these twin books which pretty much start from zero, i.e. they assume no prior knowledge of electronics whatsoever. IC Layout Basics' first chapter is actually an overview of fundamental non-semiconductor electronics principles like Ohm's Law. This should probably be considered the "first" book, then, and IC Mask Design is sort of a follow-up. These books are by far the easiest-to-read introductions I've seen to semiconductor design, clearly intended for audiences without an engineering background. As such, they're probably the simplest possible jumping-off points for anyone who wants to get into semiconductor design but has no experience whatsoever.
This is really a ridiculously important book, simply because it's one of surprisingly few books that bothers to teach true 3D graphics programming.
Before 3D APIs came along, programmers writing 3D engines would need to know a fair bit of math, particularly matrix operations, as well as some more basic things like trigonometry and other geometry. This book does a generally decent job of imparting this knowledge to the reader. Many of the explanations could benefit from being slightly clearer and having better examples, but overall this book is still one of a rare breed. LaMothe is a well-known author in the subject of graphics programming, and this is one of his earlier works. (He later attained the much-deserved position of being the Series Editor for Premier Press' Game Development Series of books.)
Most 3D games today simply use artificial crutches like Direct3D or OpenGL to do all their dirty work. While these are useful tools that certainly should be supported by any modern 3D game, it's still beneficial to developers to understand the principles behind genuine 3D modeling. It is, indeed, a black art. Perhaps the best part about this book is that it's aimed squarely at DOS (the only real operating environment for PCs) as a development platform.
This was an early book by LaMothe (who went on to become known for writing thousand-page tomes on game programming), but it already showcases his huge drive to learn about computers and turn that knowledge into fun games. A prime example is this passage which is rather casually tossed in the middle of a discussion on page 474 about interfacing with COM ports for multiplayer gaming, which I'd like to quote here because it perfectly illustrates his mindset:
Video game programming is a Black Art because many people no longer care to understand how everything works. Programmers and engineers today would rather use OPT (other people's technology). True Cybersorcerers use other technology only when they fully understand the technology and could re-create it if necessary. Using software and techniques we don't understand breeds laziness, which is the root of poor programming. So, the moral of the story is to be a master of all trades and a jack of none.
Easier said than done, perhaps, but I can't help but admire such a dedication to actually understanding the technology you implement, be it hardware or software. LaMothe wants you to genuinely understand the things he's talking about, and he writes the book with this goal. The result is a book that's sometimes surprisingly lucid.
It's also worth mentioning that despite the title, this book is actually not strictly focused on 3D graphics; it contains chapters on device (keyboard, joystick, and mouse) I/O, sound, artificial intelligence, interrupt programming, and multiplayer programming (albeit using modems, not TCP/IP; this was before the Internet took off), much of it rendered in C and assembler routines embedded in C. The result is a remarkably complete book on real-mode game programming for the PC that any true game programmer ought to read.
10 years, a few huge books, and no doubt a fair amount of money later, André LaMothe has produced his only really hardware-oriented book yet; his others have all been about programming. As you might imagine, however, LaMothe continues to bring his particular brand of get-up-and-go to his subject.
The Black Art Of Video Game Console Design is something of an enigma--the title clearly suggests a book that's about designing video game consoles, but most of the book is actually more like an introduction to electronics. This book is clearly targeted at readers who have absolutely no background in electronics, which is good if that applies to you, but unlike software, in which you can write a "Hello, world!" program in 5 minutes and be actually writing and using slightly more complex programs in a matter of hours, hardware design doesn't really happen without a comparatively large amount of introduction and theory. In recognition of this, LaMothe has gone ahead and opted to introduce you to electronics, starting right at the very basics (the first chapter actually has a section which explains electrons, protons, and neutrons) and proceeding from there, going on right up to how CPUs are structured. Because LaMothe is an eminently practical guy and this book wants to teach you how to actually build stuff, there's also a chapter around half-way through on construction techniques, discussing some details of various board types (breadboards, perfboards, etc.) and how to solder and wire-wrap. It isn't until you're almost two-thirds through the book that it starts getting to things that are really specific to video game console design.
If you're familiar with LaMothe's writing style but haven't yet read this book, the above paragraph should tell you that this book is worth reading for comedic value alone, and indeed, such is the case: When André LaMothe decides to teach you everything about electronics in half a book, hang on tight, because the ride keeps charging ahead and doesn't stop until you know enough to get a college degree in this stuff. This sounds exciting, and it usually is, but as you'd probably expect, there are multiple times where the book suffers somewhat from this approach, introducing subjects in way too little page space when they really could benefit from some elaboration.
All of this means that the book is a little tough to recommend as a practical learning tool, because it appears to serve two separate purposes, and it doesn't do either of them especially well: First, this book intends to teach you all about electronics; however, it does this using an excessively condensed approach, which is understandable given both space constraints and the fact that industry-standard books already exist for this purpose (such as Horowitz and Hill's The Art Of Electronics). Secondly, the book wants to tell you about developing video game consoles on both the hardware and software level, which is great, except that less than half of the book is actually on this titular subject matter.
So who should read this book? The target audience is clear: People who truly have no background in electronics and don't want to become electronic engineers, yet who want to design video game consoles and want just enough electronics theory to do exactly that. If this describes you, here's a book that's written explicitly for you. If you'd like to design video game consoles but also want a more thorough understanding of electronics, you're probably best off building a foundation of knowledge with more traditional electronics textbooks before you pick up LaMothe. There are rewards to be found in this book, but you'll want to supplement it with some others to make the most of it.
There once was a time when computers were simpler. There once was a time when, like cars, computers came with repair manuals (or at least had repair manuals separately available). There once was a time when an average person, with a small workshop of not-particularly-sophisticated-or-expensive tools, could (and would) actually repair their computers when something broke. This is a far cry from the throwaway computer culture of today, which exists partly because most computer parts are sufficiently difficult to fix (and cheap enough to replace) that it just doesn't pay to fix things anymore.
This book is a classic from the age when people routinely opened and fixed their computers. The Commodore 64 and its ilk were a fairly robust set of computers and rarely required repair, but the Commodore 1541, the 5.25" floppy drive commonly used with these computers, was notorious for going out of alignment and requiring repair. While this operation could be done at a local shop, there's nothing quite like the sense of satisfaction and understanding gained from doing such repair yourself. This book tells you exactly how to align your 1541; the procedure is right there, on page 33, where it is also acknowledged that the drive "frequently goes out of alignment because of the track select mechanism".
There are, of course, other things in this book as well, resulting in a volume that's a bit over 200 pages when you include the appendices, which include, among other things, a complete list of every component (right down to the resistors and capacitors) that goes into the 1541. If only computer manuals could be this thorough today. There's also a separate manual of this style for the Commodore 64 itself (ISBN 0-672-22363-5), and that's worth reading too, but the C64 is a largely solid-state entity, and in any case it rarely fails. For a true hands-on electromechanical experience, it's hard to beat working on a 1541. Even if you don't have such a drive, it's worth getting this book as a genuine piece of history, and as a model of how detailed all computer documentation should be.
Although this was technically Neal Stephenson's third novel, it was infinitely more successful than his first, The Big U, and significantly more so than his second, Zodiac: The Eco-Thriller, which never gained much more than a cult readership. Singly responsible for making Stephenson a household name (at least among sci-fi enthusiasts), Snow Crash is one of the most talked-about books of the modern age. There's not much to be said about this book that hasn't already been said, but I may as well sum up the highlights.
For starters, it seems clear to me that Stephenson richly deserves his reputation as the father of second-generation cyberpunk. He is to William Gibson what Half-Life was to Doom: The manifestation of a culture that was once primitive, strange, and mindless, but has become mature and thought-provoking. Stephenson's writing mirrors Gibson's in terms of minute detail (Stephenson has been called "the hacker Hemingway"), but Snow Crash is made so much more enjoyable by the fact that everything is clearly explained. In fact, things are explained almost to a fault; Whereas Gibson never really bothered to explain what the Matrix was, let alone what it was like to be there, Snow Crash often stops reading like a novel and more like an instruction manual or product specification. The earliest example of this is when the book's protagonist (amusingly named Hiro Protagonist; yes, really) is in the Metaverse (Stephenson's vision of the Matrix). The book actually describes the mechanics of the computer Hiro is using to connect to the Metaverse, and waxes almost philosophical on the programming algorithms used to represent a virtual-reality world in real time. It's plain to see that while William Gibson was mostly about flash with little substance, Neal Stephenson's writing is meaty and clear.
This is not an attempt at putting down Mr. Gibson or his work, but it must be understood that different audiences will have different tastes. Science-fiction lovers who don't know much about technology and don't care, but still want to read about a cool futuristic world, love Neuromancer and its ilk. But while Gibson freely admits that he doesn't know (and doesn't care) much about computers, Neal Stephenson is a real-life techie, and this shows so clearly in his writing. His fascination with gadgets, his love of clever hardware/software hacks, these are things that make a person a geek, and Snow Crash is a novel by a geek, for geeks.
This alone would make Stephenson a standout writer, but he's got all the other qualities that make a good writer too. He knows his way with a plot; Snow Crash has scarcely a boring moment, as things never seem to stop happening, even as the next gizmo in the book is being detailed. And everything Stephenson writes is imbued with one of the rarest, most precious gifts a writer can carry: The ability to be consistently, intelligently, and genuinely funny. This is a guy who can be simultaneously technical, philosophical, and hilarious, stimulating both the scientific and the artistic sides of your brain, while making you laugh at the same time. Add it all up, and it's not hard to understand why Neal Stephenson is one of the foremost authors of today, and why Snow Crash has been so well-received by virtually everyone who reads it.
It should be added that the purported strengths of Stephenson's writing can and do sometimes work against his writing. If you read this review and thought that reading descriptions of how make-believe technology works sounds like it would interrupt the flow of the writing... Well, you're right, in some cases Stephenson does get so bogged down in description that the story starts to feel disjointed and dull. It feels weird sometimes when an author starts writing about something interesting, then needs to stop and explain what it is. This could explain part of William Gibson's appeal; Assuming you can understand what's going on, the reading is much easier and the story remains uninterrupted. There are portions of Snow Crash which explain things in so much detail that it becomes almost laborious to read, especially the parts where Hiro is learning about ancient cultures and their languages and beliefs. I found myself just wishing that the impromptu lessons in history could be over so the storyline could progress. Having said that, however, I still prefer Stephenson's much more intelligent writing style, but understand that both authors' styles have their strengths and weaknesses, and in fact some people do prefer one over the other.
One other caveat about Snow Crash: This is Stephenson's "fun" book. In other words, there isn't a deep moral to be taken away from this book, as most of it is simply a story. Zodiac was about environmentalism, The Diamond Age had some interesting observations about culture, and Cryptonomicon, being set largely in World War II, had a lot of insights about a lot of things (particularly war). If you're looking for a book from which you can take something away with you, this might not be it, and you should try one of Stephenson's other books instead.
The worst thing about a book like Snow Crash is that it sets a precedent. After the soaring success it presented, anyone could be fully forgiven for assuming that the book was a fluke and that Stephenson would never again write a successful novel. A book like that, which has everything balanced almost perfectly -- action, adventure, intrigue, and tech -- can hardly be expected to be duplicated any time soon by the same author. It just doesn't happen. Sure, people get lucky and pull off a good story sometimes, but even the greatest authors can't always do it consistently.
But in this case, it happened. After dithering for a time and writing The Diamond Age, which was good, but not quite as famous as Snow Crash, Neal Stephenson produced the absolutely stunning Cryptonomicon. This book is just about everything that a book-loving techie could want in a novel: It's huge, it's intelligent, and it's sprawling in its scope. The plot unfolds among three separate people: The first is Robert Shaftoe, a hard-driven World War II Marine who writes poetry in his head while simultaneously killing hordes of the enemy. Next is Lawrence Pritchard Waterhouse, the well-meaning but out-of-it idiot savant with brilliant science skills (in this case, he's a math guru), but little in the way of social skills or common sense, cast as an information expert during the same War. Last is Randy Waterhouse (yes, the name relation is revealed later), a modern-day nerd who's trying to get his latest business venture off the ground, with a long string of high-potential but utterly-failed high-tech businesses behind him. These three characters are separately narrated at first, although their individual plot threads slowly begin to come together. This, in itself, would make the book an ambitious project, made more difficult by the fact that two of the characters are living in the 1940s and one exists more than 50 years later. But Cryptonomicon is just sweeping in scope, taking place in many geographical locations, dipping into many cultures (Japanese figuring prominently among these, or "Nipponese" as Stephenson insists on putting it), and seeming to offer interesting commentary on just about everything in its course.
There are two sides to every coin, however; Although Cryptonomicon seems to have everything going for it at first glance, there are those who believe that it is simply too big and long, that it is excessive verbosity for verbosity's own sake. This is probably true, but the question then becomes whether this is good or bad. There are certainly many people who like a big long book that will take a fair amount of reading to get through, and those are the people who will read and enjoy Cryptonomicon. The people who will dislike this book are those who prefer to-the-point writing. It must be admitted that there is actually a remarkable lack of plot development in this book: Like the airplane engines in Lawrence Waterhouse's reflection on life, the book turns around and around, and makes a lot of noise, but not much else seems to happen. If you're not going to enjoy reading 900 pages just because you love to read, it's probable that you're going to be disappointed with Cryptonomicon, and wish that you could have back the time that you wasted reading it. These exact same criticisms can be (and already have been) leveled at The Baroque Cycle, Stephenson's three-volume follow-up to Cryptonomicon. Is Stephenson getting carried away with himself, writing way, way too much just because he can? Or does more equal better in this case? You must make that decision for yourself.
As you can imagine, when anything becomes popular, hordes of people try to cash in on it. Apparently recognizing the market for the techno-thriller novel, several people have stepped up and submitted their own contributions to the genre. The techno-thriller novel is a funny genre, however, because there are different kinds of technology that can thrill. Tom Clancy has been famous for years as a master of the genre, but his books are more about military hardware clashing than computer hardware; Thick with politics and international relations, his books are geared for a less recent generation. The generation that's coming of age on the Internet revolution, however, is more interested in the intrigue that takes place in cyberspace than the intrigue of real-world, physical warfare. The modern techno-thriller is a book like Dan Brown's Digital Fortress.
This "thriller" about a new computer encryption code received little attention when it was first written by Brown in 1998. More recently, however, Brown has gotten a lot of publicity as some of his books--most notably The Da Vinci Code and Angels & Demons--suddenly became bestsellers. As Brown became the hot author of the moment, his previous work on Digital Fortress received renewed attention. This book may make you think of Cryptonomicon at first, owing to its focus on computer encryption and how encryption affects matters of national security. Indeed, it might seem to be "Cryptonomicon-lite", a somewhat less mathematical novel for the masses who'd rather read a story than a manual on codes.
However, it quickly becomes apparent that this is not even the little cousin of Stephenson's book. While Cryptonomicon is an unusually long and carefully-written book by a real-world science, math, and crypto nut who just happens to be better known as an author, Digital Fortress is a book by a guy who's an author first and foremost, and as such, this book at first seems more palatable and less like a university textbook in its style. However, the writing in Digital Fortress, while pretty good in terms of describing characters, events, and places, becomes almost painful to read in many portions of dialogue. Near the beginning of the book, there's a passage in which the female lead, who's taken it as a given all her life that no code is unbreakable, is suddenly informed (and needs to be repeatedly assured) that an unbreakable code has been found. A few chapters are wasted on her carrying on a line of questioning to the effect of "Somebody made an unbreakable code?! Oh wow! Are you sure? But that's impossible! I can't believe it! *gasp* Do you really mean that this is a code that can't be broken?! Remember, no code is unbreakable! It really is? Amazing!"
Silly (or downright embarrassing) dialogue aside, it's worth noting that the book's claim of the "rotating cleartext" encryption method used by the new, unbreakable code is exactly what it sounds like: Fiction. In fact, if you search for "rotating cleartext" on Google, the ONLY hits you'll get are ones relating to this book and complaining that there's no such thing outside the world of Digital Fortress. This book also seems to be desperately guilty of the ludicrous techno-paranoia that permeates several novels like this, using half-truths and sometimes outright fictions to try and convince the reader that privacy and security in real-world computer networks are totally nonexistant. I get tired of people who sincerely believe this.
Okay, so this book is an easy target for criticism. There are a lot of things wrong with it, but if you ignore some of its quirks and try to appreciate it for what it is, it constitutes a reasonably entertaining novel about a world that we don't see: That of high-stakes government computer code operations and the implications of a code that the Feds can't crack. The book has its flaws, but ultimately, it's actually fairly well-written most of the time, and if you can accept that this is a fictionalization, you might enjoy it. However, it's clear that Dan Brown is more of a people's writer than a mathematician's writer, so if you want something more technical to sink your teeth into, it's worth keeping in mind that Neal Stephenson is an author who really does come from a math and science background. For everyone else who wants a shorter, more accessible book, this might fit the bill.
This book, Richard K. Morgan's first novel, has gotten a lot of attention recently, putting Morgan on the scene as the hot new cyberpunk writer. The main plot premise isn't scathingly original, but it's more than good enough to carry the book: In the 2500s, technology has been developed to store a person's mind in a "cortical stack", a small storage drive implanted on the back of one's neck. The purpose behind this is that if someone dies, their stack will still contain their memories, so all that needs to be done is transplant the stack into a new body (physical bodies are called "sleeves" in this book), and the person will wake up again, albeit in a different body. (The process of transferring a stack to a new body is called "resleeving".) This essentially creates immortality (unless someone's stack is destroyed), although it does mean that whenever someone dies and is resleeved, they have to get used to living in a different body.
As you can probably imagine, the implications of a technology like this are enormous, and Morgan does a good job of taking advantage of them, examining different ideas relating to immortality, how much of a person's identity is related to what their physical body is like, and other sociological and psychological factors. Morgan is a decent writer, even though overall, the book feels like it could be written by a somewhat more coherent, less posturing William Gibson: Drug use, swearing, and high-tech data crime abound in this book as they do in Gibson's work, but Morgan doesn't use the vague, half-formed conversations and descriptions that Gibson tends to go for, instead fleshing out characters and events more fully so that the reader can get a clearer idea of just what's going on. Morgan's kind of a cross between the realistic, articulate writing of Neal Stephenson and the gritty, bleak style of William Gibson. I'm not sure if I'm prepared to call him a "third-generation cyberpunk" writer though.
Morgan did the logical thing with a good, popular book like this and wrote a sequel: Broken Angels. The movie rights to Altered Carbon have also been optioned.
All things considered, I'm surprised how good this book is. However, given the book's first impressions, that's not saying very much.
It's immediately apparent that this is another silly novel about computer culture, from the age when computers were still popular and cool. This in itself is a warning flag, since, like many other fields of endeavor, books briefly tried to get in on the dot-com bandwagon while they could. A further examination of the book's cover reveals more to be skeptical about: This is the author's first novel (not necessarily a bad thing, but still often a sign of undeveloped writing style), and in fact, it wasn't even originally intended to be a novel: Most of the book began its life as a series of short stories on salon.com. But the real clincher comes in the author's bio on the back cover; the last sentence reads: "He lives in Providence."
Providence? As in, Rhode Island? Yes, that Providence. Okay, that just does it. Here's a guy who lives on the East Coast, and he's trying to write a novel on Silicon Valley. Yeah, okay. If that doesn't spell lack of credibility, it's hard to say what does.
I read it anyway. Just because I'm that much of a sucker for books about computers or computer culture.
Well, they say you shouldn't judge a book by its cover. In this case, they're partly right. It turns out that Thomas Scoville actually used to live and work in Silicon Valley for several years; he just moved to Providence later, when everything fell apart in California (can't say I blame him for that). And yes, he was a programmer, among other things, so he's not a William Gibson: He actually knows something about computers.
Scoville is more than just a coder, however; he also appears to fancy himself something of a genuine culture and art dilettante. The book is filled with deference--reverence, even--for the artistic side of humanity, the aspects of life so frequently neglected or outright forgotten at any time when there's work to be done and money to be made (as in the dot-com boom, for instance.) The result is a novel that seems cliche and predictable, yet is something of a chimera. Chapters seem to swing wildly back and forth between scenes of programmers in hardcore geek mode writing lines of shell scripts to caterers talking about painting and music. These worlds come together in an unlikely romance between a programmer and an art lover, yet the book never quite stops feeling divided. Perhaps this is deliberate; the book is clearly making some sort of commentary on how far removed the world of code and silicon is from the once-cherished things that make people human.
Silicon Follies borrows significantly from some relatively obscure avenues of real-world media, to the point where it sometimes becomes difficult to tell what's been wholesale invented, and what borrows from reality. For example, at one point in the book, Scoville mentions a movie described as "an edgy but highly problematic cinematic opus involving nonlinear math, stock market prediction, the Cabala, computing, and do-it-yourself brain surgery". Anyone who hasn't seen this movie will probably dismiss this list as random elements that popped into Scoville's imagination, but anyone who has seen the movie will instantly recognize that Scoville is actually referring to Darren Aronofsky's first movie Pi, an enduring favorite of the art-house crowd. Similarly, the book contains a character known for epic, fiery performance-art involving giant robots; this character is clearly a fictionalized version of Survival Research Labs' founder Mark Pauline, right down to the missing digits on his right hand. This blending of the real with the fictional further blurs the scope and intent of the book.
So is this a book worth reading? Probably, if you're the kind of person who will appreciate the basic message that Scoville is trying to send. That message is, clearly, "Computers are fun and useful, but they're not all there is to life". Some people might feel like the message is a tad heavy-handed at times, and the book is fluffy overall, but anyone who likes these kinds of mostly-light-hearted books about computer culture will probably find this an enjoyable enough read.
A great many books have been written by now about the information revolution and how computers and the Internet are changing people's lives. What makes NOTLS different? Simply that it's the only book I've seen that truly focuses on the lives of the people who drive this revolution. There are plenty of books out there on Microsoft, IBM, and Intel, along with the leaders of these companies, but very few books take a peek into the lives of the little guys, the ones who do the actual work in this information revolution. But NOTLS does. And the result, of course, is alternately hilarious and shocking.
This is what it's really like to live in Silicon Valley. This isn't about the multi-billionaires and the business decisions they make, it's about the programmers and the other workers who live in one of the strangest business cultures in America. This is where people routinely sleep at work instead of going home at the end of the day, where personal hygiene and business suits take a backseat to productivity, and where money comes and goes swiftly, but never seems to stay around for very long. When a new generation of bright young people have the opportunity to make more money in their 20s than most people have previously had a chance to make all their lives, but these same young people actually view their work as the most fun thing in their lives, what happens? You get Silicon Valley. And you'll read all about it in this book.
After reading the book, I have only one small quibble with it. Although it's certainly entertaining and interesting, it seems a little too optimistic and rosy. It talks mostly about the successful people. Even the first chapter, which contains several case studies of young people who're out of money and desperately seeking any way to support themselves, ends on a positive note for everyone, as everybody mentioned eventually gets lucky and lands an opportunity. That's great for them, but the hard truth is that for every successful person in Silicon Valley, there are plenty who will lose all their money and end up without anywhere to go. NOTLS is a great book, but after reading it, don't be tempted to run off to Santa Clara or Mountain View, believing that you'll eventually get rich if you work for 18 hours a day.
After being impressed with the insight and wit that filled NOTLS, I decided to scroll back in time a bit and read Po Bronson's two fictional books that began his career as a writer. Bronson's first book was Bombardiers, a book that was certainly well-received by book critics. It's a look at the cutthroat world of investment traders, specifically an office of traders in San Francisco's famous financial district. Bronson used to work for First Boston, the investment banking firm, so he is well qualified to offer a satirical insider's view of the investment culture, and he does a good job, creating a cast of believable characters and portraying what their everyday lives are like.
But I was a bit surprised by how haphazard the writing in Bombardiers is. The book proceeds in a rather meandering way, and it feels as if it were somehow written by someone who was either in a haze or a hurry. It is not a bad book, but I didn't like the atmosphere it created; Rather than being a constantly intelligent observation of people (as Bronson's other books are), Bombardiers is a rushed jumble from start to finish. This may be intentional; It may well have been written this way to capture the air of a real investment office, where people are usually given more work than time to go through it all, and life ends up taking a backseat to business. Or it may just be because this is Bronson's first novel, and his writing style was not quite polished yet. Regardless, although I like this book a bit, I liked his next book, The First $20 Million Is Always The Hardest, much better. (It's worth noting, though, that Bombardiers comes to a fairly conclusive and satisfying--even uplifting--ending, while the ending to The First $20 Million... is abrupt and leaves some plot threads untied. The last third of Bombardiers is better than the ending of Bronson's second book.)
When this novel was released, Po Bronson and his readers must surely have realized that he had finally arrived. After the success of his first novel, Bombardiers, came this fictional story of a Silicon Valley research group that doesn't read like fiction: The story is so plausible that it sounds like a real Silicon Valley story, and the characters are so well-illustrated that they feel like real people. This is the book that Bombardiers should have been. It constantly examines the psychology behind the characters and their motives for what they do, the plot is much more structured and meaningful, and the writing is clearer and more readable. If you're a real-life worker in the Silicon Valley culture, you'll instantly recognize the types of people that populate this book and empathize with their thoughts and feelings; If you're not a computer person, you'll probably still identify with the people here, because they're realistic people with realistic behaviour living in a crazy world. The plot, about a small group of techies who set to work on a research project that seems doomed from the start and faces setbacks at every turn, is both depressing and uplifting. The group is determined, and you can almost feel the drive these people have to succeed against all odds. Definitely a book that established Bronson's status as a talented writer, and did a fine job of setting the stage for his next book, Nudist On The Late Shift, a non-fictional account of the same culture.
The First $20 Million may be a bit of a surprise, however, because it's probably not what you'd expect from the way the book has been marketed. Especially after Bombardiers, which was a day-to-day look at the fast-paced real world of investing, you might expect this book to be similarly concerned with hot-shot young kids who drive BMWs or Porsches and have three meals a day at Starbucks. In actuality, however, this book is a bit of a departure; Most of it takes place in a relatively quiet, non-profit research organization, and focuses on people who lead relatively normal working lives. If you're expecting a thriller about the big-bucks world of Silicon Valley, you might be surprised and perhaps a little disappointed; But this is a book that captures a side of the Valley that's perhaps more "real", since it focuses on the people and the motives that drive people in Silicon Valley rather than simply the wealth and glamour. As such, it's a book that succeeds at being interesting and insightful, from beginning to end.
A lot of people love this book. It's pure pop-culture fun, about a bunch of (fictional) techies who work for Microsoft and are annoyed at the lack of meaning in their lives, but lack the resolve or direction to do anything about it. The book is easy to read (in the sense that a book for young children is easy to read; In other words, easy to understand, difficult to tolerate) and funny, but I had a hard time enjoying it because the writing is so agonizingly weird that it's almost painful to look at. It's clear this book wanted to be a clever fictionalization of computer techs, and it succeeded on some level, but it falls short of being great prose. For an (in my opinion) much more intelligent and readable book that tries to do the same things, read Po Bronson's The First $20 Million Is Always The Hardest.
First published in 1944, Stick And Rudder remains to this day one of the most popular books for those getting started in the field of aviation. It's amazing to think that this book, written so long ago (in the time of World War II!) is still recommended by some flight instructors today. But although SAR is certainly a somewhat dated book, most of what's written in it still remains fundamentally true. The majority of the book seems to be simple observations made by the author rather than scientific fact, and indeed, SAR is not a complete student pilot manual by any means; The book is very easy to understand and read, and not at all bogged down with charts, diagrams, and formulas that tend to populate student manuals, but this is because it does not aim to teach you all that you need to know about flying. It rather assumes that you are already a pilot (or at least have a basic knowledge of flying) and gives you some timely advice on how to stay alive in the air. This is a book that was truly ahead of its time. Many of the things in this book are things that should have been taught to students when the book was written, but weren't. Today most people probably understand these things better, and that may well be because of the book's popularity. Some of the most strongly emphasized points in the book (for example, you use the throttle to control your altitude and the elevators to control your speed, NOT the other way around) should be taken with a heavy dose of common sense, and they may be somewhat controversial to some people, but there is a real truth to what Langewiesche writes here, although it's hard to say how much of it was mostly true in the 1940s and not so applicable today. Still, Stick And Rudder is an interesting, well-written book that is worth a look from any aviation enthusiast, whather a pilot or just a student.
This is quite possibly the best automotive tech textbook in existence. It's huge (well over a thousand pages), giving detailed coverage of all major systems in a modern automobile and explaining them clearly with the help of lots of diagrams. Although other definitive texts on this subject also exist, notably William H. Crouse's Automotive Mechanics, 10th Edition (McGraw-Hill, ISBN 0-0280-0943-6) and Stockel and Johanson's Auto Fundamentals (Goodheart-Wilcox, ISBN 1-5663-7577-0), Erjavec's book is not only about twice the length of both these other offerings, it also seems to have done the best job of staying up-to-date with the latest changes in auto technology; the other two books are comparatively behind the times.
Note that the third edition of Erjavec's book was the last one to include chapters on carburetors; carbs were removed beginning with the fourth edition. While this is understandable, carbs will never leave classic cars, so it's useful to keep around a copy of the third edition as well.
Suppose you wake up one day (not necessarily in the literal sense of awakening from nocturnal slumber, but rather in the metaphorical sense of mentally realizing something which was previously unclear) and realize that you've spent much of your life focusing on a particular thing, and have missed the sort of broad-based education that the literati often like to boast of having. Maybe you regret not being familiar with a certain period of history, or having not read a classic book. Or perhaps you just wish that you knew more in general about the key items of the humanities which liberal-arts majors are more apt to be familiar with.
If you're such a person and you'd like to get up to speed on the most important points of knowledge in the field of the humanities, books like these exist specifically for this purpose. They attempt the daunting--but viable--task of listing all the key items that literate, cultured people are expected to know. Reading these books alone won't make you cultured--these are, fundamentally, nothing more than lists, meaning that they provide lists of other books for you to read. You can read a one-page introduction on Plato or Shakespeare, but that's no substitute for reading some of the things they actually wrote. These books are not everything you need to know about the humanities; they simply function as road maps.
Of the three books listed here, the New York Times guide is the largest and most complete, as befits a work by the world's most respected newspaper. An Incomplete Education is less to-the-point and also less comprehensive, but worth considering as an alternate perspective. The Intellectual Devotional is one of those books with 365 single-page entries, the kind from which you're supposed to read one page a day for a year, the same way some people are careful to read a chapter from the Bible every day. While I usually avoid such books as gimmicky and incomplete, I like this particular book's selection of topics; it really does seem to pick the most important subjects and present them in a clear, concise way, so it succeeds as a more manageable alternative for people who aren't as absolutely devoted to gaining cultural exposure.
Whichever book you use (or even if you obtain and read through all of them), the danger of using books like this is that they can lead one to think they've learned all they need to know about culture. Not so; such a goal is fundamentally unattainable by any human being. Even after you've made your way through a book like this and read all the literature it references, there's still a world full of important books to read. Consider books like these starting points, good introductions to the field of the humanities, but where your exploration of culture leads you from here will be up to you.
Back to the main page