News & Events
This opinion piece was published in The New York Times on June 15, 2011
Educating students or the general public about computer science isn't easy. Teaching theory can be interesting and mind-expanding, but it may be no more applicable in most people's lives and careers than high school algebra or calculus. Teaching specific programming languages for more concrete purposes can risk having students lose sight of the bigger picture, confining them to rote work without much prospect for intellectual growth.
That bigger picture is what makes mastery of today's technology so special: unlike many other fields of endeavor, anyone with an idea can try it out and garner an audience around it without having to ink a business plan or raise prohibitive amounts of money. Thanks to PCs that run any software they're given, and an Internet that allows anyone to set up shop and start communicating with the world without having to do the equivalent of buying a television broadcast tower and license, we've seen amazing and disruptive ventures from humble beginnings.
Tim Berners-Lee invented the World Wide Web by writing the code for a browser and a Web server and seeing if the world wanted to take it up. No business model, venture capital rounds, or patents were involved. Ward Cunningham invented something called a wiki, where people could collectively edit a document. Jimmy Wales used it to create Wikipedia, a project that was considered foolish at first but has completely reshaped the way people document and share information about the world.
Even the Internet and PC came from unexpected origins. The inventors of the Internet protocol were experimenting; they didn't know what would be done with the network. The Internet didn't and doesn't have a "main menu," but rather, it is raw connectivity waiting for its users to do something with it -- to connect with one another.
The inventors of the first consumer PC -- Steve Wozniak and Steve Jobs -- unveiled an Apple II in 1977 that had a blinking cursor instead of bundled software; other technologists were de facto invited to design applications and then share or sell them to one another. (Two years later, to Apple's surprise, the personal computer became a business when Dan Bricklin and Bob Frankston invented VisiCalc, the first digital spreadsheet, and companies around the world suddenly craved PCs.)
What's notable about most of these and other game-changing inventions is that their creators mostly weren't computer science majors. They were self-taught or learned their craft through apprenticeship to other coders.
Computer science curricula that lack the spirit of exploration and experimentation -- that stick too closely to the textbooks, whether ones of theory or practice -- won't speed the overall pace of innovation. That's why all-night hackathons are good ideas: they encourage people to see that the world can be changed, and that a small but determined handful of people can do it.
Sputnik itself was first tracked not by professionally trained astronomers but by bands of amateurs spread across the nation, heeding a call to build or acquire their own telescopes and to document what they saw. Our challenge is to keep alive Sputnik's rallying cry in a world where coding is becoming more confined. Web servers where new ideas might take root are consolidating under a few corporate hosts. Today's coders are naturally more interested in writing for the Facebook or Apple iOS platforms, where truly disruptive ideas can be banned or diminished by gatekeepers who want to protect their own business models, or are compelled to carry water for other regulatory purposes.
The reason to teach computer science isn't to turn everyone into a coder. It's to share the insight that today, more than ever, the world can be shaped by good and rigorous ideas hatched by "mere" teenagers or other outsiders, and that these ideas transcend technology. There's a social and legal dimension to this as well. That's why computer science should include a look at the policy implications of the codes we forge. Moreover, computer science education shouldn't be limited to college. Programs like the Sprouts help make the craft available to everyone, empowering people to affect the world rather than merely marveling at shiny gizmos.
Jonathan Zittrain is a professor at Harvard Law School and a professor of computer science at the university's School of Engineering and Applied Sciences. He is a co-founder of the Berkman Center for Internet and Society.
Topics: Computer Science, Academics
Cutting-edge science delivered direct to your inbox.
Join the Harvard SEAS mailing list.