Top 10 Tech Stories of the Year

Looking back, 2010 wasn’t marked by technological innovation so much as technological maturation. This wasn’t the year that social networking started, but it was the year that your grandmother joined Facebook. This wasn’t the year that Google got huge, but it was the year Google started wrestling with the problems market dominance entails. And it wasn’t the year touch-screen mobile devices hit the market, but it was the year that they started to change how we use computers.

But enough of what the year wasn’t! Here’s the TechNewsDaily countdown of the year that was, and the tenstories that made it so.

Andre Geim and Konstantin Novoselov win the Nobel Prize for graphene research.

News outlets rarely cover materials chemistry, and this story highlighted why that’s such a big mistake.

The world we live in is enabled by breakthroughs such as this. Almost every modern consumer product contains the plastic compound polyethylene, but its original industrial synthesis in 1933 didn’t make the news. Graphene, with its diamond-like strength, amazing conductive ability and strange quantum-mechanical properties, may prove as transformative in the 21st century as plastics like polyethylene did in the 20th.

Graphene research began in earnest in 2004, when discoveries by Geim and Novoselov enabled large-scale, inexpensive production of the material. Now that the Nobel Prize has recognized their work, people will look back at 2010 as the year graphene research began getting the attention it deserves.

More people use Droids than iPhones.

Even if more people used RIM’s BlackBerry, the iPhone changed the game so profoundly that Apple owned the intellectual high ground in the mobile-operating system competition. Then came Google.

By replicating Apple’s touch interface, but adding new features such as Adobe Flash support and more variable platforms, the Android mobile OS overtook Apple’s iOS and became the most widely used smartphone operating system in the world.

Google and Apple are competing on a number of fronts, but dethroning Apple’s flagship product helped swing the momentum back toward Google.

Social coupon sites get us 25 percent off everything.

The idea is so simple, it was shocking it hadn’t happened earlier. This year, supported by very little technological innovation, a number of coupon companies tapped the collective buying power of the Internet masses to secure deals on everything from manicures to baseball tickets to restaurants.

It couldn’t have come at a better time. As recovery from the Great Recession proved shallower than hoped for, the public clamored for discounts wherever they could find them. Initially, the coupons focused on local businesses, targeting people where they live and work. Then, in August, Groupon offered a countrywide coupon to the clothing store The Gap.

The Gap deal crashed the Groupon system, announcing with a bang that the coupon site was a nationwide phenomenon that’s here to stay.

Entertainment enters the third dimension. Again.

Didn’t we try this already? Didn’t 3-D fizzle in the 1950s? Clearly, someone forgot to tell that to James Cameron and his horde of blue cat people. At the beginning of 2010, the megahit “Avatar” and a slew of advanced 3-D televisions unveiled at the Consumer Electronics Show changed how we watched our media and played our games.

Sure, the glasses still look stupid, but they’re working on that. Eyewear notwithstanding, in 2010 one would be hard-pressed to find a major action movie that wasn’t adapted for 3-D viewing. Then, this summer, the World Cup was broadcast in 3-D.

When a technology has been deployed in the highest grossing movie of all time and the world’s most widely watched sporting event, it has arrived.

China hogs all the rare earth minerals.

Ever heard of the mineral xenotime? What about aeschynite? No? Well, the U.S. Army and Intel have, because computer chips in everything from iPads to missile guidance systems need those minerals to work. And the biggest exporter of those minerals is China.

With almost every digital device requiring a rare earth mineral of some kind, China is to rare earth minerals as Saudi Arabia is to oil. We need it, and they’ve got it. And this imbalance has members of the U.S.security apparatus worried.

Luckily, China’s monopoly is only temporary. America also has huge reserves of rare earth minerals, it will just take a few years for us to ramp up production to the level China already maintains.

Google pays the cost to be the boss.

In 2010, Google learned that heavy is the head that wears the crown. After becoming the world’s most-used search engine, most visited website and, arguably, most innovative company, Google spent the last year fighting cyberattacks from China, antitrust suits from the European Union and encroachment from competitors like Facebook.

The year began with the Chinese infiltration of Gmail, followed by a tiff between the company and Beijing over the censorship of websites. Eventually, Google had to pull out of China entirely. Then in the spring, Facebook took aim at Google’s search algorithm with its Open Graph system, which ranks websites based on user recommendations. By November, the headaches piled up even more, when the European Union, a body notorious for handing out billion-dollar fines, began investigating possible antitrust violations.

Eric Schmidt may hope 2011 treats his company better, but he should probably just get used to it. After all, when you’re the biggest player on the Internet, you’re also the biggest target.

WikiLeaks on America’s leg.

On his own, U.S. Army Private Daniel Bradley Manning was simply a disgruntled soldier with access tosensitive data. Combined with Julian Assange and his organization WikiLeaks, Manning became the source that set the Internet on fire.

From his Bond-villain-like lair in Sweden, Assange released a steady stream of classified military and diplomatic documents throughout 2010. Many of the documents only confirmed what traditional news outlets and even the government itself had reported. However, revelations about the scale of Chinesehacker attacks, the U.S. relationship with drug-dealing Afghan officials and helicopter attacks on civilians turned the leaks into a divisive issue.

As a result of the WikiLeaks revelations, the Obama administration banned federal employees from readingthe docs, Air Force bases banned access to the New York Times website for fear that officers would read about the leaks and the future of online journalism became that much murkier.

Social Networking pokes the world.

It’s not just for college kids anymore. It’s not just a way to waste time at work. In 2010, social networking in general, and Facebook in particular, ceased being a diversion for the young, and became a communications tool on par with the telephone. Mix in two critically acclaimed movies and a Time “Man of the Year” nod for Facebook founder Mark Zuckerberg, and you’ve got the birth of a new normal.

Fifteen years after “The Net” portrayed Sandra Bullock as a crazed shut-in for ordering pizza online, “Catfish” and “The Social Network” showed a world where the computer was the most natural means for interpersonal communication. By mid-year, Facebook had more than 500 million members, and other social networking sites such as Friendster still retained tens of millions of users, primarily in Asia.

But simply displaying which books and movies people like isn’t enough for multibillion-dollar companies. Forays into Internet search, e-mail and mobile software suggest that connectivity may only be the first step in a long evolution of these services.

Your computer starts to disappear.

Before 2010, a tablet PC market didn’t exist. Now, after the debut of the iPad, tablets could overtake laptops as the preferred tool for mobile computing. Before 2010, consumers didn’t care about cloud computing. Now, Microsoft talks about it in their ads and Amazon offers graphics-card cloud computing to everybody. Put those two things together, and you have the birth of a new age in computer use.

It’s called ubiquitous computing, and although researchers first predicted the convergence almost 20 years ago, 2010 will be remembered as the year it hit the mainstream. The year even ended with the announcement of Google’s Chrome operating system, which more fully embodies the capabilities of ubiquitous computing than any product so far.

This change is as profound as the switch from mainframe computers to PCs. Say goodbye to yourcomputer, because after this year, computing will be a service, not a product.

Stuxnet earns tenure for dozens of think tank cyberwar Cassandras.

The Internet was invented for war, by a Department of Defense looking for a communications tool that could survive a nuclear attack. For years, cyberwar analysts have predicted that countries will use the Internet as a weapon. And for years, that meant nothing more than information theft, the occasional website defacement and some denial of service attacks. Then came Stuxnet, the first cyberweapon that did the same job as a conventional missile.

We still don’t know who designed the innovative piece of malware, but we do know its intended target: the Iranian nuclear weapons program. Stuxnet, engineered to take out controllers at the Iranian uranium enrichment plant, shut down Tehran’s nuclear program just as surely as a bomb through the roof. Except in this case, no one was killed, and no one could figure out who sent the virus.

Iran recovered from the Stuxnet attack, but the event left the world of warfare profoundly changed. Like the discovery of a lost particle or the fossil of a missing link, the appearance of Stuxnet validated all the theories that cyberwar prognosticators had spun for decades. Stuxnet proved that ones and zeros could be as damaging as bombs and bullets, and that cyberspace was as active a battlefield as the land, sea or air.

Stuart Fox currently researches and develops physical and digital exhibit experiences at the Science Liberty Center. His news writing includes the likes of several Purch sites, including Live Science and Live Science's Life's Little Mysteries.