The Future of Computers - Optical Computers

An optical computer (also called a photonic computer) is a device that performs its computation using photons of visible light or infrared (IR) beams, rather than electrons in an electric current. The computers we use today use transistors and semiconductors to control electricity but computers of the future may utilize crystals and metamaterials to control light.

An electric current creates heat in computer systems and as the processing speed increases, so does the amount of electricity required; this extra heat is extremely damaging to the hardware. Photons, however, create substantially less amounts of heat than electrons, on a given size scale, thus the development of more powerful processing systems becomes possible. By applying some of the advantages of visible and/or IR networks at the device and component scale, a computer might someday be developed that can perform operations significantly faster than a conventional electronic computer.

Coherent light beams, unlike electric currents in metal conductors, pass through each other without interfering; electrons repel each other, while photons do not. For this reason, signals over copper wires degrade rapidly while fiber optic cables do not have this problem. Several laser beams can be transmitted in such a way that their paths intersect, with little or no interference among them - even when they are confined essentially to two dimensions.

Electro-Optical Hybrid computers

Most research projects focus on replacing current computer components with optical equivalents, resulting in an optical digital computer system processing binary data. This approach appears to offer the best short-term prospects for commercial optical computing, since optical components could be integrated into traditional computers to produce an optical/electronic hybrid. However, optoelectronic devices lose about 30% of their energy converting electrons into photons and back and this switching process slows down transmission of messages.

Pure Optical Computers

All-optical computers eliminate the need for switching. These computers will use multiple frequencies to send information throughout computer as light waves and packets thus not having any electron based systems and needing no conversation from electrical to optical, greatly increasing the speed.

The Future of Computers - Quantum nanocomputers

A quantum computer uses quantum mechanical phenomena, such as entanglement and superposition to process data. Quantum computation aims to use the quantum properties of particles to represent and structure data using quantum mechanics to understand how to perform operations with this data.

The quantum mechanical properties of atoms or nuclei allow these particles to work together as quantum bits, or qubits. These qubits work together to form the computer’s processor and memory and can can interact with each other while being isolated from the external environment and this enables them to perform certain calculations much faster than conventional computers. By computing many different numbers simultaneously and then interfering the results to get a single answer, a quantum computer can perform a large number of operations in parallel and ends up being much more powerful than a digital computer of the same size.

A quantum nanocomputer would work by storing data in the form of atomic quantum states or spin. Technology of this kind is already under development in the form of single-electron memory (SEM) and quantum dots. By means of quantum mechanics, waves would store the state of each nanoscale component and information would be stored as the spin orientation or state of an atom. With the correct setup, constructive interference would emphasize the wave patterns that held the right answer, while destructive interference would prevent any wrong answers.

The energy state of an electron within an atom, represented by the electron energy level or shell, can theoretically represent one, two, four, eight, or even 16 bits of data. The main problem with this technology is instability. Instantaneous electron energy states are difficult to predict and even more difficult to control. An electron can easily fall to a lower energy state, emitting a photon; conversely, a photon striking an atom can cause one of its electrons to jump to a higher energy state.

I promise to write a series of articles on quantum issues very shortly...

The Future of Computers - Mechanical nanocomputers

Pioneers as Eric Drexler proposed as far back as the mid-1980's that nanoscale mechanical computers could be built via molecular manufacturing through a process of mechanical positioning of atoms or molecular building blocks one atom or molecule at a time, a process known as “mechanosynthesis.”

Once assembled, the mechanical nanocomputer, other than being greatly scaled down in size, would operate much like a complex, programmable version of the mechanical calculators used during the 1940s to 1970s, preceding the introduction of widely available, inexpensive solid-state electronic calculators.

Drexler's theoretical design used rods sliding in housings, with bumps that would interfere with the motion of other rods. It was a mechanical nanocomputer that uses tiny mobile components called nanogears to encode information.

Drexler and his collaborators favored designs that resemble a miniature Charles Babbage's analytical engines of the 19th century, mechanical nanocomputers that would calculate using moving molecular-scale rods and rotating molecular-scale wheels, spinning on shafts and bearings.

For this reason, mechanical nanocomputer technology has sparked controversy and some researchers even consider it unworkable. All the problems inherent in Babbage's apparatus, according to the naysayers, are magnified a million fold in a mechanical nanocomputer. Nevertheless, some futurists are optimistic about the technology, and have even proposed the evolution of nanorobots that could operate, or be controlled by, mechanical nanocomputers.

The Future of Computers - Chemical and Biochemical nanocomputers

Chemical Nanocomputer

In general terms a chemical computer is one that processes information in by making and breaking chemical bonds and it stores logic states or information in the resulting chemical (i.e., molecular) structures. In a chemical nanocomputer computing is based on chemical reactions (bond breaking and forming) and the inputs are encoded in the molecular structure of the reactants and outputs can be extracted from the structure of the products meaning that in these computers the interaction between different chemicals and their structures is used to store and process information.


These computing operations would be performed selectively among molecules taken just a few at a time in volumes only a few nanometers on a side so, in order to create a chemical nanocomputer, engineers need to be able to control individual atoms and molecules so that these atoms and molecules can be made to perform controllable calculations and data storage tasks. The development of a true chemical nanocomputer will likely proceed along lines similar to genetic engineering.


Biochemical nanocomputers

Both chemical and biochemical nanocomputers would store and process information in terms of chemical structures and interactions. Proponents of biochemically based computers can point to an “existence proof” for them in the commonplace activities of humans and other animals with multicellular nervous systems. So biochemical nanocomputers already exist in nature; they are manifest in all living things. But these systems are largely uncontrollable by humans and that makes artificial fabrication or implementation of this category of “natural” biochemically based computers seems far off because the mechanisms for animal brains and nervous systems still are poorly understood. We cannot, for example, program a tree to calculate the digits of pi , or program an antibody to fight a particular disease (although medical science has come close to this ideal in the formulation of vaccines, antibiotics, and antiviral medications)..

DNA nanocomputer

In 1994, Leonard Adelman took a giant step towards a different kind of chemical or artificial biochemical computer when he used fragments of DNA to compute the solution to a complex graph theory problem.

DNA computer

Using the tools of biochemistry, Adleman was able to extract the correct answer to the graph theory problem out of the many random paths represented by the product DNA strands. Like a computer with many processors, this type of DNA computer is able to consider many solutions to a problem simultaneously. Moreover, the DNA strands employed in such a calculation (approximately 1017) are many orders of magnitude greater in number and more densely packed than the processors in today's most massively parallel electronic supercomputer. As a result of the Adleman work, the chemical nanocomputer is the only one of the aforementioned four types to have been demonstrated for an actual calculation.

These computers use DNA to store information and perform complex calculations. DNA has a vast amount of storage capacity that enables it to hold the complex blueprints of living organisms. The storage capacity of a single gram of DNA can hold as much information as one trillion compact discs.

The Future of Computers - Electronic nanocomputers

Due to our fifty years of experience with electronic computing devices, including the extensive research and industrial infrastructure built up since the late 1940s, advances in nanocomputing technology are likely to come in this direction making the electronic nanocomputers appear to present the easiest and most likely direction in which to continue nanocomputer development in the near future.
Electronic nanocomputers would operate in a manner similar to the way present-day microcomputers work. The main difference is one of physical scale. More and more transistor s are squeezed into silicon chips with each passing year; witness the evolution of integrated circuits (IC s) capable of ever-increasing storage capacity and processing power.

The ultimate limit to the number of transistors per unit volume is imposed by the atomic structure of matter. Most engineers agree that technology has not yet come close to pushing this limit. In the electronic sense, the term nanocomputer is relative; by 1970s standards, today's ordinary microprocessors might be called nanodevices.

How it works

The power and speed of computers have grown rapidly because of rapid progress in solid-state electronics dating back to the invention of the transistor in 1948. Most important, there has been exponential increase in the density of transistors on integrated-circuit computer chips over the past 40 years. In that time span, though, there has been no fundamental change in the operating principles of the transistor.
Even microelectronic transistors no more than a few microns (millionths of a meter) in size are bulk-effect devices. They still operate using small electric fields imposed by tiny charged metal plates to control the mass action of many millions of electrons.

Although electronic nanocomputers will not use the traditional concept of transistors for its components, they will still operate by storing components, information in the positions of electrons.

At the current rate of miniaturization, the conventional transistor technology will reach a minimum size limit in a few years. At that point, smallscale quantum mechanical efects, such as the tunneling of electrons through barriers made from matter or electric fields, will begin to dominate the essential effects that permit a mass-action semiconductor device to operate Still, an electronic nanocomputer will continue to represent information in the storage and movement of electrons.
Nowadays, most eletronic nanocomputers are created through microscopic circuits using nanolithography.


Nanolithography is a term used to describe the branch of nanotechnology concerned with the study and application of a number of techniques for creating nanometer-scale structures, meaning patterns with at least one lateral dimension between the size of an individual atom and approximately 100 nm.

A nanometer is a billionth of a meter, much smaller than the width of a single human hair. The word lithography is used because the method of pattern generation is essentially the same as writing, only on a much smaller scale. Nanolithography is used during the fabrication of leading-edge semiconductor integrated circuits (nanocircuitry) or nanoelectromechanical systems (NEMS).

One common method of nanolithography, used particularly in the creation of microchips, is known as photolithography. This technique is a parallel method of nanolithography in which the entire surface is drawn on in a single moment. Photolithography is limited in the size it can reduce to, however, because if the wavelength of light used is made too small the lens simply absorbs the light in its entirety. This means that photolithography cannot reach the super-fine sizes of some alternate technologies.

A technology that allows for smaller sizes than photolithography is that of electron-beam lithography. Using an electron beam to draw a pattern nanometer by nanometer, incredibly small sizes (on the order of 20nm) may be achieved. Electron-beam lithography is much more expensive and time consuming than photolithography, however, making it a difficult sell for industry applications of nanolithography. Since electron-beam lithography functions more like a dot-matrix printer than a flash-photograph, a job that would take five minutes using photolithography will take upwards of five hours with electron-beam lithography.

New nanolithography technologies are constantly being researched and developed, leading to smaller and smaller possible sizes. Extreme ultraviolet lithography, for example, is capable of using light at wavelengths of 13.5nm. While hurdles still exist in this new field, it promises the possibility of sizes far below those produced by current industry standards. Other nanolithography techniques include dip-pen nanolithography, in which a small tip is used to deposit molecules on a surface. Dip-pen nanolithography can achieve very small sizes, but cannot currently go below 40nm.

The Future of Computers - Nanocomputers

Scientific discussion of the development and fabrication of nanometer-scale devices began in 1959 with an influential lecture by the late, renowned physicist Richard Feynman. Feynman observed that it is possible, in principle, to build and operate submicroscopic machinery. He proposed that large numbers of completely identical devices might be assembled by manipulating atoms one at a time.

Feynman's proposal sparked an initial flurry of interest but it did not broadly capture the imagination of the technical community or the public. At the time, building structures one atom at a time seemed out of reach. Throughout the 1960s and 1970s advances in diverse fields prepared the scientific community for the first crude manipulations of nanometer-scale structures. The most obvious development was the continual miniaturization of digital electronic circuits, based primarily upon the invention of the transistor by Shockley, Brattain, and Bardeen in 1948 and the invention of the integrated circuit by Noyce, Kilby, and others in the late 1950s. In 1959, it was only possible to put one transistor on an integrated circuit. Twenty years later, circuits with a few thousand transistors were commonplace.

Scientists are trying to use nanotechnology to make very tiny chips, electrical conductors and logic gates. Using nanotechnology, chips can be built up one atom at a time and hence there would be no wastage of space, enabling much smaller devices to be built. Using this technology, logic gates will be composed of just a few atoms and electrical conductors (called nanowires) will be merely an atom thick and a data bit will be represented by the presence or absence of an electron.

A nanocomputer is a computer whose physical dimensions are microscopic; smaller than the microcomputer, which is smaller than the minicomputer. (The minicomputer is called "mini" because it was a lot smaller than the original (mainframe) computers.)

A component of nanotechnology, nanocomputing will, as suggested or proposed by researchers and futurists, give rise to four types of nanocomputers:

Keep reading the next posts…


What is nanotechnology?

In simple terms, nanotechnology can be defined as ‘engineering at a very small scale’, and this term can be applied to many areas of research and development – from medicine to manufacturing to computing, and even to textiles and cosmetics. Nanotechnology (sometimes shortened to "nanotech") is a multidisciplinary science that studies how we can manipulate matter at the molecular and atomic scale, i.e., is the engineering of tiny machines — the projected ability to build things from the bottom up, using techniques and tools being developed today to make complete, highly advanced products.


Take a random selection of scientists, engineers, investors and the general public and ask them what nanotechnology is and you will receive a range of replies as broad as nanotechnology itself. For many scientists, it is nothing startlingly new; after all we have been working at the nanoscale for decades, through electron microscopy, scanning probe microscopies or simply growing and analyzing thin films. For most other groups, however, nanotechnology means something far more ambitious, miniature submarines in the bloodstream, little cogs and gears made out of atoms, space elevators made of nanotubes, and the colonization of space.

The robot in this illustration swims through the arteries and veins using a pair of tail appendages.

Nanorobot designers sometimes look at microscopic organisms for propulsion inspiration, like the flagellum on this e-coli cell.
Nanotechnology is very diverse, ranging from extensions of conventional device physics to completely new approaches based upon molecular self-assembly, from developing new materials with dimensions on the nanoscale to investigating whether we can directly control matter on the atomic scale. It can be difficult to imagine exactly how this greater understanding of the world of atoms and molecules has and will affect the everyday objects we see around us, but some of the areas where nanotechnologies are set to make a difference are described below.

Nanotechnology, in one sense, is the natural continuation of the miniaturization revolution that we have witnessed over the last decade, where millionth of a meter (10-6 m) tolerances (microengineering) became commonplace, for example, in the automotive and aerospace industries enabling the construction of higher quality and safer vehicles and planes.

It was the computer industry who kept on pushing the limits of miniaturization, and many electronic devices we see today have nano features that owe their origins to the computer industry – such as cameras, CD and DVD players, car airbag pressure sensors and inkjet printers.

Nanotechnology is fundamentally a materials science that has the following characteristics:
  • Research and development at molecular or atomic levels, with lengths ranging between about 1 to 100 nanometers;
  • Creation and use of systems, devices, and structures that have special functions or properties because of their small size;
  • Ability to control or manipulate matter on a molecular or atomic scale.

Undoubtedly, nanotechnology is going to be the future, as studies are going on in diversifying the technology from materials with dimensions in nano scale to materials in dimensions of atomic scale. Some new methods like molecular self-assembly have been developed to make this possible. There may be a future when all the common basic needs like food, shelter and even costly diamonds will be made by nanorobots.

Driller nanorobots (Graphics by Erik Viktor)

What is the nanoscale?

Nanotechnology originates from the Greek word meaning “dwarf”. A nanometre is one billionth of a meter, which is tiny, only the length of ten hydrogen atoms.  Although a meter is defined by the International Standards Organization as "the length of the path travelled by light in vacuum during a time interval of 1/299,792,458 of a second" and a nanometre is by definition 10-9 of a meter, this does not help scientists to communicate the nanoscale to non-scientists.

Although scientists have manipulated matter at the nanoscale for centuries, calling it physics or chemistry, it was not until a new generation of microscopes was invented in the nineteen eighties that the world of atoms and molecules could be visualized and managed. It is in human nature to relate sizes by reference to everyday objects, and the commonest definition of nanotechnology is in relation to the width of a human hair.

Examples of the nanoscale

In order to understand the unusual world of nanotechnology, we need to get an idea of the units of measure involved. A centimeter is one-hundredth of a meter, a millimeter is one-thousandth of a meter, and a micrometer is one-millionth of a meter, but all of these are still huge compared to the nanoscale.

Rather than asking anyone to imagine a millionth or a billionth of something, which few sane people can accomplish with ease, relating nanotechnology to atoms often makes the nanometre easier to imagine. This clearly shows how small a nano is. It is even smaller than the wavelength of visible light and a hundred-thousandth the width of a human hair. But it is still big when compared to the atomic scale. 

Let us compare different units with respect to meters.
  • 1 centimeter – 1,00th of a meter
  • 1 millimeter – 1,000th of a meter
  • 1 micrometer – 1,000,000th of a meter
  • 1 nanometer –  1,000,000,000th of a meter

As small as a nanometer is, it is still large compared to the atomic scale. An atom has a diameter of about 0.1 nm. An atom's nucleus is much smaller - about 0.00001 nm.


Generally, nanotechnology deals with structures sized between 1 to 100 nanometer in at least one dimension, and involves developing materials or devices within that size. Quantum mechanical effects are very important at this scale, which is in the quantum realm. Larger than that is the microscale, and smaller than that is the atomic scale.

Nanotechnology Now

The Future of Computers - Overview

Computers' Evolution

The history of computers and computer technology thus far has been a long and a fascinating one, stretching back more than half a century to the first primitive computing machines. These machines were huge and complicated affairs, consisting of row upon row of vacuum tubes and wires, often encompassing several rooms to fit it all in.

In the past twenty years, there has been a dramatic increase in the processing speed of computers, network capacity and the speed of the internet. These advances have paved the way for the revolution of fields such as quantum physics, artificial intelligence and nanotechnology. These advances will have a profound effect on the way we live and work, the virtual reality we see in movies like the Matrix, may actually come true in the next decade or so.
Today's computers operate using transistors, wires and electricity. Future computers might use atoms, fibers and light. Take a moment to consider what the world might be like, if computers the size of molecules become a reality. These are the types of computers that could be everywhere, but never seen. Nano sized bio-computers that could target specific areas inside your body. Giant networks of computers, in your clothing, your house, your car. Entrenched in almost every aspect of our lives and yet you may never give them a single thought.

As anyone who has looked at the world of computers lately can attest, the size of computers has been reduced sharply, even as the power of these machines has increased at an exponential rate. In fact, the cost of computers has come down so much that many households now own not only one, but two, three or even more, PCs.
As the world of computers and computer technology continues to evolve and change, many people, from science fiction writers and futurists to computer workers and ordinary users have wondered what the future holds for the computer and related technologies. Many things have been pictured, from robots in the form of household servants to computers so small they can fit in a pocket. Indeed, some of these predicted inventions have already come to pass, with the introduction of PDAs and robotic vacuum cleaners.

Understanding the theories behind these future computer technologies is not for the meek. My research into quantum computers was made all the more difficult after I learned that in light of her constant interference, it is theoretically possible my mother-in-law could be in two places at once.

Nanotechnology is another important part of the future of computers, expected to have a profound impact on people around the globe. Nanotechnology is the process whereby matter is manipulated at the atomic level, providing the ability to “build” objects from their most basic parts. Like robotics and artificial intelligence, nanotechnology is already in use in many places, providing everything from stain resistant clothing to better suntan lotion. These advances in nanotechnology are likely to continue in the future, making this one of the most powerful aspects of future computing.

In the future, the number of tiny but powerful computers you encounter every day will number in the thousands, perhaps millions. You won't see them, but they will be all around you. Your personal interface to this powerful network of computers could come from a single computing device that is worn on or in the body.

Quantum computers are also likely to transform the computing experience, for both business and home users. These powerful machines are already on the drawing board, and they are likely to be introduced in the near future. The quantum computer is expected to be a giant leap forward in computing technology, with exciting implications for everything from scientific research to stock market predictions.

Moore's law

Visit any site on the web writing about the future of computers and you will most likely find mention of Moore's Law. Moore's Law is not a strictly adhered to mathematical formula, but a prediction made by Intel's founder co-founder Gordon Moore in 1965 in a paper where he noted that number of components in integrated circuits had doubled every year from the invention of the integrated circuit in 1958 until 1965 and predicted that the trend would continue "for at least ten years".

Moore predicted that computing technology would increase in value at the same time it would decrease in cost describing a long-term trend in the history of computing hardware. More specifically, that innovation in technology would allow for the number of transistors that can be placed inexpensively on an integrated circuit to double approximately every two years. The trend has continued for more than half a century and is not expected to stop until 2015 or later.

A computer transistor acts like a small electronic switch. Just like the light switch on your wall, a transistor has only two states, On or Off. A computer interprets this on/off state as a 1 or a 0. Put a whole bunch of these transistors together and you have a computer chip. The central processing unit (CPU) inside your computer probably has around 500 million transistors.

Shrinking transistor size not only makes chips smaller, but faster. One benefit of packing transistors closer together is that the electronic pulses take less time to travel between transistors. This can increase the overall speed of the chip. The capabilities of many digital electronic devices are strongly linked to Moore's law: processing speed, memory capacity, sensors and even the number and size of pixels in digital cameras. All of these are improving at (roughly) exponential rates as well.

Not everyone agrees that Moore's Law has been accurate throughout the years, (the prediction has changed since its original version), or that it will hold true in the future. But does it really matter? The pace at which computers are doubling their smarts is happening fast enough for me.

Thanks to the innovation and drive of Gordon Moore and others like him, computers will continue to get smaller, faster and more affordable.

Who Are Anonymous?

Anonymous' organization

Anonymous isn’t truly an organization; not in any traditional sense. They are a large, decentralized group of individuals who share common interests and web haunts. There are no official members, guidelines, leaders, representatives or unifying principles. Rather, Anonymous is a word that identifies the millions of people, groups, and individuals on and off of the internet who, without disclosing their identities, express diverse opinions on many topics.

Being "Anonymous" is much more a quality or a self-definition than a membership. Each project under the Anonymous banner may have a whole different set of instigators. Leadership, when it exists, is informal and carried out in chat channels, forums, IM and public calls to action online. No one's meeting in a board room.

The name Anonymous itself is inspired by the perceived anonymity under which users post images and comments on the Internet. Usage of the term Anonymous in the sense of a shared identity began on imageboards.

Origin of Anonymous

They are loosely affiliated with 4chan and other smaller "chan" boards (like 7chan, 2chan and 711chan) due to these sites' anonymous posting feature, which allows them to plan attacks without revealing any identifying information.

A tag of Anonymous is assigned to visitors who leave comments without identifying the originator of the posted content. Users of imageboards sometimes jokingly acted as if Anonymous were a real person. As the popularity of imageboards increased, the idea of Anonymous as a collective of unnamed individuals became an internet meme.
They coordinate raids on forums like 4chan.org and ICQ chat rooms, among other venues. 4chan.org is a major hub, but I wouldn’t call it Anonymous’ home. Anonops.net was as close to an HQ as they had, but it has been shut down.

Some claim Anonymous to be the first internet-based superconsciousness. Anonymous is a group, in the sense that a flock of birds is a group. How do you know they're a group? Because they're travelling in the same direction. At any given moment, more birds could join, leave, peel off in another direction entirely.
Anonymous doesn't like to be called a 'group'. And much less do its members like to be called 'hackers'. They are, to use the manifesto's excitable, sci-fi-tinged terminology, an "Online Living Consciousness"

Anonymous hierarchy

They have been described as a leaderless, anarchic group of "hacktivists" but inside Anonymous, some have found that the organization is more hierarchical – with a hidden cabal of around a dozen highly skilled hackers coordinating attacks across the web.

One member said the group's "command and control" centers are invite-only, adding: "It's to protect people, but if you have proven trustworthy you get invited – it's not hard to do. It's not some elitist structure but a way to keep the press and the odd bit of law enforcement seeing who issues commands."

"Our project has no leader structure, only different roles. The degree of leadership and organization in the various projects various a lot," one long-term insider explained. "It's all very chaotic, but we communicate and co-operate with each other. I see us as different cells of the same organism."

The leaders of the group use internet relay chat (IRC) technology, which can allow groups of people to communicate clandestinely. Some in the upper echelons are understood to have control over "botnets" comprising more than 1,000 Windows PCs that have been infected with a virus and can be controlled without the user's knowledge to direct "distributed denial of service" (DDoS) attacks against target organizations.

History of Anonymous Hacktivism

We are Anonymous. We are legion. We do not forgive. We do not forget. Expect us.

Long before they became vigilantes in the Wikileaks cyberwars, Anonymous was conducting large-scale “raids” against their enemies.

The Habbo Hotel raids

Probably the first time 4chan users banded together under the moniker Anonymous was in order to harass the users of Habbo Hotel, a cartoonish social network designed as a virtual hotel.
As early as 2006, Anonymous would "raid" Habbo, blocking its usual users from moving around. The first major raid is known as the "Great Habbo Raid of '06," and a subsequent raid the following year is known as the "Great Habbo Raid of '07."There appears to be some connection to the news of an Alabama amusement park banning a two-year-old toddler affected by AIDS from entering the park's swimming pool.

Habbo Hotel

Users signed up to the Habbo site dressed in avatars of a black man wearing a grey suit and an Afro hairstyle and blocked entry to the pool, declaring that it was "closed due to AIDS," flooding the site with internet sayings, and forming swastika-like formations. Then when all their black cartoon avatars got banned, they'd call Habbo racist. This was all done "for the lulz," or just for fun.
At this point, Anonymous’ actions had not taken on a political bent. Some members of Anon would argue it was better that way.

Hal Turner

Anonymous targeted this white supremacist with a talk radio show in December 2006. According to Hal Turner, in December 2006 and January 2007 individuals who identified themselves as Anonymous overloaded his website, costing him thousands of dollars in bandwidth bills.
Some Anons seem to have distaste for actual racism, though they express it frequently in jest. But of course you can never tell if it's the same people. "Anon is legion," they like to say.
As a result, Turner sued 4chan, eBaum's World, 7chan, and other websites for copyright infringement. He lost his plea for an injunction, however, and failed to receive letters from the court, which caused the lawsuit to lapse.

Chris Forcand

Anonymous helped catch an internet child predator, Chris Forcand, by reporting information to the police in 2007. By this time, Anonymous began to see themselves as a group of internet vigilantes fighting for assorted noble causes, rather than a band of merry pranksters.
On December 7, 2007, the Canada-based Toronto Sun newspaper published a report stating that Forcand was already being tracked by "cyber-vigilantes who seek to out anyone who presents with a sexual interest in children" before police investigations commenced. Forcand, 53, was charged with two counts of luring a child under the age of 14, attempt to invite sexual touching, attempted exposure, possessing a dangerous weapon, and carrying a concealed weapon
Anonymous contacted the police after some members were "propositioned" by Forcand with "disgusting photos of himself." The report also stated that this is the first time a suspected Internet predator was arrested by the police as a result of Internet vigilantism

The Church of Scientology

With these raids, Anonymous exploded into popular culture. The group gained worldwide press for Project Chanology, the protest against the Church of Scientology. 
On January 14, 2008, a video produced by the Church featuring an interview with Tom Cruise was leaked to the Internet and uploaded to YouTube. When the Church tried to get this embarrassing footage taken down from YouTube, Anonymous formed a splinter group called Project Chanology that dedicated itself to stopping censorship and harassing the exploitative church.
Calling the action by the Church of Scientology a form of Internet censorship, members of Project Chanology organized a series of denial-of-service attacks against Scientology websites, prank calls, and black faxes to Scientology centers.

There were also several organized protests in many cities worldwide against Scientology. They took to the streets, protesting outside of Scientologist churches wearing "V-masks," the disguise used by Alan Moore's vigilante comic book hero, V, originally inspired by would-be British terrorist and folk hero Guy Fawkes.

Epilepsy Foundation

Anonymous has also trolled the Epilepsy Foundation of America, uploading seizure-inducing content to its forums.
On March 28, 2008, the epilepsy support forum run by the Epilepsy Foundation of America was assaulted and JavaScript code and flashing computer animations were posted with the intention of triggering migraine headaches and seizures in photosensitive and pattern-sensitive epileptics.

SOHH and AllHipHop

In late June 2008, users who identified themselves as Anonymous claimed responsibility for a series of attacks against the SOHH (Support Online Hip Hop) website. The attack was reported to have begun in retaliation for insults made by members of SOHH's "Just Bugging Out" forum against 4chan's users.

The attack against the website took place in stages, as Anonymous users flooded the SOHH forums, which were then shut down. On June 23, 2008, the group which identified themselves as Anonymous organized DDoS attacks against the website, successfully eliminating over 60% of the website's service capacity. On June 27, 2008, the hackers utilized cross-site scripting to alter the website's main page with satirical images and headlines referencing numerous racial stereotypes and slurs, and also successfully stole information from SOHH employees.

No Cussing Club

Anonymous also targeted the kid who started The No-Cussing Club, for obvious reasons. In January 2009 members of Anonymous targeted California teen McKay Hatch who runs the No Cussing Club, a website against profanity.

As Hatch's home address, phone number, and other personal information were leaked on-line, his family has received a lot of hate mail, lots of obscene phone calls, and even bogus pizza and pornography deliveries.

YouTube porn day

On May 20, 2009, members of Anonymous uploaded numerous pornographic videos onto YouTube. Many of these videos were disguised as children's videos or family friendly videos with tags such as "jonas brothers." YouTube has since removed all videos uploaded.

The BBC contacted one of the uploaders who stated that it was a "4chan raid" organized due to the removal of music videos from YouTube. BBC News reported that one victim posted a comment saying: "I'm 12 years old and what is this?" which went on to become an internet meme.

Iranian Election protests

Following allegations of vote rigging after the results of the June 2009 Iranian presidential election were announced, declaring Iran's incumbent President Mahmoud Ahmadinejad as the winner, thousands of Iranians participated in demonstrations.

Persian Bay

Anonymous teamed up with Pirate Bay and various Iranian hackers to offer Iranian dissidents a way to plan demonstrations and connect with the outside world. The result was Anonymous Iran, an Iranian Green Party Support site and a successful information-freedom project.

The site has drawn over 22,000 supporters worldwide and allows for information exchange between the world and Iran, despite attempts by the Iranian government to censor news about the riots on the internet. The site offers Iranian activists tools and advice on how to remain anonymous and avoid detection. Forums are provided for coordinating activities and communicating with the west.

Operation Didgeridie

In September 2009 the group reawakened "in order to protect civil rights" after several governments began to block access to its imageboards. The tipping point was the Australian government's plans for ISP-level censorship of the internet.

Early in the evening of September 9, Anonymous took down the prime minister's website with a distributed denial-of-service attack. The site was taken offline for approximately one hour.

Operation Titstorm

On the morning of February 10, 2010, Anonymous launched a more prepared attack hilariously-titled "Operation Titstorm."

It defaced the prime minister's website, took down the Australian Parliament House website for three days and nearly managed to take down the Department of Communications' website as a protest against the Australian Government over the forthcoming internet filtering legislation and the perceived censorship in pornography of small-breasted women (who are perceived to be under age) and female ejaculation.

Other entities they’ve tangled with include AT&T, Gene Simmons, KnowYourMeme, Hot Topic, Jessi Slaughter, and Tumblr. but I would consider these to be only skirmishes.

All of these without mentioning the recent Operation Payback, and the strikes againt Tunisia, Zimbabwe and Egypt.

Who knows where the next target might be...