Showing posts with label Computers. Show all posts
Showing posts with label Computers. Show all posts

Signcryption standard increases cyber security

Signcryption, which University of North Carolina at Charlotte professor Yuliang Zheng created, is a technology that protects confidentiality and authenticity, seamlessly and simultaneouslyis and is particularly useful in cloud computing for ensuring confidential, authenticated transmissions. The International Organization for Standardization (ISO) has published a standard for a public-key technology that combines signing and encrypting a message in a single step.

The first signcryption scheme was introduced by Professor Yuliang Zheng in 1997 and he continued his research in this revolutionary new technology at the College of Computing and Informatics. After nearly a three-year process, the International Organization of Standardization (ISO) has formally recognized his research efforts as an international standard.

News of the ISO adoption comes amidst daily reports of cyber attack and cyber crime around the world. Zheng says the application will also enhance the security and privacy of cloud computing. "The adoption of signryption as an international standard is significant in several ways," he said. "It will now be the standard worldwide for protecting confidentiality and authenticity during transmissions of digital information."

"This will also allow smaller devices, such as smartphones and PDAs, 3G and 4G mobile communications, as well as emerging technologies, such as radio frequency identifiers (RFID) and wireless sensor networks, to perform high-level security functions," Zheng said. "And, by performing these two functions simultaneously, we can save resources, be it an individual's time or be it energy, as it will take less time to perform the task."

There are also many other signcryption schemes that are proposed throughout the years, each of them having its own problems and limitations, while they are offering different level of security services and computational costs.

35 Facebook false websites


Security Web-Center found 35 Facebook phishing websites. These spammers create fake pages that look like the Facebook login page. If you enter your email and password on one of these pages, the spammer records your information and keeps it. The fake sites, like the one below, use a similar URL to Facebook.com in an attempt to steal people's login information.


The people behind these websites, then use the information to access victims' accounts and send messages to their friends, further propagating the illegitimate sites. In some instances, the phishers make money by exploiting the personal information they've obtained. Check out the list:

New botnets arrive


The recent breakup of the ChangeDNS botnet which infected more than 4 million computers and was under the control of a single ring of criminals raised a new set of concerns. The biggest effect of the commoditization of botnet tools and other computer security exploits might be a new wave of major botnet attacks, driven by people who simply buy their malware from the equivalent of an app store—or who rent it as a service.

The botnet market is nothing new—it's been evolving for years. But what is new is the business model of botnet developers, which has matured to the point where it begins to resemble other, legitimate software markets. One example of this change is a Facebook and Twitter CAPTCHA bypass bot called JET, which is openly for sale online.

JET FAcebook Wall Poster
The JET Facebook posting bot from jetbots.com

Tor Project turns to Amazon


The Tor Project offers a channel for people wanting to route their online communications anonymously and this channel has been used by activists to avoid censorship as well as those seeking anonymity for more nefarious reasons. Now the people involved in this project to maintain a secret layer of the internet have turned to Amazon to add bandwidth to the service. According to some experts, the use of Amazon's cloud service will make it harder for governments to track.

Amazon's cloud service - dubbed EC2 (Elastic Compute Cloud) offers virtual computer capacity. The Tor developers are calling on people to sign up to the service in order to run a bridge - a vital point of the secret network through which communications are routed. According to Tor developers, by setting up a bridge, you donate bandwidth to the Tor network and help improve the safety and speed at which users can access the internet.


US military ready for cyber warfare


The US military is now legally in the clear to launch offensive operations in cyberspace, the commander of the US Strategic Command said Wednesday, less than a month after terming this a work in progress.

Air Force General Robert Kehler said in the latest sign of quickening U.S. military preparations for possible cyber warfare that "I do not believe that we need new explicit authorities to conduct offensive operations of any kind". "I do not think there is any issue about authority to conduct operations," he added, referring to the legal framework.

Cyber-Warfare

But he said the military was still working its way through cyber warfare rules of engagement that lie beyond "area of hostilities," or battle zones, for which they have been approved.
The US Strategic Command is in charge of a number of areas for the US military, including space operations (like military satellites), cyberspace concerns, 'strategic deterrence' and combating WMDs. The U.S. Cyber Command, a sub-command, began operating in May 2010 as military doctrine, legal authorities and rules of engagement were still being worked out for what the military calls the newest potential battle "domain."

"When warranted, we will respond to hostile acts in cyberspace as we would to any other threat to our country," the DoD said in the report. "All states possess an inherent right to self-defense, and we reserve the right to use all necessary means – diplomatic, informational, military, and economic – to defend our nation, our allies, our partners, and our interests."
The Office of the National Counterintelligence Executive, a U.S. intelligence arm, said in a report to Congress last month that China and Russia are using cyber espionage to steal U.S. trade and technology secrets and that they will remain "aggressive" in these efforts.
It defined cyberspace as including the Internet, telecommunications networks, computer systems and embedded processors and controllers in "critical industries."

cyber-warrior cartoon

The Pentagon, in the report to Congress made public Tuesday, said it was seeking to deter aggression in cyberspace by building stronger defenses and by finding ways to make attackers pay a price.

Windows 8 first malware bootkit

An independent programmer and security analyst, Peter Kleissner, is planning to release the world's first Windows 8 bootkit in India, at the International Malware Conference (MalCon).



 
A bootkit is built upon the following broad parts:
  • Infector
  • Bootkit
  • Drivers
  • Plugins (the payload)

A bootkit is a rootkit that is able to load from a master boot record and persist in memory all the way through the transition to protected mode and the startup of the OS. It’s a boot virus that is able to hook and patch Windows to get load into the Windows kernel, and thus getting unrestricted access to the entire computer. It is even able to bypass full volume encryption, because the master boot record (where Stoned is stored) is not encrypted. The master boot record contains the decryption software which asks for a password and decrypts the drive. This is the weak point, the master boot record, which will be used to take over the whole system.

Stuxnet 3.0 released at MalCon?

Security researchers were shocked to see in a twitter update from MalCon that one of the research paper submissions shortlisted is on possible features of Stuxnet 3.0. While this may just be a discussion and not a release, it is interesting to note that the speaker Nima Bagheri presenting the paper is from IRAN.

The research paper abstract discusses rootkit features and the malware authors may likely show demonstration at MalCon with new research related to hiding rootkits and advanced Stuxnet like malwares.


Stuxnet is a computer worm discovered in June 2010. It targets Siemens industrial software and equipment running Microsoft Windows. While it is not the first time that hackers have targeted industrial systems, it is the first discovered malware that spies on and subverts industrial systems, and the first to include a programmable logic controller (PLC) rootkit.

What is alarming is the recent discovery (On 1 September 2011) by The Laboratory of Cryptography and System Security (CrySyS) of the Budapest University of Technology and Economics, of a new worm theoretically related to Stuxnet. After analyzing the malware, they named it Duqu and Symantec, based on this report, continued the analysis of the threat, calling it "nearly identical to Stuxnet, but with a completely different purpose", and published a detailed technical paper. The main component used in Duqu is designed to capture information such as keystrokes and system information and this data may be used to enable a future Stuxnet-like attack.

FBI's Operation Ghost Click

Resulting from a two-year investigation by the FBI, dubbed "Operation Ghost Click", a gang of internet bandits who stole $14 million after hacking into at least 4 million computers in an online advertising scam have been arrested.

Six Estonian nationals have been arrested and charged with running a sophisticated Internet fraud ring that infected millions of computers worldwide with a virus and enabled the thieves to manipulate the multi-billion-dollar Internet advertising industry. Users of infected machines were unaware that their computers had been compromised—or that the malicious software rendered their machines vulnerable to a host of other viruses.



Computers in more than 100 countries were infected by the “DNSChanger” malware, which redirected searches for Apple’s iTunes store to fake pages pretending to offer Apple software for sale, as well as sending those searching for information on the U.S. Internal Revenue Service to accounting company H&R Block, which allegedly paid those behind the scam a fee for each visitor via a fake internet ad agency. Beginning in 2007, the cyber ring used DNSChanger to infect approximately 4 million computers in more than 100 countries.

Trend Micro, which helped supply information to the FBI on DNS Changer, hailed the law enforcement operation as the "biggest cyber criminal takedown in history." Whilst the rogue DNS servers have been replaced, many may still be infected. Head here to learn about how to check if your system is part of the DNS Changer botnet.

The Future of Computers - Artificial Intelligence


What is Artificial Intelligence?


The term “Artificial Intelligence” was coined in 1956 by John McCarthy at the Massachusetts Institute of Technology defining it as the science and engineering of making intelligent machines.

Nowadays it’s a branch of computer science that aims to make computers behave like humans and this field of research is defined as the study and design of intelligent agents where an intelligent agent is a system that perceives its environment and takes actions that maximize its chances of success.

This new science was founded on the claim that a central property of humans, intelligence—the sapience of Homo Sapiens—can be so precisely described that it can be simulated by a machine. This raises philosophical issues about the nature of the mind and the ethics of creating artificial beings, issues which have been addressed by myth, fiction and philosophy since antiquity.

Artificial Intelligence includes programming computers to make decisions in real life situations (e.g. some of these “expert systems” help physicians in the diagnosis of diseases based on symptoms), programming computers to understand human languages (natural language), programming computers to play games such as chess and checkers (games playing), programming computers to hear, see and react to other sensory stimuli(robotics) and designing systems that mimic human intelligence by attempting to reproduce the types of physical connections between neurons in the human brain (neural networks).



History of Artificial Intelligence


The Greek myth of Pygmalion is the story of a statue brought to life for the love of her sculptor. The Greek god Hephaestus' robot Talos guarded Crete from attackers, running the circumference of the island 3 times a day. The Greek Oracle at Delphi was history's first chatbot and expert system.


In the 3rd century BC, Chinese engineer Mo Ti created mechanical birds, dragons, and warriors. Technology was being used to transform myth into reality.

Much later, the Royal courts of Enlightenment-age Europe were endlessly amused by mechanical ducks and humanoid figures, crafted by clockmakers. It has long been possible to make machines that looked and moved in human-like ways - machines that could spook and awe the audience - but creating a model of the mind was off limits.



However, writers and artists were not bound by the limits of science in exploring extra-human intelligence, and the Jewish myth of the Golem, Mary Shelley's Frankenstein, all the way through to Forbidden Planet's Robbie the Robot and 2001's HAL9000, gave us new - and troubling - versions of the manufactured humanoid.


In the 1600s, Engineering and Philosophy began a slow merger which continues today and from that union the first mechanical calculator is born at a time when the world's philosophers were seeking to encode the laws of human thought into complex, logical systems.

The mathematician, Blaise Pascal, created a mechanical calculator in 1642 (to enable gambling predictions). Another mathematician, Gottfried Wilhelm von Leibniz, improved Pascal's machine and made his own contribution to the philosophy of reasoning by proposing a calculus of thought.

Many of the leading thinkers of the 18th and 19th century were convinced that a formal reasoning system, based on a kind of mathematics, could encode all human thought and be used to solve every sort of problem. Thomas Jefferson, for example, was sure that such a system existed, and only needed to be discovered. The idea still has currency - the history of recent artificial intelligence is replete with stories of systems that seek to "axiomatize" logic inside computers.

From 1800 on, the philosophy of reason picked up speed. George Boole proposed a system of "laws of thought," Boolean Logic, which uses "AND" and "OR" and "NOT" mto establish how ideas and objects relate to each other. Nowadays most Internet search engines use Boolean logic in their searches.

Recent history and the future of Artificial Intelligence


In the early part of the 20th century, multidisciplinary interests began to converge and engineers began to view brain synapses as mechanistic constructs. A new word, cybernetics, i.e., the study of communication and control in biological and mechanical systems, became part of our colloquial language. Claude Shannon pioneered a theory of information, explaining how information was created and how it might be encoded and compressed.

Enter the computer. Modern artificial intelligence (albeit not so named until later) was born in the first half of the 20th century, when the electronic computer came into being. The computer's memory was a purely symbolic landscape, and the perfect place to bring together the philosophy and the engineering of the last 2000 years. The pioneer of this synthesis was the British logician and computer scientist Alan Turing.

AI research is highly technical and specialized, deeply divided into subfields that often fail to communicate with each other. Subfields have grown up around particular institutions, the work of individual researchers, the solution of specific problems, longstanding differences of opinion about how AI should be done and the application of widely differing tools. The central problems of AI include such traits as reasoning, knowledge, planning, learning, communication, perception and the ability to move and manipulate objects.


Natural-language processing would allow ordinary people who don’t have any knowledge of programming languages to interact with computers. So what does the future of computer technology look like after these developments? Through nanotechnology, computing devices are becoming progressively smaller and more powerful. Everyday devices with embedded technology and connectivity are becoming a reality. Nanotechnology has led to the creation of increasingly smaller and faster computers that can be embedded into small devices.

This has led to the idea of pervasive computing which aims to integrate software and hardware into all man made and some natural products. It is predicted that almost any items such as clothing, tools, appliances, cars, homes, coffee mugs and the human body will be imbedded with chips that will connect the device to an infinite network of other devices.


Hence, in the future network technologies will be combined with wireless computing, voice recognition, Internet capability and artificial intelligence with an aim to create an environment where the connectivity of devices is embedded in such a way that the connectivity is not inconvenient or outwardly visible and is always available. In this way, computer technology will saturate almost every facet of our life. What seems like virtual reality at the moment will become the human reality in the future of computer technology.

The Future of Computers - Optical Computers


An optical computer (also called a photonic computer) is a device that performs its computation using photons of visible light or infrared (IR) beams, rather than electrons in an electric current. The computers we use today use transistors and semiconductors to control electricity but computers of the future may utilize crystals and metamaterials to control light.


An electric current creates heat in computer systems and as the processing speed increases, so does the amount of electricity required; this extra heat is extremely damaging to the hardware. Photons, however, create substantially less amounts of heat than electrons, on a given size scale, thus the development of more powerful processing systems becomes possible. By applying some of the advantages of visible and/or IR networks at the device and component scale, a computer might someday be developed that can perform operations significantly faster than a conventional electronic computer.

Coherent light beams, unlike electric currents in metal conductors, pass through each other without interfering; electrons repel each other, while photons do not. For this reason, signals over copper wires degrade rapidly while fiber optic cables do not have this problem. Several laser beams can be transmitted in such a way that their paths intersect, with little or no interference among them - even when they are confined essentially to two dimensions.


Electro-Optical Hybrid computers


Most research projects focus on replacing current computer components with optical equivalents, resulting in an optical digital computer system processing binary data. This approach appears to offer the best short-term prospects for commercial optical computing, since optical components could be integrated into traditional computers to produce an optical/electronic hybrid. However, optoelectronic devices lose about 30% of their energy converting electrons into photons and back and this switching process slows down transmission of messages.

Pure Optical Computers



All-optical computers eliminate the need for switching. These computers will use multiple frequencies to send information throughout computer as light waves and packets thus not having any electron based systems and needing no conversation from electrical to optical, greatly increasing the speed.

The Future of Computers - Quantum nanocomputers

A quantum computer uses quantum mechanical phenomena, such as entanglement and superposition to process data. Quantum computation aims to use the quantum properties of particles to represent and structure data using quantum mechanics to understand how to perform operations with this data.



The quantum mechanical properties of atoms or nuclei allow these particles to work together as quantum bits, or qubits. These qubits work together to form the computer’s processor and memory and can can interact with each other while being isolated from the external environment and this enables them to perform certain calculations much faster than conventional computers. By computing many different numbers simultaneously and then interfering the results to get a single answer, a quantum computer can perform a large number of operations in parallel and ends up being much more powerful than a digital computer of the same size.



A quantum nanocomputer would work by storing data in the form of atomic quantum states or spin. Technology of this kind is already under development in the form of single-electron memory (SEM) and quantum dots. By means of quantum mechanics, waves would store the state of each nanoscale component and information would be stored as the spin orientation or state of an atom. With the correct setup, constructive interference would emphasize the wave patterns that held the right answer, while destructive interference would prevent any wrong answers.



The energy state of an electron within an atom, represented by the electron energy level or shell, can theoretically represent one, two, four, eight, or even 16 bits of data. The main problem with this technology is instability. Instantaneous electron energy states are difficult to predict and even more difficult to control. An electron can easily fall to a lower energy state, emitting a photon; conversely, a photon striking an atom can cause one of its electrons to jump to a higher energy state.



I promise to write a series of articles on quantum issues very shortly...

The Future of Computers - Mechanical nanocomputers

Pioneers as Eric Drexler proposed as far back as the mid-1980's that nanoscale mechanical computers could be built via molecular manufacturing through a process of mechanical positioning of atoms or molecular building blocks one atom or molecule at a time, a process known as “mechanosynthesis.”


Once assembled, the mechanical nanocomputer, other than being greatly scaled down in size, would operate much like a complex, programmable version of the mechanical calculators used during the 1940s to 1970s, preceding the introduction of widely available, inexpensive solid-state electronic calculators.

Drexler's theoretical design used rods sliding in housings, with bumps that would interfere with the motion of other rods. It was a mechanical nanocomputer that uses tiny mobile components called nanogears to encode information.


Drexler and his collaborators favored designs that resemble a miniature Charles Babbage's analytical engines of the 19th century, mechanical nanocomputers that would calculate using moving molecular-scale rods and rotating molecular-scale wheels, spinning on shafts and bearings.


For this reason, mechanical nanocomputer technology has sparked controversy and some researchers even consider it unworkable. All the problems inherent in Babbage's apparatus, according to the naysayers, are magnified a million fold in a mechanical nanocomputer. Nevertheless, some futurists are optimistic about the technology, and have even proposed the evolution of nanorobots that could operate, or be controlled by, mechanical nanocomputers.

The Future of Computers - Chemical and Biochemical nanocomputers


Chemical Nanocomputer


In general terms a chemical computer is one that processes information in by making and breaking chemical bonds and it stores logic states or information in the resulting chemical (i.e., molecular) structures. In a chemical nanocomputer computing is based on chemical reactions (bond breaking and forming) and the inputs are encoded in the molecular structure of the reactants and outputs can be extracted from the structure of the products meaning that in these computers the interaction between different chemicals and their structures is used to store and process information.

nanotubes

These computing operations would be performed selectively among molecules taken just a few at a time in volumes only a few nanometers on a side so, in order to create a chemical nanocomputer, engineers need to be able to control individual atoms and molecules so that these atoms and molecules can be made to perform controllable calculations and data storage tasks. The development of a true chemical nanocomputer will likely proceed along lines similar to genetic engineering.

nanotubes

Biochemical nanocomputers


Both chemical and biochemical nanocomputers would store and process information in terms of chemical structures and interactions. Proponents of biochemically based computers can point to an “existence proof” for them in the commonplace activities of humans and other animals with multicellular nervous systems. So biochemical nanocomputers already exist in nature; they are manifest in all living things. But these systems are largely uncontrollable by humans and that makes artificial fabrication or implementation of this category of “natural” biochemically based computers seems far off because the mechanisms for animal brains and nervous systems still are poorly understood. We cannot, for example, program a tree to calculate the digits of pi , or program an antibody to fight a particular disease (although medical science has come close to this ideal in the formulation of vaccines, antibiotics, and antiviral medications)..

DNA nanocomputer


In 1994, Leonard Adelman took a giant step towards a different kind of chemical or artificial biochemical computer when he used fragments of DNA to compute the solution to a complex graph theory problem.

DNA computer

Using the tools of biochemistry, Adleman was able to extract the correct answer to the graph theory problem out of the many random paths represented by the product DNA strands. Like a computer with many processors, this type of DNA computer is able to consider many solutions to a problem simultaneously. Moreover, the DNA strands employed in such a calculation (approximately 1017) are many orders of magnitude greater in number and more densely packed than the processors in today's most massively parallel electronic supercomputer. As a result of the Adleman work, the chemical nanocomputer is the only one of the aforementioned four types to have been demonstrated for an actual calculation.


These computers use DNA to store information and perform complex calculations. DNA has a vast amount of storage capacity that enables it to hold the complex blueprints of living organisms. The storage capacity of a single gram of DNA can hold as much information as one trillion compact discs.

The Future of Computers - Electronic nanocomputers


Due to our fifty years of experience with electronic computing devices, including the extensive research and industrial infrastructure built up since the late 1940s, advances in nanocomputing technology are likely to come in this direction making the electronic nanocomputers appear to present the easiest and most likely direction in which to continue nanocomputer development in the near future.
Electronic nanocomputers would operate in a manner similar to the way present-day microcomputers work. The main difference is one of physical scale. More and more transistor s are squeezed into silicon chips with each passing year; witness the evolution of integrated circuits (IC s) capable of ever-increasing storage capacity and processing power.

The ultimate limit to the number of transistors per unit volume is imposed by the atomic structure of matter. Most engineers agree that technology has not yet come close to pushing this limit. In the electronic sense, the term nanocomputer is relative; by 1970s standards, today's ordinary microprocessors might be called nanodevices.

How it works


The power and speed of computers have grown rapidly because of rapid progress in solid-state electronics dating back to the invention of the transistor in 1948. Most important, there has been exponential increase in the density of transistors on integrated-circuit computer chips over the past 40 years. In that time span, though, there has been no fundamental change in the operating principles of the transistor.
Even microelectronic transistors no more than a few microns (millionths of a meter) in size are bulk-effect devices. They still operate using small electric fields imposed by tiny charged metal plates to control the mass action of many millions of electrons.

Although electronic nanocomputers will not use the traditional concept of transistors for its components, they will still operate by storing components, information in the positions of electrons.

At the current rate of miniaturization, the conventional transistor technology will reach a minimum size limit in a few years. At that point, smallscale quantum mechanical efects, such as the tunneling of electrons through barriers made from matter or electric fields, will begin to dominate the essential effects that permit a mass-action semiconductor device to operate Still, an electronic nanocomputer will continue to represent information in the storage and movement of electrons.
Nowadays, most eletronic nanocomputers are created through microscopic circuits using nanolithography.

Nanolitography


Nanolithography is a term used to describe the branch of nanotechnology concerned with the study and application of a number of techniques for creating nanometer-scale structures, meaning patterns with at least one lateral dimension between the size of an individual atom and approximately 100 nm.

A nanometer is a billionth of a meter, much smaller than the width of a single human hair. The word lithography is used because the method of pattern generation is essentially the same as writing, only on a much smaller scale. Nanolithography is used during the fabrication of leading-edge semiconductor integrated circuits (nanocircuitry) or nanoelectromechanical systems (NEMS).


One common method of nanolithography, used particularly in the creation of microchips, is known as photolithography. This technique is a parallel method of nanolithography in which the entire surface is drawn on in a single moment. Photolithography is limited in the size it can reduce to, however, because if the wavelength of light used is made too small the lens simply absorbs the light in its entirety. This means that photolithography cannot reach the super-fine sizes of some alternate technologies.

A technology that allows for smaller sizes than photolithography is that of electron-beam lithography. Using an electron beam to draw a pattern nanometer by nanometer, incredibly small sizes (on the order of 20nm) may be achieved. Electron-beam lithography is much more expensive and time consuming than photolithography, however, making it a difficult sell for industry applications of nanolithography. Since electron-beam lithography functions more like a dot-matrix printer than a flash-photograph, a job that would take five minutes using photolithography will take upwards of five hours with electron-beam lithography.




New nanolithography technologies are constantly being researched and developed, leading to smaller and smaller possible sizes. Extreme ultraviolet lithography, for example, is capable of using light at wavelengths of 13.5nm. While hurdles still exist in this new field, it promises the possibility of sizes far below those produced by current industry standards. Other nanolithography techniques include dip-pen nanolithography, in which a small tip is used to deposit molecules on a surface. Dip-pen nanolithography can achieve very small sizes, but cannot currently go below 40nm.


The Future of Computers - Nanocomputers

Scientific discussion of the development and fabrication of nanometer-scale devices began in 1959 with an influential lecture by the late, renowned physicist Richard Feynman. Feynman observed that it is possible, in principle, to build and operate submicroscopic machinery. He proposed that large numbers of completely identical devices might be assembled by manipulating atoms one at a time.


Feynman's proposal sparked an initial flurry of interest but it did not broadly capture the imagination of the technical community or the public. At the time, building structures one atom at a time seemed out of reach. Throughout the 1960s and 1970s advances in diverse fields prepared the scientific community for the first crude manipulations of nanometer-scale structures. The most obvious development was the continual miniaturization of digital electronic circuits, based primarily upon the invention of the transistor by Shockley, Brattain, and Bardeen in 1948 and the invention of the integrated circuit by Noyce, Kilby, and others in the late 1950s. In 1959, it was only possible to put one transistor on an integrated circuit. Twenty years later, circuits with a few thousand transistors were commonplace.


Scientists are trying to use nanotechnology to make very tiny chips, electrical conductors and logic gates. Using nanotechnology, chips can be built up one atom at a time and hence there would be no wastage of space, enabling much smaller devices to be built. Using this technology, logic gates will be composed of just a few atoms and electrical conductors (called nanowires) will be merely an atom thick and a data bit will be represented by the presence or absence of an electron.

A nanocomputer is a computer whose physical dimensions are microscopic; smaller than the microcomputer, which is smaller than the minicomputer. (The minicomputer is called "mini" because it was a lot smaller than the original (mainframe) computers.)

A component of nanotechnology, nanocomputing will, as suggested or proposed by researchers and futurists, give rise to four types of nanocomputers:

Keep reading the next posts…

The Future of Computers - Overview


Computers' Evolution


The history of computers and computer technology thus far has been a long and a fascinating one, stretching back more than half a century to the first primitive computing machines. These machines were huge and complicated affairs, consisting of row upon row of vacuum tubes and wires, often encompassing several rooms to fit it all in.
 



In the past twenty years, there has been a dramatic increase in the processing speed of computers, network capacity and the speed of the internet. These advances have paved the way for the revolution of fields such as quantum physics, artificial intelligence and nanotechnology. These advances will have a profound effect on the way we live and work, the virtual reality we see in movies like the Matrix, may actually come true in the next decade or so.
Today's computers operate using transistors, wires and electricity. Future computers might use atoms, fibers and light. Take a moment to consider what the world might be like, if computers the size of molecules become a reality. These are the types of computers that could be everywhere, but never seen. Nano sized bio-computers that could target specific areas inside your body. Giant networks of computers, in your clothing, your house, your car. Entrenched in almost every aspect of our lives and yet you may never give them a single thought.



As anyone who has looked at the world of computers lately can attest, the size of computers has been reduced sharply, even as the power of these machines has increased at an exponential rate. In fact, the cost of computers has come down so much that many households now own not only one, but two, three or even more, PCs.
As the world of computers and computer technology continues to evolve and change, many people, from science fiction writers and futurists to computer workers and ordinary users have wondered what the future holds for the computer and related technologies. Many things have been pictured, from robots in the form of household servants to computers so small they can fit in a pocket. Indeed, some of these predicted inventions have already come to pass, with the introduction of PDAs and robotic vacuum cleaners.

Understanding the theories behind these future computer technologies is not for the meek. My research into quantum computers was made all the more difficult after I learned that in light of her constant interference, it is theoretically possible my mother-in-law could be in two places at once.

Nanotechnology is another important part of the future of computers, expected to have a profound impact on people around the globe. Nanotechnology is the process whereby matter is manipulated at the atomic level, providing the ability to “build” objects from their most basic parts. Like robotics and artificial intelligence, nanotechnology is already in use in many places, providing everything from stain resistant clothing to better suntan lotion. These advances in nanotechnology are likely to continue in the future, making this one of the most powerful aspects of future computing.

In the future, the number of tiny but powerful computers you encounter every day will number in the thousands, perhaps millions. You won't see them, but they will be all around you. Your personal interface to this powerful network of computers could come from a single computing device that is worn on or in the body.

Quantum computers are also likely to transform the computing experience, for both business and home users. These powerful machines are already on the drawing board, and they are likely to be introduced in the near future. The quantum computer is expected to be a giant leap forward in computing technology, with exciting implications for everything from scientific research to stock market predictions.

Moore's law


Visit any site on the web writing about the future of computers and you will most likely find mention of Moore's Law. Moore's Law is not a strictly adhered to mathematical formula, but a prediction made by Intel's founder co-founder Gordon Moore in 1965 in a paper where he noted that number of components in integrated circuits had doubled every year from the invention of the integrated circuit in 1958 until 1965 and predicted that the trend would continue "for at least ten years".



Moore predicted that computing technology would increase in value at the same time it would decrease in cost describing a long-term trend in the history of computing hardware. More specifically, that innovation in technology would allow for the number of transistors that can be placed inexpensively on an integrated circuit to double approximately every two years. The trend has continued for more than half a century and is not expected to stop until 2015 or later.

A computer transistor acts like a small electronic switch. Just like the light switch on your wall, a transistor has only two states, On or Off. A computer interprets this on/off state as a 1 or a 0. Put a whole bunch of these transistors together and you have a computer chip. The central processing unit (CPU) inside your computer probably has around 500 million transistors.

Shrinking transistor size not only makes chips smaller, but faster. One benefit of packing transistors closer together is that the electronic pulses take less time to travel between transistors. This can increase the overall speed of the chip. The capabilities of many digital electronic devices are strongly linked to Moore's law: processing speed, memory capacity, sensors and even the number and size of pixels in digital cameras. All of these are improving at (roughly) exponential rates as well.



Not everyone agrees that Moore's Law has been accurate throughout the years, (the prediction has changed since its original version), or that it will hold true in the future. But does it really matter? The pace at which computers are doubling their smarts is happening fast enough for me.

Thanks to the innovation and drive of Gordon Moore and others like him, computers will continue to get smaller, faster and more affordable.

The future of cyberspace



Is cyberspace infinite?


Of all the things we now take for granted, cyberspace is near the top of the list. The promise of the Internet for the twenty-first century is to make everything always available to everyone everywhere. All of human culture and achievement, the great and the not so great, may, one day soon, be just a click away.
cyberspace

When one is online, cyberspace can seem a lot like outer space or, to use the latest jargon, 'the cloud'. It appears infinite and ethereal. The information is simply out there. If, instead, we thought more about the real-world energy and the real estate that the Internet uses, we would start to realize that things are not so simple. Cyberspace is in fact physical space. And the longer it takes us to change our concept of the Internet—to see quite clearly its physical there-ness—the closer we'll get to blogging our way to oblivion.
cyberspace

Now, everything that we upload—all the Facebook photos, all the Youtube videos—is always available on demand to everyone. What does it take to keep up that commitment? It takes many huge buildings, with square footage in the hundreds of thousands of feet, called data centers or, more appropriately given the Internet's relentless growth, server farms.
server farm

In order to maintain total, ubiquitous availability, as today's Internet users have come to expect, a lot of things have to be happening simultaneously. The millions of hard disc drives that store the Internet's contents have to be powered up and spinning at thousands of revolutions per minute, not just in one place but at backup mirror sites elsewhere. Air conditioning keeps the whirring servers cool. Real estate has to be acquired and developed to house it all. Electrical grids have to be extended to the sites. And lots of electricity has to be generated, which means lots of carbon dioxide gets produced.


Cyberspace's consumption

How much? According to some experts, the cloud already consumes 1 to 2 percent of the world's electricity. That is what it takes to maintain Facebook's reported 15 billion photos, entertain Microsoft's 20 million Xbox Live subscribers, and host all the other always-on content that we use. But at what cost?
The economics of cyberspace and server farms provides no automatic curb to their growth. The key questions for business are how to get energy cheaply and how to keep transmission times in the low milliseconds. Revenues for services like Facebook and Youtube do not come from costs to users. From a naive user's perspective, cyberspace is infinite, free, and clean. As long as people perceive no cost in uploading their photos and videos, they will do so—and their content will stay there without expiring. Free video is like free petrol or free air conditioning: anyone not paying the bill for a resource will use it without restraint. And that is exactly what is happening in cyberspace.
server farm

If no one is watching your Youtube video, does it need to be occupying physical space on multiple, electrically-powered hard drives around the world? On a deeper level, our inexorable drive to create an eternal, all-encompassing memory demonstrates our fear of forgetting. But is forgetting so awful that we must drive the planet closer to the abyss in order to avoid losing any scrap of information, no matter how trivial? Can we let go before we are killed by the need to preserve all of our experience?


Am I suggesting that we stop using the Internet? No, of course not. I wrote this for a blog after all. We invented the Internet because we need to communicate, to share, to learn, to exchange goods and ideas. That is what makes us human. Hence the pathos of our dilemma: that gorgeous, insatiable yearning we have to communicate across all distances, literal and otherwise, is also driving us towards our destruction. If we turn the system off and turn our backs on the dream of global communication, then we may as well die off for we will have sacrificed our common human dream. This is the heightened drama of existence in the twenty-first century: the grandeur of our brave, new world comes at a cost. We can at least face it honestly.

 

How can we make a better cyberspace?

So what can we do? We can mitigate the Internet's emissions by finding alternative energy sources, but its galloping growth will wipe out whatever improved efficiencies we can discover. Can we evolve the new models of business and government that we need fast enough to head off global warming's tipping point? Probably not. We have not done it yet despite all that we know. Recent efforts to achieve something as simple as health insurance for all Americans do not inspire confidence.
Cyberspace is one place where our own actions can make a big difference. Those 15 billion Facebook photos and who-knows-how-many Youtube videos were not posted by the petrol companies. We posted them. We, the users, have the power to slow the Internet's planet-choking growth. We have to see the externalities—the CO2 emissions produced by our online activities—as internal costs to the planet. We can start by raising consciousness about the problem, restricting our uploads, and even pulling down some. Instead of 1000 CO2-emitting photos on Facebook, just keep your 200 best. If no one watches your karaoke video on Youtube, delete it. At least store on it on something that does not need to stay plugged in.
server farm



What if consciousness-raising and voluntary self-discipline are not enough? Despite the cyberpunk mantra that 'Information wants to be free', real estate does not. With that truth in mind, here comes my modest proposal. This will be a very unpopular thing to say, but it needs to be said: there ought to be a cost for sharing too much information about oneself, i.e., an uploading tax. That is the only way that most people will stop uploading huge files to cyberspace—they need to pay for the real energy and space that they are using. Information can still be free to take. What I am proposing is that information should not necessarily be free to distribute by occupying space on Internet servers. If you want to post more than a certain amount, you should have to pay rent for the physical space that your megabytes and gigabytes occupy. If uploaders had to pay, many of the photos and videos that no one looks at would come down a lot faster, and Internet-associated CO2 emissions from server farms would start to decline. The money from the rents can go towards development of alternative energy sources, or whatever.

Cyberspace's future has a cost

server farm

We need to get our heads out of the cloud and back on solid ground where we know that renting someone else's space costs money. The proliferating server farms that create the illusion of cyberspace will swallow more and more land and spit out more and more heat-trapping gases. If the life experience that we are preserving online comes at the cost of life itself, then we would be better off entrusting it to the imperfect, ephemeral storage space known as the human brain and taking our chances. We have lost most of the plays of Aeschylus, Sophocles, and Euripides. I think we will still be alright if we lose the video of Kevin crushing the beer can against his head and chugging marinara sauce. The question is, should the Kevins of the world be allowed to colonize our land and use up our energy without paying for it? Maybe we should make Kevin pay for the privilege…
Or maybe we should embrace an old fashioned solution for the waste of cyber space on various posting sites, that is still used by that archaic form of communication, television. When only a few people watch a show, it gets canceled. If no one is looking at a You Tube (or similar sites) posting, it should be canceled by the provider.
What about freedom of speech?
And some say that energy-wise, the consumption of the internet is trivial, and is probably more than compensated by the high level of global awareness of energy issues that has been generated through use of the Web.
What do you think?