Showing posts with label Future. Show all posts
Showing posts with label Future. Show all posts

Nanotechnology Applications


Nanobiotechnology


Nanobiotechnology is the unification of biotechnology and nanotechnology, thus it's a hybrid discipline dealing with the making of atomic-scale machines by imitating or incorporating biological systems at the molecular level, or building tiny tools to study or change natural structure properties atom by atom. This goal can be attained via a combination of the classical micro-technology with a molecular biological approach.

Nanobiotechnology

Tiny medical hardware can interact with tissue in the human body at a molecular level to conduct more precise diagnosis and healing of malignancy. Many illnesses and injuries have their origins in nanoscale processes. Accordingly, practical usage of nanotechnology to the practice of medical care and biomedical investigation unlocks opportunities to treat illnesses, repair injuries, and enhance human functioning beyond what is possible with larger scale techniques.

How nanotechnology affects everything


Nanotechnology scientific impact


While there is a commonly held belief that nanotechnology is a futuristic science with applications 25 years in the future and beyond, nanotechnology is anything but science fiction. In the last years over a dozen Nobel prizes have been awarded in nanotechnology, from the development of the scanning probe microscope (SPM), to the discovery of fullerenes. Almost every university in the world has a nanotechnology department, or is expecting the funds to create one.
Nanotechnology offers opportunities in creating new features and functions and is already providing the solutions to many long-standing medical, social and environmental problems. Because of its potential, nanotechnology is of global interest, attracting more public funding than any other area of technology.

There is an unprecedented multidisciplinary convergence of scientists dedicated to the study of a world so small, we cannot see it - even with a light microscope. That world is the field of nanotechnology, the realm of atoms and nanostructures, something so new; no one is really sure what will come of it.

One of the exciting and challenging aspects of the nanoscale is the role that quantum mechanics plays in it. The rules of quantum mechanics are very different from those used in classical physics, which means that the behavior of substances at the nanoscale can sometimes contradict common sense by behaving erratically. You cannot walk up to a wall and immediately teleport to the other side of it, but at the nanoscale an electron can - it's called electron tunneling. Substances that are insulators, meaning they cannot carry an electric current, in bulk form might become semiconductors when reduced to the nanoscale. Melting points can change due to an increase in surface area. Much of nanoscience requires that you forget what you know and start learning all over again.

History of Nanotechnology

 
Traditionally, the origins of nanotechnology are traced back to December 29, 1959, when Professor Richard Feynman (a 1965 Nobel Prize winner in physics) presented a lecture entitled “There’s Plenty of Room at the Bottom” during the annual meeting of the American Physical Society at the California Institute of Technology (Caltech). In this talk, Feynman spoke about the principles of miniaturization and atomic-level precision and how these concepts do not violate any known law of physics. Feynman described a process by which the ability to manipulate individual atoms and molecules might be developed, using one set of precise tools to build and operate another proportionally smaller set, and so on down to the needed scale.

He described a field that few researchers had thought much about, let alone investigated. Feynman presented the idea of manipulating and controlling things on an extremely small scale by building and shaping matter one atom at a time. He proposed that it was possible to build a surgical nanoscale robot by developing quarter-scale manipulator hands that would build quarter-scale machine tools analogous to those found in machine shops, continuing until the nanoscale is reached, eight iterations later.
 
Richard Feynman
Richard Feynman

Printed Carbon Nanotube Transistor

Researchers from Aneeve Nanotechnologies have used low-cost ink-jet printing to fabricate the first circuits composed of fully printed back-gated and top-gated carbon nanotube-based electronics for use with OLED displays. OLED-based displays are used in cell phones, digital cameras and other portable devices.

But developing a lower-cost method for mass-producing such displays has been complicated by the difficulties of incorporating thin-film transistors that use amorphous silicon and polysilicon into the production process.

In this innovative study, the team made carbon nanotube thin-film transistors with high mobility and a high on-off ratio, completely based on ink-jet printing. They demonstrated the first fully printed single-pixel OLED control circuits, and their fully printed thin-film circuits showed significant performance advantages over traditional organic-based printed electronics.

This distinct process utilizes an ink-jet printing method that eliminates the need for expensive vacuum equipment and lends itself to scalable manufacturing and roll-to-roll printing. The team solved many material integration problems, developed new cleaning processes and created new methods for negotiating nano-based ink solutions.
Ink-jet-printed circuit. (Credit: University of California - Los Angeles)
For active-matrix OLED applications, the printed carbon nanotube transistors will be fully integrated with OLED arrays, the researchers said. The encapsulation technology developed for OLEDs will also keep the carbon nanotube transistors well protected, as the organics in OLEDs are very sensitive to oxygen and moisture.

(Adapted from PhysOrg)

Graphene bubbles improve lithium-air batteries

A team of scientists from the Pacific Northwest National Laboratory and Princeton  University used a new approach to buil a graphene membrane for use in lithium-air batteries, which could, one day, replace conventional batteries in electric vehicles. Resembling coral, this porous graphene material could replace the traditional smooth graphene sheets in lithium-air batteries, which become clogged with tiny particles during use.


Resembling broken eggshells, graphene structures built around bubbles produced a lithium-air battery with the highest energy capacity to date. As an added bonus, the team’s new material does not rely on platinum or other precious metals, reducing its potential cost and environmental impact.


World’s lightest material


Ultra light (<10 milligrams per cubic centimeter) cellular materials are desirable for thermal insulation, battery electrodes, catalyst supports, and acoustic, vibration, or shock energy damping. A team of researchers from UC Irvine, HRL Laboratories and the California Institute of Technology have developed the world's lightest material – with a density of 0.9 mg/cc. The new material redefines the limits of lightweight materials because of its unique "micro-lattice" cellular architecture. The researchers were able to make a material that consists of 99.99 percent air by designing the 0.01 percent solid at the nanometer, micron and millimeter scales. "The trick is to fabricate a lattice of interconnected hollow tubes with a wall thickness 1,000 times thinner than a human hair," said lead author Dr. Tobias Schaedler of HRL.

Ultra Light Material
Photo by Dan Little, HDR Laboratories

The material's architecture allows unprecedented mechanical behavior for a metal, including complete recovery from compression exceeding 50 percent strain and extraordinarily high energy absorption. "Materials actually get stronger as the dimensions are reduced to the nanoscale," explained UCI mechanical and aerospace engineer Lorenzo Valdevit, UCI's principal investigator on the project. "Combine this with the possibility of tailoring the architecture of the micro-lattice and you have a unique cellular material."

William Carter, manager of the architected materials group at HRL, compared the new material to larger, more familiar edifices: "Modern buildings, exemplified by the Eiffel Tower or the Golden Gate Bridge, are incredibly light and weight-efficient by virtue of their architecture. We are revolutionizing lightweight materials by bringing this concept to the nano and micro scales."

Adapted from PhysOrg

Butterfly wings inspire new design


Engineers have been trying to create water repellent surfaces, but past attempts at artificial air traps tended to lose their contents over time due to external perturbations. Now an international team of researchers from Sweden, the United States, and Korea has taken advantage of what might normally be considered defects in the nanomanufacturing process to create a multilayered silicon structure that traps air and holds it for longer than one year.

Blue Mountain Swallowtail

Researchers mimicked the many-layered nanostructure of blue mountain swallowtail (Papilio ulysses) wings to make a silicon wafer that traps both air and light. The brilliant blue wings of this butterfly easily shed water because of the way ultra-tiny structures in the wings trap air and create a cushion between water and wing.

Blue Mountain Swallowtail

The researchers used an etching process to carve out micro-scale pores and sculpt tiny cones from the silicon. The team found that features of the resulting structure that might usually be considered defects, such as undercuts beneath the etching mask and scalloped surfaces, actually improved the water repellent properties of the silicon by creating a multilayered hierarchy of air traps. The intricate structure of pores, cones, bumps, and grooves also succeeded in trapping light, almost perfectly absorbing wavelengths just above the visible range.

New biosensor made of nanotubes

Standard sensors employ metal electrodes coated with enzymes that react with compounds and produce an electrical signal that can be measured. However,the inefficiency of those sensors leads to imperfect measurements. Now, scientists at Purdue University have developed a new method for stacking synthetic DNA and carbon nanotubes onto a biosensor electrode.

Carbon nanotubes, cylindrically shaped carbon molecules known to have excellent thermal and electrical properties, have been seen as a possibility for improving sensor performance. The problem is that the materials are not fully compatible with water, which limits their application in biological fluids.



Marshall Porterfield and Jong Hyun Choi have found a solution and reported their findings in the journal The Analyst, describing a sensor that essentially builds itself.

"In the future, we will be able to create a DNA sequence that is complementary to the carbon nanotubes and is compatible with specific biosensor enzymes for the many different compounds we want to measure," Porterfield said. "It will be a self-assembling platform for biosensors at the biomolecular level."

Choi developed a synthetic DNA that will attach to the surface of the carbon nanotubes and make them more water-soluble. "Once the carbon nanotubes are in a solution, you only have to place the electrode into the solution and charge it. The carbon nanotubes will then coat the surface," Choi said.

The electrode coated with carbon nanotubes will attract the enzymes to finish the sensor's assembly. The sensor described in the findings was designed for glucose. But Porterfield said it could be easily adapted for various compounds. "You could mass produce these sensors for diabetes, for example, for insulin management for diabetic patients," Porterfield said.

Electron Tweezers

A recent paper by researchers from the National Institute of Standards and Technology (NIST) and the University of Virginia (UVA) demonstrates that the beams produced by modern electron microscopes can be used to manipulate nanoscale objects.

The tool is an electron version of the laser "optical tweezers" that have become a standard tool in biology, physics and chemistry for manipulating tiny particles. Except that electron beams could offer a thousand-fold improvement in sensitivity and resolution.

If you just consider the physics, you might expect that a beam of focused electrons -- such as that created by a transmission electron microscope (TEM) -- could do the same thing. However that's never been seen, in part because electrons are much fussier to work with. They can't penetrate far through air, for example, so electron microscopes use vacuum chambers to hold specimens.

So Vladimir Oleshko and his colleague James Howe, were surprised when, in the course of another experiment, they found themselves watching an electron tweezer at work. They were using an electron microscope to study, in detail, what happens when a metal alloy melts or freezes. They were observing a small particle -- a few hundred microns wide -- of an aluminum-silicon alloy held just at a transition point where it was partially molten, a liquid shell surrounding a core of still solid metal.

"This effect of electron tweezers was unexpected because the general purpose of this experiment was to study melting and crystallization," Oleshko explains. "We can generate this sphere inside the liquid shell easily; you can tell from the image that it's still crystalline. But we saw that when we move or tilt the beam -- or move the microscope stage under the beam -- the solid particle follows it, like it was glued to the beam."

Potentially, electron tweezers could be a versatile and valuable tool, adding very fine manipulation to wide and growing lists of uses for electron microscopy in materials science. "Of course, this is challenging because it requires a vacuum," he says, "but electron probes can be very fine, three orders of magnitude smaller than photon beams -- close to the size of single atoms. We could manipulate very small quantities, even single atoms, in a very precise way."

(Adapted from ScienceDaily)

Memory at the nanoscale

Metallic alloys can be stretched or compressed in such a way that they stay deformed once the strain on the material has been released. However, shape memory alloys can return to their original shape after being heated above a specific temperature.

Now, for the first time, two physicists from the University of Constance determined the absolute values of temperatures at which shape memory nanospheres start changing back to their memorized shape, undergoing the so-called structural phase transition, which depends on the size of particles studied. To achieve this result, they performed a computer simulation using nanoparticles with diameters between 4 and 17 nm made of an alloy of equal proportions of nickel and titanium.


To date, research efforts to establish structural phase transition temperature have mainly been experimental. Most of the prior work on shape memory materials was in macroscopic scale systems and used for applications such as dental braces, stents or oil temperature-regulating devices for bullet trains.

Thanks to a computerized method known as molecular dynamics simulation, Daniel Mutter and Peter Nielaba were able to visualize the transformation process of the material during the transition. As the temperature increased, they showed that the material's atomic-scale crystal structure shifted from a lower to a higher level of symmetry. They found that the strong influence of the energy difference between the low- and high-symmetry structure at the surface of the nanoparticle, which differed from that in its interior, could explain the transition.

Potential new applications include the creation of nanoswitches, where laser irradiation could heat up such shape memory material, triggering a change in its length that would, in turn, function as a switch.

Quantum Cloning Advances

Quantum cloning is the process that takes an arbitrary, unknown quantum state and makes an exact copy without altering the original state in any way. Quantum cloning is forbidden by the laws of quantum mechanics as shown by the no cloning theorem. Though perfect quantum cloning is not possible, it is possible to perform imperfect cloning, where the copies have a non-unit fidelity with the state being cloned.

The quantum cloning operation is the best way to make copies of quantum information therefore cloning is an important task in quantum information processing, especially in the context of quantum cryptography. Researchers are seeking ways to build quantum cloning machines, which work at the so called quantum limit. Quantum cloning is difficult because quantum mechanics laws only allow for an approximate copy—not an exact copy—of an original quantum state to be made, as measuring such a state prior to its cloning would alter it. The first cloning machine relied on stimulated emission to copy quantum information encoded into single photons.

Scientists in China have now produced a theory for a quantum cloning machine able to produce several copies of the state of a particle at atomic or sub-atomic scale, or quantum state. A team from Henan Universities in China, in collaboration with another team at the Institute of Physics of the Chinese Academy of Sciences, have produced a theory for a quantum cloning machine able to produce several copies of the state of a particle at atomic or sub-atomic scale, or quantum state. The advance could have implications for quantum information processing methods used, for example, in message encryption systems.

In this study, researchers have demonstrated that it is theoretically possible to create four approximate copies of an initial quantum state, in a process called asymmetric cloning. The authors have extended previous work that was limited to quantum cloning providing only two or three copies of the original state. One key challenge was that the quality of the approximate copy decreases as the number of copies increases.

The authors were able to optimize the quality of the cloned copies, thus yielding four good approximations of the initial quantum state. They have also demonstrated that their quantum cloning machine has the advantage of being universal and therefore is able to work with any quantum state, ranging from a photon to an atom. Asymmetric quantum cloning has applications in analyzing the security of messages encryption systems, based on shared secret quantum keys.

The Future of Computers - Artificial Intelligence


What is Artificial Intelligence?


The term “Artificial Intelligence” was coined in 1956 by John McCarthy at the Massachusetts Institute of Technology defining it as the science and engineering of making intelligent machines.

Nowadays it’s a branch of computer science that aims to make computers behave like humans and this field of research is defined as the study and design of intelligent agents where an intelligent agent is a system that perceives its environment and takes actions that maximize its chances of success.

This new science was founded on the claim that a central property of humans, intelligence—the sapience of Homo Sapiens—can be so precisely described that it can be simulated by a machine. This raises philosophical issues about the nature of the mind and the ethics of creating artificial beings, issues which have been addressed by myth, fiction and philosophy since antiquity.

Artificial Intelligence includes programming computers to make decisions in real life situations (e.g. some of these “expert systems” help physicians in the diagnosis of diseases based on symptoms), programming computers to understand human languages (natural language), programming computers to play games such as chess and checkers (games playing), programming computers to hear, see and react to other sensory stimuli(robotics) and designing systems that mimic human intelligence by attempting to reproduce the types of physical connections between neurons in the human brain (neural networks).



History of Artificial Intelligence


The Greek myth of Pygmalion is the story of a statue brought to life for the love of her sculptor. The Greek god Hephaestus' robot Talos guarded Crete from attackers, running the circumference of the island 3 times a day. The Greek Oracle at Delphi was history's first chatbot and expert system.


In the 3rd century BC, Chinese engineer Mo Ti created mechanical birds, dragons, and warriors. Technology was being used to transform myth into reality.

Much later, the Royal courts of Enlightenment-age Europe were endlessly amused by mechanical ducks and humanoid figures, crafted by clockmakers. It has long been possible to make machines that looked and moved in human-like ways - machines that could spook and awe the audience - but creating a model of the mind was off limits.



However, writers and artists were not bound by the limits of science in exploring extra-human intelligence, and the Jewish myth of the Golem, Mary Shelley's Frankenstein, all the way through to Forbidden Planet's Robbie the Robot and 2001's HAL9000, gave us new - and troubling - versions of the manufactured humanoid.


In the 1600s, Engineering and Philosophy began a slow merger which continues today and from that union the first mechanical calculator is born at a time when the world's philosophers were seeking to encode the laws of human thought into complex, logical systems.

The mathematician, Blaise Pascal, created a mechanical calculator in 1642 (to enable gambling predictions). Another mathematician, Gottfried Wilhelm von Leibniz, improved Pascal's machine and made his own contribution to the philosophy of reasoning by proposing a calculus of thought.

Many of the leading thinkers of the 18th and 19th century were convinced that a formal reasoning system, based on a kind of mathematics, could encode all human thought and be used to solve every sort of problem. Thomas Jefferson, for example, was sure that such a system existed, and only needed to be discovered. The idea still has currency - the history of recent artificial intelligence is replete with stories of systems that seek to "axiomatize" logic inside computers.

From 1800 on, the philosophy of reason picked up speed. George Boole proposed a system of "laws of thought," Boolean Logic, which uses "AND" and "OR" and "NOT" mto establish how ideas and objects relate to each other. Nowadays most Internet search engines use Boolean logic in their searches.

Recent history and the future of Artificial Intelligence


In the early part of the 20th century, multidisciplinary interests began to converge and engineers began to view brain synapses as mechanistic constructs. A new word, cybernetics, i.e., the study of communication and control in biological and mechanical systems, became part of our colloquial language. Claude Shannon pioneered a theory of information, explaining how information was created and how it might be encoded and compressed.

Enter the computer. Modern artificial intelligence (albeit not so named until later) was born in the first half of the 20th century, when the electronic computer came into being. The computer's memory was a purely symbolic landscape, and the perfect place to bring together the philosophy and the engineering of the last 2000 years. The pioneer of this synthesis was the British logician and computer scientist Alan Turing.

AI research is highly technical and specialized, deeply divided into subfields that often fail to communicate with each other. Subfields have grown up around particular institutions, the work of individual researchers, the solution of specific problems, longstanding differences of opinion about how AI should be done and the application of widely differing tools. The central problems of AI include such traits as reasoning, knowledge, planning, learning, communication, perception and the ability to move and manipulate objects.


Natural-language processing would allow ordinary people who don’t have any knowledge of programming languages to interact with computers. So what does the future of computer technology look like after these developments? Through nanotechnology, computing devices are becoming progressively smaller and more powerful. Everyday devices with embedded technology and connectivity are becoming a reality. Nanotechnology has led to the creation of increasingly smaller and faster computers that can be embedded into small devices.

This has led to the idea of pervasive computing which aims to integrate software and hardware into all man made and some natural products. It is predicted that almost any items such as clothing, tools, appliances, cars, homes, coffee mugs and the human body will be imbedded with chips that will connect the device to an infinite network of other devices.


Hence, in the future network technologies will be combined with wireless computing, voice recognition, Internet capability and artificial intelligence with an aim to create an environment where the connectivity of devices is embedded in such a way that the connectivity is not inconvenient or outwardly visible and is always available. In this way, computer technology will saturate almost every facet of our life. What seems like virtual reality at the moment will become the human reality in the future of computer technology.

The Future of Computers - Optical Computers


An optical computer (also called a photonic computer) is a device that performs its computation using photons of visible light or infrared (IR) beams, rather than electrons in an electric current. The computers we use today use transistors and semiconductors to control electricity but computers of the future may utilize crystals and metamaterials to control light.


An electric current creates heat in computer systems and as the processing speed increases, so does the amount of electricity required; this extra heat is extremely damaging to the hardware. Photons, however, create substantially less amounts of heat than electrons, on a given size scale, thus the development of more powerful processing systems becomes possible. By applying some of the advantages of visible and/or IR networks at the device and component scale, a computer might someday be developed that can perform operations significantly faster than a conventional electronic computer.

Coherent light beams, unlike electric currents in metal conductors, pass through each other without interfering; electrons repel each other, while photons do not. For this reason, signals over copper wires degrade rapidly while fiber optic cables do not have this problem. Several laser beams can be transmitted in such a way that their paths intersect, with little or no interference among them - even when they are confined essentially to two dimensions.


Electro-Optical Hybrid computers


Most research projects focus on replacing current computer components with optical equivalents, resulting in an optical digital computer system processing binary data. This approach appears to offer the best short-term prospects for commercial optical computing, since optical components could be integrated into traditional computers to produce an optical/electronic hybrid. However, optoelectronic devices lose about 30% of their energy converting electrons into photons and back and this switching process slows down transmission of messages.

Pure Optical Computers



All-optical computers eliminate the need for switching. These computers will use multiple frequencies to send information throughout computer as light waves and packets thus not having any electron based systems and needing no conversation from electrical to optical, greatly increasing the speed.

The Future of Computers - Quantum nanocomputers

A quantum computer uses quantum mechanical phenomena, such as entanglement and superposition to process data. Quantum computation aims to use the quantum properties of particles to represent and structure data using quantum mechanics to understand how to perform operations with this data.



The quantum mechanical properties of atoms or nuclei allow these particles to work together as quantum bits, or qubits. These qubits work together to form the computer’s processor and memory and can can interact with each other while being isolated from the external environment and this enables them to perform certain calculations much faster than conventional computers. By computing many different numbers simultaneously and then interfering the results to get a single answer, a quantum computer can perform a large number of operations in parallel and ends up being much more powerful than a digital computer of the same size.



A quantum nanocomputer would work by storing data in the form of atomic quantum states or spin. Technology of this kind is already under development in the form of single-electron memory (SEM) and quantum dots. By means of quantum mechanics, waves would store the state of each nanoscale component and information would be stored as the spin orientation or state of an atom. With the correct setup, constructive interference would emphasize the wave patterns that held the right answer, while destructive interference would prevent any wrong answers.



The energy state of an electron within an atom, represented by the electron energy level or shell, can theoretically represent one, two, four, eight, or even 16 bits of data. The main problem with this technology is instability. Instantaneous electron energy states are difficult to predict and even more difficult to control. An electron can easily fall to a lower energy state, emitting a photon; conversely, a photon striking an atom can cause one of its electrons to jump to a higher energy state.



I promise to write a series of articles on quantum issues very shortly...

The Future of Computers - Mechanical nanocomputers

Pioneers as Eric Drexler proposed as far back as the mid-1980's that nanoscale mechanical computers could be built via molecular manufacturing through a process of mechanical positioning of atoms or molecular building blocks one atom or molecule at a time, a process known as “mechanosynthesis.”


Once assembled, the mechanical nanocomputer, other than being greatly scaled down in size, would operate much like a complex, programmable version of the mechanical calculators used during the 1940s to 1970s, preceding the introduction of widely available, inexpensive solid-state electronic calculators.

Drexler's theoretical design used rods sliding in housings, with bumps that would interfere with the motion of other rods. It was a mechanical nanocomputer that uses tiny mobile components called nanogears to encode information.


Drexler and his collaborators favored designs that resemble a miniature Charles Babbage's analytical engines of the 19th century, mechanical nanocomputers that would calculate using moving molecular-scale rods and rotating molecular-scale wheels, spinning on shafts and bearings.


For this reason, mechanical nanocomputer technology has sparked controversy and some researchers even consider it unworkable. All the problems inherent in Babbage's apparatus, according to the naysayers, are magnified a million fold in a mechanical nanocomputer. Nevertheless, some futurists are optimistic about the technology, and have even proposed the evolution of nanorobots that could operate, or be controlled by, mechanical nanocomputers.

The Future of Computers - Chemical and Biochemical nanocomputers


Chemical Nanocomputer


In general terms a chemical computer is one that processes information in by making and breaking chemical bonds and it stores logic states or information in the resulting chemical (i.e., molecular) structures. In a chemical nanocomputer computing is based on chemical reactions (bond breaking and forming) and the inputs are encoded in the molecular structure of the reactants and outputs can be extracted from the structure of the products meaning that in these computers the interaction between different chemicals and their structures is used to store and process information.

nanotubes

These computing operations would be performed selectively among molecules taken just a few at a time in volumes only a few nanometers on a side so, in order to create a chemical nanocomputer, engineers need to be able to control individual atoms and molecules so that these atoms and molecules can be made to perform controllable calculations and data storage tasks. The development of a true chemical nanocomputer will likely proceed along lines similar to genetic engineering.

nanotubes

Biochemical nanocomputers


Both chemical and biochemical nanocomputers would store and process information in terms of chemical structures and interactions. Proponents of biochemically based computers can point to an “existence proof” for them in the commonplace activities of humans and other animals with multicellular nervous systems. So biochemical nanocomputers already exist in nature; they are manifest in all living things. But these systems are largely uncontrollable by humans and that makes artificial fabrication or implementation of this category of “natural” biochemically based computers seems far off because the mechanisms for animal brains and nervous systems still are poorly understood. We cannot, for example, program a tree to calculate the digits of pi , or program an antibody to fight a particular disease (although medical science has come close to this ideal in the formulation of vaccines, antibiotics, and antiviral medications)..

DNA nanocomputer


In 1994, Leonard Adelman took a giant step towards a different kind of chemical or artificial biochemical computer when he used fragments of DNA to compute the solution to a complex graph theory problem.

DNA computer

Using the tools of biochemistry, Adleman was able to extract the correct answer to the graph theory problem out of the many random paths represented by the product DNA strands. Like a computer with many processors, this type of DNA computer is able to consider many solutions to a problem simultaneously. Moreover, the DNA strands employed in such a calculation (approximately 1017) are many orders of magnitude greater in number and more densely packed than the processors in today's most massively parallel electronic supercomputer. As a result of the Adleman work, the chemical nanocomputer is the only one of the aforementioned four types to have been demonstrated for an actual calculation.


These computers use DNA to store information and perform complex calculations. DNA has a vast amount of storage capacity that enables it to hold the complex blueprints of living organisms. The storage capacity of a single gram of DNA can hold as much information as one trillion compact discs.

The Future of Computers - Electronic nanocomputers


Due to our fifty years of experience with electronic computing devices, including the extensive research and industrial infrastructure built up since the late 1940s, advances in nanocomputing technology are likely to come in this direction making the electronic nanocomputers appear to present the easiest and most likely direction in which to continue nanocomputer development in the near future.
Electronic nanocomputers would operate in a manner similar to the way present-day microcomputers work. The main difference is one of physical scale. More and more transistor s are squeezed into silicon chips with each passing year; witness the evolution of integrated circuits (IC s) capable of ever-increasing storage capacity and processing power.

The ultimate limit to the number of transistors per unit volume is imposed by the atomic structure of matter. Most engineers agree that technology has not yet come close to pushing this limit. In the electronic sense, the term nanocomputer is relative; by 1970s standards, today's ordinary microprocessors might be called nanodevices.

How it works


The power and speed of computers have grown rapidly because of rapid progress in solid-state electronics dating back to the invention of the transistor in 1948. Most important, there has been exponential increase in the density of transistors on integrated-circuit computer chips over the past 40 years. In that time span, though, there has been no fundamental change in the operating principles of the transistor.
Even microelectronic transistors no more than a few microns (millionths of a meter) in size are bulk-effect devices. They still operate using small electric fields imposed by tiny charged metal plates to control the mass action of many millions of electrons.

Although electronic nanocomputers will not use the traditional concept of transistors for its components, they will still operate by storing components, information in the positions of electrons.

At the current rate of miniaturization, the conventional transistor technology will reach a minimum size limit in a few years. At that point, smallscale quantum mechanical efects, such as the tunneling of electrons through barriers made from matter or electric fields, will begin to dominate the essential effects that permit a mass-action semiconductor device to operate Still, an electronic nanocomputer will continue to represent information in the storage and movement of electrons.
Nowadays, most eletronic nanocomputers are created through microscopic circuits using nanolithography.

Nanolitography


Nanolithography is a term used to describe the branch of nanotechnology concerned with the study and application of a number of techniques for creating nanometer-scale structures, meaning patterns with at least one lateral dimension between the size of an individual atom and approximately 100 nm.

A nanometer is a billionth of a meter, much smaller than the width of a single human hair. The word lithography is used because the method of pattern generation is essentially the same as writing, only on a much smaller scale. Nanolithography is used during the fabrication of leading-edge semiconductor integrated circuits (nanocircuitry) or nanoelectromechanical systems (NEMS).


One common method of nanolithography, used particularly in the creation of microchips, is known as photolithography. This technique is a parallel method of nanolithography in which the entire surface is drawn on in a single moment. Photolithography is limited in the size it can reduce to, however, because if the wavelength of light used is made too small the lens simply absorbs the light in its entirety. This means that photolithography cannot reach the super-fine sizes of some alternate technologies.

A technology that allows for smaller sizes than photolithography is that of electron-beam lithography. Using an electron beam to draw a pattern nanometer by nanometer, incredibly small sizes (on the order of 20nm) may be achieved. Electron-beam lithography is much more expensive and time consuming than photolithography, however, making it a difficult sell for industry applications of nanolithography. Since electron-beam lithography functions more like a dot-matrix printer than a flash-photograph, a job that would take five minutes using photolithography will take upwards of five hours with electron-beam lithography.




New nanolithography technologies are constantly being researched and developed, leading to smaller and smaller possible sizes. Extreme ultraviolet lithography, for example, is capable of using light at wavelengths of 13.5nm. While hurdles still exist in this new field, it promises the possibility of sizes far below those produced by current industry standards. Other nanolithography techniques include dip-pen nanolithography, in which a small tip is used to deposit molecules on a surface. Dip-pen nanolithography can achieve very small sizes, but cannot currently go below 40nm.


The Future of Computers - Nanocomputers

Scientific discussion of the development and fabrication of nanometer-scale devices began in 1959 with an influential lecture by the late, renowned physicist Richard Feynman. Feynman observed that it is possible, in principle, to build and operate submicroscopic machinery. He proposed that large numbers of completely identical devices might be assembled by manipulating atoms one at a time.


Feynman's proposal sparked an initial flurry of interest but it did not broadly capture the imagination of the technical community or the public. At the time, building structures one atom at a time seemed out of reach. Throughout the 1960s and 1970s advances in diverse fields prepared the scientific community for the first crude manipulations of nanometer-scale structures. The most obvious development was the continual miniaturization of digital electronic circuits, based primarily upon the invention of the transistor by Shockley, Brattain, and Bardeen in 1948 and the invention of the integrated circuit by Noyce, Kilby, and others in the late 1950s. In 1959, it was only possible to put one transistor on an integrated circuit. Twenty years later, circuits with a few thousand transistors were commonplace.


Scientists are trying to use nanotechnology to make very tiny chips, electrical conductors and logic gates. Using nanotechnology, chips can be built up one atom at a time and hence there would be no wastage of space, enabling much smaller devices to be built. Using this technology, logic gates will be composed of just a few atoms and electrical conductors (called nanowires) will be merely an atom thick and a data bit will be represented by the presence or absence of an electron.

A nanocomputer is a computer whose physical dimensions are microscopic; smaller than the microcomputer, which is smaller than the minicomputer. (The minicomputer is called "mini" because it was a lot smaller than the original (mainframe) computers.)

A component of nanotechnology, nanocomputing will, as suggested or proposed by researchers and futurists, give rise to four types of nanocomputers:

Keep reading the next posts…

Nanotechnology


What is nanotechnology?


In simple terms, nanotechnology can be defined as ‘engineering at a very small scale’, and this term can be applied to many areas of research and development – from medicine to manufacturing to computing, and even to textiles and cosmetics. Nanotechnology (sometimes shortened to "nanotech") is a multidisciplinary science that studies how we can manipulate matter at the molecular and atomic scale, i.e., is the engineering of tiny machines — the projected ability to build things from the bottom up, using techniques and tools being developed today to make complete, highly advanced products.

Nanotools
Nanotools

Take a random selection of scientists, engineers, investors and the general public and ask them what nanotechnology is and you will receive a range of replies as broad as nanotechnology itself. For many scientists, it is nothing startlingly new; after all we have been working at the nanoscale for decades, through electron microscopy, scanning probe microscopies or simply growing and analyzing thin films. For most other groups, however, nanotechnology means something far more ambitious, miniature submarines in the bloodstream, little cogs and gears made out of atoms, space elevators made of nanotubes, and the colonization of space.

Nanorobot
The robot in this illustration swims through the arteries and veins using a pair of tail appendages.

E-coli
Nanorobot designers sometimes look at microscopic organisms for propulsion inspiration, like the flagellum on this e-coli cell.
 
Nanotechnology is very diverse, ranging from extensions of conventional device physics to completely new approaches based upon molecular self-assembly, from developing new materials with dimensions on the nanoscale to investigating whether we can directly control matter on the atomic scale. It can be difficult to imagine exactly how this greater understanding of the world of atoms and molecules has and will affect the everyday objects we see around us, but some of the areas where nanotechnologies are set to make a difference are described below.



Nanotechnology, in one sense, is the natural continuation of the miniaturization revolution that we have witnessed over the last decade, where millionth of a meter (10-6 m) tolerances (microengineering) became commonplace, for example, in the automotive and aerospace industries enabling the construction of higher quality and safer vehicles and planes.

It was the computer industry who kept on pushing the limits of miniaturization, and many electronic devices we see today have nano features that owe their origins to the computer industry – such as cameras, CD and DVD players, car airbag pressure sensors and inkjet printers.

Nanotechnology is fundamentally a materials science that has the following characteristics:
  • Research and development at molecular or atomic levels, with lengths ranging between about 1 to 100 nanometers;
  • Creation and use of systems, devices, and structures that have special functions or properties because of their small size;
  • Ability to control or manipulate matter on a molecular or atomic scale.

Undoubtedly, nanotechnology is going to be the future, as studies are going on in diversifying the technology from materials with dimensions in nano scale to materials in dimensions of atomic scale. Some new methods like molecular self-assembly have been developed to make this possible. There may be a future when all the common basic needs like food, shelter and even costly diamonds will be made by nanorobots.

Nanorobots
Driller nanorobots (Graphics by Erik Viktor)

What is the nanoscale?


Nanotechnology originates from the Greek word meaning “dwarf”. A nanometre is one billionth of a meter, which is tiny, only the length of ten hydrogen atoms.  Although a meter is defined by the International Standards Organization as "the length of the path travelled by light in vacuum during a time interval of 1/299,792,458 of a second" and a nanometre is by definition 10-9 of a meter, this does not help scientists to communicate the nanoscale to non-scientists.

Although scientists have manipulated matter at the nanoscale for centuries, calling it physics or chemistry, it was not until a new generation of microscopes was invented in the nineteen eighties that the world of atoms and molecules could be visualized and managed. It is in human nature to relate sizes by reference to everyday objects, and the commonest definition of nanotechnology is in relation to the width of a human hair.

Nanoscale
Examples of the nanoscale

In order to understand the unusual world of nanotechnology, we need to get an idea of the units of measure involved. A centimeter is one-hundredth of a meter, a millimeter is one-thousandth of a meter, and a micrometer is one-millionth of a meter, but all of these are still huge compared to the nanoscale.

Rather than asking anyone to imagine a millionth or a billionth of something, which few sane people can accomplish with ease, relating nanotechnology to atoms often makes the nanometre easier to imagine. This clearly shows how small a nano is. It is even smaller than the wavelength of visible light and a hundred-thousandth the width of a human hair. But it is still big when compared to the atomic scale. 
 Nanoscale

Let us compare different units with respect to meters.
  • 1 centimeter – 1,00th of a meter
  • 1 millimeter – 1,000th of a meter
  • 1 micrometer – 1,000,000th of a meter
  • 1 nanometer –  1,000,000,000th of a meter

As small as a nanometer is, it is still large compared to the atomic scale. An atom has a diameter of about 0.1 nm. An atom's nucleus is much smaller - about 0.00001 nm.

Nanoscale

Generally, nanotechnology deals with structures sized between 1 to 100 nanometer in at least one dimension, and involves developing materials or devices within that size. Quantum mechanical effects are very important at this scale, which is in the quantum realm. Larger than that is the microscale, and smaller than that is the atomic scale.

References:
HowStuffWorks
Nanotechnology Now
YouTube