-->
Showing posts with label Nanotechnology. Show all posts
Showing posts with label Nanotechnology. Show all posts

Nanotechnology Applications


Nanobiotechnology


Nanobiotechnology is the unification of biotechnology and nanotechnology, thus it's a hybrid discipline dealing with the making of atomic-scale machines by imitating or incorporating biological systems at the molecular level, or building tiny tools to study or change natural structure properties atom by atom. This goal can be attained via a combination of the classical micro-technology with a molecular biological approach.

Nanobiotechnology

Tiny medical hardware can interact with tissue in the human body at a molecular level to conduct more precise diagnosis and healing of malignancy. Many illnesses and injuries have their origins in nanoscale processes. Accordingly, practical usage of nanotechnology to the practice of medical care and biomedical investigation unlocks opportunities to treat illnesses, repair injuries, and enhance human functioning beyond what is possible with larger scale techniques.

How nanotechnology affects everything


Nanotechnology scientific impact


While there is a commonly held belief that nanotechnology is a futuristic science with applications 25 years in the future and beyond, nanotechnology is anything but science fiction. In the last years over a dozen Nobel prizes have been awarded in nanotechnology, from the development of the scanning probe microscope (SPM), to the discovery of fullerenes. Almost every university in the world has a nanotechnology department, or is expecting the funds to create one.
Nanotechnology offers opportunities in creating new features and functions and is already providing the solutions to many long-standing medical, social and environmental problems. Because of its potential, nanotechnology is of global interest, attracting more public funding than any other area of technology.

There is an unprecedented multidisciplinary convergence of scientists dedicated to the study of a world so small, we cannot see it - even with a light microscope. That world is the field of nanotechnology, the realm of atoms and nanostructures, something so new; no one is really sure what will come of it.

One of the exciting and challenging aspects of the nanoscale is the role that quantum mechanics plays in it. The rules of quantum mechanics are very different from those used in classical physics, which means that the behavior of substances at the nanoscale can sometimes contradict common sense by behaving erratically. You cannot walk up to a wall and immediately teleport to the other side of it, but at the nanoscale an electron can - it's called electron tunneling. Substances that are insulators, meaning they cannot carry an electric current, in bulk form might become semiconductors when reduced to the nanoscale. Melting points can change due to an increase in surface area. Much of nanoscience requires that you forget what you know and start learning all over again.

History of Nanotechnology

 
Traditionally, the origins of nanotechnology are traced back to December 29, 1959, when Professor Richard Feynman (a 1965 Nobel Prize winner in physics) presented a lecture entitled “There’s Plenty of Room at the Bottom” during the annual meeting of the American Physical Society at the California Institute of Technology (Caltech). In this talk, Feynman spoke about the principles of miniaturization and atomic-level precision and how these concepts do not violate any known law of physics. Feynman described a process by which the ability to manipulate individual atoms and molecules might be developed, using one set of precise tools to build and operate another proportionally smaller set, and so on down to the needed scale.

He described a field that few researchers had thought much about, let alone investigated. Feynman presented the idea of manipulating and controlling things on an extremely small scale by building and shaping matter one atom at a time. He proposed that it was possible to build a surgical nanoscale robot by developing quarter-scale manipulator hands that would build quarter-scale machine tools analogous to those found in machine shops, continuing until the nanoscale is reached, eight iterations later.
 
Richard Feynman
Richard Feynman

Printed Carbon Nanotube Transistor

Researchers from Aneeve Nanotechnologies have used low-cost ink-jet printing to fabricate the first circuits composed of fully printed back-gated and top-gated carbon nanotube-based electronics for use with OLED displays. OLED-based displays are used in cell phones, digital cameras and other portable devices.

But developing a lower-cost method for mass-producing such displays has been complicated by the difficulties of incorporating thin-film transistors that use amorphous silicon and polysilicon into the production process.

In this innovative study, the team made carbon nanotube thin-film transistors with high mobility and a high on-off ratio, completely based on ink-jet printing. They demonstrated the first fully printed single-pixel OLED control circuits, and their fully printed thin-film circuits showed significant performance advantages over traditional organic-based printed electronics.

This distinct process utilizes an ink-jet printing method that eliminates the need for expensive vacuum equipment and lends itself to scalable manufacturing and roll-to-roll printing. The team solved many material integration problems, developed new cleaning processes and created new methods for negotiating nano-based ink solutions.
Ink-jet-printed circuit. (Credit: University of California - Los Angeles)
For active-matrix OLED applications, the printed carbon nanotube transistors will be fully integrated with OLED arrays, the researchers said. The encapsulation technology developed for OLEDs will also keep the carbon nanotube transistors well protected, as the organics in OLEDs are very sensitive to oxygen and moisture.

(Adapted from PhysOrg)

Graphene bubbles improve lithium-air batteries

A team of scientists from the Pacific Northwest National Laboratory and Princeton  University used a new approach to buil a graphene membrane for use in lithium-air batteries, which could, one day, replace conventional batteries in electric vehicles. Resembling coral, this porous graphene material could replace the traditional smooth graphene sheets in lithium-air batteries, which become clogged with tiny particles during use.


Resembling broken eggshells, graphene structures built around bubbles produced a lithium-air battery with the highest energy capacity to date. As an added bonus, the team’s new material does not rely on platinum or other precious metals, reducing its potential cost and environmental impact.


World’s lightest material


Ultra light (<10 milligrams per cubic centimeter) cellular materials are desirable for thermal insulation, battery electrodes, catalyst supports, and acoustic, vibration, or shock energy damping. A team of researchers from UC Irvine, HRL Laboratories and the California Institute of Technology have developed the world's lightest material – with a density of 0.9 mg/cc. The new material redefines the limits of lightweight materials because of its unique "micro-lattice" cellular architecture. The researchers were able to make a material that consists of 99.99 percent air by designing the 0.01 percent solid at the nanometer, micron and millimeter scales. "The trick is to fabricate a lattice of interconnected hollow tubes with a wall thickness 1,000 times thinner than a human hair," said lead author Dr. Tobias Schaedler of HRL.

Ultra Light Material
Photo by Dan Little, HDR Laboratories

The material's architecture allows unprecedented mechanical behavior for a metal, including complete recovery from compression exceeding 50 percent strain and extraordinarily high energy absorption. "Materials actually get stronger as the dimensions are reduced to the nanoscale," explained UCI mechanical and aerospace engineer Lorenzo Valdevit, UCI's principal investigator on the project. "Combine this with the possibility of tailoring the architecture of the micro-lattice and you have a unique cellular material."

William Carter, manager of the architected materials group at HRL, compared the new material to larger, more familiar edifices: "Modern buildings, exemplified by the Eiffel Tower or the Golden Gate Bridge, are incredibly light and weight-efficient by virtue of their architecture. We are revolutionizing lightweight materials by bringing this concept to the nano and micro scales."

Adapted from PhysOrg

Butterfly wings inspire new design


Engineers have been trying to create water repellent surfaces, but past attempts at artificial air traps tended to lose their contents over time due to external perturbations. Now an international team of researchers from Sweden, the United States, and Korea has taken advantage of what might normally be considered defects in the nanomanufacturing process to create a multilayered silicon structure that traps air and holds it for longer than one year.

Blue Mountain Swallowtail

Researchers mimicked the many-layered nanostructure of blue mountain swallowtail (Papilio ulysses) wings to make a silicon wafer that traps both air and light. The brilliant blue wings of this butterfly easily shed water because of the way ultra-tiny structures in the wings trap air and create a cushion between water and wing.

Blue Mountain Swallowtail

The researchers used an etching process to carve out micro-scale pores and sculpt tiny cones from the silicon. The team found that features of the resulting structure that might usually be considered defects, such as undercuts beneath the etching mask and scalloped surfaces, actually improved the water repellent properties of the silicon by creating a multilayered hierarchy of air traps. The intricate structure of pores, cones, bumps, and grooves also succeeded in trapping light, almost perfectly absorbing wavelengths just above the visible range.

New biosensor made of nanotubes

Standard sensors employ metal electrodes coated with enzymes that react with compounds and produce an electrical signal that can be measured. However,the inefficiency of those sensors leads to imperfect measurements. Now, scientists at Purdue University have developed a new method for stacking synthetic DNA and carbon nanotubes onto a biosensor electrode.

Carbon nanotubes, cylindrically shaped carbon molecules known to have excellent thermal and electrical properties, have been seen as a possibility for improving sensor performance. The problem is that the materials are not fully compatible with water, which limits their application in biological fluids.



Marshall Porterfield and Jong Hyun Choi have found a solution and reported their findings in the journal The Analyst, describing a sensor that essentially builds itself.

"In the future, we will be able to create a DNA sequence that is complementary to the carbon nanotubes and is compatible with specific biosensor enzymes for the many different compounds we want to measure," Porterfield said. "It will be a self-assembling platform for biosensors at the biomolecular level."

Choi developed a synthetic DNA that will attach to the surface of the carbon nanotubes and make them more water-soluble. "Once the carbon nanotubes are in a solution, you only have to place the electrode into the solution and charge it. The carbon nanotubes will then coat the surface," Choi said.

The electrode coated with carbon nanotubes will attract the enzymes to finish the sensor's assembly. The sensor described in the findings was designed for glucose. But Porterfield said it could be easily adapted for various compounds. "You could mass produce these sensors for diabetes, for example, for insulin management for diabetic patients," Porterfield said.

Electron Tweezers

A recent paper by researchers from the National Institute of Standards and Technology (NIST) and the University of Virginia (UVA) demonstrates that the beams produced by modern electron microscopes can be used to manipulate nanoscale objects.

The tool is an electron version of the laser "optical tweezers" that have become a standard tool in biology, physics and chemistry for manipulating tiny particles. Except that electron beams could offer a thousand-fold improvement in sensitivity and resolution.

If you just consider the physics, you might expect that a beam of focused electrons -- such as that created by a transmission electron microscope (TEM) -- could do the same thing. However that's never been seen, in part because electrons are much fussier to work with. They can't penetrate far through air, for example, so electron microscopes use vacuum chambers to hold specimens.

So Vladimir Oleshko and his colleague James Howe, were surprised when, in the course of another experiment, they found themselves watching an electron tweezer at work. They were using an electron microscope to study, in detail, what happens when a metal alloy melts or freezes. They were observing a small particle -- a few hundred microns wide -- of an aluminum-silicon alloy held just at a transition point where it was partially molten, a liquid shell surrounding a core of still solid metal.

"This effect of electron tweezers was unexpected because the general purpose of this experiment was to study melting and crystallization," Oleshko explains. "We can generate this sphere inside the liquid shell easily; you can tell from the image that it's still crystalline. But we saw that when we move or tilt the beam -- or move the microscope stage under the beam -- the solid particle follows it, like it was glued to the beam."

Potentially, electron tweezers could be a versatile and valuable tool, adding very fine manipulation to wide and growing lists of uses for electron microscopy in materials science. "Of course, this is challenging because it requires a vacuum," he says, "but electron probes can be very fine, three orders of magnitude smaller than photon beams -- close to the size of single atoms. We could manipulate very small quantities, even single atoms, in a very precise way."

(Adapted from ScienceDaily)

Memory at the nanoscale

Metallic alloys can be stretched or compressed in such a way that they stay deformed once the strain on the material has been released. However, shape memory alloys can return to their original shape after being heated above a specific temperature.

Now, for the first time, two physicists from the University of Constance determined the absolute values of temperatures at which shape memory nanospheres start changing back to their memorized shape, undergoing the so-called structural phase transition, which depends on the size of particles studied. To achieve this result, they performed a computer simulation using nanoparticles with diameters between 4 and 17 nm made of an alloy of equal proportions of nickel and titanium.


To date, research efforts to establish structural phase transition temperature have mainly been experimental. Most of the prior work on shape memory materials was in macroscopic scale systems and used for applications such as dental braces, stents or oil temperature-regulating devices for bullet trains.

Thanks to a computerized method known as molecular dynamics simulation, Daniel Mutter and Peter Nielaba were able to visualize the transformation process of the material during the transition. As the temperature increased, they showed that the material's atomic-scale crystal structure shifted from a lower to a higher level of symmetry. They found that the strong influence of the energy difference between the low- and high-symmetry structure at the surface of the nanoparticle, which differed from that in its interior, could explain the transition.

Potential new applications include the creation of nanoswitches, where laser irradiation could heat up such shape memory material, triggering a change in its length that would, in turn, function as a switch.

The Future of Computers - Quantum nanocomputers

A quantum computer uses quantum mechanical phenomena, such as entanglement and superposition to process data. Quantum computation aims to use the quantum properties of particles to represent and structure data using quantum mechanics to understand how to perform operations with this data.



The quantum mechanical properties of atoms or nuclei allow these particles to work together as quantum bits, or qubits. These qubits work together to form the computer’s processor and memory and can can interact with each other while being isolated from the external environment and this enables them to perform certain calculations much faster than conventional computers. By computing many different numbers simultaneously and then interfering the results to get a single answer, a quantum computer can perform a large number of operations in parallel and ends up being much more powerful than a digital computer of the same size.



A quantum nanocomputer would work by storing data in the form of atomic quantum states or spin. Technology of this kind is already under development in the form of single-electron memory (SEM) and quantum dots. By means of quantum mechanics, waves would store the state of each nanoscale component and information would be stored as the spin orientation or state of an atom. With the correct setup, constructive interference would emphasize the wave patterns that held the right answer, while destructive interference would prevent any wrong answers.



The energy state of an electron within an atom, represented by the electron energy level or shell, can theoretically represent one, two, four, eight, or even 16 bits of data. The main problem with this technology is instability. Instantaneous electron energy states are difficult to predict and even more difficult to control. An electron can easily fall to a lower energy state, emitting a photon; conversely, a photon striking an atom can cause one of its electrons to jump to a higher energy state.



I promise to write a series of articles on quantum issues very shortly...

The Future of Computers - Mechanical nanocomputers

Pioneers as Eric Drexler proposed as far back as the mid-1980's that nanoscale mechanical computers could be built via molecular manufacturing through a process of mechanical positioning of atoms or molecular building blocks one atom or molecule at a time, a process known as “mechanosynthesis.”


Once assembled, the mechanical nanocomputer, other than being greatly scaled down in size, would operate much like a complex, programmable version of the mechanical calculators used during the 1940s to 1970s, preceding the introduction of widely available, inexpensive solid-state electronic calculators.

Drexler's theoretical design used rods sliding in housings, with bumps that would interfere with the motion of other rods. It was a mechanical nanocomputer that uses tiny mobile components called nanogears to encode information.


Drexler and his collaborators favored designs that resemble a miniature Charles Babbage's analytical engines of the 19th century, mechanical nanocomputers that would calculate using moving molecular-scale rods and rotating molecular-scale wheels, spinning on shafts and bearings.


For this reason, mechanical nanocomputer technology has sparked controversy and some researchers even consider it unworkable. All the problems inherent in Babbage's apparatus, according to the naysayers, are magnified a million fold in a mechanical nanocomputer. Nevertheless, some futurists are optimistic about the technology, and have even proposed the evolution of nanorobots that could operate, or be controlled by, mechanical nanocomputers.

The Future of Computers - Chemical and Biochemical nanocomputers


Chemical Nanocomputer


In general terms a chemical computer is one that processes information in by making and breaking chemical bonds and it stores logic states or information in the resulting chemical (i.e., molecular) structures. In a chemical nanocomputer computing is based on chemical reactions (bond breaking and forming) and the inputs are encoded in the molecular structure of the reactants and outputs can be extracted from the structure of the products meaning that in these computers the interaction between different chemicals and their structures is used to store and process information.

nanotubes

These computing operations would be performed selectively among molecules taken just a few at a time in volumes only a few nanometers on a side so, in order to create a chemical nanocomputer, engineers need to be able to control individual atoms and molecules so that these atoms and molecules can be made to perform controllable calculations and data storage tasks. The development of a true chemical nanocomputer will likely proceed along lines similar to genetic engineering.

nanotubes

Biochemical nanocomputers


Both chemical and biochemical nanocomputers would store and process information in terms of chemical structures and interactions. Proponents of biochemically based computers can point to an “existence proof” for them in the commonplace activities of humans and other animals with multicellular nervous systems. So biochemical nanocomputers already exist in nature; they are manifest in all living things. But these systems are largely uncontrollable by humans and that makes artificial fabrication or implementation of this category of “natural” biochemically based computers seems far off because the mechanisms for animal brains and nervous systems still are poorly understood. We cannot, for example, program a tree to calculate the digits of pi , or program an antibody to fight a particular disease (although medical science has come close to this ideal in the formulation of vaccines, antibiotics, and antiviral medications)..

DNA nanocomputer


In 1994, Leonard Adelman took a giant step towards a different kind of chemical or artificial biochemical computer when he used fragments of DNA to compute the solution to a complex graph theory problem.

DNA computer

Using the tools of biochemistry, Adleman was able to extract the correct answer to the graph theory problem out of the many random paths represented by the product DNA strands. Like a computer with many processors, this type of DNA computer is able to consider many solutions to a problem simultaneously. Moreover, the DNA strands employed in such a calculation (approximately 1017) are many orders of magnitude greater in number and more densely packed than the processors in today's most massively parallel electronic supercomputer. As a result of the Adleman work, the chemical nanocomputer is the only one of the aforementioned four types to have been demonstrated for an actual calculation.


These computers use DNA to store information and perform complex calculations. DNA has a vast amount of storage capacity that enables it to hold the complex blueprints of living organisms. The storage capacity of a single gram of DNA can hold as much information as one trillion compact discs.

The Future of Computers - Electronic nanocomputers


Due to our fifty years of experience with electronic computing devices, including the extensive research and industrial infrastructure built up since the late 1940s, advances in nanocomputing technology are likely to come in this direction making the electronic nanocomputers appear to present the easiest and most likely direction in which to continue nanocomputer development in the near future.
Electronic nanocomputers would operate in a manner similar to the way present-day microcomputers work. The main difference is one of physical scale. More and more transistor s are squeezed into silicon chips with each passing year; witness the evolution of integrated circuits (IC s) capable of ever-increasing storage capacity and processing power.

The ultimate limit to the number of transistors per unit volume is imposed by the atomic structure of matter. Most engineers agree that technology has not yet come close to pushing this limit. In the electronic sense, the term nanocomputer is relative; by 1970s standards, today's ordinary microprocessors might be called nanodevices.

How it works


The power and speed of computers have grown rapidly because of rapid progress in solid-state electronics dating back to the invention of the transistor in 1948. Most important, there has been exponential increase in the density of transistors on integrated-circuit computer chips over the past 40 years. In that time span, though, there has been no fundamental change in the operating principles of the transistor.
Even microelectronic transistors no more than a few microns (millionths of a meter) in size are bulk-effect devices. They still operate using small electric fields imposed by tiny charged metal plates to control the mass action of many millions of electrons.

Although electronic nanocomputers will not use the traditional concept of transistors for its components, they will still operate by storing components, information in the positions of electrons.

At the current rate of miniaturization, the conventional transistor technology will reach a minimum size limit in a few years. At that point, smallscale quantum mechanical efects, such as the tunneling of electrons through barriers made from matter or electric fields, will begin to dominate the essential effects that permit a mass-action semiconductor device to operate Still, an electronic nanocomputer will continue to represent information in the storage and movement of electrons.
Nowadays, most eletronic nanocomputers are created through microscopic circuits using nanolithography.

Nanolitography


Nanolithography is a term used to describe the branch of nanotechnology concerned with the study and application of a number of techniques for creating nanometer-scale structures, meaning patterns with at least one lateral dimension between the size of an individual atom and approximately 100 nm.

A nanometer is a billionth of a meter, much smaller than the width of a single human hair. The word lithography is used because the method of pattern generation is essentially the same as writing, only on a much smaller scale. Nanolithography is used during the fabrication of leading-edge semiconductor integrated circuits (nanocircuitry) or nanoelectromechanical systems (NEMS).


One common method of nanolithography, used particularly in the creation of microchips, is known as photolithography. This technique is a parallel method of nanolithography in which the entire surface is drawn on in a single moment. Photolithography is limited in the size it can reduce to, however, because if the wavelength of light used is made too small the lens simply absorbs the light in its entirety. This means that photolithography cannot reach the super-fine sizes of some alternate technologies.

A technology that allows for smaller sizes than photolithography is that of electron-beam lithography. Using an electron beam to draw a pattern nanometer by nanometer, incredibly small sizes (on the order of 20nm) may be achieved. Electron-beam lithography is much more expensive and time consuming than photolithography, however, making it a difficult sell for industry applications of nanolithography. Since electron-beam lithography functions more like a dot-matrix printer than a flash-photograph, a job that would take five minutes using photolithography will take upwards of five hours with electron-beam lithography.




New nanolithography technologies are constantly being researched and developed, leading to smaller and smaller possible sizes. Extreme ultraviolet lithography, for example, is capable of using light at wavelengths of 13.5nm. While hurdles still exist in this new field, it promises the possibility of sizes far below those produced by current industry standards. Other nanolithography techniques include dip-pen nanolithography, in which a small tip is used to deposit molecules on a surface. Dip-pen nanolithography can achieve very small sizes, but cannot currently go below 40nm.


The Future of Computers - Nanocomputers

Scientific discussion of the development and fabrication of nanometer-scale devices began in 1959 with an influential lecture by the late, renowned physicist Richard Feynman. Feynman observed that it is possible, in principle, to build and operate submicroscopic machinery. He proposed that large numbers of completely identical devices might be assembled by manipulating atoms one at a time.


Feynman's proposal sparked an initial flurry of interest but it did not broadly capture the imagination of the technical community or the public. At the time, building structures one atom at a time seemed out of reach. Throughout the 1960s and 1970s advances in diverse fields prepared the scientific community for the first crude manipulations of nanometer-scale structures. The most obvious development was the continual miniaturization of digital electronic circuits, based primarily upon the invention of the transistor by Shockley, Brattain, and Bardeen in 1948 and the invention of the integrated circuit by Noyce, Kilby, and others in the late 1950s. In 1959, it was only possible to put one transistor on an integrated circuit. Twenty years later, circuits with a few thousand transistors were commonplace.


Scientists are trying to use nanotechnology to make very tiny chips, electrical conductors and logic gates. Using nanotechnology, chips can be built up one atom at a time and hence there would be no wastage of space, enabling much smaller devices to be built. Using this technology, logic gates will be composed of just a few atoms and electrical conductors (called nanowires) will be merely an atom thick and a data bit will be represented by the presence or absence of an electron.

A nanocomputer is a computer whose physical dimensions are microscopic; smaller than the microcomputer, which is smaller than the minicomputer. (The minicomputer is called "mini" because it was a lot smaller than the original (mainframe) computers.)

A component of nanotechnology, nanocomputing will, as suggested or proposed by researchers and futurists, give rise to four types of nanocomputers:

Keep reading the next posts…

Nanotechnology


What is nanotechnology?


In simple terms, nanotechnology can be defined as ‘engineering at a very small scale’, and this term can be applied to many areas of research and development – from medicine to manufacturing to computing, and even to textiles and cosmetics. Nanotechnology (sometimes shortened to "nanotech") is a multidisciplinary science that studies how we can manipulate matter at the molecular and atomic scale, i.e., is the engineering of tiny machines — the projected ability to build things from the bottom up, using techniques and tools being developed today to make complete, highly advanced products.

Nanotools
Nanotools

Take a random selection of scientists, engineers, investors and the general public and ask them what nanotechnology is and you will receive a range of replies as broad as nanotechnology itself. For many scientists, it is nothing startlingly new; after all we have been working at the nanoscale for decades, through electron microscopy, scanning probe microscopies or simply growing and analyzing thin films. For most other groups, however, nanotechnology means something far more ambitious, miniature submarines in the bloodstream, little cogs and gears made out of atoms, space elevators made of nanotubes, and the colonization of space.

Nanorobot
The robot in this illustration swims through the arteries and veins using a pair of tail appendages.

E-coli
Nanorobot designers sometimes look at microscopic organisms for propulsion inspiration, like the flagellum on this e-coli cell.
 
Nanotechnology is very diverse, ranging from extensions of conventional device physics to completely new approaches based upon molecular self-assembly, from developing new materials with dimensions on the nanoscale to investigating whether we can directly control matter on the atomic scale. It can be difficult to imagine exactly how this greater understanding of the world of atoms and molecules has and will affect the everyday objects we see around us, but some of the areas where nanotechnologies are set to make a difference are described below.



Nanotechnology, in one sense, is the natural continuation of the miniaturization revolution that we have witnessed over the last decade, where millionth of a meter (10-6 m) tolerances (microengineering) became commonplace, for example, in the automotive and aerospace industries enabling the construction of higher quality and safer vehicles and planes.

It was the computer industry who kept on pushing the limits of miniaturization, and many electronic devices we see today have nano features that owe their origins to the computer industry – such as cameras, CD and DVD players, car airbag pressure sensors and inkjet printers.

Nanotechnology is fundamentally a materials science that has the following characteristics:
  • Research and development at molecular or atomic levels, with lengths ranging between about 1 to 100 nanometers;
  • Creation and use of systems, devices, and structures that have special functions or properties because of their small size;
  • Ability to control or manipulate matter on a molecular or atomic scale.

Undoubtedly, nanotechnology is going to be the future, as studies are going on in diversifying the technology from materials with dimensions in nano scale to materials in dimensions of atomic scale. Some new methods like molecular self-assembly have been developed to make this possible. There may be a future when all the common basic needs like food, shelter and even costly diamonds will be made by nanorobots.

Nanorobots
Driller nanorobots (Graphics by Erik Viktor)

What is the nanoscale?


Nanotechnology originates from the Greek word meaning “dwarf”. A nanometre is one billionth of a meter, which is tiny, only the length of ten hydrogen atoms.  Although a meter is defined by the International Standards Organization as "the length of the path travelled by light in vacuum during a time interval of 1/299,792,458 of a second" and a nanometre is by definition 10-9 of a meter, this does not help scientists to communicate the nanoscale to non-scientists.

Although scientists have manipulated matter at the nanoscale for centuries, calling it physics or chemistry, it was not until a new generation of microscopes was invented in the nineteen eighties that the world of atoms and molecules could be visualized and managed. It is in human nature to relate sizes by reference to everyday objects, and the commonest definition of nanotechnology is in relation to the width of a human hair.

Nanoscale
Examples of the nanoscale

In order to understand the unusual world of nanotechnology, we need to get an idea of the units of measure involved. A centimeter is one-hundredth of a meter, a millimeter is one-thousandth of a meter, and a micrometer is one-millionth of a meter, but all of these are still huge compared to the nanoscale.

Rather than asking anyone to imagine a millionth or a billionth of something, which few sane people can accomplish with ease, relating nanotechnology to atoms often makes the nanometre easier to imagine. This clearly shows how small a nano is. It is even smaller than the wavelength of visible light and a hundred-thousandth the width of a human hair. But it is still big when compared to the atomic scale. 
 Nanoscale

Let us compare different units with respect to meters.
  • 1 centimeter – 1,00th of a meter
  • 1 millimeter – 1,000th of a meter
  • 1 micrometer – 1,000,000th of a meter
  • 1 nanometer –  1,000,000,000th of a meter

As small as a nanometer is, it is still large compared to the atomic scale. An atom has a diameter of about 0.1 nm. An atom's nucleus is much smaller - about 0.00001 nm.

Nanoscale

Generally, nanotechnology deals with structures sized between 1 to 100 nanometer in at least one dimension, and involves developing materials or devices within that size. Quantum mechanical effects are very important at this scale, which is in the quantum realm. Larger than that is the microscale, and smaller than that is the atomic scale.

References:
HowStuffWorks
Nanotechnology Now
YouTube

The Future of Computers - Overview


Computers' Evolution


The history of computers and computer technology thus far has been a long and a fascinating one, stretching back more than half a century to the first primitive computing machines. These machines were huge and complicated affairs, consisting of row upon row of vacuum tubes and wires, often encompassing several rooms to fit it all in.
 



In the past twenty years, there has been a dramatic increase in the processing speed of computers, network capacity and the speed of the internet. These advances have paved the way for the revolution of fields such as quantum physics, artificial intelligence and nanotechnology. These advances will have a profound effect on the way we live and work, the virtual reality we see in movies like the Matrix, may actually come true in the next decade or so.
Today's computers operate using transistors, wires and electricity. Future computers might use atoms, fibers and light. Take a moment to consider what the world might be like, if computers the size of molecules become a reality. These are the types of computers that could be everywhere, but never seen. Nano sized bio-computers that could target specific areas inside your body. Giant networks of computers, in your clothing, your house, your car. Entrenched in almost every aspect of our lives and yet you may never give them a single thought.



As anyone who has looked at the world of computers lately can attest, the size of computers has been reduced sharply, even as the power of these machines has increased at an exponential rate. In fact, the cost of computers has come down so much that many households now own not only one, but two, three or even more, PCs.
As the world of computers and computer technology continues to evolve and change, many people, from science fiction writers and futurists to computer workers and ordinary users have wondered what the future holds for the computer and related technologies. Many things have been pictured, from robots in the form of household servants to computers so small they can fit in a pocket. Indeed, some of these predicted inventions have already come to pass, with the introduction of PDAs and robotic vacuum cleaners.

Understanding the theories behind these future computer technologies is not for the meek. My research into quantum computers was made all the more difficult after I learned that in light of her constant interference, it is theoretically possible my mother-in-law could be in two places at once.

Nanotechnology is another important part of the future of computers, expected to have a profound impact on people around the globe. Nanotechnology is the process whereby matter is manipulated at the atomic level, providing the ability to “build” objects from their most basic parts. Like robotics and artificial intelligence, nanotechnology is already in use in many places, providing everything from stain resistant clothing to better suntan lotion. These advances in nanotechnology are likely to continue in the future, making this one of the most powerful aspects of future computing.

In the future, the number of tiny but powerful computers you encounter every day will number in the thousands, perhaps millions. You won't see them, but they will be all around you. Your personal interface to this powerful network of computers could come from a single computing device that is worn on or in the body.

Quantum computers are also likely to transform the computing experience, for both business and home users. These powerful machines are already on the drawing board, and they are likely to be introduced in the near future. The quantum computer is expected to be a giant leap forward in computing technology, with exciting implications for everything from scientific research to stock market predictions.

Moore's law


Visit any site on the web writing about the future of computers and you will most likely find mention of Moore's Law. Moore's Law is not a strictly adhered to mathematical formula, but a prediction made by Intel's founder co-founder Gordon Moore in 1965 in a paper where he noted that number of components in integrated circuits had doubled every year from the invention of the integrated circuit in 1958 until 1965 and predicted that the trend would continue "for at least ten years".



Moore predicted that computing technology would increase in value at the same time it would decrease in cost describing a long-term trend in the history of computing hardware. More specifically, that innovation in technology would allow for the number of transistors that can be placed inexpensively on an integrated circuit to double approximately every two years. The trend has continued for more than half a century and is not expected to stop until 2015 or later.

A computer transistor acts like a small electronic switch. Just like the light switch on your wall, a transistor has only two states, On or Off. A computer interprets this on/off state as a 1 or a 0. Put a whole bunch of these transistors together and you have a computer chip. The central processing unit (CPU) inside your computer probably has around 500 million transistors.

Shrinking transistor size not only makes chips smaller, but faster. One benefit of packing transistors closer together is that the electronic pulses take less time to travel between transistors. This can increase the overall speed of the chip. The capabilities of many digital electronic devices are strongly linked to Moore's law: processing speed, memory capacity, sensors and even the number and size of pixels in digital cameras. All of these are improving at (roughly) exponential rates as well.



Not everyone agrees that Moore's Law has been accurate throughout the years, (the prediction has changed since its original version), or that it will hold true in the future. But does it really matter? The pace at which computers are doubling their smarts is happening fast enough for me.

Thanks to the innovation and drive of Gordon Moore and others like him, computers will continue to get smaller, faster and more affordable.