New Explanation for Key Step in Anthrax Infection Proposed by NIST and USAMRIID

A new hypothesis concerning a crucial step in the anthrax infection process has been advanced by scientists at the National Institute of Standards and Technology (NIST) and the U.S. Army Medical Research Institute for Infectious Diseases (USAMRIID) at Fort Detrick, Md.

anthrax toxin
Anthrax toxins are sequestered from the cell surface (top) in a bubble-like endosome. The toxins have been thought to escape the endosome by threading their way through a hole in the endosome (lower left), but a new hypothesis suggests they may rupture the endosome (lower right).
Credit: Robertson/NIST
View hi-resolution image

The research teams have explored the behavior of the toxins that rapidly overwhelm the body as the often-fatal disease progresses. Their findings suggest a new possible mechanism by which anthrax bacteria deliver the protein molecules that poison victims. Anthrax is easily weaponized; the findings could help lead to a more effective cure.

Anthrax bacteria kill by releasing three toxins that work in concert to destroy cells. One toxin, called PA, attaches to the cell membrane, where its surface serves as a sort of landing pad for the other two toxins, called LF and EF. Once several molecules of LF and EF have latched onto PA, the cell membrane tries to destroy these unwanted hangers-on by wrapping them up in an “endosome,” a small bubble of membrane that gets pinched off and moved into the cell’s interior. There, the cell attempts to destroy its contents by a process that includes making the interior of the endosome more acidic. But before the cell can fully carry out its plan, the LF and EF escape from the endosome and wreak havoc in the cell’s interior. The question is: how do these toxins escape?

“A recent hypothesis is that LF and EF completely unfold and then squeeze through the narrow hole that PA forms in the endosomal membrane,” says NIST physical scientist John Kasianowicz. “However, the studies used to support this concept make use of short segments of the toxins, not their native full-length versions. The results don’t show that the complete LF and EF are transported through the pore or whether they refold into functional enzymes once they reach the other side. So, we decided to look at other possible explanations.”

The NIST/USAMRIID team explored the behavior of full-length toxins using an artificial membrane that mimics a cell’s exterior. They put the toxins mixed in salt water on one side of this barrier and slowly rendered this fluid more acidic, resembling conditions within an endosome. But the change in chemistry apparently altered the physical characteristics of the LF and EF toxins, because it caused them to bind irreversibly to the PA pore, creating a “complex” of multiple toxins. This result alone suggested it would be difficult, if not impossible, for LF and EF to thread through the pore.

In addition, the team discovered that the bound toxins tend to rupture membranes. This finding led them to suggest that perhaps it is complexes of LF or EF bound to PA that gets into cells, and that these complexes are the active toxins inside cells.

Kasianowicz says this new hypothesis could explain previous experimental results, in which the complex was found in the blood of animals that died of anthrax. But he emphasizes that the matter is not yet settled.

“We don’t know enough to choose between these theories—and in fact it’s possible that the toxins escape the endosome by more than one mechanism,” he says. “But it’s important that we better understand this step in the process to thwart anthrax more effectively.”

*B.J. Nablo, R.G. Panchal, S. Bavari, T.L. Nguyen, R. Gussio, W. Ribot, A. Friedlander, D. Chabot, J.E. Reiner, J.W.F. Robertson, A. Balijepalli, K.M. Halverson and J.J. Kasianowicz. Anthrax toxin-induced rupture of artificial lipid bilayer membranes. Journal of Chemical Physics, Aug. 8, 2013 (Vol.139, Issue 6), DOI: 10.1063/1.4816467

Media Contact: Chad Boutin, boutin@nist.gov, 301-975-4261

Posted in Bioscience & Health, Public Safety/Security | Tagged , , , , , , | Leave a comment

JILA Researchers Discover Atomic Clock Can Simulate Quantum Magnetism

Researchers at JILA have for the first time used an atomic clock as a quantum simulator, mimicking the behavior of a different, more complex quantum system.*

simulator
Artist’s conception of interactions among atoms in JILA’s strontium atomic clock during a quantum simulation experiment. The atoms appear to all interact (indicated by the connections), leading to correlations among the atoms’ spins (indicated by arrows), according to patterns JILA scientists found in collective spin measurements. The interacting atoms might be harnessed to simulate other quantum systems such as magnetic materials.
Credit: Ye group and Brad Baxley, JILA
View hi-resolution image

Atomic clocks now join a growing list of physical systems that can be used for modeling and perhaps eventually explaining the quantum mechanical behavior of exotic materials such as high-temperature superconductors, which conduct electricity without resistance. All but the smallest, most trivial quantum systems are too complicated to simulate on classical computers, hence the interest in quantum simulators. Sharing some of the features of experimental quantum computers—a hot research topic—quantum simulators are “special purpose” devices designed to provide insight into specific challenging problems.

JILA is operated jointly by the National Institute of Standards and Technology (NIST) and the University of Colorado Boulder.

As described in the Aug. 9 issue of Science, the JILA experiment was performed with an atomic clock made of about 2,000 neutral strontium atoms trapped in intersecting laser beams. The researchers were surprised to discover that, under certain conditions, the clock atoms interact like atoms in magnetic materials.

“This was completely unexpected,” JILA/NIST Fellow Jun Ye says. “We were not looking for this at all, we were just naively trying to understand the particle interactions as part of our effort to further improve the clock. We were pleasantly surprised to find we can now use a clock as a powerful quantum apparatus to study magnetic spin interactions.”

The strontium clock atoms are arranged like a stack of 100 pancakes, each containing about 20 atoms. Normally the atoms react individually to red laser pulses, switching between two energy levels. But researchers discovered the atoms also can interact with each other, first in pairs and eventually all together. Until now researchers were trying to eliminate these interactions, which are undesirable in atomic clocks** but they can turn into a powerful feature for a quantum simulator.

Strontium atoms have two energy levels used for clock purposes, each with a particular configuration of electrons. In the JILA simulation, all the atoms start out at the same energy level with the same electron configuration, also called a spin-down state. A quick pulse from a very stable red laser places all the atoms in a “superposition” of spins pointing both up and down at the same time. The possibility of superposition is one of the most notable features of the quantum world. When the laser is turned off, the atoms start to interact. One second later another pulse from the same laser hits the atoms to prepare them for collective spin measurement, and then a different laser measures, based on any detected fluorescence, the final spin states of all the atoms.

In the world of classical physics such measurements would have definite results, without any “noise,” or uncertainty. However, in the quantum world a spin measurement usually has a random amount of noise. In the JILA experiment, correlations appear over time between the noise patterns of some of the atoms’ spins. Ye says these correlations suggest the atoms become entangled, another unusual quantum feature that links the properties of separated particles. JILA researchers have not yet performed the definitive test proving entanglement, however.

JILA theorist Ana Maria Rey helped to explain what Ye’s experimental team observed. For small numbers of particles, about 30 atoms, Rey calculated that the clock atom interactions obey mathematical formulas similar to those describing the behavior of electrons in magnetic materials. But if more atoms are included, classical calculations would not keep up with the experimental results. In the future the JILA team hopes to perform more complicated simulations while continuing to develop a theory explaining the findings.

The atomic clock joins a growing list of quantum simulators demonstrated recently at NIST*** and elsewhere.

The JILA research is supported by NIST, the Defense Advanced Research Projects Agency, Air Force Office of Scientific Research, National Science Foundation, and Army Research Office.

*M.J. Martin, M. Bishof, M.D. Swallows, X. Zhang, C. Benko, J. von-Stecher, A.V. Gorshkov, A.M. Rey and J. Ye. 2013. A quantum many-body spin system in an optical lattice clock. Science. August 9.
**See 2009 NIST news release, “JILA/NIST Scientists Get a Grip on Colliding Fermions to Enhance Atomic Clock Accuracy,” at www.nist.gov/pml/div689/fermions_041609.cfm.
***See 2012 NIST Tech Beat article, “NIST Physicists Benchmark Quantum Simulator with Hundreds of Qubits,” at www.nist.gov/pml/div688/qubits-042512.cfm.

Media Contact: Laura Ost, laura.ost@nist.gov, 303-497-4880

Posted in Physics | Tagged , , | Leave a comment

NIST Study Advances Use of Iris Images as a Long-Term Form of Identification

A new report* by biometric researchers at the National Institute of Standards and Technology (NIST) uses data from thousands of frequent travelers enrolled in an iris recognition program to determine that no consistent change occurs in the distinguishing texture of their irises for at least a decade. These findings inform identity program administrators on how often iris images need to be recaptured to maintain accuracy.

iris recognition
A frequent traveler uses an iris recognition camera to speed her travel across the American-Canadian border. NIST researchers evaluated data from millions of images taken over a decade from this iris-based NEXUS program to gauge iris stability.
Courtesy Canadian Border Services Agency
View hi-resolution image

For decades, researchers seeking biometric identifiers other than fingerprints believed that irises were a strong biometric because their one-of-a-kind texture meets the stability and uniqueness requirements for biometrics. However, recent research has questioned that belief. A study of 217 subjects over a three-year period found that the recognition of the subjects’ irises became increasingly difficult, consistent with an aging effect.**

To learn more, NIST biometric researchers used several methods to evaluate iris stability.

Researchers first examined anonymous data from millions of transactions from NEXUS, a joint Canadian and American program used by frequent travelers to move quickly across the Canadian border. As part of NEXUS, members’ irises are enrolled into the system with an iris camera and their irises are scanned and matched to system files when they travel across the border. NIST researchers also examined a larger, but less well-controlled set of anonymous statistics collected over a six-year period.

In both large-population studies, NIST researchers found no evidence of a widespread aging effect, said Biometric Testing Project Leader Patrick Grother. A NIST computer model estimates that iris recognition of average people will typically be useable for decades after the initial enrollment.

“In our iris aging study we used a mixed effects regression model, for its ability to capture population-wide aging and individual-specific aging, and to estimate the aging rate over decades,” said Grother. “We hope these methods will be applicable to other biometric aging studies such as face aging because of their ability to represent variation across individuals who appear in a biometric system irregularly.”

NIST researchers then reanalyzed the images from the earlier studies of 217 subjects that evaluated the population-wide aspect. Those studies reported an increase in false rejection rates over time—that is, the original, enrolled images taken in the first year of the study did not match those taken later. While the rejection numbers were high, the results did not necessarily demonstrate that the iris texture itself was changing. In fact, a study by another research team identified pupil dilation as the primary cause behind the false rejection rates.*** This prompted the NIST team to consider the issue.

NIST researchers showed that dilation in the original pool of subjects increased in the second year of the test and decreased the next, but was not able to determine why. When they accounted for the dilation effect, researchers did not observe a change in the texture or aging effect. Some iris cameras normalize dilation by using shielding or by varying the illumination.

NIST established the Iris Exchange (IREX) program in 2008 to give quantitative support to iris recognition standardization, development and deployment. Sponsors for this research include the Criminal Justice Information Systems Division of the Federal Bureau of Investigation, the Office of Biometric Identity Management in the Department of Homeland Security (DHS) and the DHS Science and Technology Directorate.

*The NIST results are reported in IREX VI – Temporal Stability of Iris Recognition Accuracy, NIST Interagency Report 7948, at www.nist.gov/manuscript-publication-search.cfm?pub_id=913900.
**S. Fenker and K.W. Bowyer. Experimental evidence of a template aging effect in iris biometrics. IEEE Computer Society Workshop on Applications of Computer Vision, November 2012.
***M. Fairhurst and M. Erbilek. Analysis of physical ageing effects in iris biometrics. IET Computer Vision, 5(6):358–366, 2011. ww.ietdl.org.

Media Contact: Evelyn Brown, evelyn.brown@nist.gov, 301-975-5661

Posted in Information Technology | Tagged , , , , | Leave a comment

Two Updated Guides Provide Latest NIST Recommendations for System Patches, Malware Avoidance

The National Institute of Standards and Technology (NIST) has updated two of its series of computer security guides to help computer system managers protect their systems from hackers and malware. Vulnerabilities in software and firmware are the easiest ways to attack a system, and the two revised publications approach the problem by providing new guidance for software patching and warding off malware.

A common method to avoid attacks is to “patch” the vulnerabilities as soon as possible after the software company develops a piece of repair software—a patch—for the problem. Patch management is the process of identifying, acquiring, installing and verifying patches for products and systems.

The earlier guidance on patching, Creating a Patch and Vulnerability Management Program, was written when patching was a manual process. The revision, Guide to Enterprise Patch Management Technologies,* is designed for agencies that take advantage of automated patch management systems such as those based on NIST’s Security Content Automation Protocol (SCAP).

Guide to Enterprise Patch Management Technologies explains the technology basics and covers metrics for assessing the technologies’ effectiveness.

The second security document provides guidance to protect computer systems from malware—malicious code. Malware is the most common external threat to most systems and can cause widespread damage and disruption.

NIST’s Guide to Malware Incident Prevention and Handling for Desktops and Laptops** was updated to help agencies protect against modern malware attacks that are more difficult to detect and eradicate than when the last version was published in 2005. The new guidance reflects the growing use of social engineering and the harvesting of social networking information for targeting attacks.

The new malware guide provides information on how to modernize an organization’s malware incident prevention measures and suggests recommendations to enhance an organization’s existing incident response capability to handle modern malware.

*Guide to Enterprise Patch Management Technologies (NIST Special Publication 800-40, Revision 3) is available at: http://nvlpubs.nist.gov/nistpubs/SpecialPublications/NIST.SP.800-40r3.pdf .
**Guide to Malware Incident Prevention and Handling for Desktops and Laptops (Special Publication 800-83 Revision 1) can be found at: http://nvlpubs.nist.gov/nistpubs/SpecialPublications/NIST.SP.800-83r1.pdf.

Media Contact: Evelyn Brown, evelyn.brown@nist.gov, 301-975-5661

Posted in Information Technology | Tagged , , , , | Leave a comment

Two NIST Standards Experts Are Honored With 2013 ANSI Leadership and Service Awards

The American National Standards Institute (ANSI) has recognized two staff members from the National Institute of Standards and Technology (NIST) for their significant contributions to national and international standardization activities, as well as ongoing commitment to their industry, their nation and the enhancement of the global voluntary consensus standards system.

Patrick Grother
Patrick Grother
Credit: NIST
View hi-resolution image
Elaine Newton
Elaine Newton
Credit: NIST
View hi-resolution image

Biometric Testing Project Leader Patrick Grother, of NIST’s Information Technology Laboratory, will receive the Edward Lohse Information Technology Medal, which recognizes outstanding effort to foster cooperation among the bodies involved in global IT standardization. Grother has supported biometrics standardization, particularly with respect to standards for biometric data interchange and testing performance of biometric technologies.

Deputy Standards Liaison Elaine Newton, also in NIST’s Information Technology Laboratory, has been named one of three recipients of the Next Generation Award, which is presented to outstanding members who have been engaged with the association for less than eight years. Newton is being honored for demonstrating vision, leadership, dedication and significant contributions to standards activities since 2006.

The awards will be presented October 2, 2013, during World Standards Week 2013.

ANSI is a private, nonprofit organization whose mission is to enhance U.S. global competitiveness and the American quality of life by promoting, facilitating and safeguarding the integrity of the voluntary standardization and conformity assessment system. The organization is the official U.S representative to the International Organization for Standardization.

Media Contact: Evelyn Brown, evelyn.brown@nist.gov, 301-975-5661

Posted in Uncategorized | Tagged , , | 1 Comment

E.U. and U.S. to Extend Scientific Cooperation on Measurements and Standards

The European Commission’s Joint Research Centre (JRC) and the U.S. Department of Commerce’s National Institute of Standards and Technology (NIST) yesterday* agreed to expand their current scientific cooperation to include new areas of research, such as energy, healthcare and clinical measurements, and food safety and nutrition.

 NIST/JRC agreement signing
Congressman Chaka Fattah (D-PA), Under Secretary of Commerce for Standards and Technology and NIST Director Patrick Gallagher, Joint Research Centre (JRC) Director General Dominique Ristori and European Parliament Member Edit Herzog at a signing ceremony July 17, 2013. NIST and the JRC agreed to expand their current scientific cooperation to include new areas of research.

Credit: K. Delak/NIST

High resolution version

Under Secretary of Commerce for Standards and Technology and NIST Director Patrick Gallagher and JRC Director General Dominique Ristori held a signing ceremony during Transatlantic Week, an annual event intended to raise the profile of the transatlantic relationship as well as to foster a dialogue on shared purpose and joint action among U.S. and E.U. policymakers.
NIST and the commission have collaborated on many projects since the signing of the U.S.-EU Agreement on Scientific and Technological Cooperation in 1997. The new Implementing Arrangement expands on previous collaboration and provides joint access to scientific infrastructure, the exchange of scientific and technological information and experts, and support for training scientists, engineers and technical experts. The arrangement is initially for five years and can be extended.

While NIST and the JRC, the commission’s in-house science service, have a history of working together, this overarching agreement replaces individual agreements on each project. It also provides additional focus on shared research priorities, including potential new areas such as security technology and systems, and environment and climate.

Speaking at the signing ceremony, JRC’s Ristori said, “The Implementing Arrangement we signed will create an overarching framework for a cross-Atlantic cooperation on standards and measurements in a wide range of areas. It will also serve as a leading example in the process towards setting global standards.”

NIST’s Gallagher highlighted current collaborative projects, including foundational research that supports the measurements underpinning manufacturing and standards, and also work developing measurement protocols and standards in fields such as homeland security technology.

“The new agreement affirms our relationship, and we look forward to new and exciting areas of interaction that will ultimately serve to support the relationship between the European Union and the United States,” said Gallagher. “By working together, we can take advantage of each other’s respective strengths to further the science. And ultimately, we will benefit from being on the same page as the science matures and allows us to establish and implement these new technologies.”

Both organizations have the strategic goal to support competitiveness and economic growth, and have cooperated on standardization since 2007. The Implementing Arrangement encompasses 10 areas related to standards and measurements. Environment and climate, energy, transportation and security are high on the collaborative research agenda. In addition to reference materials in a range of areas, the cooperation will include research on civil engineering structures (such as bridges, roads and dams) and emerging information and communication technologies, as well as marine optical radiometry.

As a non-regulatory agency of the U.S. Department of Commerce, NIST promotes U.S. innovation and industrial competitiveness by advancing measurement science, standards and technology in ways that enhance economic security and improve our quality of life. To learn more about NIST, visit www.nist.gov.

For more information on the JRC’s research on standards and measurements, visit http://ec.europa.eu/dgs/jrc/downloads/jrc_science_for_standards_reports.pdf. And to learn more about JRC-U.S. collaborations, visit http://ec.europa.eu/dgs/jrc/downloads/jrc_country_leaflet_us_en.pdf.

* Originally issued on July 18, 2013.

Media Contact: Jennifer Huergo, jennifer.huergo@nist.gov, 301-975-6343

Posted in NIST General | Tagged , , , | 1 Comment

New NIST Standard Reference Material to Help Calibrate Hospital CAT Scanners

Scientists at the National Institute of Standards and Technology (NIST) have developed a new standard reference material (SRM), the first such measurement tool to enable hospitals to link important tissue density measurements made by CAT scans to international standards.

SRM 2088
Standard Reference Material 2088 makes it possible to link CAT scans to the International System of Units for the first time. The SRM – a collection of five blocks of polyurethane foam of different densities, shown here in front of its delivery case – allows scans to be tied to measurement units important for diagnosing lung disease.
Credit: Boutin/NIST
View hi-resolution image

Computed tomography (“CT” or “CAT”) uses computer processing to combine multiple X-ray images into three-dimensional scans that resemble slices of the body. These cross-sections are useful for spotting changes that are difficult to discern from ordinary two-dimensional X-ray images alone, such as the changes in lung tissue that indicate cancer or emphysema. Millions of CAT scans take place every year in the U.S. alone, but over time, the devices’ outputs have a tendency to “drift,” according to NIST physicist Zachary Levine.

“Scanners are calibrated daily, but over periods of months you still see variations,” says Levine. “Some of these come from the physical degradation of the machine, or even from changes made by software upgrades. Doctors “get to know” their own machine’s idiosyncrasies and make good diagnoses, but they’d still like to be more certain about what they’re seeing, especially when it comes to lung tissue.”

Calibrating a CAT scanner is not difficult in principle. All that is needed are physical objects of known density that can be run through the scanner together with a patient so the scan shows both the patient and the reference. Finding an appropriate material was a bit challenging though. Lung tissue is among the lightest in the body, and varies constantly in its density depending on how much air is trapped in it.

A lung tissue SRM must span a range of densities not only for this reason, but for diagnostic purposes as well. While emphysema makes lung tissue grow less dense, in a tumor the density increases. Scientists who work on CAT scans often express this density range as a “CT number,” where a change in CT number by a few points can mean the difference between one diagnosis and another.

SRM 2088, a collection of five small blocks that the NIST team fashioned from polyurethane foam of different densities, makes it possible to link scans to the International System of Units (SI) for the first time. Together with a second reference material (SRM 2087) developed last year, SRM 2088 allows CAT scans to be tied to SI units for length, density and mass attenuation coefficient—three characteristics that are particularly important for diagnosing lung diseases effectively.

The NIST team chose polyurethane foam, which Levine says is ideal because it is both inexpensive and available off the shelf in an appropriate range of densities corresponding to the density range of lung tissue. The SRM needed to be accurate to within 10 CT numbers to be useful, and density tests indicate that they are accurate to within 0.15 CT numbers, roughly 100 times more than necessary.

SRM 2088, “Density Standard for Medical Computed Tomography” and SRM 2087, ” Dimensional Standard for Medical Computed Tomography” are available from the NIST Standard Reference Materials office at www.nist.gov/srm/.

*Z.H. Levine, H.H. Chen-Mayer, A.L. Pintar and D.S. Sawyer IV. A low-cost density reference phantom for computed tomography. Medical Physics 36, pp. 286-288 (2009).

Media Contact: Chad Boutin, boutin@nist.gov, 301-975-4261

Posted in Bioscience & Health, Physics | Tagged , , , , , | Leave a comment

New NIST Nanoscale Indenter Takes Novel Approach to Measuring Surface Properties

Researchers from the National Institute of Standards and Technology (NIST) and the University of North Carolina have demonstrated a new design for an instrument, a “instrumented nanoscale indenter,” that makes sensitive measurements of the mechanical properties of thin films—ranging from auto body coatings to microelectronic devices—and biomaterials. The NIST instrument uses a unique technique for precisely measuring the depth of the indentation in a test surface with no contact of the surface other than the probe tip itself.*

nanoindenter head
Good vibrations: Close-up image shows the tip of the new NIST nanoscale indenter flanked by two tuning forks that provide a stable, noncontact reference relative to the specimen, a piece of single-crystal silicon. Using a piar of tuning forks allows the system to compensate for any tilt.
Credit: Nowakowski/NIST
View hi-resolution image

Indenters have a long history in materials research. Johan August Brinell devised one of the first versions in 1900. The concept is to drop or ram something hard onto the test material and gauge the material’s hardness by the depth of the dent. This is fine for railway steel, but modern technology has brought more challenging measurements: the stiffness of micromechanical sensors used in auto airbags, the hardness of thin coatings on tool bits, the elasticity of thin biological membranes. These require precision measurements of depth in terms of nanometers and force in terms of micronewtons.

Instead of dents in metal, says NIST’s Douglas Smith, “We are trying to get the most accurate measurement possible of how far the indenter tip penetrates into the surface of the specimen, and how much force it took to push it in that far. We record this continuously. It’s called ‘instrumented indentation testing’.”

A major challenge, Smith says, is that at the nanoscale you need to know exactly where the surface of the test specimen is relative to the indenter’s tip. Some commercial instruments do this by touching the surface with a reference part of the instrument that is a known distance from the tip, but this introduces additional problems. “For example, if you want to look at creep in polymer—which is one thing that our instrument is particularly good at—that reference point itself is going to be creeping into the polymer just under its own contact force. That’s an error you don’t know and can’t correct for,” says Smith.

The NIST solution is a touchless surface detector that uses a pair of tiny quartz tuning forks—the sort used to keep time in most wrist watches. When the tuning forks get close to the test surface, the influence of the nearby mass changes their frequency—not much, but enough. The nanoindenter uses that frequency shift to “lock” the position of the indenter mechanism at a fixed distance from the test surface, but without exerting any detectable force on the surface itself.

“The only significant interaction we want is between the indenter and the specimen,” says Smith, “or at least, to be constant and not deforming the surface. This is a significant improvement over the commercial instruments.”

The NIST nanoindenter can apply forces up to 150 millinewtons, taking readings a thousand times a second, with an uncertainty lower than 2 micronewtons, and while measuring tip penetration up to 10 micrometers to within about 0.4 nanometers. All of this in done in a way that can be traceably calibrated against basic SI units for force and displacement in a routine manner.

The instrument is well suited for high-precision measurements of hardness, elasticity and creep and similar properties for a wide range of materials, including often difficult to measure soft materials such as polymer films, says Smith, but one of its primary uses will be in the development of reference materials that can be used to calibrate other instrumented indenters. “There still are no NIST standard reference materials for this class of instruments because we wanted to have an instrument that was better than the commercial instruments for doing that,” Smith explains.

*B.K. Nowakowski, D.T. Smith, S.T. Smith, L.F. Correa and R F. Cook. Development of a precision nanoindentation platform. Review of Scientific Instruments, 84(7), 075110, DOI: 10-1063/1.4811195, (2013). Published online July 18, 2013.

Media Contact: Michael Baum, michael.baum@nist.gov, 301-975-2763

Posted in Materials Science | Tagged , , , , , , | Leave a comment

New Nanoscale Imaging Method Finds Application in Plasmonics

Researchers from the National Institute of Standards and Technology (NIST) and the University of Maryland have shown how to make nanoscale measurements of critical properties of plasmonic nanomaterials—the specially engineered nanostructures that modify the interaction of light and matter for a variety of applications, including sensors, cloaking (invisibility), photovoltaics and therapeutics.

plasmonic resonators
Infrared laser light (purple) from below a sample (blue) excites ring shaped nanoscale plasmonic resonator structures (gold). Hot spots (white) form in the rings’ gaps. In these hot spots, infrared absorption is enhanced, allowing for more sensitive chemical recognition. A scanning AFM tip detects the expansion of the underlying material in response to absorption of infrared light.
Credit: NIST
View hi-resolution image

Their technique is one of the few that allows researchers to make actual physical measurements of these materials at the nanoscale without affecting the nanomaterial’s function.

Plasmonic nanomaterials contain specially engineered conducting nanoscale structures that can enhance the interaction between light and an adjacent material, and the shape and size of such nanostructures can be adjusted to tune these interactions. Theoretical calculations are frequently used to understand and predict the optical properties of plasmonic nanomaterials, but few experimental techniques are available to study them in detail. Researchers need to be able to measure the optical properties of individual structures and how each interacts with surrounding materials directly in a way that doesn’t affect how the structure functions.

“We want to maximize the sensitivity of these resonator arrays and study their properties,” says lead researcher Andrea Centrone. “In order to do that, we needed an experimental technique that we could use to verify theory and to understand the influence of nanofabrication defects that are typically found in real samples. Our technique has the advantage of being extremely sensitive spatially and chemically, and the results are straightforward to interpret.”

The research team turned to photothermal induced resonance (PTIR), an emerging chemically specific materials analysis technique, and showed it can be used to image the response of plasmonic nanomaterials excited by infrared (IR) light with nanometer-scale resolution.

The team used PTIR to image the absorbed energy in ring-shaped plasmonic resonators. The nanoscale resonators focus the incoming IR light within the rings’ gaps to create “hot spots” where the light absorption is enhanced, which makes for more sensitive chemical identification. For the first time, the researchers precisely quantified the absorption in the hot spots and showed that for the samples under investigation, it is approximately 30 times greater than areas away from the resonators.

The researchers also showed that plasmonic materials can be used to increase the sensitivity of IR and PTIR spectroscopy for chemical analysis by enhancing the local light intensity, and thereby, the spectroscopic signal.

Their work further demonstrated the versatility of PTIR as a measurement tool that allows simultaneous measurement of a nanomaterial’s shape, size, and chemical composition—the three characteristics that determine a nanomaterial’s properties. Unlike many other methods for probing materials at the nanoscale, PTIR doesn’t interfere with the material under investigation; it doesn’t require the researcher to have prior knowledge about the material’s optical properties or geometry; and it returns data that is more easily interpretable than other techniques that require separating the response of the sample from response of the probe.

For background on PTIR, see the February 2013 NIST Tech Beat story, “NIST Captures Chemical Composition with Nanoscale Resolution” at www.nist.gov/public_affairs/tech-beat/tb20130220.cfm#ptir.

*B. Lahiri, G. Holland, V. Aksyuk and A. Centrone. Nanoscale imaging of plasmonic hot spots and dark modes with the photothermal-induced resonance technique. Nano Letters. June 18, 2013. DOI: 10.1021/nl401284m.

Media Contact: Mark Esser, mark.esser@nist.gov, 301-975-8735

Posted in Nanotechnology, Physics | Tagged , , , | Leave a comment

NIST Shows How to Make a Compact Frequency Comb in Minutes

Laser frequency combs—high-precision tools for measuring different colors of light in an ever-growing range of applications such as advanced atomic clocks, medical diagnostics and astronomy—are not only getting smaller but also much easier to make.

Physicists at the National Institute of Standards and Technology (NIST) can now make the core of a miniature frequency comb in one minute.* Conventional microfabrication techniques, by contrast, may require hours, days or even weeks.

The NIST technique involves laser machining of a quartz rod (a common type of glass) to shape and polish a small, smooth disk within which light can circulate (see video clip). The user controls the size and shape of this optical cavity, or resonator. Its diameter can be varied from about one-fifth of a millimeter to 8 millimeters, and its thickness and curvature can be shaped as well. The quality factor—Q factor, which is a measure of the length of time light circulates inside the cavity without leaking out—equals or exceeds that of cavities made by other methods.

photo showing 4 microcavities
NIST physicists have developed a one-minute process for creating optical microcavities made of fused quartz. The photo shows four cavities with diameters (top to bottom) of 0.36 millimeters (mm), 0.71 mm, 1.2 mm, and 1.5 mm. When excited with laser light, the cavities can be used for many applications, including the generation of frequency combs used to precisely measure different colors of light. Smaller cavities produce wider spacing between the comb teeth (specific colors).
Credit: Del’Haye/NIST
View hi-resolution image

After machining the quartz, NIST scientists use a small, low-power infrared laser to pump light into it. A primary benefit of the high Q factor is that only a few milliwatts of laser light are required to generate a comb.

“We make a resonator in one minute, and one minute after that we are making a frequency comb,” NIST researcher Scott Papp says.

NIST’s one-minute method is simple and far less expensive than conventional microfabrication. The system for the NIST process costs about $10,000—most of that for purchase of a carbon dioxide laser used for cutting—compared to between $1 million and $10 million for a microfabrication system that must be used in a cleanroom.

A full-size frequency comb uses high-power, ultrafast lasers and is generally the size of a small table. NIST researchers have been making compact frequency combs for several years and often make cavities out of bulk fused quartz, an inexpensive glass material.**

By confining light in a small space, the optical cavity—which, confusingly enough, is solid—enhances optical intensity and interactions. The comb itself is the light, which starts out as a single color or frequency that through optical processes is transformed to a set of additional shades, each sharply defined and equally spaced on the spectrum. A typical NIST microcomb might have 300 “teeth,” or ticks on the ruler, each a slightly different color. A key advantage of microcombs is the ability to tune the spacing between the teeth, as needed, for applications such as calibrating astronomical instruments. The spacing is determined by the size of the cavity; a smaller cavity results in wider spacing between the comb teeth.

Scientists plan to apply for a patent on the machining technique, which could be applied to a variety of other glassy materials. Future NIST research will focus on continuing improvements in comb performance and use of the resonators in other compact applications such as optical frequency standards and low-noise microwave oscillators.

The research is supported, in part, by the Defense Advanced Research Projects Agency and National Aeronautics and Space Administration.

*S.B. Papp, P. Del’Haye and S.A. Diddams. 2013. Mechanical control of a microrod-resonator optical frequency comb. Physical Review X. 3, 031003 (2013). DOI: 10.1103/PhysRevX.3.031003. Published online July 8, 2013.
P. Del’Haye, S.A. Diddams and S.B. Papp. 2013. Laser-Machined Ultra-High-Q Microrod Resonators for Nonlinear Optics. Applied Physics Letters. Posted online June 7, 2013.
**See 2011 NIST Tech Beat article, “Future ‘Comb on a Chip': NIST’s Compact Frequency Comb Could Go Places,” at www.nist.gov/pml/div688/comb-102511.cfm.

Media Contact: Laura Ost, laura.ost@nist.gov, 303-497-4880

Posted in Physics | Tagged , , , | 1 Comment