October 31, 2015

New artificial fingerprints feel texture, hear sound


PARK ET AL., ULSAN NATIONAL INSTITUTE OF SCIENCE AND TECHNOLOGY
Scientists create artificial fingerprint that detects pressure, temperature texture,
and sound for the first time.

(October 31, 2015)  Fake fingerprints might sound like just another ploy to fool the feds. But the world’s first artificial prints—reported today—have even cooler applications. The electronic material, which mimics the swirling designs imprinted on every finger, can sense pressure, temperature, and even sound. Though the technology has yet to be tested outside the lab, researchers say it could be key to adding sensation to artificial limbs or even enhancing the senses we already have.

“It’s an interesting piece of work,” says John Rogers, materials scientist at the University of Illinois, Urbana-Champaign, who was not involved in the study. “It really adds to the toolbox of sensor types that can be integrated with the skin.”

Electronic skins, known as e-skins, have been in development for years. There are several technologies used to mimic the sensations of real human skin, including sensors that can monitor health factors like pulse or temperature. But previous e-skins have been able to “feel” only two sensations: temperature and pressure. And there are additional challenges when it comes to replicating fingertips, especially when it comes to mimicking their ability to sense even miniscule changes in texture, says Hyunhyub Ko, a chemical engineer at Ulsan National Institute of Science and Technology in South Korea.

So in the new study, Ko and colleagues started with a thin, flexible material with ridges and grooves much like natural fingerprints. This allowed them to create what they call a “microstructured ferroelectric skin” (expanded in the figure below). The e-skin’s perception of pressure, texture, and temperature all come from a highly sensitive structure called an interlocked microdome array—the tiny domes sandwiched in the bottom two layers of the e-skin, also shown in the figure below.

Park et al., Ulsan National Institute of Science and Technology

Here’s how it works: When pressure from the outside squishes the two layers together, an electric current is created, running through the thickness of the material. The current is then monitored through electrodes and registered as pressure—the larger the current, the stronger the pressure. The skin can also sense temperature, the team reports today in Science Advances, though in ways similar to other technologies. When exposed to warmth, the e-skin material relaxes, but when cooled, it stiffens. In both cases, subtle changes in stiffness generate currents that scientists can record as temperature spikes or drops, much in the same way that they infer pressure from currents.


journal reference (Open Access)  >>

NASA Study: Mass Gains of Antarctic Ice Sheet Greater than Losses



A new NASA study says that Antarctica is overall accumulating ice. Still, areas of the continent,
like the Antarctic Peninsula photographed above, have increased their mass loss in the last
decades. Credits: NASA's Operation IceBridge

(October 31, 2015)  A new NASA study says that an increase in Antarctic snow accumulation that began 10,000 years ago is currently adding enough ice to the continent to outweigh the increased losses from its thinning glaciers.

The research challenges the conclusions of other studies, including the Intergovernmental Panel on Climate Change’s (IPCC) 2013 report, which says that Antarctica is overall losing land ice.

According to the new analysis of satellite data, the Antarctic ice sheet showed a net gain of 112 billion tons of ice a year from 1992 to 2001. That net gain slowed   to 82 billion tons of ice per year between 2003 and 2008.

“We’re essentially in agreement with other studies that show an increase in ice discharge in the Antarctic Peninsula and the Thwaites and Pine Island region of West Antarctica,” said Jay Zwally, a glaciologist with NASA Goddard Space Flight Center in Greenbelt, Maryland, and lead author of the study, which was published on Oct. 30 in the Journal of Glaciology. “Our main disagreement is for East Antarctica and the interior of West Antarctica – there, we see an ice gain that exceeds the losses in the other areas.”  Zwally added that his team “measured small height changes over large areas, as well as the large changes observed over smaller areas.”

Map showing the rates of mass changes from ICESat 2003-2008 over Antarctica. Sums are
for all of Antarctica: East Antarctica (EA, 2-17); interior West Antarctica (WA2, 1, 18, 19, and 23);
coastal West Antarctica (WA1, 20-21); and the Antarctic Peninsula (24-27).
A gigaton (Gt) corresponds to a billion metric tons, or 1.1 billion U.S. tons.
Credits: Jay Zwally/ Journal of Glaciology

Scientists calculate how much the ice sheet is growing or shrinking from the changes in surface height that are measured by the satellite altimeters. In locations where the amount of new snowfall accumulating on an ice sheet is not equal to the ice flow downward and outward to the ocean, the surface height changes and the ice-sheet mass grows or shrinks.

But it might only take a few decades for Antarctica’s growth to reverse, according to Zwally. “If the losses of the Antarctic Peninsula and parts of West Antarctica continue to increase at the same rate they’ve been increasing for the last two decades, the losses will catch up with the long-term gain in East Antarctica in 20 or 30 years -- I don’t think there will be enough snowfall increase to offset these losses.”

The study analyzed changes in the surface height of the Antarctic ice sheet measured by radar altimeters on two European Space Agency European Remote Sensing (ERS) satellites, spanning from 1992 to 2001, and by the laser altimeter on NASA’s Ice, Cloud, and land Elevation Satellite (ICESat) from 2003 to 2008.

read entire press  release >>

October 30, 2015

Researchers design a full-scale architecture for a quantum computer in silicon


The UNSW members of the team: L-R: Dr Matthew House, Sam Hile (seated),
Scientia Professor Sven Rogge and Scientia Professor Michelle Simmons of the
CQC2T laboratories at UNSW. Image: UNSW

(October 30, 2015)  Researchers at UNSW and the University of Melbourne have designed a 3D silicon chip architecture based on single atom quantum bits, providing a blueprint to build a large-scale quantum computer.

Australian scientists have designed a 3D silicon chip architecture based on single atom quantum bits, which is compatible with atomic-scale fabrication techniques – providing a blueprint to build a large-scale quantum computer.

Scientists and engineers from the Australian Research Council Centre of Excellence for Quantum Computation and Communication Technology (CQC2T), headquartered at UNSW, are leading the world in the race to develop a scalable quantum computer in silicon – a material well-understood and favoured by the trillion-dollar computing and microelectronics industry.

Teams led by UNSW researchers have already demonstrated a unique fabrication strategy for realising atomic-scale devices and have developed the world’s most efficient quantum bits in silicon using either the electron or nuclear spins of single phosphorus atoms. Quantum bits – or qubits – are the fundamental data components of quantum computers.

Australian researchers have figured out a way to deal with errors in quantum computers,
giving them the essential architecture that may help this team become the first
to build a functioning quantum computer in silicon.

One of the final hurdles to scaling up to an operational quantum computer is the architecture. Here it is necessary to figure out how to precisely control multiple qubits in parallel, across an array of many thousands of qubits, and constantly correct for ‘quantum’ errors in calculations.

Now, the CQC2T collaboration, involving theoretical and experimental researchers from the University of Melbourne and UNSW, has designed such a device. In a study published today in Science Advances, the CQC2T team describes a new silicon architecture, which uses atomic-scale qubits aligned to control lines – which are essentially very narrow wires – inside a 3D design.

read entire press  release >>

Simple mathematical formula models lithium-ion battery aging


Hybrid electric vehicles combine the efficiency of electric vehicles with the power and longevity of
gasoline-powered vehicles because they have both a gasoline-fueled conventional internal combustion engine and an electric motor powered by batteries. Image: Volvo

(October 30, 2015)  Hybrid electric vehicles, cell phones, digital cameras, and the Mars Curiosity rover are just a few of the many devices that use rechargeable lithium-ion batteries. Now a team of Penn State researchers has a simple mathematical formula to predict what factors most influence lithium-ion battery aging.

Lithium-ion batteries function by moving lithium ions from the negative electrode to the positive electrode and in the opposite direction when the battery charges. How often and exactly how that battery is used determines the length of a battery's life. Complex models that predict battery aging exist and are used for battery design. However, faster, simpler models are needed to understand the most important factors that influence aging so that battery management systems in hybrid electric vehicles, for example, can better control lithium-ion batteries.

"We started out by making models specifically for Volvo's batteries that were tuned to their specific chemistry and showed that the models matched experimentally," said Christopher Rahn, professor of mechanical engineering, Penn State. "We then focused on simplifying the aging models. Now, we have the ultimate simplified aging model down to a formula."

According to Rahn, a battery ages, or degrades, whether it is sitting on a shelf or used. The main cause of lithium-ion battery aging is the continuous formation of the solid electrolyte interphase (SEI) layer in the battery. The SEI layer must form for the battery to work because it controls the amount of chemical reactions that occur in the battery. As the battery is continually used, however, small-scale side reactions build up at the SEI layer, which decreases battery capacity -- how much of a charge the battery can hold. Models allow researchers to understand how different factors affect this degradation process so that longer-lasting, more cost-efficient batteries can be made.

read entire press  release >>

New metal alloy could yield green cooling technologies RIT scientist explores alternatives to rare-earth magnets


Casey Miller

(October 30, 2015)  A promising new metal alloy system could lead to commercially viable magnetic refrigerants and environmentally friendly cooling technologies, according to a scientist at Rochester Institute of Technology.

Casey Miller, head of RIT’s materials science and engineering program, and his colleagues published their findings in the Oct. 28 issue of Scientific Reports, an online open-access journal from the publishers of Nature. Miller’s work in this area also led to an international collaboration that published in Applied Physics Letters on Oct. 6, and which was selected as an Editor’s Pick, making it free to any reader.

The study published in Scientific Reports explores an iron-based alloy as a component of next-generation cooling technologies. The materials use magnetic fields to change a refrigerant’s temperature without the coolant gases associated with global warming. The thermodynamic phenomenon, called “magnetocaloric effect,” makes magnetic refrigeration an environmentally friendly and efficient alternative to current cooling technologies.

The alloy is a substitute for metals made from rare-earth elements, predominantly produced in China and increasingly used in modern magnets. The supply and cost of rare-earth metals are susceptible to geopolitical tensions that hamper the commercial viability of new magnetic refrigeration technologies, the authors reported. Transition metals typically offer supply chain stability and are cheaper by weight than rare-earths, they said.

“Our work is a great example of President Obama’s Materials Genome Initiative in action,” Miller said. “We created alloys containing four and five different elements whose properties helped our theory collaborators develop a calculation that predicts the magnetic properties of a larger set of compounds that have not yet been synthesized. Now we have identified hundreds of new alloy combinations that could be useful.”


journal reference (Open Access)  >>

Chemical complexity promises improved structural alloys for next-gen nuclear energy


In complex alloys, chemical disorder results from a greater variety of elements than found
in traditional alloys. Traces here indicate electronic states in a complex alloy; smeared traces
reduced electrical and thermal conductivity. Image credit: Oak Ridge National Laboratory,
U.S. Dept. of Energy. Image by G. Malcolm Stocks

(October 30, 2015)  Designing alloys to withstand extreme environments is a fundamental challenge for materials scientists. Energy from radiation can create imperfections in alloys, so researchers in an Energy Frontier Research Center led by the Department of Energy’s Oak Ridge National Laboratory are investigating ways to design structural materials that develop fewer, smaller flaws under irradiation. The key, they report in the journal Nature Communications, is exploiting the complexity that is present when alloys are made with equal amounts of up to four different metallic elements.

“Chemical complexity gives us a way to modify paths for energy dissipation and defect evolution,” said first author Yanwen Zhang, who directs an Energy Frontier Research Center, called “Energy Dissipation to Defect Evolution,” or “EDDE,” funded by the U.S. Department of Energy Office of Science. The growing center is nearly 15 months old and brings together more than two dozen researchers with experimental and modeling expertise. EDDE has partners at Oak Ridge, Los Alamos and Lawrence Livermore national laboratories and the universities of Michigan, Wisconsin–Madison and Tennessee–Knoxville.

Radiation can harm spacecraft, nuclear power plants and high-energy accelerators. Nuclear reactions produce energetic particles—ions and neutrons—that can damage materials as their energy disperses, causing the formation of flaws that evolve over time. Advanced structural materials that can withstand radiation are a critical national need for nuclear reactor applications. Today, nuclear reactors provide one-fifth of U.S. electricity. Next-generation reactors will be expected to serve over longer lifetimes and withstand higher irradiation levels.

In a reactor, thousands of atoms can be set in motion by one energetic particle that displaces them from sites in a crystal lattice. While most of the displaced atoms return to lattice sites as the energy is dissipated, some do not. Irradiation can damage structural materials made of well-ordered atoms packed in a lattice—even obliterating its crystallinity. Existing knowledge of radiation effects on structural materials is mostly about reactor-core components. Over the life of a typical light water reactor, all atoms in the structural components can be displaced on average 20 times, and accumulated damage may threaten material performance. To prepare for new reactor concepts, scientists will have to design next-generation nuclear materials to withstand atoms displaced more than 200 times—a true “grand challenge.”

read entire press  release >>

UW-MADISON ENGINEERS REVEAL RECORD-SETTING FLEXIBLE PHOTOTRANSISTOR


(a), Bending setup for RF measurements. (b), A bent device array on a bending fixture.
(c), Calculated mobility values from measured transconductance as a function of bending induced strain
for both unstrained and strained devices.
(d), fT and fmax of both unstrained and strained devices as a function of bending induced external strain.

(October 30, 2015)  Inspired by mammals' eyes, University of Wisconsin-Madison electrical engineers have created the fastest, most responsive flexible silicon phototransistor ever made.

The innovative phototransistor could improve the performance of myriad products - ranging from digital cameras, night-vision goggles and smoke detectors to surveillance systems and satellites - that rely on electronic light sensors. Integrated into a digital camera lens, for example, it could reduce bulkiness and boost both the acquisition speed and quality of video or still photos.

Developed by UW-Madison collaborators Zhenqiang "Jack" Ma, professor of electrical and computer engineering, and research scientist Jung-Hun Seo, the high-performance phototransistor far and away exceeds all previous flexible phototransistor parameters, including sensitivity and response time.

The researchers published details of their advance this week in the journal Advanced Optical Materials.

Like human eyes, phototransistors essentially sense and collect light, then convert that light into an electrical charge proportional to its intensity and wavelength. In the case of our eyes, the electrical impulses transmit the image to the brain. In a digital camera, that electrical charge becomes the long string of 1s and 0s that create the digital image.

While many phototransistors are fabricated on rigid surfaces, and therefore are flat, Ma and Seo's are flexible, meaning they more easily mimic the behavior of mammalian eyes.



Long distance love affair

 


Qualities admired in another from far away can be threatening as that person approaches, according to UB research

(October 30, 2015)  What people believe they want and what they might actually prefer are not always the same thing. And in the case of being outperformed as an element of romantic attraction, the difference between genuine affinity and apparent desirability becomes clearer as the distance between two people gets smaller.

In matters of relative performance, distance influences attraction. For example, someone of greater intelligence seems attractive when they’re distant or far away in your mind. But less so when that same person is right next to you, according to a new study by a University at Buffalo-led research team published in the latest edition of the journal Personality and Social Psychology Bulletin.

“We found that men preferred women who are smarter than them in psychologically distant situations. Men rely on their ideal preferences when a woman is hypothetical or imagined,” said Lora Park, associate professor in the UB Department of Psychology and the study’s principal investigator. “But in live interaction, men distanced themselves and were less attracted to a woman who outperformed them in intelligence.”

Previous research has shown that similarities between individuals can affect attraction. This new set of studies suggests that psychological distance — whether someone is construed as being near or far in relation to the self — plays a key role in determining attraction.

read entire press  release >>

Can changes in the brain affect your microbiome? UB researchers are investigating


Study of patients with irritable bowel syndrome undergoing behavioral
self-management may strengthen understanding of brain-gut connections

(October 30, 2015)  The microbiome in your gut can affect your brain: More and more data have recently shown that. But can it go the other way? Can brain changes affect your gut microbiome? And if so, do these changes affect your health and well-being?

A University at Buffalo researcher is leading a pilot study to answer that question. The goal is to determine whether behavioral self-management of a painful and common gastrointestinal disorder may lead to fundamental changes in the gut microbiome, the digestive system’s bacterial ecosystem.

The study is being conducted with a subset of patients enrolled in a large, National Institutes of Health-funded study being led by Jeffrey Lackner, PsyD, professor in the Department of Medicine in the Jacobs School of Medicine and Biomedical Sciences at the University at Buffalo and director of the Behavioral Medicine Clinic. That multicenter study focuses on whether a specific, non-drug treatment — a cognitive behavior therapy program — can relieve the often-debilitating symptoms of irritable bowel syndrome (IBS) for which there is no satisfactory medical treatment.

read entire press  release >>

How to make Web advertising more effective




(October 30, 2015)  Every day, users are bombarded with animated ads across the Web, and companies fight to cut through the clutter. New research from the University at Buffalo School of Management has pinpointed one attribute online ads should have to influence consumers’ perceptions of a new product—and their willingness to pay for it.

Consumers who see a Web ad in which the product changes direction while moving across the screen are more likely to perceive the product as innovative, according to forthcoming research in the Journal of Marketing.

“Psychologically, we don’t expect inanimate objects to be able to change directions,” says co-author Arun Lakshmanan, PhD, assistant professor of marketing in the UB School of Management. “As a result, when we see something do that in an advertisement, it stands out as atypical and causes us to make judgments instantaneously about the product’s novelty, without even thinking about it.”
Product novelty is critical for success in the marketplace. Research shows that products perceived as innovative are adopted faster by consumers and bring in higher profits.

read entire press  release >>

New class of DNA repair enzyme discovered


Top and side view showing the difference in the way that a normal DNA repair glycosylase
enzyme (AAG) and the new enzyme (AlkD) recognize a damaged DNA base. The AAG enzyme
bends the DNA in way that forces the damaged base to rotate from its normal position inside the
double helix to an outside position where the enzyme binds to it and removes it. In contrast,
the AlkD enzyme senses the chemical features of the damaged base through the DNA backbone,
without physically contacting the damaged base itself. The enzymes are shown in grey, the DNA
backbone is orange, normal DNA base pairs are yellow, the damaged base is purple and its pair
base is green. (Brandt Eichman / Vanderbilt)

(October 30, 2015)  This year’s Nobel Prize in chemistry was given to three scientists who each focused on one piece of the DNA repair puzzle. Now a new study, reported online Oct. 28 in the journal Nature, reports the discovery of a new class of DNA repair enzyme.

When the structure of DNA was first discovered, scientists imagined it to be extremely chemically stable, which allowed it to act as a blueprint for passing the basic traits of parents along to their offspring. Although this view has remained prevalent among the public, biologists have since learned that the double helix is in fact a highly reactive molecule that is constantly being damaged and that cells must make unceasing repair efforts to protect the genetic information that it contains.

 

“It’s a double-edged sword,” said Brandt Eichman, associate professor of biological sciences and biochemistry at Vanderbilt University, who headed the research team that made the new discovery. “If DNA were too reactive then it wouldn’t be capable of storing genetic information. But, if it were too stable, then it wouldn’t allow organisms to evolve.”

The DNA double-helix has a spiral staircase structure with the outer edges made from sugar and phosphate molecules joined by stair steps composed of pairs of four nucleotide bases (adenine, cytosine, guanine and thymine) that serve as the basic letters in the genetic code.

There are two basic sources of DNA damage or lesions: environmental sources including ultraviolet light, toxic chemicals and ionizing radiation and internal sources, including a number of the cell’s own metabolites (the chemicals it produces during normal metabolism), reactive oxygen species and even water.

The new type of DNA repair enzyme, AlkD on the left, can identify and remove a damaged DNA base
without forcing it to physically "flip" to the outside of the DNA backbone, which is how all the other
DNA repair enzymes in its family work, as illustrated by the human AAG enzyme on the right. The
enzymes are shown in grey, the DNA backbone is orange, normal DNA base pairs are yellow, the
damaged base is blue and its pair base is green. (Brandt Eichman / Vanderbilt)

“More than 10,000 DNA damage events occur each day in every cell in the human body that must be repaired for DNA to function properly,” said first author Elwood Mullins, a postdoctoral research associate in the Eichman lab.

The newly discovered DNA repair enzyme is a DNA glycosylase, a family of enzymes discovered by Tomas Lindahl, who received this year’s Nobel prize for recognizing that these enzymes removed damaged DNA bases through a process called base-excision repair. It was the first of about 10 different DNA repair pathways that biologists have identified to date.


journal reference >>

October 29, 2015

BRIGHT IDEA FOR LOWLIGHT PHOTOGRAPHY


In this illustration, light passes through the new camera color filter developed by
University of Utah Electrical and Computer Engineering professor Rajesh Menon
before it reaches the digital camera sensor. Since all of the light reaches the sensor,
unlike conventional digital camera filters where only a third of the light passes through,
photos taken with Menon's new filter are much cleaner and brighter in lowlight.

(October 29, 2015)  UTAH ENGINEERS DEVELOP CAMERA FILTER THAT PRODUCES SHARPER, BRIGHTER PHOTOS IN LOW LIGHT.

Anyone who’s taken a picture of birthday candles being blown out or a selfie during a romantic candlelit dinner knows how disappointing it is when the photo comes out dark and grainy.

But University of Utah Electrical and Computer Engineering professor Rajesh Menon has developed a new camera color filter that lets in three times more light than conventional filters, resulting in much cleaner, more accurate pictures taken in lowlight. The new filter can be used for any kind of digital camera, but Menon is developing it specifically for smartphone cameras. Menon and doctoral student Peng Wang describe the invention today in the journal, Optica.

“Overall, camera phones are very good, but they are not very good in lowlight,” says Menon. “If you go out on a hike in the evening and take a picture of the sky you will see that it’s very grainy. Lowlight photography is not quite there and we are trying to fix that. This is the last frontier of mobile photography.”

Traditional digital cameras, whether they are point-and-shoot cameras or the now-ubiquitous smartphone cameras, use an electronic sensor that collects the light to make the picture. Over that sensor is a filter designed to allow in the three primary colors: red, blue and green. But by doing so, natural light hits the filter, and the filter absorbs two thirds of the color spectrum in order to let through each of the three primary colors.

University of Utah Electrical and Computer Engineering professor Rajesh Menon has
developed a new camera color filter for digital cameras that lets in three times more light
than conventional filters, resulting in much cleaner, more accurate pictures taken in lowlight.
The new filter can be used for any kind of digital camera, but Menon is developing it
specifically for smartphone cameras.

“If you think about it, this is a very inefficient way to get color because you’re absorbing two thirds of the light coming in,” Menon says. “But this is how it’s been done since the 1970s. So for the last 40 years, not much has changed in this technology.”

Menon’s solution is to use a color filter that lets all light pass through to the camera sensor. He does this with a combination of software and hardware.

Menon has designed a new color filter that is about a micron thick (100 times thinner than a human hair). It is a wafer of glass that has precisely-designed microscopic ridges etched on one side that bends the light in certain ways as it passes through and creates a series of color patterns or codes. Software then reads the codes to determine what colors they are.

read entire press  release >>

Battery Mystery Solved: Atomic-Resolution Microscopy Answers Longstanding Questions About Lithium-Rich Cathode Material


On the right the cube represents the structure of lithium- and manganese- rich transition
metal oxides. The models on the left show the structure from three different directions,
which correspond to the STEM images of the cube.

(October 29, 2015)  Berkeley Lab scientists unravel structural ambiguities in lithium-rich transition metal oxides.

Using complementary microscopy and spectroscopy techniques, researchers at Lawrence Berkeley National Laboratory (Berkeley Lab) say they have solved the structure of lithium- and manganese-rich transition metal oxides, a potentially game-changing battery material and the subject of intense debate in the decade since it was discovered.

Researchers have been divided into three schools of thought on the material’s structure, but a team led by Alpesh Khushalchand Shukla and Colin Ophus spent nearly four years analyzing the material and concluded that the least popular theory is in fact the correct one. Their results were published online in the journal Nature Communications in a paper titled, “Unraveling structural ambiguities in lithium- and manganese- rich transition metal oxides.” Other co-authors were Berkeley Lab scientists Guoying Chen and Hugues Duncan and SuperSTEM scientists Quentin Ramasse and Fredrik Hage.

This material is important because the battery capacity can potentially be doubled compared to the most commonly used Li-ion batteries today due to the extra lithium in the structure. “However, it doesn’t come without problems, such as voltage fade, capacity fade, and DC resistance rise,” said Shukla. “It is immensely important that we clearly understand the bulk and surface structure of the pristine material. We can’t solve the problem unless we know the problem.”

Colin Ophus (left) and Alpesh Khushalchand Shukla in front of the TEAM 0.5 microscope
at the Molecular Foundry. (Photo by Roy Kaltschmidt/Berkeley Lab)

A viable battery with a marked increase in storage capacity would not only shake up the cell phone and laptop markets, it would also transform the market for electric vehicles (EVs). “The problem with the current lithium-ion batteries found in laptops and EVs now is that they have been pushed almost as far as they can go,” said Ophus. “If we’re going to ever double capacity, we need new chemistries.”

Using state-of-the-art electron microscopy techniques at the National Center for Electron Microscopy (NCEM) at Berkeley Lab’s Molecular Foundry and at SuperSTEM in Daresbury, United Kingdom, the researchers imaged the material at atomic resolution. Because previous studies have been ambiguous about the structure, the researchers minimized ambiguity by looking at the material from different directions, or zone axes. “Misinterpretations from electron microscopy data are possible because individual two-dimensional projections do not give you the three-dimensional information needed to solve a structure,” Shukla said. “So you need to look at the sample in as many directions as you can.”

read entire press  release >>

Making cars of the future stronger, using less energy


Microscope view of copper (top) welded to titanium (bottom) using a new technique developed
at The Ohio State University. Image by Glenn Daehn, courtesy of The Ohio State University.

(October 29, 2015)  New welding technique can weld “un-weldable” metals

Engineers at The Ohio State University have developed a new welding technique that consumes 80 percent less energy than a common welding technique, yet creates bonds that are 50 percent stronger.

The new technique could have a huge impact on the auto industry, which is poised to offer new cars which combine traditional heavy steel parts with lighter, alternative metals to reduce vehicle weight.

Despite recent advances in materials design, alternative metals still pose a challenge to manufacturers in practice. Many are considered un-weldable by traditional means, in part because high heat and re-solidification weaken them, said Glenn Daehn, professor of materials science and engineering at Ohio State, who helped develop the new technique.

Microscope view of steel (top) welded to aluminum alloy (bottom) using a new technique
invented at The Ohio State University. Image by Glenn Daehn, courtesy of The Ohio State University.

“Materials have gotten stronger, but welds haven’t. We can design metals with intricate microstructures, but we destroy the microstructure when we weld,” he said.

“With our method, materials are shaped and bonded together at the same time, and they actually get stronger.”

Daehn explained the new process in a keynote address at the Materials Science & Technology 2015 meeting recently in Columbus.

A diagram showing vaporized foil actuator welding, a technique developed at The
Ohio State University. Image by Glenn Daehn, courtesy of The Ohio State University.

In a common technique called resistance spot welding, manufacturers pass a high electrical current through pieces of metal, so that the metals’ natural electrical resistance generates heat that partially melts them together and forms a weld. The drawbacks: generating high currents consumes a lot of energy, and the melted portions of metal are never as strong afterward as they were before.

Over the last decade, Daehn and his team have been trying to find ways around those problems. They’ve amassed more than half a dozen patents on a system called vaporized foil actuator (VFA) welding.

In VFA, a high-voltage capacitor bank creates a very short electrical pulse inside a thin piece of aluminum foil. Within microseconds (millionths of a second), the foil vaporizes, and a burst of hot gas pushes two pieces of metal together at speeds approaching thousands of miles per hour.

read entire press  release >>

October 28, 2015

Nano-scale magnets could compute complex functions significantly faster than conventional computers


CAPTION: The artist's portrayal is an illustration of a nanomagnetic coprocessor solving
complex optimization problems and highlights the shape-engineered nanomagnet's two
unique energy minimum states -- vortex and single domain. CREDIT: Illustration by Ryan Wakefield

(October 28, 2015)  Researchers from College of Engineering at University of South Florida have proposed a new form of computing that uses circular nanomagnets to solve quadratic optimization problems orders of magnitude faster than that of a conventional computer. A wide range of application domains can be potentially accelerated through this research such as finding patterns in social media, error-correcting codes to Big Data and biosciences.

Magnets have been used as computer memory/data storage since as early as 1920; they even made an entry into common hardware terminology like multi-“core”. The field of nanomagnetism has recently attracted tremendous attention as it can potentially deliver low-power, high speed and dense non-volatile memories. It is now possible to engineer the size, shape, spacing, orientation and composition of sub-100 nm magnetic structures. This has spurred the exploration of nanomagnets for unconventional computing paradigms.

In this work “Non Boolean computing with nanomagnets for computer vision applications” as published in Nature Nanotechnology, the USF research team has harnessed the energy-minimization nature of nanomagnetic systems to solve the quadratic optimization problems that arise in computer vision applications, which are computationally expensive. By exploiting the magnetization states of nanomagnetic disks as state representations of a vortex and single domain, the team has created a modeling framework to address the vortex and in-plane single domain in a unified framework and developed a magnetic Hamiltonian which is quadratic in nature. The implemented magnetic system can identify the salient features of a given image with more than 85% true positive rate. This form of computing, on average, is 1,528 times faster than IBM ILOG CPLEX (an industry standard software optimizer) with sparse affinity matrices (four neighbor), and 468 times faster with denser (eight neighbor) affinity matrices. These results show the potential of this alternative computing method to develop a magnetic coprocessor that might solve complex problems in fewer clock cycles than traditional processors.

read entire press  release >>

journal reference >>

UK’s first trial of self-healing concrete



(October 28, 2015)  A University-led project is testing ways of automatically repairing concrete without human intervention 

The first major trial of self-healing concrete in the UK, led by a team of researchers from the School of Engineering, is being undertaken at a site in the South Wales Valleys.

The project, entitled Materials for Life (M4L), is piloting three separate concrete-healing technologies for the first time in real-world settings, with a view to incorporating them into a single system that could be used to automatically repair concrete in the built environment.

At present, billions of pounds are spent every year maintaining, fixing and restoring structures such as bridges, buildings, tunnels and roads.

It is estimated that around £40 billion a year is spent in the UK on the repair and maintenance of structures, the majority of which are made from concrete.

read entire press  release >>

Bioengineers cut in half time needed to make high-tech flexible sensors



(October 28, 2015)  Bioengineers at the University of California, San Diego, have developed a method that cuts down by half the time needed to make high-tech flexible sensors for medical applications. The advance brings the sensors, which can be used to monitor vital signs and brain activity, one step closer to mass-market manufacturing.

The new fabrication process will allow bioengineers to broaden the reach of their research to more clinical settings. It also makes it possible to manufacture the sensors with a process similar to the printing press, said Todd Coleman, a bioengineering professor at the Jacobs School of Engineering at UC San Diego. Researchers describe their work in a recent issue of the journal Sensors.

“A clinical need is what drove us to change our fabrication process,” Coleman said.


Coleman’s team at UC San Diego has been working in medical settings for four years. Their sensors have been used to monitor premature babies, pregnant women, patients in Intensive Care Units and patients suffering from sleep disorders.

Coleman and colleagues quickly found out that nurses wanted the sensors to come in a peel-and-stick form, like a medical-grade Band Aid. The medium on which the sensors were placed also needed to be FDA-approved.

The sensors’ original fabrication process involved 10 steps—five of which had to take place in a clean room. Also, the steps to remove the sensors from the silicon wafer they’re built on alone took anywhere from 10 to 20 minutes. And the sensors remained fragile and susceptible to rips and tears.


But what if you could use the adhesive properties of a Band Aid-like medium to help peel off the sensors from the silicon wafer easily and quickly? Wouldn’t that make the process much simpler—and faster? That was the question that Dae Kang, a Jacobs School Ph.D. student in Coleman’s research group, set out to answer. The result of his efforts is a process that comprises only six steps—three of them in the clean room. The steps that took 10 to 20 minutes before now take just 35 seconds.



Kang created a coating about 20 to 50 micrometers thick, made of a silicon-like material called an elastomer, to easily remove the sensors, made of gold and chromium, from the silicon wafer. This was tricky work. The coating had be sticky enough to allow researchers to build the sensors in the first place, but loose enough to allow them to peel off the wafer.
  


“It’s a Goldilocks problem,” Coleman said.

The new process doesn’t require any chemical solvents. That means the sensors can be peeled off with any kind of adhesive, from scotch tape to a lint roller, as researchers demonstrated in the study.

read entire press  release >>

BANG&OLUFSEN - BEOLAB 90







(October 28, 2015)  Vision of Sound

It will not be for everybody, but it will be for the right somebody. BeoLab 90 is our proudest loudspeaker to date, built for optimum precision in sound. It will blow the roof off your house and knock the socks off your guests.

CHANGING THE FUTURE OF SOUND

THE INTELLIGENT LOUDSPEAKER

BeoLab 90 contains a multitude of technologies. It´s a perfect mix of world-class design and acoustics in, what may well be, the most complete and powerful digital loudspeaker designed for use in your home. This highly intelligent loudspeaker provides you with clarity, range and a sound staging that is second-to-none. BeoLab 90 features an impressive 360-degree design, has a variety of settings, and regardless of its placement, the room or your listening position it will give you mind-blowing sound.

THE SPEAKER, NOT THE ROOM, DEFINES THE SOUND

Resonances in your room and boundary effects from your walls affect the sound you get from your loudspeakers. BeoLab 90 is fitted with Bang & Olufsen’s new Active Room Compensation technology making up for the impact of your room, your furniture, the placement of the loudspeakers and the location of the listening position. This advanced technology guarantees you a sensational sound experience exactly where you want it.

TECHNOLOGY MADE SIMPLE

Control the settings of your BeoLab 90 by using the Bang & Olufsen remote or by using the dedicated app on your phone. It is easy to control the beam width and direction. In addition, you can pre-program all of BeoLab 90’s features and save presets that best suit your preferences for different occasions. These presets can also be automatically selected by your Bang & Olufsen television, or triggered by devices connected to BeoLab 90´s many inputs.

source >>

Scientists identify main component of brain repair after stroke


Sprouting connections in the brain: Adding GDF10 to neurons in a dish results
in the formation of new connections between brain cells. This process may lead
to recovery after stroke. Image courtesy of S. Thomas Carmichael, M.D., Ph.D.,
David Geffen School of Medicine at the University of California Los Angeles.

(October 28, 2015)  NIH-funded research pinpoints protein that sprouts into action, activating stroke repair

Looking at brain tissue from mice, monkeys and humans, scientists have found that a molecule known as growth and differentiation factor 10 (GDF10) is a key player in repair mechanisms following stroke. The findings suggest that GDF10 may be a potential therapy for recovery after stroke. The study, published in Nature Neuroscience, was supported by the National Institute of Neurological Disorders and Stroke (NINDS), part of the National Institutes of Health.

“These findings help to elucidate the mechanisms of repair following stroke. Identifying this key protein further advances our knowledge of how the brain heals itself from the devastating effects of stroke, and may help to develop new therapeutic strategies to promote recovery,” said Francesca Bosetti, Ph.D., stroke program director at NINDS.

Stroke can occur when a brain blood vessel becomes blocked, preventing nearby tissue from getting essential nutrients. When brain tissue is deprived of oxygen and nutrients, it begins to die. Once this occurs, repair mechanisms, such as axonal sprouting, are activated as the brain attempts to overcome the damage. During axonal sprouting, healthy neurons send out new projections (“sprouts”) that re-establish some of the connections lost or damaged during the stroke and form new ones, resulting in partial recovery. Before this study, it was unknown what triggered axonal sprouting.

read entire press  release >>

VTT’s project supports the future human missions to Mars



(October 28, 2015)  The international UNISONO project, which is coordinated by VTT Technical Research Centre of Finland, has developed a communication solution that can allow orbiting space station in outer space to maintain uninterrupted contact with robots working on the surface of a planet.  The technology also has potential industrial applications, such as to reduce lags and jitters in mobile gaming.

The technology developed in the course of the UNISONO project is an important step forward for initiatives such as the human mission to Mars. Before humans can land on Mars, the planet needs infrastructure, such as housing and laboratories, which need to be built by robots. These robots need to be controlled by astronauts from a space station orbiting the planet.

Astronauts can currently practice to control the robots on Earth from the International Space Station (ISS). The ISS is in constant orbit around Earth, which means that the astronauts frequently lose direct contact with the robot. This results in discontinuity in the data and video transmission, stopping astronauts to maintain the control of the robot.

"Losing control of the robot during a critical task can cause damage to the task or the robot itself. The UNISONO project has developed a solution which can keep the astronaut in constant contact with the robot during entire orbit", explains Dr Ali Muhammad, Principal Investigator in Robotics Systems at VTT Technical Research Centre of Finland.

The time window for ISS to be in direct contact with a robot on Earth is much shorter than what is planned for an orbiter around the Mars. The UNISONO project has shown that how this time window available to the astronaut can be widen by seamlessly switching between relaying stations on the ground. This allows astronauts to realistically simulate the future robotic missions on Mars, moon or other heavenly bodies.

At this stage, the project has demonstrated a seamless switching concept which can be further developed to become reality for future human missions to Mars.

read entire press  release >>

Brain Imaging Can Predict the Success of Large Public Health Campaigns



(October 28, 2015)  It’s a frustrating fact that most people would live longer if only they could make small changes: stop smoking, eat better, exercise more, practice safe sex. Health messaging is one important way to change behavior on a large scale, but while a successful campaign can improve millions of lives, a failed one can be an enormous waste of resources.

"The problem is that people are notoriously bad at guessing which ads will be effective and ineffective at changing their behavior,” says Emily Falk, Associate Professor of Communication at the Annenberg School for Communication at the University of Pennsylvania.

So Falk and her team take a different approach: They look inside people’s brains.
  

The red highlighted area depicts the brain area of interest in this study.

In a study soon to be published in the journal Social Cognitive and Affective Neuroscience, the researchers found that brain activity in just 50 smokers in Michigan was able to predict the outcome of an anti-smoking email campaign sent to 800,000 smokers in New York State.

In collaboration with the University of Michigan Center for Health Communications Research and Research Center for Group Dynamics, the researchers recorded 50 smokers’ brain activity using functional magnetic resonance imaging (fMRI) as each smoker viewed 40 anti-smoking images, one by one.

In particular, Falk’s team focused on the medial prefrontal cortex (MPFC), an area of the brain that helps us decide what information is relevant and valuable to us.

The researchers predicted that the more activity an image stimulated in the brain’s MPFC, the more motivating and self-relevant that image would be to make that person stop smoking.

read entire press  release >>