Melbourne researchers are working towards a potential treatment to slow the progression of motor neuron disease (MND), offering hope to people with this debilitating and incurable illness.
The research team have uncovered how inflammation in MND is triggered. Pinpointing the molecules involved in this pathway could be a first step towards a new treatment for MND.
They found that by blocking an immune sensor called STING, they could dramatically prevent inflammation from MND patient cells, paving the way for a new class of drugs to be developed for people with neurodegenerative disorders, such as MND.
The discovery, published today in Cell, was led by Walter and Eliza Hall Institute researchers Associate Professor Seth Masters and Dr Alan Yu, with colleagues from the University of Melbourne and Hudson Institute.
Halting the inflammatory response
MND is an incurable condition in which the nerve cells controlling the muscles that enable us to move, speak, swallow and breathe, fail to work. One in 10,000 Australians will be diagnosed with MND in their lifetime and the average life expectancy from diagnosis is just two years.
Most people suffering from MND have an accumulation of a protein called TDP-43 within cells of the central nervous system. This build-up is associated with an inflammatory response that precedes major symptoms of MND.
Institute researchers investigated how the disease-causing inflammation is triggered in MND, said Associate Professor Masters. “This unexpectedly identified that an immune sensor called STING is activated downstream of TDP-43. Fortuitously, our team had already studied the role of STING in other inflammatory diseases and are now working out how to block it.”
The team then used new inhibitors — drug-like compounds — to block different components of this inflammatory pathway.
“Using cells from patients with MND that we can turn into motor neurons in a dish, we showed that blocking STING dramatically prevented inflammation and kept the cells alive longer. This is an exciting first step before taking these inhibitors into the clinic for treatment for MND.
Vital first step towards a treatment
Associate Professor Masters said his research had also established activation of STING in people who had passed away due to MND.
“We are now aiming to validate a biomarker of the pathway earlier in the disease progression. Once this neuroinflammatory biomarker is validated, we will better understand which patients will benefit the most from treatments targeting the pathway,” he said.
“With this knowledge, there is the potential to develop a treatment for patients with MND.
“Interestingly, our preclinical models suggest that although the anti-inflammatory drugs that inhibit STING did not prevent disease onset, they did slow the degenerative progression of disease.”
Hope for people with MND and other neurodegenerative disorders
Associate Professor Masters said this discovery offered hope for people diagnosed with the debilitating condition.
“We are hopeful this research could lead to a treatment for people with established MND, who currently have very few treatment options and a life expectancy post diagnosis of just two to five years,” he said.
“While it isn’t a cure, we hope it might extend life expectancy and dramatically improve the quality of life for people diagnosed with MND.”
Associate Professor Masters said a future treatment might also be effective in slowing the progression of other neurodegenerative disorders.
“We are hoping to develop a new class of drugs that would act as STING inhibitors to stop the progression of neurodegenerative disorders, such as MND, Frontotemporal Dementia and Parkinson’s disease.”
Espresso is hard to resist. When properly pulled, shots of this wonderful drink have powerful charms. Superconcentrated, rich yet balanced, espresso’s complex flavors may hook you like no other coffee style. Making it at home can be a tall order, though. A lot of coffee makers billed as domestic espresso machines are that in name only. If you don’t do your homework, chances are good you’ll wind up with a terrible appliance, one that slings awful drinks. Make sure to avoid this pitfall and buy a machine that produces superb shots all day long.
The best home espresso machines have an advanced brewing process and handy bells and whistles like a double portafilter basket for double-shot drinks, and a milk frother and steam wand for a cup of cappuccino or a latte. These automatic machines don’t come cheap, and you can expect to pay at least $500 for something that whips up legit cafe-caliber espresso drinks (or an espresso shot, if that’s your thing). But when in doubt, try to remember how much you’ll be saving on all the lattes, cappuccinos and double shots you get from your coffee shop thanks to your espresso and cappuccino maker.
You can also drop as little as $100, if you’re willing to settle for a mediocre espresso, but I urge you not to pounce on products that cost less, especially if you plan on drinking espresso regularly. Seemingly affordable espresso machines may look like a bargain at first blush, but they’re often a waste of money and counter space.
For those on a budget, “espresso brewers” (in the $30 to $50 price range) typically lack motorized pumps and are powered by steam pressure alone. What they produce is really moka pot coffee, the sort of drink made by simple stovetop brewers; it won’t taste quite like the espresso you’re used to from the barista at your local coffee shop or cafe. That’s not inherently bad — it’s just not really espresso.
Now playing:Watch this: Want to buy an espresso machine? Here’s what you need…
To find the best espresso machine for espresso lovers, I spent over 80 hours putting 10 available espresso machines through their paces. I limited my testing to manual espresso machines, not the ones that make espresso from pods or capsules. I also revisited three other espresso machines I reviewed previously. During the process, I made and sampled scores of espresso shots, double shots, lattes, cappuccinos and pitchers of steamed milk and milk froth. Basically, if it was a coffee drink, I made it. I also took into account things like water reservoir and storage, water filter, control panel, grinding capabilities and automatic milk frother length (and its ability to steam and froth milk).
After my experience, these are the three I’d pick as the best home espresso machines. While they all get the job done and offer the essential features you need — like a steam milk frother, drip tray, substantial water reservoir and easy-to-clean stainless-steel base — the key differentiating factor between them is the price. And how much you spend on an espresso machine does have a major impact on what type of coffee you’ll ultimately get.
CNET Smart Home and Appliances
Get smart home reviews and ratings, video reviews, buying guides, prices and comparisons from CNET.
I limited this list to automatic machines and semiautomatic espresso machines. I excluded superautomatic espresso makers, as sold by Krups, Philips, Miele and others. Those models are a breed apart, costing many times more ($2,000 to $3,000). I update this list periodically, and you’ll find my testing methodology below.
Still with me? Keep going, delicious espresso will soon be yours!
You can’t beat the Breville Barista Express and its combination of performance, features and price. For $600, the machine’s formidable grinder pulverizes espresso beans and smart technology doses grounds directly into its portafilter basket, plus its sturdy frother steams milk well and makes thick foam. It also consistently pulled the best-tasting shots of espresso in my test group.
The control panel may be a little intimidating at first, but once you get the hang of things, a delicious shot (or double shot) of espresso, latte or other coffee-based drink of choice will be your reward. Made from stainless steel, the Barista Express is a cinch to clean as well. And to seal the deal, Breville includes premium metal tools such as a handy dose trimmer and tamper.
I will note, though, that this machine is not small. If counter space is at a premium in your kitchen, you may want to look at the next machine on the list instead. Read more.
For those who crave great espresso at home but are nervous about getting the technique down, the Breville Bambino Plus is the perfect choice. It’s dead simple to use and to keep clean, and it’s compact in size — and I found it pulled delicious shots of espresso second only to Breville’s Barista Express. I especially appreciate how easy it is to froth milk with the Bambino. Just insert the steam wand into the Bambino’s stainless-steel milk pitcher (included), then press one button. Less than a minute later, you’ll have expertly steamed milk foam ready for lattes and cappuccinos.
While it lacks its own coffee grinder, the Cuisinart EM-100 has plenty going for it when it comes to making an espresso, cappuccino or latte. This espresso machine has a compact design but is powerful enough to brew from fine coffee grounds. It also pulled flavorful espresso shots of good quality and strength. The machine features a long stainless-steel frother for steaming milk and a built-in cup warmer heating element too. A solid espresso machine at about a third the price of the Breville.
How we test espresso machines
My evaluation process for espresso machines is similar to how I test standard drip coffee makers. First, I hand-wash and dry all removable parts and accessories. For most espresso products, that includes the portafilter basket, metal portafilter inserts, water tank and so on. Next, I run one brewing cycle with just hot water to flush away any residual material from manufacturing.
Most espresso machines, save for fancy superautomatic models, lack an integrated coffee grinder, and I prefer to test with freshly ground coffee. So I supply my own grinder: the Breville Smart Grinder Pro. I chose this grinder for two reasons. First, it’s calibrated more for espresso and less for drip or other brewing styles. That means it produces a grind that’s quite fine. Second, its grind size is also consistently uniform. Both factors are critical for a proper espresso brewing process.
To pull shots, I start with the suggested method outlined in a given machine’s product manual. Usually that covers the amount of coffee grounds expected per shot, along with any guidelines regarding coarseness level. Likewise, I follow tamping instructions (light, medium or hard tamp) if the manual provides them.
Whenever possible, I brew double shots of espresso for all my test runs. I make sure to record the weight of the grounds I use, plus the weight of espresso for each shot I pull. This data, along with readings from a portable refractometer, allows me to calculate two important percentages: total dissolved solids and extraction percentage.
Just as for any coffee brew, the ideal extraction percentage for espresso is a range between 18% and 22%. This yields a balanced cup, assuming you perform an even and efficient extraction of coffee compounds from your grounds (both flavor and caffeine).
If you overextract, you run the risk of leaching out unpleasant flavors (bitterness) after the good. On the opposite end of the scale, underextracted brews tend to have undeveloped flavors. Lacking sugars and other caramelized organic chemicals, these shots will taste sour, weak and watery.
Unlike making a cup of drip coffee, espresso should be concentrated. While excellent drip typically has a TDS percentage of 1.3% or 1.4%, great espresso has a much higher percentage. The Breville Barista Express, for example, produced shots with TDS percentages as high as 12.4%.
The shots I pulled were balanced, though, with an extraction of 18.6%. The test beans I use are the same variety I employ for standard coffee makers — Costco Kirkland Colombian. It’s a medium dark roast, suitable for brewing espresso as well.
Lastly, I try my hand at frothing milk with each coffee machine equipped with a steam wand. I record the overall experience with the steam wand, whether the process is a snap, a tricky chore or somewhere in between.
Want more options for your cup of coffee? Check out this list of espresso machines I’ve tested in addition to the ones above.
Diamonds have a firm foothold in our lexicon. Their many properties often serve as superlatives for quality, clarity and hardiness. Aside from the popularity of this rare material in ornamental and decorative use, these precious stones are also highly valued in industry where they are used to cut and polish other hard materials and build radiation detectors.
More than a decade ago, a new property was uncovered in diamonds when high concentrations of boron are introduced to it — superconductivity. Superconductivity occurs when two electrons with opposite spin form a pair (called a Cooper pair), resulting in the electrical resistance of the material being zero. This means a large supercurrent can flow in the material, bringing with it the potential for advanced technological applications. Yet, little work has been done since to investigate and characterise the nature of a diamond’s superconductivity and therefore its potential applications.
New research led by Professor Somnath Bhattacharyya in the Nano-Scale Transport Physics Laboratory (NSTPL) in the School of Physics at the University of the Witwatersrand in Johannesburg, South Africa, details the phenomenon of what is called “triplet superconductivity” in diamond. Triplet superconductivity occurs when electrons move in a composite spin state rather than as a single pair. This is an extremely rare, yet efficient form of superconductivity that until now has only been known to occur in one or two other materials, and only theoretically in diamonds.
“In a conventional superconducting material such as aluminium, superconductivity is destroyed by magnetic fields and magnetic impurities, however triplet superconductivity in a diamond can exist even when combined with magnetic materials. This leads to more efficient and multifunctional operation of the material,” explains Bhattacharyya.
The team’s work has recently been published in an article in the New Journal of Physics, titled “Effects of Rashba-spin-orbit coupling on superconducting boron-doped nanocrystalline diamond films: evidence of interfacial triplet superconductivity.” This research was done in collaboration with Oxford University (UK) and Diamond Light Source (UK). Through these collaborations, beautiful atomic arrangement of diamond crystals and interfaces that have never been seen before could be visualised, supporting the first claims of ‘triplet’ superconductivity.
Practical proof of triplet superconductivity in diamonds came with much excitement for Bhattacharyya and his team. “We were even working on Christmas day, we were so excited,” says Davie Mtsuko. “This is something that has never been before been claimed in diamond,” adds Christopher Coleman. Both Mtsuko and Coleman are co-authors of the paper.
Despite diamonds’ reputation as a highly rare and expensive resource, they can be manufactured in a laboratory using a specialised piece of equipment called a vapour deposition chamber. The Wits NSTPL has developed their own plasma deposition chamber which allows them to grow diamonds of a higher than normal quality — making them ideal for this kind of advanced research.
This finding expands the potential uses of diamond, which is already well-regarded as a quantum material. “All conventional technology is based on semiconductors associated with electron charge. Thus far, we have a decent understanding of how they interact, and how to control them. But when we have control over quantum states such as superconductivity and entanglement, there is a lot more physics to the charge and spin of electrons, and this also comes with new properties,” says Bhattacharyya. “With the new surge of superconducting materials such as diamond, traditional silicon technology can be replaced by cost effective and low power consumption solutions.”
The induction of triplet superconductivity in diamond is important for more than just its potential applications. It speaks to our fundamental understanding of physics. “Thus far, triplet superconductivity exists mostly in theory, and our study gives us an opportunity to test these models in a practical way,” says Bhattacharyya.
In 2018, 701 cases of severe invasive listeriosis were communicated to the Robert Koch Institute (RKI), which translates into 0.8 cases per 100,000 inhabitants. Most listeriosis illnesses reported are severe and are associated with blood poisoning, meningitis or miscarriages, for example. In 2018, the disease was fatal in 5% of cases. Elderly people, people with weakened immune defences, pregnant women and their new-born babies are particularly vulnerable. Listeria can be found in a large variety of foods of plant and animal origin. Cold or hot-smoked fish are often contaminated and are, therefore, also suspected of transmitting this illness. Other fish products and seafood eaten raw, such as sushi, sashimi and oysters or cured products such as graved fish, may also be affected. “Pregnant women, elderly people or those with weakened immune defences should only eat fish and seafood that have been thoroughly heated,” says BfR President Professor Dr. Dr. Andreas Hensel.
Not all Listeria bacteria cause illness. Of the 20 Listeria species described, only Listeria (L.) monocytogenes is a significant cause of infection in humans. Infections during pregnancy can lead to miscarriage, premature birth, stillbirth or the birth of a sick child. Furthermore, listeriosis mainly develops in people whose immune system is weakened by old age, pre-existing medical conditions or medication intake. They often suffer from blood poisoning, encephalitis or meningitis as well as e.g. from endocarditis or bacterial joint inflammation. Listeriosis is associated with relatively high mortality in risk groups. In healthy individuals who do not belong to one of the risk groups, an infection can lead to inflammation of the gastrointestinal tract plus a fever, with progression generally being mild.
The bacterium L. monocytogenes is widespread in the environment and can be found in many foods. High detection rates are found in minced meat, raw meat dishes (e.g. tartare), raw sausage meat (e.g. “Mettwurst” raw minced pork) and raw milk, for example. However, numerous other ready-to-eat foods of animal and plant origin, which are not subjected to further germicidal treatment (e.g. heating) after processing, may also contain L. monocytogenes. Examples include cheese (made from raw or pasteurised milk), pre-cut salads and vegetables, deli salads or sliced sausage products. This is because listeria can survive for a long time in food processing plants in recesses that are difficult to reach for cleaning and disinfection. As a result, the continuous entry of the germs during food production is possible.
Raw, smoked or cured fish products and seafood such as sushi, sashimi, oysters, cold or hot smoked fish (e.g. smoked salmon) and cured fish (e.g. graved salmon) are frequently contaminated with listeria. 7 to 18 % of the samples of cold-smoked or cured fish products examined by the food monitoring authorities in Germany between 2007 and 2017, and 3 to 9 % of the samples of hot-smoked fish products contained L. monocytogenes. Even low germ concentrations are hazardous to risk groups, for example when products are stored at home above the temperatures recommended by the manufacturer or when they are eaten after their best-before date. What’s more, handling contaminated products risks transferring listeria to other foods.
The German Nutrition Society (DGE) recommends at least one fish meal every week. Fish notably contains special fatty acids and the long-chain omega-3 fatty acids docosahexaenoic acid (DHA) and eicosapentaenoic acid (EPA).
The BfR recommends that people who have an increased risk of developing listeriosis should not generally avoid fish, but rather only eat fish or seafood that has been thoroughly heated. Listeria can be reliably killed off by heating food to a core temperature of 70 °C for at least two minutes. Risk groups should refrain from eating raw, smoked and cured fish products and seafood.
A new intraoperative imaging technique, Cerenkov luminescence imaging (CLI), can accurately assess surgical margins during radical prostatectomy, according to a first-in-human research published in the October issue of the Journal of Nuclear Medicine. The feasibility study showed that 68Ga-PSMA CLI can image the entire excised prostate specimen’s surface to detect prostate cancer tissue at the resection margin.
Radical prostatectomy is one of the primary treatment options for men with localized prostate cancer. The goal of a radical prostatectomy is to completely resect the prostate without positive surgical margins. Incomplete removal of the cancer tissue during radical prostatectomy is often associated with poorer patient outcomes, including increased likelihood of recurrence and prostate cancer-related mortality.
Prostate-specific membrane antigen (PSMA) ligand positron emission tomography (PET) has emerged as an accurate tool to detect prostate cancer both in primary staging and at time of biochemical recurrence. As PET imaging agents also emit optical photons via a phenomenon called Cerenkov luminescence, researchers sought to evaluate the feasibility and diagnostic accuracy of CLI in detecting prostate cancer.
“Intraoperative radioguidance with CLI may help surgeons in the detection of extracapsular extension, positive surgical margins and lymph node metastases with the aim of increasing surgical precision,” stated Christopher Darr, PhD, resident at the Department of Urology of the University Medical Center Essen in Essen, Germany. “The intraoperative use of CLI would allow the examination of the entire prostate surface and provide the surgeon with real-time feedback on the resection margins.”
The single-center study included 10 patients with high-risk primary prostate cancer. 68Ga-PSMA PET scans were performed followed by radical prostatectomy and intraoperative CLI of the excised prostate. CLI images were analyzed postoperatively to determine regions of interest based on signal intensity, and tumor-to-background ratios were calculated. CLI tumor margin assessment was performed by analyzing elevated signals at the surface of the intact prostate images. To determine accuracy, tumor margin status as detected by CLI was compared to postoperative histopathology.
Tumor cells were successfully detected on the incised prostate CLI images and confirmed by histopathology. Three patients had positive surgical margins, and in two of the patients, elevated signal levels enabled correct identification on CLI. Overall, 25 out of 35 CLI regions of interest proved to visualize tumor signaling according to standard histopathology.
Boris A. Hadaschik, PhD, director of the Clinic for Urology of the University Medical Center Essen, added, “Radical prostatectomy could achieve significantly higher accuracy and oncological safety, especially in patients with high-risk prostate cancer, through the intraoperative use of radioligands that specifically detect prostate cancer cells. In the future, a targeted resection of lymph node metastases could also be performed in this way. This new imaging combines urologists and nuclear medicine specialists in the local treatment of patients with prostate cancer.”
The authors of “Intraoperative 68Gallium-PSMA Cerenkov Luminescence Imaging for Surgical Margins in Radical Prostatectomy — A Feasibility Study” include Christopher Darr, Nina N. Harke, Jan Philipp Radtke, Leubet Yirga, Claudia Kesch and Boris A. Hadaschik, Department of Urology, University Hospital Essen, Essen, Germany; Maarten R. Grootendorst, Clinical Research, Lightpoint Medical Ltd., Chesham, United Kingdom; Wolfgang P. Fendler, Peter Fragoso Costa, Christopher Rischpler, Christine Praus, Ken Herrmann and Ina Binse; Department of Nuclear Medicine, University Hospital Essen, Essen, Germany; Johannes Haubold, Institute of Diagnostics and Radiology, University Hospital Essen, Essen, Germany; and Henning Reis and Thomas Hager, Institute of Pathology, University of Duisburg-Essen, Essen, Germany.
This study was made available online in February 2020 ahead of final publication in print in October 2020.
The compound thymoquinone (TQ) selectively kills prostate cancer cells at advanced stages, according to a new study published in Oncogene. Led by researchers at Kanazawa University, the study reports that prostate cancer cells with a deletion of the SUCLA2 gene can be therapeutically targeted. SUCLA2-deficient prostate cancers represent a significant fraction of those resistant to hormone therapy or metastatic, and a new therapeutic option for this disease would have immense benefits for patients.
Hormone therapy is often chosen for the treatment of metastatic prostate cancer but nearly half of patients develop resistance to the treatment in as little as 2 years. A mutation in RB1, a tumor suppressor gene that keeps cell growth under control, has been pegged as a particularly strong driver of treatment resistance and predicts poor outcome in patients.
“Mutations in tumor suppressor genes are enough to induce initiation and malignant progression of prostate cancer, but so far we haven’t been able to directly target these mutations with drugs to treat prostate cancer,” says the lead author Susumu Kohno. “We wanted to find a genetic aberration associated with that of a tumor suppressor gene which we could target therapeutically.”
In the genome, SUCLA2 neighbors RB1. An analysis of prostate cancer cells showed that cells with a RB1 deletion were also missing SUCLA2, pairing up the SUCLA2 deletion with the RB1 deletion present in advanced stage prostate cancer. Kohno and colleagues analyzed prostate cancer tissue and found that 11% of cases were missing both SUCLA2 and RB1.
The researchers screened compounds to identify drugs that would selectively kill cells with a SUCLA2 deletion. Out of around 2,000 compounds, TQ emerged as a hit compound. TQ already has known anti-cancer effects and was shown to be safe in a phase I clinical trial. Kohno and colleagues applied the TQ treatment to a mouse model of SUCLA2-deficient prostate cancer and TQ selectively suppressed tumor growth.
“These findings show that TQ treatment could be an effective therapy for treating prostate cancer cells that harbor SUCLA2 deficiency” says the senior author Chiaki Takahashi.
In a search of genetic databases from patients with prostate cancer, the researchers found that the frequency of SUCLA2 loss was almost perfectly aligned with RB1 loss at every disease stage — meaning the SUCLA2 deletion could identify people with prostate cancer needing advanced therapy.
Finding this drug-targetable vulnerability opens a crack in the barrier of treatment resistance for prostate cancer. More work needs to be done to improve efficacy of TQ and identify patients that would benefit from this type of treatment, but the compound provides a promising route for new treatment options for advanced prostate cancer.
A research team has investigated the consequences of changes in plant biodiversity for the functioning of ecosystems. The scientists found that the relationships between plant traits and ecosystem functions change from year to year. This makes predicting the long-term consequences of biodiversity change extremely difficult.
“We found that — over the longer term — the links between plant traits and ecosystem functions were indeed very weak, as we could only explain about 12 per cent of the variance in ecosystem functioning,” said the paper’s lead author, Dr Fons van der Plas from the Institute of Biology at Leipzig University. Together with colleagues from the German Centre for Integrative Biodiversity Research (iDiv) and other research institutions in Germany and abroad, he found different patterns than in previous studies — which had focused on short-term links between plant traits and ecosystem functions. These had previously assumed much stronger links between plant traits and ecosystem functioning.
“The main difference between our studies and earlier ones was that we carried out our work over a period of ten years, while most other studies were based on data measured in just one year,” said the biologist. The relationships between plant traits and ecosystem functions changed from year to year: some species become locally extinct, while others replace them.
Scientists often ask themselves how this change in biodiversity affects the way ecosystems function, for example in terms of biomass production, carbon sequestration and pollination. In predicting these consequences, they rely on the traits in which plants differ. For example, some plant species are pollinated by insects, and others by the wind. They hope that knowing which species will be more common in the future and what traits these species have will enable them to make more precise predictions.
The research team led by van der Plas has now discovered, for example, that plant biomass production was maximised in plant communities dominated by species with thick roots in some years and by completely different plant communities in others. In almost every year, a different plant trait was found to have been important for maximising biomass production. According to van der Plas, it is therefore extremely difficult to predict exactly how changes in plant communities affect the functioning of ecosystems over long periods of time.
The surface of metals plays a key role in many technologically relevant areas, such as catalysis, sensor technology and battery research. For example, the large-scale production of many chemical compounds takes place on metal surfaces, whose atomic structure determines if and how molecules react with one another. At the same time, the surface structure of a metal influences its electronic properties. This is particularly important for the efficiency of electronic components in batteries. Researchers worldwide are therefore working intensively on developing new kinds of methods to tailor the structure of metal surfaces at the atomic level.
A team of researchers at the University of Münster, consisting of physicists and chemists and led by Dr. Saeed Amirjalayer, has now developed a molecular tool which makes it possible, at the atomic level, to change the structure of a metal surface. Using computer simulations, it was possible to predict that the restructuring of the surface by individual molecules — so-called N-heterocyclic carbenes — takes place similar to a zipper. During the process, at least two carbene molecules cooperate to rearrange the structure of the surface atom by atom. The researchers could experimentally confirm, as part of the study, this “zipper-type” mechanism in which the carbene molecules work together on the gold surface to join two rows of gold atoms into one row. The results of the work have been published in the journal Angewandte Chemie International Edition.
In earlier studies the researchers from Münster had shown the high stability and mobility of carbene molecules at the gold surface. However, no specific change of the surface structure induced by the molecules could previously be demonstrated. In their latest study, the researchers proved for the first time that the structure of a gold surface is modified very precisely as a result of cooperation between the carbene molecules. “The carbene molecules behave like a molecular swarm — in other words, they work together as a group to change the long-range structure of the surface,” Saeed Amirjalayer explains. “Based on the ‘zipper’ principle, the surface atoms are systematically rearranged, and, after this process, the molecules can be removed from the surface.”
The new method makes it possible to develop new materials with specific chemical and physical properties — entirely without macroscopic tools. “In industrial applications often macroscopic tools, such presses or rollers, are used,” Amirjalayer continues. “In biology, these tasks are undertaken by certain molecules. Our work shows a promising class of synthesized molecules which uses a similar approach to modify the surface.” The team of researchers hopes that their method will be used in future to develop for examples new types of electrode or to optimize chemical reactions on surfaces.
At the latest since the Nobel Prize in Physics was awarded for research on graphene in 2010, 2D materials — nanosheets with atomic thickness — have been a hot topic in science.
This significant interest is due to their outstanding properties, which have enormous potential for a wide variety of applications. For instance, combined with optical fibres, 2D materials can enable novel applications in the areas of sensors, non-linear optics, and quantum technologies. However, combining these two components has so far been very laborious. Typically, the atomically thin layers had to be produced separately before being transferred by hand onto the optical fibre. Together with Australian colleagues, Jena researchers have now succeeded for the first time in growing 2D materials directly on optical fibres. This approach significantly facilitates manufacturing of such hybrids. The results of the study were reported recently in the journal on materials science Advanced Materials.
Growth through a technologically relevant procedure
“We integrated transition metal dichalcogenides — a 2D material with excellent optical and photonic properties, which, for example, interacts strongly with light — into specially developed glass fibres,” explains Dr Falk Eilenberger of the University of Jena and the Fraunhofer Institute for Applied Optics and Precision Engineering (IOF) in Germany. “Unlike in the past, we did not apply the half-nanometre-thick sheet manually, but grew it directly on the fibre,” says Eilenberger, a specialist in the field of nanophotonics. “This improvement means that the 2D material can be integrated into the fibre more easily and on a large scale. We were also able to show that the light in the glass fibre strongly interacts with its coating.” The step to a practical application for the intelligent nanomaterial thus created is no longer very far away.
The success has been achieved thanks to a growth process developed at the Institute of Physical Chemistry of the University of Jena, which overcomes previous hurdles. “By analysing and controlling the growth parameters, we identified the conditions at which the 2D material can directly grow in the fibres,” says Jena 2D materials expert Prof. Andrey Turchanin, explaining the method based on chemical vapour deposition (CVD) techniques. Among other things, a temperature of over 700 degrees Celsius is necessary for the 2D material growth.
Hybrid material platform
Despite this high temperature, the optical fibres can be used for the direct CVD growth: “The pure quartz glass that serves as the substrate withstands the high temperatures extremely well. It is heat-resistant up to 2,000 degrees Celsius,” says Prof. Markus A. Schmidt of the Leibniz Institute of Photonic Technology, who developed the fibres. “Their small diameter and flexibility enable a variety of applications,” adds Schmidt, who also holds an endowed professorship for fibre optics at the University of Jena.
The combination of 2D material and glass fibre has thus created an intelligent material platform that combines the best of both worlds. “Due to the functionalisation of the glass fibre with the 2D material, the interaction length between light and material has now been significantly increased,” says Dr Antony George, who is developing the manufacturing method for the novel 2D materials together with Turchanin.
Sensors and non-linear light converters
The team envisages potential applications for the newly developed materials system in two particular areas. Firstly, the materials combination is very promising for sensor technology. It could be used, for example, to detect low concentrations of gases. To this end, a green light sent through the fibre picks up information from the environment at the fibre areas functionalised with the 2D material. As external influences change the fluorescent properties of the 2D material, the light changes colour and returns to a measuring device as red light. Since the fibres are very fine, sensors based on this technology might also be suitable for applications in biotechnology or medicine.
Secondly, such a system could also be used as a non-linear light converter. Due to its non-linear properties, the hybrid optical fibre can be employed to convert a monochromatic laser light into white light for spectroscopy applications in biology and chemistry. The Jena researchers also envisage applications in the areas of quantum electronics and quantum communication.
Exceptional interdisciplinary cooperation
The scientists involved in this development emphasise that the success of the project was primarily due to the exceptional interdisciplinary cooperation between various research institutes in Jena. Based on the Thuringian research group “2D-Sens” and the Collaborative Research Centre “Nonlinear Optics down to Atomic Scales” of Friedrich Schiller University, experts from the Institute of Applied Physics and Institute of Physical Chemistry of the University of Jena; the University’s Abbe Center of Photonics; the Fraunhofer Institute for Applied Optics and Precision Engineering IOF; and the Leibniz Institute of Photonic Technology are collaborating on this research, together with colleagues in Australia.
“We have brought diverse expertise to this project and we are delighted with the results achieved,” says Eilenberger. “We are convinced that the technology we have developed will further strengthen the state of Thuringia as an industrial centre with its focus on photonics and optoelectronics,” adds Turchanin. A patent application for the interdisciplinary team’s invention has recently been filed.
Ultraviolet light from giant stellar flares can destroy a planet’s habitability. New research from the University of North Carolina at Chapel Hill will help astrobiologists understand how much radiation planets experience during super flares and whether life could exist on worlds beyond our solar system.
Super flares are bursts of energy that are 10 to 1,000 times larger than the biggest flares from the Earth’s sun. These flares can bathe a planet in an amount of ultraviolet light huge enough to doom the chances of life surviving there.
Researchers from UNC-Chapel Hill have for the first time measured the temperature of a large sample of super flares from stars, and the flares’ likely ultraviolet emissions. Their findings, published Oct. 5 ahead of print in Astrophysical Journal, will allow researchers to put limits on the habitability of planets that are targets of upcoming planet-finding missions.
“We found planets orbiting young stars may experience life-prohibiting levels of UV radiation, although some micro-organisms might survive,” said lead study author Ward S. Howard, a doctoral student in the Department of Physics and Astronomy at UNC-Chapel Hill.
Howard and colleagues at UNC-Chapel Hill used the UNC-Chapel Hill Evryscope telescope array and NASA’s Transiting Exoplanet Survey Satellite (TESS) to simultaneously observe the largest sample of super flares.
The team’s research expands upon previous work that has largely focused on flare temperatures and radiation from only a handful of super flares from a few stars. In expanding the research, the team discovered a statistical relationship between the size of a super flare and its temperature. The temperature predicts the amount of radiation that potentially precludes on-surface life.
Super flares typically emit most of their UV radiation during a rapid peak lasting only five to 15 minutes. The simultaneous Evryscope and TESS observations were obtained at two-minute intervals, ensuring multiple measurements were taken during the peak of each super flare.
This is the first time the temperatures of such a large sample of super flares has ever been studied. The frequency of observations allowed the team to discover the amount of time super flares can cook orbiting planets with intense UV radiation.
The flares observed have already informed the TESS Extended Mission to discover thousands of exoplanets in orbit around the brightest dwarf stars in the sky. TESS is now targeting high priority flare stars from the UNC-Chapel Hill sample for more frequent observations.
“Longer term these results may inform the choice of planetary systems to be observed by NASA’s James Webb Space Telescope based on the system’s flaring activity,” said study co-author Nicholas M. Law, associate professor of physics and astronomy at UNC-Chapel Hill and principal investigator of the Evryscope telescope.