Long-sought decay of Higgs boson

Long-sought decay of Higgs boson observed

Posted by Ana Lopes on 28 Aug 2018

Six years after its discovery, the Higgs boson has at last been observed decaying to fundamental particles known as bottom quarks. The finding, presented today at CERN by the ATLAS and CMS collaborations at the Large Hadron Collider (LHC), is consistent with the hypothesis that the all-pervading quantum field behind the Higgs boson also gives mass to the bottom quark. Both teams have submitted their results for publication today.

The Standard Model of particle physics predicts that about 60% of the time a Higgs boson will decay to a pair of bottom quarks, the second-heaviest of the six flavours of quarks. Testing this prediction is crucial because the result would either lend support to the Standard Model – which is built upon the idea that the Higgs field endows quarks and other fundamental particles with mass – or rock its foundations and point to new physics.

Spotting this common Higgs-boson decay channel is anything but easy, as the six-year period since the discovery of the boson has shown. The reason for the difficulty is that there are many other ways of producing bottom quarks in proton–proton collisions. This makes it hard to isolate the Higgs-boson decay signal from the background “noise” associated with such processes. By contrast, the less-common Higgs-boson decay channels that were observed at the time of discovery of the particle, such as the decay to a pair of photons, are much easier to extract from the background.

To extract the signal, the ATLAS and CMS collaborations each combined data from the first and second runs of the LHC, which involved collisions at energies of 7, 8 and 13 TeV. They then applied complex analysis methods to the data. The upshot, for both ATLAS and CMS, was the detection of the decay of the Higgs boson to a pair of bottom quarks with a significance that exceeds 5 standard deviations. Furthermore, both teams measured a rate for the decay that is consistent with the Standard Model prediction, within the current precision of the measurement.

A CMS candidate event for the Higgs boson (H) decaying to two bottom quarks (b), in association with a Z boson decaying to an electron (e-) and an antielectron (e+). (Image: CMS/CERN)

This observationis a milestone in the exploration of the Higgs boson. It shows that the ATLAS and CMS experiments have achieved deep understanding of their data and a control of backgrounds that surpasses expectations. ATLAS has now observed all couplings of the Higgs boson to the heavy quarks and leptons of the third generation as well as all major production modes,” said Karl Jakobs, spokesperson of the ATLAS collaboration.

“Since the first single-experiment observation of the Higgs boson decay to tau-leptons one year ago, CMS, along with our colleagues in ATLAS, has observed the coupling of the Higgs boson to the heaviest fermions: the tau, the top quark, and now the bottom quark. The superb LHC performance and modern machine-learning techniques allowed us to achieve this result earlier than expected,” said Joel Butler, spokesperson of the CMS collaboration.

With more data, the collaborations will improve the precision of these and other measurements and probe the decay of the Higgs boson into a pair of much-less-massive fermions called muons, always watching for deviations in the data that could point to physics beyond the Standard Model.

The experiments continue to home in on the Higgs particle, which is often considered a portal to new physics. These beautiful and early achievements also underscore our plans for upgrading the LHC to substantially increase the statistics. The analysis methods have now been shown to reach the precision required for exploration of the full physics landscape, including hopefully new physics that so far hides so subtly,” said CERN Director for Research and Computing Eckhard Elsen.

For more information, see the ATLAS and CMS websites.

 

Laser-shocking deuterium into metal

Insulator-metal transition in dense fluid deuterium

Science  17 Aug 2018:
Vol. 361, Issue 6403, pp. 677-682

Abstract

Dense fluid metallic hydrogen occupies the interiors of Jupiter, Saturn, and many extrasolar planets, where pressures reach millions of atmospheres. Planetary structure models must describe accurately the transition from the outer molecular envelopes to the interior metallic regions. We report optical measurements of dynamically compressed fluid deuterium to 600 gigapascals (GPa) that reveal an increasing refractive index, the onset of absorption of visible light near 150 GPa, and a transition to metal-like reflectivity (exceeding 30%) near 200 GPa, all at temperatures below 2000 kelvin. Our measurements and analysis address existing discrepancies between static and dynamic experiments for the insulator-metal transition in dense fluid hydrogen isotopes. They also provide new benchmarks for the theoretical calculations used to construct planetary models.

The transformation of hydrogen from a molecular insulator to an atomic metal at high densities has been a longstanding focus in physics and planetary science. The unique quantum metallic properties of the low-temperature solid (i.e., below 300 K) have drawn sustained interest, and characterizing the transformation in the hot, dense fluid is crucial for understanding the internal structure and dynamics of giant planets, including the origin of their large magnetic fields. Numerous studies of the insulator-metal (IM) transition in dense fluid hydrogen, beginning with theoretical work five decades ago, predicted a first-order transition in the fluid with a critical point at very high temperatures (~13,000 to 15,000 K) and 60 to 90 GPa. However, the first experimental work on the IM transition in the fluid, carried out using dynamic compression techniques, provided evidence for a continuous transition with metallic states reached in the pressure range P = 50 to 140 GPa and temperatures T = 3000 to 8000 K . More recent predictions placed the critical point at a much lower temperature (~2000 K). This motivated several experimental studies using static diamond anvil cell (DAC) techniques and dynamic compression to probe the fluid properties below 2000 K and up to several hundred GPa.

Dynamic compression can explore a broad range of thermodynamic paths with time-varying manipulations of the applied pressure and controlled reverberation of pressure waves through the sample. This includes probing the dense fluid at temperatures below 2000 K, for example, with an initial jump in pressure delivered by a shock wave followed by shock reverberation or gradual ramp compression. The first demonstration of this strategy was carried out on deuterium with a magnetic compression technique at the Z facility. The results showed strong optical absorption beginning in the range 100 GPa < P < 130 GPa, followed by weak fluctuating reflectance in the range 130 GPa < P < 300 GPa, and culminated in abrupt jumps to high reflectance near 300 GPa. Knudson et al. attributed the absorption to band gap closure and determined that the reflectance jumps were associated with the first-order IM transition. The reflectance jumps occurred at higher pressures upon compression than upon decompression, plausibly as a result of thermal conduction. Meanwhile, improvements in static compression methods have allowed the exploration of the behavior of the fluid over part of this pressure-temperature (P-T) range (up to 170 GPa and >1800 K). Changes in optical properties from 120 to 170 GPa depending on temperature were attributed to the IM transition, whereas other experiments suggest the persistence of a finite (~1 eV) band gap at similar conditions.

The IM transition is the subject of a number of continuing theoretical studies that consistently predict a discontinuous transition below a critical point near ~2000 K, but over a broad range of pressures. Density functional theory (DFT)–based calculations show a spread in the transition pressure spanning 150 GPa, arising from the sensitivity of the boundary to the choice of exchange-correlation functional used and whether zero-point energy is accounted for. Quantum Monte Carlo (QMC) calculations should provide improved bounds on the transition pressures, although they disagree with a recent benchmarking experiment. Transition pressures for hydrogen and deuterium are expected to be different because of isotope effects, but with a small relative magnitude. The transition in deuterium from QMC simulations is 30 GPa higher than in hydrogen at 600 K, decreasing to 10 GPa higher at 1200 K. Despite experimental support for a first-order IM transition, the critical point has not been experimentally identified. Furthermore, the broad discrepancies in the measured transition pressure and character have made resolving the differences between the theoretical models challenging.

We completed a series of five dynamic compression experiments at the National Ignition Facility (NIF) to probe the IM transition up to 600 GPa at temperatures ranging from 900 K to 1600 K. The experiments were carried out using 168 laser beams to deliver up to 300 kJ of ultraviolet light that drove a near-isentropic reverberation compression of a cryogenic liquid deuterium sample. We adjusted the time dependence of the laser delivery (pulse shape) to control the compression sequence imposed on the sample as a function of time. Line-imaging Doppler velocimetry recorded both the compression history and the evolution of the optical properties of the D2 sample during the nanosecond compression process, using a probe laser operating at 660 nm.

Jeg blev ledt til denne artikel om “teknologi” på dr.dk:

168 super-lasere forvandler gas til et skinnende metal

“Et forskerhold har med verdens kraftigste laser skabt samme materiale, som findes i kernen af store stjerner.”

Dette lyder umiddelbart mystisk, men der er et billede af Jupiter med teksten:

I et laboratorie i USA har et forskerhold udsat gassen brint for samme forhold, som det bliver udsat for inderst inde i store planeter som Jupiter. Resultatet var metallisk brint – et stof som har et enormt potentiale for fremtidens teknologi.

Her er det “fremtidens teknologi”, som lyder mystisk. Længere nede læser man:

Det kan skabe revolutioner i alt fra magnet-tog, el-biler og alt anden elektronik. Derudover er metallisk brint det mest kraftfulde raketbrændstof, vi kender til.

Hvordan har man så tænkt sig, at den metalliske brint forbliver i denne tilstand, hvis man fjerner trykket på 200 milliarder Pascal?

Der står absolut intet herom i artiklen. Formålet med artiklen er forbedre modellerne for de store gasplaneter som Jupiter og Saturn. Temperaturen blev varieret mellem 900 K og 1600 K.

Bemærkningen om det “kraftfulde raketbrændstof” stammer fra dette spekulative fase-1 studie:

Metallic Hydrogen: A Game Changing Rocket Propellant

Atomic metallic hydrogen, if metastable at ambient pressure and temperature could be used as the most powerful chemical rocket fuel, as the atoms recombine to form molecular hydrogen. This light-weight high-energy density material would revolutionize rocketry, allowing single-stage rockets to enter orbit and chemically fueled rockets to explore our solar system. To transform solid molecular hydrogen to metallic hydrogen requires extreme high pressures, but has not yet been accomplished in the laboratory. In the proposed new approach electrons will be injected into solid hydrogen with the objective of lowering the critical pressure for transformation.  This new approach may scale down the pressures needed to produce this potentially revolutionary rocket propellant.

Dette ville i sandhed være en sensation! Men bemærk formuleringerne: “if metastable”, “would revolutionize” og “This new approach may scale down the pressures needed to produce this potentially revolutionary rocket propellant”. Jeg har kun fundet denne artikel om fast metallisk hydrogen:

Observation of the Wigner-Huntington Transition to Solid Metallic Hydrogen

We have studied solid hydrogen under pressure at low temperatures. With increasing pressure we observe changes in the sample, going from transparent, to black, to a reflective metal, the latter studied at a pressure of 495 GPa. We have measured the reflectance as a function of wavelength in the visible spectrum finding values as high as 0.90 from the metallic hydrogen. We have fit the reflectance using a Drude free electron model to determine the plasma frequency of 30.1 eV at T= 5.5 K, with a corresponding electron carrier density of 6.7×1023 particles/cm3, consistent with theoretical estimates. The properties are those of a metal. Solid metallic hydrogen has been produced in the laboratory.

Men dette er sket ved 495 milliarder Pascal, ikke ved et lavt tryk. Metastabilt fast metallisk brint ved lavt tryk forbliver fugle på taget.

 

PANOSETI: science program

Panoramic optical and near-infrared SETI instrument: overall specifications and science program

We present overall specifications and science goals for a new optical and near-infrared (350 – 1650 nm) instrument designed to greatly enlarge the current Search for Extraterrestrial Intelligence (SETI) phase space. The Pulsed All-sky Near-infrared Optical SETI (PANOSETI) observatory will be a dedicated SETI facility that aims to increase sky area searched, wavelengths covered, number of stellar systems observed, and duration of time monitored. This observatory will offer an “all-observable-sky” optical and wide-field near-infrared pulsed technosignature and astrophysical transient search that is capable of surveying the entire northern hemisphere. The final implemented experiment will search for transient pulsed signals occurring between nanosecond to second time scales. The optical component will cover a solid angle 2.5 million times larger than current SETI targeted searches, while also increasing dwell time per source by a factor of 10,000. The PANOSETI instrument will be the first near-infrared wide-field SETI program ever conducted. The rapid technological advance of fast-response optical and near-infrared detector arrays (i.e., Multi-Pixel Photon Counting; MPPC) make this program now feasible. The PANOSETI instrument design uses innovative domes that house 100 Fresnel lenses, which will search concurrently over 8,000 square degrees for transient signals (see Maire et al. and Cosens et al., this conference). In this paper, we describe the overall instrumental specifications and science objectives for PANOSETI.

The past, present and future of SETI, the search for extraterrestrial intelligence:

Is there anybody out there?

Jason DavisOctober 25, 2017

About 2,000 years ago, just before the start of the Common Era, the Romans conquered Spain. The Roman Empire was powered by money, and the currency of the time was silver. Fortunately for the Romans, there were an ample number of silver mines in their new Spanish territory.

It takes a lot of energy to smelt silver into coins, so the Romans cut down vast swaths of Spain’s forests to burn the wood for fuel. A byproduct of the smelting process is lead, which the Romans used for plumbing. For the first time, our species was engaged in large-scale industrial manufacturing—and also large-scale pollution. Signs of all this can be found in Greenland ice cores.

Pete Worden is the executive director of Breakthrough Initiatives, which funds efforts to search for life beyond Earth. He recently told me Roman silver mining is arguably the first time humans’ impact on the planet was noticeable from outer space.

“If you were sitting at a nearby star and had the ability to take a spectrum of the atmosphere, with technology that we can imagine in the next few decades, you would detect these things that are at least, from our understanding, clearly industrial pollutants,” he said.

Simon Peter “Pete” Worden

 

UEFI for the RaspberryPi 3B

64-bit Tiano Core UEFI for the Raspberry Pi 3 (with devices, Linux, FreeBSD and Windows on Arm!)

Last updated Jul 8th, 2018.

This is a port of 64-bit Tiano Core UEFI firmware for the RPi3/RPi3B+ platforms, based on Ard Bisheuvel’s 64-bit and Microsoft’s 32-bit implementations.

Initially, this was supposed to be an easy walk in the park, where the Microsoft drivers just sorta slide into Ard’s UEFI implementation, and I call it a day. It turned out to be a bit more frustrating of an experience than that :-).

This is meant as a generally useful 64-bit UEFI implementation for the Pi3, good enough for most kinds of UEFI development and good enough for running real operating systems. It has been validated to install and boot Linux (SUSE, Ubuntu) and FreeBSD, and there is experimental Windows on Arm support as well.

Andrej Warkentin

Master student at Bernstein Center for Computational Neuroscience Berlin.

Andrej Warkentin on Twitter

Here is a comparison table between different available EFI firmware implementations for the RPi3.

Feature This Implementation Ard’s Microsoft’s U-Boot Minoca
Bitness 64-bit 64-bit 32-bit Either 32-bit
PSCI CPU_ON Yes No No No No
PSCI SYSTEM_RESET Yes Yes No No No
PSCI SYSTEM_OFF Yes No No No No
DT Yes Yes No Yes No
Pass-through DT Yes No N/A Yes No
NVRAM Limited No No No No
RTC Limited No No No No
ACPI Yes No Yes No Yes
Serial Yes Yes Yes Yes Yes
HDMI GOP Yes No No Yes No
SMBIOS Yes No Yes No Yes
uSD Yes No Yes Yes Yes
uSD SdHost and Arasan Yes No Yes ? No
USB1 Limited No No Yes No
USB2/3 Yes No No Yes No
USB Mass Storage Yes No No Yes No
USB Keyboard Yes No No Yes No
USB Ax88772b PXE/Network Yes No No Yes No
USB SMSC95xx PXE/Network No No No Yes No
Tiano Yes Yes Yes No No
AArch32 Windows IoT No No Yes No No
AArch64 Windows on Arm Limited No No No No
AArch64 Linux Yes Limited No Yes No
AArch32 Linux No No No Yes No
AArch64 FreeBSD Yes No No Yes No
AArch32 Minoca No No No No Yes

OpenRISC Project

OpenRISC Project Overview

Welcome to the project overview of the OpenRISC project. The major goal of the project it to create a free and open processor for embedded systems. This includes:

  • a free and open RISC instruction set architecture with DSP features
  • a set of free, open source implementations of the architecture
  • a complete set of free, open source software development tools, libraries, operating systems and applications
  • a variety of system-on-chip and system simulators

The project is driven by a very active community and has a long history. This unfortunately lead to scattered and partly outdated information. The goal of this page is to provide an overview over active parts of the project and the current development to ease the entry for newcomers or people seeking basic information.

System-on-Chip

While a processor core is still the heart of every system, the peripherals, memory etc. are of course equally important. There are a number of system-on-chip available that you can use to perform RTL simulations, SystemC simulations or perform an FPGA synthesis of an OpenRISC-powered entire system:

  • fusesoc is a new SoC generator that not only supports OpenRISC. It also manages the available peripheral cores and allows you to easily configure and generate your system-on-chip.
  • minsoc is a minimal OpenRISC-based system-on-chip, that is easy to configure and implement, but still uses the OR1200 processor implementation.
  • OpTiMSoC is a flexible multicore system-on-chip that is based on a network-on-chip and connects a configurable number of OpenRISC (mor1kx) processors to arbitrarily large platforms.
  • MiSoC is a SoC generator using the Python based Migen which can use the mor1k processor. Both high performance and optimized for small FPGA footprint, it supports a large number of development boards out the box.

Operating Systems

If you want to run an operating system on your OpenRISC you have a few options:

  • Linux has been ported and is now upstream in the standard Linux repositories (upstream is the term that denotes that you submitted your changes to an open source project and they have been accepted and are now part of this software)
  • RTEMS has been ported during a Google Summer of Code project and is also upstream.

Hvorfor er RISC processorer interessante?

microcode – Intel / AMD CPU Microcode​
Since PentiumPro, Intel CPU are made of a RISC chip and of a microcode whose purpose is to decompose “old” ia32 instruction into new risc ones. P6 familly is concerned: PPro, PII, Celeron, PIII, Celeron2. Recent kernels have the ability to update this microcode.
The microcode update is volatile and needs to be uploaded on each system boot. I.e. it doesn’t reflash your cpu permanently. Reboot and it reverts back to the old microcode.
This package contains microcode for Intel and Amd CPUs.

Denne tekst er taget fra Thomas Backlunds (tmb) pakke med mikrokode til Mageia 6. Brugerens assembler kode udføres ikke direkte. Den oversættes i stedet via en skjult RISC-kode til de egentlige instruktioner. Årsagen er, at de moderne processorer er meget hurtigere end RAM-lageret, så man forsøger at udnytte spildtiden, mens processoren venter på lager-enhederne. OpenRISC vil åbne denne  skjulte RISC-kode for alle ved at designe en åben processor helt fra grunden.

Strengt taget vedrører OpenRISC kun embedded systems, altså ikke laptop eller desktop systems.

Lukkede kasser er usikre

Intel har indrømmet tre nye sikkerhedshuller i deres silicium og der er nok ingen der bliver overraskede hvis jeg påstår at der er flere på vej allerede.

Problemet med halvlederne er det præcis samme som for programmerne: Lukkede kasser er per definition usikre for alle andre end den der lukkede dem.

De første microprocessorer brugte et par tusinde transistorer og det tog ikke mere end et par dage med et mikroskop at skaffe sig et rigtig godt indblik i hvad de indeholdt.

Den første commodity CPU med over 20 milliarder transistorer er blevet “taped out”[1] og der er design i støbeskeen med 10% flere allerede.

Den centrale performance begrænsning er idag flaskehalsen imellem RAM og CPU hvilket er den direkte årsag til Spectre, Meltdown og alle deres efterfølgere, både de kendte og de endnu ikke afslørede sikkerhedshuller.

Sikkerhedshullerne opstår fordi det tager en evighed, set fra CPU’ens side, at hente noget i RAM og derfor har CPU designerne introduceret stadig mere spekulative “tricks” for at prøve at fremme arbejdsbyrden mens der ventes på svaret fra RAM.

Præcis hvor mange transistorer der bruges på disse optimeringer ved kun Intel og AMD, men det ligger i selve naturen af det de bruges til, at det er noget af det mest umulige at gennemskue med et kig i mikroskopet.

Det med mikroskopet er naturligvis en ren vittighed, 20 millarder transistorer, gemt under 30 lag metal er der ingen der laver reverse-engineering på, hverken nu eller senere.

20 milliarder er faktisk rigtig mange transistorer, nok til at man kan gemme en i486 der kører Minix3 med fuld netværksstak og webserver i et hjørne, uden at nogen opdager det.

Den slags obligatorisk ekstraudstyr er den anden måde CPU fabrikanterne prøver at gøre nye produkter attraktive.

Under obskure new-speak akronymer som “Active Management Technology”, “Intel Security Assist”, “Software Guard Extensions” gemmer der sig millioner af transistorer i ekstra-udstyr, der fundamentalt set alle tjener andre herrer end han der købte og betalte CPU’en.

Intentionerne bag dette ekstraudstyr er god nok, man skal kunne brick’e en stjålen laptop, man skal kunne rode med persondata på en cloud-computers uden at de lækker ud over det hele osv. osv. osv.

Men implementeringerne er hemmelige, fyldt med fejl og indtil det modsatte er bevist: Bevidst placerede bagdøre.

Intel og AMD har så mange chip-designere, at enhver efterretningstjeneste der ikke er inkompetent, har mindst en håndfuld af dem i deres sold, dels for at skaffe insider viden om hvad der egentlig er på chippen, men utvivlsomt også for at gøre det nemmere at få adgang til computere baseret på de pågældende chips.

Det gode ved kompetente efterretningstjenester er at de altid forsøger at bruge så lidt krudt som muligt, for ikke at afsløre hvad de virkelig kan. To flotte eksempler er ECHELON der viste sig at være totalovervågning af al teletrafik på hele kloden, inklusive aflytning af undersøiske fiberkabler og STUXNET virusen der blev brugt til at sabotere Irans atomprogram.

At der også er god kontakt med NSA det ledelsesmæssige plan er indlysende for enhver der gider kigge efter: Hverken Intel eller AMD fik den helt indlysende ide at bruge en enkelt million transistorer på at lave CPU support for AES kryptering, før Taiwanesiske VIA gjorde det. Endnu sjovere: Da de så gjorde det, lignede det mest af alt at det bare handlede om at enable noget der allerede var i chippen, noget som kunder med adgang til en anden “microcode firmware” åbenbart havde haft adgang til i omkring 10 år.

Rusland, Kina, Israel, USA og et antal kommercielle aktører har forlængst indset problemet og alle har udviklet private chip-designs til anvendelser hvor man ikke kan stole på Intels og AMDs lukkede kasser.

Men den dag en facistisk regering, en terrorist, en mafioso eller en IT-liberalist med nynazistiske tendenser finder en læk i en af Intel eller AMDs mange lukkede kasser, kommer der helt andre boller på suppen.

Det er ikke længere nok at insistere, som alle kompetente IT-sikkerhedsdesignere har gjort siden Ken Olsons dage, at Closed Source ikke er til at stole på, vi må også dømme “Closed Silicon” ude på samme vis.

phk

[1] I gamle dage lavede man maskerne til halvledere manuelt med dybrød “rubylith” klisterfolie og når maskerne var færdige sagde man at chippen var “taped out”, udtrykket hænger ved, selvom der idag bruges computerstyret elektronstrålelitografi til at lave maskerne.

The open source digital design conference

ORConf 2018 will be held from September 21st to 23rd in Gdansk, Poland, at the Gdansk University of Technology

 

Hurtigere beregning på fortrolige data

Forsker får europæisk bevilling til at udvikle beregning på private og fortrolige data

Der er et stigende behov for beregninger af private input på en måde, som sikrer, at inputtene holdes fortrolige og resultaterne korrekte, eksempelvis ved indkomstanalyser. Nu har aarhusiansk forsker fået en bevilling på at udvikle metoden. (fortrolige sundhedsdata er et andet eksempel)

Det Europæiske Forskningsråd (ERC) har tildelt lektor Claudio Orlandi fra Institut for Datalogi på Aarhus Universitet et Starting Grant på 1,5 mio. euro til forskning i fortrolig, effektiv og sikker multiparty computation (MPC).

Claudio Orlandi skal bruge pengene og de næste fem år på at udtænke en ny og mere effektiv tilgang til secure MPC (MultiParty Computation), eller sikker distribueret beregning på dansk, skriver universitetet i en meddelelse på Ritzau.

Teknologien er nemlig ved at nå grænsen for, hvor store opgaver den kan løse, mener han.

MPC er en kryptografisk teknologi, der gør det muligt at lave beregninger på krypterede værdier og dermed løse den indbyggede konflikt, der kan være mellem nytteværdi og privatliv, når der skal regnes med følsomme data.

Derved kan parter, som ikke stoler på hinanden, beregne enhver fælles funktion af deres private input på en måde, som sikrer, at inputtene er fortrolige og resultaterne korrekte. For eksempel kan to personer finde ud af, hvem af dem der er rigest, uden at skulle oplyse deres formue til hinanden.

Øget behov for datafortrolighed

Behovet for sådanne fortrolige databeregninger bliver stadigt mere aktuelt, fordi der er et stigende behov for at analysere på de mange data, der bliver indsamlet – for at skabe ny viden og bidrage til et bedre samfund.

Men dataene kan også lækkes og misbruges, lyder det fra universitetet. Og derfor er der brug for metoder, der sikrer fortrolighed.

I løbet af det seneste årti er effektiviteten af ​​MPC blevet forbedret betydeligt, og disse fremskridt har givet flere virksomheder over hele verden mulighed for at begynde at implementere og inkludere MPC-løsninger i deres produkter – som f.eks. anvendes i sikre auktioner, benchmarking, beskyttelse af personlige oplysninger, databehandling osv.

Men MPC er stadig langtfra effektiv nok til, at den lovende teknologi kan bruges i dagligdags beregninger.

»Det ser nu ud til, at vi har nået en mur, når det gælder mulighederne for at forbedre de nuværende byggesten til MPC, hvilket forhindrer MPC i at indgå i kritiske storskala-applikationer,« forklarer Claudio Orlandi.

Han mener, at der skal et radikalt paradigmeskifte i MPC-forskning til for at gøre MPC virkelig praktisk.

Med ERC-bevillingen vil Claudio Orlandi træde et skridt tilbage og udfordre nuværende antagelser i MPC-forskning samt designe nye MPC-løsninger. Hans hypotese er, at hvis MPC skal udvikles til næste stadie, kræver det mere en realistisk modellering af den måde, hvorpå sikkerhed, privatliv og effektivitet defineres og måles.

»Det særlige ved et ERC Starting Grant er, at det giver forskeren en meget høj grad af frihed, hvilket er nødvendigt for at kaste sig ud i potentielt risikable, men lovende forskningsretninger. Nøgleordene er høj risiko og stort udbytte.«

 

Asteroid Explorer “Hayabusa2”

Asteroid Explorer “Hayabusa2”

Hayabusa2 to clarify the origin and evolution of the solar system as well as life matter

Asteroid Explorer “Hayabusa2” is a successor of “Hayabusa” (MUSES-C), which revealed several new technologies and returned to Earth in June 2010.
While establishing a new navigation method using ion engines, Hayabusa brought back samples from the asteroid “Itokawa” to help elucidate the origin of the solar system. Hayabusa2 will target a C-type asteroid “Ryugu” to study the origin and evolution of the solar system as well as materials for life by leveraging the experience acquired from the Hayabusa mission.
To learn more about the origin and evolution of the solar system, it is important to investigate typical types of asteroids, namely S-, C-, and D-type asteroids. A C-type asteroid, which is a target of Hayabusa2, is a more primordial body than Itokawa, which is an S-type asteroid, and is considered to contain more organic or hydrated minerals — although both S- and C- types have lithologic characteristics. Minerals and seawater which form the Earth as well as materials for life are believed to be strongly connected in the primitive solar nebula in the early solar system, thus we expect to clarify the origin of life by analyzing samples acquired from a primordial celestial body such as a C-type asteroid to study organic matter and water in the solar system and how they coexist while affecting each other.

Establishing deep space exploration technology and new challenges

Hayabusa2 will utilize new technology while further confirming the deep space round-trip exploration technology by inheriting and improving the already verified knowhow established by Hayabusa to construct the basis for future deep-space exploration.
The configuration of Hayabusa2 is basically the same as that of Hayabusa, but we will modify some parts by introducing novel technologies that evolved after the Hayabusa era. For example, the antenna for Hayabusa was in a parabolic shape, but the one for Hayabusa2 will be flattened. Also, a new function, “collision device”, is considered to be onboard to create a crater artificially. An artificial crater that can be created by the device is expected to be a small one with a few meters in diameter, but still, by acquiring samples from the surface that is exposed by a collision, we can get fresh samples that are less weathered by the space environment or heat.
Hayabusa2 was launched on December 3, 2014. It should arrive at the C-type asteroid in mid 2018, staying around there for one and half years before leaving the asteroid at the end of 2019 and returning to Earth around the end of 2020.

Stereo image of asteroid Ryugu by Dr. Brian May

Brian May, the lead guitarist from the British rock band, Queen, has created a stereoscopic image of Ryugu from photographs captured with the ONC-T camera onboard Hayabusa2, so that the asteroid can be viewed in three dimensions. Brian May is an astronomer, with a doctoral degree in astrophysics from Imperial College London. He has a strong interest in planetary defense or space guard, which considers the potential threat to the Earth from meteorites. As part of this, May is a core member of “Asteroid Day”, that began about three years ago to increase awareness of asteroids and action that can be taken to protect the Earth.

Image to be used with red/blue stereo glasses.

JAXA Hayabusa2 Project

 

Rotationsinvariant CNN til SN-detektion

Enhanced Rotational Invariant Convolutional Neural Network for Supernovae Detection

Forfatterne foreslår i denne artikel en forbedret CNN-model (Deep Learning) til detektion af supernovaer. Dette gøres ved at anvende en ny metode til opnåelse af rotationsinvarians, som anvender cyklisk symmetri. Forfatterne anvender desuden en visualiseringsmetode, the layer-wise relevance propagation (LRP), der gør det muligt at lokalisere de relevante pixler, som bidrager til adskillelsen mellem supernovakandidater og falske objekter. En sammenligning med den oprindelige Deep-HiTS model viser, at den forbedrede metode opnår de hidtil bedste resultater på et HiTS-datasæt, idet den opnår en middelnøjagtighed på 99.53%. Forbedringen i forhold til Deep-HiTS er signifikant både statistisk og i praksis.

In this paper, we propose an enhanced CNN model for detecting supernovae (SNe). This is done by applying a new method for obtaining rotational invariance that exploits cyclic symmetry. In addition, we use a visualization approach, the layer-wise relevance propagation (LRP) method, which allows finding the relevant pixels in each image that contribute to discriminate between SN candidates and artifacts. We introduce a measure to assess quantitatively the effect of the rotational invariant methods on the LRP relevance heatmaps. This allows comparing the proposed method, CAP, with the original Deep-HiTS model. The results show that the enhanced method presents an augmented capacity for achieving rotational invariance with respect to the original model. An ensemble of CAP models obtained the best results so far on the HiTS dataset, reaching an average accuracy of 99.53%. The improvement over Deep-HiTS is significant both statistically and in practice.

Dette abstract holder sig ikke tilbage med hensyn til anvendelse af inforståede akronymer. Det er dog umiddelbart klart, at artiklen vedrører anvendelsen af en forbedret Deep Learning til detektion af Supernovaer.

INTRODUKTION

Astronomien træder ind i en ny æra med Big Data som følge af store teleskoper som Large Synoptic Survey Telescope (LSST), et 8.4m teleskop med et 3.2 Gigapixel kamera, som starter med at observere i 2022. LSST er et robotteleskop, som skal skanne hele den sydlige himmel i løbet af 3 dage. Teleskopet opsamler fotometri for 50 milliarder objekter over en 10-årig periode. Tidsdomæneastronomi studerer objekter, som varierer i tid eller retning, f.eks. supernovaer. High Cadance Transient Survey (HiTS) har til formål at detektere supernovaer i deres tidlige stadier for at opnå en forøget viden om disse eksploderende stjerners fysik. HiTS har en speciel pipeline til detektion af transienter (en transient er på amerikansk en person på gennemrejse). Den omtalte pipeline fratrækker et referencebillede fra hvert nyt billede, idet den detekterer kilder og klassificerer dem. Antallet af supernovaer vokser med afstanden, men fjernere objekter har et lavere signal/støj-forhold, hvorfor det er vigtigt at kunne reducere både falske negative detektioner og falske positive detektioner. Forfatterne har tidligere introduceret et convolution neural network (CNN) til klassifikation af detekterede kilder fra HiTS som enten sande transienter eller falske kandidater. En konventionel arkitektur indfører “foldningslag” i det neurale tetværk for at opnå translationsinvarians: En figur, som forskydes til en anden position i inputbilledet, medfører den samme klassifikation som den oprindelige figur. Det neurale netværks klassifikation burde imidlertid også være uafhængig af detektorens rotation i forhold til himlen. Forfatterne indførte i 2017 en delvis rotationsinvarians i deres CNN. De døbte denne specielle Deep Learning for Deep-HiTS (DH) modellen.

Forfatterne forbedrer i denne artikel Deep-HiTS ved anvendelse af en ny metode til at opnå rotationsinvarians. De anvender desuden en ny visualiseringsmetode, the layer-wise relevance propagation (LRP), med henblik på at finde de relevante pixler i hvert billede, som hjælper med at adskille en supernovakandidat fra et falskt objekt. Forfatterne vurderer virkningen af den forbedrede rotationsinvariante metode med LPR ved at sammenligne med den oprindelige Deep-HiTS model.

Årsagen til, at Deep Learning (også omtalt som kunstig intelligens) virker så godt i dette tilfælde, er at der kun findes to klasser: a) supernovakandidater og b) falske objekter (detektordefekter, kosmiske muoner).

 

21cm absorption without CDM

Predictions for the sky-averaged depth of the 21cm absorption signal at high redshift in cosmologies with and without non-baryonic cold dark matter

Stacy McGaugh

We consider the 21-cm absorption signal expected at high redshift in cosmologies with and without non-baryonic cold dark matter. The expansion of the early universe decelerates strongly with dark matter, but approximately coasts without it. This results in a different path length across the epochs when absorption is expected, with the consequence that the absorption is predicted to be a factor of ∼ 2 greater without dark matter than with it. Observation of such a signal would motivate consideration of extended theories of gravity in lieu of dark matter.

 

Half-billion-year-old animals

These half-billion-year-old creatures were animals—but unlike any known today

By Colin Barras |


Artist’s reconstruction of Stromatoveris, an ancient marine animal
J. Hoyal Cuthill

So-called Ediacaran organisms have puzzled biologists for decades. To the untrained eye they look like fossilized plants, in tube or frond shapes up to 2 meters long. These strange life forms dominated Earth’s seas half a billion years ago, and scientists have long struggled to figure out whether they’re algae, fungi, or even an entirely different kingdom of life that failed to survive. Now, two paleontologists think they have finally established the identity of the mysterious creatures: They were animals, some of which could move around, but they were unlike any living on Earth today.

Scientists first discovered the Ediacaran organisms in 1946 in South Australia’s Ediacara Hills. To date, researchers have identified about 200 different types in ancient rocks across the world. Almost all appear to have died out by 541 million years ago, just before fossils of familiar animals like sponges and the ancestors of crabs and lobsters appeared in an event dubbed the Cambrian explosion. One reason these creatures have proved so tricky to place in the tree of life is that some of them had an anatomy unique in nature. Their bodies were made up of branched fronds with a strange fractal architecture, in which the frond subunits resembled small versions of the whole frond.

Jennifer Hoyal Cuthill at the Tokyo Institute of Technology and the University of Cambridge in the United Kingdom and Jian Han at Northwest University in Xi’an, China, have now found key evidence that the Ediacaran organisms were animals. They analyzed more than 200 fossils of a 518-million-year-old marine species named Stromatoveris psygmoglena. Paleontologists had previously concluded that the 10-centimeter-tall species was some sort of animal—in part, says Hoyal Cuthill, because it was found alongside other known animals, and all of the fossils are preserved in a similar way. Hoyal Cuthill and Han argue S. psygmoglena was also an Ediacaran organism, a rare “survivor” that somehow clung on through the Cambrian explosion.

The Stromatoveris fossils, which were all unearthed in Yunnan province in southwestern China, are beautifully preserved, Hoyal Cuthill says. As she examined specimen after specimen she became increasingly excited. “I began thinking: My goodness, I’ve seen these features before.” Like some of the strange Ediacaran organisms, Stromatoveris was made up of several radially repeated, branched fronds with a fractal internal architecture.

To find out what sort of animals Stromatoveris and the other Ediacaran organisms were, Hoyal Cuthill and Han ran a computer analysis that uses anatomical features to reconstruct evolutionary relationships. They found that Stromatoveris and the other Ediacaran organisms don’t belong to any living animal group or “phylum.” Instead, they cluster on their own branch in the animal evolutionary tree, between the sponges and complex animals with a digestive cavity like worms, mollusks, and vertebrates, the team reports today in Palaeontology. “This branch, the Petalonamae, could well be its own phylum, and it apparently lacks any living descendants,” Hoyal Cuthill says.

“It looks very likely [the Ediacaran organisms] are animals,” says Simon Conway Morris, a paleontologist at the University of Cambridge, who worked with Han on the first description of Stromatoveris in 2006, but who was not involved in the current study. At that point there were just a handful of known Stromatoveris fossils. The researchers argued that they were similar to some Ediacaran organisms, although others later questioned that link. Conway Morris says the new study “extends the story very nicely” by exploring the Ediacaran nature of Stromatoveris in more detail.

Geobiologist Simon Darroch at Vanderbilt University in Nashville is also comfortable with the idea that the Ediacaran organisms were animals and that a few survived into the Cambrian. But on a first look he is not convinced that Stromatoveris was one such survivor; he thinks the evidence that it had the fractal architecture of an Ediacaran organism isn’t strong—yet he’s open to persuasion.

If the new conclusion settles one mystery, though, it introduces another. The Ediacaran organisms represent the first major explosion of complex life on Earth, and they thrived for 30 million years. Their demise has been linked to the appearance of animals in the Cambrian Explosion, Hoyal Cuthill says. But that simple explanation doesn’t work as well if Ediacaran organisms were animals themselves, and some were still alive tens of millions of years later. “It’s not quite so neat anymore,” she says. “As to what led to their eventual extinction I think it’s very hard to say.”