Page 2,170«..1020..2,1692,1702,1712,172..2,1802,190..»

Robert Crease Awarded 2021 Institute of Physics William Thomson, Lord Kelvin Medal and Prize | | SBU News – Stony Brook News

Robert Crease, professor and chair of the Department of Philosophy in the College of Arts and Sciences, has been awarded the 2021 Institute of Physics (IOP) William Thomson, Lord Kelvin Medal and Prize. Crease has received this award for his 21 years writing Physics Worlds outstanding Critical Point column, devoted to describing key humanities concepts for scientists, and explaining the significance of key scientific ideas for humanists.

Crease has written, co-written, translated, and edited nearly 20 books on the history and philosophy of science, several of which sprang from material in Critical Point. These books have been reviewed in places as diverse as The Economist, the London Review of Books, and the Wall Street Journal, and translated into a total of 10 languages. One book in particular, The Quantum Moment: How Planck, Bohr, Einstein, and Heisenberg Taught Us to Love Uncertainty, is about the real and fanciful impact that quantum mechanics has had on philosophy, culture, and life. The book stemmed from an innovative class that Crease and physicist Alfred Goldhaber co-taught at Stony Brook University.

My sincere congratulations to Bob on his receipt of the William Thomson, Lord Kelvin Medal and Prize, said Nicole Sampson, dean of the College of Arts and Sciences and distinguished professor of chemistry. His decades-long contribution to the sciences from a humanists perspective, through his Critical Point column and numerous publications as well as inventive course offerings that blend the arts and sciences, is testament to the importance of interdisciplinary collaboration as we navigate our ever-changing world. I applaud Bob for his commitment to communicating ideas and knowledge from his own area of expertise for the benefit of other disciplines.

Crease is also co-editor-in-chief of Physics in Perspective, whose mission is to bridge the gulf between physicists and non-physicists through historical and philosophical studies that typically display the unpredictable as well as the cross-disciplinary interplay of observation, experiment, and theory that has occurred over extended periods of time in academic, governmental, and industrial settings and in allied disciplines such as astrophysics, chemical physics, and geophysics.

Im thrilled to get this award, said Crease. The IOP, a century-old scientific organization, gave it to me for more than 20 years of writing the column, Critical Point for Physics World. Its a good sign for the humanities, for the column explores the numerous intersections between humanities and the sciences. In a science-dominated world, I think, the vitality of the humanities is threatened, not by interacting too much with the sciences, but too little. By the way, Kelvin, the scientist for whom the award is named, occupied what at the time was called a Chair of Natural Philosophy.

Other books include Philosophy of Physics, an IOP ebook, and the final portion of J. Robert Oppenheimer: A Life, which was begun by physicist Abraham Pais and left incomplete with his death. Crease also edited Science Policy Up Close by physicist and U.S. Presidential Science Advisor John H. Marburger III. For these and other contributions to history and philosophy of science, Crease was elected a fellow of the APS and IOP.

Crease received BA from Amherst College and PhD from Columbia University. He has written more than 75 reviews, editorials, and other short pieces on philosophy, history, and science for academic journals and The New York Times, Wall Street Journal, Nature, Newsday, and more. Crease has also appeared on a range of radio programs, from the BBC to the offbeat Talk Nerdy.

On behalf of the Institute of Physics, I warmly congratulate all of this years Award winners, said Professor Sheila Rowan, president of the Institute of Physics. Each and every one of them has made a significant and positive impact in their profession, whether as a researcher, teacher, industrialist, technician or apprentice. Recent events have underlined the absolute necessity to encourage and reward our scientists and those who teach and encourage future generations. We rely on their dedication and innovation to improve many aspects of the lives of individuals and of our wider society.

The Institute of Physics is the professional body and learned society for physics, and the leading body for practising physicists, in the UK and Ireland. The IOP awards celebrate physicists at every stage of their career; from those just starting out through to physicists at the peak of their careers, and those with a distinguished career behind them.They also recognize and celebrate companies which are successful in the application of physics and innovation, as well as employers who demonstrate their commitment and contribution to scientific and engineering apprenticeship schemes.

Read this article:

Robert Crease Awarded 2021 Institute of Physics William Thomson, Lord Kelvin Medal and Prize | | SBU News - Stony Brook News

Read More..

LMU: Tandem actions in the particle world – India Education Diary

Hopes are high that quantum theory could yield revolutionary applications. Physicist Jasmin Meinecke is working on the physical foundations of particle systems and studying the central phenomenon of entanglement, which is still a mystery to science.

Its not long into the conversation with Jasmin Meinecke that the magic word comes up: entanglement. Without it, nothing seems possible in the quantum world. It is a mysterious connection between things like photons or atoms, Meinecke explains. It means that these particles cannot be regarded as separate, even if they are far apart. Albert Einstein once called this invisible force spooky action at a distance. To this day, it is still not really understood.

Jasmin Meinecke is interested in such strange phenomena. She wants to learn more about a world that existed mainly only in theory for decades. Partly because it is so difficult to observe. Quantum theory is considered one of the most powerful physical theories of the 20th century. Although the finer details of many of its peculiar features remain unknown and still puzzle scientists to this day, there has recently been a great deal of interest around potential applications: from ultra-powerful quantum computers that are superior to todays computers, to communication encrypted with quantum cryptography making it tap-proof, to super-precise quantum sensors that could enable completely new kinds of measuring instruments to become reality. The current euphoria is driven by the fascinating possibilities offered by the quantum world itself: The theory promises dramatic improvements over the classical equipment we know and use today. Right now were learning more about what you can actually do with entangled particles, says Meinecke.

Entanglemement in a waveguideJasmin Meinecke in her laboratory in Garching.

Christoph Hohmann / LMURight now were learning more about what you can actually do with entangled particles.Dr. Jasmin MeineckeThe young physicist is one of many scientists to feel torn between the conflicting priorities of ambitious basic research and the great hopes for practical applications in the near future in areas like quantum computing. Meinecke, who heads a junior research group at the Max Planck Institute of Quantum Optics (MPQ) in Garching, is keen to conduct basic research, and so she is studying photons light particles in that context. Yet at the same time, she also wants to keep an eye on possible later applications, for instance by understanding how quantum systems can be used to measure the physical and chemical properties of matter.

Jasmin Meinecke completed her doctorate in Bristol in 2015 before moving to LMU Munich. She is also one of the researchers in the cluster of excellence MCQST (the Munich Center for Quantum Science and Technology), and in January 2020 she was awarded a START Fellowship. This is a program designed to enable excellent postdocs to set up their own project within two years and receive 300,000 euros in funding for it. Meinecke is currently working in LMU physicist Harald Weinfurters group, where she is using this funding to study open quantum systems, which are systems that interact with their environment something that scientists normally try to avoid at all costs. That is because such systems are very sensitive and are quickly disturbed by external influences all it takes is a change in temperature or a vibration.

EnlargeComparatively simple experimental setup Christoph Hohmann / LMU

Wave patterns in glassTo conduct her experiments, Meinecke uses photons that she observes under controlled conditions in integrated waveguides. Waveguides are small, unremarkable-looking glass plates with a kind of pattern inscribed inside them; the patterns are predetermined pathways for the light particles to travel along. Various entangled photons move towards and away from each other along these pathways.

Meinecke has sketched such wave patterns in her lab book and on whiteboards in the lab: curved paths, pretty to look at. Located inside the glass, they are not visible from the outside. The aim of these experiments is for the researcher to begin to understand, for example, how the quantum system and its environment exchange information and how that affects the quantum properties within it. Im fascinated by how and when quantum properties like coherence or entanglement are lost, says Meinecke. Quantum properties dont just suddenly disappear. The question is, where does the information go?

Quantum properties dont just suddenly disappear. The question is, where does the information go?Jasmin MeineckeSetting the pace about the impending quantum revolutionRead moreFor some years now, researchers have had increasing success with comparatively simple experiments like these as a way to better understand some of the concepts of quantum mechanics, such as superposition and entanglement, and be able to use them in technological applications. Entanglement is not just one of the most important properties of quantum particles, it is also the central resource for promising quantum technologies.

Meineckes field, experimental quantum optics, is ultimately based on the ability of scientists to control light, matter, and how they interact to an ever better degree. Its a field in which the Munich cluster of excellence brings together a huge amount of expertise and offers broad-ranging opportunities for cooperation. Meinecke chose photons as an experimental platform partly, she says, because light particles are easy to control, easier than solid-state systems with atoms, which are often very sensitive to their environment and in many cases can only serve as a platform for experiments at extremely low temperatures close to absolute zero. Added to that, photons have many properties that will be needed for future applications. For example, they can be readily transmitted in fiber optic cables like those used in telecommunications, and they are also easy to generate.

Several of the properties of photons can in principle be used for entanglement: the polarization of the light particles, the color of the light (the wavelength, in other words), the energy, the spin. Part of the reason why Meinecke is so excited about the possibilities offered by photons is because, unlike her colleagues who work with ultracold atoms, her experiments can be realized in a small space and she is able to take advantage of advances in the miniaturization of optical components.

The secret life of photonsThe setups in her lab are, in fact, astonishingly simple compared to other quantum experiments. A laser, a crystal, a small glass plate, and a little bit of electronics to analyze the experiments. All of it fits into a space the size of a kitchen table. The laser beams its light through a fiber optic cable into a nonlinear crystal, in which the lasers light particles can be used to generate photon pairs that are entangled. Whether an entangled pair is created from a photon coming out of the laser is pure chance.

Thats the mysterious bit about the experiment. It is a process that has only a certain probability of working. Once the particles have become entangled, thats when things get exciting for the physicist. The particles can then move along different paths through the waveguides. But what do they really do? Which pathway do they choose? Do they remain entangled? These seemingly simple questions transport you deep into the jungle of the quantum world. Meinecke has a more fundamental way of putting all these questions: What actually happens to the particles when they are not being observed?

For the LMU physicist, this is not just a philosophical question. Observation is an important thing in the quantum world. Because on the one hand, you dont know anything about whats happening in the waveguide until you observe the particles. And on the other hand, the entanglement between the photons is usually lost very suddenly if you only measure one state of a photon, such as its energy.

The peculiarities of the quantum walkSo what can be done? Were working on weak measurement too, says Meinecke. That means measuring without completely destroying the entanglement. Many research groups are interested in these same topics: Fellow scientists at the MPQ recently developed a method for detecting the entanglement of two distant atomic qubits, the quantum stores of information.

The scientists findings are not always easy for laypeople to grasp. A conversation with Jasmin Meinecke is like taking a high-speed ride through many exotic topics around quantum physics. She talks about Bell states and the peculiarities of the quantum walk, a kind of random walk that particles do, and of course the integrated waveguides that can be used for a wide variety of applications thanks to the complex structures inside the glass plates. They can now provide the level of stability and miniaturization we need to build larger experimental arrays, says Meinecke.

High-precision work Christoph Hohmann / LMUQubits for quantum computersIntegrated waveguides are an ideal tool for exploring fundamental questions of quantum physics. This is another reason why Meinecke is now keen to test the possibilities of integrated waveguide structures as quantum simulators in several further series of experiments. She says that entangled photons make a good study platform precisely because quantum systems are sensitive to the tiniest of disturbances in the environment. And that is what Meinecke wants to develop measurement techniques for.

She evidently prefers the small experiments to experiments using large-scale lasers, such as those still set up from years past in the Laboratory for Multiphoton Physics in Garching. One of these lasers, a powerful, high-performance machine, has been dubbed Tsunami. Fitted with much more complex attachments, it can be used to generate up to six entangled photons; only a few laboratories are capable of generating more entangled pairs. The current record is twelve. The hope here is that these could be used as qubits for quantum computers. But the effort is immense, especially considering that useful applications would require many times more photons. China still has research groups that keep reporting new records with more and more entangled particles, she says. This is not for her: The amount of resources and lab time it consumes is enormous. And it doesnt tell me anything more about physics, she says. A scientist at a university needs to be generating more fundamental insights anyway.

Despite all the current interest in areas like quantum computing, the researcher tries to keep her focus on the fundamentals. Quantum simulators and perhaps, later, quantum computers are not just faster computers that can solve problems like how to optimize delivery routes in logistics or work out the structure and effect of new molecules, says Meinecke. Its not just a matter of switching over from something like diesel engines to electric powertrains. Its a long road. Its quite simply something totally different, a new way of computing and understanding the world.

The rest is here:

LMU: Tandem actions in the particle world - India Education Diary

Read More..

Simplified quantum computer can be made with off-the-shelf components – New Atlas

Quantum computers could one day blow boring old classical computers out of the water, but so far their complexity limits their usefulness. Engineers at Stanford have now demonstrated a new relatively simple design for a quantum computer where a single atom is entangled with a series of photons to process and store information.

Quantum computers tap into the weird world of quantum physics to perform calculations far faster than traditional computers can handle. Where existing machines store and process information in bits, as either ones and zeroes, quantum computers use qubits, which can exist as one, zero, or a superposition of both one and zero at the same time. That means their power scales exponentially with each added qubit, allowing them to tackle problems beyond the reach of classical computers.

Of course, quantum computers bring their own challenges. For one, the quantum effects they run on are sensitive to disturbances like vibration or heat, so quantum computers need to be kept at temperatures approaching absolute zero. As such, their complexity scales with the computing power of the machine, so they become physically larger and more cumbersome as more processing power is added.

But the Stanford team says their new design is deceptively simple. Its a photonic circuit made using a few components that are already available a fiber optic cable, a beam splitter, two optical switches and an optical cavity and it can reduce the number of physical logic gates needed.

Normally, if you wanted to build this type of quantum computer, youd have to take potentially thousands of quantum emitters, make them all perfectly indistinguishable, and then integrate them into a giant photonic circuit, says Ben Bartlett, lead author of the study. Whereas with this design, we only need a handful of relatively simple components, and the size of the machine doesnt increase with the size of the quantum program you want to run.

The new design is made up of two main parts: a ring that stores photons, and a scattering unit. The photons represent qubits, with the direction that they travel around the ring determining whether their value is a one or a zero or both if it travels in both directions at once, thanks to the quirks of quantum superposition.

To encode information on the photons, the system can direct them out of the ring into the scattering unit, where they enter a cavity containing a single atom. When the photon interacts with the atom, they become entangled, a quantum state where the two particles can no longer be described separately, and changes made to one will affect its partner, no matter how large a distance separates them.

In practice, after the photon is returned to the storage ring, it can be written to by manipulating the atom with a laser. The team says that the one atom can be reset and reused, manipulating many different photons in the one ring. That means the quantum computers power can be scaled up by adding more photons to the ring, rather than needing to add more rings and scattering units.

By measuring the state of the atom, you can teleport operations onto the photons, says Bartlett. So we only need the one controllable atomic qubit and we can use it as a proxy to indirectly manipulate all of the other photonic qubits.

Importantly, this system should be able to run a variety of quantum operations. The team says that different programs can be run on the same circuit, by writing new code to change how and when the atom and photons interact.

For many photonic quantum computers, the gates are physical structures that photons pass through, so if you want to change the program thats running, it often involves physically reconfiguring the hardware, says Bartlett. Whereas in this case, you dont need to change the hardware you just need to give the machine a different set of instructions.

Better still, photonic quantum computer systems can operate at room temperature, removing the bulk added by the extreme cooling systems.

The research was published in the journal Optica.

Source: Stanford University

Original post:

Simplified quantum computer can be made with off-the-shelf components - New Atlas

Read More..

Spectral Lines From A Dying Nation: The Molecular Spectrometry Of Hertha Sponer – Women You Should Know

It is hard to imagine a time and place outside of Charles Dickenss Revolutionary France that more embodies the spirit of the Best of Times and Worst of Times than early 20th century Gttingen. At its best, in the 1920s, it boasted a heady selection of the planets greatest mathematicians and physicists, clustered together to rewrite entirely the rules for atomic and molecular structure and behavior. At its worst, in the 1930s, it gave itself over to the anti-Semitic ravings of the rising Nazi party, and tore its magnificence to tatters in the name of party ideology and racial purity. The roll call of those who directly experienced those high highs and low lows reads like a roll call of the modern scientific pantheon: Max Born, James Franck, Emmy Noether, Maria Goeppert-Mayer, Richard Courant, Edward Teller, and Germanys second ever woman to hold a professorship in physics, Hertha Sponer.

Historically, Sponers story has tended to get lost in the deep shadow of her contemporary Lise Meitner, whose tale features many of the same contours as that of Sponer, but with both the triumphs and tragedies amplified. We have decided which woman physicist victimized by the Nazi era and the systemic gender discrimiantion of early 20th century Germany we have chosen to remember, and its Meitner, thank you very much. And thats rather too bad, because not only did Hertha Sponer do some gorgeous physics over the course of four personally challenging decades, but her story also highlights the rapid social change occurring in the Western world in the early 20th century, and how a professional and highly educated woman was able to successfully navigate that evolving culturescape.

Hertha Dorothea Elisabeth Sponer was born, the eldest of five children, on September 1, 1895, in the small east German (currently Polish) town of Neisse. Her family was of a Protestant background that highly valued education, and in her elementary school Sponer was exposed to an impressively broad array of subjects, including French, German, geography, natural science, writing, singing, arithmetic, religion, gymnastics, and needlework. Somewhere around the year 1906, her family moved to the town of Zittau, which still remains, if just barely, within the borders of Germany.

It was, for Sponer, an unfortunate move. The school in Zittau was not as advanced as Neisses, and Sponer believed that, between the slow pace and unchallenging material, she would never learn enough at Zittau to be able to take the Abitur, the competitive exam for the German university system, where her ambitions lay. As such, she downgraded her expectations for a life of scientific research to that of a workaday private teacher, and attended a governess training school, which she graduated from in 1913.

She worked as a governess for two years, and in the middle of World War I took up a substitute teaching appointment. If youve ever acted as a substitute teacher, you wont be surprised by the fact that she did not remain long in that generally thankless profession and, in 1916, she made the decision to make up for lost time and attend a cram school for the Abitur. She worked hard, mastered the two year condensed curriculum in one year, and took the exam in March of 1917, excelling in its physics portion, and becoming one of only 570 women that year to pass. Her parents had expected, after the Abitur, that she would take up a position as a teacher at a girls secondary school, but after having read in a newspaper story about Sponers performance how remarkable it was to have passed the exam with so few years of preparation, they wrote to her proudly giving their full permission for her to undertake whatever university studies she liked.

Sponer began her university career at Tbingen University in May of 1917, but was quickly frustrated by the lack of theoretical physics courses on offer, or indeed any physics beyond an experimental methods course, and a lab section. Within a year, she was preparing to transfer, and by April of 1918 she was settling in at the University of Gttingen. Here, she came to the notice of Peter Debye, a future Nobel Prize winner, when she complained to one of his teaching assistants about the low difficulty level of the labs he had assigned, and the poor quality of the equipment allotted for the students use. She was told to go to Debyes office and present her criticisms in person which, amazingly, she did, and even more amazingly, which Debye responded positively to, setting her on a path of independent research which culminated in her writing her dissertation with him.

After only four semesters of study at Gottingen, Sponer took the oral examinations for her PhD in March of 1920. Ordinarily, she would have taken longer to learn more material, but Debye was planning on leaving the university shortly, and in his absence, her examination would have fallen to Robert Pohl, who was notoriously against the higher education of women, and might therefore do his level best to stack the examination against her. Ultimately, she passed her orals with an overall mark of Very Good.

Rather than remaining at Gttingen, where the focus was on theoretical physics, Sponer decided to find an institute where she could perfect her experimental technique, and Debye fatefully recommended to her that she go to Berlin and the Kaiser Wilhelm Institute, where Fritz Haber (the Nobel laureate we have met before in the tragic and sinister guise as the husband of Clara Immerwahr) was her supervisor and, more monumentally, where she met a young man named James Franck, thirteen years her senior, who would play a pivotal role in her life for the next four decades.

Before World War I, Franck had become famous for his work in verifying an early quantum prediction about how electrons of certain energies interact with atoms and now, after a brief stint as a soldier in the German army, he was returning to his research. Sponers first paper from her time at the Institute was an extension of this work. Sponer and Franck worked well together, and in 1921, when Franck was made the head of an experimental physics institute in Gttingen created by his friend Max Born, he took Sponer along with him as his Assistentin.

A gregarious and inspiring teacher and leader, Franck created a legendary intellectual atmosphere in Gttingen that set aside the rigid formalities of the usual German university system. Here it was that the new quantum physics was applied to the interpretation of atomic and molecular structures, and where the experiments were performed that dramatically supported even the most extreme predictions of the new paradigm.

To begin with, Sponer continued her work in measuring how often electron collisions with atoms induce the electrons in those atoms to undergo a quantum leap. It was not until 1924 that she pushed off into the research that would cement her name in the global physics community, the study of molecular spectra.

So, what is a molecular spectrum? You might remember from your high school chemistry courses that, when a photon of light of a particular frequency is absorbed by a lone atom, it can take that energy and use it to promote one of its electrons to a higher energy level. Since most atoms have many different electrons, there are many different promotions that can take place, each of a very particular energy. Those energies are so specific that, just by looking at the energies absorbed by an atom, or those emitted by one, you can identify what the atom is. An atoms absorption spectrum, which is a tally of all the different energies absorbed by it, acts for scientists like a tell-tale fingerprint.

When you send photons of energy at something containing more than a single atom, however, the story of what is absorbed and what that energy is used for becomes much more complicated, and molecular spectra in the early 20th century were correspondingly much harder to interpret than atomic spectra. Not only can molecules (consisting of two or more atoms bonded together) use incoming energy to promote electrons to higher energy orbits, but they can also use it to change how their atoms vibrate relative to one another, AND how they rotate around different possible axes. The more atoms there are in a molecule, the more possible vibrational states there will be, and the more asymmetric the molecule is, the more likely it is that the molecules rotational states will require distinctly different amounts of energy, all summing to a rich cornucopia of absorbed energies which it was Sponers task to untangle.

What Sponer did as a physicist throughout the 1920s, and up through the 1960s, was to take the complicated patterns produced by gaseous molecules exposed to different energies of light, and interpret them employing the theoretical mathematical models emerging from the great modern physics capitals of Berlin, Gttingen, and Copenhagen. This branch of physics, called molecular spectroscopy, was among our most powerful techniques for determining the bond lengths between the different atoms of a molecule, and the angles between them. By knowing what different energies were required to induce different types of rotation, for example, you could determine the moments of inertia around those different axes, and once you know that, coupled with the masses of the atoms involved, you can start mapping out the lengths of all the bonds and thus create geometrical pictures of increasingly complicated molecular structures.

On the strength of her work in molecular spectroscopy, Sponer received her Habilitation in 1925, which brought with it the ability to lecture at the university level. In the 25 years from 1908 to 1933, of the 10,595 women in Germany who obtained their doctorate, only 54 received a Habilitation, less than half of whom went on to become professors. Predictably, Robert Pohl was vehemently against the appointment, but relented when Franck promised that, should he ever leave the university, Sponer would as well.

That seemed an unlikely possibility in that banner year of 1925, however, and soon after receiving her Habilitation, Sponer also received a Rockefeller grant to study at the University of California, Berkeley, for a year, where she wished to learn their highly successful techniques in the field of vacuum ultraviolet spectroscopy. She learned a great deal about science while there (and developed a technique with Raymond Birge for using measured vibrational frequencies to calculate the energy required to break a molecule apart, known today as the Birge-Sponer Method), and the development of her English language skills would prove to be deeply important to her in the lean years ahead, but it was also an unfortunate time to leave Gttingen, as 1925 saw an explosion of quantum mathematical progress there, including Heisenbergs landmark paper producing a matrix mechanical explanation of electron orbitals and quantum transitions. It was also the year that James Franck won the Nobel Prize for Physics.

The late 1920s were an era of mounting professional respect and international reputation for Sponer, and in 1930 she was commissioned by the prestigious German science publishing house Springer to produce a book on molecular structure. That work would have to wait, however, as in January of 1933 the Nazi party came to power in Germany, and on April 7 promulgated the Law for the Restoration of the Professional Civil Service whereby all Jews serving in universities were to be removed from their positions, with the exception of World War I veterans.

James Franck was half-Jewish, but as a war veteran he could have maintained his position at Gttingen. Courageously, however, he elected to renounce his position in protest of the treatment of Jewish intellectuals in Germany. The decision made Franck a hero of the international intellectual community, but left Sponer in an awkward position. Classified as an Aryan by the Nazi state, she could have kept her position, but with Franck gone, Pohl expected her to leavel in accord with Francks 1925 promise.

Even if Pohl had been willing to keep Sponer on at Gottingen, in May of 1933 the German state passed the Law for the Modification of Regulations Concerning Civil Servants, Their Salaries, and Benefits, which restricted womens ability to advance in their professions and downgraded their pay scale, sending a clear indication of things to come for women at work in government. Sponer saw that her position at Gottingen was fast becoming untenable, and started making plans for emigration, either to Norway or the United States. In 1934, she moved to Oslo, where she would spend the next two years trying to get the University of Oslo on some kind of footing in the world of experimental physics. Though not the dismal situation that Lise Meitner found herself in when she moved to the University of Stockholm in 1938, it was nonetheless soon clear to her that this was not the place to continue her lifes work, and in 1936 she accepted an appointment at Duke University.

At Duke, Sponer had the funding and administrative support she needed to build a new department that married the best of European experimental techniques with the best of American educational principles. As the situation in Germany deteriorated for scientists, she became active in the emigre community organizing resources to bring as many imperilled scientists and their families to the United States as possible, thereby rescuing some of the continents greatest minds from the calamities to come. During World War II, she worked in the V-12 Program, training Navy and Marine officer candidates in accelerated education programs, while sporadically continuing her work on the spectroscopy of aromatic compounds (molecules composed of rings of atoms, like benzene).

In 1946, Hertha Sponer married James Franck, who had come to America the year before she had, and had worked on the Manhattan Project during the war on the condition that he would be able to directly advise the government against its actual use should they succeed in building it. His wife of 35 years had died in 1942, and in the ensuing years he sank deeper into ill-health and depression. The marriage of Sponer and Franck in 1946 was a tonic to both, pulling them from their respective loneliness and heartbreak over the state of their home country, and giving them both something to look forward to in the future.

Franck remained at the University of Chicago, while Sponer stayed at Duke, so the newly married pair saw each other primarily during vacations and scientific conferences, which made their time together all the more precious. Franck died in 1964, by which point Sponer was suffering from an Alzheimers-like form of dementia which had to be increasingly compensated for by her friends and loyal students. At Francks funeral, she was unable to deliver a brief eulogy, and one year later, in 1965, Hertha Sponer retired from Duke University after nearly three decades of service in which she built the experimental physics program up from the ground to become one of the nations leading centers of molecular investigation. She was brought back to Germany by her family in 1966, declared incapable of independent self-care in 1967, and died in a sanatorium in Ilten on February 17, 1968.

FURTHER READING: Marie-Ann Mausharts 1997 Um nicht zu vergessen: Hertha Sponer ein Frauenleben fuer die Physik im 20. Jahrhundert was translated in 2011 by Ralph A. Morris and is the source to have for the career of Hertha Sponer, and has a good deal of highly interesting information as well about womens evolving place in the German university system in the early 20th century.

Lead image credit: Hertha Sponer included in photo Physikalische Institut der Universitt Gttingen anllich der Franckfeier 1923 By GFHund Friedrich Hund, CC BY-SA 4.0, via Wikimedia Commons

See the original post here:

Spectral Lines From A Dying Nation: The Molecular Spectrometry Of Hertha Sponer - Women You Should Know

Read More..

Data Mining in Business Analytics – Online College | WGU

Simply put, data mining is the process that companies use to turn raw data into useful information. They utilize software to look for patterns in large batches of data so they can learn more about customers. It pulls out information from data sets and compares it to help the business make decisions. This eventually helps them to develop strategies, increase sales, market effectively, and more.

Data mining sometimes gets confused with machine learning and data analysis, but these terms are all very different and unique.

While both data mining and machine learning use patterns and analytics, data mining looks for patterns that already exist in data, while machine learning goes beyond to predict future outcomes based on the data. In data mining, the rules or patterns arent known from the start. In many cases of machine learning, the machine is given a rule or variable to understand the data. Additionally data mining relies on human intervention and decisions, but machine learning is meant to be started by a human and then learn on its own. There is quite a bit of overlap between data mining and machine learning, machine learning processes are often utilized in data mining in order to automate those processes.

Similarly data analysis and data mining arent interchangeable terms. Data mining is used in data analytics, but they arent the same. Data mining is the process of getting the information from large data sets, and data analytics is when companies take this information and dive into it to learn more. Data analysis involves inspecting, cleaning, transforming, and modeling data. The ultimate goal of analysis is discovering useful information, informing conclusions, and making decisions.

Data mining, data analysis, artificial intelligence, machine learning, and many other terms are all combined in business intelligence processes that help a company or organization make decisions and learn more about their customers and potential outcomes.

Go here to read the rest:

Data Mining in Business Analytics - Online College | WGU

Read More..

Data Mining – Applications & Trends

Advertisements

Data mining is widely used in diverse areas. There are a number of commercial data mining system available today and yet there are many challenges in this field. In this tutorial, we will discuss the applications and the trend of data mining.

Here is the list of areas where data mining is widely used

The financial data in banking and financial industry is generally reliable and of high quality which facilitates systematic data analysis and data mining. Some of the typical cases are as follows

Design and construction of data warehouses for multidimensional data analysis and data mining.

Loan payment prediction and customer credit policy analysis.

Classification and clustering of customers for targeted marketing.

Detection of money laundering and other financial crimes.

Data Mining has its great application in Retail Industry because it collects large amount of data from on sales, customer purchasing history, goods transportation, consumption and services. It is natural that the quantity of data collected will continue to expand rapidly because of the increasing ease, availability and popularity of the web.

Data mining in retail industry helps in identifying customer buying patterns and trends that lead to improved quality of customer service and good customer retention and satisfaction. Here is the list of examples of data mining in the retail industry

Design and Construction of data warehouses based on the benefits of data mining.

Multidimensional analysis of sales, customers, products, time and region.

Analysis of effectiveness of sales campaigns.

Customer Retention.

Product recommendation and cross-referencing of items.

Today the telecommunication industry is one of the most emerging industries providing various services such as fax, pager, cellular phone, internet messenger, images, e-mail, web data transmission, etc. Due to the development of new computer and communication technologies, the telecommunication industry is rapidly expanding. This is the reason why data mining is become very important to help and understand the business.

Data mining in telecommunication industry helps in identifying the telecommunication patterns, catch fraudulent activities, make better use of resource, and improve quality of service. Here is the list of examples for which data mining improves telecommunication services

Multidimensional Analysis of Telecommunication data.

Fraudulent pattern analysis.

Identification of unusual patterns.

Multidimensional association and sequential patterns analysis.

Mobile Telecommunication services.

Use of visualization tools in telecommunication data analysis.

In recent times, we have seen a tremendous growth in the field of biology such as genomics, proteomics, functional Genomics and biomedical research. Biological data mining is a very important part of Bioinformatics. Following are the aspects in which data mining contributes for biological data analysis

Semantic integration of heterogeneous, distributed genomic and proteomic databases.

Alignment, indexing, similarity search and comparative analysis multiple nucleotide sequences.

Discovery of structural patterns and analysis of genetic networks and protein pathways.

Association and path analysis.

Visualization tools in genetic data analysis.

The applications discussed above tend to handle relatively small and homogeneous data sets for which the statistical techniques are appropriate. Huge amount of data have been collected from scientific domains such as geosciences, astronomy, etc. A large amount of data sets is being generated because of the fast numerical simulations in various fields such as climate and ecosystem modeling, chemical engineering, fluid dynamics, etc. Following are the applications of data mining in the field of Scientific Applications

Intrusion refers to any kind of action that threatens integrity, confidentiality, or the availability of network resources. In this world of connectivity, security has become the major issue. With increased usage of internet and availability of the tools and tricks for intruding and attacking network prompted intrusion detection to become a critical component of network administration. Here is the list of areas in which data mining technology may be applied for intrusion detection

Development of data mining algorithm for intrusion detection.

Association and correlation analysis, aggregation to help select and build discriminating attributes.

Analysis of Stream data.

Distributed data mining.

Visualization and query tools.

There are many data mining system products and domain specific data mining applications. The new data mining systems and applications are being added to the previous systems. Also, efforts are being made to standardize data mining languages.

The selection of a data mining system depends on the following features

Data Types The data mining system may handle formatted text, record-based data, and relational data. The data could also be in ASCII text, relational database data or data warehouse data. Therefore, we should check what exact format the data mining system can handle.

System Issues We must consider the compatibility of a data mining system with different operating systems. One data mining system may run on only one operating system or on several. There are also data mining systems that provide web-based user interfaces and allow XML data as input.

Data Sources Data sources refer to the data formats in which data mining system will operate. Some data mining system may work only on ASCII text files while others on multiple relational sources. Data mining system should also support ODBC connections or OLE DB for ODBC connections.

Data Mining functions and methodologies There are some data mining systems that provide only one data mining function such as classification while some provides multiple data mining functions such as concept description, discovery-driven OLAP analysis, association mining, linkage analysis, statistical analysis, classification, prediction, clustering, outlier analysis, similarity search, etc.

Coupling data mining with databases or data warehouse systems Data mining systems need to be coupled with a database or a data warehouse system. The coupled components are integrated into a uniform information processing environment. Here are the types of coupling listed below

Scalability There are two scalability issues in data mining

Row (Database size) Scalability A data mining system is considered as row scalable when the number or rows are enlarged 10 times. It takes no more than 10 times to execute a query.

Column (Dimension) Salability A data mining system is considered as column scalable if the mining query execution time increases linearly with the number of columns.

Visualization Tools Visualization in data mining can be categorized as follows

Data Mining query language and graphical user interface An easy-to-use graphical user interface is important to promote user-guided, interactive data mining. Unlike relational database systems, data mining systems do not share underlying data mining query language.

Data mining concepts are still evolving and here are the latest trends that we get to see in this field

Application Exploration.

Scalable and interactive data mining methods.

Integration of data mining with database systems, data warehouse systems and web database systems.

SStandardization of data mining query language.

Visual data mining.

New methods for mining complex types of data.

Biological data mining.

Data mining and software engineering.

Web mining.

Distributed data mining.

Real time data mining.

Multi database data mining.

Privacy protection and information security in data mining.

Advertisements

See the rest here:

Data Mining - Applications & Trends

Read More..

3 Cryptomining Stocks to Profit From The Bitcoin Boom – Motley Fool

It's hard to ignore the signs that Bitcoin (CRYPTO:BTC) is gaining mainstream acceptance. Retailers from cinemachains to grocery stores and others are starting to accept cryptocurrencies for payments. According to a Blockdata report, Bitcoin processed 62% more transactions by dollar value than PayPal (NASDAQ:PYPL) in 2021. Now, investors can benefit from bitcoin's potential without ever owning a coin directly, thanks tobitcoin mining stocks.

Among the multiple bitcoin miners we will look at three in particular: Argo Blockchain (NASDAQ:ARBK), Hut 8 Mining (NASDAQ:HUT), and Riot Blockchain (NASDAQ:RIOT).

(Image source: Getty Images.)

We'll look at each company's operations, efficiency, potential growth, and a few intangibles. The company with the highest score at the end should take the pole position for investors to do further research.

The first comparison we'll make between the three companies is their current in-service mining power, measured by exahashes per second. More power means faster calculations. Since the reward for bitcoin mining is based on solving a complex mathematical problem, the faster to the finish line, the better.

Company

Argo Blockchain

Hut 8 Mining

Riot Blockchain

Mining Rate (EH/s)

1.3

1.7

2.8

(Data source: Argo, Hut 8, Riot financials. Table by author.)

Since the Argo figure comes from the end of October, and both Hut 8 and Riot are from the middle of November, we know this is a current comparison. Riot scores a 3, Hut 8 gets a 2, and Argo gets a 1.

A current leader is important, but it's at least as important for investors to know who might lead in the future. Each mining company is forthright about what machines they are buying and what they expect their hash rate to be in the future.

(Source: Argo, Hut 8, Riot earnings. Chart by author.)

At first, this appears to be a decisive victory for Riot. However, investors need to remember that each company's growth rate must also be considered.

Company

Argo Blockchain

Hut 8 Mining

Riot Blockchain

EH/s one year growth

185.7%

164.7%

207.1%

(Data source: Argo, Hut 8, Riot earnings reports. Table by author.)

Riot comes out ahead in both growth percentage and its final predicted EH/s rate. However, Argo actually expects to grow its mining rate faster than Hut 8. For this calculation, Riot scores a 3, Argo gets a 2, and Hut 8 gets a 1.

With the price of bitcoin over $50,000, it's easy for investors to forget about the cost each miner pays to generate their profits. The most efficient miner makes more money per bitcoin, so potential shareholders should pay close attention to this metric as well. Mining companies use average cost per bitcoin mined as a point of comparison.

Company

Argo Blockchain

Hut 8 Mining

Riot Blockchain

Average cost per bitcoin mined

$6,393

$17,700

$10,096

(Data source: Argo, Hut 8, Riot earnings reports. Table by author.)

In a somewhat surprising twist, the company with the smallest existing and planned EH/s rate is also the most efficient. Argo cam post such a low cost partly thanks to the company's focus on renewable energy sources. The company's facility in Quebec Canada primarily uses hydroelectric power, while its site in Texas takes advantage of the area's extensive wind and solar power.Going green for Argo's power needs is also providing more green for the company's bottom line.This metric gives Argo a 3, Riot a 2, and Hut 8 a 1.

There are a few other considerations for investors that might not be obvious without some digging. Argo seems to have some catching up to do. The company has the smallest starting and smallest planned EH/s rates. However, Argo has begun construction of a new bitcoin mining facility it calls Helios, in Dickens County, Texas, that could change the company's future.

The Helios facility will ultimately have access to up to 800 megawatts of power. If Argo were to use this entire amount, its Helios facility would become the largest bitcoin mining facility in North America. A power source this size would allow Argo to go far beyond the 3.7 EH/s that it has planned over the next year or so. This massive potential gives Argo a score of 3.

Hut 8 has two intangibles that investors should keep an eye on. First, the company has 10,000 highly efficient GPUs by NVIDA (NASDAQ:NVDA) that are being installed. These units are mining at a total cost of under $3,000 per bitcoin. Given that Hut 8 has the highest average cost per bitcoin of the three companies, this should help improve efficiency. Second, the mining company's CFO, Shane Downey, said, "We have already surpassed our goal of 5,000 bitcoin held in reserve." This is the largest amount of the three companies and could provide additional potential profits if the price of bitcoin continues to rise. The company's huge bitcoin reserve gives it a score of 2.

Where Riot is concerned, the company presently uses about 200 megawatts of its power to mine bitcoin for other institutions. Riot runs the machines, pays out bitcoin to the other companies, and collects a percentage, plus enough to cover its costs. This decision technically lowers Riot's risk: It will cover its costs, and receive a profit, even if the price of Bitcoin declines. However, using 200 megawatts of the 750 megawatts at North America's largest bitcoin mining facility means nearly 27% of Riot's largest asset is working for someone else. The inability to directly mine with 200 megawatts of power means Riot scores a 1 for this section.

The tally shows Riot scores the highest, based primarily on its current and projected lead in mining power. The company's lead in these two segments gave Riot half of its points.

Company

Total Score

Argo Blockchain

10

Hut 8 Mining

8

Riot Blockchain

12

(Data source: article scores. Table by author.)

Each of these companies has the potential to be a good investment, with the obvious caveat that their values will be closely tied to the price of bitcoin. If you believe that bitcoin will continue to rise in the future, but want an alternative to holding the coin, this initial scorecard suggests starting your research with Riot Blockchain.

This article represents the opinion of the writer, who may disagree with the official recommendation position of a Motley Fool premium advisory service. Were motley! Questioning an investing thesis -- even one of our own -- helps us all think critically about investing and make decisions that help us become smarter, happier, and richer.

Go here to read the rest:

3 Cryptomining Stocks to Profit From The Bitcoin Boom - Motley Fool

Read More..

Adventus Mining Announces Commencement of Drilling on Rathkeale Property and Initial Results of Kingscourt Drill Program Under The South32 Earn-In…

TORONTO, Dec. 3, 2021 /CNW/ - Adventus Mining Corporation ("Adventus")(TSXV: ADZN) (OTCQX: ADVZF) is pleased to announce the commencement of exploration scout hole drilling on its Rathkeale block in County Limerick in the Republic of Ireland. The work is being done under the earn-in agreement with a wholly owned subsidiary of South32 Limited ("South32"), which has a right to acquire a 70% interest in the Kingscourt, Rathkeale and Fermoy blocks, which are 100% owned by Adventus through its wholly owned subsidiary, Adventus Zinc Ireland Limited ("AZIL"). These three project areas encompass 1,277 km2 of favourable strata known to host Irish-type zinc-lead-silver systems. South32 is required to fund EUR 3,500,000 in exploration on the three blocks over a four-year period with AZIL acting as operator during the earn-in period (see January 13, 2020, news release).

Highlights:

Rathkeale Block Geology

The Rathkeale Block consists of a total of eight contiguous prospecting licences covering an area of approximately 255 km2 of prospective ground for base of Waulsortian Zn-Pb mineralization in west County Limerick. The block lies immediately west of Glencore's Tobermalug deposit (1) (45.4 Mt inferred mineral resources grading 7% zinc and 1% lead), and Group Eleven's Pallas Green West and Stonepark projects. The rocks underlying the Rathkeale Block are primarily Lower Carboniferous in age. The geological history of the area progressively records the early Carboniferous marine transgression across the Old Red Sandstone continent, development of Visan basin and shelf palaeoenvironments and later deposition of the late Namurian strata of the Newcastle West area. Historical drilling and geological interpretation suggest that the Rathkeale syncline is a complex of half graben structures associated with a series of ENE trending, Early Carboniferous extensional faults. Thick breccias and conglomerates along with associated alteration including dolomitization and black matrix breccia have been intersected in historical drilling on this zone. This polymict sequence of Rathkeale Limestone and Waulsortian Mudbank clasts indicates large scale faulting following the initial deposition of the Waulsortian. Regional extension was accompanied by gravitational collapse and excavation of footwall scarps during Chadian to Arundian; a similar age to that recognized in the north Dublin Basin and more significantly coeval with development of the Boulder Conglomerate at Navan.

Rathkeale Block Target Refinement

Over the last year, the target generation compilation was updated to include all the new exploration data being collected over the Rathkeale Block. Large datasets for surficial geochemistry (including soil and lithogeochemical samples), historical drilling, historical geophysical studies (gravity, magnetics, electromagnetic) and geological mapping were incorporated and developed into working models. Each target that was developed required rock exposures to be checked and mapped; however, high-priority areas had a combination of geochemical & geophysical techniques were applied (ionic leach soils, SGH, ground MT) to further enhance target refinement for drilling.

To visualize the Rathkeale geological and structural framework, detailed cross-sections using historical drill hole and recent seismic information were constructed and digitized. These cross-sections were then added to a Leapfrog Geo 3D model that was developed for visualization of all target generation initiative ("TGI") datasets. This 3D model allows for enhanced target selection in the 3D environment, notably for drill hole planning. A key component of the Phase 1 scout hole drilling at Rathkeale is the refinement and verification of the current geological and structural interpretation, which will lead to the overall enhancement of target evaluation. The seven Phase 1 scout hole drill targets were selected due to a combination of pre-existing targeting utilising 2017 seismic data, updated structural-stratigraphic targeting, historical mineral occurrences without base of Waulsortian test current geochemical-geophysical targeting.

The first drill hole in the program, collared in late November 2021 is drilling on the Killeen target in licence PLA 3368. The Killeen target is in the hanging wall of the GB fault, close to an original 2018 seismic target (Attyflin). This scout hole is aimed at targeting at an area of anomalous LGC (hydrothermal pyrite & barite signature), a conductive feature identified in the ground MT survey in the HW of a key structure on the block and at providing essential geological context. The siting of the hole, 1.5 km from the seismic line ADV17-01 will allow the collection of downhole velocity data which is the next key step in the interpretation of seismic data. Additionally, the hole will act as a test of the use of the MT method as a subsurface interpretation tool in this environment.

Kingscourt Drilling Update

Two holes 21-3609-01 and 21-3732-01 were completed in mid-2021 on the Marl Hill (Julianstown) and Marvelstown targets respectively described in the May 6, 2021, news release. Although only trace levels of zinc-lead mineralization were intersected in each drill hole, the key target horizon, known as the Pale Beds, was present and well developed, which confirmed modeling. The presence of slumped sedimentary breccias at Mark Hill in drill hole 21-3609-01 further enhances the prospectiveness of the target area.

The pXRF data collected on drill core indicates elevated base metal concentrations in the Micrite Unit that appears to be the most prospective assemblage intersected in 21-3732-01. The pXRF data also indicates a greater degree of hydrothermal alteration in this part of the basin denoted by hydrothermal dolomite signatures (with enrichments in manganese, iron, and magnesium). The profiles of the Pale Beds in 21-3732-01 show a broad pattern of mineralization and alteration, with a galena dominant signature towards the base of the Pale Beds and sphalerite mineralization at a higher elevation. Both weakly mineralized zones are interpreted as fault-controlled mineralization with the lower zone being more sulphide deficient (lead and barium enriched), whereas the higher zone having formed in a more reduced environment leading to low arsenic pyrite, low cadmium sphalerite, more sphalerite formation relative to galena with associated dolomitization.

With these two drill holes, only a very small portion of the prospective area from the Kingscourt Fault east along the broad hanging wall of the Ardee Moynalty Fault Zone has been tested. This area is thought to be a compartmentalized basin that has a large area of untested prospective ground yet to be drilled. A key area of interest is the Marvelstown to the Kingscourt Fault area where pXRF data indicates a greater degree of hydrothermal activity. The use of innovate processing of pXRF data to model stratigraphy has confirmed the absence of an approximate 60 m section of the ABL in 21-3609-01, which is observed both in drill core and in chemical profiles.

Next Steps

Adventus and South32 plan to continue with drilling the remaining scout hole targets on the Rathkeale Block, as each target is considered stand-alone. Key targeting information derived from the completion of the current drill hole on the Killeen target will be incorporated into the Rathkeale TGI, and the geological and geophysical modelling will be updated using the downhole data obtained, with future drill planning adjusted accordingly. The technical team is also continuing to work on the remaining Kingscourt targets and assessing follow up drilling based upon results from the first two drill holes. Additional drilling at Kingscourt is expected in the first half of 2022.

Geochemical sampling is ongoing on the Fermoy block with results expected in Q1 2022 for incorporation into the Fermoy TGI with detailed follow up to be carried out ahead of developing targets for drilling in the second half of 2022; however, it should be noted that five new prospecting licenses have now been issued for the Fermoy block totalling 122km2, expanding the coverage over prospective lands with field work commencing in 2022.

Technical Information and Quality Control & Quality Assurance ("QAQC")

Project work in Ireland is being managed and reviewed by Vice President of Exploration for Adventus, Jason Dunning, M.Sc., P.Geo., a Qualified Person within the meaning of NI 43-101, who has also reviewed and approved the technical and scientific information of this news release as accurate. Technical staff collect and process samples that are securely sealed and shipped to ALS Global ("ALS") in Loughrea, Ireland for sample preparation that includes crushing and milling to prepare pulps that are then split for analysis. All assay data have undergone internal validation of QAQC; noting there is an established sampling control program with blind insertion of assay blanks, certified industry standards and sample duplicates for the project. A QAQC program is also in place at ALS and includes insertion of blanks, standards, and duplicate reanalysis of selected samples. ALS' quality system complies with the requirements for the International Standards ISO 9001:2000 and ISO 17025: 1999. At ALS, elements are analyzed using a 48-element 4 acid digestion with ICP-MS technique with overlimit testing in place using ore grade methodologies, also with 4 acid digestion.

Qualified Person

The technical information of this news release has been reviewed and verified as accurate by Mr. Jason Dunning, M.Sc., P.Geo., Vice President Exploration for Adventus, a non-Independent Qualified Person, as defined by NI 43-101.

About Adventus

Adventus Mining Corporation is an Ecuador focused copper-gold exploration and development company. Its strategic shareholders include Altius Minerals Corporation, Greenstone Resources LP, Wheaton Precious Metals Corp., and the Nobis Group of Ecuador. Adventus is advancing the Curipamba copper-gold project through a feasibility study, while continuing to explore the broader 215 square kilometre district. In addition, Adventus is engaged in a country-wide exploration alliance with its partners in Ecuador, which has incorporated the Pijili and Santiago copper-gold porphyry projects to date. Adventus also controls an exploration project portfolio in Ireland with South32 Limited as funding partner. Adventus is based in Toronto, Canada, and is listed on the TSX Venture Exchange under the symbol ADZN and trades on the OTCQX under the symbol ADVZF.

About South32

South32 is a globally diversified mining and metals company. Our purpose is to make a difference by developing natural resources, improving people's lives now and for generations to come. We are trusted by our owners and partners to realise the potential of their resources. We produce bauxite, alumina, aluminium, and metallurgical coal, manganese, nickel, silver, lead and zinc at our operations in Australia, Southern Africa, and South America. With a focus on growing our base metals exposure, we also have two development options in North America and several partnerships with junior explorers around the world.

Neither the TSX Venture Exchange nor its Regulation Services Provider (as that term is defined in the policies of the TSX Venture Exchange) accepts responsibility for the adequacy or accuracy of this news release.

This press release contains "forward -looking information" within the meaning of applicable Canadian securities laws. Any statements that express or involve discussions with respect to predictions, expectations, beliefs, plans, projections, objectives, assumptions or future events or performance (often, but not always, identified by words or phrases such as "believes", "anticipates", "expects", "is expected", "scheduled", "estimates", "pending", "intends", "plans", "forecasts", "targets", or "hopes", or variations of such words and phrases or statements that certain actions, events or results "may", "could", "would", "will", "should" "might", "will be taken", or "occur" and similar expressions) are not statements of historical fact and may be forward-looking statements.

Forward-looking information herein includes, but is not limited to, statements that address activities, events, or developments that Adventus and Salazar expect or anticipate will or may occur in the future. Although Adventus and Salazar have attempted to identify important factors that could cause actual actions, events, or results to differ materially from those described in forward-looking information, there may be other factors that cause actions, events or results not to be as anticipated, estimated, or intended. There can be no assurance that such information will prove to be accurate, and actual results and future events could differ materially from those anticipated in such information. Accordingly, readers should not place undue reliance on forward-looking information. Adventus and Salazar undertake to update any forward-looking information except in accordance with applicable securities laws.

Please also visit the Adventus website at http://www.adventusmining.comand LinkedIn page at https://www.linkedin.com/company/adventus-mining-corporation.

Table 1: Drill Collar Information, Rathkeale block

Hole ID

EAST

NORTH

ELEV (m)

AZIMUTH

DIP

DEPTH (m)

COMMENT

21-3368-01

550316

647065

24

152

-70

N/A

In progress; target depth of Pale Beds ~ 900 m

Notes:(1) All drill holes are surveyed in UTM Datum (ITM / IRENET95)

Table 2: Drill Collar Information, Kingscourt block

Hole ID

EAST

NORTH

ELEV (m)

AZIMUTH

DIP

DEPTH (m)

COMMENT

29-3609-01

685436

787317

61

160

-75

657

Successfully completed as planned

21-3732-01

680528

782619

70

160

-75

534

Successfully completed as planned

Notes:(1) All drill holes are surveyed in UTM Datum (ITM / IRENET95)

Read more:

Adventus Mining Announces Commencement of Drilling on Rathkeale Property and Initial Results of Kingscourt Drill Program Under The South32 Earn-In...

Read More..

The Ins and Outs of Privacy in Clinical Trials for Patients With Cancer – Curetoday.com

Keeping your personal information safe is always required

It seems like every day there is news about groundbreaking developments in cancer treatments. But the thing about research is that it can show amazing possibilities in the laboratory and still not make a difference in real life. Even if those incredible ideas could work, it takes people willing to be clinical trial participants to transform the ideas of researchers into better treatments.

Though I havent been on a clinical trial due to cancer progression, I have been a participant in an exercise/diet intervention for people with metastatic breast cancer and part of the control group in a cardiac intervention trial for people prescribed Herceptin (trastuzumab). Ive also been a research team member and have reviewed clinical trial consent forms.

Despite all of this, I still find the idea of participating in a clinical trial a hurdle. Part of it is that I want to know that my personal data will be safe and remain private.

Personal health privacy is a big deal. There is a history of misused data and unethical behavior in the U.S. that cant be ignored. The hospitals we trust can give our de-identified health care data to researchers and sell it to other companies (or create their own data-mining companies), and they dont always have to let us know.

Yet, if we want to take down cancer, we have to find confidence in a system that wasnt designed to protect us. In clinical trials, there are steps that researchers and institutions are required take to assure that private information is kept as safe as possible. However, the words your doctor or clinical trial researcher uses can be unfamiliar. Here, a quick rundown of words to know when it comes to clinical trial privacy:

Institutional Review Board (IRB): The IRB is made up of people from different backgrounds some have scientific training, but there are also people without scientific training as well as community members. It is meant to keep research participants as safe as possible. In terms of data, the IRB asks why specific data is needed, who collects and can look at that data, and how it will be protected. IRB itself is short for Institutional Review Board of Biomedical Researching Using Human Subjects.

Protected Health Information (PHI): Also called Personal Health Information, this is all the information--demographic, medical history, test results, mental health conditions, insurance information, and so on--that your doctors collect to identify you and provide care.

De-identified: This means that your personal health information isnt easily linked to you, but it is not anonymous. Each person is given a patient-identifier for the study (not the number used with your medical records) and that alone is used when the research team is looking at all the collected information. With de-identified data, a select person, such as the main investigator, is able to link your specific information (for example, tissue or blood samples) to you. This is also called coded data and is used in clinical trials.

Anonymous: In research, this means that not even the investigator can link your information to you. You might see this word used to describe some patient surveys.

HIPAA (Health Insurance and Portability and Accountability Act): For clinical trials where patient data will be de-identified, this Act ensures that 18 personal identifiers are removed from the information collected. HIPAA and the Health Information Technology for Economic and Clinical Health Act limit the PHI that healthcare providers, health insurance companies, and the companies they work with can collect. Those rules also limit what can be done with the data in terms of sharing it with other organizations or using it in marketing.

Data Storage: In the clinical trial plan, researchers must describe the specifics of how they will keep de-identified data safe from being tracked back to the patient. This is done by using technology to disguise (encrypt) the information, limiting who can see it and keeping paper records in a locked location.

Consent Form: This form ensures that the patient explicitly agrees to participate and outlines how their identity and information will be kept safe. It must be written in a way that middle schoolers can understand. Often, it will include information about where and how the de-identified patient information might be shared.

For more news on cancer updates, research and education, dont forget tosubscribe to CUREs newsletters here.

Go here to read the rest:

The Ins and Outs of Privacy in Clinical Trials for Patients With Cancer - Curetoday.com

Read More..

Preserve Data History In The Cloud — Or Lose It – ITPro Today

In a discussion I had with a spokesperson for a cloud services vendor several months ago, the representative said something that stuck with me. One thing you dont get with the cloud is a history of how your data changes over time, the spokesperson said. Theres no way you can look at a record [at a point in time] and compare it to other time periods. What were doing is were preserving the historical data that is [otherwise] lost in the cloud.

The spokesperson was not affiliated with a cloud data lake, data warehouse, database or object storage vendor. It seemed that the company hadnt previously considered that cloud subscribers could use one of these services to collect and preserve the historical data produced by cloud applications. Or, if the company had previously considered this, it was rejected as an undesirable, or nonviable, option.

In my conversations specific to the data mesh architecture and data fabric markets, the terms history and historical tend to come up infrequently. Instead, the emphasis is on (a) enabling business domain experts to produce their own data and (b) making it easier for outsiders -- experts and non-experts alike -- to discover and use data. And yet, discussion of the technologies that underpin both data mesh architecture and the data fabric, viz., data virtualization, metadata cataloguing, knowledge discovery, focuses on connectivity to operational databases, applications and services; i.e., resources that do not preserve data history.

Historical data is of crucial importance to machine learning (ML) engineers, data scientists and other experts, of course. It is not an afterthought. And there are obvious schemes you can use to accommodate historical data in data mesh architecture -- a historical repository that is instantiated as its own domain, for example.

But these two data points got me wondering: Are we forgetting about data history? In the pell-mell rush to the cloud, are some organizations poised to reprise the mistakes of past decades?

Most cloud apps and services do not preserve historical data. That is, once a field, value or record changes, it gets overwritten with new data. Absent a routinized mechanism for preserving it, this data is lost forever. However, having said this, some cloud services do give customers a means to preserve data history.

This option to preserve data history might seem convenient, at least so far as the customer is concerned. But there is a myriad of reasons why organizations should consider taking on and owning the responsibility of preserving historical data themselves.

The following is a quick-and-dirty exploration of considerations germane to the problem of preserving, managing and enabling access to historical data produced by cloud applications and services.It is not in any sense an exhaustive tally, but it does aspire to be a solid overview.

Then you need a plan to preserve, manage and use the data produced by your cloud apps and services.

The good news is that it should be possible to recover historical data from extant sources. Back in the early days of decision support, for example, recreating data history for a new data warehouse project usually involved recovering data from backup archives, which, in most cases, were stored on magnetic tape.

In the cloud, this legacy dependency on tape may go away, but the process of recreating data history is still not always straightforward. For example, in the on-premises environment, it was not unusual for a backup archive to tie into a specific version of an application, database management system (DBMS) or operating system (OS). This meant that recovering data from an old backup would entail recreating the context in which that backup was created.

Given the software-defined nature of cloud services, virtual abstraction on its own does not address the problem of software dependencies. So, for example, in infrastructure as a service, you have the same dependencies (OS, DBMS, etc.) as you did in the on-premises data center. With respect to platform as a service (PaaS) and software as a service (SaaS), changes to newer versions of core cloud software (e.g., deprecated or discontinued APIs) could also complicate data recovery.

The lesson: Develop a plan to preserve and manage your data history sooner rather than later.

You should still have a plan. When you use your providers offerings to preserve data history, it creates an unnecessary dependency. That is, do you really own your data if it lives in the providers cloud services?

Moreover, your access to your own data is mediated by the tools and APIs -- and the terms of service -- that are specified by your cloud provider. But what if the provider changes its terms of service? What if you decide to discontinue use of the providers services? What if the provider is acquired by a competitor or discontinues its services? How much will it cost you to move your data out of the providers cloud environment? What formats can you export it in?

In sum: Are you comfortable with these constraints? This is why it is incumbent upon customers to own and take responsibility for the historical data produced by their cloud apps and services.

Even in the era of data scarcity -- first, scarcity with respect to data volumes; second, scarcity with respect to data storage capacity -- savvy data warehouse architects preferred to preserve as much raw historical data as possible, in some cases using change-data capture (CDC) technology to replicate all deltas to a staging area. Warehouse architects did this because having raw online transaction processing (OLTP) data on hand made it relatively easy to change or to maintain the data warehouse. For example, they could add new dimensions or rekey existing ones.

Today, this is more practicable than ever, thanks to the availability (and cost-effectiveness) of cloud object storage. It is likewise more necessary than ever, due to the popularity of disciplines such as data science and machine learning engineering. These disciplines, along with traditional practices such as data mining, typically require raw, unconditioned data.

A caveat, however: If you use CDC to capture tens of thousands of updates an hour, you will ingest tens of thousands of new, time-stamped records each hour. Ultimately, this adds up.

The lesson is that not all OLTP data is destined to become historical. If for some reason you need to capture all updates -- e.g., if you are using a data lake to centralize access to current cloud data for hundreds of concurrent consumers -- you do not need to persist all these updates as part of your data history. (Few customers could afford to persist updates at this volume.) What you should do is persist a sample of all useful OLTP data at a fixed interval.

On its own, it is possible to query against data produced by cloud applications or services to establish a history of how it has changed over time. Data scientists and ML engineers can trawl historical data to glean useful features, assuming they can access the data. But data is also useful when it is combined with (historical) data from other services to create different kinds of multidimensional views: you know, analytics.

For example, by combining data in Salesforce with data from finance, logistics, supply chain/procurement, and other sources, analysts, data scientists, ML engineers and others can produce more useful analytics, design better (more reliable) automation features and so on.

By linking sales and marketing, finance, HR, supply chain/procurement, logistics, and other business function areas, executive decision makers can obtain a complete, synoptic view of the business and its operations. They can make decisions, plan and forecast on that basis.

This is just to scratch the surface of its usefulness.

The purpose of this article was to introduce and explore the problem of capturing and preserving the data that is produced by cloud apps and services -- specifically, the historical operational data that typically gets overwritten when new data gets produced. There are several reasons organizations will want to preserve and manage this data, including the following:

There is another reason that organizations will want to capture and preserve all the data their cloud apps produce, however. Most SaaS apps (and even many PaaS apps) are not designed for accessing, querying, moving, and/or modifying data. Rather, they are designed to be used by different kinds of consumers who work in different types of roles. The apps likewise impose constraints, such as API rate limits and, alternately, per-API charges, that can complicate the process of accessing and using data in the cloud.

In a follow-up article, I will delve into this problem, focusing specifically on API rate limits.

Link:

Preserve Data History In The Cloud -- Or Lose It - ITPro Today

Read More..