Category Archives: Deep Mind

Why the algorithms assisting medics is good for health services (Includes interview) – Digital Journal

As Digital Journal's Ken Hanly reported, Google has developed artificial intelligence aimed at helping doctors identify breast cancer more accurately and faster. The algorithm scans mammograms reduces false negatives by 9.4 percent (for U.S. patients) and 2.7 percent (for U.K. patients), according to experimental data.Then new technology comes from DeepMind and Google Health. The artificial intelligence is not only designed to reduce error rates, it also has the intention of enabling doctors detect breast cancer early. This is based on six radiologists assessing 500 randomly-selected mammograms.According to expert Joseph Mossel, CEO and co-founder of Ibex Medical Analytics: The Deep Mind algorithm utilities the same technology that we use deep learning to improve the accuracy and efficiency of diagnoses at a different stage of the cancer pathway."He notes that while "Googles technology is chiefly concerned with radiology which is the initial detection of whether cancer is present or not, his company's own "Second Read System looks at pathology which is the final diagnosis of what is suspected and any subsequent grading and is already used in a live clinical setting in several labs globally."Commenting on the potential benefits of artificial intelligence for healthcare, Mossel says: "Both systems address the same underlying problem a chronic shortage of trained staff which is putting an increased strain on the current system and is impacting the quality of diagnoses."Taking the U.K. as an example, Mossel notes: "Just 3 percent of NHS labs have enough pathologists, which means that 97 percent dont have sufficient resources. This can lead to error rates where benign diagnoses are actually cancerous as high as 12 percent; a truly alarming statistic."Extolling the overall benefits, Mossel concludes: "With more than two million new cases identified each year and as many as one in eight women diagnosed with breast cancer, it is a disease that impacts many lives. But with more people leaving the professions that help identify cancer than joining, it is vital that we empower those left with the tools to do their jobs as effectively as possible and AI technology can do just that.

Read the original here:
Why the algorithms assisting medics is good for health services (Includes interview) - Digital Journal

2020: The Rise of AI in the Enterprise – IT World Canada

This year looks to become the breakout year in Canada for artificial intelligence (AI) initiatives in the enterprise with the accelerated adoption of public cloud particularly in Western Canada.

Why? In the fall of 2019 British Columbia amended legislation as it relates to the use of cloud computing that will enable the public sector (inclusive of crown corporations, healthcare, etc.) to pursue digital transformation programs with modern technology. Alberta, for its part, has become one of Canadas leading AI hubs with numerous research labs and accelerators. For example, Google DeepMind, opened its first-ever international AI research office in Alberta. With this progress, Western Canada is poised for a year of innovation and digital transformation initiatives coming to fruition.

From an overarching data analytics perspective, in many of my conversations with CIOs, CTOs and senior IT Leaders across Western Canada over the past year, I found the viewpoints around the benefits of AI and machine learning (ML) to be quite consistent. Namely, the next decade will be about data-driven decision making. And, the feeling that those organizations investing time on assessing a variety of business use cases early and often will have the highest probability of success.

Even at this early stage of adoption within the enterprise, there is much to be learned from both the successful and not (as) successful attempts. Its important for CEOs to understand that the insights and learnings gleaned from their digital transformation teams and data science teams are the foundations of experience embrace it.

To highlight the rise of AI in the enterprise across Western Canada in 2020, along with providing insights from a variety of industries, I will be taking a holistic view and feature interviews with CIOs of provincial governments to heads of innovation in the oil & gas sector among others. Well tackle both the business benefits as well as the challenges many face by discussing the following topic areas:

Additionally, from the vendor landscape perspective, Ill look at the leading platforms helping organizations along their data analytics journey such as AWS, Databricks, Microsoft, Snowflake and ThoughtSpot among others. Well discuss top use cases across a variety of industries as well. For example, leveraging geospatial data for the public sector to equipment failure prediction for the oil & gas and energy sectors among others.

It will be an exciting year in Western Canada as many embark on their journey over the next decade.

Open Invitation If you have innovative AI and machine learning initiatives that you are interested in potentially being featured for an article, you may contact me via LinkedIn to discuss further.

Go here to see the original:
2020: The Rise of AI in the Enterprise - IT World Canada

DeepMind’s new AI can spot breast cancer just as well as your doctor – Wired.co.uk

PA Images / Kristan Lieb/Chicago Tribune/TNS/Sipa USA

One in eight women will be diagnosed with breast cancer throughout their lives. In an effort to help with quicker detection, researchers have trained a deep-learning algorithm to spot breast cancer in screening scans as accurately or better than a radiologist.

While still at an early stage, the research could eventually help reduce incorrect results in the US and help alleviate the shortage of radiologists in the UK. As early detection is key to treatment, women over the age of 50 are tested in the US and UK even if they don't show signs of the disease. False negatives, when cancer is present but not spotted, can prove deadly, while false positives can be distressing.

Google-owned DeepMind has already worked with NHS organisations to develop AI to read eye scans and spot neck cancer. Over the past two years, researchers from Cancer Research UK Imperial College, Northwestern University, Royal Surrey County Hospital, and Google Health have used a deep-learning system developed by DeepMind on two different datasets of breast scans, one from the US and one from the UK, suggesting the AI could help read mammograms accurately.

"This is another step along the way of trying to answer some of the questions that will be critical for us to actually deploying this in the real world," says Dominic King, director and UK lead of Google Health. "This is another step closer to trying to deploy this type of technology safely and effectively."

The system was first trained on de-identified mammograms from 76,000 British women, using Cancer Research UK's OPTIMAM dataset, as well as 15,000 scans from the US. Once trained, the algorithm was tested on 25,000 scans in the UK, and a further 3,000 in the US. Four images from each mammogram were pulled into a neural network, spitting out scores for three different models between zero and one, with the latter a high risk of cancer.

The AI's conclusions were then compared against real-life results in the future and what radiologists said at the time, says Christopher Kelly, clinician scientist at Google Health, and co-author of the research, which has been published in the journal Nature. "Our ground truth was based on biopsy results and follow-ups, so if they had a normal screen two or three years later," he explains.

The research says the AI model could predict breast cancer with the same level of accuracy as a single expert radiologist. Compared to human experts, the system saw a reduction in false positives by 5.7 per cent in the US and 1.2 per cent in the UK, and in false negatives of 9.4 per cent in the US and 2.7 per cent in the UK.

However, those results don't necessarily reflect how such scans are read in real life. In the US, breast scans are normally checked by a single radiologist, while in the NHS mammograms are checked by a minimum of two radiologists. If those two "readers" disagree on the result, the scan is checked by a third and potentially even a fourth.

The study claims the DeepMind algorithm performs better than a single radiologist, and is "non inferior" versus two."The model performs better than an individual radiologist in both the UK and the US," Kelly says. "In the UK we have this double reading system, where two radiologists or maybe three or four look at each scan we're statistically the same as that, but not better than that."

However, the Royal College of Radiologists says its workforce modelling research suggests the UK is short of at least 1,104 radiologists; there are currently 542 expert breast radiologists in the UK, but eight per cent of hospital posts for such roles are unfilled.

If the role of the second reader could be partially replaced by AI, that could alleviate some of the staff shortages, notes King indeed, he says radiologists asked Google Health to look into AI for screening scans for just this reason. "We had a group of senior breast radiologists in the UK contact us three or four years ago to say that this was an area they felt was amenable to artificial intelligence but also it was critical to start thinking about how technology could start supporting the sustainability of the service, because currently there can be very lengthy delays," he says.

To test that idea, the researchers ran a side project, simulating how the algorithm could work with a human radiologist. The AI and the human radiologists agreed 88 per cent of the time, meaning only 12 per cent of scans would then have to be read by another radiologist. However, the reader study was run with a more limited data set and only six radiologists, all of whom were US trained and only two of whom had fellowship-level training in breast imaging. "We'd like to test this with further work, but as a kind of simulation, it was quite exciting to see this as a suggestion towards a potential system on the future," says Kelly.

Regardless of the success of such research, radiologists can't be fully replaced by AI but they could be assisted, stresses Caroline Rubin, vice president for Clinical Radiology at the Royal College of Radiologists, who was not involved in the research. Like the rest of the health service, breast imaging and UK radiology more widely is under-staffed and desperate for help," she says. "AI programmes will not solve the human staffing crisis as radiologists and imaging teams do far more than just look at scans but they will undoubtedly help by acting as a second pair of eyes and a safety net."

Alongside the accuracy checks and the reader simulation, the researchers also examined whether the system could be generalised, meaning the system could be trained on a single data set and used everywhere. To test this, they ran the algorithm trained on UK data on the American scans. The results were not as good as the system trained on US data a 3.5 per cent reduction in false positives, versus 5.7 per cent using local data but they were still positive, suggesting some generalisation may be possible, but training on a localised data set remains preferable.

While the results are positive, the researchers stress the work remains in the early stages. The Google researchers say they would like to see more research done not on retrospective, historical data, but with current patients. "Prospective studies are the only way you find out how these things perform in the real world," Kelly says, in particular how clinicians would interact with the system. Plans for such a project is in the works. "That's a different programme of research that we're now excited to be exploring," King says.

The RCP's Rubin agrees, calling for rigorous testing and careful regulation. "The next step for promising [breast screening AI] products is for them to be used in clinical trials, evaluated in practice and used on patients screened in real-time, a process that will need to be overseen by the UK public health agencies that have overall responsibility for the breast screening programmes," she says.

Thanks in part to the Royal Free debacle, concerns around data privacy have stalked Google's latest foray into medical research. The researchers stressed that the mammograms were de-identified, adding that the algorithm looked only at the scans and no other patient information. The UK data was sourced from a set collected specifically for research by Cancer Research UK.

Another concern with AI development is the spectre of bias, and the paper says that checks of bias found none. That suggests the algorithm should work equally well on any scan though it's actually better at spotting invasive cancers, that's positive as humans find them more troublesome regardless of the specific details of the individual. To check that, Kelly says the team looked at metadata associated with each image to ensure the AI wasn't "underperforming" on minority subgroups, he says, adding that more in depth analysis can and should be done to ensure no bias.

Andrew Holding, senior research associate at Cancer Research UK's Cambridge Institute, who was not involved with the research, says the best way to avoid such bias is training with a diverse data set. "In this study data from the US and the UK was used in recognition of these challenges, nonetheless, were still a long way from representing the full diversity of people who present in the clinic," he says.

"A clinician would rapidly adapt to something as simple as skin pigmentation by drawing on their wider life experiences, but an AI having never seen it would diagnose in an unpredictable manner. Similar problems could occur because one hospital uses a slightly older piece of equipment to take the mammogram leading, and that might lead to different patient outcomes. These problems aren't unsolvable, but they do present a huge challenge."

One bias to be considered in future research is the manufacturer of the scanning machines. The study coincidentally used scans that were predominantly produced via Hologic machines, and future work should ensure the algorithm works as well with other scanners.

There is one concern raised by the paper, says Holding: the code used by the algorithm has not been released. "The code used for training the models has a large number of dependencies on internal tooling, infrastructure and hardware, and its release is therefore not feasible," the paper notes, saying it's described in enough detail in the supplementary materials to be replicated with non-proprietary libraries.

"The work presents a fantastic effort, but its a shame that the authors have decided only to include instructions on how the AI was built and not provide the source code," Holding says, pointing to a campaign for reproducible science. "Including source code is vital for increasing the impact of the research. It allows other scientists to build on the work rather than having to start again from scratch and provides for a better understanding of how the results were obtained. It also helps the researchers who do the work. By working reproducibly, you avoid costly mistakes and help your own research group build on your results."

Beyond research itself, Holding argues researchers owe it to patients whose data they use to release such information freely. If patients are generous enough to consent to their data to be used by companies like Google for research purposes, ideally the results and methods generated from the data should be available to them for free," he says. "The research simply isnt possible without that consent.

Google Health's Kelly admits that black-box algorithms are less useful in clinical settings, as it's helpful to physicians to see their workings. This particular system is made up of three different models, which are combined into a single score. One of those, the local model, is perhaps the most important to clinicians as it highlights areas of concern. "When you look at workflows, localisation is actually really important to a radiologist they look at the images and draw on the area that is suspicious," says Kelly.

"When that goes to a consensus process, they compare all these diagrams." That means that while the "global model", which includes the local and the other data, has the most accurate cancer predictions, it may not be as useful to human radiologists. "Although it might perform the best, it's a black box." That said, Holding adds, we don't always know how radiologists or physicians make their decisions, either. "While it's in my nature to be sceptical of 'black-box' software, there is the counterpoint that we don't really know how each individual clinician pulls together years of experience to make a decision about a patient."

Work remains on this particular project as well as in the wider field of AI to read medical images, but such progress not only could cut costs, but improve care, Holding says. "These studies provide the path to a second digital pair of eyes that are never tired and see every patient," he says. "By then using these AIs to catch potential mistakes, we can avoid concerns of putting all our faith in software, and still apply the technology to give better patient outcome. And that is really exciting.

SUVs are worse for the planet than anyone realised

Science says we should work shorter hours in winter

The illegal trade of Siberian mammoth tusks revealed

I ditched Google for DuckDuckGo. Here's why you should too

How to use psychology to get people to answer your emails

Get The Email from WIRED, your no-nonsense briefing on all the biggest stories in technology, business and science. In your inbox every weekday at 12pm sharp.

by entering your email address, you agree to our privacy policy

Thank You. You have successfully subscribed to our newsletter. You will hear from us shortly.

Sorry, you have entered an invalid email. Please refresh and try again.

More:

DeepMind's new AI can spot breast cancer just as well as your doctor - Wired.co.uk

The Most Mind-Boggling Scientific Discoveries Of 2019 Include The First Image Of A Black Hole, A Giant Squid Sighting, And An Exoplanet With Water…

In 2019, scientists around the world pulled off some impressive feats: Theyimaged a supermassive black holefor the first time,debuted two treatments for the Ebola virus, andlaunched a spacecraftinto orbit that's powered by sunlight alone.

Over the past year, researchers have alsodiscovered a hidden continent,captured video of a giant squidin its deep-sea habitat, andsent a probe to an asteroid5.5 million miles from Earth.

These and other accomplishments are improving scientists' understanding of our own biology, our planet, and the surrounding cosmos.

As a new year and a new decade approaches, here's a look back at some of the most mind-boggling scientific discoveries from 2019.

On New Year's Day, NASA's nuclear-powered New Horizons spacecraft flew past a mysterious, mountain-sized object 4 billion miles from Earth.

The object, called MU69, is nicknamed Arrokoth, which means "sky" in the Powhatan/Algonquian language (it was previously nicknamed Ultima Thule). It's themost distant objecthumanity has ever visited.

The New Horizons probe took hundreds of photographs as it flew by the space rockat 32,200 miles per hour.

Images revealed that Arrokoth isflat like a pancake, rather than spherical in shape. The unprecedented data will likely reveal new clues about the solar system's evolution and how planets like Earth formed, though scientists are still receiving and processing the information from the distant probe.

Just days after New Horizons' fly-by, China's Chang'e-4 mission put a rover and lander on the far side of the moon the part we can't see from Earth.

Before Chang'e-4's success, no country or space agency had evertouched the far side of the moon.

The name "Chang'e" is that of a mythical lunar goddess, and the "4" indicates that this is the fourth robotic mission in China's decade-long lunar exploration program.

The rover landed in the moon's South Pole-Aitken Basin, which is the site of a cataclysmic collision that occurred about 3.9 billion years ago. The celestial smash-up left a1,550-mile-wide impact sitethat likely punched all the way through the moon's crust. Landing the spacecraft in this crater could therefore enable scientists to study some of the moon's most ancient rocks.

Elsewhere in the solar system, NASA scientists learned about Mars quakes, the red planet's version of earthquakes.

NASA's InSight lander, which touched down on Marsin November 2018, has given scientists the unprecedented ability to detect and monitor Mars quakes.

The lander's built-in seismometer detected its first Mars quake in April. Since then, researchers have recordedmore than 100 seismic events, about 21 of which were likely quakes. Reading the seismic waves on Mars, scientists hope, will reveal clues about what the planet's inside looks like.

Over 5.5 million miles from Earth, a Japanese spacecraft landed on the surface of an asteroid called Ryugu in July.

The Japan Aerospace Exploration Agency (JAXA) launched itsHayabusa-2probe in December 2014. Hayabusa-2 arrived at Ryugu in June 2018, but didn'tland on the asteroid's surface until this year.

In order to collect samples from deep within the space rock, Hayabusa-2blasted a holein the asteroid before landing. The mission plan calls for it to bring those samples back to Earth. By studying Ryugu's innermost rocks and debris which have been sheltered from the wear and tear of space scientists hope to learn how asteroids like this may have seeded Earth with key ingredients for life billions of years ago.

NASA's Voyager 2 spacecraft left our solar system and entered the depths of interstellar space.

The probe beamed back unprecedented data about previously unknown boundary layers at the far edge of oursolar system an area known as the heliopause.

The discovery of these boundary layers suggests there are stages in the transition from our solar bubble to interstellar space that scientists did not know about until now.

In December, the European Space Agency launched a new space telescope into orbit to examine known exoplanets in more detail.

The CHaracterizing ExOPlanets Satellite (CHEOPS) has a foot-widecamera lens designed specifically to study the size and mass of known exoplanets smaller than Saturn.

CHEOPS will alsolook for atmosphereson those far-away worlds a requirement for any planet to host life.

Kate Isaak, a physicist on the CHEOPS team, said in apress releasethat the telescope will "take us one step closer to answering one of the most profound questions we humans ponder: Are we alone in the universe?"

This was also a watershed year for the study of black holes. In April, the Event Horizon Telescope team published the first-ever image of a black hole.

The unprecedented photo shows the supermassive black hole at the center of the Messier 87 galaxy, which is about54 million light-years away from Earth. The black hole's mass is equivalent to 6.5 billion suns.

Though theimage is somewhat fuzzy, it showed that, as predicted, black holeslook like dark spheres surrounded by a glowing ring of light.

Scientists struggled for decades to capture a black hole on camera, since black holes distort space-time, ensuring that nothing can break free of their gravitational pull even light. That's why the image shows a unique shadow in the form of a perfect circle at the center.

The catastrophic collision nearly a billion years ago createdripples in space-time, also known asgravitational waves. They passed through Earth this year.

This was the third event scientists observed using gravitational-wave detectors. In 2015, researchers detected waves from the collision of two black holes, and in 2017 they observed two neutron stars merging.

Einstein predicted the existence of gravitational waves in 1915, but thought they'd be too weak to ever pick up on Earth.New tools have proved otherwise.

This year saw many innovations in space-travel technology, too. In March, SpaceX launched Crew Dragon, a commercial spaceship designed for NASA astronauts, into orbit for the first time.

The maiden flight of Crew Dragon marked the first time thata commercial spaceshipdesigned for humans has left Earth.

It was also the first time in eight years that any American spaceship made for people launched into orbit. Crew Dragon's successful test flight was a critical milestone for the US. Since NASA retired its fleet of space shuttles in 2011, the US has relied on Russian rockets and ships to taxi astronauts to and from the ISS.

Scientists also successfully harnessed the power of sunlight to propel a spacecraft.

This summer, the Planetary Society led by science communicator Bill Nye launched a satellite calledLightSail 2 into orbit, where it then unfurled a 344-square-foot solar sail.

As light particles reflect off that sail, they transfer momentum to the spacecraft.

A spacecraft that utilizes a solar sail in this way has an almost unlimited supply of energy. Advancing this type of propulsion technology could one day help spacecraft reach nearby star systems that aren't currently accessible due to the finite amount of fuel we can launch off the planet.

On Earth, scientists have made monumental though often troubling discoveries. Climate researchers found that the Antarctic and Greenland ice sheets are melting at unprecedented rates.

In April, a studyrevealed that the Greenland ice sheet is sloughing off an average of 286 billion tons of ice per year. Two decades ago, the annual average was just 50 billion.

In 2012, Greenland lostmore than 400 billion tons of ice.

Antarctica, meanwhile, lost an average of 252 billion tons of ice per year in the last decade. In the 1980s, by comparison, Antarctica lost 40 billion tons of ice annually.

What's more, parts of Thwaites Glacier in western Antarctica are retreating by up to 2,625 feet per year, contributing to4% of sea-level rise worldwide.A studypublished in July suggested that Thwaites' melting is a time bomb that is likely approachingan irreversible pointafter which the entire glacier could collapse into the ocean. If that happened, global sea levels would rise by more than 1.5 feet.

Researchers' predictions about coming sea-level rise are getting more accurate and scarier. Estimates suggest the world's oceans could rise 3 feet by 2100.

A September report from the United Nations Intergovernmental Panel on Climate Change projected that sea levels couldrise by more than 3 feetby the end of the century. The rising water could affect hundreds of millions of people who live on small islands and in coastal regions.

Another studysuggested that the number of people displaced by sea-level rise could reach 630 million if greenhouse-gas emissions continue to rise through 2100.

Another landmark UN report revealed that between 500,000 and 1 million plant and animals species face extinction, many within decades.

Thereport, published in April, estimatedthat 40% of amphibian species, more than 33% of all marine mammals and reef-forming corals, and at least 10% of insect species are threatened, largely as a result of human actions. Researchers also found that more than 500,000 land species already don't have enough natural habitat left to ensure their long-term survival.

This finding contributes to a rapidly growing body of evidence that suggests Earth is the midst ofa sixth mass extinction the sixth time in the planet's history that species are experiencing a major global collapse in numbers.

One nearly long-lost species, however, emerged from the wilderness this year. In June, scientists spotted a giant squid in its deep-sea habitat in the Gulf of Mexico.

The giant squid, which inspired the legend of the Kraken monster, has only been caught on video one other time. The creatures almost never leave the icy depths of their habitat,up to3,300 feet (about 1,000 meters) beneath the waves.

In 2012, scientists from Japan's National Museum of Nature and Science filmeda giant squid in its natural habitatin the Ogasawara archipelago.

Another hidden part of nature a lost continent was found hiding under Europe.

Hundreds of millions of years ago, Earth had one giant supercontinent named Pangea, which eventually broke up into our modern-day continents.A recent studyshowed that in that process, an eighth continent slid under what is now southern Europe about 120 million years ago.

It's still hidden deep within the Earth.

The researchers named this continentGreater Adria. Its uppermost regions formed mountain ranges across Europe, like the Alps.

Anthropologists dug deep into the Earth to make incredible discoveries in 2019. In August, researchers announced they'd found the oldest skull ever seen from one of our human ancestors.

The nearly-intact skull, which belonged to the species Australopithecus anamensis, is3.8 million years old. The fossil, nicknamed "MRD," revealed that these ancient people had protruding faces with prominent foreheads and cheek bones, much like other Australopithecus species in the fossil record.

"The MRD find is an iconic cranium," paleoanthropologist Tim Whitetold Nature.

MRD's age also suggested that these human ancestors coexisted with another species of human ancestor, Australopithecus afarensis, for at least 100,000 years. The nearly complete skeleton "Lucy" was a member of the latter group, which roamed Africa between 3.9 million and 3 million years ago.

In April, anthropologists discovered teeth and a finger bone from a new species of human ancestor.

The new species, namedHomo luzonensisafter the Philippine island on which it was discovered, lived between 50,000 and 67,000 years ago.

A studydescribed how this human ancestor shared traits witholder human ancestorslike Australopithecus and Homo erectus, as well as with modern-day humans.

Excerpt from:

The Most Mind-Boggling Scientific Discoveries Of 2019 Include The First Image Of A Black Hole, A Giant Squid Sighting, And An Exoplanet With Water...

Review: In the Vale is full of characters who linger long in the mind – Nation.Cymru

In the Vale. Background image, Parish Church, St Nicholas, Vale of Glamorgan/ Picture by John Lord (CC BY-SA 2.0)

Jon Gower

The eponymous vale of this evocative historical novel is the Vale of Glamorgan though at times it might just as easily be the Vale of Tears as children die prematurely, hunger stalks the land and the shadowy clouds of the Napoleonic Wars lengthen.

The central characters are George, a cash-strapped curate and his wife Sarah, who conveniently marries him after becoming pregnant with the child of a local landowner, Richard Aubrey. Hiding his paternity Aubrey pulls some strings and George is gifted the ancient church at Llantrithyd to look after, and thus a scant but secure living, as this is still a period when the church was granted tithes from its parishioners. They tilled the land and the churchmen prayed for them in turn, and took their scant money.

With the sure authority of Anthony Trollope writing about the clergy of Barchester, Adams details the life of the parish and a clergymans responsibilities therein. He is especially good on the patterns, rigours and terminology of the agricultural life. So we find George weeding a barley crop, His companion in labour was one of a pair of oxen from the hovel, which needed the flick of a long, thin whip on its flank as a reminder to lift one broad hoof after another with the same measured pace the share would spill out fresh furrows through the day.

On a warm day in August, the communal act of haymaking sees landowners and tenants alike working in tandem, having fallen into line and into the rhythm of cutting, the scythe close to the ground, arms straight, the body twisting left to right, slicing the grass neatly, adding stroke by stroke to the swathe lengthening across the field.

All this detail, all these rhythms of the farming year are delivered with quiet authority. Here is a novelist who knows how to employ his research without show or flourish, allowing his evocations of London life, or autumnal shooting parties out to bag hares and partridges, or the rituals of rural funerals to settle fully in the readers mind.

Defences

Sam Adams is a long-established poet as well as a novelist so naturally, the book is shot through with passages of quiet lyricism:

Nearing Penlline, the lane climbed steeply and passed between tall trees and hedges so high and weighted by foliage they bowed inwards, hiding the sky. In the damp warmth of May, blossoms on the hedge banks, primrose and dandelion, bluebell and celandine, strained up towards chinks of light in the canopy.

One of the unexpected subjects of the book is that of inoculation, as Jenners ideas for ensuring children should not be affected by the disfigurements and threat of death by smallpox are trialled by George and Sarah on their child Bloom, with tragic consequences. Life and death, plantings and harvests, blossoming and decay: these are the patternings of the novel.

As the story proceeds so too do Britains preparations to deal with Napoleons commandeering of Europe, and a threat to British shores. Richard Aubrey becomes the commandant of the Glamorgan Militia, which finds itself on the south coast of England, digging underground defences to ward off French invasion across the channel.

For a brief while, George joins him as a curate, where the learns the men are a different breed. When he asks them where they come from, which parishes and who are their vicars one of them replies No vicars, said the Dowlais man, chapel us. And, as work began, the deep voice from the gloom rose again in a hymn tune, in a moment taken up it seemed by the whole body of the men, and the tune, the strong Welsh words, followed the chaplain

Meanwhile, back in Wales, the effect of the war is felt deeply as bread prices rise and food shortages spread and many already poor people find themselves on starvation rations. A rebellious spirit spreads among them and George, in his role as magistrate, has to stand up to them and in particular to one of the ringleaders, his surly, obnoxious gardener who turns out to have the blackest heart of all the books characters.

The Vale offers an absorbing insight into life in both grand houses and near-derelict hovels, setting the lives of working folk in a small corner of Wales in the context of sweeping events such as wars in Spain or Napoleons attempt to annexe Austria.

Sam Adams has won his spurs as a writer about history with previous works such as the novel Prichards Nose and Where the Stream Ran Red, a memoir of life in Gilfach Goch, the mining village where he grew up. Here, once again he proves himself to be an artist who is happy to work with a broad and busy canvas, filling it with finely detailed landscapes and cityscapes and characters who linger a long time in the mind.

Sam Adams himself has been writing since the 1960s and this latest novel shows absolutely no signs whatsoever of his tale-tellers gifts diminishing. In fact, quite the opposite.

In the Vale by Sam Adams costs 9.99 and can be bought here.

See original here:

Review: In the Vale is full of characters who linger long in the mind - Nation.Cymru

10 Gifts That Cater to Your Loved One’s Basic Senses – Wide Open Country

I feel extremely blessed to be born with five working senses. I don't take them for granted, and I sure do love gifts that allow me to feel thankful for them. For each sense, I have five senses gifts you can never go wrong with. For sight, it's movies. I can watch movies all day. For touch, it's soft throw blankets. Fuzzy blankets definitely pair with a good movie. For hearing, it's music. CDs, vinyl, and even concert tickets are always welcomed.

When it comes to smell, I'm a sucker for Yankee Candles. I'll never get over someone telling me that my house smells good. It's a wonderful compliment, and I love having the smell of vanilla candles linger on me. Last but not least, taste. Well, I'm a big eater. Take me to dinner, bake me a cake, or send me chocolate covered strawberries. I'll devour it all. Gift-giving feels good when you put a lot of thought into it and keeping a loved one's senses in mind adds a sentimental touch and loving feeling to make a gift exchange feel magical.

Headphones are a perfect gift for any time of the year. Wires get faulty, and there's always a new feature on these gadgets that make us want an upgraded option. These Cowin Bluetooth headphones have a noise-canceling feature. What's not to love about them?

For a smaller gadget, consider AirPods.

I am obsessed with my Google Nest Mini. It's in the living room with me all day so I can ask it to play my favorite podcasts and songs while I work. Then sits in the bathroom later in the evening so I can sit in my lavender bubble bath and ask Google to play my favorite songs. I don't miss having to skip songs manually, or having to find playlists on my own. Your Google Assistant will do it for you.

Many people are choosing to cut the cord and I don't blame them. As long as you have an HDTV and internet connection, you can enjoy streaming apps such as Netflix, YouTube, Prime Video, Disney+, HBO and more. It makes a great birthday gift or Christmas gift.

A camera is a thoughtful gift for the nostalgic woman in your life. Especially a Polaroid camera. What I like about Polaroid cameras is that I'm about to give the pictures of my loved ones to them right after taking a picture. Polaroid pictures also make great DIY activities. Making collages or hanging them from clothespins is a great way to display fun and candid photos. Be sure to check out more printable cameras on Amazon and Walmart.

Men can be picky about cologne, so I always play it safe and get a nice car freshener for their vehicle. This one smells like whisky and oak. Surely the man in your life will love it! This is a great stocking stuffer or something great to include in a 5 senses gift basket.

Essential oils are really something! It's crazy what these oils can do for you when you're congested or can't sleep. I recommend lavender oil in the diffuser when you need to sit back and relax.

A restaurant gift card is perfect for your significant other. They will definitely keep it in mind for your next date night or Valentine's Day. Sharing bread with the people you love is a great feeling. Especially sharing Texas Roadhouse rolls. That butter is addicting!

Ladies, I think we all love devouring chocolate here and there. Chocolates are my go-to snack for a Lifetime movie marathon. Add a bit a bourbon to the chocolate and I am having a blast. These make great gifts for the woman who enjoys a lazy Saturday afternoon on the couch.

Who doesn't love a foot massage? Leave the massage oil and candles for an anniversary gift, this foot massager is perfect for an everyday foot rub. There are 18 nodes for a deep massage, so you'll be able to find a setting that really takes the edge off.

Skincare is a vital nighttime routine for me. I love to lotion my body up before bed. Especially my feet! (You'll sleep better with moisturized feet.) Most importantly, I take care of my face. At night, I want my skin to be clean and have all of the necessary serums and moisture to keep my face glowing and even. Vitamin B and C are great for glowy even skin, but it's important to top it off with a good moisturizer.

The 5 senses gift concept really says, "I love you." It's so wild to me that thinking of a loved one's basic senses amps up a gift. Five senses gifts are perfect romantic gifts for Valentine's Day, unique gifts for birthdays, and even sweet gifts for the holidays. Keep the senses in mind next time you're shopping for family and friends.

oembed rumble video here

Follow this link:

10 Gifts That Cater to Your Loved One's Basic Senses - Wide Open Country

Re: Your Account Is Overdrawn – Thrive Global

Bigger, better, more and now.

This is the society we live in today. Christmas Trees up before Halloween, shopping, parties, spending, eating, drinking, rushing, worrying all in the name of holidays.

The word holiday comes from the word haligdaeg (holy day) and refers to a day of rest & relaxation, without work. And we work so hard for that day without work.

And then, in the blink of an eye, its here. Christmas morning, the kids rush down the stairs, begging to open gifts. We desperately pour another cup of coffee, and act surprised by what Santa left under the tree. We watch as they tear open gift after gift. Through tired eyes, we smile from the inside out seeing the joy on their faces. And then, in an instant, its over. Were left with an empty tree, a floor covered in paper, and decorations demanding to be taken down. And despite all the smiles, we cant help but feel a little deflated.

But what if there was more? What if the end was only the beginning? What if weve been doing it wrong this whole time?

Now before you start saying, More??! No. Im done. I cant take any more, Alicia! Just hear me out. Id like to propose we start a new tradition in 2020. Bigger, better, longer. I call it Big Rest.

And heres how it works:

1. Question everything:

You know that obnoxious Christmas song about the 12 Days of Christmas that makes no sense? Did you know that the 12 days dont end on Christmas? The 12 days begin on Christmas day and last through January 6th (3 Kings Day). So what does that mean?

But I dont celebrate Christmas, Alicia.

Thats OK. Christmas came from the Yule Festival, a Nordic tradition celebrated in Scandinavia and other parts of northern Europe. It begins on Winter Solstice (roughly December 21 or 22) and lasts for. 12 days! Later, Christians adapted these traditions to the celebration of Christs birth.

The point is, this isnt about traditions or religion. Its about digging a little deeper into why we do the things we do. And just because weve always done things a certain way, doesnt mean we cant change. After all, dont we love bigger, better, faster and more?

2. Lean In:

Now that our holiday boundaries have expanded, lets take a deep breath, and lean into the season longer. We, in the Northern Hemisphere, are in the middle of the winter season. It neither begins or ends with the holiday.

The winter solstice time is no longer celebrated as it once was, with the understanding that this is a period of descent and rest like nature and the animal kingdom around us, this time of hibernation is so necessary for our tired limbs, our burdened minds people are left feeling that winter is hard, because for those of us without burning fires and big festive families, it can be lonely and isolating. But winter is kind. She points us in her quiet, soft way towards our inner self, towards this annual time of peace and reflection, embracing the darkness and forgiving, accepting and embracing the past year and then, just around the corner the new year will begin again, and like a seed planted deep in the earth, we will all rise with renewed energy once again.

-Bridget Anna McNeil

3. Rest:

In the month of December, we rush and eat and spend and consume, and suddenly, its all over in a day. Then we hit the ground running on January 1st and resume our busy lives. Pay off the credit card, lose weight, make more money, hit the quota.

What if instead of ending and beginning the year in a rush, we ended and began with rest? What if we embraced winter, from December through February, as a time of peace and renewal? What if instead of reacting to the previous year, we took time to prepare for the next one? To recharge, to listen, to read, to explore, to plan.

Everything in nature is cyclical phases of the moon, the seasons, even our own bodies. So why, then, do we constantly push and rush and drive without ceasing? The moon isnt full all the time. The earth isnt spring year-round (unless you live in San Diego). The sun doesnt shine 24 hours a day. No, the sun sets, and we rest. In fact, we should spend 30% of our day in rest (approximately 8 hours). And yet, we view it as a burden to productivity instead of the fuel that enables us to create.

When we sleep, our bodies perform very vital functions to recharge, replenish, and heal itself. And all of this without our conscious mind to plan, execute, manage and analyze when we rest, our bodies recharge with no effort from us.

So, if youre left feeling drained at the end of the year, like an overdrawn bank account, I challenge you to prioritize Big Rest in 2020. Let go of unceasing effort, lean in, and trust the power and process of rest. Bigger, better, more and now.

To help illustrate the power of resting a little longer, Ill leave you with this old, but relevant, 80s commercial featuring the kid from A Christmas Story.

Happy Holidays! May they be full of rest and last a little longer.

Read this article:

Re: Your Account Is Overdrawn - Thrive Global

Want to dive into the lucrative world of deep learning? Take this $29 class. – Mashable

Just to let you know, if you buy something featured here, Mashable might earn an affiliate commission.From AI to sentiment analysis, the Ultimate Deep Learning class covers it all.

Image: pexels

By StackCommerceMashable Shopping2019-12-24 10:00:00 UTC

TL;DR: Become a machine learning expert with The Ultimate Deep Learning and NLP Certification Bundle for $29, a 97% savings.

Artificial intelligence and deep learning might bring to mind pictures of Will Smith's misadventures in I, Robot, but in reality, these are the technologies behind tomorrow's world. In fact, by 2030, PwC predicts nearly 40 percent of all U.S. jobs could be replaced by AI and automation.

That's a bit frightening, but if you're one of the pros who understands the technology behind these next-generation concepts, you'll be in good shape. And luckily, this Ultimate Deep Learning and NLP Certification Bundle is on sale and a great place to start.

This bundle provides six premium courses and over 300 lessons introducing you to machine learning, neural networks, and core tools like Keras, TensorFlow, and Python. You'll get a core understanding of deep learning from a practical viewpoint and deal with issues that newcomers to the field face all for just $29.

Once you get the gist of neural networks, you'll dive deeper into NLP, which helps computers understand, analyze, and manipulate human language. And eventually, you'll build your own applications for problems like text classification, neural machine translation, and stock prediction.

Usually retailing for $1,200, this learning bundle is currently on sale for just $29. And don't worry; you can complete all of the lessons on your own time and with lifetime access, you can return to it anytime you need to.

Originally posted here:

Want to dive into the lucrative world of deep learning? Take this $29 class. - Mashable

AI from Google is helping identify animals deep in the rainforest – Euronews

A simple device, just a heat and movement sensor attached to digital camera, has revolutionised the way that conservationists learn about animals in the wild.

Camera traps are a very simple solution to the task of working out when, where and how wildlife interacts with its environment. Monitoring populations without damaging habitats, these relatively simple devices have provided some astonishing finds including revealing species previously hidden in the untouched depths of the forest. Elusive new creatures arent their only speciality, however, as in 2015, similar devices helped reveal that the critically endangered Javan rhinoceros was breeding and significantly adding to its tiny population.

After identifying a likely area for a sighting, usually with the help of local guides, traps are placed at animal height on trees and posts and left to wait until wildlife walks by. A memory card stores the images for conservationists to fetch with these cameras often being deployed for months at a time and collecting hundreds of thousands of images with invaluable information about biodiversity.

And herein lies the problem. Whilst the traps have allowed millions of photographs worth of data to be amassed, the time it takes to weed out empty frames and label the species that cross their lens is astronomical. The delay in processing data can mean that vital information about biodiversity could be out of date before it can be used.

Watch More | Baby dolphin trapped in debris rescued from drowning

Tanya Birch of Google Earth Outreach and Jorge Ahumada from Conservation International explain in a blog post that, while a highly trained human being can label somewhere between 300 and 1000 of these images in an hour, new AI technology could analyze 3.6 million. Despite the challenging task of identifying species, Wildlife Insights is pretty accurate as well. Looking at data from the 614 species it has currently been trained on, the chance that it will guess correctly is over 80%. Perhaps, more importantly, the AI system is very good at removing camera trap images that feature no animals at all which can be up to 80% of the images taken.

This rapid computer analysis would make camera trap data available in near real-time removing the delay in processing that can make performing even basic research an incredibly slow process. Speed is vital in conservation to allow scientists and officials to react to ecological changes and rapid environmental disruption. Making this wealth of information rapidly available will help to support those attempting to save critically endangered species and their habitats.

Google has highlighted the story of scientists in Columbia who, after the demobilization of the FARC in 2017, have been gradually rediscovering parts of the Amazon that they could not enter during the conflict in the region. The short film shows how field biologist Anglica Diaz-Pulido of the Instituto Humbolt is using camera traps to monitor umbrella species like the Jaguar in the forest that skirts the unique Rainbow River region.

The challenges we biodiversity researchers currently face are focused on carrying out research quickly and ensuring it gets to decision-makers in time, Diaz-Pulido explains in the short film. Threats to forests in this region are looming and intervention needs data to be processed rapidly so that recommendations can be passed on to the Minister of Environment. Leaving camera traps for just one month, however, can generate nearly 150,000 images which would take the Instituto Humbolts researchers up to a year to sort through.

Machine learning helps to speed up the processing of this massive amount of data so that the scientists on the raw edge of this impending mass extinction can use the most recent biodiversity figures to inform policymaking. This speedy analysis could help save species on the brink of extinction by getting the news of their decline out there before the images before irrelevant.

Read More | I visited an ecoresort helping to protect Brazil's biodiversity

The launch of Wildlife Insights not only revolutionises the use of camera trap data by scientists but also allows open access to 4.5 million photos that date back to 1990. Ahumada hopes that the platform will help close the gap between the revolution that technology has brought about it allowing this data to be captured and our inability to use it. "AI can dramatically increase the speed at which camera traps can be processed and analyzed," he told euronews Living, "this will allow people to quickly assess wildlife populations from camera trap data and focus on actions that can help save and manage these species."

By making the data accessible Ahmuda hopes that everyone from managers of anti-poaching programmes to governments to children will use the wealth of images to identify wildlife in their area and grow the database. We want citizen scientists, teachers and children to use the platform, he says, These are the future generations who will benefit from wildlife conservation.

Using AI to process camera trap technology is by no means a new idea. Recently DeepMind detailed their work with Tanzanias Serengeti National Park which will see the technology used to identify behaviours impacted by encroaching human activity in the area. As with many other projects, previously processing the millions of images collected by the team relied on volunteers using Zooniverse, a site where people help citizen scientists by identifying and counting species.

Link:

AI from Google is helping identify animals deep in the rainforest - Euronews

Inside The Political Mind Of Jerry Brown – Radio Ink

The former California governor, now retired, has strong feelings about whats needed for California and for the country, and about his own legacy. San Francisco Bay Areas KQED has produced The Political Mind of Jerry Brown, a limited-run radio and podcast series hosted by KQEDs Senior Editor for Politics and Government Scott Shafer.

KQED producer Guy Marzorati and historians from UC Berkeleys Oral History Center spent over 40 hours interviewing Brown at his ranch for the series.

This is a deep dive into the life and career of someone who has had a profound impact on California over the course of 50 years, Shafer said. Our conversations reveal a reflective side of Jerry Brown how he plotted his rise in politics, how he both leveraged his fathers name and distanced himself from Pat Browns approach to politics, and what he learned from the mistakes he made along the way.

The Political Mind of Jerry Brown premieres January 8 with hour-long episodes on KQED 88.5 FM on Wednesdays at 8 p.m. through January 29. Listeners can binge on the entire series on Apple Podcasts, NPR One, and other podcast platforms beginning January 11.

Listen to the trailer HERE.

Shafer and Brown will participate in a live one-on-one conversation at the Herbst Theatre in San Francisco on Monday, January 13. For tickets and information go HERE.

Read the original:

Inside The Political Mind Of Jerry Brown - Radio Ink