Page 3,951«..1020..3,9503,9513,9523,953..3,9603,970..»

Uncovering the Long-Term Cloud Archive Equation – TV Technology

Deciding, for the long run, where to keep an organizations media-centric assets (i.e., original content, EDLs, finished masters, copies, versions, releases, etc.,) is reshaping the ways of storing or archiving data. Continued popularity in cloud-based solutions for ingest/content collection, playout from the cloud and processing in a virtual environment leads many to rethink storage in the cloud.

Getty Images/lvcandy

Yet cloud concerns still leave the door open to storage alternatives, tier-based migration, automated management and more.

What are the possible alternatives: Backup or archive? On-prem or in a co-lo? Private, hybrid or public cloud? Knowing the differences could create changes in how to approach archive management, regardless of the size, location or types of data libraries.

BACK IT UP OR ARCHIVE IT

Lets look first at the differences in backup vs. archive.

Backup is a duplicate copy of data used to restore lost or corrupted data in the event of unexpected damage or catastrophic loss. By definition, all original data is retained even after a backup is created; original data is seldom deleted. Most backup just in case something happens to the original version, while for others, it is a routine process mandated either by policy or because theyve previously suffered through a data disaster and pledge never to live through that again.

On a small scale, for a local workstation or laptop, practice suggests a nightly copy of the computers data be created to another storage medium, e.g., a NAS or portable 4-8 TB drive. Travel makes this difficult, so alternative online solutions prevail.

Businesses routinely backup their file servers (as unstructured data) and their databases (as structured data) as a precaution against a short-term issue with data on a local drive being corrupted. Snap-shots or images of an entire drive (OS, applications and data) are often suggested by administrators, software vendors, and portable hard drive manufacturers.

Incremental backups, whereby only new data or any which was changed since the last backup, are made due to the larger storage volumes and the time required for the full backup.

STORAGE AS A SERVICE

Archived data could be placed on a local NAS, transportable disk drive, an on-prem protected storage array partition, or linear data tape. Since an archive is about putting the data on a shelf (so to speak), the choices vary based on need.

Cloud archiving is about storage as a service, and is intended for long-term retention and preservation of data assets. Archives are where the data isnt easily accessed and remains for a long uninterrupted time.

Archiving used to mean pushing data to a digital linear tape (DLT) drive and shipping those tapes to an iron mountain-like storage vault. In this model, recovering any needed data was a lengthy process involving retrieving the information from a vault, copying it to another tape, putting it onto a truck and returning the tape to the mothership where it was then copied back onto local storage, indexed against a database (MAM), and then made available on a NAS or SAN.

This method involves risks, including the loss or damage to tapes in transport, tapes which went bad or possibly had corrupted data onto the tape in the first place. Obsolescence of either the tape media or the actual drives meant that every few years a refresh of the data tapes was required, adding more risk in data corruption or other unknowns.

As technology moved onward, robotic tape libraries pushed the processes to creating two (tape) copiesone for the on-prem applications and one to place safely in a vault under a mountain somewhere. While this reduced some riskssuch as placing duplicate copies in storage at diverse locationsit didnt eliminate the refresh cycle and meant additional handling (shipping back tapes in the vault for updating). Refresh always added costs: tape library management, refresh cycle transport costs, tape updates including drives and physical media, plus the labor to perform those migrations and updates.

STORAGE OVERLOAD

Images keep getting larger. Formats above HD are now common, quality is improving, pushing native format editing (true 4K and 5K) upwards, added to dozens to hundreds more releases per program and keeping storage archive equations in continual check. As files get larger, the amount of physical media needed to store a master, such as protect copies and one or more archive copies of every file, is causing storage overload. Decisions on what to archive are balanced against costs and the unpredictable reality that the content may never need to be accessed again.

High-capacity, on-prem storage vaults can only grow so largea hardware refresh on thousands of hard drives every couple years can be overwhelming from a cost and labor perspective. Object-based storage is solving some of those space or protection issues but having all your organizations prime asset eggs in one basket is risky and not very smart business. So opens the door to cloud-based archiving.

MANY SHAPES AND SIZES

Fee-based products such as iCloud, Carbonite, Dropbox and such are good for some. These products, while cloud based, have varying schemes and work well for many users or businesses. Private iPhone users get Apple iCloud almost cost-free but with limited storage sizes. Other users prefer interfaces specifically for a drive or computer device with an unlimited storage or file count.

So why to pick one service over another? Is one a better long term solution versus another? Do we really want an archive or a readily accessible copy of the data in case of an HDD crash?

One common denominator to most commercial backup/archive services is they keep your data in the cloud. Data is generally accessible from any location with an internet connection and is replicated in at least three locations. However, getting your data back (from a less costly archive) has a number of cost-and non-cost-based perspectives. Recovering a few files or photos is relatively straightforward but getting gigabytes of files back is another question. So beware of what you sign up for and know what youre paying for and why.

Beyond these common points is where the divisions of capabilities, accessibility, cost, serviceability, and reliability become key production indicators which in turn drive differing uses or applications.

ACCESS FROM THE CLOUD

Cloud-stored data costs have a direct relationship to the accessibility and retrieving of that data.

If you rarely need the data and can afford to wait a dozen hours or more for the recoverythen choose a deep-storage or cold storage solution. If you simply need more physical storage and intend regular daily or weekly accessthen you select a near term (probably not an archive) storage solution. Options include On-Prem (limited storage, rapid accessibility) or off-prem in a Co-Located (Co-Lo) storage environment, referred to as a private-cloud or alternatively a public (or commercial) cloud provider such as Google, Azure, AWS, IBM or others.

Cloud services continue to grow in usage and popularity, yet, there remains a degree of confusion as to which kind of cloud service to deploy and for which kinds of assets. Many users prefer regular access to their archived materialthis would be a wrong approach and is more costly (as much as 4:1) than putting their data into deep-or cold-storage vs. a short-term environment (a temporary storage bucket) that is easily accessible.

Fig. 1: Example of sharing storage services amongst varying cloud providers and for multiple purposessome on-prem, some in the cloud. Concept is courtesy of Spectra Logic.

Fig. 1 shows a hybrid managed solution with local cache, multiple cloud storage providers, and local/on-prem primary archive serviced by an object-based storage bucket manager. The concept allows migration, protection, and even retention of existing storage subsystems.

CHOICES AND DECISIONS

Picking cloud service providers for your archive is no easy decisioncomparisons in services and costs can be like selecting a gas or electricity provider. Plans change, sometimes often. Signing onto a deep archive becomes a long-term commitment due primarily to the cost of retrieving the data despite the initial upload costs being much lower. If your workflows demand continual data migration, dont pick a deep or cold-storage plan; look at another near-line solution. Be wary of long-term multi-year contractscloud vendors are very competitive, offering advantages that can adjust annually.

The amount of data youre to store will continually grow. Be selective about the types of data stored, the duration you expect to keep that data, and choose wisely as to what is really needed. Your organizations policies may dictate store everything, so know what the legal implications are, if any.

Carefully look at the total cost of ownership (TCO) in the platform, then weigh the long-term vs. short-term model appropriately.

Karl Paulsen is CTO at Diversified and a SMPTE Fellow. He is a frequent contributor to TV Technology, focusing on emerging technologies and workflows for the industry. Contact Karl at kpaulsen@diversifiedus.com.

See the article here:
Uncovering the Long-Term Cloud Archive Equation - TV Technology

Read More..

Is Cloud Computing the Answer to Genomics Big Data… – Labiotech.eu

The success of the genomics industry has led to generation of huge amounts of sequence data. If put to good use, this information has the potential to revolutionize medicine, but the expense of the high-powered computers needed to achieve this is making full exploitation of the data difficult. Could cloud computing be the answer?

Over the last decade, genomics has become the backbone of drug discovery. It has allowed scientists to develop more targeted therapies, boosting the chances of successful clinical trials. In 2018 alone, over 40% of FDA-approved drugs had the capacity for being personalized to patients, largely based on genomics data. As the percentage has doubled over the past four years, this trend is unlikely to slow down anytime soon.

The ever-increasing use of genomics in the realm of drug discovery and personalized treatments can be traced back to two significant developments over the past decade: plunging sequencing costs and, consequently, an explosion of data.

As sequencing technologies are constantly evolving and being optimized, the cost of sequencing a genome has plummeted. The first sequenced genome, part of the Human Genome Project, cost 2.4B and took around 13 years to complete. Fast forward to today, and you can get your genome sequenced in less than a day for under 900.

According to the Global Alliance for Genomics and Health, more than 100 million genomes will have been sequenced in a healthcare setting by 2025. Most of these genomes will be sequenced as part of large-scale genomic projects stemming from both big pharma and national population genomics initiatives. These efforts are already garnering immense quantities of data that are only likely to increase over time. With the right analysis and interpretation, this information could push precision medicine into a new golden age.

Are we ready to deal with enormous quantities of data?

Genomics is now considered a legitimate big data field just one whole human genome sequence produces approximately 200 gigabytes of raw data. If we manage to sequence 100M genomes by 2025 we will have accumulated over 20B gigabytes of raw data. The massive amount of data can partially be managed through data compression technologies, with companies such as Petagene, but that doesnt solve the whole problem.

Whats more, sequencing is futile unless each genome is thoroughly analyzed to achieve meaningful scientific insights. Genomics data analysis normally generates an additional 100 gigabytes of data per genome for downstream analysis, and requires massive computing power supported by large computer clusters a feat that is economically unfeasible for the majority of companies and institutions.

Researchers working with large genomics datasets have been searching for other solutions, because relying solely on such high-performance computers (HPC) for data analysis is economically out of the question for many. Large servers require exorbitant amounts of capital upfront and incur significant maintenance overheads. Not to mention, specialized and high-level hardware, such as graphics processing units, require constant upgrades to remain performant.

Furthermore, as most HPCs have different configurations, ranging from technical specs to required software, the reproducibility of genomics analyses across different infrastructures is not a trivial feat.

Cloud computing: a data solution for small companies

Cloud computing has emerged as a viable way to analyze large datasets fast without having to worry about maintaining and upgrading servers. Simply put, Cloud computing is a pay-as-you-go model allowing you to rent computational power and storage. and its pervasive across many different sectors.

According to Univa the industrial leader in workload scheduling in the cloud and HPC more than 90% of organizations requiring high performance computing capacity have moved, or are looking into moving to the cloud. Although this is not specific for companies in the life sciences, Gary Tyreman Univas CEO suggests that pharmaceutical companies are ahead of the market in terms of adoption.

The cloud offers flexibility, an alluring characteristic for small life science companies that may not have the capital on-hand to commit to large upfront expenses for IT infrastructure: HPC costs can make or break any company. As a consequence, many opt to test their product in the cloud first, and if numbers look profitable, they can then invest in an in-house HPC solution.

The inherent elasticity of cloud resources enables companies to scale their computational resources in relation to the amount of genomic data that they need to analyze. Unlike with in-house HPCs, this means that there is no risk money will be wasted on idle computational resources.

Elasticity also extends to storage: data can be downloaded directly to the cloud and removed once the analyses are finished, with many protocols and best practices in place to ensure data protection. Cloud resources are allocated in virtualized slices called instances. Each instance hardware and software is pre-configured according to the users demand, ensuring reproducibility.

Will Jones, CTO of Sano Genetics, a startup based in Cambridge, UK, offering consumer genetic tests with support for study recruitment, believes the cloud is the future of drug discovery. The company carries out large data analyses for researchers using its services in the cloud.

In a partnership between Sano Genetics and another Cambridge-based biotech, Joness team used the cloud to complete the study at a tenth of the cost and in a fraction of the time it would have taken with alternative solutions.

Besides economic efficiency, Jones says that moving operations to the cloud has provided Sano Genetics with an additional security layer, as the leading cloud providers have developed best practices and tools to ensure data protection.

Why isnt cloud computing more mainstream in genomics?

Despite all of the positives of cloud computing, we havent seen a global adoption of the cloud in the genomics sector yet.

Medley Genomics a US-based startup using genomics to improve diagnosis and treatment of complex heterogeneous diseases, such as cancer moved all company operations to the cloud in 2019 in a partnership with London-based Lifebit.

Having spent more than 25 years at the interface between genomics and medicine, Patrice Milos, CEO and co-founder of Medley Genomics, recognized that cloud uptake has been slow in the field of drug discovery, as the cloud has several limitations that are preventing its widespread adoption.

For starters, long-term cloud storage is more expensive than the HPC counterpart: cloud solutions charge per month per gigabyte, whereas with HPC, once youve upgraded your storage disk, you have no additional costs. The same goes for computing costs: while the cloud offers elasticity, Univas CEO Tyreman says that the computation cost of a single analysis is five times more expensive compared to an HPC solution in many scenarios. However, as cloud technologies continue to progress and the market becomes increasingly more competitive among providers, the ongoing cloud war will likely bring prices down.

Furthermore, in the world of drug discovery, privacy and data safety are paramount. While cloud providers have developed protocols to ensure the data is safe, some risks still exist, for example, when moving the data. Therefore, large pharmaceutical companies prefer internal solutions to minimize these risks.

According to Milos, privacy remains the main obstacle for pharmaceutical companies to fully embrace the cloud, while the cost to move operations away from HPCs is no longer a barrier. While risks will always exist to a certain extent, Milos highlighted that the cloud allows seamless collaboration and reproducibility, both of which are essential for research and drug discovery.

Current players in the cloud genomics space

Cloud computing is a booming business and 86% of cloud customers rely on three main providers: AWS (Amazon), Azure (Microsoft) and Google Cloud. Although the three giants currently control the market, many other providers exist, offering more specialized commercial and academic services.

Emerging companies are now leveraging the technology offered by cloud providers to offer bioinformatics solutions in the cloud, such as London-based Lifebit, whose technology allows users to run any bioinformatics analyses through any cloud provider with a user-friendly interface effectively democratizing bioinformatics for all researchers, regardless of skill set.

Federation is a concept from computing now used in the field of genomics. It allows separate computers in different networks to work together to perform secure analysis without having to expose private data to others, effectively removing any potential security issues.

The amount of data organizations are now dealing with has become absolutely unmanageable with traditional technologies, and is too big to even think about moving, explained Maria Chatzou Dunford, Lifebits CEO and co-founder.

When data is moved, you increase the chances of having it be intercepted by third-parties, essentially putting it at significant risk. Data federation is the only way around this unnecessary data storage and duplication costs, and painstakingly slow data transfers become a thing of the past.

Getting ready for the genomics revolution

Its no secret that genomics is key to enabling personalized medicine and advancing drug discovery. We are now seeing a genomics revolution where we have an unprecedented amount of data ready to be analyzed.

The challenge now is: are we ready for it? To be analyzed, big data requires massive computation power, effectively becoming an entry barrier for most small organizations. Cloud computing provides an alternative to scale analyses, while at the same time, facilitating reproducibility and collaboration

While the cost and security limitations of cloud computing are preventing companies from fully embracing the cloud, these drawbacks are technical and are expected to be resolved within the next few years.

Many believe that the benefits of the cloud heavily outweigh its limitations. With major tech giants competing to offer the best cloud solutions a market valued at $340 billion by 2024 we might be able to expect a drastic reduction in costs. While some privacy concerns may still exist, leading genomics organizations are developing new tools and technologies to protect genomic data.

Taken as a whole, it is likely that the cloud will be increasingly important in accelerating drug discovery and personalized medicine. According to Univas Tyreman, it will take around 1015 years to see the accelerated transition from HPC to cloud, as large organizations are often conservative in embracing novel approaches.

Distributed big data is the number one overwhelming challenge for life sciences today, the major obstacle impeding progress for precision medicine, Chatzou Dunford concluded.

The cloud and associated technologies are already powering intelligent data-driven insights, accelerating research, discovery and novel therapies. I have no doubt we are on the cusp of a genomics revolution.

Filippo Abbondanza is a PhD candidate in Human Genomics at the University of St Andrews in the UK. While doing his PhD, he is doing an internship at Lifebit and is working as marketing assistant at Global Biotech Revolution, a not-for-profit company growing the next generation of biotech leaders. When not working, he posts news on LinkedIn and Twitter.

Images via E. Resko, Lifebit and Shutterstock

See original here:
Is Cloud Computing the Answer to Genomics Big Data... - Labiotech.eu

Read More..

HPC In 2020: Acquisitions And Mergers As The New Normal – The Next Platform

After a decade of vendor consolidation that saw some of the worlds biggest IT firms acquire first-class HPC providers such as SGI, Cray, and Sun Microsystems, as well as smaller players like Penguin Computing, WhamCloud, Appro, and Isilon, it is natural to wonder who is next. Or maybe, more to the point, who is left?

As it turns out, there are still plenty of companies, large and small, that can fill critical holes in the product portfolios of HPC providers, or those who want to be HPC players. These niche acquisitions will be especially important to these same providers as they expand into HPC-adjacent markets such as artificial intelligence, data analytics and edge computing.

One company that can play into all of these markets is FPGA-maker Xilinx. Since Intel acquired Altera in 2015, Xilinx is the only standalone company of any size that makes reconfigurable logic devices. Give that, the natural buyer for Xilinx would be AMD, Intels arch-nemesis. AMD, of course, already has a highly competitive lineup of CPUs and GPUs to challenge its much larger rival, and the addition of an FPGA portfolio would open a third front. It would also provide AMD entry into a whole array of new application markets where FPGAs operate: ASIC prototyping, IoT, embedded aerospace/automotive, 5G communications, AI inference, database acceleration, and computational storage, to name a few.

The only problem is that Xilinxs current market cap of around $25 billion, or about half the current market cap of AMD. And if youre wondering about AMDs piggy bank, the chipmaker has $1.2 billion cash on hand as of September 2019. Which means any deal would probably take the form of a merger rather than a straight acquisition. Theres nothing wrong with that, but a merger is a more complex decision and has greater ramifications for both parties. Thats why the rumors of a Xilinx acquisition have tended to center on larger semiconductor manufacturers that might be looking to diversify their offerings, like Broadcom or Qualcomm. Those acquisitions wouldnt offer the HPC and AI technology synergies that AMD could provide, but they would likely be easier to execute.

Another area that continues to be ripe for acquisitions is the storage market. In HPC, Panasas and DataDirect Networks stand alone well, stand together as the two HPC specialists left in the market. And of those two, the more modest-sized Panasas would be easier to swallow. But most HPC OEMs, including the biggies like Hewlett Packard Enterprise, Dell Technologies, and Lenovo already have their own HPC storage and file system offerings of one sort or another, although Lenovo is probably most deficient in this regard. For what its worth though, Panasas, which has been around since 1999, has never attracted the kind of suitor willing to fold the companys rather specialized parallel file system technologies into its own product portfolio. In all honesty, we dont expect that to change.

The real storage action in the coming years in HPC, as well as in the enterprise and the cloud, is going to be in the software defined space, where companies like WekaIO, VAST Data, Excelero, and DataCore Software have built products that can virtualize all sorts of hardware. Thats because the way storage is being used and deployed in the datacenter these days is being transformed by cheaper capacity (disks) and cheaper IOPS (NVM-Express and other SSD devices), the availability of cloud storage, and the inverse trends of disaggregation and hyperconvergence.

As we noted last July: While there are plenty of NAS and SAN appliances being sold into the enterprise to support legacy applications, modern storage tends to be either disaggregated with compute and storage broken free of each other at the hardware level but glued together on the fly with software to look local or hyperconverged with the compute and block storage virtualized and running on the same physical server clusters and atop the same server virtualization hypervisors.

Any of the aforementioned SDS companies, along with others, may find themselves courted by OEMs and storage-makers, and even cloud providers. DDN has been busy in that regard, having acquired software-defined storage maker Nexenta in May 2019. We expect to see more of such deals in the coming years. Besides DDN, other storage companies like NetApp should be looking hard at bringing more SDS in-house. The big cloud providers Amazon, Microsoft, Google, and so on will also be making some big investments in SDS technologies, even if theyre not buying such companies outright.

One market that is nowhere near the consolidation stage is quantum computing. However, that doesnt mean companies wont be looking to acquire some promising startups in this area, even at this early stage. While major tech firms such as IBM, Google, Intel, Fujitsu, Microsoft, and Baidu have already invested a lot on in-house development and are busy selecting technology partners, other companies have taken a more wait-and-see approach.

In the latter category, one that particularly stands out is HPE. In this case, the company is more focused on near-term R&D, like memristors or other memory-centric technologies. While there may be some logic in letting other companies spend their money figuring out the most promising approaches for quantum computing, and then swoop in and copy (or buy) whatever technology is most viable, there is also the risk of being left behind. Thats something HPE cannot afford.

That said, HPE has recently invested in IonQ, a promising quantum computing startup that has built workable prototype using ion trap technology. The investment was provided via Pathfinder, HPEs investment arm. In an internal blog post on the subject penned by Abhishek Shukla, managing director of global venture investments, and Ray Beausoleil, Senior Fellow of large scale integrated photonics, the authors extol the virtues of IonQs technical approach:

IonQs technology has already surpassed all other quantum computers now available, demonstrating the largest number of usable qubits in the market. Its gate fidelity, which measures the accuracy of logical operations, is greater than 98 percent for both one-qubit and two-qubit operations, meaning it can handle longer calculations than other commercial quantum computers. We believe IonQs qubits and methodology are of such high quality, they will be able to scale to 100 qubits (and 10,000 gate operations) without needing any error correction.

As far as we can tell, HPE has no plans to acquire the company (and it shares investment in the startup with other companies, including Amazon, Google, and Samsung, among others). But if HPE is truly convinced IonQ is the path forward, it would make sense to pull the acquisition trigger sooner rather than later.

We have no illusions that any of this comes to pass in 2020 or ever. As logical as the deals we have suggested seem to us, the world of acquisitions and mergers is a lot more mysterious and counterintuitive than wed like to admit (cases in point: Intel buying Whamcloud or essentially buying Cloudera through such heavy investment). More certain is the fact that these deals will continue to reshape the HPC vendor landscape in the coming decade as companies go after new markets and consolidate their hold on old ones. If anything, the number of businesses bought and sold will increase as high performance computing, driven by AI and analytics, will extend into more application domains. Or as the Greeks put it more succinctly, the only constant is change.

See the original post here:
HPC In 2020: Acquisitions And Mergers As The New Normal - The Next Platform

Read More..

From the Ground Up: Using Artificial Intelligence for Weed Control – KBTX

BRYAN, Tex. (KBTX) - Almost everyone knows what a smartphone is, but there are also new generations of smart machines being built that will be used in agriculture to help manage a crop. Shannon Pickering is a market development manager for Blue River Technology.

We are working on several projects but primarily focused on spraying. So precision spraying using computer vision systems and artificial intelligence in order to be able to identify every plant in the field and determine what is the crop versus the weeds and only spray the weeds.

John Deere acquired Blue River Technology in 2017 to help make its Ag equipment smarter.

We hope to do several things all at once basically. Number one is become more efficient. Utilize resources wisely. Be able to spray less pesticides on crops. If we can identify the weeds in the field and only spray the weeds instead of spraying the entire field then thats a big deal. Thats a lot of chemical savings thats not going into the soil or onto the plant. So being able to provide a more sustainable solution for our farmers going forward is really a big deal for us.

One of their early conceptual sprayers showed up to 95 percent fewer chemicals being sprayed in the field where they were being very precise and applying it only where it needed to go which was on the weeds.

It has to provide value to the grower. It has to provide efficiency to pay for itself, and so thats a must and it will do that for sure. The technology is here. We definitely have the capability of doing this today. Its just a matter of integrating it into the machinery. Were a few years away yet probably from seeing it in a go-to-market form but the potential is there. The technology works and its coming for sure.

Continue reading here:
From the Ground Up: Using Artificial Intelligence for Weed Control - KBTX

Read More..

World’s First ‘Living Machine’ Created Using Frog Cells and Artificial Intelligence – Livescience.com

What happens when you take cells from frog embryos and grow them into new organisms that were "evolved" by algorithms? You get something that researchers are calling the world's first "living machine."

Though the original stem cells came from frogs the African clawed frog, Xenopus laevis these so-called xenobots don't resemble any known amphibians. The tiny blobs measure only 0.04 inches (1 millimeter) wide and are made of living tissue that biologists assembled into bodies designed by computer models, according to a new study.

These mobile organisms can move independently and collectively, can self-heal wounds and survive for weeks at a time, and could potentially be used to transport medicines inside a patient's body, scientists recently reported.

Related: The 6 Strangest Robots Ever Created

"They're neither a traditional robot nor a known species of animal," study co-author Joshua Bongard, a computer scientist and robotics expert at the University of Vermont, said in a statement. "It's a new class of artifact: a living, programmable organism."

Algorithms shaped the evolution of the xenobots. They grew from skin and heart stem cells into tissue clumps of several hundred cells that moved in pulses generated by heart muscle tissue, said lead study author Sam Kriegman, a doctoral candidate studying evolutionary robotics in the University of Vermont's Department of Computer Science, in Burlington.

"There's no external control from a remote control or bioelectricity. This is an autonomous agent it's almost like a wind-up toy," Kriegman told Live Science.

Biologists fed a computer constraints for the autonomous xenobots, such as the maximum muscle power of their tissues, and how they might move through a watery environment. Then, the algorithm produced generations of the tiny organisms. The best-performing bots would "reproduce" inside the algorithm. And just as evolution works in the natural world, the least successful forms would be deleted by the computer program.

"Eventually, it was able to give us designs that actually were transferable to real cells. That was a breakthrough," Kriegman said.

The study authors then brought these designs to life, piecing stem cells together to form self-powered 3D shapes designed by the evolution algorithm. Skin cells held the xenobots together, and the beating of heart tissue in specific parts of their "bodies" propelled the 'bots through water in a petri dish for days, and even weeks at a stretch, without needing additional nutrients, according to the study. The 'bots were even able to repair significant damage, said Kriegman.

"We cut the living robot almost in half, and its cells automatically zippered its body back up," he said.

"We can imagine many useful applications of these living robots that other machines can't do," said study co-author Michael Levin, director of theCenter for Regenerative and Developmental Biologyat Tufts University in Massachusetts. These might include targeting toxic spills or radioactive contamination, collecting marine microplastics or even excavating plaque from human arteries, Levin said in a statement.

Creations that blur the line between robots and living organisms are popular subjects in science fiction; think of the killer machines in the "Terminator" movies or the replicants from the world of "Blade Runner." The prospect of so-called living robots and using technology to create living organisms understandably raises concerns for some, said Levin.

"That fear is not unreasonable," Levin said. "When we start to mess around with complex systems that we don't understand, we're going to get unintended consequences."

Nevertheless, building on simple organic forms like the xenobots could also lead to beneficial discoveries, he added.

"If humanity is going to survive into the future, we need to better understand how complex properties, somehow, emerge from simple rules," Levin said.

The findings were published online Jan. 13 in the journal Proceedings of the National Academy of Sciences.

Originally published on Live Science.

Read more:
World's First 'Living Machine' Created Using Frog Cells and Artificial Intelligence - Livescience.com

Read More..

Artificial Intelligence Expert Neil Sahota Says AI Will Have Major Impact On 2020 Elections And In Medicine – PRNewswire

LOS ANGELES, Jan. 15, 2020 /PRNewswire/ --Artificial intelligence, or AI, will play a significant role in the 2020 election campaign and may also lead to major breakthroughs in solving personal medical issues, according to futurist and AI expert Neil Sahota.

"I'm increasingly concerned about the impact of fake news, photo scams and other deceits designed to negatively influence voting this year," says Sahota, who works closely with the United Nations and other organizations to foster innovation and develop next generation products/solutions to be powered by AI. "We will see the effect of more AI tools generating fraudulent information and influencing voters. Thankfully, there will also be new tools to fight this kind of disinformation. What is certain is that machine vs machine battles will become more prevalent."

The author of the influential book Own the AI Revolution (McGraw Hill), Sahota is also an IBM Master Inventor, who led the IBM Watson Group and is a professor at the University of California/Irvine.

In addition to its potential impact on the election campaigns, Sahota predicts AI will be responsible for significant medical advances. "We will see more use of AI that will accelerate solutions for doctors, nurses, clinicians and researchers in providing personalized care," he said. "Each of us is genetically unique and there isn't a one-size fits all solution for us. But AI can solve this dilemma by providing personalized medicine based on a specific person's genomic sequence, lifestyle, medical history, environment and other differences. I think there will be great strides in these areas in the coming year."

"The election and medicine are only two areas where we will feel the impact of AI, which is coming into its own as an emerging technology," Sahota says. "We are likely to see it help combine tools such as block chain, virtual reality and artificial reality. For example, I envision a virtual reality courtroom where a law student interacts with an AI 'judge,' opposing counsel and jury. AI simulation is not only more 'real world' but has great variability, meaning each time the VR module is used, it's different. There's no memorization or 'cheat sheet' for the law student. It's a dynamic, highly interactive learning module and 2020 will start the wave of convergence: combining these technologies together.

About Neil Sahota: Neil Sahota is a futurist and leading expert on Artificial Intelligence (AI) and other next generation technologies. He is the author of Own the AI Revolution (McGraw Hill) and works with the United Nations on the AI for Good initiative. Sahota is also an IBM Master Inventor, former leader of the IBM Watson Group and professor at the University of California/Irvine. His work spans multiple industries, including legal services, healthcare, life sciences, retail, travel, transportation, energy, utilities, automotive, telecommunications, media, and government.

SOURCE Neil Sahota

Home

See the rest here:
Artificial Intelligence Expert Neil Sahota Says AI Will Have Major Impact On 2020 Elections And In Medicine - PRNewswire

Read More..

Companies Use Artificial Intelligence to Help With Hiring. Korean Consultants Teach You How to Beat It – Inc.

Artificial intelligence is supposed to free the hiring process from prejudices and biases. We can have a totally neutral system that evaluates candidates and selects the best possible one, regardless of race, gender, or any other characteristic.

It sounds fantastic, but it's been an abysmal failure in that matter. Artificial intelligence is only as good as the programmers, who, of course, are actual humans with flaws. Amazon, which, of course, has gobs of money to pour into development, had to scrap its A.I. recruiting process because the bot didn't like women.

HireVue faces pressure from rights groups over its hiring systems, which, according to TheWashington Post,

use video interviews to analyze hundreds of thousands of data points related to a person's speaking voice, word selection and facial movements. The system then creates a computer-generated estimate of the candidates' skills and behaviors, including their "willingness to learn" and "personal stability."

This model of gaming the system has been in place for as long as people have applied for jobs. There are thousands of articles on the internet that tell you how to answer standard interview questions ("Where do you see yourself in five years?") or extol the virtues of a firm handshake. This is really no different than the training these consultants give. Except, instead of trying to convince a human, you're trying to convince a machine.

And that makes this training so much more valuable. I can tell you "firm handshakes are important!" and then you interview with someone who prefers the dead-fish version of shaking hands and my advice harms instead of helps. Butif two companies use the same software, the information from these consultants will help you shine regardless of who the hiring manager is.

That's the goal, of course, to take the human biases out of interviews. But the biases still exist in A.I.--it's just that every job requires you to overcome the same preferences. Which means it will be easier to beat the system. Once the consultants figure out what the algorithms want, they can train you to respond the right way.

While it potentially levels the playing field, people who can afford training will do better in the interviews. Interviewers already discriminate on class, so this doesn't solve that problem at all.

Can artificial intelligence potentially make hiring better? Probably. But, as these consultants understand--anytime there is a system, there is a way to beat it. While humans are fallible, at least we all know they are. Artificial intelligence allows you to think the process is bias-free, but it's not. It just makes for consistent bias.

Published on: Jan 15, 2020

The opinions expressed here by Inc.com columnists are their own, not those of Inc.com.

Read more here:
Companies Use Artificial Intelligence to Help With Hiring. Korean Consultants Teach You How to Beat It - Inc.

Read More..

The future is intelligent: Harnessing the potential of artificial intelligence in Africa – Brookings Institution

The future is intelligent: By 2030, artificial intelligence (AI) will add $15.7 trillion to the global GDP, with $6.6 trillion projected to be from increased productivity and $9.1 trillion from consumption effects. Furthermore, augmentation, which allows people and AI to work together to enhance performance, will create $2.9 trillion of business value and 6.2 billion hours of worker productivity globally. In a world that is increasingly characterized by enhanced connectivity and where data is as pervasive as it is valuable, Africa has a unique opportunity to leverage new digital technologies to drive large-scale transformation and competitiveness. Africa cannot and should not be left behind.

There are 10 key enabling technologies that will drive Africas digital economy, including cybersecurity, cloud computing, big data analytics, blockchain, the Internet of Things, 3D printing, biotechnology, robotics, energy storage, and AI. AI in particular presents countless avenues for both the public and private sectors to optimize solutions to the most crucial problems facing the continent today, especially for struggling industries. For example, in health care, AI solutions can help scarce personnel and facilities do more with less by speeding initial processing, triage, diagnosis, and post-care follow up. Furthermore, AI-based pharmacogenomics applications, which focus on the likely response of an individual to therapeutic drugs based on certain genetic markers, can be used to tailor treatments. Considering the genetic diversity found on the African continent, it is highly likely that the application of these technologies in Africa will result in considerable advancement in medical treatment on a global level.

In agriculture, Abdoulaye Banir Diallo, co-founder and chief scientific officer of the AI startup My Intelligent Machines, is working with advanced algorithms and machine learning methods to leverage genomic precision in livestock production models. With genomic precision, it is possible to build intelligent breeding programs that minimize the ecological footprint, address changing consumer demands, and contribute to the well-being of people and animals alike through the selection of good genetic characteristics at an early stage of the livestock production process. These are just a few examples that illustrate the transformative potential of AI technology in Africa.

In a world that is increasingly characterized by enhanced connectivity and where data is as pervasive as it is valuable, Africa has a unique opportunity to leverage new digital technologies to drive large-scale transformation and competitiveness. Africa cannot and should not be left behind.

However, a number of structural challenges undermine rapid adoption and implementation of AI on the continent. Inadequate basic and digital infrastructure seriously erodes efforts to activate AI-powered solutions as it reduces crucial connectivity. (For more on strategies to improve Africas digital infrastructure, see the viewpoint on page 67 of the full report). A lack of flexible and dynamic regulatory systems also frustrates the growth of a digital ecosystem that favors AI technology, especially as tech leaders want to scale across borders. Furthermore, lack of relevant technical skills, particularly for young people, is a growing threat. This skills gap means that those who would have otherwise been at the forefront of building AI are left out, preventing the continent from harnessing the full potential of transformative technologies and industries.

Similarly, the lack of adequate investments in research and development is an important obstacle. Africa must develop innovative financial instruments and public-private partnerships to fund human capital development, including a focus on industrial research and innovation hubs that bridge the gap between higher education institutions and the private sector to ensure the transition of AI products from lab to market.

At the same time, we must be careful that priority sectors drive the AI strategy in Africa with accompanying productsnot the other way around. We believe the health care industry presents by far the most urgent need and promising market opportunity, and, as such, should be put at the top of the list for the continents decisionmakers. A large portion of the African population is still unable to access proper health care, with a low patient ratio of one physician per 5,000 patients, and there is almost no country with a fully integrated health management platform. AI could intervene directly to improve personalized health care and product development. Importantly, the health management platform precedes the leveraging of AI, so we must equally invest in cybersecurity, Big Data, cloud computing, and blockchain.

Artificial intelligence for Africa presents opportunities to put the continent at the forefront of the Fourth Industrial Revolution. Before Africa can lead this transformation, though, there are important steps that must be undertaken. First, the region needs to formulate a comprehensive continental blueprint to guide its AI strategy by involving key Pan-African institutions, academia, and the private and public sectors in its conception.

In addition, these stakeholders must also invest in creating a digital identity platform for all Africans with reliable data banks for AI to be a viable economic option. For this, it is imperative to leverage readily available local talent as a means to promote and democratize AI technology continent-wide. Finally, we must harmonize regulatory policies that encourage ethically built AI systems so as to guarantee a more inclusive economic development for Africa. With these important steps, the next decade for Africa will be intelligent.

Read the original:
The future is intelligent: Harnessing the potential of artificial intelligence in Africa - Brookings Institution

Read More..

Asia Pacific Artificial Intelligence in Fashion Market to 2027 – Regional Analysis and Forecasts by Offerings; Deployment; Application; End-User…

The Asia Pacific artificial intelligence in fashion market accounted for US$ 55. 1 Mn in 2018 and is expected to grow at a CAGR of 39. 0% over the forecast period 2019-2027, to account for US$ 1015. 8 Mn in 2027.

New York, Jan. 15, 2020 (GLOBE NEWSWIRE) -- Reportlinker.com announces the release of the report "Asia Pacific Artificial Intelligence in Fashion Market to 2027 - Regional Analysis and Forecasts by Offerings; Deployment; Application; End-User Industry" - https://www.reportlinker.com/p05833586/?utm_source=GNW Real-time consumer behavior insights and increased operational efficiency are driving the adoption of artificial intelligence in fashion industry. Moreover, the availability of a large amount of data originating from different data sources is one of the key factors driving the growth of AI technology across the fashion industry. Artificial Intelligence has already disrupted several industries, including the retail and fashion industry. The fashion industry so far has been one of the primary adopters of the technology. The fashion retailers these days are leveraging several revolutionary technologies, including machine learning, like augmented reality (AR) and artificial intelligence (AI), to make seamless shopping experiences across the channels, from online models to brick and mortar stores. Fashion retailers are progressively moving towards the AI integration within their supply chain, where more focus is being on customer-facing AI initiatives. Further, an AI integrated search engine is expected to reshape the way fashion designers develop new product designs. Store operations and in-store services will also be greatly benefited from AI integration in the fashion industry.The artificial intelligence in fashion market is fragmented in nature due to the presence of several end-user industries, and the competitive dynamics in the market are anticipated to change during the coming years.In addition to this, various initiatives are undertaken by governmental bodies to accelerate the artificial intelligence in fashion market further.

The governments of various countries in this region are trying to attract FDIs in the technology sector with the increasing need for enhanced technology-related services.For instance, Chinas government relaxed the restrictions on new entries with an objective to encourage overseas and private capital to invest in its economy.

This factor is anticipated to drive the demand for artificial intelligence in fashion market in this region.The artificial intelligence in fashion market by deployment type is segmented into on-premise and cloud.During the forecast period of 2019 to 2027, the cloud-based segment is anticipated to be the largest contributor in artificial intelligence in fashion market.

The artificial intelligence in fashion market is experiencing a paradigm shift from traditional on-premise deployment to cloud-based deployments in the current scenario. This trend is predominantly driven by the presence of a new category of cloud-only solutions, which help in minimizing integration complexities and installation costs with quick setup.The overall artificial intelligence in fashion market size has been derived using both primary and secondary source.The research process begins with exhaustive secondary research using internal and external sources to obtain qualitative and quantitative information related to the artificial intelligence in fashion market.

It also provides an overview and forecast for the artificial intelligence in fashion market based on all the segmentation provided with respect to the Asia Pacifica region.Also, primary interviews were conducted with industry participants and commentators to validate data and analysis.

The participants who typically take part in such a process include industry expert such as VPs, business development managers, market intelligence managers, and national sales managers, and external consultants such as valuation experts, research analysts, and key opinion leaders specializing in the artificial intelligence in fashion market. Some of the players present in artificial intelligence in fashion market are Adobe Inc., Amazon Web Services, Inc., Catchoom Technologies S.L., Facebook, Inc., Google LLC, Huawei Technologies Co., Ltd., IBM Corporation, Microsoft Corporation, Oracle Corporation, and SAP SE among others.Read the full report: https://www.reportlinker.com/p05833586/?utm_source=GNW

About ReportlinkerReportLinker is an award-winning market research solution. Reportlinker finds and organizes the latest industry data so you get all the market research you need - instantly, in one place.

__________________________

Story continues

Clare: clare@reportlinker.comUS: (339)-368-6001Intl: +1 339-368-6001

View original post here:
Asia Pacific Artificial Intelligence in Fashion Market to 2027 - Regional Analysis and Forecasts by Offerings; Deployment; Application; End-User...

Read More..

Artificial intelligence and the future of rep visits – – pharmaphorum

As access to healthcare professionals (HCPs) declines, the challenges facing sales representatives continue to increase: less time with HCPs, Sunshine Act restrictions, and integration of practices into larger health systems. It can be daunting.

Once, influence was based on interactions between reps and HCPs more than just about anything else. But today, influence is spread across a variety of touch points, many digital, which can be accessed by an HCP at any time and place. To reinforce their value, sales reps are expected to have deep knowledge of the market and their customers, so that they can tailor their interactions to the unique needs of each.

How can todays rep succeed? Its all about data.

Data gathered judiciously, digested accurately, analysed rapidly, and used wisely makes the sales force more efficient and productive. This concept is nothing new: it dates back to the beginnings of CRM in the 20th century.

But todays digital world offers new possibilities, enabling connections and predictions that yesterdays rep never even dreamed of.

What if reps could anticipate relevance?

By combining the best in industry expertise, brand strategy, CRM technology, and artificial intelligence (AI) and machine learning, reps can have the tools to make anticipated relevance possible.

At a recent Digital Health Coalition Midwest Summit, Intouch demonstrated examples of what this could look like for a brand, using their AI assistant, EVA, which is short for embedded virtual assistant.

How does it work?

EVA connects with Veeva to access a reps calendar of appointments to obtain information about where they need to go and who they need to see. Combined with marketing segmentation, EVA tells a sales rep the segmentation of todays calls. Data further informs the conversation with helpful facts like script-writing history, marketing plan, prior messages presented, and online activity, giving our rep a prediction of what their next best actions should be. These suggestions can be offered through the voice assistant, or sent by text or email for later reference, and can power the flow of the in-office detail. After the call, EVA can help a rep record a call quickly and easily in the CRM system.

An AI-powered ecosystem makes sure no pertinent data goes to waste. Whether its an email open, a website visit, a rep conversation, a script, or any other activity, the rep can quickly and easily understand what their HCP cares about and what information will be most helpful to their practice.

By anticipating relevance, the rep can provide an HCP with information thats useful to them, in the format, time, and place that helps them most. And EVA is able to use the most relevant assets efficiently and minimise the burden of administrative tasks. Time is used wisely on both sides, making it possible for the right information to help patients that much sooner.

Want to learn more about AI and modern pharma marketing? Download Intouchs comprehensive ebook.

Interested in learning how AI can work for your reps? Reach out to the Intouch team today.

Read more:
Artificial intelligence and the future of rep visits - - pharmaphorum

Read More..