Page 3,898«..1020..3,8973,8983,8993,900..3,9103,920..»

Why being data-centric is the first step to success with artificial intelligence – Tech Wire Asia

Being successful in deploying AI must start with a data-centric mindset. Source: Shutterstock.

REGARDLESS of industry, artificial intelligence (AI) is a disruptive technology that is greatly sought after.

Many organizations are looking to deploy AI projects at scale, in hopes of boosting performance and ultimately increasing revenues.

However, many fail to see returns on their AI investments. Often, this is because AI projects are not approached in the right manner.

To be AI-first, organizations need to adopt a data-first mindset. Heres how and why:

Using the right methodologies and technologies is crucial for the successful deployment of AI solutions.

It is not enough to just rely on agile methods, as they focus heavily on functionality and application logic delivery. Instead, data-centric methodologies such as the Cross Industry Standard Process for Data Mining (CRISP-DM) should be used, as they concentrate on the steps needed for a successful data project.

Depending on organizational needs, a hybrid methodology can also be deployed by merging the non-agile CRISP-DM with agile methodologies, making it more relevant.

Data-centric methodologies must be followed by the use of data-centric technologies. For any AI projects, organizations must always keep the end in mind, and have clarity on what the desired outcomes are.

Methodology and technology will not be of use without a data-proficient team.

There must be a specialized AI-team in place that can effectively collect, compile, and extract key information from seemingly haphazard data sets.

Ideally, the team should have a good mix of data scientists, engineers, and specialists that possess the skills to put models into operation.

There is no room for guesswork in AI deployment randomly changing data sets wastes unnecessary time and resources and is simply disastrous.

For a successful AI project to materialize, organizations ought to continuously invest for the long term.

Staying complacent is not an option. They must seek to refine the methodologies in place. If the technologies used are no longer relevant, they should be replaced.

AI projects will not work if employees lack the skills and tools needed to deploy them. Thus, employees should be upskilled, and also made to understand the value of AI, and how it can augment the work that they do.

While the technology is still in its infancy for large scale projects, it is only a matter of time before AI is deployed at scale, across organizations and markets.

Ultimately, it all boils down to agility and resilience in the midst of change. Those that adopt the right mindset will succeed, those that resist the change will suffer.

Emily Wong

Emily is a tech writer constantly on the lookout for cool technology and writes about how small and medium enterprises can leverage it. She also reads, runs, and dreams of writing in a mountain cabin.

The rest is here:
Why being data-centric is the first step to success with artificial intelligence - Tech Wire Asia

Read More..

AI, machine learning, robots, and marketing tech coming to a store near you – TechRepublic

Retailers are harnessing the power of new technology to dig deeper into customer decisions and bring people back into stores.

The National Retail Federation's 2020 Big Show in New York was jam packed full of robots, frictionless store mock-ups, and audacious displays of the latest technology now available to retailers.

Dozens of robots, digital signage tools, and more were available for retail representatives to test out, with hundreds of the biggest tech companies in attendance offering a bounty of eye-popping gadgets designed to increase efficiency and bring the wow factor back to brick-and-mortar stores.

SEE: Artificial intelligence: A business leader's guide (free PDF) (TechRepublic)

Here are some of the biggest takeaways from the annual retail event.

With the explosion in popularity of Amazon, Alibaba, and other e-commerce sites ready to deliver goods right to your door within days, many analysts and retailers figured the brick-and-mortar stores of the past were on their last legs.

But it turns out billions of customers still want the personal, tailored touch of in-store experiences and are not ready to completely abandon physical retail outlets.

"It's not a retail apocalypse. It's a retail renaissance," said Lori Mitchell-Keller, executive vice president and global general manager of consumer industries at SAP.

As leader of SAP's retail, wholesale distribution, consumer products, and life sciences industries division, Mitchell-Keller said she was surprised to see that retailers had shifted their stance and were looking to find ways to beef up their online experience while infusing stores with useful but flashy technology.

"Brick-and-mortar stores have this unique capability to have a specific advantage against online retailers. So despite the trend where everything was going online, it did not mean online at the expense of brick-and-mortar. There is a balance between the two. Those companies that have a great online experience and capability combined with a brick-and-mortar store are in the best place in terms of their ability to be profitable," Mitchell-Keller said during an interview at NRF 2020.

"There is an experience that you cannot get online. This whole idea of customer experience and experience management is definitely the best battleground for the guys that can't compete in delivery. Even for the ones that can compete on delivery, like the Walmarts and Targets, they are using their brick-and-mortar stores to offer an experience that you can't get online. We thought five years ago that brick-and-mortar was dead and it's absolutely not dead. It's actually an asset."

In her experience working with the world's biggest retailers, companies that have a physical presence actually have a huge advantage because customers are now yearning for a personalized experience they can't get online. While e-commerce sites are fast, nothing can beat the ability to have real people answer questions and help customers work through their options, regardless of what they're shopping for.

Retailers are also transforming parts of their stores into fulfillment centers for their online sales, which have the doubling effect of bringing customers into the store where they may spend even more on things they see.

"The brick-and-mortar stores that are using their stores as fulfillment centers have a much lower cost of delivery because they're typically within a few miles of customers. If they have a great online capability and good store fulfillment, they're able to get to customers faster than the aggregators," Mitchell-Keller said. "It's better to have both."

SEE: Feature comparison: E-commerce services and software (TechRepublic Premium)

But one of the main trends, and problems, highlighted at NRF 2020 was the sometimes difficult transition many retailers have had to make to a digitized world.

NRF 2020 was full of decadent tech retail tools like digital price tags, shelf-stocking robots and next-gen advertising signage, but none of this could be incorporated into a retail environment without a basic amount tech talent and systems to back it all.

"It can be very overwhelmingly complicated, not to mention costly, just to have a team to manage technology and an environment that is highly digitally integrated. The solution we try to bring to bear is to add all these capabilities or applications into a turn key environment because fundamentally, none of it works without the network," said Michael Colaneri, AT&T's vice president of retail, restaurants and hospitality.

While it would be easy for a retailer to leave NRF 2020 with a fancy robot or cool gadget, companies typically have to think bigger about the changes they want to see, and generally these kinds of digital transformations have to be embedded deep throughout the supply chain before they can be incorporated into stores themselves.

Colaneri said much of AT&T's work involved figuring out how retailers could connect the store system, the enterprise, the supply chain and then the consumer, to both online and offline systems. The e-commerce part of retailer's business now had to work hand in hand with the functionality of the brick-and-mortar experience because each part rides on top of the network.

"There are five things that retailers ask me to solve: Customer experience, inventory visibility, supply chain efficiency, analytics, and the integration of media experiences like a robot, electronic shelves or digital price tags. How do I pull all this together into a unified experience that is streamlined for customers?" Colaneri said.

"Sometimes they talk to me about technical components, but our number one priority is inventory visibility. I want to track products from raw material to where it is in the legacy retail environment. Retailers also want more data and analytics so they can get some business intelligence out of the disparate data lakes they now have."

The transition to digitized environments is different for every retailer, Colaneri added. Some want slow transitions and gradual introductions of technology while others are desperate for a leg up on the competition and are interested in quick makeovers.

While some retailers have balked at the thought, and price, of wholesale changes, the opposite approach can end up being just as costly.

"Anybody that sells you a digital sign, robot, Magic Mirror or any one of those assets is usually partnering with network providers because it requires the network. And more importantly, what typically happens is if someone buys an asset, they are underestimating the requirements it's going to need from their current network," Colaneri said.

"Then when their team says 'we're already out of bandwidth,' you'll realize it wasn't engineered and that the application wasn't accommodated. It's not going to work. It can turn into a big food fight."

Retailers are increasingly realizing the value of artificial intelligence and machine learning as a way to churn through troves of data collected from customers through e-commerce sites. While these tools require the kind of digital base that both Mitchell-Keller and Colaneri mentioned, artificial intelligence (AI) and machine learning can be used to address a lot of the pain points retailers are now struggling with.

Mitchell-Keller spoke of SAP's work with Costco as an example of the kind of real-world value AI and machine learning can add to a business. Costco needed help reducing waste in their bakeries and wanted better visibility into when customers were going to buy particular products on specific days or at specific times.

"Using machine learning, what SAP did was take four years of data out of five different stores for Costco as a pilot and used AI and machine learning to look through the data for patterns to be able to better improve their forecasting. They're driving all of their bakery needs based on the forecast and that forcecast helped Costco so much they were able to reduce their waste by about 30%," Mitchell-Keller said, adding that their program improved productivity by 10%.

SAP and dozens of other tech companies at NRF 2020 offered AI-based systems for a variety of supply chain management tools, employee payment systems and even resume matches. But AI and machine learning systems are nothing without more data.

SEE:Managing AI and ML in the enterprise 2019: Tech leaders expect more difficulty than previous IT projects(TechRepublic Premium)

Jeff Warren, vice president of Oracle Retail, said there has been a massive shift toward better understanding customers through increased data collection. Historically, retailers simply focused on getting products through the supply chain and into the hands of consumers. But now, retailers are pivoting toward focusing on how to better cater services and goods to the customer.

Warren said Oracle Retail works with about 6,000 retailers in 96 different countries and that much of their work now prioritizes collecting information from every customer interaction.

"What is new is that when you think of the journey of the consumer, it's not just about selling anymore. It's not just about ringing up a transaction or line busting. All of the interactions between you and me have value and hold something meaningful from a data perspective," he said, adding that retailers are seeking to break down silos and pool their data into a single platform for greater ease of use.

"Context would help retailers deliver a better experience to you. Its petabytes of information about what the US consumer market is spending and where they're spending. We can take the information that we get from those interactions that are happening at the point of sale about our best customers and learn more."

With the Oracle platform, retailers can learn about their customers and others who may have similar interests or live in similar places. Companies can do a better job of targeting new customers when they know more about their current customers and what else they may want.

IBM is working on similar projects with hundreds of different retailers , all looking to learn more about their customers and tailor their e-commerce as well as in-store experience to suit their biggest fans.

IBM global managing director for consumer industries Luq Niazi told TechRepublic during a booth tour that learning about consumer interests was just one aspect of how retailers could appeal to customers in the digital age.

"Retailers are struggling to work through what tech they need. When there is so much tech choice, how do you decide what's important? Many companies are implementing tech that is good but implemented badly, so how do you help them do good tech implemented well?" Niazi said.

"You have all this old tech in stores and you have all of this new tech. You have to think about how you bring the capability together in the right way to deploy flexibly whatever apps and experiences you need from your store associate, for your point of sale, for your order management system that is connected physically and digitally. You've got to bring those together in different ways. We have to help people think about how they design the store of the future."

Get expert tips on mastering the fundamentals of big data analytics, and keep up with the latest developments in artificial intelligence. Delivered Mondays

Read this article:
AI, machine learning, robots, and marketing tech coming to a store near you - TechRepublic

Read More..

The Best 17 AI and Machine Learning TED Talks for Practitioners – Solutions Review

The editors at Solutions Review curated this list of the best AI and machine learning TED talks for practitioners in the field.

TED Talks are influential videos from expert speakers in a variety of verticals. TED began in 1984 as a conference where Technology, Entertainment and Design converged, and today covers almost all topics from business to technology to global issues in more than 110 languages. TED is building a clearinghouse of free knowledge from the worlds top thinkers, and their library of videos is expansive and rapidly growing.

Solutions Review has curated this list of AI and machine learning TED talks to watch if you are a practitioner in the field. Talks were selected based on relevance, ability to add business value, and individual speaker expertise. Weve also curated TED talk lists for topics like data visualization and big data.

Erik Brynjolfsson is the director of the MIT Center for Digital Business and a research associate at the National Bureau of Economic Research. He asks how IT affects organizations, markets and the economy. His books include Wired for Innovation and Race Against the Machine. Brynjolfsson was among the first researchers to measure the productivity contributions of information and community technology (ICT) and the complementary role of organizational capital and other intangibles.

In this talk, Brynjolfsson argues that machine learning and intelligence are not the end of growth its simply the growing pains of a radically reorganized economy. A riveting case for why big innovations are ahead of us if we think of computers as our teammates. Be sure to watch the opposing viewpoint from Robert Gordon.

Jeremy Howard is the CEO ofEnlitic, an advanced machine learning company in San Francisco. Previously, he was the president and chief scientist atKaggle, a community and competition platform of over 200,000 data scientists. Howard is a faculty member atSingularity University, where he teaches data science. He is also a Young Global Leader with the World Economic Forum, and spoke at the World Economic Forum Annual Meeting 2014 on Jobs for the Machines.

Technologist Jeremy Howard shares some surprising new developments in the fast-moving field of deep learning, a technique that can give computers the ability to learn Chinese, or to recognize objects in photos, or to help think through a medical diagnosis.

Nick Bostrom is a professor at the Oxford University, where he heads theFuture of Humanity Institute, a research group of mathematicians, philosophers and scientists tasked with investigating the big picture for the human condition and its future. Bostrom was honored as one ofForeign Policys 2015Global Thinkers. His bookSuperintelligenceadvances the ominous idea that the first ultraintelligent machine is the last invention that man need ever make.

In this talk, Nick Bostrom calls machine intelligence the last invention that humanity will ever need to make. Bostrom asks us to think hard about the world were building right now, driven by thinking machines. Will our smart machines help to preserve humanity and our values or will they have values of their own?

Lis work with neural networks and computer vision (with Stanfords Vision Lab) marks a significant step forward for AI research, and could lead to applications ranging from more intuitive image searches to robots able to make autonomous decisions in unfamiliar situations. Fei-Fei was honored as one ofForeign Policys 2015Global Thinkers.

This talk digs into how computers are getting smart enough to identify simple elements. Computer vision expert Fei-Fei Li describes the state of the art including the database of 15 million photos her team built to teach a computer to understand pictures and the key insights yet to come.

Anthony Goldbloom is the co-founder and CEO ofKaggle. Kaggle hosts machine learning competitions, where data scientists download data and upload solutions to difficult problems. Kaggle has a community of over 600,000 data scientists. In 2011 and 2012,Forbesnamed Anthony one of the 30 under 30 in technology; in 2013 theMIT Tech Reviewnamed him one of top 35 innovators under the age of 35, and the University of Melbourne awarded him an Alumni of Distinction Award.

This talk by Anthony Goldbloom describes some of the current use cases for machine learning, far beyond simple tasks like assessing credit risk and sorting mail.

Tufekci is a contributing opinion writer at theNew York Times, an associate professor at the School of Information and Library Science at University of North Carolina, Chapel Hill, and a faculty associate at Harvards Berkman Klein Center for Internet and Society. Her book,Twitter and Tear Gas was published in 2017 by Yale University Press.

Machine intelligence is here, and were already using it to make subjective decisions. But the complex way AI grows and improves makes it hard to understand and even harder to control. In this cautionary talk, techno-sociologist Zeynep Tufekci explains how intelligent machines can fail in ways that dont fit human error patterns and in ways we wont expect or be prepared for.

In his bookThe Business Romantic, Tim Leberecht invites us to rediscover romance, beauty and serendipity by designing products, experiences, and organizations that make us fall back in love with our work and our life. The book inspired the creation of the Business Romantic Society, a global collective of artists, developers, designers and researchers who share the mission of bringing beauty to business.

In this talk, Tim Leberecht makes the case for a new radical humanism in a time of artificial intelligence and machine learning. For the self-described business romantic, this means designing organizations and workplaces that celebrate authenticity instead of efficiency and questions instead of answers. Leberecht proposes four principles for building beautiful organizations.

Grady Booch is Chief Scientist for Software Engineering as well as Chief Scientist for Watson/M at IBM Research, where he leads IBMs research and development for embodied cognition. Having originated the term and the practice of object-oriented design, he is best known for his work in advancing the fields of software engineering and software architecture.

Grady Booch allays our worst (sci-fi induced) fears about superintelligent computers by explaining how well teach, not program, them to share our human values. Rather than worry about an unlikely existential threat, he urges us to consider how artificial intelligence will enhance human life.

Tom Gruberis a product designer, entrepreneur, and AI thought leader who uses technology to augment human intelligence. He was co-founder, CTO, and head of design for the team that created theSiri virtual assistant. At Apple for over 8 years, Tom led the Advanced Development Group that designed and prototyped new capabilities for products that bring intelligence to the interface.

This talk introduces the idea of Humanistic AI. He shares his vision for a future where AI helps us achieve superhuman performance in perception, creativity and cognitive function from turbocharging our design skills to helping us remember everything weve ever read. The idea of an AI-powered personal memory also extends to relationships, with the machine helping us reflect on our interactions with people over time.

Stuart Russell is a professor (and formerly chair) of Electrical Engineering and Computer Sciences at University of California at Berkeley. His bookArtificial Intelligence: A Modern Approach (with Peter Norvig) is the standard text in AI; it has been translated into 13 languages and is used in more than 1,300 universities in 118 countries. He also works for the United Nations, developing a new global seismic monitoring system for the nuclear-test-ban treaty.

His talk centers around the question of whether we can harness the power of superintelligent AI while also preventing the catastrophe of robotic takeover. As we move closer toward creating all-knowing machines, AI pioneer Stuart Russell is working on something a bit different: robots with uncertainty. Hear his vision for human-compatible AI that can solve problems using common sense, altruism and other human values.

Dr. Pratik Shahs research creates novel intersections between engineering, medical imaging, machine learning, and medicine to improve health and diagnose and cure diseases. Research topics include: medical imaging technologies using unorthodox artificial intelligence for early disease diagnoses; novel ethical, secure and explainable artificial intelligence based digital medicines and treatments; and point-of-care medical technologies for real world data and evidence generation to improve public health.

TED Fellow Pratik Shah is working on a clever system to do just that. Using an unorthodox AI approach, Shah has developed a technology that requires as few as 50 images to develop a working algorithm and can even use photos taken on doctors cell phones to provide a diagnosis. Learn more about how this new way to analyze medical information could lead to earlier detection of life-threatening illnesses and bring AI-assisted diagnosis to more health care settings worldwide.

Margaret Mitchells research involves vision-language and grounded language generation, focusing on how to evolve artificial intelligence towards positive goals. Her work combines computer vision, natural language processing, social media as well as many statistical methods and insights from cognitive science. Before Google, Mitchell was a founding member of Microsoft Researchs Cognition group, focused on advancing artificial intelligence, and a researcher in Microsoft Researchs Natural Language Processing group.

Margaret Mitchell helps develop computers that can communicate about what they see and understand. She tells a cautionary tale about the gaps, blind spots and biases we subconsciously encode into AI and asks us to consider what the technology we create today will mean for tomorrow.

Kriti Sharma is the Founder of AI for Good, an organization focused on building scalable technology solutions for social good. Sharma was recently named in theForbes 30 Under 30 list for advancements in AI. She was appointed a United Nations Young Leader in 2018 and is an advisor to both the United Nations Technology Innovation Labs and to the UK Governments Centre for Data Ethics and Innovation.

AI algorithms make important decisions about you all the time like how much you should pay for car insurance or whether or not you get that job interview. But what happens when these machines are built with human bias coded into their systems? Technologist Kriti Sharma explores how the lack of diversity in tech is creeping into our AI, offering three ways we can start making more ethical algorithms.

Matt Beane does field research on work involving robots to help us understand the implications of intelligent machines for the broader world of work. Beane is an Assistant Professor in the Technology Management Program at the University of California, Santa Barbara and a Research Affiliate with MITs Institute for the Digital Economy. He received his PhD from the MIT Sloan School of Management.

The path to skill around the globe has been the same for thousands of years: train under an expert and take on small, easy tasks before progressing to riskier, harder ones. But right now, were handling AI in a way that blocks that path and sacrificing learning in our quest for productivity, says organizational ethnographer Matt Beane. Beane shares a vision that flips the current story into one of distributed, machine-enhanced mentorship that takes full advantage of AIs amazing capabilities while enhancing our skills at the same time.

Leila Pirhaji is the founder ofReviveMed, an AI platform that can quickly and inexpensively characterize large numbers of metabolites from the blood, urine and tissues of patients. This allows for the detection of molecular mechanisms that lead to disease and the discovery of drugs that target these disease mechanisms.

Biotech entrepreneur and TED Fellow Leila Pirhaji shares her plan to build an AI-based network to characterize metabolite patterns, better understand how disease develops and discover more effective treatments.

Janelle Shane is the owner of AIweirdness.com. Her book, You Look Like a Thing and I Love Youuses cartoons and humorous pop-culture experiments to look inside the minds of the algorithms that run our world, making artificial intelligence and machine learning both accessible and entertaining.

The danger of artificial intelligence isnt that its going to rebel against us, but that its going to do exactly what we ask it to do, says AI researcher Janelle Shane. Sharing the weird, sometimes alarming antics of AI algorithms as they try to solve human problems like creating new ice cream flavors or recognizing cars on the road Shane shows why AI doesnt yet measure up to real brains.

Sylvain Duranton is the global leader of BCG GAMMA, a unit dedicated to applying data science and advanced analytics to business. He manages a team of more than 800 data scientists and has implemented more than 50 custom AI and analytics solutions for companies across the globe.

In this talk, business technologist Sylvain Duranton advocates for a Human plus AI approach using AI systems alongside humans, not instead of them and shares the specific formula companies can adopt to successfully employ AI while keeping humans in the loop.

For more AI and machine learning TED talks, browse TEDs complete topic collection.

Timothy is Solutions Review's Senior Editor. He is a recognized thought leader and influencer in enterprise BI and data analytics. Timothy has been named a top global business journalist by Richtopia. Scoop? First initial, last name at solutionsreview dot com.

See original here:
The Best 17 AI and Machine Learning TED Talks for Practitioners - Solutions Review

Read More..

REPLY: European Central Bank Explores the Possibilities of Machine Learning With a Coding Marathon Organised by Reply – Business Wire

TURIN, Italy--(BUSINESS WIRE)--The European Central Bank (ECB), in collaboration with Reply, leader in digital technology innovation, is organising the Supervisory Data Hackathon, a coding marathon focussing on the application of Machine Learning and Artificial Intelligence.

From 27 to 29 February 2020, at the ECB in Frankfurt, more than 80 participants from the ECB, Reply and further companies explore possibilities to gain deeper and faster insights into the large amount of supervisory data gathered by the ECB from financial institutions through regular financial reporting for risk analysis. The coding marathon provides a protected space to co-creatively develop new ideas and prototype solutions based on Artificial Intelligence within a short timeframe.

Ahead of the event, participants submit projects in the areas of data quality, interlinkages in supervisory reporting and risk indicators. The most promising submissions will be worked on for 48 hours during the event by the multidisciplinary teams composed of members from the ECB, Reply and other companies.

Reply has proven its Artificial Intelligence and Machine Learning capabilities with numerous projects in various industries and combines this technological expertise with in-depth knowledge of the financial services industry and its regulatory environment.

Coding marathons using the latest technologies are a substantial element in Replys toolset for sparking innovation through training and knowledge transfer internally and with clients and partners.

ReplyReply [MTA, STAR: REY] specialises in the design and implementation of solutions based on new communication channels and digital media. As a network of highly specialised companies, Reply defines and develops business models enabled by the new models of big data, cloud computing, digital media and the internet of things. Reply delivers consulting, system integration and digital services to organisations across the telecom and media; industry and services; banking and insurance; and public sectors. http://www.reply.com

See original here:
REPLY: European Central Bank Explores the Possibilities of Machine Learning With a Coding Marathon Organised by Reply - Business Wire

Read More..

VUniverse Named One of Five Finalists for SXSW Innovation Awards: AI & Machine Learning Category – PRNewswire

NEW YORK, Feb. 5, 2020 /PRNewswire/ -- VUniverse, a personalized movie and show recommendation platform that enables users to browse their streaming services in one appa channel guide for the streaming universe, announced today it's been named one of five finalists in the AI & Machine Learning category for the 23rd annual SXSW Innovation Awards.

The SXSW Innovation Awards recognizes the most exciting tech developments in the connected world. During the showcase on Saturday, March 14, 2020, VUniverse will offer first-look demos of its platform as attendees explore this year's most transformative and forward-thinking digital projects. They'll be invited to experience how VUniverse utilizes AI to cross-reference all streaming services a user subscribes to and then delivers personalized suggestions of what to watch.

"We're honored to be recognized as a finalist for the prestigious SXSW Innovation Awards and look forward to showcasing our technology that helps users navigate the increasingly ever-changing streaming service landscape," said VUniverse co-founder Evelyn Watters-Brady. "With VUniverse, viewers will spend less time searching and more time watching their favorite movies and shows, whether it be a box office hit or an obscure indie gem."

About VUniverse VUniverse is a personalized movie and show recommendation platform that enables users to browse their streaming services in one appa channel guide for the streaming universe. Using artificial intelligence, VUniverse creates a unique taste profile for every user and serves smart lists of curated titles using mood, genre, and user-generated tags, all based on content from the user's existing subscription services. Users can also create custom watchlists and share them with friends and family.

Media Contact Jessica Cheng jessica@relativity.ventures

SOURCE VUniverse

Continue reading here:
VUniverse Named One of Five Finalists for SXSW Innovation Awards: AI & Machine Learning Category - PRNewswire

Read More..

Top Machine Learning Projects Launched By Google In 2020 (Till Date) – Analytics India Magazine

It may be that time of the year when new year resolutions start to fizzle, but Google seems to be just getting started.The tech giant has been building tools and services to bring in the benefits of artificial intelligence (AI) to its users. The company has begun upping its arsenal of AI-powered products with a string of new releases this month alone.

Here is a list of the top products launched by Google in January 2020.

Although first introduced in 2014, the latest iterations of sequence-to-sequence (seq2seq) AI models have strengthened the capability of key text-generating tasks including sentence formation and grammar correction. Googles LaserTagger, which the company has open-sourced, speeds up the text generation process and reduces the chances of errors

Compared to traditional seq2seq methods, LaserTagger computes predictions up to 100 times faster, making it suitable for real-time applications. Furthermore, it can be plugged into an existing technology stack without adding any noticeable latency on the user side because of its high inference speed. These advantages become even more pronounced when applied at a large scale.

The company has expanded its Coral lineup by unveiling two new Coral AI products Coral Dev Board Mini and Coral Accelerator Module. Announced ahead of the Consumer Electronics Show (CES) this year, the latest addition to the Coral family followed a successful beta run of the platform in October 2019.

The Coral Accelerator Module is a multi-chip package that encapsulates the companys custom-designed Edge Tensor Processing Unit (TPU). The chip inside the Coral Dev Board is designed to execute multiple computer vision models at 30 frames per second or a single model at over 100fps. Users of this technology have said that it is easy to integrate into custom PCB designs.

Coral Accelerator Module, a new multi-chip module with Google Edge TPU.

Google has also released the Coral Dev Board Mini which provides a smaller form-factor, lower-power, and a cost-effective alternative to the Coral Dev Board.

Caption: The Coral Dev Board Mini is a cheaper, smaller and lower power version of the Coral Dev Board

Officially announced in March 2019, the Coral products were intended to help developers work more efficiently by reducing their reliance on connections to cloud-based systems by creating AI that works locally.

Chatbots are one of the hottest trends in AI owing to its tremendous growth in applications. Google has added to the mix with its human-like multi-turn open-domain version. Meena has been trained in an end-to-end fashion on data mined from social media conversations held in the public domain with a totalling 300GB+ text data. Furthermore, it is massive in size with 2.6B parameter neural network and has been trained to minimize perplexity of the next token.

Furthermore, Googles human evaluation metric called Sensibleness and Specificity Average (SSA) also captures the key elements of a human-like multi-turn conversation, making this chatbot even more versatile. In a blog post, Google had claimed that Meena can conduct conversations that are more sensible and specific than existing state-of-the-art chatbots.

Plugged as an important development of Googles Transformer the novel neural network architecture for language understanding Reformer is intended to handle context windows of up to 1 million words, all on a single AI accelerator using only 16GB of memory.

Google had first mooted the idea of a new transformer model in a research paper in collaboration with UC Berkeley in 2019. The core idea behind this model was self-attention, and the ability to attend to different positions of an input sequence to compute a representation of that sequence elaborated in one of our articles.

Today, Reformer can process whole books concurrently and that too on a single gadget, thereby exhibiting great potential.

Google has time and again reiterated its commitment to the development of AI. Seeing it as more profound than fire or electricity, it firmly believes that this technology can eliminate many of the constraints we face today.

The company has also delved into research anchored around AI that is spread across a host of sectors, whether it be detecting breast cancer or protecting whales or other endangered species.

comments

See the original post here:
Top Machine Learning Projects Launched By Google In 2020 (Till Date) - Analytics India Magazine

Read More..

Reinforcement Learning (RL) Market Report & Framework, 2020: An Introduction to the Technology – Yahoo Finance

Dublin, Feb. 04, 2020 (GLOBE NEWSWIRE) -- The "Reinforcement Learning: An Introduction to the Technology" report has been added to ResearchAndMarkets.com's offering.

These days, machine learning (ML), which is a subset of computer science, is one of the most rapidly growing fields in the technology world. It is considered to be a core field for implementing artificial intelligence (AI) and data science.

The adoption of data-intensive machine learning methods like reinforcement learning is playing a major role in decision-making across various industries such as healthcare, education, manufacturing, policing, financial modeling and marketing. The growing demand for more complex machine working is driving the demand for learning-based methods in the ML field. Reinforcement learning also presents a unique opportunity to address the dynamic behavior of systems.

This study was conducted in order to understand the current state of reinforcement learning and track its adoption along various verticals, and it seeks to put forth ways to fully exploit the benefits of this technology. This study will serve as a guide and benchmark for technology vendors, manufacturers of the hardware that supports AI, as well as the end-users who will finally use this technology. Decisionmakers will find the information useful in developing business strategies and in identifying areas for research and development.

The report includes:

Key Topics Covered

Chapter 1 Reinforcement Learning

Chapter 2 Bibliography

List of TablesTable 1: Reinforcement Learning vs. Supervised Learning vs. Unsupervised LearningTable 2: Global Machine Learning Market, by Region, Through 2024

List of FiguresFigure 1: Reinforcement Learning ProcessFigure 2: Reinforcement Learning WorkflowFigure 3: Artificial Intelligence vs. Machine Learning vs. Reinforcement LearningFigure 4: Machine Learning ApplicationsFigure 5: Types of Machine LearningFigure 6: Reinforcement Learning Market DynamicsFigure 7: Global Machine Learning Market, by Region, 2018-2024

For more information about this report visit https://www.researchandmarkets.com/r/g0ad2f

Research and Markets also offers Custom Research services providing focused, comprehensive and tailored research.

CONTACT: ResearchAndMarkets.comLaura Wood, Senior Press Managerpress@researchandmarkets.comFor E.S.T Office Hours Call 1-917-300-0470For U.S./CAN Toll Free Call 1-800-526-8630For GMT Office Hours Call +353-1-416-8900

View original post here:
Reinforcement Learning (RL) Market Report & Framework, 2020: An Introduction to the Technology - Yahoo Finance

Read More..

SwRI, SMU fund SPARKS program to explore collaborative research and apply machine learning to industry problems – TechStartups.com

Southwest Research Institute (SwRI) and the Lyle School of Engineering at Southern Methodist University (SMU) announced the Seed Projects Aligning Research, Knowledge, and Skills (SPARKS) joint program, which aims to strengthen and cultivate long-term research collaboration between the organizations.

Research topics will vary for the annual funding cycles. The inaugural program selections will apply machine learning a subset of artificial intelligence (AI) to solve industry problems. A peer review panel selected two proposals for the 2020 cycle, with each receiving $125,000 in funding for a one-year term.

Our plan for the SPARKS program is not only to foster a close collaboration between our two organizations but, more importantly, to also make a long-lasting impact in our collective areas of research, said Lyle Dean Marc P. Christensen. With the growing demand for AI tools in industry, machine learning was an obvious theme for the programs inaugural year.

The first selected project is a proof of concept that will lay the groundwork for drawing relevant data from satellite and other sources to assess timely surface moisture conditions applicable to other research. SwRI will extract satellite, terrain and weather data that will be used by SMU Lyle to develop machine learning functions that can rapidly process these immense quantities of data. The interpreted data can then be applied to research for municipalities, water management authorities, agricultural entities and others to produce, for example, fire prediction tools and maps of soil or vegetation water content. Dr. Stuart Stothoff of SwRI and Dr. Ginger Alford of SMU Lyle are principal investigators of Enhanced Time-resolution Backscatter Maps Using Satellite Radar Data and Machine Learning.

The second project tackles an issue related to the variability of renewable energy from wind and solar power systems: effective management of renewable energy supplies to keep the power grid stable. To help resolve this challenge, the SwRI-SMU Lyle team will use advanced machine learning techniques to model and control battery energy storage systems. These improved battery storage systems, which would automatically and strategically push or draw power instantly in response to grid frequency deviations, could potentially be integrated with commercial products and tools to help regulate the grid. Principal investigators of Machine Learning-powered Battery Storage Modeling and Control for Fast Frequency Regulation Service are Dr. Jianhui Wang of SMU Lyle and Yaxi Liu of SwRI.

To some extent, the SPARKS program complements our internal research efforts, which are designed to advance technologies and processes so they can be directly applied to industry programs, said Executive Vice President and COO Walt Downing of SwRI. We expect the 2020 selections to do just that, greatly advancing the areas of environmental management and energy storage and supply.

The program will fund up to three projects each year, seeking to bridge the gap between basic and applied research.

Read the original here:
SwRI, SMU fund SPARKS program to explore collaborative research and apply machine learning to industry problems - TechStartups.com

Read More..

How to handle the unexpected in conversational AI – ITProPortal

One of the biggest challenges for developers of natural language systems is accounting for the many and varied ways people express themselves. There is a reason many technology companies would rather we all spoke in simple terms, it makes humans easier to understand and narrows down the chances of machines getting it wrong.

But its hardly the engaging conversational experience that people expect of AI.

Language has evolved over many centuries. As various nations colonised and traded with other nations so our language whatever your native tongue is changed. And thanks to radio, TV, and the internet its continuing to expand every day.

Among the hundreds of new words added to the Merriam Webster dictionary in 2019 was Vacay: a shortening of vacation; Haircut: a new sense was added meaning a reduction in the value of an asset; and Dad joke: a corny pun normally told by fathers.

In a conversation, we as humans would probably be able to deduce what someone meant, even if wed never heard a word or expression before. Machines? Not so much. Or at least, not if they are reliant solely on machine learning for their natural language understanding.

While adding domain specialism such as a product name or industry terminology to an application overcomes a machine recognising some specific words, understanding all of the general everyday phrases people use in between those words is where the real challenge lies.

Most commercial natural language development tools today dont offer the intelligent, humanlike, experience that customers expect in automated conversations. One of the reasons is because they rely on pattern matching words using machine learning.

Although humans - at a basic level - pattern match words too, our brains add a much higher level of reasoning to allow us to do a better job of interpreting what the person meant by considering the words used, their order, synonyms and more, plus understanding when words such as book is being used as a verb or a noun. One might say we add our own more flexible form of linguistic modelling.

As humans, we can zoom in on the vocabulary that is relevant to the current discussion. So, when someone asks a question using a phrasing weve not heard before, we can extrapolate from what we do know, to understand what is meant. Even if weve never heard a particular word before, we can guess with a high degree of accuracy what it means.

But when it comes to machines, most statisticians will tell you that accuracy isnt a great metric. Its too easily skewed by the data its based on. Instead of accuracy, they use precision and recall. In simple terms precision is about quality. It marks the number of times you were actually correct with your prediction. Recall is about quantity, the number of times you predicted correctly out of all of the possibilities.

The vast majority of conversational AI development tools available today rely purely on machine learning. However, machine learning isnt great at precision, not without massive amounts of data on which to build its model. The end result is that the developer has to code in each and every way someone might ask a question. Not a task for the faint hearted when you consider there are at least 22 ways to say yes in the English language.

Some development tools rely on linguistic modelling, which is great at precision, because it understands sentence constructs and the common ways a particular type of question is phrased, but often doesnt stack up to machine learnings recall ability. This is because linguistic modelling is based on binary rules. They either match or they dont, which means inputs with minor deviations such as word ordering or spelling mistakes will be missed.

Machine learning on the other hand provides a probability on how much the input matches with the training data for a particular intent class and is therefore less sensitive to minor variations. Used alone, neither system is conducive to delivering a highly engaging conversation.

However, by taking a hybrid approach to conversational AI development, enterprises can benefit from the best of both worlds. Rules increase the precision of understanding, while machine learning delivers greater recall by recovering the data missed by the rules.

Not only does this significantly speed up the development process, it also allows for the application to deal with examples it has never seen before. In addition, it reduces the number of customers sent to a safety net such as a live chat agent, merely because theyve phrased their question slightly differently.

By enabling the conversational AI development platform to decide where each model is used, the performance of the conversational system can be optimised even further. Making it easier for the developer to build robust applications by automatically mixing and matching the underlying technology to achieve the best results, while allowing technology to more easily understand humans no matter what words we choose to use.

Andy Peart, CMSO, Artificial Solutions

Link:
How to handle the unexpected in conversational AI - ITProPortal

Read More..

Global Altcoin Breakout Could Usher Return of Alt Season Crypto Riches – newsBTC

The altcoin market has been ablaze all 2020 thus far, after the total crypto market and many individual altcoins broke out from downtrend resistance and went on massive rallies.

Next, altcoins are preparing a global breakout of major, horizontal resistance, which if holds could lead to a return of alt season and richest beyond investors wildest imaginations.

Across the top ten cryptocurrencies by market cap, altcoins have been exploding in value, with some gaining over 400% during their recent local rallies. Ethereum, Litecoin, and even XRP are now through their diagonal downtrend lines.

Related Reading | Altcoin Market Preparing Shocking Disbelief Rally According to Wall Street Cheat Sheet

The collective bullish price action across the altcoin space, is causing the total altcoin market to contest with horizontal, overhead resistance potentially the last remaining hurdle before a new bull market is in full effect.

According to one crypto analyst, if the altcoin market cap breaks out and holds above $90 billion USD for a few daily candle closes, the market will be primed for additional, powerful upside in the coming months ahead.

Its revived discussion across the crypto community about an alt season coming soon.

Alt seasons are often short, yet explosive periods of time where altcoins outperform Bitcoin by a significant margin and often go on rallies that result in over a few thousand percent gains.

The idea of alt season has turned itself into a meme after each time it was mentioned throughout 2019, it almost immediately led to a deep selloff in altcoins.

Before the breakout occurred, the bottoming pattern according to another analyst claimed appeared to resemble early Bitcoin price charts, suggesting that life-changing wealth could be ahead for altcoin investors.

If an alt season does occur, thats exactly what will happen if history repeats itself.

During previous alt seasons, Litecoin rallied from $96 to $420 in just five total days. This amounts to a 337% increase.

In roughly the same amount of time, XRP rallied from 16 cents apiece to as much as $3.50 per XRP token, resulting in an over 2,000% gain for Ripple investors.

Ethereum during this time rallied from $300 to $1,400 per ETH. A year earlier each ETH token was just $5.

It wasnt uncommon to see such incredible returns on investments. It was this buying frenzy that caused the crypto hype bubble in late 2017 a bubble that popped and caused much of the astronomical gains to be completely erased.

Related Reading | 10 Factors Confirm a New Crypto Bull Market Has Officially Begun

Although altcoins are pumping crazy as of late like analysts are claiming, their primary use remains little more than speculation. So the next time your altcoin portfolio reaches gains of 1000% or more, remember to consider taking some profit this time around because theres always a chance what goes up, comes back down.

Follow this link:
Global Altcoin Breakout Could Usher Return of Alt Season Crypto Riches - newsBTC

Read More..