Page 3,748«..1020..3,7473,7483,7493,750..3,7603,770..»

Structure-based AI tool can predict wide range of very different reactions – Chemistry World

New software has been created that can predict a wide range of reaction outcomes but is also more flexible than other programs when it comes to dealing with completely different chemical problems.The machine-learning platform, which uses structure-based molecular representations instead of big reaction-based datasets, could find diverse applications in organic chemistry.

Although machine-learning methods have been widely used to predict the molecular properties and biological activities of target molecules, their application in predicting reaction outcomes has been limited because current models usually cant be transferred to different problems. Instead, complex parameterisation is required for each individual case to achieve good results. Researchers in Germany are now reporting a general approach that overcomes this limitation.

Previous models for accurately predicting reaction results have been highly complex and problem-specific, says Frank Glorius of the University of Mnster, Germany, who led the study. They are mostly based on a previously gained understanding of the underlying processes and cannot be transferred to other problems. In our approach, we use a universal representation of the involved compounds, which is solely based on their molecular structures. This allows for a general applicability of our program to diverse problem sets.

The new tool is based on the assumption that reactivity can be directly derived from a molecules structure and uses an input based on multiple fingerprint features as an all-round molecular representation. Frederik Sandfort, who also participated in the research, explains that organic compounds can be represented as graphs on which simple structural (yes/no) queries can be carried out. Fingerprints are number sequences based on the combination of many such successive queries, he says. They have originally been developed for structural similarity searches and were proven to be well-suited for application in computational models. We use a large number of different fingerprints to represent the molecular structure of each compound as accurately as possible.

Glorius points out that their platform is very versatile. While our model can be used to predict molecular properties, its most important application is the accurate prediction of reaction results, he says. We could predict enantioselectivities and yields with comparable accuracy to previous problem-specific models. Furthermore, the model was applied to predicting relative conversion based on a high-throughput data set which was never tackled using machine learning before.

The program is also easy to use, the researchers say. It only requires the input data in a very simple form and some problem-specific settings, explains Sandfort. He adds that the tool is already online and will be updated further with the teams most recent developments.

Robert Paton at Colorado State University and the Center for Computer Assisted Synthesis, US, who was not involved in the study, notes that machine-learning methods are being increasingly used to identify patterns in data that can help to predict the outcome of experiments. Chemists have managed to harness these techniques by converting molecular structures into vectors of numbers that can then be passed to learning algorithms, he says. Representations using information only from a molecules atoms and their connectivity are agnostic to the particular reaction and as a result may be used across multiple reaction types for different types of predictions. Future developments in interpreting these predictions a challenge shared by all machine learning approaches will be valuable.

See original here:
Structure-based AI tool can predict wide range of very different reactions - Chemistry World

Read More..

With Launch of COVID-19 Data Hub, The White House Issues A ‘Call To Action’ For AI Researchers – Machine Learning Times – machine learning & data…

Originally published in TechCrunch, March 16, 2020

In a briefing on Monday, research leaders across tech, academia and the government joined the White House to announce an open data set full of scientific literature on the novel coronavirus. The COVID-19 Open Research Dataset, known as CORD-19, will also add relevant new research moving forward, compiling it into one centralized hub. The new data set is machine readable, making it easily parsed for machine learning purposes a key advantage according to researchers involved in the ambitious project.

In a press conference, U.S. CTO Michael Kratsios called the new data set the most extensive collection of machine readable coronavirus literature to date. Kratsios characterized the project as a call to action for the AI community, which can employ machine learning techniques to surface unique insights in the body of data. To come up with guidance for researchers combing through the data, the National Academies of Sciences, Engineering, and Medicine collaborated with the World Health Organization to come up with high priority questions about the coronavirus related to genetics, incubation, treatment, symptoms and prevention.

The partnership, announced today by the White House Office of Science and Technology Policy, brings together the Chan Zuckerberg Initiative, Microsoft Research, the Allen Institute for Artificial Intelligence, the National Institutes of Healths National Library of Medicine, Georgetown Universitys Center for Security and Emerging Technology, Cold Spring Harbor Laboratory and the Kaggle AI platform, owned by Google.

The database brings together nearly 30,000 scientific articles about the virus known as SARS-CoV-2. as well as related viruses in the broader coronavirus group. Around half of those articles make the full text available. Critically, the database will include pre-publication research from resources like medRxiv and bioRxiv, open access archives for pre-print health sciences and biology research.

To continue reading this article, click here.

Link:
With Launch of COVID-19 Data Hub, The White House Issues A 'Call To Action' For AI Researchers - Machine Learning Times - machine learning & data...

Read More..

Fritz brings on-device AI to Android and iOS – VentureBeat

Fritz AI, a startup providing an AI and machine learning development platform for Android and iOS, today announced that it has raised $5 million. CEO Dan Abdinoor says that the capital will accelerate Fritzs expansion as it launches its product out of early access, which he asserts addresses the challenges of mobile AI for businesses with toolkits that facilitate development, management, and execution.

Successfully deploying AI and machine learning models to production isnt often a walk in the park. In a recent study conducted by IDC analysts, only 25% of organizations said theyd successfully adopted an enterprise-wide AI strategy, and its estimated that 50% of companies spend between 8 and 90 days developing a single AI model.

To address this challenge, Fritz provides a cross-platform software development kit (SDK) with pretrained models for object detection, image segmentation, image labeling, style transfer, pose estimation, and more baked in. Using its end-to-end suite for building and deploying custom trained models, developers can generate and collect labeled data sets and train optimized models without code, and they can improve those models with fresh data uploaded continuously.

Apple and Google offer mobile machine learning solutions in Core ML and ML Kit, respectively, but theyre platform-specific. Plus, Fritzs prebuilt models dont require an internet connection, and they run atop live video with a fast frame rate. All of them optionally perform inference on-device and come in several sizes, from small models tailored for size and bandwidth to fast models optimized for processing speed.

Fritz enables customers to generate synthetic data or collect data for annotation and to benchmark on-device model performance before deployment. It supports the deployment of model versions to test devices while training and tracks the configurations, and it protects models from attackers while improving those running on-device by analyzing platform, device, and processor performance.

Among Fritzs customers are Momento, One Bite, Video Star, PlantVillage, MDacne, Superimpose X, and Instasaber, whove used its workflows to develop models that change hair color in real time, replace photo backgrounds, identify food like pizza, detect pets, and create stickers for messaging apps. The company has a rival in Polarr, which last year raised $11.5 million for its offline, on-device computational photography thats used by companies including Qualcomm, Oppo, and Hober. But Abdinoor asserts that Fritz has a competitive advantage in the breadth of its product portfolio.

Foundry Group led Boston, Massachusetts-based Fritzs latest round with participation from NextGen Venture Partners, Inner Loop Capital, Eniac Ventures, Uncork Capital, and Hack VC, which brings the companys total raised to $7 million. Fritz has about 24 employees.

Read the original:
Fritz brings on-device AI to Android and iOS - VentureBeat

Read More..

So only 12% of supply chain pros are using AI? Apparently. – Supply Chain Dive

Dive Brief:

Matt Leonard / Supply Chain Dive, data from MHI Annual Industry Report

One problem with pinning down the number of people who are using AI is if you ask two people what they consider to be AI you'll get two answers. I know because I did just that.

Thomas D. Boykin, a supply chain specialist at Deloitte and leader of the MHI white paper, said he considered AI to be not just predictive analytics and prescriptive analytics, but a system where the human is taken completely out of the loop. An example would be a system used by a waste management company to reroute vehicles based on sensor data from waste receptacles around the service area,Boykin said.

"There's some things that can be executed systematically without human intervention," he said in an interview. "And for us, that's where AI comes in."

But the definition is quite different for Stefan Nusser, the VP of product at Fetch Robotics who used to run the Cloud AI team for Google in Europe.

"In my mind, any data-driven, model-based machine learning approach that to me is AI,"Nusser said in an interview with Supply Chain Dive. This would include an algorithm based on historical data that provides outputs with a certain level of accuracy, he said.

For Nusser, if AI is the car then machine learning is the engine. In this case, many of the methods analysts currently use for predictive analytics clustering, classification, etc. would be considered AI.

But definitions aside,Boykin and Nusser agree that AI is far from widespread within the supply chain at this point.

"I think penetration is just slow,"Nusser said.

There are also two ways a company could be using AI:

The latter of these approaches is probably even rarer in the world of supply chain right now unless you're a transportation company trying to predict traffic and optimize route planning, Nusser said.

"I really doubt 12% of companies have that level of investment in AI," he said about in-house modeling capabilities.

This doesn't mean companies aren't interested in using more of the technology. However, bringing the required talent on board can be a struggle. Fifty-six percent of respondents considered hiring a top challenge in the current environment and 78% said there was high competition for the talent available.

Access to data is another issue.

AI applications are trained on historical data and, depending on the application, a company will need to ensure access to its data as a first step. But the MHI report found that only 16% of respondents consider their organization's data stream management to be either "good" or "excellent."

Data is more available thanks to cheap sensors and other Internet of Things technology, but "it also presents a problem with being able to synthesize it and filter it and understand what data is needed to drive what insights,"Boykin said.

Putting this data in the cloud can make it easier to share with vendors and other business partners when looking to create an AI application with outside help,Nusser said. These struggles aside, he still considers it a good technology for companies to invest.

While some technologies like blockchain might have been overhyped, AI has proven itself.

"I do think that it has the potential to even exceed what people are expecting from it,"Boykin said. "And I do think it is a worthwhile investment."

So what is AI good for? It's great for understanding unstructured data like images or language,Nusser said.

Within a warehouse this could mean using cameras to get a better understanding of inventory, the use of robotics or anything else in the physical world, he said.

"The value I see us bringing to the table is the physical world: an understanding of the physical world, an understanding of what get's touched ... how the environment changes over time?" he said.

This story was first published in our weekly newsletter, Supply Chain Dive: Operations. Sign up here.

See more here:
So only 12% of supply chain pros are using AI? Apparently. - Supply Chain Dive

Read More..

AI Is Changing Work and Leaders Need to Adapt – Harvard Business Review

Executive Summary

Recent empirical research by the MIT-IBM Watson AI Lab provides new insight into how work is changing in the face of AI. Based on this research, the author provides a roadmap for leaders intent on adapting their workforces and reallocating capital, while also delivering profitability. They argue that the key to unlocking the productivity potential while delivering on business objectives lies in three key strategies: rebalancing resources, investing in workforce reskilling and, on a larger scale, advancing new models of education and lifelong learning.

As AI is increasingly incorporated into our workplaces and daily lives, it is poised to fundamentally upend the way we live and work. Concern over this looming shift is widespread. A recent survey of 5,700 Harvard Business School alumni found that 52% of even this elite group believe the typical company will employ fewer workers three years from now.

The advent of AI poses new and unique challenges for business leaders. They must continue to deliver financial performance, while simultaneously making significant investments in hiring, workforce training, and new technologies that support productivity and growth. These seemingly competing business objectives can make for difficult, often agonizing, leadership decisions.

Against this backdrop, recent empirical research by our team at the MIT-IBM Watson AI Lab provides new insight into how work is changing in the face of AI. By examining these findings, we can create a roadmap for leaders intent on adapting their workforces and reallocating capital, while also delivering profitability.

The stakes are high. AI is an entirely new kind of technology, one that has the ability to anticipate future needs and provide recommendations to its users. For business leaders, that unique capability has the potential to increase employee productivity by taking on administrative tasks, providing better pricing recommendations to sellers, and streamlining recruitment, to name a few examples.

For business leaders navigating the AI workforce transition, the key to unlocking the productivity potential while delivering on business objectives lies in three key strategies: rebalancing resources, investing in workforce reskilling and, on a larger scale, advancing new models of education and lifelong learning.

Our research report, offers a window into how AI will change workplaces through the rebalancing and restructuring of occupations. Using AI and machine learning techniques, our MIT-IBM Watson AI Lab team analyzed 170 million online job posts between 2010 and 2017. The studys first implication: While occupations change slowly over years and even decades tasks become reorganized at a much faster pace.

Jobs are a collection of tasks. As workers take on jobs in various professions and industries, it is the tasks they perform that create value. With the advancement of technology, some existing tasks will be replaced by AI and machine learning. But our research shows that only 2.5% of jobs include a high proportion of tasks suitable for machine learning. These include positions like usher, lobby attendant, and ticket taker, where the main tasks involve verifying credentials and allowing only authorized people to enter a restricted space.

Most tasks will still be best performed by humans whether craft workers like plumbers, electricians and carpenters, or those who do design or analysis requiring industry knowledge. And new tasks will emerge that require workers to exercise new skills.

As this shift occurs, business leaders will need to reallocate capital accordingly. Broad adoption of AI may require additional research and development spending. Training and reskilling employees will very likely require temporarily removing workers from revenue-generating activities.

More broadly, salaries and other forms of employee compensation will need to reflect the shifting value of tasks all along the organization chart. Our research shows that as technology reduces the cost of some tasks because they can be done in part by AI, the value workers bring to the remaining tasks increases. Those tasks tend to require grounding in intellectual skill and insightsomething AI isnt as good at as people.

In high-wage business and finance occupations, for example, compensation for tasks requiring industry knowledge increased by more than $6,000, on average, between 2010 and 2017. By contrast, average compensation for manufacturing and production tasks fell by more than $5,000 during that period. As AI continues to reshape the workplace, business leaders who are mindful of this shifting calculus will come out ahead.

Companies today are held accountable not only for delivering shareholder value, but for positively impacting stakeholders such as customers, suppliers, communities and employees. Moreover, investment in talent and other stakeholders is increasingly considered essential to delivering long-term financial results. These new expectations are reflected in the Business Roundtables recently revised statement on corporate governance, which underscores corporations obligation to support employees through training and education that help develop new skills for a rapidly changing world.

Millions of workers will need to be retrained or reskilled as a result of AI over the next three years, according to a recent IBM Institute for Business Value study. Technical training will certainly be a necessary component. As tasks requiring intellectual skill, insight and other uniquely human attributes rise in value, executives and managers will also need to focus on preparing workers for the future by fostering and growing people skills such as judgement, creativity and the ability to communicate effectively. Through such efforts, leaders can help their employees make the shift to partnering with intelligent machines as tasks transform and change in value.

As AI continues to scale within businesses and across industries, it is incumbent upon innovators and business leaders to understand not only the business process implications, but also the societal impact. Beyond the need for investment in reskilling within organizations today, executives should work alongside policymakers and other public and private stakeholders to provide support for education and job training, encouraging investment in training and reskilling programs for all workers.

Our research shows that technology can disproportionately impact the demand and earning potential for mid-wage workers, causing a squeeze on the middle class. For every five tasks that shifted out of mid-wage jobs, we found, four tasks moved to low-wage jobs and one moved to a high-wage job. As a result, wages are rising faster in the low- and high-wage tiers than in the mid-wage tier.

New models of education and pathways to continuous learning can help address the growing skills gap, providing members of the middle class, as well as students and a broad array of mid-career professionals, with opportunities to build in-demand skills. Investment in all forms of education is key: community college, online learning, apprenticeships, or programs like P-TECH, a public-private partnership designed to prepare high school students for new collar technical jobs like cloud computing and cybersecurity.

Whether it is workers who are asked to transform their skills and ways of working, or leaders who must rethink everything from resource allocation to workforce training, fundamental economic shifts are never easy. But if AI is to fulfill its promise of improving our work lives and raising living standards, senior leaders must be ready to embrace the challenges ahead.

View original post here:
AI Is Changing Work and Leaders Need to Adapt - Harvard Business Review

Read More..

Transitioning Your Enterprise Workload? Read This First – Forbes

It's a familiar story: IT leaders are desperately trying to solve the complex problems of their workloads. They are pressured to deliver better, faster performance and are scrambling to manage their workloads within budget. As they do this, IT departments must consider whether they want to fully transition their workloads to the public cloud or adopt a hybrid IT strategy. Before going one way or the other, understanding the differences will eliminate the common "boomerang" back and forth between both worlds.

Workload Data Storage: Which Location Is Best?

Every IT workload has different needs, costs and complexity profiles, meaning there is no such thing as a one-size-fits-all solution. As an organization begins to consider migrating its data to cloud-hosting environments, it should first conduct a formal, objective IT assessment to understand which workloads would be best hosted in private, public, hybrid or multicloud environments. Factors to consider when determining which workloads should go where include security and compliance, latency, cost and performance.

The factors that play into this decision will be different for every business. Companies that are planning to go or have already gone all-in on public cloud should ensure they have thoroughly evaluated the elements that will determine which workloads are best to remain in hyperscale environments and which can succeed in public atmospheres. Subject to enterprise needs, the benefits and drawbacks of both public cloud and hybrid include the following.

Public Cloud Pros

Collaboration: For businesses seeking a collaborative environment that provides continuous new workload service offerings, a multitenant managed public cloud environment may be the best option. The high capacity of these environments means this is an ideal environment for applications like web servers and websites, which are continually being modified.

Complex Capabilities: Performance needs will always be based on the type of workload you are storing. High-touch and mission-critical applications can thrive in public community clouds.

Seasonal Workloads (Scalability): With public cloud environments, it is easy to provide additional compute space for ever-growing workloads.

Flexibility: For environments where there is a need for rapid build and teardown of VMs in other words, short life cycle workloads public cloud provides great flexibility to customers without much lead time.

New Features And Functionality: Often, a customer would benefit from the constant influx of new and improved features in the public cloud.

Public Cloud Cons

High Cost: Public cloud structures often mean low front-end costs, but given their workload requirements high levels of compute, memory, bandwidth and storage required these costs can rise quickly. The failure to understand how these needs affect cost frequently results in monthly invoices that are higher than initially anticipated.

Low Privacy: Anytime you are storing private information, sensitivity is paramount. With public cloud infrastructure, the level of privacy desired is not always available.

Resource-Intensive Workarounds: Legacy systems often cannot run in cloud environments. If a business is looking to host a legacy system here, it will probably mean that it needs to implement intricate workarounds.

Low Control: Public cloud offerings can make it more challenging for businesses to maintain full administrative control of new and existing workloads in their IT environment.

Hybrid IT Pros

Low Cost: A business's success, at the end of the day, comes down to its bottom line. For companies that have a tighter budget, hybrid IT will likely be more beneficial than public cloud. The right hybrid IT application mix enables businesses to scale their infrastructure solutions and match the right cost model to each workload.

High Control: For companies that want to maintain control and have the option to integrate new tools as well as monitor and troubleshoot their workloads, they should turn to a hybrid approach.

Performance Management And Scalability: For workloads that require extremely low latency, hybrid IT is more beneficial given the higher performance management and edge computing capabilities provided, especially for individuals who need direct access. Moreover, with hybrid IT, businesses can make quick resource additions or adjustments based on demand.

Flexible Compliance Options: Because of regulations like GDPR and HIPAA, it's critical for organizations to be ultra-aware of how and where they are storing their data so they can ensure compliance. Businesses often return to hybrid IT infrastructure because it offers them the flexibility to move and manage the data they want protected.

Hybrid IT Cons

Long Lead Time: Due to the nature of the solution, it may take a bit longer to procure, provision and provide the answer in hybrid IT environments.

Complexity: With interwoven solutions comes complexity. With so many moving parts, finding a solution to an issue can sometimes be tricky.

Looking To The Future

Storing data at the edge is becoming increasingly important. Businesses realize the importance of decentralizing their workloads to get users the data that they need quickly and efficiently. From streaming content to processing data from IoT devices, ensuring that workloads are strategically located is essential. This can be best achieved through the flexibility of a hybrid IT environment, where workload storage is matched to location based on individualized needs.

But still, hybrid IT environments don't work for all workloads, especially those that can benefit from the extensive product catalogs and service offerings of public cloud environments.

Hybrid IT is best for enterprises looking for value, solution, low tolerance to downtime, long life cycle, mission-critical platforms and cross-product integrations, all with utmost attention to security and compliance. These require multiple teams to work in harmony and have a standard set of tools and processes to follow something that is resolved using best practices and continuous improvement.

As organizations look to implement their infrastructure environments, taking the time first to identify their workload needs will enable them to create an environment that best fits their business.

Continued here:
Transitioning Your Enterprise Workload? Read This First - Forbes

Read More..

Hybrid cloud – The business infrastructure of tomorrow – Techerati

Senior Vice President EMEA at Nutanix, Sammy Zoghlami explores recent research into multi-cloud and hybrid cloud adoption

Technology has become a core component of a customers experience with an organisation, no matter the vertical market. As a result, businesses require technology that provides flexibility and security, and that cost effectively allows the organisation to change according to altering customer behaviour. A recent study, the Nutanix Enterprise Cloud Index 2019, reveals that hybrid cloud is becoming business infrastructure; indeed, hybrid cloud is providing the necessary security and agility that businesses in 2020 require.

Poor customer experience will damage 30% of digital business projects, according to analyst house Gartner, placing the technology team and its infrastructure choices right at the heart of the bottom line and brand perception of every business, every day. Your business results depend on your brands ability to retain and add customers, says Olive Huang, Research Director at Gartner.

Against a backdrop of an increased demand to meet the expectations of customers and to deliver new digital products and services, it is not surprising that the Enterprise Cloud Index 2019 shows that there will be significant adoption of hybrid cloud across the EMEA region over the next three to five years. Indeed, 53% of respondents stated a plan to adopt hybrid cloud by 2024 and 84% of EMEA respondents cited hybrid cloud as the ideal IT operating model.

With businesses requiring increasing levels of flexibility, the portability of hybrid cloud can deliver the same level of flexibility to an organisations infrastructure. The Enterprise Cloud Index further demonstrates that customer-centric technology teams realise and value this portability. Nearly 20% of respondents cited interoperability as a key benefit of hybrid cloud and 16% said application mobility was a major benefit. As organisations require applications to scale up and down according to customer demands, the business requires an infrastructure that can move an application back and forth between private and public cloud with ease.

Promotions, major events and partnerships require an infrastructure that can scale up significantly to meet a marketing programme or scale back and reduce operational costs during a national holiday, for example. Over 10% of respondents therefore cited the ability to match the right cloud to the right application and the right use case as one of the benefits they were seeking from adopting hybrid cloud.

This same need for flexible and customer friendly infrastructure is driving a shift from multi-cloud to hybrid cloud, according to the respondents to the Enterprise Cloud Index. Over nine per cent of respondents are currently using multi-cloud environments and over 22% plan to be using multi-cloud in the next 12 to 24 months. However, as digital customer services mature, the study reveals that in three to five years, more than half (52%) of respondents will have moved to hybrid cloud and only 19% will continue with multi-cloud environments.

Interestingly, the Enterprise Cloud Index reveals a drop in hybrid cloud adoption during 2019 compared to what the same study for 2018 said was going to happen. 2019 witnessed an increase in traditional datacentre usage for desktop application hosting, CRM and ERP applications, databases, analytics, as well as backup and recovery. There was also a corresponding drop in the use of private cloud during this period. 2019 was the year that saw regulations such as the General Data Protection Regulations (GDPR) come into force and some corresponding major changes in operating models, which may be behind this trend.

The Enterprise Cloud Index reveals challenging approaches to security when it comes to the adoption of hybrid cloud, as well as some real business and, therefore, customer concerns.

Security was cited as one of the main ways hybrid cloud can benefit an organisation, but interestingly there remains a high level of concern that hybrid cloud decreases the security of an organisation. Nearly 20% of respondents believe hybrid cloud will increase data security and compliance in the organisation.

Hybrid cloud was said to be inherently secure by 26.5% of respondents in EMEA, two per cent lower than respondents in the Americas and three per cent lower than their peers in the Asia Pacific and Japan market. Hybrid cloud is seen as significantly more secure than public cloud (8.7%) of respondents and multi-cloud (7.7%), but not significantly more secure than an on-premise private cloud, which was backed by 20.4% of respondents.

The Enterprise Cloud Index highlighted the common concern of business technology leaders towards access to cloud and cybersecurity skills. Across all markets, organisations are finding these skills scarce.

Across EMEA, the Enterprise Cloud Index demonstrates that public cloud budgets are over exceeding limitations, according to 31.7% of respondents and five per cent are majorly over budget. That said, it is clear that organisations looking for an infrastructure that gives them customer-centricity are looking at hybrid cloud as the method that will enable the flexibility they require.

As the Enterprise Cloud Index reveals, hybrid cloud is seen by business technology leaders as the way to provide organisations with the flexibility they require for todays ever changing market dynamics. The results reveal that organisations expect agility and interoperability as customer demands flex. Old rigid forms of infrastructure fail to meet the needs of businesses that are focused on their customers. Security remains a topic of debate in the modern enterprise, but as the Enterprise Cloud Index demonstrates, the infrastructure of tomorrow will be both secure and adaptable.

Read the original post:
Hybrid cloud - The business infrastructure of tomorrow - Techerati

Read More..

On the horizon: 57 meetups, conferences & networking events across NC in April – WRAL Tech Wire

Plenty of technology and life science events and deadlines are on tap for April, despite the spreading coronavirus. The schedule follows as of March 23.

If youre interested in events coming up in the immediate future, check out our two-part list of:

Also, check out our list of over 100 meetups in the Triangle.

These columns accompany our interactive calendar, along with a comprehensive resource package for startups in the Triangle.

Keeping up with WRAL TechWires continued initiative to track events happening across North Carolina, heres a look at whats to come in April:

Held every month, 1 Million Cups Charlotte features a presentation from a local startup followed by a Q&A from the community. Free coffee is included.

Ashevilles startup community meets weekly to hear presentations and support one another in continuing to grow.

1 Million Cups, presented by Kauffman, is a weekly informal pitch event for the startup community. Join for free coffee and entrepreneurial support as local startups deliver their presentations.

Bunker Labs Wilmington is celebrating the growth of the chapter over the past year. Join for food, drinks, prizes and networking with the local military community.

The Charlotte chapter of the Ellevate network is hosting a casual social for both members and non-members in Matthews.

This free event features six hardware startups pitching to local VCs and investors. The winner gets $3,000 cash and prizes, as well as a chance to win the $50,000 grand prize at the international finals in May.

At this meetup, John Tuders, UNCC professor and executive director at Skookum, will share his research on talent and decision-making in payments innovation.

Join an open tour of the Skookum office to get an introduction to the tech and ideas powering the community. The Skookum team will be available for any questions or comments participants may have. Lunch will be provided.

Held on a weekly basis, this Venture Caf event series provides all sorts of programming for Piedmont Triad entrepreneurs and innovators. Every Thursday evening, the community gathers for networking, panel talks, workshops, presentations, product demos, interviews, and more.

This free monthly interactive webinar provides participants with an overview of NC TECHs activities, resources and member offerings.

This weekly meetup brings together developers, IT professionals and tech enthusiasts who are interested in the Google Cloud Platform.

Join this event to see first-hand software demonstrations from the latest cohort of engineers in Project Shifts immersive full-stack development program.

Join the Code for Chapel Hill meetup to network with like-minded individuals and work on civic hacking projects. Meetings are held every two weeks on Tuesdays.

Every month, a handful of Queen City-based companies headline Charlotte PitchBreakfast, pitching their products and services for a total of five minutes, after which they will answer questions from the audience and a panel of entrepreneurs and experts.

Ashevilles startup community meets weekly to hear presentations and support one another in continuing to grow.

1 Million Cups, presented by Kauffman, is a weekly informal pitch event for the startup community. Join for free coffee and entrepreneurial support as local startups deliver their presentations.

Held on a weekly basis, this Venture Caf event series provides all sorts of programming for Piedmont Triad entrepreneurs and innovators. Every Thursday evening, the community gathers for networking, panel talks, workshops, presentations, product demos, interviews, and more.

Bunker Labs Raleigh-Durham is hosting an event to build stronger connections across the veteran community. Hear insights from local entrepreneur Randy H. Nelson on his experience as a leader, mentor and community builder.

The Charlotte chapter of the Ellevate network is hosting a casual social for both members and non-members.

The Charlotte chapter of the Ellevate network is hosting a casual social for both members and non-members in Cornelius.

In this talk, Berryville Institute of Machine Learning Co-Founder Dr. Gary McGraw will use the story of static analysis for code review and its decade-long evolution as a driver for discussion. The presentation will cover startups, big companies, venture capital, research agencies and subject matter expertise.

This free bi-monthly event offers a space for local tech professionals to build connections and find potential job opportunities.

In this Demo Day event, Momentum Learnings seventh cohort of Immersive Web Development students will present the projects theyve created during the 12-week full-stack web development program.

This weekly meetup brings together developers, IT professionals and tech enthusiasts who are interested in the Google Cloud Platform.

During this event, students will gather to pitch their ventures before a panel of judges, receive feedback and celebrate the end of their work at BLUE.

The Duke Incubation Fund provides funding for innovative projects that demonstrate potential for future financial support, company formation licensing and not-for-profit partnering.

Code for Durham brings together technologists, designers, developers, data scientists, map makers and activists to collaborate on civic technology projects. Meetings are held every two weeks on Tuesdays. Pizza will be provided.

All Things Open is now seeking talks from technologists to be featured at the 2020 conference, set for October 18-20 in Raleigh.

This weekly event brings together entrepreneurs in the Wilmington and Cape Fear community to gather for coffee, casual startup pitches and conversation.

Ashevilles startup community meets weekly to hear presentations and support one another in continuing to grow.

1 Million Cups, presented by Kauffman, is a weekly informal pitch event for the startup community. Join for free coffee and entrepreneurial support as local startups deliver their presentations.

On the third Wednesday of every month, the Queen Citys entrepreneurial community joins together for an evening of networking and connections over drinks.

The City of Raleighs Impact Partner Grant program provides funding for new programs and resources that aim to assist entrepreneurs and small businesses.

NC IDEA LABS is a four-week customer discovery program that helps idea-stage entrepreneurs take their first steps toward building a successful startup. The next cohort will run from May 26 to June 19.

In this webinar, the Institute for Emerging Issues is bringing together representatives from Service Year organizations, nonprofits and government agencies to discuss the innovative ways in which communities are using service years to tackle problems and add opportunities.

The Regional Transportation Alliances Innovations and Solutions Forum series will highlight new innovations to improve roads and streets quickly and safely.

Held on a weekly basis, this Venture Caf event series provides all sorts of programming for Piedmont Triad entrepreneurs and innovators. Every Thursday evening, the community gathers for networking, panel talks, workshops, presentations, product demos, interviews, and more.

This free monthly interactive webinar provides participants with an overview of NC TECHs activities, resources and member offerings.

This weekly meetup brings together developers, IT professionals and tech enthusiasts who are interested in the Google Cloud Platform.

NC TECHs Government Vendor Network is a forum for member companies who are interested in doing business with state government.

At Future Fund 10 LIVE, 10 nonprofits will pitch innovative program ideas, competing for over $40,000 grants and cash prizes.

Join the Code for Chapel Hill meetup to network with like-minded individuals and work on civic hacking projects. Meetings are held every two weeks on Tuesdays.

This weekly event brings together entrepreneurs in the Wilmington and Cape Fear community to gather for coffee, casual startup pitches and conversation.

Ashevilles startup community meets weekly to hear presentations and support one another in continuing to grow.

1 Million Cups, presented by Kauffman, is a weekly informal pitch event for the startup community. Join for free coffee and entrepreneurial support as local startups deliver their presentations.

For the third cycle of its Flash Grant program, the North Carolina Biotech Center is searching for innovative projects aiming to address the global coronavirus (COVID-19) outbreak, precision health, and digital and data-driven life science technologies.

Held on a weekly basis, this Venture Caf event series provides all sorts of programming for Piedmont Triad entrepreneurs and innovators. Every Thursday evening, the community gathers for networking, panel talks, workshops, presentations, product demos, interviews, and more.

The NC Sustainable Energy Association is hosting a discussion on North Carolinas Clean Energy Plan and what it means for the states future.

This free bi-monthly event offers a space for local tech professionals to build connections and find potential job opportunities.

This weekly meetup brings together developers, IT professionals and tech enthusiasts who are interested in the Google Cloud Platform.

Code for Durham brings together technologists, designers, developers, data scientists, map makers and activists to collaborate on civic technology projects. Meetings are held every two weeks on Tuesdays. Pizza will be provided.

Bring your ideas and opinions to the next Midtown Techies meetup. Events are held on the last Tuesday of every month.

This weekly event brings together entrepreneurs in the Wilmington and Cape Fear community to gather for coffee, casual startup pitches and conversation.

Ashevilles startup community meets weekly to hear presentations and support one another in continuing to grow.

1 Million Cups, presented by Kauffman, is a weekly informal pitch event for the startup community. Join for free coffee and entrepreneurial support as local startups deliver their presentations.

Join an open tour of the Skookum office to get an introduction to the tech and ideas powering the community. The Skookum team will be available for any questions or comments participants may have. Lunch will be provided.

Held on a weekly basis, this Venture Caf event series provides all sorts of programming for Piedmont Triad entrepreneurs and innovators. Every Thursday evening, the community gathers for networking, panel talks, workshops, presentations, product demos, interviews, and more.

Continue reading here:
On the horizon: 57 meetups, conferences & networking events across NC in April - WRAL Tech Wire

Read More..

How cloud providers are changing the outlook for IoT data and analytics management – Cloud Tech

CIOs and CTOs are exploring new ways to extract insights from their enterprise data assets with analytics tools. While they continue to invest in on-premises solutions, they're also looking to public cloud service providers.

As cloud computing providers grow their footprint in the Internet of Things (IoT) value chain, their investments in data and analytics services are accelerating.

Based on the review of cloud service provider offerings, recent acquisitions, and the competitive outlook, ABI Research now forecasts that cloud suppliers will grow their share of IoT data and analytics management revenues from $6 billion in 2019 to $56 billion in 2026.

While the growth is impressive, cloud vendor services today are focused on data management complemented by a generic analytics toolset. That said, cloud computing vendor revenues come primarily from streaming, storage, and the orchestration of data.

In contrast, most analytics service offerings across cloud vendors are less differentiated, as reflected in pre-built templates -- such as AWS Sagemaker and Microsoft Azure Notebooks -- which leverage the Project Jupyter open-source software, standards and services initiative.

Considering that many cloud vendors are in the early stages of their analytics investment, they are relying on their specialized channel partners for addressing more specific 'advanced analytics' and vertical market needs.

"The overall approach shown by cloud suppliers in their analytics services reflects the dilemma they face in the complex IoT partnership ecosystem," says Kateryna Dubrova, analyst at ABI Research. "Effectively, do they rely on partners for analytics services, or do they build analytics services that compete with them?"

Interestingly, streaming is the one analytics technology that all cloud vendors are building into their solution portfolios to blend data management with near-real-time analytics on streamed IoT data.

Companies such as AWS, Microsoft, Google, IBM, and Oracle, for example, are promoting their proprietary streaming solutions to differentiate, accelerate time-to-market, and win over customers.

According to the ABI assessment, companies including Cloudera, Teradata, and C3.ai are introducing streaming analytics services that are reliant upon open-source technology, such as Spark and Flink.

However, by choosing to focus on data management and streaming technologies, cloud vendors are ceding the advanced analytics market to other suppliers. That emerging market is an example of the 'coopetition' in the IoT ecosystem, where cloud vendors partner with advanced analytics experts.

This vendor coopetition enables them to promote an end-to-end IoT technology stack. For example, Azure and AWS have partnered with Seeq to leverage its advance analytics capabilities. Other vendors, such as Oracle, Cisco, and Huawei, are pushing intelligence and analytics closer to the devices, expanding their edge computing portfolio.

Such divergent analytics strategies represent the reality and challenges for serving a very diverse IoT ecosystem with IoT analytics services.

"Ultimately, businesses are moving to an analytics-driven business model which will require both infrastructure and services for continuous intelligence. Cloud vendor strategies need to align with this reality to take advantage of analytics value and revenues that will transition to predictive and prescriptive solutions," Dubrova concludes.

There is a significant upside opportunity for vendors that are exploring the numerous applications for IoT analytics services. Solutions will include both on-premises IT infrastructure and cloud offerings.

Interested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

Read more from the original source:
How cloud providers are changing the outlook for IoT data and analytics management - Cloud Tech

Read More..

Amazon and Microsoft join White House team to unleash high-performance computing on COVID-19 – GeekWire

Among the high-performance computing resources that will be made available for coronavirus research is Oak Ridge National Laboratorys Summit, the worlds fastest supercomputer. (ORNL Photo)

Less than a week after the White Houses Office of Science and Technology organized a consortium to focus the power of artificial intelligence on addressing the coronavirus outbreak, another tech team is joining the fight this time, armed with supercomputers and the cloud.

The COVID-19 High-Performance Computing Consortium includes the Seattle areas powerhouses of cloud computing, Amazon Web Services and Microsoft, as well as IBM and Google Cloud.

There are also academic partners (MIT and Rensselaer Polytechnic Institute), federal agency partners (NASA and the National Science Foundation) and five Department of Energy labs (Argonne, Lawrence Livermore, Los Alamos, Oak Ridge and Sandia).

Among the resources being brought to bear is the worlds most powerful supercomputer, the Oak Ridge Summit, which packs a 200-petaflop punch.

America is coming together to fight COVID-19, and that means unleashing the full capacity of our world-class supercomputers to rapidly advance scientific research for treatments and a vaccine, Michael Kratsios, the White Houses chief technology officer, said in a news release.

The research projects supported by the consortium are expected to range from studies of the SARS-CoV-2 virus molecular makeup, to the bioinformatics behind the workings of the virus, to the epidemiology behind COVID-19s spread and the strategies for stopping it. The common thread is the need for high-performance computing resources to get a handle on the complex challenges of such studies.

Microsoft will be providing grants to researchers through its AI for Health program, to ensure additional access to Azure Cloud and the companys high-performance computing capabilities. AI for Healths data science experts will also make themselves available to collaborate on COVID-19 research projects.

We want to make sure researchers working to combat COVID-19 have access to the tools they need, said John Kahan, Microsoft Global AI for Health lead.

Amazon Web Services is offering research institutions and companies technical support and promotional credits for the use of AWS services to advance research on diagnosis, treatment and vaccine studies relating to the coronavirus and its effects.

Were proud to support this critical work and stand ready with the compute power of AWS to help accelerate research and development efforts, Teresa Carlson, vice president for AWS Worldwide Public Sector, said in a statement.

Similar access to high-performance computing resources is being made available to researchers via Google Cloud HPC, the IBM Research WSC Cluster, NASAs High-End Computing Capability, Rensselaers AIMOS supercomputer system, the MIT/Massachusetts Green HPC Center, NSFs Office of Advanced Cyberinfrastructure (which can provide access to Frontera, the fastest supercomputer deployed on a U.S. academic campus) and the five national labs.

To take advantage of those resources, researchers should submit a simple project proposal via consortiums website. A steering committee will review proposals for potential impact, feasibility, resource requirements and timeline.

Each proposal thats selected will be matched up with computing resources from one of the consortiums members. That member will then get in contact with the research team to discuss the process for obtaining access to the resources.

Researchers making their proposals should expect to produce a regularly updated blog of their activities during the course of their work, and have publishable results come out of their efforts.

Microsoft is also part of the team behind the COVID-19 Open Research Dataset, or CORD-19, which is offering an AI-enabled database to provide faster, surer, wider-ranging access to coronavirus research.

View post:
Amazon and Microsoft join White House team to unleash high-performance computing on COVID-19 - GeekWire

Read More..